<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Stefan Stöhr</title>
    <description>The latest articles on Forem by Stefan Stöhr (@xstefan).</description>
    <link>https://forem.com/xstefan</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/xstefan"/>
    <language>en</language>
    <item>
      <title>How generate images with Azure DALL·E 2 in JavaScript</title>
      <dc:creator>Stefan Stöhr</dc:creator>
      <pubDate>Wed, 16 Aug 2023 08:45:02 +0000</pubDate>
      <link>https://forem.com/xstefan/how-generate-images-with-azure-dalle-2-in-javascript-65i</link>
      <guid>https://forem.com/xstefan/how-generate-images-with-azure-dalle-2-in-javascript-65i</guid>
      <description>&lt;p&gt;If you're a JavaScript developer, eager to access a DALL·E 2 instance on Microsoft Azure, here's a brief workaround until the official OpenAI Node Library supports this.&lt;/p&gt;

&lt;p&gt;I figured out the urls by reverse engineering the DALL·E playground (Preview) in the Azure AI Studio.&lt;/p&gt;

&lt;p&gt;This is definitely not a proper solution 😅, but it should give you an idea how to do it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;setTimeout&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nx"&gt;sleep&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;node:timers/promises&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;baseUrl&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https://{YOUR-AZURE-INSTANCE-NAME}.openai.azure.com/openai&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;headers&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;API-Key&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;AZURE_OPENAI_API_KEY&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Content-Type&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;application/json&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;requestBody&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;your dall-e 2 prompt&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;size&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;512x512&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;n&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;baseUrl&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/images/generations:submit?api-version=2023-06-01-preview`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;method&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;POST&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="nx"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;requestBody&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// returns { "id": "xxx", "status": "notRunning" }&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;initJob&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;jobId&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;initJob&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="cm"&gt;/**
 * the image generation may take a while, 
 * so you need to ask the api for a status in intervals.
 * In this case I try it 5 times before giving up
 */&lt;/span&gt;
&lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;i&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="nx"&gt;i&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="nx"&gt;i&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="c1"&gt;// Wait 1.5 seconds after a request&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;sleep&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1500&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;fetch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;baseUrl&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/operations/images/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;jobId&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;?api-version=2023-06-01-preview`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;method&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;GET&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="nx"&gt;headers&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;

  &lt;span class="cm"&gt;/**
   * either returns
   * { "created": 1234567, "id": "x-x-x-x-x", "status": "running" }
   * 
   * or
   * 
   * { "created": 1234567, "expires": 7654321, "id": "x-x-x-x-x", "status": "succeeded"
        "result": { "created": 1234567,
            "data": [
                {
                    "url": "https://pathtotheimage"
                }
            ]
      }}
   */&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;job&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;response&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;json&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

  &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;succeeded&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;info&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;DALL-E 2 image&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;job&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;result&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;

    &lt;span class="c1"&gt;// exit the for-loop early since we have what we wanted&lt;/span&gt;
    &lt;span class="k"&gt;break&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And that's it - hope this helps 🙂&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>node</category>
      <category>openai</category>
    </item>
    <item>
      <title>Building a “Follow Light” with Arduino: Part 2</title>
      <dc:creator>Stefan Stöhr</dc:creator>
      <pubDate>Fri, 10 Jun 2022 14:13:24 +0000</pubDate>
      <link>https://forem.com/studio_m_song/building-a-follow-light-with-arduino-part-2-lkm</link>
      <guid>https://forem.com/studio_m_song/building-a-follow-light-with-arduino-part-2-lkm</guid>
      <description>&lt;p&gt;It's been a while since we started our &lt;a href="https://dev.to/s2engineers/building-a-follow-light-with-arduino-4127"&gt;Follow Light Arduino Project&lt;/a&gt; (and it's still not installed in our corridor 😳), but I recently stumbled over this awesome online simulator called &lt;strong&gt;&lt;a href="https://wokwi.com"&gt;Wokwi&lt;/a&gt;&lt;/strong&gt;!&lt;/p&gt;

&lt;p&gt;So I decided to tidy up our codebase a bit and put it into action there. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_HLnlwUx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3e9cszb3m9pjbk8g5foe.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_HLnlwUx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3e9cszb3m9pjbk8g5foe.gif" alt="How the simulator looks like" width="800" height="388"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Wokwi&lt;/strong&gt; doesn't support the &lt;strong&gt;TFmini Plus LiDAR&lt;/strong&gt; sensor, but it does have the &lt;strong&gt;HC-SR04&lt;/strong&gt; Ultrasonic distance sensor.&lt;/p&gt;

&lt;p&gt;And there is a (hidden) &lt;a href="https://wokwi.com/arduino/libraries/Adafruit_NeoPixel/RGBWstrandtest"&gt;Wokwi Neo Pixel Canvas&lt;/a&gt; component, which acts like a &lt;strong&gt;LED lightstrip&lt;/strong&gt;. You can only add it by editing the &lt;code&gt;diagram.json&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;You can find our Follow Light simulation here:&lt;/strong&gt; &lt;a href="https://wokwi.com/projects/334003558896632403"&gt;https://wokwi.com/projects/334003558896632403&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Just run the simulator and click the HC-SR04 component to change the detected distance.&lt;/p&gt;

</description>
      <category>arduino</category>
      <category>iot</category>
      <category>microcontroller</category>
    </item>
    <item>
      <title>How do you handle Code Reviews?</title>
      <dc:creator>Stefan Stöhr</dc:creator>
      <pubDate>Wed, 28 Jul 2021 07:36:29 +0000</pubDate>
      <link>https://forem.com/studio_m_song/how-do-you-handle-code-reviews-40gh</link>
      <guid>https://forem.com/studio_m_song/how-do-you-handle-code-reviews-40gh</guid>
      <description>&lt;p&gt;That’s the question I asked my colleagues and the result was, well … not so surprising.&lt;/p&gt;

&lt;p&gt;Many teams have a &lt;strong&gt;chat channel&lt;/strong&gt; where they post the link to their merge request.&lt;/p&gt;

&lt;p&gt;If someone does a review for a MR, they add an &lt;code&gt;eyes&lt;/code&gt; 👀  emoji to the message, if it’s approved they add a &lt;code&gt;checkmark&lt;/code&gt; ✅ and if there are questions they add a &lt;code&gt;speech bubble&lt;/code&gt; 💬 or &lt;code&gt;cross mark&lt;/code&gt; ❌ .&lt;/p&gt;

&lt;p&gt;Other teams only &lt;strong&gt;assign people directly&lt;/strong&gt; in GitHub or GitLab and the ticket system.&lt;/p&gt;

&lt;p&gt;And some do a mix of both of them (like my team). If &lt;strong&gt;nobody reacts&lt;/strong&gt; to a MR in our chat, we &lt;strong&gt;pick one&lt;/strong&gt; 🤗&lt;/p&gt;

&lt;p&gt;What is interesting: In some teams the &lt;strong&gt;reviewer merges&lt;/strong&gt; the MR, in others the &lt;strong&gt;person who wrote the code&lt;/strong&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Pressing the merge button is a success experience and we wanted to leave that to the person who wrote the code&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;But you could also say: Who merges the code is responsible for breaking the pipeline 😅&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;So how are you doing it?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Message in a channel? Assigning your Team Lead? Or something completely different (drawing straws)? Let me know in the comments.&lt;/p&gt;

</description>
      <category>codereview</category>
      <category>productivity</category>
      <category>discuss</category>
    </item>
    <item>
      <title>Building a “Follow Light” with Arduino</title>
      <dc:creator>Stefan Stöhr</dc:creator>
      <pubDate>Wed, 02 Dec 2020 15:01:20 +0000</pubDate>
      <link>https://forem.com/studio_m_song/building-a-follow-light-with-arduino-4127</link>
      <guid>https://forem.com/studio_m_song/building-a-follow-light-with-arduino-4127</guid>
      <description>&lt;p&gt;At my workplace in Frankfurt/Main, I'm walking through a long black corridor to get to my desk, which is kinda boring.&lt;/p&gt;

&lt;p&gt;So some colleagues and I had an idea: How about making it a bit more colorful by adding some LEDs?&lt;/p&gt;

&lt;p&gt;And when you walk through, we &lt;strong&gt;light the LEDs closest to your position&lt;/strong&gt;, so it will look like it follows you - a follow light.&lt;/p&gt;

&lt;h2&gt;
  
  
  Measurements and components
&lt;/h2&gt;

&lt;p&gt;We started by writing down a &lt;strong&gt;concept paper&lt;/strong&gt;. &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What do we want to archive?&lt;/li&gt;
&lt;li&gt;How can we archive that?&lt;/li&gt;
&lt;li&gt;What are the dimensions of the corridor? &lt;/li&gt;
&lt;li&gt;Which parts would we need? &lt;/li&gt;
&lt;li&gt;How much would it cost?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5filqhgh1eb5zez5c6n2.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F5filqhgh1eb5zez5c6n2.jpg" alt="A photo of said corridor and a rough sketch of its dimensions"&gt;&lt;/a&gt;&lt;/p&gt;
The corridor is around 8m long and connects our welcome area with the office.



&lt;p&gt;We figured out we would need a controller, two LED strips and some kind of sensor, which can "see" you walking through. &lt;/p&gt;

&lt;p&gt;So we discussed some hardware options and possible &lt;strong&gt;solutions&lt;/strong&gt; and debating the pros and cons. &lt;/p&gt;

&lt;h3&gt;
  
  
  Microcontroller
&lt;/h3&gt;

&lt;p&gt;We quickly agreed on an &lt;strong&gt;Arduino&lt;/strong&gt;. It's a well known platform and a Raspberry Pi felt like overkill after we decided against the XBox Kinect sensor. But more about that later.&lt;/p&gt;

&lt;h3&gt;
  
  
  LED lightstrip
&lt;/h3&gt;

&lt;p&gt;Every single LED of the lightstrip needs to be &lt;strong&gt;addressable&lt;/strong&gt;, so we can control specific LEDs. And we require a really long LED strip.&lt;/p&gt;

&lt;p&gt;To make it work easier with the &lt;a href="https://github.com/adafruit/Adafruit_NeoPixel" rel="noopener noreferrer"&gt;Adafruit NeoPixel Library&lt;/a&gt; we decided for two 5m "WS2812b like" lightstrips with 300 LEDs each.&lt;/p&gt;

&lt;p&gt;But that much LEDs require a lot of power.&lt;/p&gt;

&lt;p&gt;One LED takes ~0.06 Ampere. So we would need &lt;code&gt;0.06 * 600&lt;/code&gt; = 36 Ampere if we want to light all LEDs in bright white.&lt;/p&gt;

&lt;p&gt;We came to the conclusion, that the Arduino alone would not be able to deliver enough power. &lt;/p&gt;

&lt;p&gt;So we added an &lt;strong&gt;extra power supply&lt;/strong&gt; (5V10A) to our setup.&lt;/p&gt;

&lt;p&gt;This is not enough to lighten all LEDs in bright white, but bright white LEDs are really harsh to your eyes which means we have to dim them anyway (=requires less power).&lt;/p&gt;

&lt;p&gt;We're also using colors (=requires less power) and we won't use all LEDs at the same time (=you guessed it: requires less power).&lt;/p&gt;

&lt;h3&gt;
  
  
  Distance Sensor
&lt;/h3&gt;

&lt;p&gt;Each sensor we came up with has a different &lt;strong&gt;Field of View&lt;/strong&gt; (short FOV, basically an angle of how much area it can grasp) and max distance. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;XBox Kinect:&lt;/strong&gt; Huge FOV, can detect multiple people, expensive, but just medium range (~5m), power supply is a concern and it doesn’t work well with just an Arduino.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Photoelectric Sensor:&lt;/strong&gt; Cheap, small FOV, short range, we would require quite a lot of them for tracking someone and also wiring is an issue.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ultrasonic Distance Sensor:&lt;/strong&gt; Cheap, easy setup, medium FOV, but short range (~2m).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;LiDAR Sensor:&lt;/strong&gt; Long range (~10m), easy setup, but a small FOV&lt;/p&gt;

&lt;p&gt;Our corridor is quite long and narrow, so the trackable distance is more important than the FOV.&lt;/p&gt;

&lt;p&gt;In the end we decided in favor of the &lt;strong&gt;LiDAR Sensor&lt;/strong&gt;. This way the power supply isn’t an issue, we need less wires and it’s comparably cheap. (Later, we also tested the Ultra Sonic sensor, since one was part of the Arduino Starter Kit).&lt;/p&gt;

&lt;h3&gt;
  
  
  Our Hardware
&lt;/h3&gt;

&lt;p&gt;We ended up with the following components and a price tag of ~225€.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Arduino UNO R3 (together with a Starter Kit)&lt;/li&gt;
&lt;li&gt;SK6812 IC LED Stripes (like WS2812B)&lt;/li&gt;
&lt;li&gt;Benewake TFmini Plus LiDAR&lt;/li&gt;
&lt;li&gt;Power Supply 5V10A (for the LED strips)&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Creating a Prototype
&lt;/h2&gt;

&lt;p&gt;After buying the hardware we started exploring.&lt;/p&gt;

&lt;p&gt;Beside the Arduino IDE we installed the &lt;strong&gt;Arduino CLI&lt;/strong&gt;, so we can compile and upload our code from the command line.&lt;/p&gt;

&lt;h3&gt;
  
  
  Setup
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$ &lt;/span&gt;brew update
&lt;span class="nv"&gt;$ &lt;/span&gt;brew &lt;span class="nb"&gt;install &lt;/span&gt;arduino-cli
&lt;span class="nv"&gt;$ &lt;/span&gt;arduino-cli core update-index

&lt;span class="c"&gt;# Install board core&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;arduino-cli core &lt;span class="nb"&gt;install &lt;/span&gt;arduino:avr

&lt;span class="c"&gt;# Install libs&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;arduino-cli lib &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="s2"&gt;"Adafruit NeoPixel"&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;arduino-cli lib &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="s2"&gt;"TFMPlus"&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Compile and upload
&lt;/h3&gt;

&lt;p&gt;After connecting the Arduino via USB we can compile and upload our code with these commands:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Show connected boards with port&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;arduino-cli board list

&lt;span class="c"&gt;# Compile&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;arduino-cli compile &lt;span class="nt"&gt;--fqbn&lt;/span&gt; arduino:avr:uno followlight

&lt;span class="c"&gt;# Upload (replace /dev/cu.usbmodem144101 with your port)&lt;/span&gt;
&lt;span class="nv"&gt;$ &lt;/span&gt;arduino-cli upload &lt;span class="nt"&gt;--port&lt;/span&gt; /dev/cu.usbmodem144101 &lt;span class="nt"&gt;--fqbn&lt;/span&gt; arduino:avr:uno followlight
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h3&gt;
  
  
  Getting each component to work
&lt;/h3&gt;

&lt;p&gt;Before screwing everything together, we needed to learn and understand how we get the &lt;strong&gt;independent components&lt;/strong&gt; to work.&lt;/p&gt;

&lt;p&gt;There are quite a lot of tutorials on how to get the LED strip running, but &lt;a href="https://starthardware.org/viele-leds-mit-arduino-steuern-ws2812/" rel="noopener noreferrer"&gt;this one (German)&lt;/a&gt; helped us a lot.&lt;/p&gt;

&lt;p&gt;The most important learnings for us:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Make sure you have a wire connecting the Arduino &lt;strong&gt;ground&lt;/strong&gt; (GND) with the LED power supply.&lt;/li&gt;
&lt;li&gt;Add a &lt;strong&gt;resistor&lt;/strong&gt; of 300 to 500 Ohm between the Arduino pin and the LED data pin. This protects the LED from receiving too much power through this wire and burning out.&lt;/li&gt;
&lt;li&gt;It's recommended to add a &lt;strong&gt;capacitor&lt;/strong&gt; of 1000 microfarad to support the power supply of your LED strip.&lt;/li&gt;
&lt;li&gt;You should get some &lt;strong&gt;DC Power Connector&lt;/strong&gt; with screws to connect the lightstrip wires with the power supply and you wont need to solder them.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Floog5cibprva95n4pnun.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Floog5cibprva95n4pnun.jpg" alt="Photo showing the Arduino connected to one LED strip through a breadboard"&gt;&lt;/a&gt;&lt;/p&gt;
The red wire conducts current (+5V), the white wire is ground (-, GRND) and the green wire is our data channel.



&lt;p&gt;Getting the LiDAR Sensor to work was a bit confusing since the documentation mentioned a &lt;strong&gt;blue wire, which is actually green&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Thanks to Bud Ryerson and his &lt;a href="https://github.com/budryerson/TFMini-Plus" rel="noopener noreferrer"&gt;Arduino Library for the Benewake TFMini-Plus Lidar sensor&lt;/a&gt; we now had a working sensor setup.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fbtefts44awg15mtai1br.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fbtefts44awg15mtai1br.jpg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;
The red wire conducts current (+5V), the black wire is ground (-, GND), white is receiving (RXD) and green is transmitting (TXD)






&lt;h2&gt;
  
  
  Putting everything together
&lt;/h2&gt;

&lt;p&gt;The logic of our first prototype looked somewhat like this.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="cp"&gt;#define FOLLOW_LIGHT_LENGTH 5 // LEDs we want to light at once
&lt;/span&gt;
&lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;      &lt;span class="c1"&gt;// x sensor value in cm&lt;/span&gt;
&lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;xLed&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;   &lt;span class="c1"&gt;// x position of active led&lt;/span&gt;

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;loop&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;getDistanceTFMiniPlus&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="c1"&gt;// calculate the led (300 LEDs / 500cm)&lt;/span&gt;
    &lt;span class="n"&gt;xLed&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;round&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mf"&gt;0.6&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="n"&gt;pixels&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;clear&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="n"&gt;pixels&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;fill&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pixels&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Color&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;xLed&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;FOLLOW_LIGHT_LENGTH&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="n"&gt;pixels&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;show&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The TF Mini Plus LiDAR sensor returns a value in centimeters, but the NeoPixel Library can only switch single LEDs, so we have to convert the centimeters to a pixel LED.&lt;/p&gt;

&lt;p&gt;We have 300 LEDs equally distributed over 500cm, &lt;code&gt;300 / 500&lt;/code&gt; = 0.6 LED per cm and we don't have half LEDs so we round that to an integer value.&lt;/p&gt;

&lt;p&gt;And that's our prototype:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fpdhn0x495wkq1po6n12k.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fpdhn0x495wkq1po6n12k.gif" alt="Video Gif showing a lightstrip controlled by a distance sensor"&gt;&lt;/a&gt;The GIF actually shows our setup with an HC-SR04 Ultra Sonic Distance Sensor, but it works the same with our TFmini Plus LiDAR Sensor.&lt;/p&gt;




&lt;h2&gt;
  
  
  What's next?
&lt;/h2&gt;

&lt;p&gt;Well, we want a smooth animation, this prototype has only one LED strip and we actually need to install it in our corridor.&lt;/p&gt;

&lt;p&gt;So watch out for a part 2 about our Follow Light.&lt;/p&gt;

</description>
      <category>arduino</category>
      <category>iot</category>
      <category>microcontroller</category>
    </item>
  </channel>
</rss>
