<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Benjamin Cabé</title>
    <description>The latest articles on Forem by Benjamin Cabé (@kartben).</description>
    <link>https://forem.com/kartben</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/kartben"/>
    <language>en</language>
    <item>
      <title>3 Free Simulation Tools to Work Around the Global Chip Shortage</title>
      <dc:creator>Benjamin Cabé</dc:creator>
      <pubDate>Thu, 17 Mar 2022 19:16:14 +0000</pubDate>
      <link>https://forem.com/kartben/3-free-simulation-tools-to-work-around-the-global-chip-shortage-j5n</link>
      <guid>https://forem.com/kartben/3-free-simulation-tools-to-work-around-the-global-chip-shortage-j5n</guid>
      <description>&lt;p&gt;There is something that’s been seriously bothering me ever since I started to work with embedded devices: &lt;strong&gt;trying out new things when you don’t have an actual device at hand is hard&lt;/strong&gt;!&lt;/p&gt;

&lt;p&gt;Over the years, I’ve had the following happen more often than I care to admit:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;💡 Hear about an interesting embedded tool or library ;&lt;/li&gt;
&lt;li&gt;🧑‍💻 Check out the provided code samples ;&lt;/li&gt;
&lt;li&gt;😔 Realize none work with any combination of the—dozens!—embedded boards and sensors I own ;&lt;/li&gt;
&lt;li&gt;🚶‍♂️ Give up…&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;With the ongoing &lt;a href="https://en.wikipedia.org/wiki/2020%E2%80%93present_global_chip_shortage"&gt;global chip shortage&lt;/a&gt;, the problem is just so much worse these days. Even if I were to go through the hassle of ordering the required hardware, I would have to face incredibly long lead times—sometimes well over a year!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/Screenshot-2022-03-11-170537.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TVIdMs43--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/Screenshot-2022-03-11-170537.png" alt="" width="880" height="504"&gt;&lt;/a&gt;&lt;/p&gt;
Do I really want to wait a full year until I get started with my IoT/Embedded project idea?



&lt;p&gt;Luckily, there has been a lot of innovation around making it &lt;strong&gt;easier to simulate embedded hardware&lt;/strong&gt; over the past years, and I have found myself using the following tools on a regular basis.&lt;/p&gt;

&lt;p&gt;For each, I will (briefly) explain what they do, what I like about them, and I will include a link to a quick demo for you to realize first-hand how much time they can save you!&lt;/p&gt;




&lt;p&gt;&lt;a href="https://renode.io/"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--e-Rt6Quz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/renode-logo-768x116-1-300x45.png" alt="" width="300" height="45"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Renode
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://renode.io/"&gt;&lt;strong&gt;Renode&lt;/strong&gt;&lt;/a&gt; is an open-source framework that allows you to run, debug and test unmodified embedded software right from your PC.&lt;/p&gt;

&lt;p&gt;Out of the box, Renode supports a wide variety of embedded &lt;a href="https://renode.readthedocs.io/en/latest/introduction/supported-boards.html#"&gt;boards&lt;/a&gt; and &lt;a href="https://renode.readthedocs.io/en/latest/introduction/supported-boards.html#supported-peripherals"&gt;peripherals&lt;/a&gt;, with more being added regularly. If the board or peripheral you need is not in the list, there is a good chance they can be added with minimal work. Renode has a really elegant and extensible mechanism to &lt;a href="https://renode.readthedocs.io/en/latest/advanced/platform_description_format.html"&gt;describe platforms&lt;/a&gt;, as well as to &lt;a href="https://renode.readthedocs.io/en/latest/advanced/writing-peripherals.html"&gt;model peripherals&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I am using Renode to drastically simplify and speed up my “inner dev loop” when I build Azure RTOS applications. It can emulate pretty complex peripherals such as LCD touchscreens, which means I can even test my GUIX applications!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/guix-demo.gif"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--B-h3IBvC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/guix-demo.gif" alt="" width="800" height="630"&gt;&lt;/a&gt;&lt;/p&gt;
Azure RTOS GUIX Home Automation Demo running on a simulated STM32F746G-DISCO board in Renode.



&lt;p&gt;I have to give extra bonus points to Renode for &lt;a href="https://renode.readthedocs.io/en/latest/debugging/gdb.html"&gt;allowing debugging&lt;/a&gt; of the simulated code, as well as providing a &lt;a href="https://renode.readthedocs.io/en/latest/introduction/testing.html"&gt;complete framework for automated testing&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;I have assembled in the &lt;a href="https://github.com/kartben/azure-rtos-renode-samples"&gt;GitHub repository&lt;/a&gt; below some examples of Azure RTOS applications running on Renode, including an example showing how to use the testing framework mentioned above.&lt;/p&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--566lAguM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/github-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/kartben"&gt;
        kartben
      &lt;/a&gt; / &lt;a href="https://github.com/kartben/azure-rtos-renode-samples"&gt;
        azure-rtos-renode-samples
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;h1&gt;
Tests for Azure RTOS apps emulated in the Renode framework&lt;/h1&gt;
&lt;p&gt;This repository contains a few samples showcasing Azure RTOS-based applications simulated and/or tested using the Renode framework.&lt;/p&gt;
&lt;blockquote&gt;
&lt;p&gt;⚠️ Support for real-time operating systems on Arm chips in Renode has been vastly improved since Renode 1.12 release.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;You will need to use a nightly build&lt;/strong&gt; until Renode 1.13 is officially released. You can download the latest nightly build at &lt;a href="https://dl.antmicro.com/projects/renode/builds/?P=*latest*" rel="nofollow"&gt;https://dl.antmicro.com/projects/renode/builds/?P=&lt;em&gt;latest&lt;/em&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;
&lt;h2&gt;
Azure RTOS ThreadX demo&lt;/h2&gt;
&lt;p&gt;You can launch the Azure RTOS Threadx "Hello World" demo for STM32F746G Discovery kit directly from the Renode REPL by issuing the following command:&lt;/p&gt;
&lt;div class="snippet-clipboard-content position-relative overflow-auto"&gt;&lt;pre&gt;&lt;code&gt;s @https://raw.githubusercontent.com/kartben/azure-rtos-renode-samples/master/stm32f746_azure_rtos_guix_home_automation.resc
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;
&lt;p&gt;The application binary will automatically download and the UART monitoring window will show the serial traces produced by the application.&lt;/p&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer" href="https://github.com/kartben/azure-rtos-renode-samplesassets/threadx-demo.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--j5POcdBh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://github.com/kartben/azure-rtos-renode-samplesassets/threadx-demo.png" alt="Azure RTOS ThreadX Demo running in Renode"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
Azure RTOS GUIX Home Automation demo&lt;/h2&gt;
&lt;p&gt;You can launch the Azure RTOS GUIX Home Automation demo for STM32F746G Discovery kit directly from the Renode REPL by issuing the following…&lt;/p&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/kartben/azure-rtos-renode-samples"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;There is a lot more that can be done with Renode (multi-node simulation, networking, …) so I will probably write some more about it in the future.&lt;/p&gt;




&lt;p&gt;&lt;a href="https://wokwi.com"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lnpQbxIh--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/wokwi-300x71.png" alt="" width="300" height="71"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Wokwi
&lt;/h2&gt;

&lt;p&gt;It is becoming increasingly common to &lt;strong&gt;develop right from a web browser&lt;/strong&gt;. While it is a significant paradigm shift for long-time embedded developers, it also brings tons of benefits and makes it much easier to setup reproducible (and versioned!) development environments and toolchains. In fact, I already spoke about this a while ago in &lt;a href="https://www.youtube.com/watch?v=-enIM4x-KPA"&gt;this video&lt;/a&gt; 😊.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://wokwi.com/"&gt;&lt;strong&gt;Wokwi&lt;/strong&gt;&lt;/a&gt;is a web-based hardware simulation environment that can be used to simulate a wide variety of micro-controllers (Arduino Uno, ESP32, Raspberry Pi Pico, …) , electronic parts, sensors, and actuators.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/image.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ZcZfOrPW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/image-1024x701.png" alt="" width="880" height="602"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;At first sight, Wokwi feels like a tool that is only targeting “hobbyists”, but it is in fact really powerful and so much more than a toy (for example, it too does come with &lt;a href="https://docs.wokwi.com/gdb-debugging"&gt;debugging support&lt;/a&gt;)! It comes with an online web IDE that resembles the Arduino IDE, but you can also bring your pre-compiled ELF binary if you want to keep using your usual development environment.&lt;/p&gt;

&lt;p&gt;Beyond the fact that Wokwi “just works” and the fact that &lt;strong&gt;sharing a project with someone is as simple as giving them the URL to access it&lt;/strong&gt; (!), I really love the following Wokwi features:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/image-3.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_ZW_u6fH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/image-3.png" alt="" width="880" height="414"&gt;&lt;/a&gt;&lt;/p&gt;
Example of Wokwi’s Logic Analyzer capturing the I2C traffic between the MCU and an RTC (real-time clock) module.An external tool such as PulseView can then read and decode the captured traffic.



&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://docs.wokwi.com/guides/logic-analyzer"&gt;&lt;strong&gt;Logic analyzer&lt;/strong&gt;&lt;/a&gt; for capturing/debugging the various signals in your system (ex. I&lt;sup&gt;2&lt;/sup&gt;C or SPI traffic)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;a href="https://docs.wokwi.com/guides/esp32-wifi#viewing-wifi-traffic-with-wireshark"&gt;&lt;strong&gt;Network traffic capture&lt;/strong&gt;&lt;/a&gt; for analyzing Wi-Fi traffic in Wireshark. You can e.g. troubleshoot the connection to your IoT cloud, something that’s pretty cumbersome or even impossible to do when running on “real” hardware.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When you use Wokwi to simulate an ESP32 application, you can actually connect to a Wi-Fi, right from within the simulation. Your application may simply connect to the simulated “Wokwi-GUEST” WLAN to access the Internet. Pretty convenient for simulating your next IoT project, eh?&lt;/p&gt;

&lt;p&gt;As a way to show you how much easier things are when you don’t have to worry about the hardware, you may check out &lt;a href="https://wokwi.com/projects/322313026680128082"&gt;this Wokwi project&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://wokwi.com/projects/322313026680128082"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4vVWQKXb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/wokwi-azure-iot-central-1024x520.png" alt="" width="880" height="447"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It is showing the &lt;a href="https://github.com/Azure/azure-sdk-for-c-arduino/tree/main/examples/Azure_IoT_Central_ESP32"&gt;official Azure IoT Central sample code application for ESP32&lt;/a&gt;, running in a fully simulated environment. Getting the sample to run is really straightforward:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Create an Azure IoT Central Application, if need, and provision a new device, as per the &lt;a href="https://github.com/Azure/azure-sdk-for-c-arduino/tree/main/examples/Azure_IoT_Central_ESP32#create-the-iot-central-application"&gt;sample’s instructions&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Enter your Azure IoT Central and device information into the sample’s &lt;code&gt;iot_configs.h&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Run the simulation! The application will automatically get compiled by Wokwi, and then run within your browser.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;As the application starts, the simulated device will attach to the—simulated—wireless network, and will show up shortly thereafter in Azure IoT Central.&lt;/p&gt;

&lt;p&gt;You will be able to interact with the LEDs or the LCD display by sending commands from Azure IoT Central, as well as see the telemetry corresponding to the various sensors (you can control what “fake” temperature, acceleration data, etc. the sensors are reporting by clicking on them in the Web interface.&lt;/p&gt;




&lt;p&gt;&lt;a href="https://tinkercad.com/"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xRT4_ghH--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/1200px-Logo-tinkercad-wordmark.svg_-300x98.png" alt="" width="300" height="98"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  TinkerCAD
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/image-4-1.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3UUtiEm0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2022/03/image-4-1-218x300.png" alt="" width="218" height="300"&gt;&lt;/a&gt;&lt;/p&gt;
TinkerCAD’s circuit simulator is the perfect companion to ensure you don’t fry the actual components of your next project.



&lt;p&gt;I discovered &lt;a href="https://www.tinkercad.com/"&gt;&lt;strong&gt;TinkerCAD&lt;/strong&gt;&lt;/a&gt;‘s circuit simulation capabilities only recently. Compared to Renode and Wokwi, TinkerCAD will get you the closest to the “metal”. It helps validate the circuit down to making sure you have the wiring 100% correct, and do not risk frying a component due to a missing resistor!&lt;/p&gt;

&lt;p&gt;TinkerCAD lets you simulate Arduino UNO and micro:bit-based circuits, but I must admit that I haven’t spent a lot of time using it for actual projects just yet. The fact that it doesn’t allow to simulate network communications makes it a bit impractical for IoT scenarios anyway. However, there are some great projects in the &lt;a href="https://www.tinkercad.com/things?type=circuits&amp;amp;view_mode=default"&gt;online gallery&lt;/a&gt; that should give you a sense of what the tool is capable of.&lt;/p&gt;




&lt;p&gt;I highly recommend you give these amazing (and did I mention &lt;em&gt;free&lt;/em&gt;?) tools a try.&lt;/p&gt;

&lt;p&gt;You should be able to get all the examples I shared up and running in no time, which I hope is making my point regarding how much of a time-saver they can be.&lt;/p&gt;

&lt;p&gt;I would also love to hear about other simulation/emulations tools out there that you may be using to speed up your embedded &amp;amp; IoT development workflow—just let me know in the comments!&lt;/p&gt;

</description>
      <category>embedded</category>
      <category>simulation</category>
      <category>iot</category>
    </item>
    <item>
      <title>How I Built a Connected Artificial Nose (and How You Can Too!)</title>
      <dc:creator>Benjamin Cabé</dc:creator>
      <pubDate>Tue, 03 Aug 2021 16:08:06 +0000</pubDate>
      <link>https://forem.com/azure/how-i-built-a-connected-artificial-nose-and-how-you-can-too-4oh3</link>
      <guid>https://forem.com/azure/how-i-built-a-connected-artificial-nose-and-how-you-can-too-4oh3</guid>
      <description>&lt;p&gt;Over the past few months, I have worked on a pretty cool project that some of you might have already heard about as it sort of went viral. I &lt;strong&gt;built a DIY, general-purpose, artificial nose&lt;/strong&gt; that can smell virtually anything you teach it to recognize!&lt;/p&gt;

&lt;p&gt;The artificial nose in action, smelling coffee ☕ and whiskey 🥃.&lt;/p&gt;

&lt;p&gt;It is powered by the &lt;a href="https://www.seeedstudio.com/Wio-Terminal-p-4509.html"&gt;&lt;strong&gt;Wio Terminal&lt;/strong&gt;&lt;/a&gt; (an Arduino-compatible prototyping platform), a super affordable &lt;strong&gt;electronic gas sensor&lt;/strong&gt; , and a &lt;strong&gt;TinyML&lt;/strong&gt; neural network that I trained using the free online tool &lt;a href="https://www.edgeimpulse.com/"&gt;&lt;strong&gt;Edge Impulse&lt;/strong&gt;&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The project was recently featured on the cover of &lt;strong&gt;Make: Magazine&lt;/strong&gt; , and I encourage you to &lt;a href="https://makezine.com/projects/second-sense-build-an-ai-smart-nose"&gt;check out the article&lt;/a&gt; I wrote for them before reading further.&lt;/p&gt;

&lt;p&gt;The Make: Magazine article covers a lot about &lt;em&gt;how&lt;/em&gt; you can build the artificial nose for yourself, so I want to use this blog post to dive deeper into why this project is so important to me. In particular, I want to share with you how it helped me &lt;strong&gt;understand more about AI&lt;/strong&gt; than I’d ever thought, and how I eventually ended up &lt;strong&gt;connecting the “nose” to an IoT platform&lt;/strong&gt; (namely, Azure IoT).&lt;/p&gt;

&lt;blockquote&gt;
&lt;ol&gt;
&lt;li&gt;Making Neural Networks Tangible&lt;/li&gt;
&lt;li&gt;Turning the nose into an IoT device&lt;/li&gt;
&lt;li&gt;Connecting the Artificial Nose to Azure IoT Central&lt;/li&gt;
&lt;li&gt;Digital Twins meet virtual senses&lt;/li&gt;
&lt;li&gt;Get started today!&lt;/li&gt;
&lt;/ol&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Making Neural Networks Tangible
&lt;/h2&gt;

&lt;p&gt;Despite being passionate about all things software, Machine Learning (ML) has always been a field that’s eluded me, perhaps because it tends to be too abstract and too much maths for my &lt;strong&gt;visual brain&lt;/strong&gt;?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--8MVBNPWj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1060/1%2AVAjYygFUinnygIx9eVCrQQ.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8MVBNPWj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://miro.medium.com/max/1060/1%2AVAjYygFUinnygIx9eVCrQQ.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;
Sample images from the MNIST test dataset.



&lt;p&gt;Speaking of &lt;em&gt;visual&lt;/em&gt; things, every time I have tried to open a book promising to be an introduction to ML, most of the introductory examples involved &lt;strong&gt;image classification&lt;/strong&gt; (ex. automatically recognizing handwritten digits from the &lt;a href="https://en.wikipedia.org/wiki/MNIST_database"&gt;MNIST database&lt;/a&gt;). And, sadly, those innocent pixels would be anything but &lt;em&gt;visual&lt;/em&gt; to me, as they would quickly turn into abstract matrices.&lt;/p&gt;

&lt;p&gt;So when I started to think of implementing an artificial nose, I didn’t initially approach it as a Machine Learning problem. Instead, &lt;strong&gt;I tried to use my intuition: “What characterizes a smell?”&lt;/strong&gt;. And my intuition was telling me that somehow I needed to establish a correlation between the concentration of the various gasses measured by the gas sensor (carbon monoxide, ethyl alcohol, etc.), and the associated smell. However, doing a simple read of the gasses concentration at a given point in time would probably not cut it: how would it make the difference between a really strong alcohol smell, and one that was maybe more volatile?&lt;/p&gt;

&lt;p&gt;Quickly, I realized that &lt;strong&gt;acquiring a couple seconds of sensor data&lt;/strong&gt; would probably be just enough to “capture” the olfactory fingerprint of each smell. With these few seconds of sensor data, I could &lt;strong&gt;look at the variation&lt;/strong&gt; (min, max, average, etc.) of the concentration of each gas, and this would hopefully characterize each smell.&lt;/p&gt;

&lt;p&gt;It turns out that once I had extracted those characteristics—something that I can now refer to as _ &lt;strong&gt;feature extraction&lt;/strong&gt; _, like the AI grown-ups, and which was really easy to do using the Edge Impulse tool suite—all that was left was to effectively establish the correlation between them and the expected smells. However, I didn’t really know what kind of &lt;strong&gt;neural network architecture&lt;/strong&gt; I would need, let alone what a neural network was anyway. So, once again, I leveraged the Edge Impulse environment.&lt;/p&gt;

&lt;p&gt;It turns out the kind of &lt;strong&gt;classification problem&lt;/strong&gt; I was looking at was reasonably simple: given the minimum/maximum/average/… concentration of each gas on a given time period (I found 1.5s to be the sweet spot), what is the predicted smell? And one simple way to “solve” that equation, is to use a so-called &lt;strong&gt;fully-connected neural network&lt;/strong&gt; , like you see below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2021/07/nn-3.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TrKhhf_A--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2021/07/nn-3-1024x611.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;During the &lt;strong&gt;training phase&lt;/strong&gt; , the training data represents the ground truth (ex. “&lt;em&gt;This&lt;/em&gt; is 100% coffee!”) and is used to tweak the parameters of the equation—the weights of the neurons—based on how much each characteristic (ex. the average concentration of NO2) accrues to each smell.&lt;/p&gt;

&lt;p&gt;Once the model has been trained, and during the &lt;strong&gt;inference phase&lt;/strong&gt; , a given input/olfactory fingerprint entering the network (left-hand side of the diagram), ends up being “routed” to the appropriate output bucket (right-hand side). effectively giving a prediction about what smell it corresponds to.&lt;/p&gt;

&lt;h4&gt;
  
  
  Building an actual nose
&lt;/h4&gt;

&lt;p&gt;When I initially shared my project on social media back in May last year, I quickly realized lots of people were interested in it.&lt;/p&gt;


&lt;blockquote class="ltag__twitter-tweet"&gt;
    &lt;div class="ltag__twitter-tweet__media ltag__twitter-tweet__media__two-pics"&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_NCJCGPJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://pbs.twimg.com/media/EXggS1PXQAA3-bp.jpg" alt="unknown tweet media content"&gt;
    &lt;/div&gt;

  &lt;div class="ltag__twitter-tweet__main"&gt;
    &lt;div class="ltag__twitter-tweet__header"&gt;
      &lt;img class="ltag__twitter-tweet__profile-image" src="https://res.cloudinary.com/practicaldev/image/fetch/s--yiyxkIG8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://pbs.twimg.com/profile_images/1374848691576713218/bZv67WCW_normal.jpg" alt="Benjamin Cabé profile image"&gt;
      &lt;div class="ltag__twitter-tweet__full-name"&gt;
        Benjamin Cabé
      &lt;/div&gt;
      &lt;div class="ltag__twitter-tweet__username"&gt;
        &lt;a class="mentioned-user" href="https://dev.to/kartben"&gt;@kartben&lt;/a&gt;

      &lt;/div&gt;
      &lt;div class="ltag__twitter-tweet__twitter-logo"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ir1kO05j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-f95605061196010f91e64806688390eb1a4dbc9e913682e043eb8b1e06ca484f.svg" alt="twitter logo"&gt;
      &lt;/div&gt;
    &lt;/div&gt;
    &lt;div class="ltag__twitter-tweet__body"&gt;
      Long weekend hacking project: &lt;a href="https://twitter.com/hashtag/TinyML"&gt;#TinyML&lt;/a&gt; powered artificial nose that identifies booze, using &lt;a href="https://twitter.com/seeedstudio"&gt;@seeedstudio&lt;/a&gt;'s Wio Terminal and @edgeimpulse. I'm literally mind-blown by the accuracy: it can actually discern different kinds of rum or scotch. 
    &lt;/div&gt;
    &lt;div class="ltag__twitter-tweet__date"&gt;
      16:12 PM - 08 May 2020
    &lt;/div&gt;


    &lt;div class="ltag__twitter-tweet__actions"&gt;
      &lt;a href="https://twitter.com/intent/tweet?in_reply_to=1258791793073815552" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fFnoeFxk--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-reply-action-238fe0a37991706a6880ed13941c3efd6b371e4aefe288fe8e0db85250708bc4.svg" alt="Twitter reply action"&gt;
      &lt;/a&gt;
      &lt;a href="https://twitter.com/intent/retweet?tweet_id=1258791793073815552" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--k6dcrOn8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-retweet-action-632c83532a4e7de573c5c08dbb090ee18b348b13e2793175fea914827bc42046.svg" alt="Twitter retweet action"&gt;
      &lt;/a&gt;
      &lt;a href="https://twitter.com/intent/like?tweet_id=1258791793073815552" class="ltag__twitter-tweet__actions__button"&gt;
        &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SRQc9lOp--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/twitter-like-action-1ea89f4b87c7d37465b0eb78d51fcb7fe6c03a089805d7ea014ba71365be5171.svg" alt="Twitter like action"&gt;
      &lt;/a&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/blockquote&gt;


&lt;p&gt;This motivated me to go further and to turn my initial prototype into an &lt;strong&gt;actual nose&lt;/strong&gt;! I had never done that before, so I ended up teaching myself how to use 3D CAD software so that I could design an actual enclosure for my device. I picked &lt;a href="https://www.blender.org/"&gt;Blender&lt;/a&gt;—which I would &lt;em&gt;not&lt;/em&gt; recommend for pure CAD stuff as there are better alternatives out there, ex. &lt;a href="https://www.tinkercad.com/"&gt;TinkerCAD&lt;/a&gt;—, and 3D-printed the resulting plastic enclosure.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.thingiverse.com/thing:4493907"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Xghqb5_M--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2021/07/figure-B-nose-enclosure-on-thingiverse-931x1024.png" alt='A screen capture from the thingiverse.com website titled "Artificial Nose Enclosure" that show a blue 3D rendering of a nose.'&gt;&lt;/a&gt;The Nose Enclosure on Thingiverse.&lt;/p&gt;

&lt;h2&gt;
  
  
  Turning the nose into an IoT device
&lt;/h2&gt;

&lt;p&gt;An interesting aspect of TinyML is that it enables scenarios where your low-power, constrained, microcontroller-based equipment is &lt;strong&gt;completely autonomous&lt;/strong&gt; when it comes to performing machine learning inference (ex. guessing a smell). It is very powerful, as it means your &lt;strong&gt;sensor data never has to leave your device&lt;/strong&gt; and you don’t need to rely on any sort of cloud-based AI service. But on the other hand, it also means that your &lt;em&gt;smart&lt;/em&gt; device might not be so smart if it ends up living in its own echo chamber, right?&lt;/p&gt;

&lt;p&gt;At the heart of an IoT solution is often the “thing” itself, and it makes a lot of sense to design it to be as smart as possible for there are many reasons why relying on any form of network communication or cloud-based processing is at best impractical, sometimes plain impossible.&lt;/p&gt;

&lt;h3&gt;
  
  
  Connecting the Artificial Nose to Azure IoT Central
&lt;/h3&gt;

&lt;p&gt;The Artificial Nose is effectively an &lt;a href="https://docs.microsoft.com/en-us/azure/iot-develop/overview-iot-plug-and-play"&gt;IoT Plug and Play&lt;/a&gt; device.&lt;/p&gt;

&lt;p&gt;As soon I was happy with how it performed at smelling things, and once I had completed the development of the graphical user interface, I did use the Azure IoT SDK (and &lt;a href="https://dev.to/azure/connecting-the-wio-terminal-to-azure-iot-3o9n"&gt;some of the work&lt;/a&gt; I had done last year) to enable the nose to talk to the Azure IoT services.&lt;/p&gt;

&lt;p&gt;It means you can very easily connect the device to &lt;strong&gt;Azure IoT Central&lt;/strong&gt; (using the Wio Terminal’s Wi-Fi module), and get access to gas sensor data telemetry in near-realtime, see what the device is smelling, etc.&lt;/p&gt;

&lt;p&gt;More importantly, you can automatically &lt;strong&gt;trigger rules&lt;/strong&gt; when, for example, a &lt;strong&gt;bad smell is being detected&lt;/strong&gt; , therefore allowing the nose to be much smarter than if it were just a standalone, offline, device.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2021/08/artificial_nose_iot_central.gif"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--SAdhG3Vq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2021/08/artificial_nose_iot_central.gif" alt=""&gt;&lt;/a&gt;Connecting the Artificial Nose to Azure IoT Central – Real-time telemetry.&lt;/p&gt;

&lt;p&gt;If you built the artificial nose for yourself—and I hope many of you will consider doing so!—here are the &lt;strong&gt;simple steps&lt;/strong&gt; for you &lt;strong&gt;to connect it to Azure IoT Central&lt;/strong&gt; :&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;First, make sure that your Wio Terminal is running an up-to-date WiFi firmware by following &lt;a href="https://wiki.seeedstudio.com/Wio-Terminal-Network-Overview/"&gt;these instructions&lt;/a&gt; ;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://docs.microsoft.com/en-us/azure/iot-central/core/quick-deploy-iot-central#create-an-application"&gt;Create a new Azure IoT Central application&lt;/a&gt; (if you already have one you want to use, that works too!) ;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the &lt;strong&gt;Administration&lt;/strong&gt; section of the IoT Central application, look for the &lt;strong&gt;Device Connection&lt;/strong&gt; menu. &lt;/li&gt;
&lt;li&gt;Open the &lt;strong&gt;SAS-IoT-Devices&lt;/strong&gt; enrollment group and take note of the following credentials that you will need to connect your AI nose(s):&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;ID Scope&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;SAS Primary Key&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Flash the Wio Terminal with the &lt;a href="https://github.com/kartben/artificial-nose/releases/latest"&gt;latest Artificial Nose firmware&lt;/a&gt; (or deploy your own custom build) ;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;While the Wio Terminal is powered, keep the three buttons (A, B, C) at the top pressed, and slide the reset button. The device should now be showing a black screen ;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;

&lt;p&gt;Connect to the Wio Terminal over serial and check that it’s running the configuration prompt by typing &lt;code&gt;help&lt;/code&gt;, which should show you the list of supported commands. Type the following commands to configure the WiFi connection and the Azure IoT credentials&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;set_wifissid &amp;lt;your_wifi_ssid&amp;gt;&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;set_wifipwd &amp;lt;your_wifi_password&amp;gt;&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;set_az_iotc &amp;lt;id_scope&amp;gt; &amp;lt;sas_primary_key&amp;gt; &amp;lt;device_id&amp;gt;&lt;/code&gt; (id_scope and sas_primary_key as per earlier, and device_id being the ID you want to give your device in Azure IoT Central)&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Reset the Wio Terminal, and voila! You should now see a new device popping up in the &lt;strong&gt;Devices&lt;/strong&gt; section of your IoT Central application.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Digital Twins meet virtual senses
&lt;/h2&gt;

&lt;p&gt;Like I mentioned above, having the &lt;em&gt;nose&lt;/em&gt; talking to an IoT platform enables scenarios where e.g. you trigger an alert when a bad smell is being picked up. But what is a bad smell anyway? This might depend on a lot of different factors, just like the final destination for the actual alert might be highly dynamic.&lt;/p&gt;

&lt;p&gt;Let me try to illustrate this with an example of a &lt;strong&gt;real estate cleaning company&lt;/strong&gt; in charge of buildings all around the city of Chicago. Their information system already allows them to &lt;strong&gt;keep track of their personnel and associated cleaning schedules&lt;/strong&gt; , but in a pretty &lt;strong&gt;static way&lt;/strong&gt; : cleaning people are going to their assigned location once a day, no matter what. From time to time, it turns out that the location doesn’t really require urgent cleaning (hello, COVID-19 and slow office spaces!), in which case the cleaning staff would have been better off going to a place that actually required servicing.&lt;/p&gt;

&lt;p&gt;Beyond the apparent buzzword, the concept of &lt;strong&gt;Digital Twins&lt;/strong&gt; consists in nothing more than &lt;strong&gt;augmenting the information system&lt;/strong&gt; (staff directory, building inventory, cleaning schedules, etc.) and overall knowledge graph of the cleaning company with entities that correspond to physical, connected, assets.&lt;/p&gt;

&lt;p&gt;With that in mind, a mere “it doesn’t smell so good in here” signal sent by a sniffing device sitting in an office building can immediately be contextualized, and appropriate actions can be taken. Based on where the device is effectively located, it becomes easy to figure out who is the person responsible for cleaning that space on that particular day, and to notify them accordingly.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2021/07/digital_twins_animation.gif"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--pVD2u_Ji--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2021/07/digital_twins_animation.gif" alt=""&gt;&lt;/a&gt;Connecting the Artificial Nose to a Digital Twins environment.&lt;/p&gt;

&lt;h2&gt;
  
  
  Get started today!
&lt;/h2&gt;

&lt;p&gt;Many people have already started to build the device for themselves and to experiment what adding “virtual smell” to their devices and applications could mean. If this blog post inspired you to join them, I will leave you with the only two links that you really need to get started:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You can &lt;a href="https://www.seeedstudio.com/Tiny-ML-powered-Artificial-Nose-Project-kit-with-Wio-Terminal-p-4999.html"&gt;get your artificial nose hardware kit&lt;/a&gt; directly from Seeed Studio.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--l6ahWHtM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://media-cdn.seeedstudio.com/media/catalog/product/cache/9d0ce51a71ce6a79dfa2a98d65a0f0bd/a/r/artificial_nose_kit_1_.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--l6ahWHtM--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://media-cdn.seeedstudio.com/media/catalog/product/cache/9d0ce51a71ce6a79dfa2a98d65a0f0bd/a/r/artificial_nose_kit_1_.png" alt=""&gt;&lt;/a&gt;TinyML powered Artificial Nose Project kit with Wio Terminal&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You can find the source code, 3D files for the nose enclosure, initial AI model, etc. &lt;a href="https://github.com/kartben/artificial-nose"&gt;on my GitHub repository&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--i3JOwpme--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/github-logo-ba8488d21cd8ee1fee097b8410db9deaa41d0ca30b004c0c63de0a479114156f.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/kartben"&gt;
        kartben
      &lt;/a&gt; / &lt;a href="https://github.com/kartben/artificial-nose"&gt;
        artificial-nose
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Instructions, source code, and misc. resources needed for building a Tiny ML-powered artificial nose.
    &lt;/h3&gt;
  &lt;/div&gt;
&lt;/div&gt;


</description>
      <category>iot</category>
      <category>ai</category>
      <category>tinyml</category>
      <category>azureiot</category>
    </item>
    <item>
      <title>Deploying a LoRaWAN network server on Azure</title>
      <dc:creator>Benjamin Cabé</dc:creator>
      <pubDate>Tue, 01 Dec 2020 17:07:03 +0000</pubDate>
      <link>https://forem.com/azure/deploying-a-lorawan-network-server-on-azure-b6d</link>
      <guid>https://forem.com/azure/deploying-a-lorawan-network-server-on-azure-b6d</guid>
      <description>&lt;p&gt;There is something oddly fascinating about radio waves, radio communications, and the sheer amount of innovations they’ve enabled since the end of the 19th century.&lt;/p&gt;

&lt;p&gt;What I find even more fascinating is that it is now very easy for anyone to get hands-on experience with radio technologies such as &lt;strong&gt;LPWAN&lt;/strong&gt; ( &lt;strong&gt;Low-Power Wide Area Network&lt;/strong&gt; , a technology that allows connecting pieces of equipment over a &lt;strong&gt;low-power, long-range, secure radio network&lt;/strong&gt; ) in the context of building connected products.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TAgwdZMm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/11/Heinrich_Rudolf_Hertz-150x150.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TAgwdZMm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/11/Heinrich_Rudolf_Hertz-150x150.jpg" alt="A portrait of Heinrich Rudolf Hertz"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;“&lt;em&gt;It’s of no use whatsoever&lt;/em&gt; […]_ this is just an experiment that proves Maestro Maxwell was right—we just have these mysterious electromagnetic waves that we cannot see with the naked eye. But they are there._“&lt;/p&gt;

&lt;p&gt;&lt;cite&gt;&lt;strong&gt;— Heinrich Hertz&lt;/strong&gt;, about the practical importance of his radio wave experiments&lt;/cite&gt;&lt;/p&gt;
&lt;/blockquote&gt;




&lt;p&gt;Nowadays, not only is there a wide variety of hardware developer kits, gateways, and radio modules to help you with the hardware/radio aspect of LPWAN radio communications, but there is also open-source software that allows you to &lt;strong&gt;build and operate your very own network&lt;/strong&gt;. Read on as I will be giving you some insights into what it takes to set up a full-blown &lt;strong&gt;LoRaWAN network server in the cloud&lt;/strong&gt;!&lt;/p&gt;

&lt;h2&gt;
  
  
  A quick refresher on LoRaWAN
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://en.wikipedia.org/wiki/LoRaWAN"&gt;LoRaWAN&lt;/a&gt; is a &lt;strong&gt;low-power wide-area network&lt;/strong&gt; (LPWAN) technology that uses the &lt;strong&gt;&lt;a href="https://en.wikipedia.org/wiki/LoRa"&gt;LoRa&lt;/a&gt; radio protocol&lt;/strong&gt; to allow long-range transmissions between IoT devices and the Internet. LoRa itself uses a form of chirp spread spectrum modulation which, combined with error correction techniques, allows for very high link budgets—in other terms: the ability to cover &lt;em&gt;very&lt;/em&gt; long ranges!&lt;/p&gt;

&lt;p&gt;Data sent by LoRaWAN end devices gets picked up by &lt;strong&gt;gateways&lt;/strong&gt; nearby and is then routed to a so-called &lt;strong&gt;network server.&lt;/strong&gt; The network server de-duplicates packets (several gateways may have “seen” and forwarded the same radio packet), performs security checks, and eventually routes the information to its actual destination, i.e. the &lt;strong&gt;application&lt;/strong&gt; the devices are sending data to.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hE4ius8m--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/12/boschparkingsensor-300x189.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hE4ius8m--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/12/boschparkingsensor-300x189.png" alt=""&gt;&lt;/a&gt;Bosch’s Smart Parking Lot sensor.&lt;/p&gt;

&lt;p&gt;LoRaWAN end nodes are usually pretty “dumb”, battery-powered, devices (ex. soil moisture sensor, parking occupancy, …), that have very limited knowledge of their radio environment. For example, a node may be in close proximity to a gateway, and yet transmit radio packets with much more transmission power than necessary, wasting precious battery energy in the process. Therefore, one of the duties of a LoRaWAN network server is to consolidate various metrics collected from the field gateways to optimize the network. If a gateway is telling the network server it is getting a really strong signal from a sensor, it might make sense to send a &lt;em&gt;downlink&lt;/em&gt; packet to that device so that it can try using slightly less power for future transmissions.&lt;/p&gt;

&lt;p&gt;As LoRa uses an unlicensed spectrum and granted one follows their local radio regulations, anyone can freely connect LoRa devices, or even operate their own network.&lt;/p&gt;

&lt;h2&gt;
  
  
  My private LoRaWAN server, why?
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://lora-alliance.org/lorawan-for-developers"&gt;LoRaWAN specification&lt;/a&gt; puts a really strong focus on &lt;strong&gt;security&lt;/strong&gt; , and by no means do I want to make you think that rolling out your own networking infrastructure is mandatory to make your LoRaWAN solution secure. In fact, LoRaWAN has a pretty elegant way of &lt;strong&gt;securing communications, while keeping the protocol lightweight&lt;/strong&gt;. There is &lt;a href="https://lora-alliance.org/resource-hub/lorawanr-secure-implementation-matters"&gt;a lot of literature on the topic&lt;/a&gt; that I encourage you to read but, in a nutshell, &lt;strong&gt;the protocol makes it almost impossible for malicious actors to impersonate your devices&lt;/strong&gt; (messages are signed and protected against replay attacks) &lt;strong&gt;or access your data&lt;/strong&gt; (your application data is seen by the network server as an opaque, ciphered, payload).&lt;/p&gt;

&lt;p&gt;So why should you bother about rolling your ow LoRaWAN network server anyway?&lt;/p&gt;

&lt;h3&gt;
  
  
  Coverage where you need it
&lt;/h3&gt;

&lt;p&gt;In most cases, &lt;strong&gt;relying on a public network operator means being dependant on their coverage&lt;/strong&gt;. While some operators might allow a hybrid model where you can attach your own gateways to their network, and hence extend the coverage right where you need it, oftentimes you don’t get to decide how well a particular geographical area will be covered by a given operator.&lt;/p&gt;

&lt;p&gt;When rolling out your own network server, you end up &lt;strong&gt;managing your own fleet of gateways&lt;/strong&gt; , bringing you &lt;strong&gt;more flexibility in terms of coverage&lt;/strong&gt; , network redundancy, etc.&lt;/p&gt;

&lt;h3&gt;
  
  
  Data ownership
&lt;/h3&gt;

&lt;p&gt;While operating your own server will not necessarily add a lot in terms of pure security (after all, your LoRaWAN packets are hanging in the open air a good chunk of their lifetime anyway!), being your own operator definitely brings you more flexibility to know and control what happens to your data once it’s reached the Internet.&lt;/p&gt;

&lt;h3&gt;
  
  
  What about the downsides?
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://i.giphy.com/media/10KIsXhwdoerHW/giphy.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://i.giphy.com/media/10KIsXhwdoerHW/giphy.gif" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It goes without saying that &lt;strong&gt;operating your network is no small feat&lt;/strong&gt; , and you should obviously do your due diligence with regards to the potential challenges, risks, and costs associated with keeping your network up and running.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;That being said, it’s now high time I tell you how you’d go about rolling out your own LoRaWAN network, right?&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Things Stack on Azure
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://thethingsstack.io/"&gt;The Things Stack&lt;/a&gt; is an open-source LoRaWAN network server that supports all versions of the LoRaWAN specification and operation modes. It is actively being maintained by &lt;a href="https://www.thethingsindustries.com/"&gt;The Things Industries&lt;/a&gt; and is the underlying core of their commercial offerings.&lt;/p&gt;

&lt;p&gt;A typical/minimal deployment of The Things Stack network server relies on roughly three pillars:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A &lt;strong&gt;Redis in-memory data store&lt;/strong&gt; for supporting the operation of the network ;&lt;/li&gt;
&lt;li&gt;An &lt;strong&gt;SQL database&lt;/strong&gt; (PostgreSQL or CockroachDB are supported) for storing information regarding the gateways, devices, and users of thje network ;&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;actual stack&lt;/strong&gt; , running the different services that power the web console, the network server itself, etc.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The deployment model recommended for someone interested in quickly testing out The Things Stack is to &lt;a href="https://thethingsstack.io/getting-started/installation/"&gt;use their Docker Compose configuration&lt;/a&gt;. It fires up all the services mentioned above as Docker containers on the same machine. Pretty cool for testing, but not so much for a production environment: who is going to keep those Redis and PostgreSQL services available 24/7, properly backed up, etc.?&lt;/p&gt;

&lt;p&gt;I have put together a set of &lt;strong&gt;instructions and a deployment template&lt;/strong&gt; that aim at &lt;strong&gt;showing how a LoRaWAN server based on The Things Stack and running in Azure could look like&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2020/11/deployment-diagram-1.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--uaNvVusA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/11/deployment-diagram-1.png" alt=""&gt;&lt;/a&gt;The Things Stack running on Azure – Deployment diagram.&lt;/p&gt;

&lt;p&gt;The instructions in the &lt;a href="https://github.com/kartben/thethingsstack-on-azure"&gt;GitHub repository&lt;/a&gt; linked below should be all you need to get your very own server up and running!&lt;/p&gt;

&lt;p&gt;In fact, you only have a handful of parameters to tweak (what fancy nickname to give your server, credentials for the admin user, …) and the deployment template will do the rest!&lt;/p&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--i3JOwpme--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/github-logo-ba8488d21cd8ee1fee097b8410db9deaa41d0ca30b004c0c63de0a479114156f.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/kartben"&gt;
        kartben
      &lt;/a&gt; / &lt;a href="https://github.com/kartben/thethingsstack-on-azure"&gt;
        thethingsstack-on-azure
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Instructions and deployment scripts for setting up a LoRaWAN network server on Azure cloud, using the Things Stack.
    &lt;/h3&gt;
  &lt;/div&gt;
&lt;/div&gt;


&lt;h2&gt;
  
  
  OK, I deployed my network server in Azure, now what?
&lt;/h2&gt;

&lt;p&gt;Just to enumerate a few, here are some of the things that having your own network server, running in your own Azure subscription, will enable. Some will sound oddly specific if you don’t have a lot of experience with LoRaWAN yet, but they are important nevertheless. You can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;benefit from &lt;strong&gt;managed Redis and PostgreSQL services&lt;/strong&gt; , and not have to worry about potential security fixes that would need to be rolled out, or about performing regular backups, etc. ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;control what LoRaWAN gateways can connect to your network server&lt;/strong&gt; , as you can tweak your Network Security Group to only allow specific IPs to connect to the UDP packet forwarder endpoint of your network server ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;completely isolate the internals of your network server from the public Internet&lt;/strong&gt; (including the Application Server if you which so), putting you in a better position to control and secure your business data ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;scale your infrastructure up or down&lt;/strong&gt; as the size and complexity of the fleet that you are managing evolves ;&lt;/li&gt;
&lt;li&gt;… and there is probably so much more. I’m actually curious to hear in the comments below about other benefits (or downsides, for that matter) you’d see.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I started to put together an &lt;a href="https://github.com/kartben/thethingsstack-on-azure#faq"&gt;FAQ&lt;/a&gt; in the GitHub repository so, hopefully, your most obvious questions are already answered there. However, there is one that I thought was worth calling out in this post, which is: &lt;em&gt;“ &lt;strong&gt;How big of a fleet can I connect?&lt;/strong&gt; “&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;It turns out that even a reasonably small VM like the one used in the deployment template—2 vCPUs, 4GB of RAM—can already handle thousands of nodes, and hundreds of gateways. You may find this &lt;a href="https://github.com/kartben/lorawan-node-simulator"&gt;LoRaWAN traffic simulation tool&lt;/a&gt; that I wrote helpful in case you’d want to conduct your own stress testing experiments.&lt;/p&gt;

&lt;h2&gt;
  
  
  What’s next?
&lt;/h2&gt;

&lt;p&gt;You should definitely expect more from me when it comes to other LoRaWAN related articles in the future. From leveraging &lt;a href="https://github.com/Azure/opendigitaltwins-dtdl"&gt;DTDL&lt;/a&gt; for simplifying end application development and interoperability with other solutions, to integrating with Azure IoT services, there’s definitely a lot more to cover. Stay tuned, and please let me know in the comments of other related topics you’d like to see covered!&lt;/p&gt;

</description>
      <category>iot</category>
      <category>lorawan</category>
      <category>lpwan</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Using Github Codespaces for Embedded Development</title>
      <dc:creator>Benjamin Cabé</dc:creator>
      <pubDate>Wed, 02 Sep 2020 14:45:12 +0000</pubDate>
      <link>https://forem.com/kartben/using-github-codespaces-for-embedded-development-2bbg</link>
      <guid>https://forem.com/kartben/using-github-codespaces-for-embedded-development-2bbg</guid>
      <description>&lt;p&gt;&lt;strong&gt;Managing an embedded development environment can be pretty painful and error-prone&lt;/strong&gt; , from properly checking out the codebase and all its dependencies, to making sure the correct (and often pretty big!) toolchains are setup and used, to having the developers’ IDE use the right set of extensions and plugins.&lt;/p&gt;

&lt;p&gt;When you start &lt;strong&gt;thinking of containers as a technology that can be used not only for runtime&lt;/strong&gt; (ex. for packaging microservices) &lt;strong&gt;but also at development time&lt;/strong&gt; , it becomes possible to easily describe the entirety of the required development environment for a particular project. Make this description part of your source code repository and you end up with a versioned, fully reproducible, dev environment! Hey, using a &lt;strong&gt;cloud-based IDE&lt;/strong&gt; surely you should even be able to code straight from your web browser, right?&lt;/p&gt;

&lt;p&gt;I recently gave &lt;a href="https://github.com/features/codespaces"&gt;GitHub Codespaces&lt;/a&gt; a try to get a sense of the benefits of the approach. &lt;strong&gt;Spoiler alert:&lt;/strong&gt; there is already a lot that can be done (debugging embedded code from your web browser anyone?), so I am really excited to see what’s ahead of us in terms of making embedded development even more seamless.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/-enIM4x-KPA"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;I highly encourage you to &lt;a href="https://github.com/features/codespaces"&gt;give Codespaces a try&lt;/a&gt; and see for yourself what you think might be missing in the picture. I would love to hear about it!&lt;/p&gt;

&lt;p&gt;A good way for you to get started if you have &lt;a href="https://www.st.com/en/evaluation-tools/b-l475e-iot01a.html"&gt;STM32L4 developer kit&lt;/a&gt; handy would be to go with the &lt;a href="https://github.com/azure-rtos/getting-started/tree/master/STMicroelectronics/STM32L4_L4%2B"&gt;Azure RTOS getting started&lt;/a&gt; example, like I did in the video. Don’t forget to check the &lt;a href="https://github.com/azure-rtos/getting-started/blob/master/docs/debugging.md"&gt;debugging instructions&lt;/a&gt;—they complement what you see in the video nicely.&lt;/p&gt;

</description>
      <category>codespaces</category>
      <category>github</category>
      <category>embedded</category>
    </item>
    <item>
      <title>Connecting the Wio Terminal to Azure IoT</title>
      <dc:creator>Benjamin Cabé</dc:creator>
      <pubDate>Wed, 05 Aug 2020 18:59:40 +0000</pubDate>
      <link>https://forem.com/azure/connecting-the-wio-terminal-to-azure-iot-3o9n</link>
      <guid>https://forem.com/azure/connecting-the-wio-terminal-to-azure-iot-3o9n</guid>
      <description>&lt;p&gt;It’s been a few months now since I started playing with the &lt;a href="https://www.seeedstudio.com/Wio-Terminal-p-4509.html"&gt;Wio Terminal from Seeed Studio&lt;/a&gt;. It is a pretty complete device that can be used to power a wide range of IoT solutions—just look at its specifications!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2020/08/WioT-Hardware-Overview.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--O3Xxzao6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/08/WioT-Hardware-Overview-300x174.png" alt=""&gt;&lt;/a&gt;Wio Terminal Features&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cortex-M4F running at 120MHz&lt;/strong&gt; (can be overclocked to 200MHz) from Microchip (&lt;a href="https://www.microchip.com/wwwproducts/en/ATSAMD51P19A"&gt;ATSAMD51P19&lt;/a&gt;) ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;192 KB of RAM&lt;/strong&gt; , &lt;strong&gt;4MB of Flash&lt;/strong&gt; ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Wireless connectivity&lt;/strong&gt; : WiFi 2.4 &amp;amp; 5 GHz  (802.11 a/b/g/n), BLE, BLE 5.0, powered by a &lt;a href="https://www.seeedstudio.com/Realtek8720DN-2-4G-5G-Dual-Bands-Wireless-and-BLE5-0-Combo-Module-p-4442.html"&gt;Realtek RTL8720DN&lt;/a&gt; module ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;2.4″ LCD screen&lt;/strong&gt; , 320×240 pixels ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;microSD card reader&lt;/strong&gt; ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Built-in sensors and actuators&lt;/strong&gt; : light sensor, &lt;a href="https://www.st.com/en/mems-and-sensors/lis3dh.html"&gt;LIS3DH accelerometer&lt;/a&gt;, infrared emitter, microphone, buzzer, 5-way switch ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Expansion ports&lt;/strong&gt; : 2x Grove ports, 1x Raspberry-Pi compatible 40-pin header.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Wireless connectivity, extensibility, processing power… on paper, the Wio Terminal must be the ideal platform from IoT development, right? Well, ironically, &lt;strong&gt;one thing it doesn’t do out-of-the-box is to actually connect to an IoT cloud platform&lt;/strong&gt;!&lt;/p&gt;

&lt;p&gt;You will have guessed it by now… In this blog post, you’ll learn how to &lt;strong&gt;connect your Wio Terminal to Azure IoT&lt;/strong&gt;. More importantly, you will learn about the steps I followed, giving you all the information you need in order to &lt;strong&gt;port the Azure IoT Embedded C libraries to your own IoT device&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Connecting your Wio Terminal to Azure IoT
&lt;/h2&gt;

&lt;p&gt;I have put together a &lt;a href="https://github.com/kartben/wioterminal-azureiothub-sample"&gt;sample application&lt;/a&gt; that should get you started in no time.&lt;/p&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--i3JOwpme--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev.to/assets/github-logo-ba8488d21cd8ee1fee097b8410db9deaa41d0ca30b004c0c63de0a479114156f.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/kartben"&gt;
        kartben
      &lt;/a&gt; / &lt;a href="https://github.com/kartben/wioterminal-azureiothub-sample"&gt;
        wioterminal-azureiothub-sample
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      This repository contains a sample application showing how to connect a Wio Terminal to Azure IoT Hub to send telemetry and receive commands.
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;h1&gt;
Welcome to wioterminal-azureiothub-sample 👋
&lt;/h1&gt;
&lt;p&gt;&lt;a href="https://github.com/kartben/wioterminal-azureiothub-sample/LICENSE"&gt;&lt;img src="https://camo.githubusercontent.com/78f47a09877ba9d28da1887a93e5c3bc2efb309c1e910eb21135becd2998238a/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f4c6963656e73652d4d49542d79656c6c6f772e737667" alt="License: MIT"&gt;&lt;/a&gt;
&lt;a href="https://twitter.com/kartben" rel="nofollow"&gt;&lt;img src="https://camo.githubusercontent.com/21485bcfeaa2ad8d58d32952bb3cccd163d8e1ce0f83de9b5446854661bcfd19/68747470733a2f2f696d672e736869656c64732e696f2f747769747465722f666f6c6c6f772f6b61727462656e2e7376673f7374796c653d736f6369616c" alt="Twitter: kartben"&gt;&lt;/a&gt;
&lt;a href="https://github.com/kartben/wioterminal-azureiothub-sample/actions?query=workflow%3A%22PlatformIO+CI%22"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--8ZhaL9WV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://github.com/kartben/wioterminal-azureiothub-sample/workflows/PlatformIO%2520CI/badge.svg" alt="PlatformIO CI"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;This sample application shows you how to connect your &lt;a href="https://www.seeedstudio.com/Wio-Terminal-p-4509.html" rel="nofollow"&gt;Wio Terminal&lt;/a&gt; from Seeed Studio to &lt;a href="https://azure.microsoft.com/services/iot-hub" rel="nofollow"&gt;Azure IoT Hub&lt;/a&gt;. It is built on top of the &lt;a href="https://github.com/Azure/azure-sdk-for-c"&gt;Azure SDK for Embedded C&lt;/a&gt;, a small footprint, easy-to-port library for communicating with Azure services.&lt;/p&gt;
&lt;p&gt;As the Wio Terminal is one of PlatformIO's (many!) supported platforms, the sample is conveniently made available as a PlatformIO project. This means that you don't have to worry about installing the &lt;a href="https://wiki.seeedstudio.com/Wio-Terminal-Network-Overview/" rel="nofollow"&gt;multiple Arduino libraries&lt;/a&gt; the Wio Terminal requires for Wi-Fi &amp;amp; TLS, and you don't need to manually install any other third-party library either! All dependencies are automatically fetched from Github by the PlatformIO Library Manager.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/kartben/wioterminal-azureiothub-sample#running-the-sample"&gt;Running the sample&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/kartben/wioterminal-azureiothub-sample#testing-the-sample"&gt;Testing the sample&lt;/a&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/kartben/wioterminal-azureiothub-sample#testing-that-telemetry-is-correctly-sent-to-azure-iot-hub"&gt;Testing that telemetry is correctly sent to Azure IoT Hub&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/kartben/wioterminal-azureiothub-sample#sending-a-command-from-azure-iot-hub"&gt;Sending a command from Azure IoT Hub&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/kartben/wioterminal-azureiothub-sample#a-few-words-on-the-azure-sdk-for-embedded-c-and-how-its-been-ported-to-wio-terminal"&gt;A few words on the Azure SDK for Embedded C and how it's been&lt;/a&gt;…&lt;/li&gt;
&lt;/ul&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/kartben/wioterminal-azureiothub-sample"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;You will need a &lt;strong&gt;Wio Terminal&lt;/strong&gt; , of course, an &lt;strong&gt;Azure IoT Hub&lt;/strong&gt; instance, and a working &lt;strong&gt;Wi-Fi connection&lt;/strong&gt;. The Wio Terminal will need to be connected to your computer over USB—&lt;em&gt;kudos to Seeed Studio for providing a USB-C port, by the way!&lt;/em&gt;—so it can be programmed.&lt;/p&gt;

&lt;p&gt;Here are the steps you should follow to get your Wio Terminal connected to Azure IoT Hub:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;If you don’t have an &lt;strong&gt;Azure subscription&lt;/strong&gt; , &lt;strong&gt;&lt;a href="https://azure.microsoft.com/free"&gt;create one for free&lt;/a&gt;&lt;/strong&gt; before you begin.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-create-through-portal#create-an-iot-hub"&gt;Create an IoT Hub&lt;/a&gt;&lt;/strong&gt; and &lt;strong&gt;&lt;a href="https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-create-through-portal#register-a-new-device-in-the-iot-hub"&gt;register a new device&lt;/a&gt;&lt;/strong&gt; (i.e. your Wio Terminal). Using the Azure portal is probably the most beginner-friendly method, but you can also use the &lt;a href="https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-create-using-cli"&gt;Azure CLI&lt;/a&gt; or the &lt;a href="https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-create-use-iot-toolkit"&gt;VS Code extension&lt;/a&gt;. The sample uses symmetric keys for auhentication.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/kartben/wioterminal-azureiothub-sample"&gt;Clone&lt;/a&gt;&lt;/strong&gt; and open the sample repository in VS Code, making sure you have the &lt;strong&gt;&lt;a href="https://marketplace.visualstudio.com/items?itemName=platformio.platformio-ide"&gt;PlatformIO extension&lt;/a&gt;&lt;/strong&gt; installed.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Update the application settings&lt;/strong&gt; (&lt;code&gt;include/config.h&lt;/code&gt;) file with your Wi-Fi, IoT Hub URL, and device credentials.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Flash your Wio Terminal&lt;/strong&gt;. Use the command palette (Windows/Linux: Ctrl+Shift+P / macOS: ⇧⌘P) to execute the &lt;code&gt;PlatformIO: Upload&lt;/code&gt; command. The operation will probably take a while to complete as the Wio Terminal toolchain and the dependencies of the sample application are downloaded, and the code is compiled and uploaded to the device.&lt;/li&gt;
&lt;li&gt;Once the code has been uploaded successfully, your Wio Terminal LCD should turn on and start logging connection traces.
You can also open the PlatformIO serial monitor to check the logs of the application (&lt;code&gt;PlatformIO: Serial Monitor&lt;/code&gt; command).
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;gt; Executing task: C:\Users\kartben\.platformio\penv\Scripts\platformio.exe device monitor &amp;lt;
-------- Available filters and text transformations: colorize, debug, default, direct, hexlify, log2file, nocontrol, printable, send_on_enter, time
-------- More details at http://bit.ly/pio-monitor-filters
-------- Miniterm on COM4  9600,8,N,1 ---
-------- Quit: Ctrl+C | Menu: Ctrl+T | Help: Ctrl+T followed by Ctrl+H ---
Connecting to SSID: WiFi-Benjamin5G
......
&amp;gt; SUCCESS.
Connecting to Azure IoT Hub...
&amp;gt; SUCCESS.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Your device should now be &lt;strong&gt;sending&lt;/strong&gt; its &lt;strong&gt;accelerometer sensor values&lt;/strong&gt; to Azure IoT Hub every 2 seconds, and be ready to &lt;strong&gt;receive commands&lt;/strong&gt; remotely sent &lt;strong&gt;to ring its buzzer&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Please &lt;a href="https://github.com/kartben/wioterminal-azureiothub-sample#testing-the-sample"&gt;refer to the application’s README&lt;/a&gt; to learn how to test that the sample is working properly using &lt;a href="https://github.com/Azure/azure-iot-explorer"&gt;Azure IoT Explorer&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2020/08/azure-iot-explorer-send-command.gif"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CuQCj8Ji--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/08/azure-iot-explorer-send-command.gif" alt="An animated screen capture showing how to send IoT commands in the Azure IoT Explorer application"&gt;&lt;/a&gt;Azure IoT Explorer – Sending Commands&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2020/08/azure-iot-explorer-telemetry.gif"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--idijxT7f--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/08/azure-iot-explorer-telemetry.gif" alt="An animated screen capture showing how to display IoT Telemetry in the Azure IoT Explorer application"&gt;&lt;/a&gt;Azure IoT Explorer – Monitoring Telemetry&lt;/p&gt;

&lt;p&gt;It is important to mention that this sample application is &lt;strong&gt;compatible with &lt;a href="https://docs.microsoft.com/en-us/azure/iot-pnp/overview-iot-plug-and-play"&gt;IoT Plug and Play&lt;/a&gt;&lt;/strong&gt;. It means that there is a clear and documented contract of the kind of messages the Wio Terminal may send (telemetry) or receive (commands).&lt;/p&gt;

&lt;p&gt;You can see the model of this contract below—it is rather straightforward. It’s been authored using the &lt;a href="https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.vscode-dtdl"&gt;dedicated VS Code extension&lt;/a&gt; for &lt;a href="https://github.com/Azure/opendigitaltwins-dtdl/blob/master/DTDL/v2/dtdlv2.md"&gt;DTDL&lt;/a&gt;, the &lt;strong&gt;Digital Twin Description Language&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"@context"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"dtmi:dtdl:context;2"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"@id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"dtmi:seeed:wioterminal;1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"@type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Interface"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"displayName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Seeed Studio Wio Terminal"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"contents"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"@type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Telemetry"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Acceleration"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"unit"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"gForce"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"imu"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"schema"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"@type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Object"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"fields"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"x"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"displayName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"IMU X"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"schema"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"double"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"y"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"displayName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"IMU Y"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"schema"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"double"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"z"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"displayName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"IMU Z"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"schema"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"double"&lt;/span&gt;&lt;span class="w"&gt;
          &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"@type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Command"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"ringBuzzer"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"displayName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Ring buzzer"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Rings the Wio Terminal's built-in buzzer"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"request"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"duration"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"displayName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Duration"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Number of milliseconds to ring the buzzer for."&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"schema"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"integer"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When connecting to IoT Hub, the Wio Terminal sample application “introduces itself” as conforming to the &lt;code&gt;dtmi:seeed:wioterminal;1&lt;/code&gt; model.&lt;/p&gt;

&lt;p&gt;This allows you (or anyone who will be creating IoT applications integrating with your device, really) to &lt;strong&gt;be sure there won’t be any impedence mismatch between the way your device &lt;em&gt;talks&lt;/em&gt; and expects to be &lt;em&gt;talked to&lt;/em&gt;&lt;/strong&gt; , and what your IoT application does.&lt;/p&gt;

&lt;p&gt;A great example of why being able to automagically match a device to a corresponding DTDL model is useful can be illustrated with the way we used the Azure IoT Explorer earlier. Since the device “introduced itself” when connecting to IoT Hub, and since Azure IoT Explorer has a local copy of the model, it automatically showed us a dedicated UI for sending the &lt;code&gt;ringBuzzer&lt;/code&gt; command!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EHvWV1Ly--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/08/image-1024x382.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EHvWV1Ly--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/08/image-1024x382.png" alt=""&gt;&lt;/a&gt;Thanks to IoT Plug and Play, any application or tool can easily leverage the model that describes a device’s capabilities to. &lt;br&gt;Here, Azure IoT Explorer uses the model to help the user send commands that the device can actually understand.&lt;/p&gt;

&lt;h2&gt;
  
  
  Azure SDK for Embedded C
&lt;/h2&gt;

&lt;p&gt;In the past, adding support for Azure IoT to an IoT device using the C programming language required to either use the rather monolithic (ex. it is not trivial to bring your own TCP/IP or TLS stack) &lt;a href="https://github.com/Azure/azure-iot-sdk-c"&gt;Azure IoT C SDK&lt;/a&gt;, or to implement everything from scratch using the public documentation of &lt;a href="https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-mqtt-support"&gt;Azure IoT’s MQTT front-end&lt;/a&gt; for devices.&lt;/p&gt;

&lt;p&gt;Enter the &lt;strong&gt;Azure SDK for Embedded C&lt;/strong&gt;!&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The &lt;a href="https://github.com/Azure/azure-sdk-for-c"&gt;Azure SDK for Embedded C&lt;/a&gt; is designed to allow small embedded (IoT) devices to communicate with Azure services.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The Azure SDK team has &lt;a href="https://devblogs.microsoft.com/azure-sdk/azure-sdk-release-june-2020/"&gt;recently&lt;/a&gt; started to put together a &lt;strong&gt;C SDK that specifically targets embedded and constrained devices&lt;/strong&gt;. It provides a generic, platform-independent, infrastructure for manipulating buffers, logging, JSON serialization/deserialization, and more. On top of this lightweight infrastructure, client libraries for e.g Azure Storage or Azure IoT have been developed.&lt;/p&gt;

&lt;p&gt;You can read more on the Azure IoT client library &lt;a href="https://github.com/Azure/azure-sdk-for-c/tree/master/sdk/docs/iot#azure-iot-clients"&gt;here&lt;/a&gt;, but in a nutshell, here’s what I had to implement in order to use it on the Wio Terminal connected:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;As the sample uses &lt;strong&gt;symmetric keys&lt;/strong&gt; to authenticate, we need to be able to generate a &lt;a href="https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-devguide-security#security-tokens"&gt;security token&lt;/a&gt;. 

&lt;ul&gt;
&lt;li&gt;The token needs to have an expiration date (typically set to a few hours in the future), so &lt;strong&gt;we need to know the current date and time&lt;/strong&gt;. We use an &lt;strong&gt;&lt;a href="https://github.com/sstaub/NTP"&gt;NTP&lt;/a&gt; library&lt;/strong&gt; to get the current time from a time server.&lt;/li&gt;
&lt;li&gt;The token includes an &lt;strong&gt;HMAC-SHA256 signature string that needs to be base64-encoded&lt;/strong&gt;. Luckily, the &lt;a href="https://wiki.seeedstudio.com/Wio-Terminal-Network-Overview/#libraries-installation"&gt;recommended WiFi+TLS stack&lt;/a&gt; for the Wio Terminal already includes &lt;strong&gt;Mbed TLS&lt;/strong&gt; , making it relatively simple to &lt;strong&gt;compute HMAC signatures&lt;/strong&gt; (ex. &lt;code&gt;mbedtls_md_hmac_starts&lt;/code&gt;) and perform base64 encoding (ex. &lt;code&gt;mbedtls_base64_encode&lt;/code&gt;). &lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;li&gt;The Azure IoT client library helps with crafting MQTT topics that follow the &lt;a href="https://docs.microsoft.com/en-us/azure/iot-hub/iot-hub-mqtt-support"&gt;Azure IoT conventions&lt;/a&gt;. However, you still need to &lt;strong&gt;provide your own MQTT implementation&lt;/strong&gt;. In fact, this is a major difference with the historical Azure IoT C SDK, for which the MQTT implementation was baked into it. Since it is widely supported and just worked out-of-the-box, the sample application uses the &lt;a href="https://github.com/knolleary/pubsubclient"&gt;&lt;code&gt;PubSubClient&lt;/code&gt;&lt;/a&gt; MQTT library from &lt;a href="https://github.com/knolleary"&gt;Nick O’Leary&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;And of course, one must implement their own &lt;strong&gt;application logic&lt;/strong&gt;. In the context of the sample application, this meant using the Wio Terminal’s IMU driver to get acceleration data every 2 seconds, and hooking up the &lt;code&gt;ringBuzzer&lt;/code&gt; command to actual embedded code that… rings the buzzer.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I hope you found this post useful! I will soon publish additional articles that go beyond the simple “Hey, my Wio Terminal can send accelerometer data to the cloud!” to more advanced use cases such as remote firmware upgrade. Stay tuned! 🙂&lt;/p&gt;

&lt;p&gt;Let me know in the comments what you’ve done (or will be doing!) with your Wio Terminal, and also don’t hesitate to ask any burning question you may have. You can also always &lt;a href="https://twitter.com/kartben"&gt;find me on Twitter&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>iot</category>
      <category>azure</category>
      <category>platformio</category>
      <category>mqtt</category>
    </item>
    <item>
      <title>Eliminating Vanity Metrics From Your Developer Program</title>
      <dc:creator>Benjamin Cabé</dc:creator>
      <pubDate>Thu, 23 Jul 2020 17:16:25 +0000</pubDate>
      <link>https://forem.com/kartben/eliminating-vanity-metrics-from-your-developer-program-4l1p</link>
      <guid>https://forem.com/kartben/eliminating-vanity-metrics-from-your-developer-program-4l1p</guid>
      <description>&lt;p&gt;I have been, directly or indirectly, responsible for growing and nurturing several developer communities for over a decade now. Along the way, I’ve come to realize that there are lots of misconceptions in terms of what characterizes successful developer engagement programs, and how to effectively measure their impact.&lt;/p&gt;

&lt;p&gt;A lot has already been said on the reasons why vanity metrics are dangerous, so why should you bother reading further? Well, what I had originally planned as a short &lt;em&gt;brain dump&lt;/em&gt; ended up covering pretty extensively &lt;strong&gt;the pitfalls of vanity metrics in the specific context of developer engagement&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;This article will help you identify some areas where you can improve, and new indicators that you will want to start tracking. I also hope it will help change your mindset so that you can actually &lt;strong&gt;start becoming proud of your not-so bright metrics and what you have learned from them&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;I would really like to hear about your own experience in the comments below. You can also follow me or &lt;a href="https://twitter.com/kartben"&gt;ping me on Twitter&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Fear the Green Giant… Dashboard
&lt;/h2&gt;

&lt;p&gt;Who doesn’t like a performance dashboard filled with green indicators? Well… I don’t!&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If your dashboard shows a majority of green indicators, you’re doing it wrong&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Whether it’s intentional or not, if your metrics and KPIs are designed to make you “look good”, you’re probably not looking at the right thing, or at least not with the right level of granularity.&lt;/p&gt;

&lt;p&gt;A “green” dashboard is not inherently bad—who would I be to question the fact that your community is growing anyway? What I am quite confident &lt;em&gt;is&lt;/em&gt; bad, though, is a dashboard that does not capture and highlight &lt;strong&gt;the things that are not working&lt;/strong&gt; … and there are always a few behind even the most stellar aggregated metrics.&lt;/p&gt;

&lt;p&gt;The rest of this article will cover several ways you can refine your metrics to capture better the things that can be improved.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://blog.benjamin-cabe.com/wp-content/uploads/2020/07/Picture3.png"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hbL9Wjua--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/07/Picture3-1024x617.png" alt=""&gt;&lt;/a&gt;&lt;strong&gt;&lt;span&gt;Note&lt;/span&gt;:&lt;/strong&gt; the number of views on your videos is actually &lt;span&gt;not&lt;/span&gt; a good metric to track. Read on to learn why!&lt;/p&gt;

&lt;p&gt;As a rule of thumb, always make sure that all the activities accruing to a given (green) indicator do it equally so. Just think about it: if out of four things you’re doing successfully overall—maybe you even exceeded your initial goal!—one is in fact really lagging behind, you might as well focus your time and effort on the ones that work, right? Or at the very least, you’ll want to analyze what is making that one activity unsuccessful, in order to do better next time…&lt;/p&gt;

&lt;h2&gt;
  
  
  Learn from the outliers
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MpvNmhgX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/07/pexels-photo-3965671-300x200.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MpvNmhgX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/07/pexels-photo-3965671-300x200.jpeg" alt="person in red and brown jacket holding magnifying glass"&gt;&lt;/a&gt;Photo by Andrea Piacquadio on &lt;a href="https://www.pexels.com/photo/person-in-red-and-brown-jacket-holding-magnifying-glass-3965671/" rel="nofollow"&gt;Pexels.com&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I can’t emphasize this enough: you will learn a lot by &lt;strong&gt;making sure your metrics have the right granularity&lt;/strong&gt; , and by digging into your “outliers”, i.e those articles/social posts/videos that are performing particularly well–or not, for that matter.&lt;/p&gt;

&lt;p&gt;Whenever I’m faced with a piece of content that is in appearance successful, I always start by trying to answer these two related questions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Is this an actual success, or are my metrics somehow biased or&lt;/strong&gt; , &lt;strong&gt;worse, simply inaccurate?&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;What made this piece perform so well?&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--e5UWS6mq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/07/3-questions-1024x368.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--e5UWS6mq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/07/3-questions-1024x368.png" alt=""&gt;&lt;/a&gt;Three questions you should ask yourself to learn more from your successes.&lt;/p&gt;

&lt;p&gt;More specifically, when it comes to deciding whether I should celebrate an actual success, I usually ask myself:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Has the content been promoted as part of a paid campaign?&lt;/strong&gt; If so, it is worth looking at its organic traffic stats, and how they compare to your average article. A sub-par article can easily be flagged as impactful when, in reality, you’ve only paid for getting more eyeballs on it without particularly generating attention or engagement. More on the topic of engagement below) &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;What are the high-level demographics of the people who viewed or relayed my content?&lt;/strong&gt; Would you call impactful an article that got shared or liked by 100 people among which 95 you either personally know or they happen to be direct or indirect colleagues? Personally, &lt;strong&gt;I’d rather have ten times less engagement if the people involved happen to spread the word in more distant and uncharted social circles&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Who, specifically, promoted and shared my content?&lt;/strong&gt; There are good chances that your content has been picked up and amplified by some media outlets or key influencers in your community. &lt;strong&gt;Find who these are, and always try to personally reach out and engage.&lt;/strong&gt; &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On the opposite side of the spectrum, there are those “meh” articles or videos that didn’t seem to find an audience and that can also teach you a lot:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--08FI_owQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://upload.wikimedia.org/wikipedia/commons/9/94/Gartner_Hype_Cycle.svg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--08FI_owQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://upload.wikimedia.org/wikipedia/commons/9/94/Gartner_Hype_Cycle.svg" alt=""&gt;&lt;/a&gt;Gartner Hype Cycle.&lt;br&gt;&lt;em&gt;Jeremykemp at English Wikipedia / CC BY-SA.&lt;/em&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The success, or lack thereof, of your content is often going to be correlated to &lt;strong&gt;where in the &lt;a href="https://en.wikipedia.org/wiki/Hype_cycle"&gt;hype cycle&lt;/a&gt; the technology you’re covering stands&lt;/strong&gt;.
If you’re covering bleeding edge technology, an underperforming article should not necessarily be a cause for disappointment. However, you will want to look for signals showing that it piqued the curiosity of at least &lt;em&gt;some&lt;/em&gt; folks!&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Don’t underestimate the impact of SEO and optimizing for social media&lt;/strong&gt;. Sometimes, the only explanation as to why some content is lagging behind is that you didn’t take the time to create a nice visual/card for catching people’s attention when your post pops up in their timeline.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Eyeballs are nice, engagement is better
&lt;/h2&gt;

&lt;p&gt;A metric that often contributes to the “green dashboard symptom” is &lt;strong&gt;the mythical &lt;em&gt;pageview&lt;/em&gt;&lt;/strong&gt; , and all its variations (ex. Twitter impressions).&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Is there something truly useful to your business that you can deduce from how many pageviews your technical article got?&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;You may argue that tracking pageviews allows you to measure your thought leadership and your reach. However, and at the very least, that’s assuming you have a good understanding of the size of your overall potential audience, otherwise you’re just making a wild guess about what a “good” number should be…&lt;/p&gt;

&lt;p&gt;In most cases, you will be better of looking at the &lt;strong&gt;actual engagement&lt;/strong&gt; of your audience. Rather than pageviews, I tend to look at the following instead:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Impressions click-through rate (CTR)&lt;/strong&gt;. Out of 100 people presented with the thumbnail of my YouTube video, or the link to my post in their Twitter feed, how many did I convince to click to learn more?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Number of comments&lt;/strong&gt;. If I’m getting tens of thousands of views and not a single person is bothering commenting—even to simply say “Thanks!”, or “Cool stuff!”—or asking a question, I usually start being concerned about the relevance of my article, or at least if I did all I could to foster engagement from my audience.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Trends over absolute numbers
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---_U2kBhL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/07/cyumacqmyvi-300x200.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---_U2kBhL--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/07/cyumacqmyvi-300x200.jpg" alt="Several white arrows pointing upwards on a wooden wall"&gt;&lt;/a&gt;Photo by &lt;a href="https://unsplash.com/@oowgnuj?utm_source=unsplash&amp;amp;utm_medium=referral&amp;amp;utm_content=creditCopyText"&gt;Jungwoo Hong&lt;/a&gt; on &lt;a href="https://unsplash.com/s/photos/increase?utm_source=unsplash&amp;amp;utm_medium=referral&amp;amp;utm_content=creditCopyText"&gt;Unsplash&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;People you will share your metrics with likely have &lt;strong&gt;no idea if getting 50,000 views per month&lt;/strong&gt; on your YouTube channel, &lt;strong&gt;or 70 retweets&lt;/strong&gt; on your Twitter campaign &lt;strong&gt;is any good&lt;/strong&gt;. &lt;strong&gt;In fact, you probably don’t either&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;However, if you are able to &lt;strong&gt;show a trend over the past 7, 30, and 365 days&lt;/strong&gt; , of how a particular metric has evolved, this will make it much easier to evaluate the impact of your various activities.&lt;/p&gt;

&lt;p&gt;What’s more, this will also force you to not rest on your laurels, by giving you a way to spot absolute numbers that seemed huge a couple years ago, and that have, in fact, been stagnating since then.&lt;/p&gt;

&lt;h2&gt;
  
  
  There’s always room for improvement
&lt;/h2&gt;

&lt;p&gt;Like everyone else, I like celebrating a successful article or video, and so should you. However, &lt;strong&gt;even your most successful content has downsides if you analyze it carefully&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Remember that contest you ran with a bunch of partners and that was super successful? Well, try and do the exercise of looking for that particular metric that might not shine as much as the others. By looking at your referral traffic, for example, you may notice that the impact of the promotion activities of one of the partners is lagging behind. Why is that? Maybe this partner’s community isn’t the right target for you? Maybe the tone you usually use just needs to be tweaked for this particular crowd?&lt;/p&gt;

&lt;p&gt;It might sound like nitpicking to look for things that didn’t work, but trust me, you will learn a lot by paying attention to these.&lt;/p&gt;

&lt;h2&gt;
  
  
  Don’t set (arbitrary) goals too early
&lt;/h2&gt;

&lt;p&gt;It is very tempting to look at some of the metrics your existing tools are giving you access to (ex. pageviews), increase them by an arbitrary ratio, and then use this number as your goal for the upcoming period. This is just wrong.&lt;/p&gt;

&lt;p&gt;First, unless you’ve already been thinking about them twice, &lt;strong&gt;I doubt the goals that you initially set will reflect tangible and actionable insights&lt;/strong&gt;. Congratulations, you have 20% more unique visitors on your web property! Now what? Are these visitors directly driving 20% more usage of your products? Are you even aiming for increased adoption in the first place? What if I tell you that your competitor saw a 100% growth during the same period, is that good or bad?&lt;/p&gt;

&lt;p&gt;Once you’ve narrowed down some of the trends you are going to monitor, it becomes much easier to &lt;strong&gt;adapt your programs and tactics to make sure you’re aiming for continual improvement and growth&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Your community ≠ your official channels
&lt;/h2&gt;

&lt;p&gt;A common mistake when looking after a developer community is to limit the breadth of monitored channels to your official/corporate ones. It usually stems from a pure tooling limitation: &lt;strong&gt;we naturally tend to only pay attention to the channels that can easily and automatically be tracked&lt;/strong&gt; (see previous paragraph), since we directly own them.&lt;/p&gt;

&lt;p&gt;However, your community leaves in many places, and I would be surprised if your goal is to only grow traffic and engagement on your own properties. Whether you have tools that allow you to do this automatically or not, you should make sure you track metrics related to your performance on third party channels and platforms.&lt;/p&gt;

&lt;p&gt;At a minimum, in particular if you’re finding it cumbersome to collect information for the properties you don’t directly own, you should always &lt;strong&gt;make referral traffic one of your key indicators&lt;/strong&gt;. This way, you can directly evaluate how much your content has been shared or linked to from third party channels.&lt;/p&gt;

&lt;h2&gt;
  
  
  Empower your authors
&lt;/h2&gt;

&lt;p&gt;For many organizations, the people creating the content are not necessarily the ones that are responsible for actually publishing and promoting it. This is of course how organizations can scale and how people can stay focused, but this presents a major flaw. In order to truly meet their audience, &lt;strong&gt;your authors need to be able to see first-hand how their content performed.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;While not everyone is an expert at Google Analytics or social media tactics, you should aim at giving your authors direct access to the tools that will allow them to quickly assess if their message landed with their intended audience.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Don’t underestimate the impact empowered authors can have on your content creation activities&lt;/strong&gt; and your overall organization. That feature owner who did their best to write a series of blog posts about a new release, actively engaging in promoting their piece in key communities and seeking developer engagement? It’s them who are going to get tons of valuable first-hand feedback from their &lt;strong&gt;actual users&lt;/strong&gt; , as they will have been able to &lt;strong&gt;meet them where they are&lt;/strong&gt;. And it’s them who are the &lt;strong&gt;thought leaders&lt;/strong&gt; you need to establish trust with your developer community.&lt;/p&gt;

&lt;h2&gt;
  
  
  Automation should never replace your own judgment
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Q-_2zyN8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/07/ykw0jjp7rlu-225x300.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Q-_2zyN8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://blog.benjamin-cabe.com/wp-content/uploads/2020/07/ykw0jjp7rlu-225x300.jpg" alt="white and brown human robot illustration"&gt;&lt;/a&gt;Photo by &lt;a href="https://unsplash.com/@franckinjapan?utm_source=unsplash&amp;amp;utm_medium=referral&amp;amp;utm_content=creditCopyText"&gt;Franck V.&lt;/a&gt; on &lt;a href="https://unsplash.com/?utm_source=unsplash&amp;amp;utm_medium=referral&amp;amp;utm_content=creditCopyText"&gt;Unsplash&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;From Google Analytics to Adobe Analytics to your favorite content marketing tool, you probably have at your disposal a &lt;strong&gt;ton of metrics that are automatically collected, and consolidated into nice reports&lt;/strong&gt;. This is great and can save you a lot of effort every time you need to share an activity report with your stakeholders.&lt;/p&gt;

&lt;p&gt;That being said, not only should you &lt;strong&gt;not trust these metrics blindly&lt;/strong&gt; (remember to pay special attention to outliers), but you should also &lt;strong&gt;make sure to complement them with your own manual findings&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;As an example, here are some of the things I do to give my reports more context:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;For &lt;strong&gt;social media amplification&lt;/strong&gt; , I always dig into the demographics of the people who ended up sharing or re-sharing something.
Like I mentioned before, I’ll always tend to prefer articles that have been shared less if the people who shared it are not direct members of my community, nor colleagues.&lt;/li&gt;
&lt;li&gt;For &lt;strong&gt;video content&lt;/strong&gt; , ex. on YouTube, I try to compare the number of comments or number of likes (and dislikes!) that key videos are getting to the numbers that videos from similar communities, or close competitors, are getting. It is likely that you will have to collect these numbers manually, but it should only take you a few minutes.&lt;/li&gt;
&lt;li&gt;I often try to manually &lt;strong&gt;capture and quote a couple key comments/posts/tweets&lt;/strong&gt; from the community (both positive and negative ones!). If you have access to &lt;strong&gt;sentiment analysis&lt;/strong&gt; tools, do not hesitate to use them to help you look in the right direction.&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;Promoting content is not an exact science.&lt;/p&gt;

&lt;p&gt;&lt;cite&gt;Stephanie Morillo (&lt;a href="https://twitter.com/radiomorillo" rel="noreferrer noopener"&gt;@radiomorillo&lt;/a&gt;), Developer’s Guide to Content Creation. &lt;/cite&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Not all indicators come in the form of tangible numbers, and you won’t always be able to directly include them in your report tables or to track their evolution over time. However, they are instrumental in reminding you that you should not overlook the human aspects, and the importance of personal interactions, in your developer community.&lt;/p&gt;

&lt;p&gt;Once again, if you found this article useful, or if you’ve had other experiences, I would really love to hear from you in the comments. In the meantime, I’ll leave you with a few links to some really good resources.&lt;/p&gt;

&lt;h2&gt;
  
  
  Useful links &amp;amp; resources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://www.tableau.com/learn/articles/vanity-metrics"&gt;The definition of vanity metrics and how to identify them&lt;/a&gt;. A great article from Tableau, including a great list of alternatives to the typical vanity metrics.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://gumroad.com/l/YAmIh"&gt;The Developer’s Guide to Content Creation&lt;/a&gt;, by Stephanie Morillo (&lt;a href="https://twitter.com/radiomorillo"&gt;@radiomorillo&lt;/a&gt;).&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://medium.com/@ashleymcnamara/what-is-developer-advocacy-3a92442b627c"&gt;What is Developer Advocacy?&lt;/a&gt;, by Ashley Willis (&lt;a href="https://twitter.com/ashleymcnamara"&gt;@ashleymcnamara&lt;/a&gt;).&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>devrel</category>
      <category>community</category>
      <category>metrics</category>
    </item>
    <item>
      <title>How to run TensorFlow Lite on the MXChip AZ3166 IoT Devkit</title>
      <dc:creator>Benjamin Cabé</dc:creator>
      <pubDate>Mon, 27 Apr 2020 12:57:41 +0000</pubDate>
      <link>https://forem.com/kartben/how-to-run-tensorflow-lite-on-the-mxchip-az3166-iot-devkit-23lb</link>
      <guid>https://forem.com/kartben/how-to-run-tensorflow-lite-on-the-mxchip-az3166-iot-devkit-23lb</guid>
      <description>&lt;p&gt;This post will be a short one in my ongoing series about TinyML and IoT (check out my previous post and video demo &lt;a href="https://dev.to/kartben/quickly-train-your-ai-model-with-mxchip-iot-devkit-edge-impulse-30i8"&gt;here&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;In a nutshell, you will learn how to run the TensorFlow Lite “Hello World” sample on an &lt;a href="https://microsoft.github.io/azure-iot-developer-kit/"&gt;MXChip IoT developer kit&lt;/a&gt;. You can jump directly to the end of the post for a video tutorial.&lt;/p&gt;

&lt;p&gt;You may have not realized, but unless you are looking at real-time object tracking, complex image processing (think CSI crazy image resolution enhancement, which turns out to be more real than you may have initially thought!), &lt;strong&gt;there is a lot that can be accomplished with fairly small neural networks&lt;/strong&gt;, and therefore reasonable amounts of memory and compute power. Does this mean you can run neural networks on tiny microcontrollers? I certainly hope so since this is the whole point of this series!&lt;/p&gt;

&lt;h2&gt;
  
  
  TensorFlow on microcontrollers?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--xJTTkoTm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i2.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/04/tensorflow-lite-logo-social-e1587565210408.png%3Fresize%3D251%252C107%26ssl%3D1" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--xJTTkoTm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i2.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/04/tensorflow-lite-logo-social-e1587565210408.png%3Fresize%3D251%252C107%26ssl%3D1" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.tensorflow.org/lite"&gt;TensorFlow Lite&lt;/a&gt; is an open-source deep learning framework that enables on-device inference on a wide range of equipment, from mobile phones to the kind of microcontrollers that may be found in IoT solutions. It is, as the name suggests, a lightweight version of TensorFlow.&lt;/p&gt;

&lt;p&gt;When &lt;strong&gt;TensorFlow&lt;/strong&gt; typically positions itself has a rich framework for creating, training, and running potentially complex neural networks, &lt;strong&gt;TensorFlow Lite&lt;/strong&gt; only focuses on &lt;strong&gt;inference&lt;/strong&gt;. It aims at providing low latency, and small model/executable size, making it an ideal candidate for constrained devices.&lt;/p&gt;

&lt;p&gt;In fact, there is even &lt;a href="https://www.tensorflow.org/lite/microcontrollers"&gt;a version of TensorFlow Lite&lt;/a&gt; that is &lt;strong&gt;specifically targetted at microcontrollers&lt;/strong&gt;, with a runtime footprint of just a couple of dozens of kilobytes on e.g. an Arm Cortex M3. Just like a regular TensorFlow runtime would rely on e.g. a GPU to train or evaluate a model faster, TensorFlow Lite for micro-controllers too might leverage hardware acceleration built into the microcontroller (ex. &lt;a href="https://arm-software.github.io/CMSIS_5/DSP/html/index.html"&gt;CMSIS-DSP&lt;/a&gt; on Arm chips, which provides a bunch of APIs for fast math, matrix operations, etc.).&lt;/p&gt;

&lt;p&gt;A simplified workflow for getting TensorFlow Lite to run inference using your own model would be as follows:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--meGk5_Xz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i1.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/04/tflite-flowchart-1.png%3Fresize%3D474%252C234%26ssl%3D1" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--meGk5_Xz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i1.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/04/tflite-flowchart-1.png%3Fresize%3D474%252C234%26ssl%3D1" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;First, you need to build and train your model (❶). Note that TensorFlow is one of the many options you have for doing so, and nothing prevents you from using PyTorch, DeepLearning4j, etc. Then, the trained model needs to be &lt;a href="https://www.tensorflow.org/lite/guide/get_started#2_convert_the_model_format"&gt;converted&lt;/a&gt;into the TFlite format (❷) before you can use it (➌) in your embedded application. The first two steps typically happen on a “regular” computer while, of course, the end goal is that the third step is happening right on your embedded chip.&lt;/p&gt;

&lt;p&gt;In practice, and as &lt;a href="https://www.tensorflow.org/lite/microcontrollers#developer_workflow"&gt;highlighted in the TensorFlow documentation&lt;/a&gt;, you will probably need to convert your TFlite model in the form of a C array to help with inclusion in your final binary, and you will, of course, need the TensorFlow Lite library for microcontrollers. Luckily for us, this library is made available in the form of an Arduino library so it should be pretty easy to get it to work with our MXChip AZ3166 devkit!&lt;/p&gt;

&lt;h2&gt;
  
  
  TensorFlow Lite on MXChip AZ3166?
&lt;/h2&gt;

&lt;p&gt;I will let you watch the video below for a live demo/tutorial of how to actually run TensorFlow Lite on your MXChip devkit, using the &lt;a href="https://www.tensorflow.org/lite/microcontrollers/get_started#the_hello_world_example"&gt;Hello World example&lt;/a&gt; as a starting point.&lt;/p&gt;

&lt;p&gt;Spoiler alert: it pretty much just works out of the box! The only issue you will encounter is &lt;a href="https://github.com/tensorflow/tensorflow/blob/master/tensorflow/lite/micro/examples/hello_world/README.md"&gt;described in this Github issue&lt;/a&gt;, hence while you will see me disabling the &lt;code&gt;min()&lt;/code&gt; and &lt;code&gt;max()&lt;/code&gt; macros in my Arduino sketch.&lt;/p&gt;

&lt;h2&gt;
  
  
  Live demo
&lt;/h2&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/B_DcpRzkAiM"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

</description>
      <category>tinyml</category>
      <category>iot</category>
      <category>ai</category>
      <category>machinelearning</category>
    </item>
    <item>
      <title>Quickly train your AI model with MXChip IoT DevKit &amp; Edge Impulse</title>
      <dc:creator>Benjamin Cabé</dc:creator>
      <pubDate>Thu, 02 Apr 2020 18:10:21 +0000</pubDate>
      <link>https://forem.com/kartben/quickly-train-your-ai-model-with-mxchip-iot-devkit-edge-impulse-30i8</link>
      <guid>https://forem.com/kartben/quickly-train-your-ai-model-with-mxchip-iot-devkit-edge-impulse-30i8</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--V6MI4tmc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i0.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/03/Trojan_Room_coffee_pot_xcoffee.png%3Fw%3D474%26ssl%3D1" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--V6MI4tmc--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i0.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/03/Trojan_Room_coffee_pot_xcoffee.png%3Fw%3D474%26ssl%3D1" alt=""&gt;&lt;/a&gt;The first IoT device (albeit not so “smart”)? The &lt;a href="https://en.wikipedia.org/wiki/Trojan_Room_coffee_pot" rel="noreferrer noopener"&gt;Trojan Room coffee pot&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;For the past few weeks, I’ve been spending some time digging into what some people call &lt;em&gt;AIoT&lt;/em&gt;, Artificial Intelligence of Things. As often in the vast field of the Internet of Things, a lot of the technology that is powering it is not new. For example, the term &lt;em&gt;Machine Learning&lt;/em&gt; actually dates back to 1959(!), and surely we didn’t wait for IoT to become a thing to connect devices to the Internet, right?&lt;/p&gt;

&lt;p&gt;In the next few blog posts, I want to share part of my journey into &lt;em&gt;AIoT&lt;/em&gt;, and in particular I will try to help you understand how you can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Quickly and efficiently train an AI model that uses sensor data ;&lt;/li&gt;
&lt;li&gt;Run an AI model with very limited processing power (think MCU) ;&lt;/li&gt;
&lt;li&gt;Remotely operate your TinyML* solution, i.e. evolve from AI to AIoT.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--H_m6pPsT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i0.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/03/tinyml-book.jpg%3Fresize%3D115%252C150%26ssl%3D1" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--H_m6pPsT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i0.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/03/tinyml-book.jpg%3Fresize%3D115%252C150%26ssl%3D1" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;* TinyML&lt;/strong&gt;: &lt;em&gt;the ability to run a neural network model at an energy cost of below 1 mW.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;cite&gt;Pete Warden (&lt;a rel="noreferrer noopener" href="https://twitter.com/petewarden"&gt;@petewarden&lt;/a&gt;&lt;em&gt;)&lt;/em&gt;,&lt;br&gt;&lt;a rel="noreferrer noopener" href="https://amzn.to/33IYwkh"&gt;TinyML: Machine Learning with TensorFlow Lite on Arduino and Ultra-Low-Power Microcontrollers&lt;/a&gt;&lt;/cite&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Simplifying data capture and model training
&lt;/h2&gt;

&lt;p&gt;According to Wikipedia, &lt;a href="https://en.wikipedia.org/wiki/Supervised_learning"&gt;supervised learning&lt;/a&gt; is the machine learning task of learning a function that maps an input to an output based on example input-output pairs.&lt;/p&gt;

&lt;p&gt;As an &lt;strong&gt;example&lt;/strong&gt;, you may want to use input data in the form of vibration information (that you can measure using, for example, an accelerometer) to &lt;strong&gt;predict when a bearing is starting to wear out&lt;/strong&gt;. &lt;/p&gt;

&lt;p&gt;You will build a model (think: a mathematical function on steroids!) that will be able to look at say 1 second of vibration information (the &lt;em&gt;input&lt;/em&gt;) and tell you what the vibration corresponds to (the &lt;em&gt;output&lt;/em&gt; – for example: “bearing OK” / “bearing worn out”). For your model to be accurate, you will “teach” it how to best correlate the inputs to the outputs, by providing it with a training dataset. For this example, this would be a few minutes/hours worth of vibration data, together with the associated label (i.e., the expected outcome).&lt;/p&gt;

&lt;p&gt;Adding some AI into your IoT project will often follow a similar pattern:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Capture and label sensor data&lt;/strong&gt; coming from your actual “thing” ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Design a neural network classifier&lt;/strong&gt;, including the steps potentially needed to process the signal (ex. filter, extract frequency characteristics, etc.) ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Train and test a model&lt;/strong&gt; ;&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Export a model&lt;/strong&gt; to use it in your application.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;All those steps might not be anything out of the ordinary for people with a background in data science, but for a vast majority—including yours truly!—this is just too big of a task. Luckily, there are quite a few great tools out there that can help you get from zero to having a pretty good model, even if you have close to zero skills in neural networks!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Xb7AQe4z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i2.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/03/edgeimpulse-screenshot-studio.png%3Fresize%3D225%252C197%26ssl%3D1" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Xb7AQe4z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i2.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/03/edgeimpulse-screenshot-studio.png%3Fresize%3D225%252C197%26ssl%3D1" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Enter &lt;a href="https://edgeimpulse.com/"&gt;Edge Impulse&lt;/a&gt;. Edge Impulse provides a pretty complete set of tools and libraries that provides a user-friendly (read: no need to be a data scientist) way to:&lt;/p&gt;

&lt;p&gt;They have &lt;a href="https://docs.edgeimpulse.com/docs/continuous-motion-recognition"&gt;great tutorials&lt;/a&gt; based on an STM32 developer kit, but since I didn’t have one at hand when initially looking at their solution, I created a quick tool for capturing accelerometer and gyroscope data from my MXCHIP AZ3166 Developer Kit.&lt;/p&gt;

&lt;p&gt;In order to build an accurate model, you will want to acquire tons of data points. As IoT devices are often pretty constrained, you often need to be a bit creative in order to capture this data, as it’s likely your device won’t let you simply store megabytes worth of data on it, so you’ll need to somehow offload some of the data collection.&lt;/p&gt;

&lt;p&gt;Edge Impulse exposes a set of APIs to minimize the number of manual steps needed to acquire the data you need to train your model:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The &lt;strong&gt;ingestion service&lt;/strong&gt; is used to send new device data to Edge Impulse ;&lt;/li&gt;
&lt;li&gt;The &lt;strong&gt;remote management service&lt;/strong&gt; provides a way to remotely trigger the acquisition of data from a device. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As indicated in the Edge Impulse &lt;a href="https://docs.edgeimpulse.com/reference#remote-management"&gt;documentation&lt;/a&gt;, “devices can either connect directly to the remote management service over a WebSocket, or can connect through a proxy”. The WebSocket-based remote management protocol is not incredibly complex, but porting it on your IoT device might be overkill when in fact it is likely that you can simply use your computer as a proxy that will, on the one hand, receive sensor data from your IoT device, and on the other hand communicate with the Edge Impulse backend.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;So how does it work in practice should you want to capture and label sensor data coming from your MXChip developer kit?&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Custom MXChip firmware
&lt;/h2&gt;

&lt;p&gt;You can directly head over to this &lt;a href="https://github.com/kartben/mxchip-serial-capture"&gt;GitHub repo&lt;/a&gt; and &lt;a href="https://raw.githubusercontent.com/kartben/mxchip-serial-capture/master/mxchip-serial-capture.bin"&gt;download&lt;/a&gt; a ready-to-use firmware that you can directly copy to your MXChip devkit. As soon as you have this firmware installed on your MXChip, its only purpose in life will be to dump on its serial interface the raw values acquired from its accelerometer and gyroscope sensors as fast as possible (~150 Hz). If you were to look at the serial output from your MXChip, you’d see tons of traces similar to this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;…
[67,-24,1031,1820,-2800,-70]
[68,-24,1030,1820,-2730,-70]
[68,-24,1030,1820,-2730,-70]
[68,-23,1030,1820,-2730,-70]
[68,-24,1030,1820,-2800,-70]
[68,-24,1030,1820,-2800,-70]
[68,-24,1031,1820,-2800,-70]
[69,-22,1030,1820,-2730,-70]
[69,-22,1030,1820,-2800,-70]
…
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;There would probably be tons of better options to expose the sensor data over serial more elegantly or efficiently (ex. &lt;a href="https://github.com/firmata/protocol/blob/master/protocol.md"&gt;Firmata&lt;/a&gt;, binary encoding such as &lt;a href="http://cbor.io/"&gt;CBOR&lt;/a&gt;, etc.), but I settled on something quick :)&lt;/p&gt;

&lt;h2&gt;
  
  
  Serial bridge to Edge Impulse
&lt;/h2&gt;

&lt;p&gt;To quickly feed sensor data into Edge Impulse, I’ve developed a very simple Node.js app that reads input from the serial port on the one hand and talks to the Edge Impulse remote management API on the other. As soon as you install and start the bridge (and assuming, of course, that you have an MXChip connected to your machine), you’ll be able to remotely trigger the acquisition of sensor data right from the Edge Impulse portal. You will need to create an Edge Impulse account and project.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;npm install serial-edgeimpulse-remotemanager -g
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;The tool should be configured using the following environment variables:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;EI_APIKEY&lt;/code&gt;: EdgeImpulse API key (ex. ei_e48a5402eb9ebeca5f2806447218a8765196f31ca0df798a6aa393b7165fad5fe’) for your project ;&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;EI_HMACKEY&lt;/code&gt;: EdgeImpulse HMAC key (ex. ‘f9ef9527860b28630245d3ef2020bd2f’) for your project ;&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;EI_DEVICETYPE&lt;/code&gt;: EdgeImpulse Device Type (ex. ‘MXChip’) ;&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;EI_DEVICEID&lt;/code&gt;: EdgeImpulse Device ID (ex. ‘mxchip001’) ;&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;SERIAL_PORT&lt;/code&gt;: Serial port (ex: ‘COM3’, ‘/dev/tty.usbmodem142303’, …).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Once all the environment variables have been set (you may &lt;a href="https://www.npmjs.com/package/dotenv"&gt;declare them in a &lt;code&gt;.env&lt;/code&gt; file&lt;/a&gt;), you can run the tool:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;serial-edgeimpulse-remotemanager
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;From that point, your MXChip device will be accessible in your Edge Impulse project.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://i2.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/03/screencapture-studio-edgeimpulse-studio-256-devices-2020-03-31-15_05_06.png?ssl=1"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Pp1Eg3ip--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i2.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/03/screencapture-studio-edgeimpulse-studio-256-devices-2020-03-31-15_05_06.png%3Fw%3D474%26ssl%3D1" alt=""&gt;&lt;/a&gt;Edge Impulse &amp;gt; Device Explorer&lt;/p&gt;

&lt;p&gt;&lt;a href="https://i1.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/03/screencapture-studio-edgeimpulse-studio-256-acquisition-training-2020-03-31-15_05_58.png?fit=1209%2C898&amp;amp;ssl=1"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GN-RRNGJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i1.wp.com/blog.benjamin-cabe.com/wp-content/uploads/2020/03/screencapture-studio-edgeimpulse-studio-256-acquisition-training-2020-03-31-15_05_58.png%3Fw%3D474%26ssl%3D1" alt=""&gt;&lt;/a&gt;&lt;em&gt;Edge Impulse &amp;gt; &lt;/em&gt;Capture&lt;/p&gt;

&lt;p&gt;You can now very easily start capturing and labeling data, build &amp;amp; train a model based on this data, and even test the accuracy of your model once you’ve actually trained it.&lt;/p&gt;

&lt;p&gt;In fact, let’s check the end-to-end experience with the video tutorial below.&lt;/p&gt;

&lt;p&gt;&lt;iframe width="710" height="399" src="https://www.youtube.com/embed/Dan8TOWg30o"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  TensorFlow on an MCU?!
&lt;/h2&gt;

&lt;p&gt;Now that we’ve trained a model that turns sensor data into meaningful insights, we’ll see in a future article how to run that very model directly on the MXChip. You didn’t think we were training that model just for fun, did you?&lt;/p&gt;

</description>
      <category>iot</category>
      <category>ai</category>
      <category>machinelearning</category>
      <category>tinyml</category>
    </item>
    <item>
      <title>Top 5 VS Code Extensions for IoT Developers</title>
      <dc:creator>Benjamin Cabé</dc:creator>
      <pubDate>Fri, 06 Mar 2020 09:45:21 +0000</pubDate>
      <link>https://forem.com/kartben/top-5-vs-code-extensions-for-iot-developers-597k</link>
      <guid>https://forem.com/kartben/top-5-vs-code-extensions-for-iot-developers-597k</guid>
      <description>&lt;p&gt;In just a few years, &lt;a href="https://code.visualstudio.com/"&gt;Visual Studio Code&lt;/a&gt; has conquered the hearts of a wide variety of developers. It took off very quickly in the web development communities, but it has now also become the IDE of choice for Java, Python, or C/C++ developers as well, whether they run Linux, MacOS, or Windows. In fact, in Stack Overflow’s most recent &lt;a href="https://insights.stackoverflow.com/survey/2019"&gt;developer survey&lt;/a&gt;, VS Code is ranked at  &lt;strong&gt;over 50% market share&lt;/strong&gt;  among the 90,000+ developers who responded.&lt;/p&gt;

&lt;p&gt;Whether you’re just getting into IoT or whether you’ve been working on IoT solutions for some time already, you’ve probably realized that “full-stack developer” is a term that also often applies to IoT. You may very well be spending most of your days working on developing and testing the firmware of your connected embedded device in C. Still, once in a while, you may want to tune some Python scripts used for you build system, or use a command-line tool to check that your IoT backend services are up and running.&lt;/p&gt;

&lt;p&gt;Rather than having to switch from one development environment or command line terminal to the other, I wouldn’t be surprised if, just like me, you’d be interested in doing most of your work without ever leaving your IDE.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;In this article, we look at some essential VS Code extensions that will help you become a more productive IoT developer.&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  VS Code extension for Arduino
&lt;/h2&gt;

&lt;p&gt;It’s been a very long time since I last opened the Arduino IDE on my computer. It is a great tool, especially for helping newcomers get started with the Arduino ecosystem, but it is lacking some key features for anyone interested in doing more than just blinking an LED or running basic programs. And now that &lt;a href="https://en.wikipedia.org/wiki/List_of_Arduino_boards_and_compatible_systems"&gt;more and more platforms&lt;/a&gt; are compatible with Arduino, from RISC-V developer kits such as HiFive1, to ESP32 or STM32 Nucleo family, there are even more reasons for looking for a better IDE for Arduino development.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.vscode-arduino"&gt;VS Code extension for Arduino&lt;/a&gt; is built on top of the official Arduino IDE—which you need to install once but will probably never open ever again—and provides you with all the features you’d expect to find in the classic IDE (e.g. browsing code samples or monitor your serial port).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--kh76T32G--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/8wg4r60cdb1agebuxcgr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--kh76T32G--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/8wg4r60cdb1agebuxcgr.png" alt="The VS Code extension for Arduino in action."&gt;&lt;/a&gt;The VS Code extension for Arduino in action.&lt;/p&gt;

&lt;p&gt;What makes the extension particularly powerful in my opinion, is the fact it builds on top of the VS Code C/C++ tools to provide you with full-blown  &lt;strong&gt;Intellisense&lt;/strong&gt;  and  &lt;strong&gt;code navigation&lt;/strong&gt;  for your code, which proves to be very useful&lt;/p&gt;

&lt;p&gt;I vividly remember the first time I put my hands on and soldered an Arduino-compatible board, circa 2010, at the TechShop Menlo Park. It’s been incredible to see the Arduino ecosystem grow over the years. Equally incredible is to think that until very recently, debugging a so-called sketch was reserved for the most adventurous programmers. If there was only one reason for you to try out the VS Code extension for Arduino, it has to be the fact it makes  &lt;strong&gt;debugging&lt;/strong&gt;  Arduino programs so much easier (no more &lt;code&gt;Serial.println&lt;/code&gt; traces, yay!).&lt;/p&gt;

&lt;p&gt;Behind the scenes, the extension leverages common debug interfaces such as CMSIS-DAP, JLink, and ST-Link. If your device already has an onboard debugging chip implementing one of these interfaces, you’re all set! If not, you will simply need to look at using an external connector that’s compatible with your chip.&lt;/p&gt;




&lt;h2&gt;
  
  
  PlatformIO IDE
&lt;/h2&gt;

&lt;p&gt;Like I mentioned in the previous section, there are more and more platforms that tap into the Arduino paradigm, but there is, of course, more to embedded development than the Arduino ecosystem.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://platformio.org/"&gt;PlatformIO&lt;/a&gt; originated as an open-source command-line tool to support IoT and embedded developers by providing a uniform mechanism for toolchain provisioning, library management, debugging, etc. It quickly evolved to integrate tightly with VS Code, and the &lt;a href="https://marketplace.visualstudio.com/items?itemName=platformio.platformio-ide"&gt;PlatformIO IDE extension for VS Code&lt;/a&gt; is now one of the most popular ones on the Visual Studio Marketplace.&lt;/p&gt;

&lt;p&gt;PlatformIO supports 30+ platforms (ex. Atmel AVR, Atmel SAM, ESP-32 and 8266, Kendryte K210, Freescale Kinetis, etc. ), 20+ frameworks (Arduino, ESP-IDF, Arm Mbed, Zephyr, …) and over 750 different boards! For each of these platforms, the extension will help you write your code (code completion, code navigation), manage your dependencies, build and debug, and interact with your device using the serial port monitor.&lt;/p&gt;

&lt;p&gt;Another interesting feature is the ability to convert an existing Arduino project to the PlatformIO format, essentially making it much easier to share with your coworkers (and the world!), since it can then leverage PlatformIO’s &lt;a href="https://docs.platformio.org/en/latest/librarymanager/index.html"&gt;advanced library management features&lt;/a&gt;. For example, it can automatically pull your 3&lt;sup&gt;rd&lt;/sup&gt; party libraries solely based on the header files you’re including in your code.  &lt;/p&gt;




&lt;h2&gt;
  
  
  Azure IoT Tools
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://marketplace.visualstudio.com/items?itemName=vsciot-vscode.azure-iot-tools"&gt;Azure IoT Tools extension&lt;/a&gt; for VS Code is essentially an extension bundle that installs in one single click the Azure IoT Hub Toolkit, the IoT Edge extension, and the Device Workbench.&lt;/p&gt;

&lt;p&gt;As you look at connecting your devices to the cloud, Azure IoT Hub provides you with all you need to manage your devices, collect their telemetry and route it to consuming services, and more. Using the Azure IoT Hub extension, you can easily provision an IoT Hub instance in your Azure subscription, provision your devices, monitor the data they are sending, etc. all without having to leave your IDE!&lt;/p&gt;

&lt;p&gt;If you are interested in using a container-based architecture for making your IoT gateways &lt;em&gt;smart&lt;/em&gt;, chances are IoT Edge can help you! Thanks to the dedicated extension, you can easily build your custom IoT Edge modules, and deploy them to your edge devices connected to IoT Hub, either real ones or simulated ones running on your development machine.&lt;/p&gt;

&lt;p&gt;Finally, the Device Workbench can help you get started very quickly with actual devices. It provides a set of tools to help with building your own &lt;a href="https://docs.microsoft.com/en-us/azure/iot-pnp/"&gt;IoT plug-and-play&lt;/a&gt; device, or simply to try out Azure IoT with an actual device, using one of the many examples bundled with the workbench.&lt;/p&gt;

&lt;p&gt;What do I like the most with the Azure IoT Tools extension? Every few weeks, you get &lt;a href="https://devblogs.microsoft.com/visualstudio/tag/iot/"&gt;tons of awesome updates&lt;/a&gt; and new features, as the extension is actively developed.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;By the way, if you don’t have an Azure subscription and want to get started with IoT on Azure, you can create a free trial account &lt;a href="https://azure.microsoft.com/free/iot/"&gt;here&lt;/a&gt;!&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Remote Development extension pack
&lt;/h2&gt;

&lt;p&gt;IoT Development is much more than writing code for embedded devices. Frequently, you will find yourself in a situation where you want to interact with a folder that lives in a container on a remote edge gateway, or on a cloud server. You sure can use SSH and/or SCP to sync your local and remote development environments, but this can be pretty painful and error-prone.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---Q20N2tI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/cwuchsp1yzaqkwqvvnhs.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---Q20N2tI--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/cwuchsp1yzaqkwqvvnhs.gif" alt="Remote Development Extension pack screencast"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://marketplace.visualstudio.com/items?itemName=ms-vscode-remote.vscode-remote-extensionpack"&gt;Remote Development extension pack&lt;/a&gt; allows you to open any folder in a container or on a remote machine and to then just use VS Code’s as if you were manipulating local resources.&lt;/p&gt;




&lt;h2&gt;
  
  
  REST Client
&lt;/h2&gt;

&lt;p&gt;If you are like me, your go-to tool for testing REST APIs is probably Postman. It is indeed a great tool for creating and testing REST, SOAP, or GraphQL requests and it even allows you to save queries in the cloud and to share them with your colleagues. However, I recently found myself in a situation where I wanted to share some sample queries with people during a training session, and I didn’t want them to have to copy-paste unnecessarily from the training instructions to Postman; instead, I wanted the queries to be part of the actual training material!&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://marketplace.visualstudio.com/items?itemName=humao.rest-client"&gt;REST Client extension&lt;/a&gt; turns any file with an .http or .rest extension into an executable notebook, where you can very easily execute all the queries contained in it.&lt;/p&gt;

&lt;p&gt;As you build an end-to-end IoT solution, it is more than likely that you will rely on 3&lt;sup&gt;rd&lt;/sup&gt; party services along the way, and that you will interact with them using some form of REST API. For example, you may rely on a weather service as part of your predictive maintenance computations. Below is an example of how I shared with my students a few queries showing how to use the Azure Maps API to compute routes or render map tiles.&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;
 &lt;br&gt;
And now for the same queries (except for the subscription key which has been replaced by a real one 🙂) executed in real-time thanks to the REST Client extension:

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--uBmTwuaK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/7w7tc0yhqc7ftl84gwkp.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--uBmTwuaK--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/7w7tc0yhqc7ftl84gwkp.gif" alt="VS Code REST Client extension"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How about you? Are there other VS Code extensions that you’ve found useful for your IoT projects? If so, I would love to hear about them in the comments.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can also always &lt;a href="https://twitter.com/kartben"&gt;find me on Twitter&lt;/a&gt; to continue the conversation.&lt;/p&gt;

</description>
      <category>iot</category>
      <category>vscode</category>
      <category>arduino</category>
      <category>azureiot</category>
    </item>
  </channel>
</rss>
