<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Oleksandr Pliekhov</title>
    <description>The latest articles on Forem by Oleksandr Pliekhov (@oleks_pv_26).</description>
    <link>https://forem.com/oleks_pv_26</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/oleks_pv_26"/>
    <language>en</language>
    <item>
      <title>I extended an open-source BLE mesh messenger with on-device AI for disaster response</title>
      <dc:creator>Oleksandr Pliekhov</dc:creator>
      <pubDate>Fri, 27 Mar 2026 19:16:53 +0000</pubDate>
      <link>https://forem.com/oleks_pv_26/i-extended-an-open-source-ble-mesh-messenger-with-on-device-ai-for-disaster-response-1gn0</link>
      <guid>https://forem.com/oleks_pv_26/i-extended-an-open-source-ble-mesh-messenger-with-on-device-ai-for-disaster-response-1gn0</guid>
      <description>&lt;p&gt;&lt;em&gt;Originally published on &lt;a href="https://hackernoon.com/when-the-internet-dies-your-phone-can-still-be-smart-building-ai-powered-disaster-communication" rel="noopener noreferrer"&gt;HackerNoon&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;GitHub: &lt;a href="https://github.com/AleksPlekhov/ai-mesh-emergency-communication-platform" rel="noopener noreferrer"&gt;https://github.com/AleksPlekhov/ai-mesh-emergency-communication-platform&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem Nobody Talks About
&lt;/h2&gt;

&lt;p&gt;Every major disaster follows the same pattern. The earthquake hits. The hurricane makes landfall. The wildfire jumps the firebreak. And within minutes - sometimes seconds - the cell towers go down.&lt;/p&gt;

&lt;p&gt;Not because they're damaged. Because 50,000 people are all trying to call their families at the same time.&lt;/p&gt;

&lt;p&gt;First responders pull out their phones. Nothing. Rescue coordinators open their apps. No signal. Meanwhile, somewhere in the rubble, someone is typing "trapped under debris, 3rd floor, need help" into a mesh messaging app - and that message is sitting in a queue behind a hundred check-ins, status updates, and "is everyone okay?" messages from people who are fine.&lt;/p&gt;

&lt;p&gt;This is the information overload problem. And until now, nobody had solved it for offline mesh networks.&lt;/p&gt;

&lt;h2&gt;
  
  
  Mesh Networking Works. Triage Doesn't.
&lt;/h2&gt;

&lt;p&gt;Apps like Meshtastic and BitChat have proven that device-to-device mesh communication works without internet infrastructure. Your phone talks to my phone via Bluetooth. My phone relays to the next phone. Messages hop across a network of humans - no towers, no servers, no connectivity required.&lt;/p&gt;

&lt;p&gt;The problem is that these systems treat every message equally. A cardiac arrest report and a "checking in, all good" message arrive with identical priority. In a network serving hundreds of nodes during an active disaster, a rescue coordinator might receive dozens of messages per minute with zero mechanism to distinguish life-threatening signals from ambient traffic.&lt;/p&gt;

&lt;p&gt;The result is cognitive overload at precisely the moment when a human being needs to make fast, accurate decisions.&lt;/p&gt;

&lt;p&gt;We needed AI. But we had a constraint: the internet was gone.&lt;/p&gt;

&lt;h2&gt;
  
  
  On-Device AI: The Only Option That Actually Works
&lt;/h2&gt;

&lt;p&gt;Cloud NLP is off the table the moment infrastructure fails. You cannot call an API when there is no network. You cannot run inference on a remote server when that server is unreachable.&lt;/p&gt;

&lt;p&gt;What you can do is run a small, fast, quantized neural model directly on a commodity Android smartphone - no connectivity required, no cloud dependency, no specialized hardware.&lt;/p&gt;

&lt;p&gt;This is the core insight behind ResQMesh AI Platform: an open-source Android platform that integrates on-device machine learning directly into the BLE mesh communication stack.&lt;/p&gt;

&lt;p&gt;The entire AI layer runs locally. Always. Even when every tower in a 50-mile radius is dark.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn1hgdi2jg068q0epf53l.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn1hgdi2jg068q0epf53l.jpg" alt="ResQMesh AI chat interface showing emergency message classification" width="800" height="1644"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  How the Classification Pipeline Works
&lt;/h2&gt;

&lt;p&gt;The heart of the system is a two-stage composite classifier I call M1.&lt;/p&gt;

&lt;h3&gt;
  
  
  Stage 1: Keyword Classifier
&lt;/h3&gt;

&lt;p&gt;Before any neural inference runs, every incoming message passes through a deterministic rule engine covering approximately 90 lexical patterns derived from FEMA Incident Command System guidelines. Patterns like "trapped inside," "can't breathe," "water rising," "cardiac arrest."&lt;/p&gt;

&lt;p&gt;If a CRITICAL or HIGH pattern matches, the message is tagged and surfaced immediately - no neural model invocation required. This guarantees sub-millisecond classification for the most time-sensitive messages, and it works on any Android device regardless of hardware capability.&lt;/p&gt;

&lt;h3&gt;
  
  
  Stage 2: Neural Classifier
&lt;/h3&gt;

&lt;p&gt;Messages that don't match any keyword pattern go to a quantized Conv1D TensorFlow Lite model. The architecture is deliberately lightweight: an embedding layer, a single Conv1D layer with global max pooling, a dense layer with dropout, and a 10-class softmax output.&lt;/p&gt;

&lt;p&gt;The model classifies into nine emergency categories - Medical, Collapse, Fire, Flood, Security, Weather, Missing Person, Infrastructure, Resource Request - plus a Non-Emergency class. Each category maps to one of four priority tiers: CRITICAL, HIGH, NORMAL, or LOW.&lt;/p&gt;

&lt;p&gt;Total model size: 420KB. Inference time on mid-range hardware: single-digit milliseconds.&lt;/p&gt;

&lt;p&gt;A confidence threshold of τ = 0.25 filters out uncertain predictions. Messages shorter than three tokens skip neural classification entirely.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftqah2zs9ghyw4o1v5c9m.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftqah2zs9ghyw4o1v5c9m.jpg" alt="Emergency Feed showing messages sorted by priority categories" width="800" height="1644"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Part That Surprised Us: The BLE Priority Queue
&lt;/h2&gt;

&lt;p&gt;Classification is only half the problem. The other half is radio-layer delivery.&lt;/p&gt;

&lt;p&gt;I replaced the FIFO broadcaster with a &lt;code&gt;java.util.PriorityQueue&lt;/code&gt; protected by a &lt;code&gt;Mutex&lt;/code&gt;, driven by a &lt;code&gt;CONFLATED&lt;/code&gt; signal actor. Every outgoing packet gets a priority field assigned synchronously by the classifier before enqueuing. CRITICAL packets preempt everything else at the radio layer.&lt;/p&gt;

&lt;p&gt;The benchmark result: &lt;strong&gt;1001x reduction in delivery latency&lt;/strong&gt; for CRITICAL messages under simulated mesh load - 1,000 packets queued at 100µs intervals.&lt;/p&gt;

&lt;p&gt;One thousand and one times faster delivery for the message that says someone is dying.&lt;/p&gt;

&lt;h2&gt;
  
  
  What Else the Platform Does
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;M2&lt;/strong&gt; - Offline Speech Recognition via Vosk Android. Voice-to-text that works completely offline, powered by a deep neural acoustic model.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;M3&lt;/strong&gt; - FEMA ICS-213 Report Generator. Automatically aggregates classified messages into ICS-213-compatible situation reports - rendered as HTML, printable via Android's PrintManager.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;M4&lt;/strong&gt; - Energy Optimizer. A battery-aware relay policy engine that adapts packet forwarding probability based on device power state. M4 enforces the guarantee that CRITICAL traffic always relays - with 21 unit tests.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn06mo1xdll14q2yf7abq.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fn06mo1xdll14q2yf7abq.jpg" alt="ICS-213 FEMA report generated from mesh network messages" width="800" height="1644"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Architecture Decision That Made Everything Possible
&lt;/h2&gt;

&lt;p&gt;All four modules live in a self-contained Gradle library module - &lt;code&gt;:resqmesh-ai&lt;/code&gt; - with zero dependency on the application module. The entire AI layer can be unit tested independently of the Android UI.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why This Matters Beyond Disaster Response
&lt;/h2&gt;

&lt;p&gt;The pattern I've built - lightweight on-device neural classification combined with deterministic rule-based fallback, running over an encrypted peer-to-peer mesh, with radio-layer priority preemption - is generalizable.&lt;/p&gt;

&lt;p&gt;Anywhere connectivity is unreliable or absent. Rural areas. Underground facilities. Ships at sea.&lt;/p&gt;

&lt;p&gt;A 420KB quantized Conv1D model running on a mid-range Android phone can classify emergency messages faster than a human can read them. You don't need a data center. You don't need an API key. You don't need internet.&lt;/p&gt;

&lt;h2&gt;
  
  
  Open Source and What's Next
&lt;/h2&gt;

&lt;p&gt;ResQMesh AI Platform is released under GPL-3.0:&lt;br&gt;
&lt;a href="https://github.com/AleksPlekhov/ai-mesh-emergency-communication-platform" rel="noopener noreferrer"&gt;https://github.com/AleksPlekhov/ai-mesh-emergency-communication-platform&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Planned next steps include federated learning for in-field model improvement, a real-time situational awareness map on offline OpenStreetMap tiles, adaptive BLE/Wi-Fi Direct switching, and bridging to FEMA IPAWS when connectivity is restored.&lt;/p&gt;

&lt;p&gt;If you're working on offline communication, emergency response systems, or on-device ML for constrained environments - open an issue or start a discussion on GitHub.&lt;/p&gt;

</description>
      <category>android</category>
      <category>opensource</category>
      <category>machinelearning</category>
      <category>kotlin</category>
    </item>
  </channel>
</rss>
