<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Elimihele God's favour</title>
    <description>The latest articles on Forem by Elimihele God's favour (@favourite1975).</description>
    <link>https://forem.com/favourite1975</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/favourite1975"/>
    <language>en</language>
    <item>
      <title>The Future of Compose UI with JetStart</title>
      <dc:creator>Elimihele God's favour</dc:creator>
      <pubDate>Sun, 05 Apr 2026 15:23:08 +0000</pubDate>
      <link>https://forem.com/favourite1975/the-future-of-compose-ui-with-jetstart-3ef8</link>
      <guid>https://forem.com/favourite1975/the-future-of-compose-ui-with-jetstart-3ef8</guid>
      <description>&lt;p&gt;Jetpack Compose completely changed how we write Android UIs. It brought declarative programming to an ecosystem desperate for it. But while the code changed, the tooling didn't. &lt;/p&gt;

&lt;p&gt;Android developers were writing elegant, modern Compose code, but compiling it using a heavily bloated, multi-minute, slow pipeline. We built JetStart to bring the tooling into the modern era.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where JetStart is Going
&lt;/h2&gt;

&lt;p&gt;We successfully replaced our old DSL parser with raw DEX compilation. This allows developers to use genuine Compose functions without any syntax hacks. What's next for the JetStart ecosystem?&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Expanded Desktop &amp;amp; iOS Support
&lt;/h3&gt;

&lt;p&gt;Kotlin Multiplatform (KMP) has taken the world by storm. Compose isn't just for Android anymore—it's for iOS and Desktop natively. JetStart currently focuses heavily on the Android execution path, but we are working on abstracting our WebSocket injection layer to target &lt;code&gt;Compose Multiplatform&lt;/code&gt; apps on iOS and macOS/Windows.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Live State Injection
&lt;/h3&gt;

&lt;p&gt;Live hot reload is amazing, but it can lose local UI state if an overarching class is redefined. We are building a memory-graph inspector that captures the Compose tree's state payload right before the hot-reload byte replacement and injects it back after the class loads.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. VS Code &amp;amp; IntelliJ Extensions
&lt;/h3&gt;

&lt;p&gt;We are finalizing our &lt;code&gt;JetStart VS Code Extension&lt;/code&gt; and planning an IntelliJ plugin to tightly bind the &lt;code&gt;OverrideGenerator&lt;/code&gt; warnings, connection statuses, and logs directly into your IDE.&lt;/p&gt;

&lt;p&gt;JetStart is more than just a hot-reload CLI—it is a philosophical shift in how Android apps are compiled and tested. Try &lt;code&gt;npx jetstart create my-app&lt;/code&gt; today and see the future of Android development.&lt;/p&gt;

</description>
      <category>opensource</category>
      <category>typescript</category>
      <category>android</category>
      <category>mobile</category>
    </item>
    <item>
      <title>Why Build Systems Matter: Ditching Gradle for WebSockets</title>
      <dc:creator>Elimihele God's favour</dc:creator>
      <pubDate>Sun, 05 Apr 2026 15:13:44 +0000</pubDate>
      <link>https://forem.com/favourite1975/why-build-systems-matter-ditching-gradle-for-websockets-3elo</link>
      <guid>https://forem.com/favourite1975/why-build-systems-matter-ditching-gradle-for-websockets-3elo</guid>
      <description>&lt;p&gt;Gradle is an incredible piece of engineering. It manages complex dependency trees, supports endless plugins, and orchestrates build tasks in a way that scales well for enterprise applications.&lt;/p&gt;

&lt;p&gt;But for the inner development loop—the seconds between writing code and seeing the result—Gradle is an obstacle. If you simply want to change a padding value on a Compose UI screen, running a full Gradle evaluation is overkill. The overhead of task execution graphs, even with caching enabled, causes noticeable delays.&lt;/p&gt;

&lt;h2&gt;
  
  
  The JetStart Solution: Sidestepping Gradle
&lt;/h2&gt;

&lt;p&gt;We realized that to get instant, state-preserving hot reloads on physical Android devices, we couldn't rely on Gradle. We had to build our own compiler pipeline specifically designed for the edit-refresh cycle.&lt;/p&gt;

&lt;p&gt;When you run &lt;code&gt;jetstart dev&lt;/code&gt;, Gradle is nowhere to be found. Instead, we run a custom Node.js server that binds a file system watcher to your &lt;code&gt;.kt&lt;/code&gt; files. When an edit occurs, we run a raw &lt;code&gt;kotlinc&lt;/code&gt; command on that singular file. We bypass the build system entirely. &lt;/p&gt;

&lt;p&gt;Once &lt;code&gt;kotlinc&lt;/code&gt; outputs a Java class, we run &lt;code&gt;d8&lt;/code&gt; to generate DEX bytecode. And because ADB (Android Debug Bridge) is too slow for our needs, we built a WebSocket connection between our dev server and the physical device. The device listens on port &lt;code&gt;8766&lt;/code&gt; for raw byte arrays and loads them into a custom ClassLoader instantly.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Best of Both Worlds
&lt;/h2&gt;

&lt;p&gt;We didn't completely kill Gradle—we just put it in its place. When you run &lt;code&gt;jetstart build&lt;/code&gt;, we hand the process back to Gradle to compile an APK, manage dependencies, and digitally sign your app. &lt;/p&gt;

&lt;p&gt;JetStart gives you the blinding speed of WebSockets for UI work, and the reliable build process of Gradle for production releases. This hybrid approach represents the future of Android development.&lt;/p&gt;

</description>
      <category>gradle</category>
      <category>websocket</category>
      <category>android</category>
      <category>androiddev</category>
    </item>
    <item>
      <title>Android Emulator vs Physical Device: Picking Your Development Target</title>
      <dc:creator>Elimihele God's favour</dc:creator>
      <pubDate>Sun, 05 Apr 2026 15:05:52 +0000</pubDate>
      <link>https://forem.com/favourite1975/android-emulator-vs-physical-device-picking-your-development-target-1ij4</link>
      <guid>https://forem.com/favourite1975/android-emulator-vs-physical-device-picking-your-development-target-1ij4</guid>
      <description>&lt;p&gt;JetStart gives Android developers a choice. I wanted to make sure you could hot reload your Kotlin Compose UIs exactly how you prefer whether that's directly onto a physical Android device or using a high-performance Android Emulator.&lt;/p&gt;

&lt;p&gt;Which one is right for your workflow? Here is how I built both paths to ensure you never have to wait for Gradle again.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Gold Standard: Physical Device Hot Reload
&lt;/h2&gt;

&lt;p&gt;There is no substitute for feeling an app in your hands. To achieve live, state-preserving speeds on a physical device, I had to develop a custom injection architecture. JetStart intercepts your file edits, runs &lt;code&gt;kotlinc&lt;/code&gt; and &lt;code&gt;d8&lt;/code&gt; in the background, and pushes raw DEX bytecode via WebSocket directly to a custom &lt;code&gt;ClassLoader&lt;/code&gt; inside your running app.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Pros:&lt;/strong&gt; It is the "real" environment. Colors, native Android choreography, and touch latency are 100% accurate.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cons:&lt;/strong&gt; Requires a physical device connected via USB or the same Wi-Fi network.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Virtual Powerhouse: Android Emulator
&lt;/h2&gt;

&lt;p&gt;If you don't have a device handy, the Android Emulator is the next best thing. I’ve fully integrated AVD (Android Virtual Device) management into the JetStart CLI so you don't even need to open Android Studio.&lt;/p&gt;

&lt;p&gt;By running &lt;code&gt;npx jetstart android-emulator&lt;/code&gt;, you can create "JetStart-optimized" emulators that are pre-configured for speed. When you run &lt;code&gt;npx jetstart dev --emulator&lt;/code&gt;, the CLI automatically detects the running emulator, installs the client, and establishes the hot reload connection for you.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Pros:&lt;/strong&gt; No cables required. It stays on your screen alongside your IDE. Supports multiple configurations and API levels.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cons:&lt;/strong&gt; Requires hardware acceleration (VT-x/AMD-V) and can be resource-intensive on older machines.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Experimental Fallback: The Web Emulator
&lt;/h2&gt;

&lt;p&gt;I also included a Web Emulator as a tactical fallback. By running &lt;code&gt;jetstart dev --web&lt;/code&gt;, the CLI starts a secondary pipeline that transpiles your Compose code to JavaScript (ES modules) and renders it in the browser.&lt;/p&gt;

&lt;p&gt;I consider this &lt;strong&gt;experimental&lt;/strong&gt;. It’s perfect for quick UI scaffolding when you’re working from a laptop in a cafe and can't run a heavy emulator, but it isn't a replacement for a true Android environment.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Pros:&lt;/strong&gt; Opens instantly in any browser. No virtualization required.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cons:&lt;/strong&gt; It's an abstraction layer; it won't render complex third-party Android-specific views.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The Verdict
&lt;/h2&gt;

&lt;p&gt;For final polish and performance testing, use a &lt;strong&gt;Physical Device&lt;/strong&gt;. For your day-to-day development loop, the &lt;strong&gt;Android Emulator&lt;/strong&gt; is the perfect companion. If you're on the move and need to save battery or resources, the &lt;strong&gt;Web Emulator&lt;/strong&gt; has your back. &lt;/p&gt;

&lt;p&gt;JetStart gives you the freedom to build at the speed of thought, regardless of your hardware setup. Give it a spin and let me know which one you prefer!&lt;/p&gt;

</description>
      <category>android</category>
      <category>kotlin</category>
      <category>jetpackcompose</category>
      <category>jetstart</category>
    </item>
    <item>
      <title>How to Achieve Instant Android Hot Reload</title>
      <dc:creator>Elimihele God's favour</dc:creator>
      <pubDate>Sun, 05 Apr 2026 14:55:02 +0000</pubDate>
      <link>https://forem.com/favourite1975/how-to-achieve-instant-android-hot-reload-1pk0</link>
      <guid>https://forem.com/favourite1975/how-to-achieve-instant-android-hot-reload-1pk0</guid>
      <description>&lt;p&gt;Android development is plagued by long feedback loops. If you make a tweak to a padding value, an alignment, or a color, Gradle will make you wait. Sometimes seconds, sometimes minutes. This single bottleneck ruins the developer "flow state."&lt;/p&gt;

&lt;p&gt;We wanted a web-like developer experience on physical Android devices. We wanted an instant UI hot reload. Here is how we achieved it in JetStart.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Architecture of Speed
&lt;/h2&gt;

&lt;p&gt;Standard Android development compiles your whole app, signs an APK, and pushes it via ADB. This is fundamentally slow. JetStart circumvents this process entirely during development.&lt;/p&gt;

&lt;p&gt;Here is the step-by-step pipeline we use:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;File System Watching:&lt;/strong&gt; A Chokidar process watches your project directory. As soon as a &lt;code&gt;.kt&lt;/code&gt; file changes, we grab it.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Targeted Compilation:&lt;/strong&gt; We do &lt;em&gt;not&lt;/em&gt; run Gradle. We invoke &lt;code&gt;kotlinc&lt;/code&gt; directly, passing only the changed file and compiling it into Java classes using the Compose compiler plugin.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Instant Run Hooks:&lt;/strong&gt; Our &lt;code&gt;OverrideGenerator&lt;/code&gt; injects standard &lt;code&gt;$Override&lt;/code&gt; companion logic into the class so the Android runtime natively swaps the implementation.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;DEX Translation:&lt;/strong&gt; We pipe those &lt;code&gt;.class&lt;/code&gt; files into Google's &lt;code&gt;d8&lt;/code&gt; tool, spitting out a raw &lt;code&gt;.dex&lt;/code&gt; payload.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;WebSocket Delivery:&lt;/strong&gt; ADB is too slow, so we rely on WebSockets. The DEX is base64-encoded and blasted to your device over a local Wi-Fi or USB connection on port 8766.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Live Execution:&lt;/strong&gt; The JetStart client app running on your device intercepts the DEX payload, feeds it to a custom &lt;code&gt;ClassLoader&lt;/code&gt;, and forces an instant recomposition of your UI.&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>android</category>
      <category>mobile</category>
      <category>productivity</category>
      <category>tooling</category>
    </item>
    <item>
      <title>Why I Killed My DSL Parser for Raw DEX Compilation</title>
      <dc:creator>Elimihele God's favour</dc:creator>
      <pubDate>Sun, 05 Apr 2026 14:38:30 +0000</pubDate>
      <link>https://forem.com/favourite1975/why-i-killed-my-dsl-parser-for-raw-dex-compilation-4gfl</link>
      <guid>https://forem.com/favourite1975/why-i-killed-my-dsl-parser-for-raw-dex-compilation-4gfl</guid>
      <description>&lt;p&gt;When I first built JetStart, the goal was simple: make Android UI development fast. I started with a DSL (Domain Specific Language) approach. I parsed Kotlin-like syntax, mapped it to Compose UI elements, and sent a JSON payload to the app to render. It worked, and it was fast—but it wasn't &lt;em&gt;real&lt;/em&gt;. &lt;/p&gt;

&lt;p&gt;Developers hit a wall the moment they needed to use their own custom state, complex animations, or third-party Compose libraries. The DSL approach was too brittle, too restrictive, and ultimately, a dead end for a serious developer tool.&lt;/p&gt;

&lt;p&gt;So, I made a hard pivot. &lt;/p&gt;

&lt;h2&gt;
  
  
  The Pivot to DEX
&lt;/h2&gt;

&lt;p&gt;Instead of faking it with a DSL, I decided to compile the actual Kotlin code. Here is how the new architecture works:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;KotlinCompiler:&lt;/strong&gt; JetStart uses &lt;code&gt;kotlinc&lt;/code&gt; with the Compose compiler plugin to compile your &lt;code&gt;.kt&lt;/code&gt; files into Java &lt;code&gt;.class&lt;/code&gt; files.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;DexGenerator:&lt;/strong&gt; It passes those &lt;code&gt;.class&lt;/code&gt; files through Android's &lt;code&gt;d8&lt;/code&gt; tool to create raw &lt;code&gt;classes.dex&lt;/code&gt; bytecode.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;WebSocket Push:&lt;/strong&gt; This base64 DEX payload is streamed to the connected Android device over our custom WebSocket protocol.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Live Injection:&lt;/strong&gt; A custom &lt;code&gt;ClassLoader&lt;/code&gt; on the device loads the new classes instantly. The UI recomposes with the new logic—no full app rebuild, no re-installation.&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  The Result: Real Compose, Sub-100ms
&lt;/h2&gt;

&lt;p&gt;By dropping the DSL and embracing true Kotlin Compilation + DEX generation, JetStart now supports &lt;em&gt;any&lt;/em&gt; valid &lt;code&gt;@Composable&lt;/code&gt; function. You aren't writing "JetStart Code" anymore; you're writing pure Android code, but experiencing a Web-like hot reload speed.&lt;/p&gt;

&lt;p&gt;It’s instantly injected. It’s real Compose. It’s the Android development experience I’ve always wanted. Have you tried it yet? Let me know what you think on Discord or X!&lt;/p&gt;

</description>
      <category>kotlin</category>
      <category>jetpackcompose</category>
      <category>android</category>
      <category>androiddev</category>
    </item>
    <item>
      <title>I Built an AI Tutor That Actually Sees Your Homework — Here's How</title>
      <dc:creator>Elimihele God's favour</dc:creator>
      <pubDate>Thu, 12 Mar 2026 23:03:01 +0000</pubDate>
      <link>https://forem.com/favourite1975/i-built-an-ai-tutor-that-actually-sees-your-homework-heres-how-1hd3</link>
      <guid>https://forem.com/favourite1975/i-built-an-ai-tutor-that-actually-sees-your-homework-heres-how-1hd3</guid>
      <description>&lt;p&gt;A few weeks ago I was watching my younger cousin struggle through a physics worksheet.&lt;br&gt;
She kept typing questions into ChatGPT, getting a wall of text back, and still looking confused. It hit me!&lt;br&gt;
why can't she just &lt;em&gt;show&lt;/em&gt; the problem to an AI and have it &lt;em&gt;talk&lt;/em&gt; her through it like a real tutor would?&lt;/p&gt;

&lt;p&gt;That question became &lt;strong&gt;VisionSolve(SolveTutor)&lt;/strong&gt;, and I built it for the &lt;a href="https://devpost.com/software/visionsolve" rel="noopener noreferrer"&gt;Gemini Live Agent Challenge&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Idea: What if AI Could See and Speak?
&lt;/h2&gt;

&lt;p&gt;Most AI tutoring tools work through text boxes. You type your question, you get a text response. But that's not how tutoring works in real life. A real tutor looks at your paper, listens to your confusion, and talks you through it step by step. They notice when you're lost and adjust.&lt;/p&gt;

&lt;p&gt;So I wanted to build exactly that — an AI tutor that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Sees&lt;/strong&gt; your homework through your camera&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Listens&lt;/strong&gt; to your questions through your microphone&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Speaks&lt;/strong&gt; explanations back to you, naturally&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;No typing required. Just point your phone at a math problem and start talking.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Gemini Live API Was Perfect for This
&lt;/h2&gt;

&lt;p&gt;I'd been playing around with different LLM APIs, and when I found the Gemini Live API, it clicked immediately. Most APIs are request-response — you send text, you get text back. Gemini's Live API is completely different. It opens a persistent bidirectional stream where you can send audio and video frames &lt;em&gt;continuously&lt;/em&gt;, and the model responds in real-time audio.&lt;/p&gt;

&lt;p&gt;The killer feature for me was &lt;strong&gt;native audio output&lt;/strong&gt;. The model doesn't generate text that gets piped through a TTS engine — it produces audio directly. The voice sounds natural, with proper pacing and intonation. When Sol (my tutor agent) explains a math concept, it actually sounds like someone talking to you, not a robot reading a script.&lt;/p&gt;

&lt;p&gt;The other thing that sold me was &lt;strong&gt;barge-in support&lt;/strong&gt;. Students interrupt. That's normal. They'll say "wait, what?" in the middle of an explanation. With other APIs you'd have to manage complex state to handle that. With Gemini Live, the student can just... talk. The model handles it gracefully.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Stack
&lt;/h2&gt;

&lt;p&gt;Here's what I ended up using:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Backend:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Google ADK (Agent Development Kit)&lt;/strong&gt; — this was huge. Instead of wiring up raw API calls, I defined my agent with a system instruction, gave it tools, and ADK handled the session management. The agent framework made it easy to add Google Search grounding so Sol can verify facts on the fly.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;FastAPI + WebSockets&lt;/strong&gt; — the frontend connects over a WebSocket, and the backend proxies audio/video to Gemini Live and streams audio responses back.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Firebase Firestore&lt;/strong&gt; — for storing session transcripts so students can review past sessions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Frontend:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Next.js + TypeScript&lt;/strong&gt; — nothing fancy here, just a clean mobile responsive interface with a webcam feed, audio visualizer, and chat transcript.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Firebase Auth&lt;/strong&gt; — Google Sign-In for authentication.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Model:&lt;/strong&gt; &lt;code&gt;gemini-2.5-flash-native-audio&lt;/code&gt; — the latest native audio model. Fast enough for real-time conversation, capable enough to understand handwritten math from a shaky phone camera.&lt;/p&gt;

&lt;h2&gt;
  
  
  Things That Surprised Me
&lt;/h2&gt;

&lt;h3&gt;
  
  
  The vision capabilities are seriously good
&lt;/h3&gt;

&lt;p&gt;I expected the model to struggle with handwritten math, especially messy student handwriting. It doesn't. I tested it with all kinds of problems scribbled algebra, printed calculus, even chemistry diagrams — and it identified them correctly almost every time. It even handles when the camera is slightly angled or the lighting isn't great.&lt;br&gt;
i even tried it with  my cousin's messy algebra homework, a printed calculus worksheet, and even a badly drawn chemistry diagram.&lt;/p&gt;

&lt;h3&gt;
  
  
  Natural interruptions just work
&lt;/h3&gt;

&lt;p&gt;This was the feature I was most nervous about. In my head, I had this complex state machine planned out for handling when a student interrupts. Turns out, I didn't need any of it. The Live API's barge-in support means when the student starts talking, the model stops, listens, and responds. It's seamless.&lt;/p&gt;

&lt;h3&gt;
  
  
  The hardest part wasn't the AI
&lt;/h3&gt;

&lt;p&gt;Honestly, the AI side was smoother than I expected thanks to ADK and the Live API. &lt;br&gt;
The hardest part wasn't the AI at all.&lt;br&gt;
It was WebSocket audio streaming.&lt;br&gt;
Browsers hate it. Microphone permissions break. Safari behaves weirdly. Classic web dev pain.&lt;/p&gt;

&lt;h2&gt;
  
  
  Google Cloud Deployment
&lt;/h2&gt;

&lt;p&gt;I deployed the whole thing on Google Cloud:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Backend runs on &lt;strong&gt;Cloud Run&lt;/strong&gt; with Vertex AI integration&lt;/li&gt;
&lt;li&gt;Frontend is on &lt;strong&gt;Firebase Hosting&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;CI/CD pipeline through &lt;strong&gt;GitHub Actions&lt;/strong&gt; — push a tag and everything deploys automatically&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The deploy workflow builds a Docker image, pushes it to GCR, deploys to Cloud Run, then builds the frontend with the new backend URL injected and deploys it to Firebase. The whole pipeline takes about 4 minutes, which is pretty nice for not having to think about deployments at all.&lt;/p&gt;

&lt;p&gt;You can check out the full pipeline &lt;a href="https://github.com/dev-phantom/VisionSolve/blob/master/.github/workflows/deploy.yml" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I'd Do Differently
&lt;/h2&gt;

&lt;p&gt;If I had more time, I'd add:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Drawing/annotation support&lt;/strong&gt; — let Sol highlight parts of the image while explaining&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Progress tracking&lt;/strong&gt; — track which topics the student struggles with over time&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Try It Out
&lt;/h2&gt;

&lt;p&gt;The project is open source: &lt;strong&gt;&lt;a href="https://github.com/dev-phantom/VisionSolve" rel="noopener noreferrer"&gt;github.com/dev-phantom/VisionSolve&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The README has full setup instructions if you want to run it locally. You'll need a Gemini API key (free from Google AI Studio) and a Firebase project.&lt;/p&gt;

&lt;p&gt;If you're thinking about building something with the Gemini Live API, I'd say go for it. The combination of real-time audio + vision is genuinely different from anything else available right now. It opens up use cases that just weren't possible with traditional request-response APIs.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Built for the #GeminiLiveAgentChallenge using Google Gemini, ADK, Firebase, and Cloud Run.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>gemini</category>
      <category>geminiliveagentchallenge</category>
      <category>ai</category>
      <category>googlecloud</category>
    </item>
  </channel>
</rss>
