<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Nilesh Arnaiya</title>
    <description>The latest articles on Forem by Nilesh Arnaiya (@nilesharnaiya).</description>
    <link>https://forem.com/nilesharnaiya</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/nilesharnaiya"/>
    <language>en</language>
    <item>
      <title>I Was Spending 30% of My Research Time on LaTeX. Then I Found Bibby AI — and Got That Time Back.</title>
      <dc:creator>Nilesh Arnaiya</dc:creator>
      <pubDate>Mon, 02 Mar 2026 08:10:56 +0000</pubDate>
      <link>https://forem.com/nilesharnaiya/i-was-spending-30-of-my-research-time-on-latex-then-i-found-bibby-ai-and-got-that-time-back-42ba</link>
      <guid>https://forem.com/nilesharnaiya/i-was-spending-30-of-my-research-time-on-latex-then-i-found-bibby-ai-and-got-that-time-back-42ba</guid>
      <description>&lt;h2&gt;
  
  
  I Was Spending 30% of My Research Time on LaTeX. Then I Found Bibby — and Got That Time Back.
&lt;/h2&gt;

&lt;p&gt;&lt;em&gt;A researcher's honest take on the AI LaTeX editor that quietly changed how I write papers.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Let me paint you a picture.&lt;/p&gt;

&lt;p&gt;It's 11:47 PM. My conference deadline is in 13 hours. My methodology section is solid, my results are clean — and I am &lt;em&gt;deep&lt;/em&gt; in a rabbit hole trying to figure out why my &lt;code&gt;\bibliography&lt;/code&gt; command is throwing a cryptic &lt;code&gt;I found no \citation commands&lt;/code&gt; error while my &lt;code&gt;.bib&lt;/code&gt; file is sitting right there, staring at me, completely fine.&lt;/p&gt;

&lt;p&gt;I lost two hours that night. Not to research. Not to writing. To &lt;em&gt;LaTeX plumbing&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;That was the moment I started seriously looking for something better. That's how I found &lt;strong&gt;&lt;a href="https://trybibby.com" rel="noopener noreferrer"&gt;Bibby&lt;/a&gt;&lt;/strong&gt; — and honestly? I wish I'd found it years earlier.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why LaTeX Is Both Necessary and Maddening
&lt;/h2&gt;

&lt;p&gt;If you're a researcher, you already know LaTeX isn't optional. Journals expect it. Conference templates demand it. Your collaborators use it. The output is beautiful, the math renders perfectly, and nothing else comes close for typesetting a 40-page paper with 200 citations.&lt;/p&gt;

&lt;p&gt;But the &lt;em&gt;experience&lt;/em&gt; of writing LaTeX? It's like driving a Formula 1 car where you also have to rebuild the engine every 20 minutes.&lt;/p&gt;

&lt;p&gt;The Bibby team clearly gets this. Their pitch — &lt;em&gt;"Write research, not code"&lt;/em&gt; — hit me somewhere deep in my exhausted researcher soul.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Even Is Bibby?
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://trybibby.com" rel="noopener noreferrer"&gt;Bibby&lt;/a&gt; is an &lt;strong&gt;AI-powered LaTeX editor&lt;/strong&gt; built specifically for academic researchers. Not a general-purpose writing tool with LaTeX bolted on. Not a glorified text editor with syntax highlighting. It's a purpose-built research environment where every single feature — from citation management to peer review simulation — exists to answer one question: &lt;em&gt;how do we give researchers their time back?&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;It's used by 9,920+ researchers from institutions like MIT, Yale, Berkeley, Harvard, and Max Planck. That's not a marketing number thrown on a landing page — that's a signal that people with serious research workflows have found this useful enough to keep using.&lt;/p&gt;

&lt;p&gt;Here's a deep dive into every feature that's actually changed how I work.&lt;/p&gt;




&lt;h2&gt;
  
  
  🤖 Feature 1: Bibby Chat — The AI Writing Assistant That Lives In Your Editor
&lt;/h2&gt;



&lt;p&gt;This is the centerpiece, and it earns that position.&lt;/p&gt;

&lt;p&gt;Bibby Chat is a context-aware AI assistant (Uses &lt;strong&gt;Gemini 3.1 Pro&lt;/strong&gt;) and latest as models evolve and that understands your &lt;em&gt;entire document&lt;/em&gt; — not just the line you're on. You can ask it to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Draft a skeleton introduction based on your topic&lt;/li&gt;
&lt;li&gt;Rewrite a paragraph to sound more formal&lt;/li&gt;
&lt;li&gt;Generate TikZ figures from a description&lt;/li&gt;
&lt;li&gt;Explain a cryptic LaTeX error in plain English &lt;strong&gt;and fix it&lt;/strong&gt;
&lt;/li&gt;
&lt;li&gt;Suggest improvements to your argument structure&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The "context-aware" part is what separates it from just pasting your document into ChatGPT. Bibby knows your LaTeX environment, your bibliography, your package list. When it suggests a fix, it's not guessing — it knows what's already in your preamble.&lt;/p&gt;

&lt;p&gt;I used it last week to completely restructure my related work section. What would've taken me 3 hours of shuffling paragraphs took 25 minutes of back-and-forth with Bibby Chat.&lt;/p&gt;




&lt;h2&gt;
  
  
  📚 Feature 2: Smart Citation Search &amp;amp; Insert
&lt;/h2&gt;



&lt;p&gt;This one might be my single biggest time-saver.&lt;/p&gt;

&lt;p&gt;Before Bibby, my citation workflow was: open Google Scholar → find paper → copy title → search Semantic Scholar → grab BibTeX → paste into &lt;code&gt;.bib&lt;/code&gt; file → manually write &lt;code&gt;\cite{key}&lt;/code&gt; in my document → compile → pray the key matched. Rinse and repeat for every single reference.&lt;/p&gt;

&lt;p&gt;With Bibby, I type what I'm looking for, and it searches &lt;strong&gt;across Semantic Scholar, CrossRef, and more&lt;/strong&gt; simultaneously. It auto-generates the BibTeX entry with correct formatting and inserts the &lt;code&gt;\cite{}&lt;/code&gt; command directly into my document with one click.&lt;/p&gt;

&lt;p&gt;It also does something I didn't expect: it &lt;strong&gt;suggests relevant papers based on what I'm writing&lt;/strong&gt;. Like, I'll be drafting a paragraph about transformer attention mechanisms and Bibby will surface three papers I hadn't considered. That's not just saving time — that's actively improving the quality of my literature coverage.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;200 million+ papers searchable. One click to cite. No more tab-switching.&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  🧮 Feature 3: AI Table &amp;amp; Equation Generator — Snap → Perfect LaTeX
&lt;/h2&gt;



&lt;p&gt;I want to tell you about the moment I fell in love with Bibby.&lt;/p&gt;

&lt;p&gt;I had a handwritten equation from a whiteboard session with my advisor. A gnarly multi-line thing with summations, subscripts, and a matrix embedded in it. I took a photo. I uploaded it. Bibby gave me publication-ready LaTeX in about 4 seconds.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Four seconds.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The Equation from Image feature converts handwritten or printed math to LaTeX through image upload. It handles:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Multi-line equations with &lt;code&gt;align&lt;/code&gt; environments&lt;/li&gt;
&lt;li&gt;Matrices and vectors&lt;/li&gt;
&lt;li&gt;Complex summations and integrals&lt;/li&gt;
&lt;li&gt;Fractions nested inside fractions (you know the ones)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;But it goes further. You can also &lt;strong&gt;describe a table or equation in plain English&lt;/strong&gt; and Bibby generates the LaTeX. "Give me a 3-column table comparing BERT, GPT-2, and T5 on parameters, training data, and benchmark scores" → done. No more fighting with &lt;code&gt;tabular&lt;/code&gt; column specifications.&lt;/p&gt;




&lt;h2&gt;
  
  
  🔍 Feature 4: AI Paper Reviewer — Peer Review Before the Peer Review
&lt;/h2&gt;

&lt;p&gt;One of Bibby's most unique features, and honestly one of the most valuable things I've used as a researcher.&lt;/p&gt;

&lt;p&gt;Upload your manuscript PDF and Bibby gives you structured feedback modeled after &lt;strong&gt;real peer reviews&lt;/strong&gt; — with venue-specific criteria for NeurIPS, ICML, ICLR, CVPR, IEEE, ACM, and more.&lt;/p&gt;

&lt;p&gt;The feedback isn't generic. It evaluates:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Novelty&lt;/strong&gt; — is your contribution actually new?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Technical quality&lt;/strong&gt; — are your proofs/experiments solid?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Clarity&lt;/strong&gt; — is this readable for someone outside your subfield?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Presentation&lt;/strong&gt; — figures, tables, formatting&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Impact&lt;/strong&gt; — will this matter to the community?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And it gives you actionable recommendations, not just scores.&lt;/p&gt;

&lt;p&gt;I submitted a workshop paper last quarter and ran it through Bibby's reviewer first. It caught a weakness in my experimental setup that I'd been handwaving — the kind of thing a real reviewer would've written a scathing paragraph about. I fixed it. The paper got in.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://trybibby.com/paper-review" rel="noopener noreferrer"&gt;Try AI Paper Review →&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  ✍️ Feature 5: AI Abstract Generator
&lt;/h2&gt;

&lt;p&gt;Writing abstracts is deceptively hard. You're compressing months of work into 150-250 words while making it compelling, accurate, and keyword-optimized for your target venue.&lt;/p&gt;

&lt;p&gt;Bibby's Abstract Generator takes your full paper content (or a PDF upload) and generates a publication-ready abstract that hits the standard structure: motivation, problem, approach, results, impact. You can then refine it with follow-up prompts — "make it less technical," "shorten by 30 words," "emphasize the dataset contribution more."&lt;/p&gt;

&lt;p&gt;It's not a replacement for your voice as a researcher. Think of it as a first draft generator that gets you 80% of the way there in 30 seconds instead of 45 minutes.&lt;/p&gt;




&lt;h2&gt;
  
  
  📖 Feature 6: AI Literature Review Generator
&lt;/h2&gt;

&lt;p&gt;This one is wild.&lt;/p&gt;

&lt;p&gt;You describe your research topic. Bibby drafts a literature review section with &lt;strong&gt;real, cited references&lt;/strong&gt; — grouped by theme and methodology, written in coherent paragraphs, formatted with &lt;code&gt;\cite{}&lt;/code&gt; commands, ready for you to edit.&lt;/p&gt;

&lt;p&gt;It's not making up papers. It's pulling from actual academic databases and synthesizing them. The output isn't perfect — you'll want to verify every citation and add your own framing — but as a starting scaffold, it saves hours of work.&lt;/p&gt;

&lt;p&gt;I used it to bootstrap a grant proposal's background section. What typically takes me a full day of reading and organizing took about 2 hours total (including my editing pass on top of Bibby's draft).&lt;/p&gt;




&lt;h2&gt;
  
  
  🔬 Feature 7: Deep Research Assistant
&lt;/h2&gt;



&lt;p&gt;Sometimes you need to go deep on a topic you're not fully familiar with. Maybe you're pivoting research directions, exploring an adjacent field, or trying to understand the state-of-the-art before formulating a new hypothesis.&lt;/p&gt;

&lt;p&gt;Bibby's Deep Research mode does multi-source academic exploration — reading across papers, identifying gaps in the literature, suggesting research directions and hypotheses, and presenting everything with a full citation trail.&lt;/p&gt;

&lt;p&gt;It's like having a well-read research assistant who can synthesize a field overview in minutes. Every claim is traceable. Every citation is real. And it points you toward where the interesting open questions actually are.&lt;/p&gt;




&lt;h2&gt;
  
  
  ⚠️ Feature 8: LaTeX Error Detection &amp;amp; Auto-Fix
&lt;/h2&gt;



&lt;p&gt;Remember the 11:47 PM nightmare I opened with? This feature would have ended that in seconds.&lt;/p&gt;

&lt;p&gt;Bibby detects compilation errors in real time as you type, explains them in plain English ("You have a missing closing brace on line 47 — here's what the corrected code looks like"), and offers &lt;strong&gt;one-click fixes&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;It handles:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Missing or conflicting packages&lt;/li&gt;
&lt;li&gt;Mismatched braces and environments
&lt;/li&gt;
&lt;li&gt;Undefined references and citations&lt;/li&gt;
&lt;li&gt;Font encoding issues&lt;/li&gt;
&lt;li&gt;Multi-file project errors&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;No more Googling cryptic TeX error messages at midnight. No more Stack Exchange rabbit holes. Bibby just tells you what's wrong and fixes it.&lt;/p&gt;




&lt;h2&gt;
  
  
  🤝 Feature 9: Real-Time Collaboration (Without the Chaos)
&lt;/h2&gt;



&lt;p&gt;If you've ever tried to co-author a paper over email with 4 collaborators, you know what "merge conflict hell" feels like. Version 7_FINAL_v3_JohnEdits_ACTUALLYFINAL.tex is a horror story we've all lived.&lt;/p&gt;

&lt;p&gt;Bibby's real-time collaboration is Google Docs-style live editing but for LaTeX, with:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Unlimited collaborators&lt;/strong&gt; on Pro plans&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;In-line comments&lt;/strong&gt; so you can discuss changes without breaking the document&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI-resolved merge conflicts&lt;/strong&gt; — when two people edit the same section, Bibby handles it&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Version control&lt;/strong&gt; so you can time-travel through every edit&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;My last paper had 3 co-authors across 2 time zones. We used Bibby the whole way. Zero version conflicts. First time that's happened in 6 years of collaborative research.&lt;/p&gt;




&lt;h2&gt;
  
  
  📁 Feature 10: GitHub Sync
&lt;/h2&gt;



&lt;p&gt;For researchers who live in Git (or whose institutions require it), this is a dream feature.&lt;/p&gt;

&lt;p&gt;Two-way sync with any GitHub repository — public, private, or institutional. Push directly from Bibby. Pull updates back. Branch support for parallel drafts. Automatic commit messages generated from your edit descriptions.&lt;/p&gt;

&lt;p&gt;Your paper is always versioned, always backed up, always accessible from any machine. And your collaborators who prefer to edit locally in VS Code or Vim can still do that — they're just syncing to the same repo.&lt;/p&gt;




&lt;h2&gt;
  
  
  ⏱️ Feature 11: Version History — Time-Travel Through Your Work
&lt;/h2&gt;



&lt;p&gt;Every edit session is automatically saved. Bibby keeps a complete visual diff history — you can see exactly what changed between any two versions and restore with one click.&lt;/p&gt;

&lt;p&gt;The named snapshots feature is my favorite part: before a major rewrite, I create a snapshot called something like "Before restructuring intro — NeurIPS draft." Three days later when I've convinced myself the rewrite made things worse, I'm back in 30 seconds.&lt;/p&gt;




&lt;h2&gt;
  
  
  📄 Feature 12: 5,000+ Journal-Ready Templates
&lt;/h2&gt;



&lt;p&gt;Bibby ships with 5,000+ publisher-approved templates covering IEEE, Nature, APA, ACM, NeurIPS, ICML, Cambridge thesis formats, and hundreds more. Every margin, every font size, every column layout — already configured and ready to go.&lt;/p&gt;

&lt;p&gt;Stop spending 45 minutes trying to get the IEEE two-column format right from scratch. Open the template, start writing.&lt;/p&gt;




&lt;h2&gt;
  
  
  🆓 Bonus: Free Tools for Everyone
&lt;/h2&gt;

&lt;p&gt;Bibby also offers genuinely useful free tools with no account required — because they believe in open science:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;LaTeX to Word converter&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;LaTeX Table Generator&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Citation Generator&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;LaTeX to HTML&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;PDF to Abstract&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;LaTeX Word Counter&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Common LaTeX Symbols reference&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Conference Deadlines tracker&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These are legitimately useful standalone tools, not crippled demos designed to upsell you.&lt;/p&gt;




&lt;h2&gt;
  
  
  What's the Catch? (Honest Pricing Take)
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Free plan:&lt;/strong&gt; Unlimited projects, limited compiles, 2 AI completions/month, 10+ basic templates. No collaboration. Good for trying it out, not enough for serious research.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Pro plan:&lt;/strong&gt; $9/month (currently 50% off from $18). Unlimited everything — projects, collaborators, AI assist, templates, citations. Version control, Git sync, equation from image, deep research, paper review, priority support, offline mode, export in all formats. 30-day money-back guarantee.&lt;/p&gt;

&lt;p&gt;For context: I used to spend ~6 hours per paper on formatting and citation management alone. At even a modest valuation of my time, Bibby Pro pays for itself in the first paper. It's not a close calculation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Institution plan:&lt;/strong&gt; Custom pricing for research teams and universities, with SSO, centralized billing, analytics, API access, and on-premise deployment options.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Thing That Surprised Me Most: Privacy
&lt;/h2&gt;

&lt;p&gt;I was skeptical about an AI tool that sits inside my unpublished manuscripts. Bibby's answer:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Your research data is never used to train AI models. Ever.&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;They're explicit about this: institutional-grade privacy by design. You can enable or disable any AI feature. Your documents are yours. For anyone working on pre-publication research, this matters enormously — and it's not a throwaway privacy policy line, it's a core product commitment.&lt;/p&gt;




&lt;h2&gt;
  
  
  My Honest Assessment
&lt;/h2&gt;

&lt;p&gt;Here's what I keep coming back to: &lt;strong&gt;Bibby was built by people who understand what research actually feels like.&lt;/strong&gt; Every feature addresses a real pain point. The AI isn't bolted on for marketing — it's woven into the core workflow.&lt;/p&gt;

&lt;p&gt;I went from dreading the "writing phase" of research (mostly because it meant fighting LaTeX for hours) to actually looking forward to it. That's not a small thing.&lt;/p&gt;

&lt;p&gt;If you're a researcher who uses LaTeX — or who &lt;em&gt;should&lt;/em&gt; be using LaTeX but keeps avoiding it because of the overhead — I'd genuinely encourage you to give Bibby a try. The free tier will let you feel the editor. Spring for a month of Pro and actually use the AI features before deciding.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://trybibby.com" rel="noopener noreferrer"&gt;Start writing free at trybibby.com →&lt;/a&gt;&lt;/strong&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;What's your biggest LaTeX pain point? Drop it in the comments — curious if others have found creative workarounds, or if Bibby handles it in a way I haven't explored yet.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Tags:&lt;/strong&gt; &lt;code&gt;#latex&lt;/code&gt; &lt;code&gt;#research&lt;/code&gt; &lt;code&gt;#productivity&lt;/code&gt; &lt;code&gt;#ai&lt;/code&gt; &lt;code&gt;#academicwriting&lt;/code&gt; &lt;code&gt;#phd&lt;/code&gt; &lt;code&gt;#tools&lt;/code&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>writing</category>
      <category>latexwriting</category>
      <category>bibbyai</category>
    </item>
    <item>
      <title>Creating your Rust Functions in Node.js Using SSVM and docker</title>
      <dc:creator>Nilesh Arnaiya</dc:creator>
      <pubDate>Fri, 31 Jul 2020 04:41:12 +0000</pubDate>
      <link>https://forem.com/nilesharnaiya/creating-your-rust-functions-in-node-js-using-ssvm-and-docker-5470</link>
      <guid>https://forem.com/nilesharnaiya/creating-your-rust-functions-in-node-js-using-ssvm-and-docker-5470</guid>
      <description>&lt;p&gt;Head over to &lt;a href="https://github.com/second-state/ssvm-nodejs-starter" rel="noopener noreferrer"&gt;https://github.com/second-state/ssvm-nodejs-starter&lt;/a&gt; to get started. &lt;/p&gt;

&lt;p&gt;Use Docker: &lt;br&gt;
&lt;code&gt;$ docker pull secondstate/ssvm-nodejs-starter:v1&lt;br&gt;
$ docker run -p 3000:3000 --rm -it -v $(pwd):/app secondstate/ssvm-nodejs-starter:v1&lt;br&gt;
(docker) # cd /app&lt;br&gt;
(docker) # ssvmup build&lt;br&gt;
(docker) # node node/app.js&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Copy the functions example - &lt;a href="https://github.com/second-state/wasm-learning/tree/master/nodejs/functions" rel="noopener noreferrer"&gt;https://github.com/second-state/wasm-learning/tree/master/nodejs/functions&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;Replace the files and run your node app.&lt;/p&gt;

&lt;p&gt;If you want to try out Machine learning with Rust - &lt;a href="https://github.com/second-state/wasm-learning/tree/master/nodejs/ml" rel="noopener noreferrer"&gt;https://github.com/second-state/wasm-learning/tree/master/nodejs/ml&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Tensorflow Models on Android - Absolute Beginner</title>
      <dc:creator>Nilesh Arnaiya</dc:creator>
      <pubDate>Thu, 21 May 2020 17:20:56 +0000</pubDate>
      <link>https://forem.com/nilesharnaiya/tensorflow-models-on-android-absolute-beginner-ia8</link>
      <guid>https://forem.com/nilesharnaiya/tensorflow-models-on-android-absolute-beginner-ia8</guid>
      <description>&lt;h1&gt;
  
  
  What is &lt;strong&gt;Tensorflow&lt;/strong&gt; Lite?
&lt;/h1&gt;

&lt;ul&gt;
&lt;li&gt;TensorFlow Lite is an open-source framework created by Tensorflow and optimized to run inference on mobile devices📱 as well as edge devices. 
&lt;strong&gt;TFLite&lt;/strong&gt; is used for deploying pre-trained models on android, iOS and even on your latest &lt;a href="https://www.amazon.in/gp/product/B07XT1QJ4S/ref=as_li_tl?ie=UTF8&amp;amp;tag=masks4all-21&amp;amp;camp=3638&amp;amp;creative=24630&amp;amp;linkCode=as2&amp;amp;creativeASIN=B07XT1QJ4S&amp;amp;linkId=7137f29fa680dcec8bfaca80fa1ed81a" rel="noopener noreferrer"&gt;Raspberry Pi&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This article is an introduction to TensorFlow Lite and you will learn how to train the model, convert the model, and run inference on your mobile devices.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Requirements&lt;/strong&gt;
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Android Studio&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let's Create a &lt;strong&gt;Face Mask&lt;/strong&gt; 😷 Classifier right in the browser using Google's &lt;a href="https://teachablemachine.withgoogle.com/train/image" rel="noopener noreferrer"&gt;Teachable Machine&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;All thanks to Prajna Bhandary for the dataset - &lt;a href="https://github.com/prajnasb/observations" rel="noopener noreferrer"&gt;Download&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;strong&gt;Let's Train&lt;/strong&gt;
&lt;/h1&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Let's use the dataset and upload it to the Teachable Machine&lt;br&gt;
 &lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw12w1br9p5i0mrryanyo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw12w1br9p5i0mrryanyo.png" alt="Training" width="800" height="506"&gt;&lt;/a&gt;&lt;br&gt;
You can add as many classes with as many images you want.&lt;br&gt;
More the Images, More the training time. &lt;br&gt;
Click the &lt;strong&gt;Train&lt;/strong&gt; Model and wait until it completes the training. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;h3&gt;
  
  
  &lt;strong&gt;The Advanced Tab&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;This gives you more freedom to select the basic &lt;strong&gt;hyperparameters&lt;/strong&gt; that help the Neural network learn. &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Epochs&lt;/strong&gt; - 1 epoch is one run of each of the samples in our data and you want to keep increasing until it you get the desired predictions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Batch Size&lt;/strong&gt; - Divides your samples by the Batch Size you choose and that becomes your batch. For Eg: 112 samples/ 16 = 7 Batches&lt;/li&gt;
&lt;li&gt;7 batches to be processed by the model to finish 1 Epoch&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Learning Rate&lt;/strong&gt; - Size of a step that a model makes for choosing the best possible weights in a neural network. The smaller steps take a lot of time to learn which weights would be best and a high learning rate would mean we might miss some best weights possible. So tweak carefully. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftbau93xs57whs7v4ccor.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftbau93xs57whs7v4ccor.png" alt="Results" width="800" height="386"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;h3&gt;
  
  
  &lt;strong&gt;Export the model&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Choose a quantized model for reducing the model size and accelerating operations on mobile devices while suffering a very minimal loss. 
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvskrghz668mhfzar4u6n.png" alt="Export" width="765" height="507"&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;h3&gt;
  
  
  &lt;strong&gt;Deploying on Android Devices&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Let's create a &lt;strong&gt;labels.txt&lt;/strong&gt; file with all our classes and each class on a separate line&lt;/li&gt;
&lt;li&gt;We will use this &lt;a href="https://github.com/tensorflow/examples/tree/master/lite/examples/image_classification/android" rel="noopener noreferrer"&gt;MobileNet Repository&lt;/a&gt; for our Mask Detector and update the files respectively.&lt;/li&gt;
&lt;li&gt;Download or clone the repo and open the project in Android Studio&lt;/li&gt;
&lt;li&gt;Add the labels.txt file in

&lt;code&gt;src/main/assets/&lt;/code&gt;

Folder.
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Replace the model in the &lt;strong&gt;assets/&lt;/strong&gt; folder with our trained model.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Change the MODEL_PATH in MainActivity.java
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;code&gt;private static final String MODEL_PATH = &lt;br&gt;
"mobilenet_quant_v1_224.tflite";&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Connect the USB and RUN the app&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;Enjoy👏 in thee glory of creating thee Face Mask Detection Model and running it on mobile devices📱. &lt;/p&gt;

&lt;h2&gt;
  
  
  Additional Resources
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;If you are using a Custom trained model, use this for conversion
&lt;a href="https://colab.research.google.com/github/tensorflow/examples/blob/master/courses/udacity_intro_to_tensorflow_lite/tflite_c04_exercise_convert_model_to_tflite_solution.ipynb" rel="noopener noreferrer"&gt;Udacity Tflite conversion&lt;/a&gt; &lt;/li&gt;
&lt;li&gt;If you are using Keras - use this 
&lt;a href="https://colab.research.google.com/drive/1rYqJmgI_luomxO2D59Q4vtHn580ywxm1?usp=sharing" rel="noopener noreferrer"&gt;Keras .pb to .tflite&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Also, Check out the Udacity Course - &lt;a href="https://www.udacity.com/course/intro-to-tensorflow-lite--ud190" rel="noopener noreferrer"&gt;Intro to Tensorflow Lite&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;A Project built on top of TFLite - &lt;a href="https://buildawn.com/blog/2020/05/18/smart-mirror-diy-project-under-100/" rel="noopener noreferrer"&gt;Smart Mirror&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>android</category>
      <category>machinelearning</category>
      <category>tensorflow</category>
      <category>python</category>
    </item>
  </channel>
</rss>
