<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: kohei</title>
    <description>The latest articles on Forem by kohei (@0xkohe).</description>
    <link>https://forem.com/0xkohe</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/0xkohe"/>
    <language>en</language>
    <item>
      <title>Built tasuki — an AI CLI Orchestrator that Seamlessly Hands Off Between Tools</title>
      <dc:creator>kohei</dc:creator>
      <pubDate>Sun, 19 Apr 2026 08:49:32 +0000</pubDate>
      <link>https://forem.com/0xkohe/built-tasuki-an-ai-cli-orchestrator-that-seamlessly-hands-off-between-tools-519o</link>
      <guid>https://forem.com/0xkohe/built-tasuki-an-ai-cli-orchestrator-that-seamlessly-hands-off-between-tools-519o</guid>
      <description>&lt;p&gt;I built and open-sourced &lt;strong&gt;tasuki&lt;/strong&gt;, an AI CLI orchestrator that automatically rotates between &lt;strong&gt;Claude Code / Codex CLI / GitHub Copilot CLI&lt;/strong&gt; based on priority.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Repository: &lt;a href="https://github.com/0xkohe/tasuki" rel="noopener noreferrer"&gt;https://github.com/0xkohe/tasuki&lt;/a&gt;
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkcw64lvyid0gouvx7wpy.png" alt="Terminal screenshot showing tasuki orchestrating multiple AI CLIs and switching providers based on usage" width="800" height="367"&gt;
&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Motivation
&lt;/h2&gt;

&lt;p&gt;I’m the kind of person who prefers subscribing to multiple AI tools at $20–$30 each rather than committing to a single $100–$200 plan. I want to explore different tools and continuously feel the differences between models.&lt;/p&gt;

&lt;p&gt;However, doing this means hitting usage limits fairly quickly, which interrupts workflow and requires manual switching — something I found annoying.&lt;/p&gt;

&lt;p&gt;Each AI has its strengths — Codex for review, Claude for conversation, etc. But unless you’re extremely opinionated, there’s value in constantly rotating through tools and experiencing their evolution. If you stick to one, you may miss when another becomes better at something.&lt;/p&gt;

&lt;p&gt;The switching decision itself is trivial labor — so I wanted to automate it.&lt;br&gt;
That’s why I built tasuki.&lt;/p&gt;

&lt;p&gt;Instead of treating rate limits as a constraint, I treat them as an opportunity to move to another model.&lt;/p&gt;




&lt;h2&gt;
  
  
  Intended Workflow
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Prioritize Claude Code and Codex CLI (5-hour windows)&lt;/strong&gt;&lt;br&gt;
Since their reset cycles are short, it’s more cost-efficient to exhaust them first.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Once both 5-hour windows are exhausted, switch to GitHub Copilot CLI&lt;/strong&gt;&lt;br&gt;
Copilot operates on a monthly quota, so it works well as a “bridge.”&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;When Claude / Codex reset, switch back automatically&lt;/strong&gt;&lt;br&gt;
Monthly quota should be preserved as much as possible.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  What It Does
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Native UI stays intact&lt;/strong&gt;&lt;br&gt;
Each CLI runs inside a wrapped PTY, preserving the original interactive experience.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Automatic switching based on usage&lt;/strong&gt;&lt;br&gt;
Adapters monitor CLI output streams and trigger a handoff when a threshold (default: 95%) is reached.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Context-aware handoff&lt;/strong&gt;&lt;br&gt;
Progress is written to &lt;code&gt;.tasuki/handoff.md&lt;/code&gt; and injected into the next provider.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  Try It in 5 Minutes
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Install
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;go &lt;span class="nb"&gt;install &lt;/span&gt;github.com/0xkohe/tasuki/cmd/tasuki@latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  First Run
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd &lt;/span&gt;path/to/your/project
tasuki
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7aybiohnxwcpd9zlzc7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo7aybiohnxwcpd9zlzc7.png" alt="Terminal output showing tasuki switching from Claude to Codex after hitting usage threshold" width="800" height="744"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fplufeba5hj9w73saebum.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fplufeba5hj9w73saebum.png" alt="Terminal output showing tasuki continuing execution seamlessly on another provider" width="800" height="744"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Closing
&lt;/h2&gt;

&lt;p&gt;Use everything. Keep exploring.&lt;br&gt;
tasuki reduces the friction of doing that.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;GitHub: &lt;a href="https://github.com/0xkohe/tasuki" rel="noopener noreferrer"&gt;https://github.com/0xkohe/tasuki&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Issues: &lt;a href="https://github.com/0xkohe/tasuki/issues" rel="noopener noreferrer"&gt;https://github.com/0xkohe/tasuki/issues&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ai</category>
      <category>cli</category>
      <category>go</category>
      <category>programming</category>
    </item>
    <item>
      <title>I built git-ai – generate Conventional Commit messages via Claude/Copilot/Codex CLI</title>
      <dc:creator>kohei</dc:creator>
      <pubDate>Fri, 06 Mar 2026 10:17:51 +0000</pubDate>
      <link>https://forem.com/0xkohe/i-built-git-ai-generate-conventional-commit-messages-via-claudecopilotcodex-cli-4bhk</link>
      <guid>https://forem.com/0xkohe/i-built-git-ai-generate-conventional-commit-messages-via-claudecopilotcodex-cli-4bhk</guid>
      <description>&lt;p&gt;I built a small  cli called git-ai that adds a &lt;code&gt;git ai commit&lt;/code&gt; command to your workflow.&lt;br&gt;
It reads &lt;code&gt;git diff --staged&lt;/code&gt;, sends the diff to your preferred AI CLI (Claude, GitHub Copilot, or OpenAI Codex), and generates a Conventional Commit message automatically. A spinner animation shows the progress, and if one model isn't installed or fails, it silently falls back to the next available one.&lt;/p&gt;

&lt;p&gt;$ git add .&lt;br&gt;
  $ git ai commit&lt;br&gt;
    ⠹ Generating commit message with copilot...&lt;br&gt;
  ✅ Generated by copilot&lt;/p&gt;




&lt;p&gt;Proposed: feat: add user authentication middleware&lt;/p&gt;




&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7gcljp0lv3fzpxvr062i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7gcljp0lv3fzpxvr062i.png" alt=" " width="744" height="511"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Features:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Supports Claude CLI, GitHub Copilot CLI, and OpenAI Codex CLI&lt;/li&gt;
&lt;li&gt;Auto-detects which model to use (claude → copilot → codex)&lt;/li&gt;
&lt;li&gt;--edit / -e flag to review and modify in vi before committing&lt;/li&gt;
&lt;li&gt;Pure bash, no dependencies beyond the AI CLI itself&lt;/li&gt;
&lt;li&gt;One-line install via install.sh&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The motivation was simple: I kept writing the same kinds of commit messages over and over. With git-ai, I just stage my changes and let the AI handle the wording — it's surprisingly good at following Conventional Commits format.&lt;/p&gt;

&lt;p&gt;GitHub: &lt;a href="https://github.com/0xkohe/git-ai" rel="noopener noreferrer"&gt;https://github.com/0xkohe/git-ai&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>cli</category>
      <category>git</category>
      <category>showdev</category>
    </item>
  </channel>
</rss>
