<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: krit.k83 (ΚρητικόςIGB)</title>
    <description>The latest articles on Forem by krit.k83 (ΚρητικόςIGB) (@krit83).</description>
    <link>https://forem.com/krit83</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/krit83"/>
    <language>en</language>
    <item>
      <title>I launched an open source CLI tool with zero audience — here's what happened in 10 days</title>
      <dc:creator>krit.k83 (ΚρητικόςIGB)</dc:creator>
      <pubDate>Sat, 11 Apr 2026 09:42:38 +0000</pubDate>
      <link>https://forem.com/krit83/i-launched-an-open-source-cli-tool-with-zero-audience-heres-what-happened-in-10-days-1d6g</link>
      <guid>https://forem.com/krit83/i-launched-an-open-source-cli-tool-with-zero-audience-heres-what-happened-in-10-days-1d6g</guid>
      <description>&lt;p&gt;10 days ago I posted about a tool I built on LinkedIn. I had no prior posts and no history of sharing my work online. I just wanted to see if anyone cared.&lt;/p&gt;

&lt;p&gt;The tool is &lt;a href="https://github.com/gekap/fast-copy" rel="noopener noreferrer"&gt;fast-copy&lt;/a&gt; — a Python CLI for high-speed file transfers with block-order I/O, content-aware deduplication, and SSH tar streaming. I wrote about the technical details in my &lt;a href="https://dev.to/krit83/i-built-a-faster-alternative-to-cp-and-rsync-heres-how-it-works-39fa"&gt;previous article&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This post is about what happened after I hit publish.&lt;/p&gt;

&lt;h2&gt;
  
  
  The numbers
&lt;/h2&gt;

&lt;p&gt;After 10 days, with no ads, no influencers, and no growth hacks:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;557 total clones&lt;/strong&gt; from &lt;strong&gt;202 unique users&lt;/strong&gt; on GitHub&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;100 unique visitors&lt;/strong&gt; on the repo&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;14 people&lt;/strong&gt; read the full source code&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;175 readers&lt;/strong&gt; on my dev.to article&lt;/li&gt;
&lt;li&gt;Organic traffic from Google, DuckDuckGo, Bing, Kagi, Reddit, and Hacker News&lt;/li&gt;
&lt;li&gt;People I've never met shared it in Slack channels and Teams groups at their workplaces&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All from one LinkedIn post, one dev.to article, and one Hacker News submission.&lt;/p&gt;

&lt;h2&gt;
  
  
  What actually worked
&lt;/h2&gt;

&lt;h3&gt;
  
  
  LinkedIn surprised me
&lt;/h3&gt;

&lt;p&gt;I expected zero engagement. Instead, the post reached 676 people and got almost 1,000 impressions. The "problem-first" format worked — I opened with "I got tired of watching cp -r crawl on a 60K-file directory" and that hooked people who recognized the pain.&lt;/p&gt;

&lt;h3&gt;
  
  
  Dev.to was the slow burn winner
&lt;/h3&gt;

&lt;p&gt;My article here didn't get much traction in the first few days. Then around day 5, it spiked to 70 readers in a single day. Google and DuckDuckGo started indexing it and now send 20+ views each without me doing anything. This is the platform that will keep working months from now.&lt;/p&gt;

&lt;h3&gt;
  
  
  Hacker News was humbling
&lt;/h3&gt;

&lt;p&gt;I submitted a Show HN post. It got 2 points and zero comments. But it still sent 14 unique visitors to my repo — and those were high-quality visitors who dug deep into the source code and releases page. HN is hard to crack, but even a "failed" post there has impact.&lt;/p&gt;

&lt;h3&gt;
  
  
  Reddit was a rollercoaster
&lt;/h3&gt;

&lt;p&gt;My r/python post got removed by the spam filter. I messaged the mods, waited days, and it eventually got approved. It's slowly getting traction now — 6 unique visitors so far. Patience is key.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I learned
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;1. People don't comment — they clone.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Across all platforms, I got almost zero comments. For days I thought nobody cared. But 200+ people downloaded the code. Developers are silent users — they evaluate with their terminal, not their keyboard.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. One post triggers a chain reaction.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;LinkedIn → Hacker News → dev.to → search engines → Reddit. Each platform fed the next. You don't need to go viral on one platform — you need presence on several.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Search engines are the long game.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The first week was all social media traffic. By day 10, Google and DuckDuckGo were sending consistent daily traffic. That will compound. An article or repo that ranks for "rsync alternative" or "fast file copy Python" will bring visitors for years.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Benchmarks &amp;gt; descriptions.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The single most effective thing in my posts was the real benchmark: "92K files, 888MB, copied in 17.9 seconds." People respond to numbers, not feature lists.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Ship binaries.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Adding pre-built executables for Linux, macOS, and Windows using GitHub Actions made a visible difference. The releases page is one of the most visited pages on the repo. Lower the barrier and people will try your tool.&lt;/p&gt;

&lt;h2&gt;
  
  
  What's next
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Retry Hacker News with better timing&lt;/li&gt;
&lt;li&gt;Add a comparison table to the README&lt;/li&gt;
&lt;li&gt;Keep shipping improvements and posting about them&lt;/li&gt;
&lt;li&gt;Maybe a GUI someday&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you're building something and hesitating to share it — just do it. The right people will find it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://github.com/gekap/fast-copy" rel="noopener noreferrer"&gt;fast-copy on GitHub&lt;/a&gt;&lt;/p&gt;

</description>
      <category>opensource</category>
      <category>python</category>
      <category>showdev</category>
      <category>beginners</category>
    </item>
    <item>
      <title>I built a faster alternative to cp and rsync — here's how it works</title>
      <dc:creator>krit.k83 (ΚρητικόςIGB)</dc:creator>
      <pubDate>Sun, 05 Apr 2026 20:27:08 +0000</pubDate>
      <link>https://forem.com/krit83/i-built-a-faster-alternative-to-cp-and-rsync-heres-how-it-works-39fa</link>
      <guid>https://forem.com/krit83/i-built-a-faster-alternative-to-cp-and-rsync-heres-how-it-works-39fa</guid>
      <description>&lt;p&gt;I'm a systems engineer. I spend a lot of time copying files — backups to USB drives, transfers to NAS boxes, moving data between servers over SSH. And I kept running into the same frustrations:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;cp -r&lt;/code&gt; is painfully slow on HDDs when you have tens of thousands of small files&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;rsync&lt;/code&gt; is powerful but complex, and still slow for bulk copies&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;scp&lt;/code&gt; and SFTP top out at 1-2 MB/s on transfers that should be much faster&lt;/li&gt;
&lt;li&gt;No tool tells you upfront if the destination even has enough space&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So I built &lt;strong&gt;fast-copy&lt;/strong&gt; — a Python CLI that copies files at maximum sequential disk speed.&lt;/p&gt;

&lt;h2&gt;
  
  
  The core idea
&lt;/h2&gt;

&lt;p&gt;When you run &lt;code&gt;cp -r&lt;/code&gt;, files are read in directory order — which is essentially random on disk. Every file seek on an HDD costs 5-10ms. Multiply that by 60,000 files and you're spending minutes just on head movement.&lt;/p&gt;

&lt;p&gt;fast-copy does something different: it resolves the physical disk offset of every file before copying. On Linux it uses &lt;code&gt;FIEMAP&lt;/code&gt;, on macOS &lt;code&gt;fcntl&lt;/code&gt;, on Windows &lt;code&gt;FSCTL&lt;/code&gt;. Then it sorts files by block position and reads them sequentially.&lt;/p&gt;

&lt;p&gt;That alone makes a big difference. But there's more.&lt;/p&gt;

&lt;h2&gt;
  
  
  Deduplication
&lt;/h2&gt;

&lt;p&gt;Many directories have duplicate files — node_modules across projects, cached downloads, backup copies. fast-copy hashes every file with xxHash-128 (or SHA-256 as fallback), copies each unique file once, and creates hard links for duplicates.&lt;/p&gt;

&lt;p&gt;In my test with 92K files, over half were duplicates — saving 379 MB and a lot of I/O time.&lt;/p&gt;

&lt;p&gt;It also keeps a SQLite database of hashes, so repeated copies to the same destination skip files that were already copied in previous runs.&lt;/p&gt;

&lt;h2&gt;
  
  
  SSH tar streaming
&lt;/h2&gt;

&lt;p&gt;This is the part I'm most proud of. Instead of using SFTP (which has significant protocol overhead), fast-copy streams files as chunked ~100 MB tar batches over raw SSH channels.&lt;/p&gt;

&lt;p&gt;The remote side runs &lt;code&gt;tar xf -&lt;/code&gt; and files land directly on disk — no temp files, no SFTP overhead. This even works on servers that have SFTP disabled, like some Synology NAS configurations.&lt;/p&gt;

&lt;p&gt;Three modes are supported:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Local → Remote&lt;/li&gt;
&lt;li&gt;Remote → Local&lt;/li&gt;
&lt;li&gt;Remote → Remote (relay through your machine)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Real benchmarks
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Local copy — 92K files to USB:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;44,718 unique files copied + 47,146 hard-linked&lt;/li&gt;
&lt;li&gt;509.8 MB written, 378.9 MB saved by dedup&lt;/li&gt;
&lt;li&gt;17.9 seconds, 28.5 MB/s&lt;/li&gt;
&lt;li&gt;All files verified after copy&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Remote to local — 92K files over LAN:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;509.8 MB downloaded in 14 minutes&lt;/li&gt;
&lt;li&gt;46,951 duplicates detected, saving 378.5 MB of transfer&lt;/li&gt;
&lt;li&gt;3x faster than SFTP&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Getting started
&lt;/h2&gt;

&lt;p&gt;The simplest way — just run the Python script:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python fast_copy.py /source /destination
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Or download a standalone binary (no Python needed) from the Releases page — available for Linux, macOS, and Windows.&lt;/p&gt;

&lt;p&gt;For SSH transfers, install paramiko:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;paramiko
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For faster hashing:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;xxhash
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;GitHub: &lt;a href="https://github.com/gekap/fast-copy" rel="noopener noreferrer"&gt;https://github.com/gekap/fast-copy&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;License: Apache 2.0&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I'd love to hear feedback — especially from anyone dealing with large file transfers or backup workflows. What tools are you currently using? What's missing from them?&lt;/p&gt;




</description>
      <category>python</category>
      <category>linux</category>
      <category>opensource</category>
      <category>productivity</category>
    </item>
  </channel>
</rss>
