<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Dimension AI Technologies</title>
    <description>The latest articles on Forem by Dimension AI Technologies (@dimension-ai).</description>
    <link>https://forem.com/dimension-ai</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/dimension-ai"/>
    <language>en</language>
    <item>
      <title>Just what IS Nim, anyway?</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Sun, 15 Mar 2026 09:57:24 +0000</pubDate>
      <link>https://forem.com/dimension-ai/just-what-is-nim-anyway-4n15</link>
      <guid>https://forem.com/dimension-ai/just-what-is-nim-anyway-4n15</guid>
      <description>&lt;p&gt;&lt;em&gt;The language that separated logic from its execution environment&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Imagine your team has built a real-time analytics engine – event ingestion, aggregation, threshold detection – and the core logic is correct, fast, and well tested.&lt;/p&gt;

&lt;p&gt;Now it has to live in three places: the Linux backend service is C, the Windows desktop monitoring tool is C++, and the browser dashboard is TypeScript. The same aggregation algorithm has therefore been implemented three times, and when the threshold logic changes, three codebases must be updated, retested, and redeployed independently. The maintenance cost is no longer proportional to the complexity of the logic – it is proportional to the number of environments the logic must inhabit.&lt;/p&gt;

&lt;p&gt;TypeScript demonstrated part of this idea — the language sits above its runtime and generates JavaScript — but targets a single managed environment. Nim applies the same principle to C, with a critical difference: it generates code that can manage raw pointers, manual memory, and direct hardware access, and it does so across three active backends — C, C++, and JavaScript (with Objective-C still present as a legacy option).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;nim c   app.nim   &lt;span class="c"&gt;# native binary via C&lt;/span&gt;
nim cpp app.nim   &lt;span class="c"&gt;# C++ integration&lt;/span&gt;
nim js  app.nim   &lt;span class="c"&gt;# browser application via JavaScript&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The same source file produces three different outputs for three different execution environments, because the backend is not the language's identity – it is a parameter.&lt;/p&gt;




&lt;h2&gt;
  
  
  Logic versus physics
&lt;/h2&gt;

&lt;p&gt;Most systems languages bind the logic of a program to the physics of its execution. In C, Rust, or Odin, the algorithm and the execution model are tightly coupled. Ownership rules, ABI conventions, memory layout, and runtime behaviour are not optional layers that can be swapped out; they are part of what the language &lt;em&gt;is&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Nim separates these layers. The durable artefact in its compilation model is the abstract syntax tree – the structured representation of the program before it is lowered into any particular environment. Backend selection determines how that logic is realised, but the logic itself remains constant.&lt;/p&gt;

&lt;p&gt;Earlier articles in this series used "physics" to describe C's fixed execution rules – memory model, ABI, and calling conventions. Nim's architecture treats those rules as variable, selectable at compile time. That variability is precisely what separates Nim from every other language in the series.&lt;/p&gt;

&lt;p&gt;A language defined by its AST rather than by a single runtime becomes a &lt;em&gt;generator&lt;/em&gt; of programmes — capable of inhabiting different execution environments because the logic is treated as primary and the physics as contingent.&lt;/p&gt;




&lt;h2&gt;
  
  
  C as infrastructure
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://dev.to/dimension-ai/why-c-is-still-the-most-important-programming-language-2n9j"&gt;first article&lt;/a&gt; in this series argued that C's lasting importance lies not in the language itself but in the infrastructure around it: its ABI, its portability, and its toolchain. Nim attaches itself to exactly that infrastructure. By generating C, it inherits the entire C ecosystem – compilers, optimisers, debuggers, profilers, and library access – without building any of it. This is commensal architecture: Nim gains reach by living within the surrounding ecosystem without attempting to replace it.&lt;/p&gt;

&lt;p&gt;The contrast with &lt;a href="https://dev.to/dimension-ai/just-what-is-zig-anyway"&gt;Zig&lt;/a&gt; is sharp: Zig rebuilt the C toolchain and positioned itself as a better C compiler, while Nim leaves the existing toolchain intact and emits code for compilers developers already have. Both languages treat C as central to their strategy, but they occupy opposite sides of the compilation boundary.&lt;/p&gt;

&lt;p&gt;The cost of that dependence is real: Nim does not own its compilation pipeline, and debug information maps to the generated C rather than always cleanly to the Nim source. The language is, structurally, a guest in someone else's toolchain.&lt;/p&gt;




&lt;h2&gt;
  
  
  Reach versus depth
&lt;/h2&gt;

&lt;p&gt;This distinction clarifies Nim's position in the series.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-odin-anyway"&gt;Odin&lt;/a&gt; represents a strategy of depth. Its design asks how data sits in memory, how cache lines are used, and how programmes align themselves with hardware behaviour. The &lt;code&gt;#soa&lt;/code&gt; annotation, built-in vector types, and the implicit context system all follow from a single organising principle: data layout over control flow.&lt;/p&gt;

&lt;p&gt;Nim represents a strategy of reach. Its design asks how one body of logic can survive contact with many execution environments without being rewritten — and unlike JVM or .NET languages, which achieve cross-platform deployment through a shared runtime, Nim achieves it through compile-time generation, producing native code for each target without requiring a runtime beneath it. The multi-backend compiler, configurable memory management, and powerful macro system all follow from a different organising principle: the separation of logic from the physics of its execution.&lt;/p&gt;

&lt;p&gt;Odin narrows around one strong constraint. Nim stays broad because it must map onto different runtime models – native binaries via C, library integration via C++, and browser applications via JavaScript. A language that wants to inhabit all three cannot afford to be defined too tightly by any one of them, and that breadth is not a failure of focus but the direct consequence of the architectural choice that defines the language.&lt;/p&gt;




&lt;h2&gt;
  
  
  Memory management as configuration
&lt;/h2&gt;

&lt;p&gt;Nim's most unusual technical feature follows directly from the logic/physics separation, and it is the clearest evidence that the separation is real rather than theoretical.&lt;/p&gt;

&lt;p&gt;Most modern systems languages choose one memory management philosophy and embed it in the language design: Rust enforces ownership and borrow checking, Go uses a tracing garbage collector, and Zig requires explicit allocator parameters. Each bakes its approach into the grammar and type system.&lt;/p&gt;

&lt;p&gt;Nim treats memory management as configuration.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;nim c &lt;span class="nt"&gt;--mm&lt;/span&gt;:arc  app.nim    &lt;span class="c"&gt;# automatic reference counting&lt;/span&gt;
nim c &lt;span class="nt"&gt;--mm&lt;/span&gt;:none app.nim    &lt;span class="c"&gt;# manual memory management&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The same source code, compiled with different flags, produces different runtime behaviour. Consider a function that allocates a buffer:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight nim"&gt;&lt;code&gt;&lt;span class="k"&gt;proc &lt;/span&gt;&lt;span class="nf"&gt;processData&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;
  &lt;span class="k"&gt;var&lt;/span&gt; &lt;span class="n"&gt;buf&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;newSeq&lt;/span&gt;&lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="n"&gt;byte&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;4096&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="n"&gt;buf&lt;/span&gt;&lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;42&lt;/span&gt;
  &lt;span class="c"&gt;# scope exits here — what happens to buf?&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Under &lt;code&gt;--mm:arc&lt;/code&gt;, the generated C contains reference-counting machinery. The compiler inserts a cleanup call at scope exit that frees &lt;code&gt;buf&lt;/code&gt; automatically – deterministic destruction, similar in effect to C++'s RAII. Under &lt;code&gt;--mm:none&lt;/code&gt;, that machinery is absent. Lifetime responsibility shifts entirely to the programmer. To see the difference, look at what the compiler actually emits:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight c"&gt;&lt;code&gt;&lt;span class="cm"&gt;/* Generated C under --mm:arc (simplified) */&lt;/span&gt;
&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;processData&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;void&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;NimSeq&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;buf&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;newSeq&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;4096&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="n"&gt;buf&lt;/span&gt;&lt;span class="o"&gt;-&amp;gt;&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;42&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="n"&gt;nimDecRefAndFree&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;buf&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;   &lt;span class="cm"&gt;/* ← inserted by the compiler */&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="cm"&gt;/* Generated C under --mm:none (simplified) */&lt;/span&gt;
&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;processData&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;void&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;NimSeq&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;buf&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;newSeq&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;4096&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="n"&gt;buf&lt;/span&gt;&lt;span class="o"&gt;-&amp;gt;&lt;/span&gt;&lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;42&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="cm"&gt;/* ← nothing. Lifetime management is the programmer's responsibility. */&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Nim source is identical in both cases, but the generated C is not – the memory management strategy has moved out of the language and into the compilation configuration.&lt;/p&gt;

&lt;p&gt;Nim is not interesting here because it supports reference counting – C++ already has deterministic destruction. Nim is interesting because it allows the choice of lifetime strategy to move out of the language's identity and into compilation configuration. Memory management becomes part of the execution physics rather than part of the programme's logic.&lt;/p&gt;

&lt;p&gt;That flexibility buys reach. A library compiled with &lt;code&gt;--mm:arc&lt;/code&gt; can ship as a managed component; the same library compiled with &lt;code&gt;--mm:none&lt;/code&gt; can be embedded in a bare-metal environment where no runtime overhead is acceptable. The flexibility also gives up the kind of compile-time guarantees that Rust can make only because it refuses such flexibility. Rust's ownership model can prove absence of use-after-free and data races precisely because the memory rules are fixed. Nim's switchable model cannot, because the rules change depending on the flag.&lt;/p&gt;




&lt;h2&gt;
  
  
  Macros as the adapter layer
&lt;/h2&gt;

&lt;p&gt;In a single-backend language, macros are a convenience. In a multi-backend language, they become the primary mechanism for adapting high-level intent to backend-specific implementations.&lt;/p&gt;

&lt;p&gt;Nim macros operate directly on the abstract syntax tree during compilation, using ordinary Nim syntax rather than a separate metalanguage. The simplest form of backend adaptation is conditional compilation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight nim"&gt;&lt;code&gt;&lt;span class="k"&gt;when&lt;/span&gt; &lt;span class="n"&gt;defined&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;js&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
  &lt;span class="k"&gt;proc &lt;/span&gt;&lt;span class="nf"&gt;getTimestamp&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt; &lt;span class="kt"&gt;float&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{.&lt;/span&gt;&lt;span class="n"&gt;importjs&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"Date.now()"&lt;/span&gt;&lt;span class="p"&gt;.}&lt;/span&gt;
&lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
  &lt;span class="k"&gt;proc &lt;/span&gt;&lt;span class="nf"&gt;getTimestamp&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt; &lt;span class="kt"&gt;float&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;
    &lt;span class="c"&gt;# native implementation via C's clock_gettime&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;But the deeper capability is AST transformation. A macro can receive a block of code as a syntax tree, inspect its structure, and emit a rewritten version – all at compile time:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight nim"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="n"&gt;macros&lt;/span&gt;

&lt;span class="k"&gt;macro&lt;/span&gt; &lt;span class="n"&gt;serialise&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;untyped&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt; &lt;span class="n"&gt;untyped&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;
  &lt;span class="c"&gt;# receives the body as an AST node&lt;/span&gt;
  &lt;span class="c"&gt;# walks the tree, finds field declarations&lt;/span&gt;
  &lt;span class="c"&gt;# emits read/write procedures for each field&lt;/span&gt;
  &lt;span class="c"&gt;# the generated code compiles against whichever backend is active&lt;/span&gt;
  &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;buildSerialiser&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;body&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The distinction matters: conditional compilation chooses between existing code paths, whereas AST macros &lt;em&gt;generate&lt;/em&gt; code paths that did not exist in the source. A macro could, for instance, accept a high-level concurrency intent and emit &lt;code&gt;pthreads&lt;/code&gt; calls for the C backend but &lt;code&gt;Web Workers&lt;/code&gt; setup for the JavaScript backend — same logical operation, different execution physics, generated from a single source definition. Without this capability, adapting a single codebase to multiple backends would collapse into layers of manual wrappers and platform-specific branches.&lt;/p&gt;

&lt;p&gt;The cost is the same tension the &lt;a href="https://dev.to/dimension-ai/just-what-is-clojure-anyway-42np"&gt;Clojure article&lt;/a&gt; in this series identified: expressive metaprogramming that individual experts wield productively and teams struggle to maintain.&lt;/p&gt;




&lt;h2&gt;
  
  
  Where Nim's design fits
&lt;/h2&gt;

&lt;p&gt;Nim's architecture works best where deployment diversity is unavoidable.&lt;/p&gt;

&lt;p&gt;The strongest production example is the Ethereum ecosystem. The Nimbus clients, developed by Status, are substantial Nim codebases that produce efficient native binaries while operating inside an infrastructure dominated by C and Go implementations. Status has used Nim across multiple components of its distributed messaging platform, showing that the language has been used for substantial production workloads.&lt;/p&gt;

&lt;p&gt;Scientific and numerical computing provides a second family of examples. Arraymancer is a tensor library aimed at CPU, CUDA, and OpenCL workloads. SciNim exists as an explicit initiative to build scientific computing infrastructure around the language. Both exploit Nim's ability to generate C that links directly against established numerical libraries.&lt;/p&gt;

&lt;p&gt;What these cases share is a specific engineering condition: a core algorithm that must operate inside ecosystems built in other languages. Nim's value is clearest when the alternative is maintaining parallel implementations across those ecosystems.&lt;/p&gt;




&lt;h2&gt;
  
  
  Limitations and the adoption paradox
&lt;/h2&gt;

&lt;p&gt;Languages that achieve institutional adoption usually do so by imposing visible constraints: Rust tells teams what they cannot do with memory, Go tells teams roughly how everyone else will write, and Zig tells teams where hidden behaviour is not allowed. These constraints make codebases legible across teams and give organisations confidence that new hires will produce code consistent with the existing base.&lt;/p&gt;

&lt;p&gt;Nim offers a larger design space. Configurable memory models, macro-based DSL construction, and multiple backends give expert developers unusual power. They also make standardisation harder. Two experienced Nim developers may write code that looks nothing alike, using different paradigms, different macro patterns, and different memory strategies. Large organisations often adopt constraints precisely because constraints make codebases predictable. Nim's flexibility works against that institutional need.&lt;/p&gt;

&lt;p&gt;That is the deeper adoption problem – not marketing, but institutional trust.&lt;/p&gt;

&lt;p&gt;Nim has been stable and capable for over a decade. It has not achieved critical mass. At some point the question shifts from "what still needs to be improved?" to "does the market have room for another general-purpose compiled language whose main virtue is flexibility?" The honest answer in 2026 is: probably not at large scale, unless the language finds a narrow domain it can own completely.&lt;/p&gt;

&lt;p&gt;The practical limitations compound this. The package ecosystem is smaller than Rust's, Go's, or Python's. Debugging can expose generated C rather than Nim source, reminding developers that the language operates as a layer above another toolchain. LLM training data for Nim is thin compared with mainstream languages, making AI-assisted development less reliable – a compounding disadvantage in an era where coding agents are increasingly central to developer productivity.&lt;/p&gt;




&lt;h2&gt;
  
  
  Competitors
&lt;/h2&gt;

&lt;p&gt;Zig and Nim both treat C as central to their strategy, but from opposite directions: Zig became a C compiler, absorbing the toolchain and offering to build C code better than GCC does, while Nim became a C generator, emitting code for the existing toolchain and letting it handle the rest. Both keep C's physics, but they differ on which side of the compilation boundary they occupy.&lt;/p&gt;

&lt;p&gt;No other mainstream compiled language separates the programming language from its compilation target to the degree Nim does.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Programming languages are usually defined by where their programmes run – the compilation target determines the runtime behaviour, the tooling, and the surrounding ecosystem.&lt;/p&gt;

&lt;p&gt;Nim reversed that relationship. By treating the backend as variable and the AST as canonical, it separated the logic of the program from the physics of its execution more decisively than any other language in this series.&lt;/p&gt;

&lt;p&gt;Within the landscape explored by this series, Odin pursues depth, Zig pursues control over the toolchain, and Nim pursues reach.&lt;/p&gt;

&lt;p&gt;Its achievement is real: it demonstrates that one body of logic can inhabit many environments without surrendering its identity at the source level. Its paradox is equally real: a language that can inhabit almost any role eventually struggles to claim one of its own.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;This article is part of an ongoing series examining what programming languages actually are and why they matter.&lt;/em&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Argument&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/why-c-is-still-the-most-important-programming-language-2n9j"&gt;C&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;The irreplaceable foundation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-python-anyway-7ce"&gt;Python&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;The approachable ecosystem&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-rust-anyway-28bh"&gt;Rust&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Safe systems programming&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-clojure-anyway-42np"&gt;Clojure&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Powerful ideas, niche language&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-zig-anyway"&gt;Zig&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Rebuild the toolchain, keep the physics&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-odin-anyway"&gt;Odin&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Data layout over control flow&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Nim&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Logic separated from execution&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

</description>
      <category>nim</category>
      <category>c</category>
      <category>javascript</category>
      <category>dotnet</category>
    </item>
    <item>
      <title>Just What IS Odin, Anyway?</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Wed, 11 Mar 2026 17:17:42 +0000</pubDate>
      <link>https://forem.com/dimension-ai/just-what-is-odin-anyway-542a</link>
      <guid>https://forem.com/dimension-ai/just-what-is-odin-anyway-542a</guid>
      <description>&lt;p&gt;&lt;em&gt;The language that treats the cache line as a design principle&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Imagine you are a game-graphics programmer writing what's known as a "particle system" – a technique used in real-time graphics to render large numbers of small, short-lived visual objects such as sparks, smoke, fire, and rain. Each particle is a lightweight data record with position, velocity, colour, lifetime, and a handful of flags, and a typical system manages tens of thousands of them simultaneously, updated sixty times per second. It is one of the clearest cases in software where data layout dominates performance.&lt;/p&gt;

&lt;p&gt;You have written the obvious code: a struct containing all the fields, an array of ten thousand structs, a loop that updates position from velocity each frame.&lt;/p&gt;

&lt;p&gt;The profiler tells you the update loop is slow. Not algorithmically slow – the work per particle is trivial – but &lt;em&gt;memory&lt;/em&gt; slow. The CPU is stalling, waiting for data to arrive from main memory. The reason is a classic data layout problem: Array of Structures (AOS) versus Structure of Arrays (SOA). You are running through an array of structs, but on each iteration the cache line loads position, velocity, colour, lifetime, and every flag. The update only touches position and velocity. The rest is dead weight, dragged through the cache on every iteration, evicting data you actually need.&lt;/p&gt;

&lt;p&gt;You know the fix. You've known it for years. Convert the AOS to an SOA: one contiguous array of positions, another of velocities, another of colours. Now the update loop streams through two tightly packed arrays and the cache line contains nothing but data the loop actually reads.&lt;/p&gt;

&lt;p&gt;The problem is that in most mainstream languages, this fix becomes a &lt;em&gt;refactoring project&lt;/em&gt;. In C, you tear apart the struct and rewrite every access pattern. In C++, you write a template library or reach for an ECS framework. In Rust, you restructure your data model and update every borrow. In Zig, you write the SOA layout manually and adjust the allocator calls. The conceptual change is one sentence: "store the data by field, not by record." The implementation change is days of work and a significant source of bugs.&lt;/p&gt;

&lt;p&gt;Now imagine a language where that change is a single annotation on the struct declaration, and the compiler handles the rest.&lt;/p&gt;

&lt;p&gt;That language exists. It is called Odin. And the annotation – &lt;code&gt;#soa&lt;/code&gt; – is not a library feature or a compiler extension. It is part of the language, because Odin is explicitly designed around how data moves through hardware, rather than how logic flows through code.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why data layout matters
&lt;/h2&gt;

&lt;p&gt;Modern CPUs do not fetch individual bytes from memory. They fetch cache lines – typically 64 bytes at a time. If a data structure contains ten fields but a loop only touches two of them, eight fields' worth of memory are loaded and discarded on every iteration. The cost is not the computation; it is the wait. On modern hardware, a cache miss can cost 100–200 CPU cycles – the CPU can perform several hundred arithmetic operations in the time it takes to fetch a single cache line from main memory.&lt;/p&gt;

&lt;p&gt;Game development discovered this problem earlier than most of the software industry, because games run fixed-budget loops at 60 or 120 frames per second. A few hundred unnecessary cache misses per frame can be the difference between smooth rendering and visible stutter. The discipline that emerged – data-oriented design, articulated most influentially by Mike Acton in his widely cited 2014 CppCon talk – reorganises software around memory access patterns rather than object hierarchies.&lt;/p&gt;

&lt;p&gt;Data-oriented design has been practised in C and C++ for decades, but always as a discipline imposed &lt;em&gt;on top of&lt;/em&gt; the language. The language itself provides no support for it. C structs are always arrays-of-structures. C++ classes encourage bundling behaviour with data, which pushes against cache-friendly layout in many codebases. Rust and Zig are primarily organised around control flow and procedure structure; both leave layout optimisation to the programmer.&lt;/p&gt;

&lt;p&gt;Odin was designed by Bill Hall (known as gingerBill), who had been doing this work manually for years and decided the language itself should handle it. The result is a systems language that keeps C's fundamental execution rules – its memory model, its programmer-managed resources, its C-compatible ABI – but reorganises everything above that foundation around data layout.&lt;/p&gt;

&lt;p&gt;(This series uses "physics" to mean a language's fundamental execution rules – memory model, execution model, and binary interface. Odin keeps C's physics, as &lt;a href="https://dev.to/dimension-ai/just-what-is-zig-anyway"&gt;Zig&lt;/a&gt; does; the difference is what each language builds on top of that shared foundation.)&lt;/p&gt;




&lt;h2&gt;
  
  
  How data-orientation shapes the language
&lt;/h2&gt;

&lt;p&gt;Mainstream systems languages foreground control flow: functions, conditionals, loops, call stacks, scope rules. The programmer thinks in terms of "what happens next." Odin foregrounds data layout instead: how structures sit in memory, how they travel through cache lines, how they're accessed in bulk. That single reorientation explains every distinctive feature of the language.&lt;/p&gt;

&lt;h3&gt;
  
  
  SOA as a built-in
&lt;/h3&gt;

&lt;p&gt;In Odin, converting an Array of Structures to a Structure of Arrays is a directive on the type, not a restructuring of the codebase:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Array of Structures (default)
particles: [10000]Particle

// Structure of Arrays — same data, cache-friendly layout
particles: #soa [10000]Particle
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The compiler rearranges the memory layout. Crucially, access syntax remains identical – &lt;code&gt;particles[i].position&lt;/code&gt; works the same way regardless of whether the underlying storage is AOS or SOA. The programmer changes one annotation; the data moves through the cache differently; every line of code that reads or writes the data compiles without modification.&lt;/p&gt;

&lt;p&gt;To appreciate what this eliminates, consider what the same transformation requires in C:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight c"&gt;&lt;code&gt;&lt;span class="c1"&gt;// C: the struct must be torn apart into parallel arrays&lt;/span&gt;
&lt;span class="kt"&gt;float&lt;/span&gt; &lt;span class="n"&gt;pos_x&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;10000&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;pos_y&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;10000&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;pos_z&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;10000&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
&lt;span class="kt"&gt;float&lt;/span&gt; &lt;span class="n"&gt;vel_x&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;10000&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;vel_y&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;10000&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt; &lt;span class="n"&gt;vel_z&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;10000&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
&lt;span class="kt"&gt;float&lt;/span&gt; &lt;span class="n"&gt;lifetime&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;10000&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
&lt;span class="kt"&gt;uint32_t&lt;/span&gt; &lt;span class="n"&gt;flags&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;10000&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
&lt;span class="c1"&gt;// Every access pattern that touched the original struct must be rewritten.&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In C, the refactoring is structural and pervasive. In C++, you would typically reach for a template library or an Entity Component System framework to manage the transformation. In Rust, the data model restructuring propagates through every borrow and lifetime annotation. In Odin, it is a single keyword.&lt;/p&gt;

&lt;p&gt;This is the clearest illustration of the thesis. In a control-flow-oriented language, layout is a consequence of how the programmer defines types. In Odin, layout is a first-class design decision that the language supports directly.&lt;/p&gt;

&lt;h3&gt;
  
  
  Built-in vector and matrix types
&lt;/h3&gt;

&lt;p&gt;Most languages treat vectors and matrices as library types. Odin treats them as primitives:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pos: [3]f32            // a 3-element vector
mat: matrix[4,4]f32    // a 4×4 matrix
result := mat * pos    // native operation, no library call
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In C++, Eigen or GLM provide these operations, but as template libraries with their own compilation costs and diagnostic noise. In Rust, basic linear algebra requires importing a crate – nalgebra, glam, or similar – and often involves working around the type system to ensure correct alignment. In Odin, vectors and matrices are native types that the compiler can map efficiently to SIMD instructions, with no external library dependency and without the extra indirection of a library-defined numeric type. Because the compiler knows these are vector types rather than arbitrary structs, it can make alignment and packing guarantees that library-based solutions must work around.&lt;/p&gt;

&lt;p&gt;This choice only makes sense if the language is designed around data. If your organising principle is control flow, vector types are a library concern. If your organising principle is how data is arranged and moved, they are as fundamental as integers.&lt;/p&gt;

&lt;h3&gt;
  
  
  No constructors, no destructors
&lt;/h3&gt;

&lt;p&gt;Odin has neither. Structs are inert data – they zero-initialise by default, and the programmer manages lifecycle explicitly:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Odin: data is passive. No constructor. No destructor. Zero-initialised.
p: Particle  // all fields are zero
p.lifetime = 5.0
// When p goes out of scope, nothing happens. No Drop, no destructor, no hidden call.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Compare with C++, where constructing an object may invoke an arbitrary constructor, and where scope exit triggers type-defined destruction logic that is not visible at the declaration site:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// C++: what happens when this object is created? When it's destroyed?&lt;/span&gt;
&lt;span class="c1"&gt;// The answer depends on the class definition, which may be in another file.&lt;/span&gt;
&lt;span class="n"&gt;Particle&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;      &lt;span class="c1"&gt;// constructor runs — doing what?&lt;/span&gt;
&lt;span class="c1"&gt;// ... use p ...&lt;/span&gt;
&lt;span class="c1"&gt;// scope exit: destructor runs — doing what?&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The absence of constructors and destructors is perhaps the most consequential omission in the language, and it follows directly from the data-oriented model: if data is passive and the programme's job is to transform it in bulk, then attaching behaviour to individual records is often the wrong abstraction for that workload. It hides per-record work inside what should be a bulk operation, and it couples data to behaviour in precisely the way data-oriented design seeks to avoid. Odin achieves simplicity not by adding more semantic machinery, but by refusing to attach lifecycle behaviour to the data in the first place.&lt;/p&gt;




&lt;h2&gt;
  
  
  The context system
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://dev.to/dimension-ai/just-what-is-zig-anyway"&gt;Zig article&lt;/a&gt; in this series explained explicit allocator parameters as a consequence of the transparency principle: every function that allocates memory declares that fact in its signature. Odin solves the same problem – who controls memory allocation – but arrives at a different answer.&lt;/p&gt;

&lt;p&gt;Every Odin procedure has access to an implicit &lt;code&gt;context&lt;/code&gt; value, passed on the stack, containing the current allocator, temporary allocator, logger, and assertion handler. Any procedure can override the context for its callees by modifying its own copy:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Odin: switch the allocator for everything called within this scope
context.allocator = my_arena_allocator
process_batch(particles)
// process_batch and everything it calls will use my_arena_allocator
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;No function signatures change. No explicit parameters are threaded through the call graph. The allocator flows through the programme as environmental data.&lt;/p&gt;

&lt;p&gt;The equivalent operation in Zig requires the allocator to appear in every function signature in the call chain:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight zig"&gt;&lt;code&gt;&lt;span class="c"&gt;// Zig: the allocator must be passed explicitly at every level&lt;/span&gt;
&lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="n"&gt;process_batch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;allocator&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="py"&gt;mem&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="py"&gt;Allocator&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;particles&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;&lt;span class="n"&gt;Particle&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="k"&gt;void&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c"&gt;// every sub-call that allocates also needs the allocator parameter&lt;/span&gt;
    &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="n"&gt;process_chunk&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;allocator&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;particles&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="n"&gt;chunk_size&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In codebases where allocators and scratch state are pervasive – which describes most data-heavy systems – threading an allocator parameter through every function signature adds noise to every call site. Odin chooses ambient context because it reduces that noise. The practical justification is straightforward: in a codebase processing ten thousand particles per frame, the allocator is environmental infrastructure, not per-function state. The context system treats it accordingly.&lt;/p&gt;

&lt;p&gt;The trade-off is real. Context is implicit. A function's signature does not tell you whether it allocates memory. This is the inverse of Zig's guarantee. Odin sacrifices signature transparency for lower per-function noise, and developers coming from Zig or Rust will notice the loss.&lt;/p&gt;




&lt;h2&gt;
  
  
  Where Odin's design assumptions match reality
&lt;/h2&gt;

&lt;p&gt;Three conditions align with Odin's architecture.&lt;/p&gt;

&lt;p&gt;The first is real-time graphical software: game engines, simulation tools, renderers – systems where the frame budget is fixed and cache performance dominates. This is Odin's strongest territory and the domain that produced it. JangaFX's EmberGen, a real-time volumetric fluid simulation tool, is built in Odin and is the most prominent production example.&lt;/p&gt;

&lt;p&gt;The second is bulk data transformation: any system that processes large homogeneous datasets where the access pattern is "stream through a large array and transform each element." Physics simulation, audio DSP, rendering pipelines, GPU data preparation, and signal processing all share this characteristic. SOA layout and SIMD-friendly primitives provide direct, measurable performance gains in these contexts, and they are domains where data layout routinely dominates performance.&lt;/p&gt;

&lt;p&gt;The third is systems work where Rust's constraints impose excessive friction and Zig's explicitness imposes excessive noise. Odin occupies a middle ground: more guardrails than C (bounds-checked arrays, zero-initialisation), fewer constraints than Rust (no borrow checker), less ceremony than Zig (implicit context rather than explicit allocator parameters). For developers who find Rust over-constrained and Zig over-explicit for their problem domain, Odin is a coherent alternative.&lt;/p&gt;

&lt;p&gt;Where the assumptions do not match: if formal memory safety is required – safety-critical systems, security-sensitive code – Rust remains the strongest mainstream option. If C ABI stewardship and toolchain integration are the priority, Zig's infrastructure-first approach is stronger. If the workload is not data-heavy – business logic, string processing, network protocol handling – Odin's data-oriented features provide no advantage and its smaller ecosystem becomes a net cost.&lt;/p&gt;




&lt;h2&gt;
  
  
  Limitations
&lt;/h2&gt;

&lt;p&gt;Odin's limitations are real and should be weighed honestly.&lt;/p&gt;

&lt;p&gt;The package ecosystem is substantially smaller than Rust's, Go's, or Zig's. Many domains have no established library. Developers frequently need to write bindings or implement functionality from scratch.&lt;/p&gt;

&lt;p&gt;Odin remains largely the vision of a single designer – Bill Hall. This produces unusual design coherence but also concentrates project risk. There is no Odin Foundation, no corporate sponsor, and no clear succession plan. For organisations evaluating multi-year commitments, the trade-off between coherence and institutional continuity is real.&lt;/p&gt;

&lt;p&gt;The language provides practical safety improvements over C – bounds checking, zero initialisation – but no compile-time memory safety model. For domains where safety guarantees are contractually or regulatorily required, Odin is not a candidate.&lt;/p&gt;

&lt;p&gt;IDE support, debugging, and profiling tools are functional but less mature than the equivalents for Rust, Go, or Zig. The language server (OLS) is under active development but not yet at the level of rust-analyzer or ZLS.&lt;/p&gt;

&lt;p&gt;Odin's association with game development, while accurate to its origins, creates a perception problem. Developers outside the games industry may dismiss it as domain-specific without evaluating its broader applicability. The language is general-purpose in capability but niche in reputation.&lt;/p&gt;

&lt;p&gt;As with Zig, LLM training data for Odin is thin. AI-assisted development – an increasingly material factor in language productivity – is less reliable for Odin than for mainstream languages, and this gap compounds the ecosystem and documentation limitations.&lt;/p&gt;




&lt;h2&gt;
  
  
  Competitors
&lt;/h2&gt;

&lt;p&gt;Zig occupies the closest territory – the same layer of the stack, the same "better C" positioning – but its organising principle is infrastructure and transparency, not data layout. The &lt;a href="https://dev.to/dimension-ai/just-what-is-zig-anyway"&gt;Zig article&lt;/a&gt; in this series argued that Zig rebuilt the C toolchain; Odin redesigned the programming model above the C foundation.&lt;/p&gt;

&lt;p&gt;Hare and C3 remain closer to C's original procedure-oriented design and do not prioritise data layout as a language-level concern. Rust's data layout control (&lt;code&gt;repr(C)&lt;/code&gt;, manual packing) exists but is secondary to its ownership model.&lt;/p&gt;

&lt;p&gt;Odin's nearest conceptual relative is not another programming language but the Entity Component System pattern in game engine architecture – a design pattern that Odin promotes to a language-level primitive, chiefly through &lt;code&gt;#soa&lt;/code&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;For fifty years, systems languages have been designed around the question: "What should the programme do next?" Odin asks a different question: "Where does the data live, and how does it get to the processor?"&lt;/p&gt;

&lt;p&gt;That reorientation produces a language that looks superficially like C – manual memory, explicit control, no garbage collector – but is organised around a fundamentally different principle. SOA as a built-in, vectors as primitives, context as implicit data flow, and the absence of per-object lifecycle machinery all follow from the decision to treat the cache line, not the call stack, as the primary unit of design.&lt;/p&gt;

&lt;p&gt;Whether that principle has broad applicability beyond its origins in real-time graphics is an open question. But as hardware continues to widen the gap between compute speed and memory latency, the question Odin asks is becoming harder for the rest of the industry to ignore.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;This article is part of an ongoing series examining what programming languages actually are and why they matter.&lt;/em&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Argument&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/why-c-is-still-the-most-important-programming-language-2n9j"&gt;C&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;The irreplaceable foundation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-python-anyway-7ce"&gt;Python&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;The approachable ecosystem&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-rust-anyway-28bh"&gt;Rust&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Safe systems programming&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-clojure-anyway-42np"&gt;Clojure&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Powerful ideas, niche language&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-zig-anyway"&gt;Zig&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Rebuild the toolchain, keep the physics&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Odin&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Data layout over control flow&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;em&gt;Coming next: Nim – where Odin's data-oriented design is a strategy for depth, Nim's transpilation to C is a strategy for reach.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>rust</category>
      <category>zig</category>
      <category>languagecomparison</category>
      <category>c</category>
    </item>
    <item>
      <title>Just what IS Zig, Anyway?</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Wed, 11 Mar 2026 08:42:11 +0000</pubDate>
      <link>https://forem.com/dimension-ai/just-what-is-zig-anyway-epa</link>
      <guid>https://forem.com/dimension-ai/just-what-is-zig-anyway-epa</guid>
      <description>&lt;p&gt;&lt;em&gt;And why are people using it without writing a line of it?&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Most programming languages spread because developers write code in them. A developer tries the syntax, likes the ergonomics, builds something, and tells a colleague. Some languages gain traction through adjacent tooling or frameworks – TypeScript through Angular, Kotlin through Android – but the language itself is still the point. The tooling serves the code.&lt;/p&gt;

&lt;p&gt;Zig has followed a different path. One of its most widely discussed real-world uses is &lt;code&gt;zig cc&lt;/code&gt; – a drop-in C and C++ cross-compilation toolchain that teams adopt to compile code written in &lt;em&gt;other&lt;/em&gt; languages, without writing or intending to write a single line of Zig. Uber engineers have adopted it for C++ cross-compilation. The Bun JavaScript runtime chose Zig partly for the language but substantially because the toolchain solved cross-compilation problems that were otherwise intractable. These are prominent examples rather than proof of universal adoption – the evidence base is still developing – but the pattern is consistent enough to demand explanation, particularly for a project that has not yet reached version 1.0. The language enters organisations through the compiler, not through the syntax.&lt;/p&gt;

&lt;p&gt;The standard description of Zig – "a better C" – is accurate in the narrow sense that Zig occupies the same layer of the software stack and shares C's execution model.&lt;/p&gt;

&lt;p&gt;But it obscures the more interesting truth: where Rust fixes C by replacing the language with a fundamentally different &lt;strong&gt;memory model&lt;/strong&gt;, Zig fixes C by rebuilding its &lt;strong&gt;infrastructure&lt;/strong&gt; – the compiler, the build system, the cross-compilation machinery – into a hermetic, portable package, while leaving C's raw execution model and ABI intact.&lt;/p&gt;

&lt;p&gt;Rust replaces C's &lt;strong&gt;code&lt;/strong&gt; but keeps C's &lt;strong&gt;interfaces&lt;/strong&gt;; Zig rebuilds C's &lt;strong&gt;tools&lt;/strong&gt; but keeps C's &lt;strong&gt;physics&lt;/strong&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  The toolchain
&lt;/h2&gt;

&lt;p&gt;The &lt;a href="https://dev.to/dimension-ai/why-c-is-still-the-most-important-programming-language-2n9j"&gt;first article&lt;/a&gt; in this series argued that C's importance derives from the ABI – the universal binary interface that every other language targets at its boundaries. If that is true, then the tool that manages that ABI most competently acquires strategic importance. Zig understood this before most of its competitors did.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;zig cc&lt;/code&gt; bundles a patched Clang/LLVM frontend with platform headers into a single, portable binary. A developer can cross-compile C or C++ to any supported target from any host machine without installing platform-specific toolchains, sysroots, or header packages. The crucial detail is that Zig ships what it calls bundled libc targets – musl and glibc headers for many architectures – inside its own distribution. That is what makes the cross-compilation hermetic rather than merely convenient.&lt;/p&gt;

&lt;p&gt;Without Zig, building a C program for Linux ARM64 from a macOS x86 host typically requires installing a cross-compiler, obtaining the correct sysroot, configuring include and library paths, and debugging linker errors. The process is fragile, poorly documented, and different for every target triple.&lt;/p&gt;

&lt;p&gt;Traditional cross-compilation for Linux ARM64 from macOS:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Install a cross-compiler, obtain a sysroot, configure paths...&lt;/span&gt;
brew &lt;span class="nb"&gt;install &lt;/span&gt;aarch64-linux-gnu-gcc
&lt;span class="nb"&gt;export &lt;/span&gt;&lt;span class="nv"&gt;SYSROOT&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;/path/to/aarch64-linux-gnu/sysroot
aarch64-linux-gnu-gcc &lt;span class="nt"&gt;-o&lt;/span&gt; myapp main.c &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;--sysroot&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nv"&gt;$SYSROOT&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;-I&lt;/span&gt;&lt;span class="nv"&gt;$SYSROOT&lt;/span&gt;/usr/include &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nt"&gt;-L&lt;/span&gt;&lt;span class="nv"&gt;$SYSROOT&lt;/span&gt;/usr/lib
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The same operation with Zig:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;zig cc &lt;span class="nt"&gt;-target&lt;/span&gt; aarch64-linux-gnu &lt;span class="nt"&gt;-o&lt;/span&gt; myapp main.c
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;zig cc&lt;/code&gt; reduces this to a single command with the target specified as a flag.&lt;/p&gt;

&lt;p&gt;The strategic consequence is unusual. Every other "better C" language asks developers to leave the C ecosystem. Zig absorbs it. A team can adopt &lt;code&gt;zig cc&lt;/code&gt; without writing a single line of Zig, without changing any source code, and without altering any interfaces. The cost of adoption is close to zero. The language becomes available as an option once the toolchain is already in use. The toolchain opens the door; the language follows.&lt;/p&gt;

&lt;p&gt;The contrast with &lt;a href="https://dev.to/dimension-ai/just-what-is-rust-anyway-28bh"&gt;Rust&lt;/a&gt; is instructive. Both languages interoperate with the C ABI, and both use LLVM as a backend. The real difference is the memory model: Rust replaces it; Zig retains it. Rust's FFI requires &lt;code&gt;extern "C"&lt;/code&gt; blocks, unsafe wrappers, and often a binding generator such as bindgen:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="c1"&gt;// Rust: calling C's strlen requires an unsafe block and a manual declaration&lt;/span&gt;
&lt;span class="k"&gt;extern&lt;/span&gt; &lt;span class="s"&gt;"C"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="nf"&gt;strlen&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="nb"&gt;c_char&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;usize&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="nf"&gt;length&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;CStr&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;usize&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;unsafe&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nf"&gt;strlen&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="nf"&gt;.as_ptr&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;  &lt;span class="c1"&gt;// every call crosses an unsafe boundary&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Zig can &lt;code&gt;@cImport&lt;/code&gt; a C header directly, which works by invoking &lt;code&gt;zig translate-c&lt;/code&gt; under the hood – a mechanical translation step that parses C declarations into Zig types:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight zig"&gt;&lt;code&gt;&lt;span class="c"&gt;// Zig: import the C header; call the function as though it were Zig&lt;/span&gt;
&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;@cImport&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;@cInclude&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"string.h"&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;

&lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="n"&gt;length&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="kt"&gt;u8&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="kt"&gt;usize&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;c&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;strlen&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;  &lt;span class="c"&gt;// no wrapper, no unsafe block, no binding generator&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The interop cost is structurally lower because Zig kept C's memory model and calling conventions.&lt;/p&gt;




&lt;h2&gt;
  
  
  Transparency
&lt;/h2&gt;

&lt;p&gt;If the toolchain is Zig's adoption strategy, transparency is its design philosophy. Zig's stated principle is: no hidden control flow, no hidden allocations. If you do not see a function call in the code, no function call occurs. If you do not see a memory allocation, no memory is allocated. Every operation is visible at the call site.&lt;/p&gt;

&lt;p&gt;This is easier to understand through a concrete comparison. Consider a routine operation: appending an element to a dynamic array.&lt;/p&gt;

&lt;p&gt;In C, &lt;code&gt;realloc&lt;/code&gt; may be called inside a library function. The caller cannot tell from the call site whether memory will be allocated, how much, or what happens on failure. The convention is to check a return value, but nothing in the language enforces this.&lt;/p&gt;

&lt;p&gt;In C++, &lt;code&gt;std::vector::push_back&lt;/code&gt; may reallocate the backing buffer, invoke copy or move constructors, and – if an exception is thrown – unwind the stack through destructors. None of this is visible at the call site. The programmer must know the type's implementation to reason about what happens.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// C++: what happens behind this line?&lt;/span&gt;
&lt;span class="n"&gt;items&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;push_back&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="c1"&gt;// Answer: possibly realloc, possibly copy/move constructors on every element,&lt;/span&gt;
&lt;span class="c1"&gt;// possibly an exception, possibly stack unwinding through destructors.&lt;/span&gt;
&lt;span class="c1"&gt;// The call site tells you none of this.&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In Rust, &lt;code&gt;Vec::push&lt;/code&gt; may reallocate. When the old buffer is dropped, &lt;code&gt;Drop&lt;/code&gt; runs automatically. The borrow checker prevents dangling references, but the programmer cannot see from the call site alone what cleanup will occur or when.&lt;/p&gt;

&lt;p&gt;In Zig, the function takes an &lt;code&gt;Allocator&lt;/code&gt; parameter explicitly. The append call returns an error union. If allocation fails, the error is a normal return value in the caller's control flow. No destructors run. No exceptions are thrown. No implicit function calls occur. The call site is a complete description of what the machine will do.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight zig"&gt;&lt;code&gt;&lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="n"&gt;appendItem&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;list&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ArrayList&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;u32&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;allocator&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;std&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="py"&gt;mem&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="py"&gt;Allocator&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;u32&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="k"&gt;void&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="n"&gt;list&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;allocator&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;value&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The point is not that the other languages are wrong. Each hides a different amount of machinery behind the call site, and each has reasons for doing so. Zig hides the least. The cognitive load of reasoning about what a line of Zig does is lower than the equivalent line in C++, because there is less invisible behaviour to account for.&lt;/p&gt;

&lt;h3&gt;
  
  
  The allocator pattern
&lt;/h3&gt;

&lt;p&gt;The explicit allocator is central to how this works in practice. Zig's standard library functions that allocate memory require an &lt;code&gt;Allocator&lt;/code&gt; to be passed as a parameter. This single design decision has several consequences.&lt;/p&gt;

&lt;p&gt;Any function signature that includes an &lt;code&gt;Allocator&lt;/code&gt; parameter declares, at the type level, that it may allocate memory. Any function signature that does not include one is guaranteed not to allocate. Tests can substitute a different allocator – a failing allocator, a tracking allocator, a fixed-buffer allocator – without changing the code under test. In systems with strict memory budgets – embedded devices, databases, game engines – the allocator pattern makes resource consumption auditable at the API boundary. And out-of-memory is a normal error return, not an abort or an exception. The caller decides what to do.&lt;/p&gt;

&lt;p&gt;In C, allocation failure is easy to ignore:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight c"&gt;&lt;code&gt;&lt;span class="c1"&gt;// C: nothing forces the programmer to check this&lt;/span&gt;
&lt;span class="kt"&gt;char&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="n"&gt;buf&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;malloc&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;4096&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="n"&gt;memcpy&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;buf&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;src&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;len&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;  &lt;span class="c1"&gt;// if malloc returned NULL, this is undefined behaviour&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In Zig, the error is part of the return type and the compiler will not let the caller ignore it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight zig"&gt;&lt;code&gt;&lt;span class="c"&gt;// Zig: the caller must handle the error or explicitly propagate it&lt;/span&gt;
&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="n"&gt;buf&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;allocator&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;alloc&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;u8&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;4096&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;catch&lt;/span&gt; &lt;span class="p"&gt;|&lt;/span&gt;&lt;span class="n"&gt;err&lt;/span&gt;&lt;span class="p"&gt;|&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="c"&gt;// OOM is a normal control flow path, not an abort&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;err&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Resource exhaustion becomes a first-class engineering concern rather than a runtime surprise.&lt;/p&gt;

&lt;p&gt;The trade-off must be stated honestly. Zig does not prevent use-after-free, dangling pointers, or data races at compile time. It keeps C's physics – including the dangers that come with them – and the programmer remains responsible for memory correctness. Zig's position is that if every operation is visible and every allocation is explicit, the programmer can reason about correctness directly – and that this is preferable, for certain classes of work, to the indirection and constraint of a borrow checker. Whether that bet pays off at scale is an open question. Zig's largest production users suggest it can, but the evidence base is still narrow compared to Rust's.&lt;/p&gt;




&lt;h2&gt;
  
  
  Comptime
&lt;/h2&gt;

&lt;p&gt;Comptime is the mechanism that keeps the language small despite its toolchain ambitions. Zig's toolchain does a great deal; the language itself does remarkably little, because comptime collapses several distinct features into one.&lt;/p&gt;

&lt;p&gt;Any expression or block in Zig can be marked &lt;code&gt;comptime&lt;/code&gt;, which instructs the compiler to evaluate it during compilation rather than at runtime. The code that runs at compile time is ordinary Zig – the same syntax, the same semantics, the same standard library. There is no separate macro language, no template metalanguage, no preprocessor.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Compile-time mechanisms&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;C&lt;/td&gt;
&lt;td&gt;Preprocessor (&lt;code&gt;#define&lt;/code&gt;, &lt;code&gt;#ifdef&lt;/code&gt;)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C++&lt;/td&gt;
&lt;td&gt;Templates, constexpr, macros&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;Generics, proc macros, const fn, build.rs&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Zig&lt;/td&gt;
&lt;td&gt;comptime (one mechanism)&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The structural consequence is most visible in how generics work. Zig does not have a generics subsystem. In Rust, generics are a distinct wing of the compiler architecture with their own syntax, trait bounds, and monomorphisation rules. In C++, templates are an entire sub-language with its own error messages, its own instantiation model, and its own debugging challenges:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// C++: a generic linked list requires the template sub-language&lt;/span&gt;
&lt;span class="k"&gt;template&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="k"&gt;typename&lt;/span&gt; &lt;span class="nc"&gt;T&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
&lt;span class="k"&gt;struct&lt;/span&gt; &lt;span class="nc"&gt;LinkedList&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;struct&lt;/span&gt; &lt;span class="nc"&gt;Node&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;T&lt;/span&gt; &lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="n"&gt;Node&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;nullptr&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;};&lt;/span&gt;
    &lt;span class="n"&gt;Node&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;nullptr&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="c1"&gt;// Errors in templates produce notoriously unreadable diagnostics.&lt;/span&gt;
&lt;span class="c1"&gt;// The template system is a distinct language-within-a-language.&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In Zig, a generic data structure is a function that takes a &lt;code&gt;type&lt;/code&gt; parameter at comptime and returns a struct:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight zig"&gt;&lt;code&gt;&lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="n"&gt;LinkedList&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;comptime&lt;/span&gt; &lt;span class="n"&gt;T&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;type&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;type&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="k"&gt;struct&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="n"&gt;Node&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;struct&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;data&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;T&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="n"&gt;next&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;?*&lt;/span&gt;&lt;span class="n"&gt;Node&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="p"&gt;};&lt;/span&gt;
        &lt;span class="n"&gt;head&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;?*&lt;/span&gt;&lt;span class="n"&gt;Node&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That entire architectural wing – generics, template specialisation, trait resolution – is deleted and replaced by a function call that happens to run at compile time. One mechanism does the work of four. This is simplicity in the sense the &lt;a href="https://dev.to/dimension-ai/just-what-is-clojure-anyway-42np"&gt;Clojure article&lt;/a&gt; in this series discussed – fewer entangled concepts, fewer distinct syntactic forms – rather than mere ease of use.&lt;/p&gt;

&lt;p&gt;The build system is where comptime meets the toolchain. Zig's build system is a Zig program – not a separate tool like Make, not a separate language like CMake, not a domain-specific configuration format like Cargo.toml. Because the build system is Zig, it has full access to comptime, to the standard library, and to the same cross-compilation targets as the compiler. It can orchestrate mixed C/C++/Zig builds, conditional compilation, dependency fetching, and platform-specific logic – all in one language, in one file. This is the structural link between the toolchain story and the language design: the build system is where they meet.&lt;/p&gt;




&lt;h2&gt;
  
  
  Where Zig's design assumptions match reality
&lt;/h2&gt;

&lt;p&gt;Three conditions align with Zig's architecture.&lt;/p&gt;

&lt;p&gt;The first is an existing C or C++ codebase that must be maintained, not rewritten. The codebase is too large to port to Rust. The build system is fragile. Cross-compilation is manual and error-prone. Zig enters as the toolchain first – &lt;code&gt;zig cc&lt;/code&gt; replaces the existing compiler without changing any source code. New modules can then be written in Zig and linked against the existing C code with no FFI layer, because Zig can &lt;code&gt;@cImport&lt;/code&gt; C headers directly. This is a gradual adoption path with near-zero switching cost, and of the three conditions, it is the strongest match for Zig's design.&lt;/p&gt;

&lt;p&gt;The second is deterministic, resource-constrained systems: databases, game engines, embedded firmware, custom allocators – systems where garbage collection pauses are unacceptable and where the borrow checker's constraints impose friction on the data structures the problem requires. Intrusive linked lists, arena allocators, and memory-mapped I/O are all straightforward in Zig and often require &lt;code&gt;unsafe&lt;/code&gt; abstractions in Rust. The explicit allocator pattern makes resource budgets visible and OOM handling a normal code path.&lt;/p&gt;

&lt;p&gt;The third is cross-platform binary distribution from a single build environment. A CLI tool, a library, or a runtime that must ship to multiple OS/architecture combinations from a single CI pipeline. Zig's hermetic cross-compilation produces static binaries for each target without requiring platform-specific toolchains on the build machine – and notably, it makes static linking against glibc significantly easier than standard toolchains do, which has historically been a major pain point for C developers targeting Linux. What would otherwise be a multi-day infrastructure project becomes a build flag.&lt;/p&gt;

&lt;p&gt;Where Zig's assumptions do not match: if compile-time memory safety guarantees are required, Rust provides them and Zig does not. If a mature ecosystem with extensive library coverage is needed, Rust and Go are years ahead. If API stability between releases is essential, Zig is not there yet. These are not minor caveats.&lt;/p&gt;




&lt;h2&gt;
  
  
  Limitations
&lt;/h2&gt;

&lt;p&gt;Zig's limitations are significant and should be weighed honestly.&lt;/p&gt;

&lt;p&gt;The language is pre-1.0. The standard library APIs have broken between releases and will continue to do so until the specification stabilises. Code written against Zig 0.11 may not compile on 0.13. For production systems, this imposes a maintenance cost that must be weighed against the language's benefits.&lt;/p&gt;

&lt;p&gt;Documentation is incomplete. The standard library documentation is auto-generated and sparse. The language reference is thorough but dense. There is no equivalent of the Rust Book or Go Tour – no structured onboarding path for new developers. The primary learning resource remains reading the standard library source code, which is a high barrier to entry.&lt;/p&gt;

&lt;p&gt;The package ecosystem is young. Many common libraries – HTTP, JSON, database drivers – exist but are maintained by small teams or individuals. The depth and breadth of crates.io or PyPI is years away. IDE support via ZLS (the Zig Language Server) is functional but less complete than rust-analyzer or gopls. Debugging and profiling support exists through LLVM but is not as well documented as for C or Rust.&lt;/p&gt;

&lt;p&gt;There is also the question of AI-assisted development. LLM coding tools are trained predominantly on C, Python, JavaScript, Java, and Rust. Zig's representation in training data is thin. AI-assisted development – an increasingly material factor in language productivity – is less reliable for Zig than for mainstream languages. This gap is likely to narrow, but it has not done so yet, and for teams that rely heavily on agentic coding workflows the practical impact is real.&lt;/p&gt;




&lt;h2&gt;
  
  
  Competitors
&lt;/h2&gt;

&lt;p&gt;Zig is not the only language attempting to improve on C. Odin prioritises data-oriented design and developer ergonomics – while Zig rebuilds the toolchain for the programmer's environment, Odin builds a better language for the programmer's data. Hare prioritises minimalism and stability, with its own compiler backend (QBE) and a stated intention to freeze the language specification at 1.0. Nim transpiles to C and therefore operates as part of the C infrastructure by definition, though it relies on the host's C compiler rather than shipping its own. C3 stays closest to C's syntax while fixing the preprocessor and adding contracts.&lt;/p&gt;

&lt;p&gt;Each made different strategic choices about the toolchain question. Zig is the most prominent language to ship a drop-in C/C++ compiler as a first-class, maintained component of its distribution.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Zig's strategy for entering the systems programming landscape was to recognise that C's dominance rests on infrastructure – the ABI, the toolchain, the build process – not on the quality of the language. Rather than building a better language and hoping the infrastructure would follow, Zig rebuilt the infrastructure and let the language follow.&lt;/p&gt;

&lt;p&gt;Whether that strategy succeeds long-term depends on reaching 1.0, stabilising the API, and building an ecosystem deep enough to sustain production use at scale. Those are open questions. But the approach itself – rebuilding the toolchain while keeping the physics – is the most original strategic move in systems language design since Rust replaced the memory model while keeping the ABI.&lt;/p&gt;

&lt;p&gt;In systems programming, the compiler has always mattered more than the language. Zig is built on that conviction.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;This article is part of an ongoing series examining what programming languages actually are and why they matter.&lt;/em&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Argument&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/why-c-is-still-the-most-important-programming-language-2n9j"&gt;C&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;The irreplaceable foundation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-python-anyway-7ce"&gt;Python&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;The approachable ecosystem&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-rust-anyway-28bh"&gt;Rust&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Safe systems programming&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-clojure-anyway-42np"&gt;Clojure&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Powerful ideas, niche language&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Zig&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Rebuild the toolchain, keep the physics&lt;/strong&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;em&gt;Coming next: Odin – a language for developers who ship software, not papers about shipping software.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>zig</category>
      <category>c</category>
      <category>rust</category>
      <category>languagecomparison</category>
    </item>
    <item>
      <title>Just What IS Clojure, Anyway?</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Sun, 08 Mar 2026 17:44:46 +0000</pubDate>
      <link>https://forem.com/dimension-ai/just-what-is-clojure-anyway-42np</link>
      <guid>https://forem.com/dimension-ai/just-what-is-clojure-anyway-42np</guid>
      <description>&lt;p&gt;Look at this line of code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="nf"&gt;processCustomerOrder&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;customer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;orderItems&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Any developer with six months of experience knows roughly what that does. The name is explicit, the structure is familiar, the intent is readable. Now look at this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight clojure"&gt;&lt;code&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;reduce&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nb"&gt;+&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;map&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;xs&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The reaction most developers have is immediate and unfavourable. Parentheses everywhere. No obvious structure. It looks less like a programming language and more like a typographer's accident. The old joke writes itself: LISP stands for Lost In Stupid Parentheses.&lt;/p&gt;

&lt;p&gt;That joke is, technically, a backronym. John McCarthy named it LISP as a contraction of &lt;em&gt;LISt Processing&lt;/em&gt; when he created it in 1958. The sardonic expansion came later, coined by programmers who had opinions about the aesthetic choices involved. Those opinions have not mellowed with time.&lt;/p&gt;

&lt;p&gt;And yet Clojure – a modern descendant of Lisp – ranked as one of the highest-paying languages in the Stack Overflow Developer Survey for several consecutive years around 2019. Developers walked away from stable Java and C# positions to build production systems in it. A Brazilian fintech used it to serve tens of millions of customers. Something requires explaining.&lt;/p&gt;




&lt;h2&gt;
  
  
  The ancestry: Lisp reborn
&lt;/h2&gt;

&lt;p&gt;Clojure only makes sense against the background of Lisp, and Lisp only makes sense as what it actually was: not merely a programming language, but a direct implementation of mathematical ideas about computation.&lt;/p&gt;

&lt;p&gt;McCarthy's 1958 creation introduced concepts that took the rest of the industry decades to absorb. Garbage collection, conditional expressions, functional programming, symbolic computation – all present in Lisp before most working developers today were born. Many programmers encounter Lisp's descendants daily without being aware of it.&lt;/p&gt;

&lt;p&gt;The defining feature is the S-expression:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight clojure"&gt;&lt;code&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;+&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Everything is written as a list. This is not merely a syntactic preference. Because code and data share the same underlying structure, a Lisp program can manipulate other programs directly. This property – &lt;em&gt;homoiconicity&lt;/em&gt; – is the technical foundation of Lisp macros: code that generates and transforms other code at compile time, with a flexibility that few conventional infix languages match. It is the reason serious Lisp practitioners regard the syntax not as a historical curiosity but as a genuine technical advantage.&lt;/p&gt;

&lt;p&gt;Lisp also, however, developed a reputation for producing work that individual experts could write brilliantly and teams could not maintain at all. The tension between expressive power and collective readability never fully resolved. Clojure inherits this tradition knowingly, and is aware of the cost.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Clojure actually is
&lt;/h2&gt;

&lt;p&gt;The brilliant Rich Hickey created Clojure in 2007. His central design decision was not to build a new runtime from scratch but to attach Lisp to an existing ecosystem.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Layer&lt;/th&gt;
&lt;th&gt;Technology&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Runtime&lt;/td&gt;
&lt;td&gt;JVM&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Libraries&lt;/td&gt;
&lt;td&gt;Java ecosystem&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Language model&lt;/td&gt;
&lt;td&gt;Lisp&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;This host strategy gave Clojure immediate access to decades of mature Java libraries without needing to rebuild any of them. A Clojure developer can call Java code directly. The same logic drove two later variants: ClojureScript, which compiles to JavaScript and found real traction in teams already working with React, and ClojureCLR, which runs on .NET. Rather than fight the unwinnable battle of building its own ecosystem from scratch, Clojure attached itself to three of the largest ones that already existed.&lt;/p&gt;

&lt;p&gt;Clojure does not attempt to displace existing ecosystems. It operates inside them.&lt;/p&gt;

&lt;p&gt;Central to how Clojure development actually works is the REPL – Read–Eval–Print Loop. Rather than the standard write–compile–run–crash cycle, developers send code fragments to a running system and modify it live. Functions are redefined while the application continues executing. For experienced practitioners this is a material productivity difference: the feedback loop is short, and the distance between an idea and a tested result is small. Experienced Clojure developers report unusually low defect rates, a claim that is plausible given the constraints immutability places on the ways a programme can fail.&lt;/p&gt;




&lt;h2&gt;
  
  
  The Hickey doctrine: simple versus easy
&lt;/h2&gt;

&lt;p&gt;Hickey's 2011 Strange Loop talk &lt;em&gt;Simple Made Easy&lt;/em&gt; is the philosophical engine behind every design choice in Clojure. It draws a distinction that most language design ignores.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Term&lt;/th&gt;
&lt;th&gt;Meaning&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Easy&lt;/td&gt;
&lt;td&gt;Familiar; close to what you already know&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Simple&lt;/td&gt;
&lt;td&gt;Not intertwined; concerns kept separate&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Most languages pursue &lt;em&gt;easy&lt;/em&gt;. They aim to resemble natural language, minimise cognitive friction at the point of learning, and reduce the effort required to write the first working programme. This also means that languages favoured by human readers tend to be the hardest for which to write parsers and compilers.&lt;/p&gt;

&lt;p&gt;Clojure instead pursues &lt;em&gt;simple&lt;/em&gt;. Its goal is to minimise tangled interdependencies in the resulting system, even at the cost of an unfamiliar surface. Writing parsers for Lisps is comparatively straightforward, at the cost of human readability.&lt;/p&gt;

&lt;p&gt;Hickey's specific target is what he calls place-oriented programming: the treatment of variables as named locations in memory whose values change over time – mutability, in more formal terms. His argument is that conflating a &lt;em&gt;value&lt;/em&gt; with a &lt;em&gt;location&lt;/em&gt; generates incidental complexity at scale, particularly in concurrent systems. When you cannot be certain what a variable contains at a given moment, reasoning about a programme becomes difficult in proportion to the programme's size.&lt;/p&gt;

&lt;p&gt;The design of Clojure follows directly from this diagnosis. Immutable data, functional composition, minimal syntax, and data structures in place of object hierarchies are all consequences of the same underlying position. The language may not feel easy. The resulting systems are intended to be genuinely simpler to reason about.&lt;/p&gt;




&lt;h2&gt;
  
  
  The real innovation: data and immutability
&lt;/h2&gt;

&lt;p&gt;Clojure's core model is data-oriented. Rather than building class hierarchies, programmes pass simple structures through functions:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight clojure"&gt;&lt;code&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;assoc&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="no"&gt;:name&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s"&gt;"Alice"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="no"&gt;:age&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;30&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="no"&gt;:city&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s"&gt;"London"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This creates a new map. The original is untouched. That is the default behaviour across all of Clojure's data structures – values do not change; new versions are produced instead.&lt;/p&gt;

&lt;p&gt;This is made practical by &lt;em&gt;persistent data structures&lt;/em&gt;, which use structural sharing. When a new version of a data structure is produced, it shares most of its internal memory with the previous version rather than copying it entirely. The comparison that makes this intuitive for most developers: Git does not delete your previous commits when you push a new one. It stores only the difference, referencing unchanged content from before. Clojure applies the same principle to in-memory data.&lt;/p&gt;

&lt;p&gt;The consequence for concurrency results directly from this. Race conditions require mutable shared state. If data cannot be mutated, the precondition for the most common class of concurrency bug does not exist. This was Clojure's most compelling practical argument during the multicore boom of the 2010s, when writing correct concurrent code had become a routine industrial concern rather than a specialist one. Clojure let developers eliminate that entire class of problem.&lt;/p&gt;




&lt;h2&gt;
  
  
  The functional programming wave – and why easy beat rigorous
&lt;/h2&gt;

&lt;p&gt;Between roughly 2012 and 2020, functional programming moved from academic discussion to genuine industry interest. The drivers were concrete: multicore processors created pressure to write concurrent code correctly; distributed data systems required reasoning about transformation pipelines rather than mutable state; and the sheer complexity of large-scale software made the promise of mathematical rigour appealing.&lt;/p&gt;

&lt;p&gt;Clojure was among the most visible representatives of this movement, alongside Haskell, Scala, and F#. Conference talks filled. Engineering blogs ran long series on immutability and monads. For a period it seemed plausible that functional languages might displace the mainstream ones.&lt;/p&gt;

&lt;p&gt;What actually happened was different. Mainstream languages absorbed the useful ideas and continued. And the majority of working programmers, it turned out, rarely needed to reason about threading and concurrency at all.&lt;/p&gt;

&lt;p&gt;Java gained streams and lambdas in Java 8. JavaScript acquired map, filter, and reduce as first-class patterns, and React popularised unidirectional data flow. C# extended its functional capabilities across successive versions. Rust built immutability and ownership into its type system from the outset. The industry did not convert to functional programming – it extracted what it needed and kept the syntax it already knew.&lt;/p&gt;

&lt;p&gt;A developer who can obtain most of functional programming's benefits inside a language they already know will rarely conclude that switching entirely is justified.&lt;/p&gt;

&lt;p&gt;The deeper reason functional languages lost the mainstream argument is not technical. It is sociological. Python won because it is, in the most precise sense, the Visual Basic of the current era. That comparison is not an insult – Visual Basic dominated the 1990s because it made programming accessible to people who had no intention of becoming professional developers, and that accessibility produced an enormous, self-reinforcing community. Python did exactly the same thing for data scientists, academics, hobbyists, and beginners, and for precisely the same reason: it is easy to learn, forgiving of error, and immediately rewarding to write. Network effects took care of the rest. Libraries multiplied. Courses proliferated. Employers specified it. The ecosystem became self-sustaining.&lt;/p&gt;

&lt;p&gt;Clojure is the antithesis of this process. It is a language for connoisseurs – genuinely, not dismissively. Its internal consistency is elegant, its theoretical foundations are sound, and developers who master it frequently describe it with something approaching aesthetic appreciation. Mathematical beauty, however, has never been a reliable route to mass adoption. Narrow appeal does not generate network effects. And Clojure, by design, operates as something of a lone wolf: it rides atop the JVM rather than integrating natively with the broader currents of modern computing – the web-first tooling, the AI infrastructure, the vast collaborative ecosystems built around Python and JavaScript. At a moment when the decisive advantages in software development come from connectivity, interoperability, and the accumulated weight of shared tooling, a language that demands a clean break from everything a developer already knows is swimming directly against the tide.&lt;/p&gt;

&lt;p&gt;Compare this with Kotlin or TypeScript, both of which succeeded in part because they offered a graduated path. A developer new to Kotlin can write essentially Java-style code and improve incrementally. A developer new to TypeScript can begin with plain JavaScript and add types as confidence grows. Both languages have, in effect, a beginner mode. Clojure has no such thing. You either think in Lisp or you do not write Clojure at all.&lt;/p&gt;




&lt;h2&gt;
  
  
  Where Clojure succeeded
&lt;/h2&gt;

&lt;p&gt;Despite remaining a specialist language, Clojure has real industrial presence.&lt;/p&gt;

&lt;p&gt;The most prominent example is Nubank, a Brazilian fintech that reached a valuation of approximately $45 billion at its NYSE listing in December 2021. Nubank runs significant portions of its backend in Clojure, and in 2020 acquired Cognitect – the company that stewards the language. That acquisition was considerably more than a gesture; it was a statement of long-term commitment from an organisation operating at scale.&lt;/p&gt;

&lt;p&gt;ClojureScript found parallel influence in the JavaScript ecosystem. The Reagent and re-frame frameworks attracted serious production use, demonstrating that the Clojure model could be applied to front-end development at scale and not merely to backend data pipelines.&lt;/p&gt;

&lt;p&gt;The pattern that emerges from successful Clojure deployments is consistent: small, experienced teams working on data-intensive systems where correctness and concurrency matter more than onboarding speed. That is a narrow niche. It was also, not coincidentally, a well-paid one – for a time.&lt;/p&gt;




&lt;h2&gt;
  
  
  Verdict: the ideas won
&lt;/h2&gt;

&lt;p&gt;Clojure despite its brilliance and initial buzz has not become a mainstream language. By any measure of adoption – survey rankings, job advertisements, GitHub repositories – it remains firmly in specialist territory. Even F#, a functional rival with the full weight of Microsoft's backing - and easier syntax - has not broken through.&lt;/p&gt;

&lt;p&gt;But the arguments Clojure made in 2007 has largely been vindicated. Immutability is now a design principle in Rust, Swift, and Kotlin. Functional composition is standard across modern JavaScript and C#. Data-oriented design has become an explicit architectural pattern in game development and systems programming. The industry did not adopt Clojure, but it has been grateful for Hickey's ideas and has in fact hungrily absorbed them.&lt;/p&gt;

&lt;p&gt;What did not transfer was the syntax – and behind the syntax lay an economic problem that no philosophical vindication could resolve.&lt;/p&gt;

&lt;p&gt;Developers and Heads of Technology might love Clojure for its expressive power and productivity. But a CTO and their C-suite peers evaluating a language does not ask only whether it is technically sound. The questions are: how large is the available talent pool? How long does onboarding take? What happens when a key developer leaves? Does it leave us exposed to key person risk? Clojure's answers to all of these questions are uncomfortable.&lt;/p&gt;

&lt;p&gt;There is a further cost that rarely appears in language comparisons. A developer with ten years of experience in Java, C#, or Python carries genuine accumulated capital: hard-won familiarity with idioms, libraries, failure modes, and tooling. Switching to a Lisp-derived language does not extend that knowledge – it resets it. Clojure keeps the JVM underneath but discards almost everything a developer has learned about how to structure solutions idiomatically. The ten-year veteran spends their first six months feeling like a junior again. Recursion replaces loops. Immutable pipelines replace stateful objects. The mental models that took years to build are, at best, partially transferable. They might even fail entirely to transition to functional thinking, after decades in imperative and OOP styles.  That cost and those risks are real and largely invisible in adoption discussions, and they fall on precisely the experienced developers an organisation most wants to retain. Knowledge compounds most effectively when it is built upon incrementally. Clojure does not permit that. It demands a clean break, and most organisations and most developers are not willing to pay that price.&lt;/p&gt;

&lt;p&gt;The high wages Clojure commanded were not, from a management perspective, a straightforward mark of quality. They were a warning of risk, of "god programmer" lock-in. They reflected something less flattering than productivity: the classic dynamic of the expert who becomes indispensable by writing systems that only they can maintain. At its worst this approaches a form of institutional capture – a codebase so entangled with one person's idiom that replacing them becomes prohibitively expensive, something uncomfortably close to ransomware in its commercial effect.&lt;/p&gt;

&lt;p&gt;That position has been further undermined by the rise of agentic coding tools. The practical value of writing in a mainstream language has dramatically increased, because AI coding assistants are trained on the accumulated body of code that exists – and that body is overwhelmingly Python, JavaScript, Java, and C#. The effect is concrete: ask a capable model to produce a complex data transformation in Python and it draws on an enormous foundation of high-quality examples. Ask it to do the same in idiomatic Clojure and the results are less reliable, the suggestions thinner, the tooling shallower. It might have to go through several rounds of trial-and-error. A language's effective learnability in 2026 is no longer a matter only of human cognition; it is also a function of training density. Niche languages are niche in the training data too, and that gap compounds. The expert moat – already questionable on organisational grounds – is being drained from two directions at once.&lt;/p&gt;

&lt;p&gt;Clojure's ideas spread widely and rapidly through the languages that absorbed them and left the parentheses behind. Its practitioners, once among the best-paid developers in the industry, now find that the scarcity premium they commanded rested partly on barriers that no longer hold.&lt;/p&gt;




&lt;p&gt;So, just what is Clojure, anyway? It is a language that was correct about the most important questions in software design, arrived a decade before the industry was ready to hear the answers, and expressed those answers in a notation the industry was not willing to learn.&lt;/p&gt;

&lt;p&gt;The language was right about the future of programming. It might not be part of that future now that it has arrived. But it rightly deserves a place in computing history.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;This article is part of an ongoing series examining what programming languages actually are and why they matter.&lt;/em&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Argument&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/why-c-is-still-the-most-important-programming-language-2n9j"&gt;C&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;The irreplaceable foundation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-ai/just-what-is-python-anyway-7ce"&gt;Python&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;The approachable language&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;a href="https://dev.to/dimension-zero/just-what-is-rust-anyway-28bh"&gt;Rust&lt;/a&gt;&lt;/td&gt;
&lt;td&gt;Safe systems programming&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Clojure&lt;/td&gt;
&lt;td&gt;Powerful ideas, niche language&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;em&gt;Coming next: Zig, Odin, and Nim – three languages that think C's job could be done better, and have very different ideas about how.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>clojure</category>
    </item>
    <item>
      <title>Just What IS Rust, Anyway?</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Sat, 07 Mar 2026 17:20:42 +0000</pubDate>
      <link>https://forem.com/dimension-ai/just-what-is-rust-anyway-28bh</link>
      <guid>https://forem.com/dimension-ai/just-what-is-rust-anyway-28bh</guid>
      <description>&lt;p&gt;&lt;em&gt;Any why does everyone say it's hard?&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Every mainstream language fits a mental slot. C is a systems language. Python is a runtime ecosystem that excels at orchestration. JavaScript is a browser language that escaped into the server.&lt;/p&gt;

&lt;p&gt;Not quite: Rust doesn't fit the usual slots.&lt;/p&gt;

&lt;p&gt;Where Python is hard to classify because it does so many things loosely, Rust is hard to classify because by contrast it does one thing with unusual rigour.&lt;/p&gt;

&lt;p&gt;Developers who haven't used it have heard the same three adjectives: fast, safe, and difficult. That specific combination is not at all accidental; it is in fact fully intentional. Understanding why those three things travel together is the key to understanding what Rust actually is.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://dev.to/dimension-ai/why-c-is-still-the-most-important-programming-language-2n9j"&gt;first article&lt;/a&gt; in this series argued that C's importance derives not from the language itself but from the ABI — the universal binary interface that every other language targets at its boundaries. It dominates, half a century on, through network effects - like how the English became the world's language.&lt;/p&gt;

&lt;p&gt;The &lt;a href="https://dev.to/dimension-ai/just-what-is-python-anyway-7ce"&gt;second&lt;/a&gt; in this series, argued that Python is best understood as a runtime ecosystem (alongside .NET and the JVM) rather than a language with a clean output artefact.&lt;/p&gt;

&lt;p&gt;This third article explores how Rust sits at the intersection of both arguments. It is a systems language that takes the C ABI seriously, and its design is most clearly understood once that context is established.&lt;/p&gt;




&lt;h2&gt;
  
  
  The problem Rust was built to solve
&lt;/h2&gt;

&lt;p&gt;The dominant categories of serious software vulnerability — buffer overflows, use-after-free, dangling pointers, data races — are not programming errors in the usual sense. They are &lt;em&gt;permitted operations&lt;/em&gt; in C and C++. The language allows them. The programmer is the last line of defence, and programmers are not reliable at scale.&lt;/p&gt;

&lt;p&gt;This is not a minor concern. Microsoft disclosed in 2019 that roughly 70% of the CVEs it addressed in the preceding decade were memory safety issues. Google reported similar figures for the Chrome codebase. These are not exotic edge cases; they are the default failure mode of systems written in C and C++.&lt;/p&gt;

&lt;p&gt;Languages that solved this earlier — Java, C#, Go — did so with garbage collectors and managed runtimes. That works, but the cost is real: non-deterministic pauses, memory overhead, and loss of control over how data is laid out. For application development those costs are usually acceptable. For kernels, databases, real-time systems, and network infrastructure they frequently are not.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Approach&lt;/th&gt;
&lt;th&gt;Examples&lt;/th&gt;
&lt;th&gt;Cost of safety&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Human discipline&lt;/td&gt;
&lt;td&gt;C, C++&lt;/td&gt;
&lt;td&gt;Vulnerabilities (CVEs)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Runtime (GC / interpreter)&lt;/td&gt;
&lt;td&gt;Python, Java, C#, Go&lt;/td&gt;
&lt;td&gt;Latency and memory overhead&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Static analysis (type system)&lt;/td&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;Development time and compile time&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Rust proposes that the third row is achievable: memory safety enforced by the compiler, with no garbage collector, no managed runtime, and no runtime cost. The question is what that actually requires.&lt;/p&gt;




&lt;h2&gt;
  
  
  The organising insight: correctness moves into the type system
&lt;/h2&gt;

&lt;p&gt;Every language makes a choice about &lt;em&gt;when&lt;/em&gt; to catch errors. Dynamic languages catch them at runtime — when the programme is already running. Statically typed compiled languages catch type errors at compile time. Rust extends this principle as far as it can be extended: memory safety, aliasing rules, and concurrency correctness are all encoded in the type system and verified before the programme runs.&lt;/p&gt;

&lt;p&gt;This is not merely a technical decision. It represents a different understanding of what a compiler is &lt;em&gt;for&lt;/em&gt;. In most languages, the compiler translates code that looks correct into code that runs. In Rust, the compiler's job includes refusing to translate code that could behave incorrectly at runtime — even if the same code would compile without complaint in C or C++.&lt;/p&gt;

&lt;p&gt;The Rust compiler is therefore closer in spirit to a formal verification tool than to a conventional translator. Ownership, borrowing, and lifetimes are not quirks of the language's syntax or design accidents. They are the mechanism by which the type system reasons about memory at compile time.&lt;/p&gt;




&lt;h2&gt;
  
  
  The three mechanisms
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Ownership
&lt;/h3&gt;

&lt;p&gt;Every value in memory has exactly one owner at any given time. When ownership is transferred to another binding, the original becomes invalid. The compiler enforces this statically.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;s&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;String&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"hello"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;t&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;                      &lt;span class="c1"&gt;// ownership moves to t&lt;/span&gt;
&lt;span class="nd"&gt;println!&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"{}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;              &lt;span class="c1"&gt;// compile error: s is no longer valid&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is not a runtime check. The compiler refuses to produce this programme. The consequence is that double-free errors become structurally impossible: if only one binding owns the memory, only one thing can free it, and that happens automatically when the owner goes out of scope.&lt;/p&gt;

&lt;h3&gt;
  
  
  Borrowing
&lt;/h3&gt;

&lt;p&gt;Rather than transferring ownership, code can borrow temporary access to a value. Rust permits either multiple simultaneous read-only borrows, or a single read-write borrow — but never both at once.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;v&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nd"&gt;vec!&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
&lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;r&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;v&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;                     &lt;span class="c1"&gt;// immutable borrow begins here&lt;/span&gt;
&lt;span class="n"&gt;v&lt;/span&gt;&lt;span class="nf"&gt;.push&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;                      &lt;span class="c1"&gt;// compile error: cannot mutate while borrowed&lt;/span&gt;
&lt;span class="nd"&gt;println!&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"{:?}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This rule is the compile-time equivalent of a readers-writer lock. It eliminates data races before the programme runs: if the compiler can see that two threads could hold conflicting access to the same data, it refuses to compile the code. In an era where every server runs on multi-core hardware and concurrent execution is the norm rather than the exception, this guarantee matters. C requires the programmer to reason about thread safety manually. Python sidesteps the problem with the Global Interpreter Lock, which prevents true parallelism. Rust encodes the constraint in the type system and enforces it for free.&lt;/p&gt;

&lt;h3&gt;
  
  
  Lifetimes
&lt;/h3&gt;

&lt;p&gt;The compiler tracks how long each reference remains valid. If a reference might outlive&lt;br&gt;
the value it points to, the programme does not compile.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="nf"&gt;first_word&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;  &lt;span class="c1"&gt;// compiler verifies this reference cannot outlive s&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Dangling pointers — references to memory that has already been freed — are structurally prevented. The compiler requires that the proof of safety be expressible in terms it can verify; if it cannot be expressed, the code does not compile.&lt;/p&gt;

&lt;p&gt;Together, these three mechanisms mean that a Rust programme which compiles cannot exhibit the class of memory error that accounts for the majority of CVEs in C and C++ codebases. The bugs are not caught at runtime. They are made structurally unavailable.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why Rust is difficult
&lt;/h2&gt;

&lt;p&gt;The difficulty follows directly from the mechanisms above. Most languages ask the programmer to write code that runs. Rust asks the programmer to write code the compiler can &lt;em&gt;prove is safe&lt;/em&gt; — and that proof must be expressed explicitly enough for a static analyser to verify it.&lt;/p&gt;

&lt;p&gt;Developers trained in languages that hide memory management have never been required to reason about ownership, aliasing, or reference lifetimes. These concerns exist in every language; they are simply invisible. Rust makes them explicit and mandatory.&lt;/p&gt;

&lt;p&gt;This reframing matters in practice. The borrow checker — the component of the compiler that enforces borrowing and lifetime rules — is a common source of frustration for new Rust programmers. Experienced Rust programmers tend to describe the same component as a collaborator. The difference is not in the tool; it is in whether the programmer understands the exchange being made. The difficulty is the price of the guarantee. Once code compiles, an entire class of runtime failure has been ruled out by construction.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why Rust compiles slowly
&lt;/h2&gt;

&lt;p&gt;Rust compiles slowly relative to most comparable languages - and this has a specific cause.&lt;/p&gt;

&lt;p&gt;Most compilers translate code into bytecode or directly into assembler. Rust's compiler however also performs static analysis: it verifies ownership transfers, checks borrow constraints, validates lifetimes across the entire programme, and confirms the absence of data races — all before emitting assembler. This extra layer has no direct equivalent in C, Go, or Java compilation; and it is where the magic of Rust happens.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Memory/concurrency checking at compile time&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;C / C++&lt;/td&gt;
&lt;td&gt;None — left to the programmer&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Go&lt;/td&gt;
&lt;td&gt;Escape analysis (basic); race detector available at runtime&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Java / C#&lt;/td&gt;
&lt;td&gt;Null safety (partial); no memory ownership model&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;Ownership, borrows, lifetimes, and data races — all static&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The compilation overhead is the upfront cost of eliminating the runtime memory and thread errors that C/C++ freely permit. Think of it as an extra round of proofreading: a Rust program that eventually runs has already been formally checked in ways that no other mainstream systems language provides.&lt;/p&gt;




&lt;h2&gt;
  
  
  More on Rust and C
&lt;/h2&gt;

&lt;p&gt;The C article in this series argued that C's importance derives from the ABI — the binary interface that every language targets when it needs to call outside itself. Rust in the Linux kernel still speaks C at the boundary, building safer implementations behind a C-compatible façade.&lt;/p&gt;

&lt;p&gt;We need to say more about that observation here, because it identifies what makes Rust different from every previous attempt at a memory-safe systems language.&lt;/p&gt;

&lt;p&gt;Java and C# solve memory safety, but they require a managed runtime. That runtime sits between the language and the machine. It manages its own heap, controls its own memory, and cannot easily share ownership with C code. Calling a C library from Java requires a bridge — JNI — that is notoriously difficult and imposes overhead. The managed runtime is the reason these languages never became genuine C replacements in systems&lt;br&gt;
programming: the boundary between the managed world and the unmanaged world is too expensive to cross.&lt;/p&gt;

&lt;p&gt;Rust conversely has no managed runtime and no garbage collector. Its memory layout is directly controllable. It can call C code, and be called by C code, through the same ABI mechanism that all other languages use — but without a runtime in the middle.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;extern&lt;/span&gt; &lt;span class="s"&gt;"C"&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="nf"&gt;strlen&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="k"&gt;const&lt;/span&gt; &lt;span class="nb"&gt;u8&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;usize&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;  &lt;span class="c1"&gt;// calls a C function directly via the platform ABI&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That single declaration is enough for Rust to call any C function with no marshalling layer and no overhead. The reverse is equally straightforward: a Rust library can expose a C-compatible interface and be consumed by Python, Ruby, Go, or any other language as though it were a C library.&lt;/p&gt;

&lt;p&gt;Rust therefore occupies a position no other memory-safe language has held: it can replace C module-by-module, function-by-function, without replacing the interfaces that the rest of the system depends on - without either the errors that C/C++ permit or the cost of C# or Java's runtime and garbage collection. That is quite an achievement - hitherto never achieved by any other language.&lt;/p&gt;

&lt;p&gt;The Linux kernel illustrates this in practice. Rust has been accepted into the kernel — first experimentally, then as a first-class language alongside C in December 2025 — and Rust modules interoperate with fifty years of existing C kernel code through C-compatible interfaces. Rust doesn't discard or compete with the C ABI: it was designed to work within it.&lt;/p&gt;




&lt;h2&gt;
  
  
  Where Rust is used
&lt;/h2&gt;

&lt;p&gt;Operating systems, databases (TiKV, sled), web infrastructure (Cloudflare's core networking stack, Amazon's Firecracker virtualisation engine), embedded systems, and the Linux kernel. The common factor is consistent across all of them: memory safety without a managed runtime.&lt;/p&gt;

&lt;p&gt;It is for these reasons that Stack Overflow's developer survey found Rust the most admired language for nine consecutive years, from 2015 to 2024. The gap between admiration and adoption has however been wide and consistent: developers recognise the value of the guarantees but find the onboarding cost high. That gap has been closing a little faster since roughly 2022, as tooling, documentation, and the Cargo package manager have matured, and as the consequences of decades of memory-unsafe systems code have become harder to ignore; but large amounts of C and C++ are not about to be rewritten in Rust any time soon.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Rust actually is
&lt;/h2&gt;

&lt;p&gt;Rust is a systems language that moved the programmer's memory discipline into the compiler. That single decision accounts for everything unusual about it: the strict type system, the slow compilation, the learning curve, the purported difficulty, and the guarantees.&lt;/p&gt;

&lt;p&gt;The language is internally consistent once its central ambition is clear. The borrow checker is not arbitrary strictness; it is the mechanism by which ownership rules are enforced. The slow compilation is not a tooling failure; it is the cost of static analysis that other compilers do not perform. The new way of thinking is not poor design; it is the price of eliminating at compile time the errors that every other systems language discovers at runtime — or in a CVE report.&lt;/p&gt;

&lt;p&gt;Rust is not a general-purpose language competing with Python or Go for application development. It is the first plausible answer, after fifty years, to the question of whether C-level performance and memory safety can coexist without a runtime or garbage collection. The answer, it turns out, is yes — provided the compiler is willing to do enough of the work, and the programmer is willing to let it.  It is to C's credit that it took half a century for anyone to work this out. For all these reasons, Rust is an enormous advance, whose benefits are still in their infancy. Give it another 50 years to play out?&lt;/p&gt;

</description>
      <category>rust</category>
    </item>
    <item>
      <title>MVVM Made Easy</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Fri, 06 Mar 2026 16:58:35 +0000</pubDate>
      <link>https://forem.com/dimension-ai/mvvm-made-easy-44l8</link>
      <guid>https://forem.com/dimension-ai/mvvm-made-easy-44l8</guid>
      <description>&lt;p&gt;The architectural pattern "Model-View-ViewModel" is now over 20 years old yet is still notoriously hard to understand.&lt;/p&gt;

&lt;p&gt;A developer might implement &lt;code&gt;INotifyPropertyChanged&lt;/code&gt;, wire up bindings, make controls respond to state changes and ship a working application. But if a colleague asked you to explain – not &lt;em&gt;how&lt;/em&gt; MVVM works, but &lt;em&gt;why&lt;/em&gt; it is structured the way it is – many developers with years of experience might give a detailed technical answer but be met with blank incomprehension. They know the plumbing but explaining the concept remains hard.&lt;/p&gt;

&lt;p&gt;To be fair, to use MVVM you can get away with just knowing the how to use but still not understand why it came to be - or explain it simply.&lt;/p&gt;

&lt;p&gt;Two things make MVVM genuinely hard to understand:&lt;/p&gt;

&lt;p&gt;First, the name: "Model, View, ViewModel" describes the three components, not what they do, not how they relate, and not why the separation matters. It is like naming the parts of a car "Metal, Glass, Rubber" and expecting that to teach someone to drive.&lt;/p&gt;

&lt;p&gt;Second, almost every tutorial leads with binding syntax and interface implementation – the hardest, most fiddly part of the pattern – before establishing the underlying idea. This is exactly backwards.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;[A side-note on scale before going further. If you are binding a single text box to a string property, MVVM is pointless overhead. It is the kind of pattern that earns its complexity only when &lt;strong&gt;multiple&lt;/strong&gt; views share the same data, react to each other's state and need to remain independently testable. If you are building a dashboard where changing one filter should re-query several data sets and update four panels simultaneously, MVVM is not over-engineered – it is the only approach that remains intelligible at maintenance time.]&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Why MVVM exists
&lt;/h2&gt;

&lt;p&gt;Let's start by elucidating what problem MVVM was introduced to solve, in three parts:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Testability.&lt;/strong&gt; A ViewModel contains no UI framework code. It can be unit tested in isolation: no window instantiation, no simulated button clicks, no rendered elements to inspect. You call a method, you assert on state. The logic that drives your UI is testable in the same way any other class is testable. Before this, automated testing of UI's was hard if not impossible - and relied on expensive, grumpy humans.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Separation.&lt;/strong&gt; The View is a dumb rendering surface and nothing more. Domain logic lives outside it. When you need to retarget the same underlying data to a different interface – a mobile layout, a web front end, a printed report – the ViewModel and Model are unchanged. The View is the only thing that needs to change. This makes cross-platform support (Windows/macOS/Linux, mobile/table/desktop) a cinch.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Binding engine compatibility.&lt;/strong&gt; In declarative UI frameworks, the binding engine expects properties and observable state, not imperative UI code. A ViewModel is exactly the component that provides that interface. It is not incidental to the pattern; it is part of the reason the pattern takes the shape it does.&lt;/p&gt;

&lt;p&gt;Without understanding these three drivers, MVVM might look like pointless ceremony.&lt;/p&gt;

&lt;p&gt;A developer who already knows Model-View-Controller ("MVC") might reasonably ask: why not just put this logic in the View? The answer is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;because you cannot unit test a View&lt;/li&gt;
&lt;li&gt;because tying domain logic to a UI framework makes it fragile and non-portable, and&lt;/li&gt;
&lt;li&gt;because declarative binding engines need something to bind &lt;em&gt;to&lt;/em&gt;.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  The SQL analogy
&lt;/h2&gt;

&lt;p&gt;The clearest route into MVVM for most developers is something they already understand: relational databases and SQL.&lt;/p&gt;

&lt;p&gt;Think of the mapping like this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Model&lt;/strong&gt; = the database engine and its tables – the entire data and domain layer, including services and repositories. Not merely a DTO or a row; the full source of truth.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;ViewModel&lt;/strong&gt; = a parameterised &lt;code&gt;SELECT&lt;/code&gt; statement plus the logic that drives it.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;View&lt;/strong&gt; = the rendered output of that query.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The forward direction is intuitive: data flows from the table, through the query, to the displayed result – exactly as it does in SQL Server Management Studio.&lt;/p&gt;

&lt;p&gt;A qualification now needs making: a &lt;code&gt;SELECT&lt;/code&gt; statement is stateless and declarative - it runs and gives output; whereas a ViewModel is stateful and procedural - it holds data in memory and can be updated. The SQL analogy thus holds for the &lt;em&gt;shape&lt;/em&gt; of the relationship, not the implementation detail. A ViewModel also carries UI-only state that has no equivalent anywhere in the data layer – loading flags (&lt;code&gt;IsBusy&lt;/code&gt;), selection state (&lt;code&gt;IsSelected&lt;/code&gt;), temporary form edits, pagination position. The most accurate framing is that a ViewModel is a both the domain state &lt;em&gt;plus&lt;/em&gt; UI interaction state, not merely a query.&lt;/p&gt;

&lt;p&gt;What the SQL analogy does capture very well is the relationship between the three layers, and the point at which the analogy breaks is exactly the point at which MVVM becomes interesting.&lt;/p&gt;

&lt;p&gt;In SQL Server Management Studio (or, indeed, any database query application), the result grid is &lt;strong&gt;&lt;em&gt;read-only&lt;/em&gt;&lt;/strong&gt; with respect to the underlying query. You can run a &lt;code&gt;SELECT&lt;/code&gt; and see a result set; but if you want to edit the data, if the table permits it, you have to edit individual rows directly. But those edits do not change the query or cause another query to re-run. They do not update the results displayed in a different query window. For the updated data to percolate through to the viewable output, you have to run the SQL query again.&lt;/p&gt;

&lt;p&gt;MVVM evolved to address all three of these issues.&lt;/p&gt;

&lt;p&gt;Consider what it would mean if SQL Server Management Studio worked differently:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="c1"&gt;-- Query 1: you click "GROCERY STORES" in this result set&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;ProviderClass&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Amount&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;Entries&lt;/span&gt;
&lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;ProviderClass&lt;/span&gt;

&lt;span class="c1"&gt;-- Query 2: automatically re-runs, filtered by your selection&lt;/span&gt;
&lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;OurClass&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;COUNT&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;*&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="k"&gt;SUM&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Amount&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;Entries&lt;/span&gt;
&lt;span class="k"&gt;WHERE&lt;/span&gt; &lt;span class="n"&gt;ProviderClass&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s1"&gt;'GROCERY STORES'&lt;/span&gt;
&lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;OurClass&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;What if selecting a row in one result automatically re-executes another query with a &lt;code&gt;WHERE&lt;/code&gt; clause derived from your selection? The two result sets would then &lt;strong&gt;react&lt;/strong&gt; to each other. No database tool does this natively. This is precisely what MVVM, with its binding system, steps in to provide.&lt;/p&gt;

&lt;p&gt;Under MVVM, the flow of information is &lt;strong&gt;bidirectional&lt;/strong&gt; and looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Forward:  Model → ViewModel → View
Backward: View  → ViewModel → Model
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The forward pass is the query and its output - like executing a query in SQL Server Management Studio. The backward pass is the new MVVM bit: the user reaches edits the rendered output (e.g. double click and edit a field) and any edit flows back - changing either the underlying data (in the database) or the parameters of the query itself.&lt;/p&gt;

&lt;p&gt;The nearest database analogy for the full bidirectional picture is a materialised view with triggers: the view renders derived data, but user actions fire triggers that modify the underlying query parameters, which re-materialise all the other views. Databases do not natively do this. That gap is what MVVM fills.&lt;/p&gt;




&lt;h2&gt;
  
  
  A worked example
&lt;/h2&gt;

&lt;p&gt;Abstract descriptions of data flow are easy to nod along to and hard to retain. A small end-to-end example makes both directions of the flow concrete.&lt;/p&gt;

&lt;p&gt;Imagine a customer management screen with a region filter, a customer grid, an orders panel, and a summary panel.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The user selects "North" from the region filter (View interaction).&lt;/li&gt;
&lt;li&gt;The ViewModel updates its &lt;code&gt;SelectedRegion&lt;/code&gt; property and re-queries the Model. The customer grid updates to show only northern customers (forward pass).&lt;/li&gt;
&lt;li&gt;The user clicks a customer row (View interaction).&lt;/li&gt;
&lt;li&gt;The ViewModel sets &lt;code&gt;SelectedCustomer&lt;/code&gt; and loads that customer's orders from the Model. The orders panel updates (forward pass triggered by backward selection).&lt;/li&gt;
&lt;li&gt;The user clicks an order in the orders panel (View interaction).&lt;/li&gt;
&lt;li&gt;The ViewModel writes the selected order back to the Model and recalculates the summary. The summary panel updates (full round trip).&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Each step either writes state backward through the ViewModel to the Model, or reads updated state forward from the Model through the ViewModel to the View – often both within the same interaction. This is the case where MVVM earns its structure. At this scale, without the pattern, you are managing a web of manual event handlers and direct control references. With it, each layer has a single, defined responsibility.&lt;/p&gt;




&lt;h2&gt;
  
  
  Bindings and commands – the wiring, and where it goes wrong
&lt;/h2&gt;

&lt;p&gt;Bindings and commands are the mechanism that automates the flow described above. Neither is part of the &lt;em&gt;idea&lt;/em&gt; of MVVM; both are the entire &lt;em&gt;implementation&lt;/em&gt; of it. This distinction matters because tutorials that begin with binding syntax, before the concept is established, guarantee the confusion described at the start of this article.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Bindings&lt;/strong&gt; propagate data state changes between the ViewModel and the View automatically, in both directions. The reactive binding layer is the concept; the specific syntax varies by framework. MVVM applies to WPF, MAUI, Avalonia, SwiftUI, and reactive front-end frameworks alike – the pattern is the same, only the binding expression syntax differs. In WPF/XAML terms: &lt;code&gt;INotifyPropertyChanged&lt;/code&gt; drives the forward pass, notifying the View when ViewModel properties change; &lt;code&gt;UpdateSourceTrigger&lt;/code&gt; controls when the backward pass fires, pushing user edits back through the ViewModel to the Model.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Commands&lt;/strong&gt; handle action flow rather than data flow. If a binding is a pipe for state, a command is a trigger for action. A button click does not write a property value back through the ViewModel; it fires a command. Commands are defined on the ViewModel – they live in the same layer as the observable properties – and are exposed to the View through bindings. The &lt;code&gt;ICommand&lt;/code&gt; interface in .NET is the standard mechanism; other frameworks have direct equivalents.&lt;/p&gt;

&lt;p&gt;Understanding both primitives before reading about either is the correct order. Bindings carry state. Commands carry intent. Together they constitute the entire backward pass.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Where things go wrong.&lt;/strong&gt; Using XAML as the concrete example, the most common failure modes are these:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Silent path failure.&lt;/strong&gt; A mistyped property name in a binding expression fails without a compiler error. The UI simply does not update. There is no exception; there is a runtime warning in the Output window, if you know to look for it and have the right diagnostic verbosity set. Developers accustomed to type-checked, compiled code find this genuinely jarring.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;DataContext absent or late.&lt;/strong&gt; If the &lt;code&gt;DataContext&lt;/code&gt; is not set, or is set after bindings have already attempted to resolve, the binding has nothing to point at. Again, this fails silently.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Wrong binding direction.&lt;/strong&gt; &lt;code&gt;OneWay&lt;/code&gt; when &lt;code&gt;TwoWay&lt;/code&gt; was needed, or vice versa. The data flows correctly in one direction and not at all in the other, which can look like a ViewModel bug when it is a binding configuration issue.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Incomplete &lt;code&gt;INotifyPropertyChanged&lt;/code&gt;.&lt;/strong&gt; If property change notification is missing for some properties, the forward pass fires inconsistently. Some controls update; others do not. Debugging this without understanding the mechanism is an exercise in frustration.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;XAML binding expressions are terse and unforgiving. An expression such as &lt;code&gt;{Binding SelectedRegion, UpdateSourceTrigger=PropertyChanged, Mode=TwoWay}&lt;/code&gt; is compact, but one wrong token causes the whole expression to fail at runtime without a compiler error. Getting bindings right is a skill that takes time to acquire, which is part of why MVVM has a reputation for being hard. The pattern itself is not hard. The implementation surface is genuinely fiddly.&lt;/p&gt;




&lt;h2&gt;
  
  
  Three other ways to think about it
&lt;/h2&gt;

&lt;p&gt;The SQL analogy is perhaps the easiest for developers with a database background. The underlying pattern, however, is general enough to be seen in many other contexts.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Spreadsheet.&lt;/strong&gt; Cell values are the Model, formulas are the ViewModel, and the displayed grid is the View. Edit a cell value and every formula that depends on it recalculates immediately; the display updates without any manual refresh. Edit a displayed value directly and it writes back to the underlying cell, propagating through all dependent formulas. This is perhaps the most widely encountered example of reactive bidirectional data flow in everyday computing.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Music mixing desk.&lt;/strong&gt; The multitrack recording is the Model, each channel strip – EQ settings, gain, pan position – is a ViewModel, and the audio output is the View. Multiple ViewModels operate on the same Model simultaneously, and adjusting one channel affects the audible result of the others. Muting a channel (View interaction) changes the ViewModel state, which changes the output. This captures the multi-view reactive case: several independent lenses on the same underlying data, each influencing what the others produce.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Map application.&lt;/strong&gt; Geographic data is the Model, the current zoom level, pan position, and active layer filters are the ViewModel, and the rendered map is the View. Panning or zooming (View interaction) updates coordinates and scale in the ViewModel, which re-renders the visible tiles. Switching to satellite view or adding a traffic overlay applies a different ViewModel to the same underlying Model. Multiple panels – street view, satellite, elevation – are multiple Views of the same data.&lt;/p&gt;

&lt;p&gt;The table below maps all four analogies across the key dimensions.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Mental model&lt;/th&gt;
&lt;th&gt;Model (the truth)&lt;/th&gt;
&lt;th&gt;ViewModel (the lens)&lt;/th&gt;
&lt;th&gt;View (the output)&lt;/th&gt;
&lt;th&gt;Bidirectional element&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;SQL / database&lt;/td&gt;
&lt;td&gt;Tables + data layer&lt;/td&gt;
&lt;td&gt;Parameterised query + driving logic&lt;/td&gt;
&lt;td&gt;Rendered result grid&lt;/td&gt;
&lt;td&gt;Selecting output re-queries; edits can write back to table&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Spreadsheet&lt;/td&gt;
&lt;td&gt;Cell values&lt;/td&gt;
&lt;td&gt;Formulas&lt;/td&gt;
&lt;td&gt;Displayed grid&lt;/td&gt;
&lt;td&gt;Editing a displayed value writes back to the cell; recalculates all dependents&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Mixing desk&lt;/td&gt;
&lt;td&gt;Multitrack recording&lt;/td&gt;
&lt;td&gt;Channel strip settings (EQ, gain, pan)&lt;/td&gt;
&lt;td&gt;Audio output&lt;/td&gt;
&lt;td&gt;Adjusting a channel control updates the output; muting feeds back to change the mix&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Map application&lt;/td&gt;
&lt;td&gt;Geographic data&lt;/td&gt;
&lt;td&gt;Zoom / pan / layer parameters&lt;/td&gt;
&lt;td&gt;Rendered map&lt;/td&gt;
&lt;td&gt;Panning or filtering updates ViewModel parameters, which re-renders all panels&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  The concept, then the plumbing
&lt;/h2&gt;

&lt;p&gt;The reason MVVM takes years to click for many developers is straightforward: standard explanations describe the parts rather than the logic. Once the concept is in place – a source of truth, a lens that shapes and drives it, a rendered output through the lens, and the ability to reach back through the lens in both directions – the plumbing becomes interpretable. The binding expressions, the &lt;code&gt;INotifyPropertyChanged&lt;/code&gt; implementations, the &lt;code&gt;ICommand&lt;/code&gt; wiring: these are no longer arbitrary boilerplate. They are the mechanical realisation of a flow you already understand.&lt;/p&gt;

&lt;p&gt;Get the concept first. The implementation follows. At that point, MVVM stops looking like ceremony and starts looking like a disciplined way to manage reactive UI state at scale.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Seven further mental models for MVVM – accounting ledger, kitchen, wardrobe, stock trading terminal, library catalogue, security camera control room, and orchestra conductor's score – are set out in the footnote below for readers who find alternative framings useful.&lt;/em&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  Appendix: seven additional mental models
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Mental model&lt;/th&gt;
&lt;th&gt;Model (the truth)&lt;/th&gt;
&lt;th&gt;ViewModel (the lens)&lt;/th&gt;
&lt;th&gt;View (the output)&lt;/th&gt;
&lt;th&gt;Bidirectional element&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Accounting ledger&lt;/td&gt;
&lt;td&gt;Journal entries&lt;/td&gt;
&lt;td&gt;Trial balance / P&amp;amp;L (different aggregations of the same transactions)&lt;/td&gt;
&lt;td&gt;Printed reports&lt;/td&gt;
&lt;td&gt;Post a correcting entry and all reports regenerate&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Kitchen&lt;/td&gt;
&lt;td&gt;Ingredients in the larder&lt;/td&gt;
&lt;td&gt;Recipe (selects, transforms, and combines)&lt;/td&gt;
&lt;td&gt;Plated dish&lt;/td&gt;
&lt;td&gt;Head chef tasting and calling for more salt: a View-level judgement that modifies the ViewModel parameters&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Wardrobe&lt;/td&gt;
&lt;td&gt;All clothes owned&lt;/td&gt;
&lt;td&gt;"What to wear today" (filtered by weather, occasion, and what is clean)&lt;/td&gt;
&lt;td&gt;Mirror&lt;/td&gt;
&lt;td&gt;Deciding in front of the mirror to swap the jacket writes a change back to the selection logic&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Stock trading terminal&lt;/td&gt;
&lt;td&gt;Market tick data&lt;/td&gt;
&lt;td&gt;Watchlist with filters and calculated columns (P/E, % change, position size)&lt;/td&gt;
&lt;td&gt;Screen display&lt;/td&gt;
&lt;td&gt;Placing a buy order writes back through the ViewModel to the Model, updating the portfolio and all dependent calculations&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Library catalogue&lt;/td&gt;
&lt;td&gt;Books on the shelves&lt;/td&gt;
&lt;td&gt;Search query with active filters&lt;/td&gt;
&lt;td&gt;Results list&lt;/td&gt;
&lt;td&gt;Reserving a book marks it unavailable in the Model, updating results visible to every other user&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Security camera control room&lt;/td&gt;
&lt;td&gt;Camera feeds&lt;/td&gt;
&lt;td&gt;Operator's selection of which feeds appear on which monitors&lt;/td&gt;
&lt;td&gt;Monitor wall&lt;/td&gt;
&lt;td&gt;Switching a monitor feed updates the ViewModel; a camera going offline updates the Model, greying out that feed on all monitors&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Orchestra conductor's score&lt;/td&gt;
&lt;td&gt;Composed music&lt;/td&gt;
&lt;td&gt;Conductor's interpretation (tempo, dynamics, section balance)&lt;/td&gt;
&lt;td&gt;Sound the audience hears&lt;/td&gt;
&lt;td&gt;Conductor adjusts in real time based on what they hear: a feedback loop from View back through ViewModel, changing the next forward pass&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

</description>
      <category>modelviewviewmodel</category>
      <category>mvvm</category>
      <category>architecture</category>
    </item>
    <item>
      <title>Something Big Is Happening: a Response</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Sat, 21 Feb 2026 16:13:06 +0000</pubDate>
      <link>https://forem.com/dimension-ai/something-big-is-happening-a-response-21lm</link>
      <guid>https://forem.com/dimension-ai/something-big-is-happening-a-response-21lm</guid>
      <description>&lt;p&gt;&lt;em&gt;A response to a viral essay – agreeing with the urgency, adding precision about what these systems are and what they are not. TL;DR at bottom.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  I. Yes, Something Big Is Happening
&lt;/h2&gt;

&lt;p&gt;Matt Shumer's excellent and thought-provoking article "&lt;a href="https://shumer.dev/something-big-is-happening" rel="noopener noreferrer"&gt;Something Big Is Happening&lt;/a&gt;" captures something that many of us who work with AI daily rarely articulate: paid-tier "AI" models in early 2026 are qualitatively different from those of eighteen months ago; and the gap between what they can do and what is commonly thought they can do is wide and still widening.&lt;/p&gt;

&lt;p&gt;Coding workflows have been transformed. Legal, financial, analytical and programming tasks that once took hours or even days now take minutes. Anyone who dismissed these tools after a brief encounter with free-tier ChatGPT in 2023 owes it to themselves to look again, because they'll find capability stunningly stronger than what they remember.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faoefp45fvrzrm8ijvsp3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faoefp45fvrzrm8ijvsp3.png" alt="The Steam-Powered Robot of 1868" width="610" height="257"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We share Shumer's view that something important is happening and that underestimation of it is widespread. Where we'd like to add some supporting perspective is on &lt;em&gt;what, precisely&lt;/em&gt;, is happening — to help clarity and form emerge from the dust-cloud of early-stage change. We suggest that "cognitive automation is scaling rapidly" does not automatically mean "autonomous intelligence is emerging". This may sound like a technicality, grasping it will help shape predictions, policy and career decisions so deeply that it is difficult to identify any period in the past that wrought such fundamental change.&lt;/p&gt;

&lt;p&gt;Our view is that this new wave of technology is a profound expansion of &lt;strong&gt;cognitive automation&lt;/strong&gt; — possibly the most significant technology in human history. Understanding what that means matters for anyone making life-decisions: everyone.&lt;/p&gt;




&lt;h2&gt;
  
  
  II. A Framework For Understanding: Nine Ideas
&lt;/h2&gt;

&lt;p&gt;Own account that augments Shumer's essay by identifying nine developments that, taken together, amount to a genuine transformation in what computers can do and what humans will need to do.&lt;/p&gt;

&lt;p&gt;They fall into three groups:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;An Interface Revolution: how computers and humans can now talk to each and what they can do with data.&lt;/li&gt;
&lt;li&gt;Labour Reallocation: how the relationship between people and software is being restructured.&lt;/li&gt;
&lt;li&gt;"The Great Levelling": an economic re-balancing where power has suddenly shifted and what this means for who can build what and where.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Group 1 – The Interface Revolution: What Computers Can Now Do
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;1. Computers have learned to speak human language.&lt;/strong&gt; For sixty years, humans had to learn the languages of computers — programming languages, query syntax and command-line interfaces — to make them do useful work. But not everyone can be – or wants to be – a computer scientist. That relationship has now reversed: machines can now receive and produce natural human language and be functionally useful across a vast range of tasks. This sounds simple, but its consequences are enormous: it eliminates the cognitive burden of having to express your thoughts in computer code. This removes the translation layer that has, for over half a century, stood between human intent and machine execution.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Unstructured data has become operable.&lt;/strong&gt; This may be the most consequential development of all, yet it receives surprisingly little public attention. The majority of the world's information is unstructured. It sits in formats that computers can store but cannot meaningfully process: documents, emails, PDFs, meeting transcripts, legal filings, medical records. This enormous corpus of knowledge has never been queryable by machine beyond simple word-matching or the limited tools of traditional Natural Language Processing, which delivered useful results in narrow domains but never achieved general operability over arbitrary text.&lt;/p&gt;

&lt;p&gt;Large Language Models have upended this. You can now point a system at a body of unstructured text and ask questions on a pseudo-semantic basis — approximating an understanding of meaning and context, rather than merely matching keywords. The result is imperfect but genuinely transformative: legal discovery, regulatory compliance, financial due diligence and medical literature review are already being reshaped by the fact that machines can now read documents and do something useful with what they find. Much of the real economic value is already accumulating here — in the mundane task of making the world's information accessible to computation for the first time. We do not need true Artificial Intelligence for this to happen and to be enormously valuable.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. The mechanical work of computing is being automated.&lt;/strong&gt; Coding, configuration, diagnosis and debugging — the production layer of software development — are increasingly handled by machines. Tasks that would have occupied a developer for a full day now emerge in minutes, with quality that often requires minimal revision. This human time can be reallocated to areas where machines still can't match humans such as creativity and relationships. This sweeping change already extends well beyond the software industry, because code is the substrate of the digital world we all use. Faster, cheaper code production accelerates every domain that depends on software — which in 2026 means all of them.&lt;/p&gt;

&lt;p&gt;Interlude: Why This Is Not How Human Intelligence Writes&lt;/p&gt;

&lt;p&gt;Human writing is typically goal-directed: you begin a sentence with an intended point, maintain an argument, and aim toward an endpoint you can roughly see. You can revise mid-stream, but the act is guided by a plan, even if only a loose one.&lt;/p&gt;

&lt;p&gt;An LLM does not work that way. It does not start a sentence knowing where the sentence will end, or start a paragraph holding an argument in mind and steering toward a conclusion. It generates one token at a time, conditioned on the text so far, with no privileged access to "what it is about to say" beyond whatever patterns are statistically activated by the current context.&lt;/p&gt;

&lt;p&gt;This is hugely important, because it explains a recurring phenomenon: outputs that look like coherent argumentation can still drift, contradict themselves, or smuggle in unstated assumptions. The model is not "pursuing" a claim. It is producing locally plausible continuations that often resemble purposeful reasoning because it has learned the surface form of purposeful reasoning from human text.&lt;/p&gt;

&lt;h3&gt;
  
  
  Group 2 – Labour Reallocation: How This Changes the Relationship Between People and Software
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;4. The abstraction layer has moved up.&lt;/strong&gt; Software engineering has a well-known framework called the V-Model of Testing: at the bottom, writing and unit-testing code; moving upward through integration, system design, requirements specification, and acceptance testing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F96mbwm1tp3994fn53mm7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F96mbwm1tp3994fn53mm7.png" alt="The V-Model of Testing emerged in the 1980s and 1990s" width="459" height="443"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Historically, most computing professionals spent their time near the bottom, doing mechanical production work. And a lot of developers will say they actively dislike coding: it is cognitively difficult and requires long hours that are generally not compatible with family life. Many technologists find themselves moving away from hands-on coding somewhere in their 30s as a result.&lt;/p&gt;

&lt;p&gt;Language models have promoted the entire profession upward. Human work increasingly sits at the upper levels — conceiving, specifying, designing, reviewing, testing, and accepting — while the translation of a clear specification into working software is automated. The human role is shifting from production to direction and judgment: a change in kind, rather than merely degree. It has also extended careers: technologists can continue supervising and reading code well beyond their thirties. And it is even bringing former programmers back into active service, to help with older languages such as COBOL or C++.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. The barriers to entry have dropped, and the specification gap has narrowed sharply.&lt;/strong&gt; When the barrier to producing working software drops from "years of training in programming languages" to "the ability to describe what you want in plain English," the pool of people who can participate expands enormously. A domain expert who understands a problem deeply can now build a working tool to address it, without learning Python or hiring a developer as intermediary. Being expert in computer systems architecture or the latest frameworks and libraries shifts from "must-have" to "nice-to-have".&lt;/p&gt;

&lt;p&gt;This connects directly to one of the oldest problems in software: the gap between what the customer wants and what the developer builds. They often talk past each other. Requirements are written, discussed, revised, misinterpreted, built wrong, sent back, and revised again — round after round that often ends with something that technically matches the brief but misses the point entirely. It is rather like walking into a pub, ordering a pint of bitter, and having the barman hand you a glass of Chardonnay because, in his professional judgment, that's what you probably need.&lt;/p&gt;

&lt;p&gt;This gap is now far cheaper to surface and close, because iteration is cheap. A domain expert can sit down with an AI tool and produce a working example directly: here is what I want, here is how it should behave, here is what it should look like. There is no intermediary to misunderstand, no ping-pong of specification documents that accumulate ambiguity with every round, no techies trying to understand the creatives. The person with the problem can illustrate the solution themselves, in concrete and testable form. The specification gap has always been one of the largest hidden costs in software — responsible for wasted effort, failed projects, and the pervasive frustration of receiving something other than what was asked for. Cheap, rapid prototyping has compressed it dramatically. There is a new risk to be honest about, though: we may be trading a translation gap for a wisdom gap. If the user is unskilled in logic or systems thinking, they may describe a solution that is precisely what they asked for but fundamentally broken in ways a human developer would have caught. The intermediary was sometimes also a sanity check. As the barriers drop, the need for clear thinking about what you actually want — and whether it makes sense — becomes more important, not less.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. This will change who goes into computing.&lt;/strong&gt; The shifts described in points 3 through 5, taken together, will attract different people with different skills into technology. The field is likely to become more diverse, more domain-expert-led, more creative, more experimental, and less reliant on traditional computer science training, as people with deep knowledge of law, medicine, finance, logistics, or education find they can build tools for their own domains directly. This long-tailing of talent and ideas is one of the most positive long-term consequences of the current wave of AI development, and another that is so far little discussed.&lt;/p&gt;

&lt;h3&gt;
  
  
  Group 3 – The Economic Re-Levelling: What This Means More Broadly
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;7. Enterprise capability is now available to everyone: the Great Levelling.&lt;/strong&gt; For decades, complex software development — data integration, sophisticated analytics, multi-system workflows, automated testing and CI/CD — was the preserve of large corporations with dedicated IT departments and enterprise budgets. Small businesses could see what was possible, but the cost placed it out of reach. They were locked out of the technology revolution, noses pressed against the window.&lt;/p&gt;

&lt;p&gt;That gap is now closing. The same AI tools reducing headcount in large engineering teams are simultaneously putting enterprise-grade capabilities into the hands of small businesses and indie developers, anywhere in the world, at a fraction of the former cost. A two-person consultancy can now build tools that would have required a dedicated team and a six-figure budget three years ago. A sole practitioner in a regional town has access to the same capabilities as a team of fifty in a City of London office. A mid-size manufacturer can implement in two weeks what an external consultant said might take two years.&lt;/p&gt;

&lt;p&gt;This is also where an old economic observation may reassert itself: &lt;a href="https://en.wikipedia.org/wiki/Jevons_paradox" rel="noopener noreferrer"&gt;Jevons' Paradox&lt;/a&gt;. When the effective cost of a capability collapses, it does not necessarily reduce the amount of that capability used and often increases it. Cheaper computation produced vastly more computation; cheaper bandwidth produced vastly more bandwidth consumption; and they combined to create entirely new categories of activity – such as social media, streaming and today's AI explosion.&lt;/p&gt;

&lt;p&gt;There is a related and underappreciated fact about computing more broadly. For over half a century, the price of computation per unit has fallen relentlessly, while the economic value created by computation has risen just as relentlessly. The cost per transistor, per floating-point operation, per gigabyte stored or transmitted has trended toward zero in real terms. Yet total spending on computing infrastructure — hardware, software, cloud services, data centres — has continued to rise.&lt;/p&gt;

&lt;p&gt;Far from being a contradiction, instead this reflects a structural property of general-purpose technologies. If a capability becomes cheaper, it does not become less valuable; it becomes embedded in more processes, more industries, and more decisions. Electricity, telecommunications and computing are all following this pattern.&lt;/p&gt;

&lt;p&gt;If AI reduces the marginal cost of producing software and analysis, we should not expect a shrinking of the digital economy. We should expect its further expansion. Lower unit prices and higher aggregate value can coexist for decades, provided the capability continues to unlock new domains of use.&lt;/p&gt;

&lt;p&gt;If the barriers to coding fall far enough, we should not assume "less software" or "less enterprise IT". We should expect more: more internal tools, more automation, more bespoke workflows, more integrations, and more experimentation inside small firms that previously could not justify any of it. Capability becomes omnipresent – almost ambient – and usage expands to fill it.&lt;/p&gt;

&lt;p&gt;We think of this as "The Great Levelling". If giant corporations need fewer developers, the talent freed up flows outward — to smaller firms, new ventures, and parts of the economy that have struggled to keep pace with technology. The transition will feel like a rupture at times, and we should be under no illusions about that. But the longer shift is toward something vastly more positive: a world where the ability to build powerful software is no longer gated by organisational scale or geography. When the cost of producing value drops by orders of magnitude, more of it gets produced and the risk of being wrong all but vanishes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;8. The combined effect is larger than any single prior computing development.&lt;/strong&gt; Each development above is individually significant. Together, they represent a bigger shift than the personal computer, the database, email, the spreadsheet or the internet, considered individually.&lt;/p&gt;

&lt;p&gt;We make this claim for a specific reason: natural language as a programming interface lowers the barrier to entry; unstructured data becoming operable unlocks the majority of the world's information; automated code production compresses delivery timescales; cheap prototyping closes the specification gap; and the levelling of capability between large and small firms redistributes who can build what. No single prior computing innovation did all of these things simultaneously. Acting in concert, these changes are reshaping the economics of knowledge work at a pace and scale that may rival or exceed the impact of the internet to date.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;9. This is a revolution.&lt;/strong&gt; Taken together, these nine developments form a transformation whose full scale we are only beginning to appreciate. Whether it ultimately surpasses the internet in economic impact remains to be seen — but the breadth of what is changing, and the speed at which it is changing, already has no precedent in the history of computing.&lt;/p&gt;




&lt;h2&gt;
  
  
  III. A Distinction Worth Making: Capability and Intention
&lt;/h2&gt;

&lt;p&gt;There is a distinction that deserves more attention in the current public conversation about AI, because it clarifies a great deal: the difference between what these systems can &lt;em&gt;do&lt;/em&gt; and what they &lt;em&gt;are&lt;/em&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  What the Architecture Actually Does
&lt;/h3&gt;

&lt;p&gt;At their core, large language models predict the next token in a sequence, and they do this extraordinarily well — well enough to produce outputs that are, across many tasks, indistinguishable from competent human work.&lt;/p&gt;

&lt;p&gt;The mechanism is statistical pattern-matching over vast quantities of human-generated text. During training, the model learns associations between language patterns — associations rich enough to capture something resembling understanding of meaning, style, argumentation, and domain knowledge. When the output looks like good legal reasoning, elegant code, or sound medical advice, it is drawing on these learned associations, and the results can be remarkable. But without that training on human-produced knowledge, they are themselves incapable of anything. They merely reflect us back at ourselves, albeit very cleverly.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Syntax-Semantics Gap: Symbols vs. Reality
&lt;/h3&gt;

&lt;p&gt;It is vital to remember that these models operate entirely within the realm of syntax: pure text. They have no understanding of what the text means. They are extraordinarily sophisticated at manipulating symbols — words, numbers and code — based on their statistical relationships to other symbols. These relationships have been defined by humans, not machines. They possess no "grounded" model of the physical or causal world. They are, in essence, merely shuffling paper.&lt;/p&gt;

&lt;p&gt;If you ask an "AI" to describe the trajectory of a falling glass, it is unable to simulate gravity or calculate the structural integrity of the floor. It will merely predict the most likely words a human would use to describe that event. This means they lack what engineers call &lt;a href="https://en.wikipedia.org/wiki/Causal_reasoning" rel="noopener noreferrer"&gt;Causal Reasoning&lt;/a&gt;. Because they don't understand why things happen in the real world — only how people talk about them — they are prone to "&lt;a href="https://en.wikipedia.org/wiki/Hallucination_(artificial_intelligence)" rel="noopener noreferrer"&gt;hallucinations&lt;/a&gt;" that are syntactically perfect but physically or logically impossible. They don't even know if they're right or wrong.&lt;/p&gt;

&lt;h3&gt;
  
  
  How Modern Systems Create the Appearance of Intention
&lt;/h3&gt;

&lt;p&gt;Modern AI systems layer additional capabilities on top of this base. &lt;a href="https://en.wikipedia.org/wiki/Instruction_tuning" rel="noopener noreferrer"&gt;Instruction tuning&lt;/a&gt; teaches the model to follow directions. Tool use lets it interact with search engines, code interpreters, and databases. &lt;a href="https://en.wikipedia.org/wiki/Intelligent_agent" rel="noopener noreferrer"&gt;Agent architectures&lt;/a&gt; decompose complex instructions into sub-tasks, execute them sequentially, evaluate intermediate results, and iterate. Evaluation feedback from human evaluators shapes the model's behaviour toward outputs that people judge to be helpful and accurate.&lt;/p&gt;

&lt;p&gt;The result is behaviour that strongly resembles intentional action. When an agent receives a complex brief, breaks it into steps, executes each step, checks its own work, and revises its approach, the resemblance to purposeful human work is close enough that many might mistake it — understandably — for "judgment" and "taste."&lt;/p&gt;

&lt;p&gt;This resemblance is worth acknowledging honestly. These systems &lt;em&gt;behave as if&lt;/em&gt; they have goals, and for many practical purposes the distinction between "has a goal" and "behaves as if it has a goal" may seem immaterial. If the output is good, does it matter why? We should concede the functionalist point here: in terms of &lt;em&gt;impact&lt;/em&gt;, a system that behaves with perfect agency is indistinguishable from one that possesses it. If an AI agent autonomously completes a week-long project, the economic and social consequences are the same whether or not there is "&lt;a href="https://en.wikipedia.org/wiki/What_Is_It_Like_to_Be_a_Bat%3F" rel="noopener noreferrer"&gt;something it is like&lt;/a&gt;" (a "&lt;a href="https://en.wikipedia.org/wiki/Qualia" rel="noopener noreferrer"&gt;quale&lt;/a&gt;") to be that agent. The distinguishing between capability and intention is enormously important for understanding the source of risk and for designing governance — but it does not diminish the scale of disruption.&lt;/p&gt;

&lt;p&gt;But three things are worth noting. First, many researchers working on long-horizon planning, robustness and grounded world-models view LLMs as incomplete foundations for AGI, even while acknowledging their economic impact. Second, we do not need true AI for these tools to be extremely useful. Third, performance is uneven: these systems can look astonishing on unfamiliar ground and mediocre on your home turf. People are typically greatly impressed by LLMs' pronouncements on unfamiliar topics, but mightily unimpressed by their output in subjects where the user is expert. We must use these tools carefully, without overstating what they are and without worrying that rapid task compression automatically implies near-term wholesale role deletion. We are still useful.&lt;/p&gt;

&lt;h3&gt;
  
  
  Where the Direction Comes From
&lt;/h3&gt;

&lt;p&gt;The goals, evaluation criteria, and decisions about what to ask an AI to build next all originate with human researchers, human institutions, and human capital allocation. The system pursues whatever goal it has been given and, as of 2026, that goal itself always comes from outside: from a human, who has a need. LLMs, as of early 2026, are not capable of "self-generating thought" or "self-direction".&lt;/p&gt;

&lt;p&gt;A thermostat maintains temperature — without wanting warmth. Evolution produces adaptation — without wanting survival. &lt;a href="https://en.wikipedia.org/wiki/Gradient_descent" rel="noopener noreferrer"&gt;Gradient descent&lt;/a&gt; reduces loss — without wanting improvement. In each case there is directionality, optimisation, and behaviour that looks purposeful. Yet in each case the direction is supplied by the structure of the system, rather than by any internal experience of desire. The system has no awareness of what it is doing. It is entirely mechanical. But — and this is important — a thermostat connected to a global energy grid with a bug in its optimisation function can freeze a city without ever wanting to. Scale and connectivity transform the consequences of mechanical systems, even absent intention. The same applies here.&lt;/p&gt;

&lt;p&gt;Current AI systems sit in this same category. They are extraordinarily capable optimisation engines but the optimisation they perform is directed by human choices: human instructions, human-designed &lt;a href="https://en.wikipedia.org/wiki/Reinforcement_learning_from_human_feedback" rel="noopener noreferrer"&gt;reward functions&lt;/a&gt;, human-defined objectives. The direction they travel is, in every meaningful sense, set by us. Human needs, human life.&lt;/p&gt;

&lt;p&gt;Calling this "lack of intention" does not mean the systems are harmless. It means the risks come from mis-specified objectives, misuse and institutional incentives rather than from spontaneous self-motivated behaviour. The danger is human error and human recklessness, not machine volition.&lt;/p&gt;

&lt;h3&gt;
  
  
  This Matters for the Future
&lt;/h3&gt;

&lt;p&gt;If these systems were genuinely self-directing — choosing their own objectives and determining their own trajectory — then the future of AI would unfold according to its own logic, and exponential extrapolations about autonomous capability may apply.&lt;/p&gt;

&lt;p&gt;But if the trajectory depends on continued human decisions about funding, energy infrastructure, regulation and research direction, then the future is fundamentally a story about human choices, subject to all the familiar constraints that shape every other technology. There is still a human hand on the tiller.&lt;/p&gt;

&lt;p&gt;We should be candid, though, about the limits of that reassurance. A hand on the tiller matters less if the vessel is a supertanker that takes five miles to turn. Market pressure, military competition, the sunk-cost logic of trillion-dollar infrastructure investments, and the sheer speed at which AI accelerates its own development cycle all create structural momentum that may prove stronger than any individual act of steering. The intention may be ours, but the current is powerful, and it has its own dynamics. Saying "humans are in control" is true today. Whether it remains practically true if development velocity continues to increase depends on whether the machines do develop the capability for self-direction and self-generated thought.&lt;/p&gt;

&lt;p&gt;We think the evidence strongly favours this second view, and we find this encouraging, because it means the outcome is ours to shape through policy, investment, and institutional design — something that we do, rather than something that will simply be done to us.&lt;/p&gt;

&lt;h3&gt;
  
  
  A Note on "AI Building Itself"
&lt;/h3&gt;

&lt;p&gt;Shumer highlights OpenAI's statement that GPT-5.3 Codex "was instrumental in creating itself," and correctly identifies this as a pivotal moment. We'd offer a supporting insight.&lt;/p&gt;

&lt;p&gt;In concrete terms, OpenAI's engineers used the model as a tool during its own development — to debug training runs, manage deployment, and diagnose test results. This is impressive and represents a genuine acceleration of research productivity.&lt;/p&gt;

&lt;p&gt;We'd frame it as a sophisticated instance of &lt;a href="https://en.wikipedia.org/wiki/Bootstrapping_(compilers)" rel="noopener noreferrer"&gt;bootstrapping&lt;/a&gt; — a practice that has existed in computing since the 1960s, where a tool is used to build the next version of itself. The C compiler has been compiled by earlier versions of itself for decades. Bootstrapping is recursive and creates a real productivity loop, but what it creates is accelerated human-directed development rather than autonomous self-improvement.&lt;/p&gt;

&lt;p&gt;It's also related to the concept of "&lt;a href="https://en.wikipedia.org/wiki/Eating_your_own_dog_food" rel="noopener noreferrer"&gt;dogfooding&lt;/a&gt;" whereby developers of Windows at Microsoft were made to use the unfinished product they were building, to accelerate the ironing out of kinks. The developers both used and improved their own product.&lt;/p&gt;

&lt;p&gt;These ideas are important because "AI creating itself" carries a strong implication of agency: of endogenous desire, decision-making and action. We offer this description to assist precision: human researchers used a powerful AI tool to accelerate the human-directed process of building the next AI system. That is a remarkable productivity story, and worth understanding on those terms. But it is not truly an AI creating itself. It is an AI being used as a tool to assist in creating itself, under human direction and supervision.&lt;/p&gt;




&lt;h2&gt;
  
  
  IV. Some Cautions About Prediction
&lt;/h2&gt;

&lt;p&gt;We share the view that change is accelerating, and we think the direction of travel is clear. We'd offer some additional considerations that may help with forecasts.&lt;/p&gt;

&lt;h3&gt;
  
  
  Task Complexity Changes Character as It Grows
&lt;/h3&gt;

&lt;p&gt;The &lt;a href="https://metr.org/" rel="noopener noreferrer"&gt;METR&lt;/a&gt; data on AI task completion is genuinely impressive: from ten-minute tasks a year ago to five-hour expert-level tasks with recent models, with the doubling time apparently accelerating. This can be extrapolated forward: day-long tasks within a year, week-long within two, month-long projects within three.&lt;/p&gt;

&lt;p&gt;We add that five-hour tasks and month-long projects differ qualitatively. A month-long project involves shifting requirements, stakeholder dynamics, ambiguity that can only be resolved through human conversation, political context within organisations and the maintenance of coherent purpose for longer than any current model's working memory (if it has any at all) by orders of magnitude. These are features of the task environment rather than the cognitive difficulty of the task itself, and they may respond to scaling in different and less predictable ways.&lt;/p&gt;

&lt;p&gt;This is a reason for care rather than scepticism: progress may continue impressively, but along a different curve than straight-line extrapolation from bounded-task benchmarks suggests.&lt;/p&gt;

&lt;h3&gt;
  
  
  Institutional Dynamics Will Shape the Pace of Adoption
&lt;/h3&gt;

&lt;p&gt;There is a recurring pattern in technology history: capability arrives suddenly but adoption, implementation and deployment take time, because they depend on institutional readiness. Institutions need to manage change carefully and to avoid discontinuities, preventing change at pace.&lt;/p&gt;

&lt;p&gt;Regulatory frameworks for AI in high-stakes domains — medicine, law, finance — are being developed but remain immature. In particular, liability for AI-generated errors is legally unsettled; professional licensing regimes will take time to adapt. Organisational procurement, integration and change management are substantial even when the technology works perfectly. These factors suggest adoption will be uneven — faster in some sectors, much slower in others, and everywhere shaped by realities that operate on their own timescales regardless of how rapidly the models improve.&lt;/p&gt;

&lt;h3&gt;
  
  
  Task Compression and Role Elimination Are Different Things
&lt;/h3&gt;

&lt;p&gt;Most knowledge-work roles are bundles of tasks. A lawyer's job includes reading documents, drafting arguments, advising clients, negotiating, managing staff, building relationships, exercising judgment in ambiguous situations, and bearing personal legal accountability. AI is becoming very good at some of these tasks — but the bundle includes components that are relationship-based, political and accountability-bearing in ways that do not give way to automation in any way that we can currently envision.&lt;/p&gt;

&lt;p&gt;When some tasks are automated, the role changes rather than disappears: headcounts will be redistributed, required skills will shift and the value of remaining human contributions will be recalibrated. But we should acknowledge the tipping-point risk honestly: if AI absorbs enough of the task bundle — say, 70% or 80% of what a junior associate or analyst currently does — the remaining tasks (judgment, relationships, accountability) may not sustain the previous headcount or salary floor. Roles may persist in name while the number of people employed in them falls sharply. The distinction between "transformed" and "eliminated" is real, but it can be cold comfort to the people on the wrong side of a headcount reduction. We should all be prepared: virtually every knowledge-work role will be reshaped over the coming years, unevenly, by industry, organisation and jurisdiction – even if models do not improve any further, and even if we never reach true Artificial General Intelligence. These new tools are already powerful enough to bring vast changes. The picture will be messy and at times the narrative may reveal itself in unexpected ways. But that is how economic transformation has always looked from the inside. Revolutions are nerve-wracking and exhausting – but generally work themselves out.&lt;/p&gt;




&lt;h2&gt;
  
  
  V. What We Actually Believe
&lt;/h2&gt;

&lt;p&gt;We believe this is the most significant expansion of cognitive automation in modern history and that the process has a long way yet to run. These systems are genuinely useful, increasingly capable and already changing the working reality of millions of knowledge workers. Anyone who has yet to engage with them seriously is leaving value and preparation time on the table – and maybe also themselves. Better to be at the table than on it.&lt;/p&gt;

&lt;p&gt;We also believe what people insist on calling "AI" are best understood as powerful optimisation engines attached to human goals — operating within architectures designed by human engineers, trained on human-generated data and deployed according to human decisions. They are tools of extraordinary and growing sophistication and the direction they travel is set by us.&lt;/p&gt;

&lt;p&gt;This means the future of AI is fundamentally a story about human choices — about funding, regulation, institutional design and the wisdom with which we integrate these tools into our economies. This perspective gives grounds for agency rather than fatalism because the outcome remains ours to shape.&lt;/p&gt;

&lt;p&gt;Shumer's practical advice is largely sound and we echo much of it: learn these tools, experiment regularly, build financial resilience, help your children develop adaptability rather than optimising for career paths that may shift beneath them. We offer this addition to his counsel: cultivate the habit of asking, with precision, what these systems actually are and how they work. The better your mental model of the technology, the better your decisions about when to trust it, when to verify, when to delegate and when to insist on human judgment.&lt;/p&gt;

&lt;p&gt;As a concrete operating model: work as if these tools are junior staff with amnesia and no accountability. Delegate to them freely — drafting, exploring, prototyping, triaging — but keep humans on the hook for correctness and responsibility. Verify what they produce. Treat their output as a first pass, not a final answer. (A fair caveat: the "amnesia" part of this metaphor is eroding. Context windows now reach millions of tokens, and retrieval-augmented architectures give models functional memory across long projects. The metaphor still holds for accountability — these systems have no stake in outcomes — but for continuity of context, the gap is narrowing fast. Adjust accordingly.) This is how the most effective practitioners already work with them, and it maps directly onto the capability-without-intention distinction: these systems will do exactly what you point them at, with impressive competence, but they have no stake in whether the result is right. You do.&lt;/p&gt;




&lt;h2&gt;
  
  
  VI. The New Moat: Competing After the Levelling
&lt;/h2&gt;

&lt;p&gt;If the "Great Levelling" makes enterprise-grade technical capability a commodity, it raises a survival question for the individual and the firm: when everyone has access to the same "super-junior staff," what constitutes a competitive advantage?&lt;/p&gt;

&lt;p&gt;When the cost of production trends toward zero, the value shifts entirely to the things that cannot be automated or replicated by statistical pattern-matching. We identify four "New Moats" that will define professional success in the post-levelling era.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. The Human Premium: Accountability and Trust&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;In a world flooded with AI-generated content and code, provenance becomes a luxury good. A machine can produce a legal brief, but it cannot go to jail for it. It can suggest a medical diagnosis, but it cannot lose its license or feel the weight of a patient's life.&lt;/p&gt;

&lt;p&gt;The new moat is verified human accountability. Clients will pay a premium not for the "work" (which is now cheap), but for the "signature" (which remains expensive). Success will belong to those who cultivate a reputation for being the "Adult in the Room" — the person who takes the risk, provides the guarantee, and stands behind the output.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. High-Context Relationships and "Political" Intelligence&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;AI is excellent at solving puzzles but poor at navigating "politics" — the complex, unstated web of human incentives, egos, and history that governs every large organization.&lt;/p&gt;

&lt;p&gt;Tacit Knowledge: Knowing what the CEO actually cares about, even when it contradicts the formal brief.&lt;/p&gt;

&lt;p&gt;Stakeholder Synthesis: Convincing three different departments with conflicting interests to move in the same direction.&lt;br&gt;
The new moat is the ability to operate in high-ambiguity human environments where the most important data isn't in the database, but in the subtext of a meeting or the "vibe" of a partnership.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Curation, Taste, and the "Editor-in-Chief" Mindset&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;When production is throttled by human effort, "more" is a strategy. When production is infinite and instant, "more" is noise.&lt;/p&gt;

&lt;p&gt;As we move from a world of scarcity to a world of glut, the value shifts from the creator to the curator. The new moat is Taste — the ability to look at ten AI-generated variations of a product, a strategy, or a design and know which one will actually resonate with a human audience. We are moving from a world of "Content Creators" to a world of "Content Editors," where the primary skill is the discernment to say "no" to 99% of what the machine produces.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Integration and Systems Thinking&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The Great Levelling provides everyone with powerful "Lego bricks," but it doesn't provide the manual for the castle.&lt;/p&gt;

&lt;p&gt;As AI handles the mechanical tasks (writing the function, drafting the clause), the human role becomes the Architect. The new moat is Systems Thinking: the ability to see how disparate pieces of technology, law, and business logic fit together into a coherent whole. While the AI focuses on the task, the human must focus on the outcome. Those who can bridge the "Wisdom Gap" by understanding how a single automation might break a larger system will be the ones who lead.&lt;/p&gt;

&lt;p&gt;Because AI lacks a model of the real world, it is fundamentally incapable of Systems Thinking. It can optimize a single line of code or a specific paragraph, but it cannot foresee the "ripple effects" that a change might have on a complex, interconnected system.&lt;/p&gt;

&lt;p&gt;A model can suggest a brilliant tax-optimization strategy (symbolic manipulation), but it cannot intuitively grasp how that strategy might alienate a specific regulator or trigger a sequence of unintended legal consequences (causal reality). The new moat is the ability to bridge this World-Model Gap. Humans must provide the "grounding" — the intuitive understanding of how the digital output will collide with the messy, physical, and political reality of the world. The machine provides the parts; the human provides the Causal Architecture.&lt;/p&gt;




&lt;h2&gt;
  
  
  VII. Closing
&lt;/h2&gt;

&lt;p&gt;Today's global conversation about AI enjoys no shortage of urgency, but it is largely driven by fear. The greatest fear of all is fear of the unknown; and AI currently holds that label. This can be addressed with more precision. Even if what we have today falls short of true artificial intelligence in the fullest sense, these technologies are real. The changes they are bringing to knowledge work are significant and accelerating, and will continue for the foreseeable future. We need to engage early and thoughtfully, in order to collectively shape outcomes rather than merely react to them.&lt;/p&gt;

&lt;p&gt;Understand what these systems are: extraordinary tools for cognitive automation, directed by human purpose. Understand also that they have yet to become systems with genuine autonomy, goals and self-direction. Until that capability arrives, the gap between "here" and "yet to arrive" is where sensible preparation and constructive public conversation need to take place.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk1816tvlpalzqkzhufpa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fk1816tvlpalzqkzhufpa.png" alt="The Great Levelling framework" width="800" height="459"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Something big is indeed happening. It deserves precision about what it is and what it is not.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;TL;DR&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Shumer is right: something big &lt;strong&gt;is&lt;/strong&gt; happening.&lt;/li&gt;
&lt;li&gt;It is the most significant expansion of cognitive automation in history — but not yet the emergence of true artificial intelligence.&lt;/li&gt;
&lt;li&gt;Large language models do not plan, intend or "know where they are going." They generate one token at a time, producing locally plausible continuations that can resemble reasoning without being goal-directed cognition.&lt;/li&gt;
&lt;li&gt;These systems have no goals, no intention and no self-direction. They are powerful optimisation engines whose direction is set entirely by human choices.&lt;/li&gt;
&lt;li&gt;Unless and until that changes, the future remains shaped by funding, regulation, infrastructure and institutional decisions — not machine inevitability.&lt;/li&gt;
&lt;li&gt;The barriers to building software and analysis have collapsed. Enterprise-grade capability is now available to small firms and sole practitioners.&lt;/li&gt;
&lt;li&gt;When the cost of a general-purpose capability falls, usage tends to expand rather than contract. Computing's unit price has fallen for decades while its total economic value has risen. AI is likely to follow the same pattern.&lt;/li&gt;
&lt;li&gt;Expect more software, more automation and more embedded computation — not less.&lt;/li&gt;
&lt;li&gt;Learn these tools. Use them daily. Treat them as highly capable junior staff with limited memory and no accountability.&lt;/li&gt;
&lt;li&gt;Keep humans on the hook for correctness, judgment and responsibility.&lt;/li&gt;
&lt;li&gt;Engage seriously — and be precise about what this is: transformative automation, not yet Artificial General Intelligence.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ai</category>
    </item>
    <item>
      <title>Never heard of microcode? Nope, us neither. But it's really important.</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Fri, 30 Jan 2026 12:47:38 +0000</pubDate>
      <link>https://forem.com/dimension-ai/never-heard-of-microcode-nope-us-neither-but-its-really-important-14j8</link>
      <guid>https://forem.com/dimension-ai/never-heard-of-microcode-nope-us-neither-but-its-really-important-14j8</guid>
      <description>&lt;p&gt;Imagine you've been programming since the 1980s. Maybe you were a bedroom coder on a ZX Spectrum, then had a career writing BASIC and then Visual BASIC before finally transitioning to C#. Or if you were in the USA, maybe you learned on an Apple II and then learned assembly, Pascal, C, C++ and everything that came after. Four decades of programming, three for a living.&lt;/p&gt;

&lt;p&gt;If you are such a person you will likely have never heard the word "microcode".&lt;/p&gt;

&lt;p&gt;From from being a confession of ignorance, our own surprise at never having heard of it is widespread. And many who've heard the term have never looked any further.&lt;/p&gt;

&lt;p&gt;Most professional programmers – even very senior ones – do not know about microcode in any concrete sense, for the simple reason that most never needed to.&lt;/p&gt;

&lt;p&gt;The reason is simple: microcode sits below the abstraction of assembler instructions, which is usually the limit of what programmers care about. If your good old ANSI C compiled to good-looking assembler, most people stopped looking any further. Microcode was designed to be invisible and - for decades - it succeeded.&lt;/p&gt;

&lt;p&gt;That is now changing. Security vulnerabilities, performance limits and the sheer complexity of modern processors have forced microcode into view. If you write software in 2025, you probably still don't need to understand microcode in detail; but you should know it exists, what it does and why it suddenly matters.&lt;/p&gt;

&lt;h2&gt;
  
  
  What microcode actually is
&lt;/h2&gt;

&lt;p&gt;Every CPU implements an Instruction Set Architecture (ISA) - the set of instructions that software can use. x86, ARM, RISC-V: these are ISAs. When you write assembly language, or when a compiler generates machine code, the result is a sequence of ISA instructions.&lt;/p&gt;

&lt;p&gt;Microcode is the layer below.&lt;/p&gt;

&lt;p&gt;Inside the CPU, some instructions are simple enough to execute directly in hardware. An integer addition, for instance, can be wired straight through: operands in, result out - done in a single cycle.&lt;/p&gt;

&lt;p&gt;Other instructions are however more complex. They involve multiple internal steps, conditional behaviour, memory accesses, flag updates, and corner cases. Implementing all of that in pure hardware would be expensive and inflexible.&lt;/p&gt;

&lt;p&gt;Microcode provides an alternative. Instead of hardwiring every instruction, the CPU contains a small internal control program that orchestrates the hardware for complex operations. When the CPU encounters a microcoded instruction, it fetches a sequence of micro-operations from an internal store and executes them in order.&lt;/p&gt;

&lt;p&gt;Think of it as firmware for the instruction decoder. The ISA defines what the CPU promises to do. Microcode defines how it actually does it.&lt;/p&gt;

&lt;p&gt;The extent of microcode use varies by architecture:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;x86&lt;/em&gt; relies heavily on microcode because its instruction set is large, irregular and burdened with decades of backward compatibility.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;ARM&lt;/em&gt; cores vary widely – many are predominantly hardwired, while high-performance implementations such as Apple's M-series or ARM's Cortex-X designs use microcode-like structures to varying degrees.&lt;/li&gt;
&lt;li&gt;
&lt;em&gt;RISC-V&lt;/em&gt; implementations tend toward hardwired control, though complex extensions may introduce microcode.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Why microcode exists
&lt;/h2&gt;

&lt;p&gt;Microcode originated in the 1950s as a way to simplify CPU design. Rather than creating custom hardware for every instruction, engineers could write microcode sequences that reused a common datapath. This made CPUs cheaper to design, easier to debug and simpler (and therefore cheaper) to modify.&lt;/p&gt;

&lt;p&gt;By the 1960s, microcode had become central to computer architecture. IBM's System/360, launched in 1964, used microcode extensively. This allowed IBM to sell machines with different performance characteristics – different hardware implementations – while maintaining a single ISA across the product line. Software written for one System/360 model would run on another. Microcode made that possible. It was a big deal.&lt;/p&gt;

&lt;p&gt;The pattern persists. x86 has survived for over forty years partly because microcode allows Intel and AMD to implement the same ancient instructions on radically different internal architectures. The 8086 of 1978 and a modern Zen 5 core both execute &lt;code&gt;REP MOVSB&lt;/code&gt;. The microcode behind that instruction has been rewritten many times.&lt;/p&gt;

&lt;p&gt;Modern microcode also serves as a post-silicon patching mechanism. Once a chip is fabricated, the silicon cannot be changed; but microcode can be updated. Operating systems and firmware routinely load microcode patches at boot time to fix bugs, close security holes and adjust behaviour. The physical chip stays the same; the control logic changes.&lt;/p&gt;

&lt;h2&gt;
  
  
  A concrete example
&lt;/h2&gt;

&lt;p&gt;Consider the x86 instruction &lt;code&gt;REP MOVSB&lt;/code&gt;. In assembly, it looks like a single operation:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;REP MOVSB
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The architectural specification says: copy ECX bytes from the address in RSI to the address in RDI, incrementing both pointers and decrementing ECX with each byte, until ECX reaches zero.&lt;/p&gt;

&lt;p&gt;That is a lot of work for "one instruction." Internally, it involves:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Loading a byte from memory&lt;/li&gt;
&lt;li&gt;Storing it to a different memory location&lt;/li&gt;
&lt;li&gt;Incrementing RSI&lt;/li&gt;
&lt;li&gt;Incrementing RDI&lt;/li&gt;
&lt;li&gt;Decrementing ECX&lt;/li&gt;
&lt;li&gt;Checking whether ECX is zero&lt;/li&gt;
&lt;li&gt;Branching back if not&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;None of this is visible at the ISA level. The programmer sees one instruction. The CPU sees a microcode sequence, something like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;loop:
  load byte from [RSI]
  store byte to [RDI]
  RSI++
  RDI++
  ECX--
  if ECX != 0, jump loop
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Modern implementations are more sophisticated – they may copy multiple bytes per iteration, use vector registers, or special-case aligned transfers – but the principle holds. Microcode makes the architectural fiction of "one instruction" hold together.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why most programmers never encountered it
&lt;/h2&gt;

&lt;p&gt;If microcode has existed since the 1950s, why have most programmers never heard of it?&lt;/p&gt;

&lt;p&gt;Three reasons.&lt;/p&gt;

&lt;p&gt;First, microcode place in the abstraction stack is quite awkward. Programming education typically covers high-level languages, then perhaps assembly, then maybe pipelines, caches and branch prediction. Microcode sits below the ISA but above transistors – a layer that courses tend to mention briefly, if at all, then move past.&lt;/p&gt;

&lt;p&gt;Second, microcode is intentionally invisible. CPU vendors treat it as proprietary. Intel and AMD do not publish microcode documentation. You cannot call microcode from software. You cannot observe it in a debugger. You cannot disassemble it (legally, at least). If something is undocumented, inaccessible and unobservable, it tends to disappear from working knowledge. It's low level of recognition is a sign of success.&lt;/p&gt;

&lt;p&gt;Third, for most of computing history, microcode simply did not matter for application programming. Performance problems were algorithmic and bugs were logical. Portability issues lived in languages and operating systems. The hardware was a black box that honoured its documented interface and that was sufficient.&lt;/p&gt;

&lt;p&gt;Microcode only intrudes when:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Instructions misbehave in ways the ISA does not explain&lt;/li&gt;
&lt;li&gt;Timing side-channels reveal internal implementation details&lt;/li&gt;
&lt;li&gt;A "hardware bug" gets fixed by a "software update"&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For most programmers, those situations never arose.&lt;/p&gt;

&lt;h2&gt;
  
  
  Historical irony
&lt;/h2&gt;

&lt;p&gt;Here is an odd fact: microcode was more widely discussed in the 1960s and 1970s than in the 1990s and 2000s.&lt;/p&gt;

&lt;p&gt;IBM's System/360 made microcode famous. DEC used it heavily in the PDP-11 and VAX lines. Some machines – Xerox Alto, certain Burroughs systems – even exposed writable microcode, allowing users to define new instructions. Dangerous, but fascinating. Malware authors can only dream.&lt;/p&gt;

&lt;p&gt;Then the RISC (Reduced Instruction Set Computing) revolution promised simpler computing due to simpler instructions, executed faster, that would outperform complex microcoded instructions. The slogan was "hardwire everything." Microcode was subjected to name-calling as a relic of the CISC (Complex Instruction Set Computing) past.&lt;/p&gt;

&lt;p&gt;Despite the name-calling, there was genuine engineering reality. Early RISC machines – MIPS, SPARC, early ARM – were indeed largely hardwired and performance improved. The argument seemed vindicated.&lt;/p&gt;

&lt;p&gt;But x86 survived. Intel and AMD responded not by abandoning microcode but by hiding it more effectively. Modern x86 chips translate complex ISA instructions into internal micro-operations, execute those out of order across multiple pipelines and present the illusion of sequential execution. The microcode is still there. It is just buried under so many layers of complexity that even CPU architects sometimes struggle to explain exactly what is happening.&lt;/p&gt;

&lt;p&gt;Meanwhile, the 1980s home computer generation – people who learned on the ZX Spectrum, Commodore 64, BBC Micro, Apple II – grew up with machines that were either hardwired (the 6502) or used microcode invisibly (the Z80). The 6502 famously had no microcode at all; its control logic was hand-drawn. The Z80 did use microcode internally, but this was entirely invisible to programmers and irrelevant to how you wrote software. Either way, there was nothing to notice so nothing to know about.&lt;/p&gt;

&lt;p&gt;A whole generation of programmers came up without ever needing to know.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why microcode matters again
&lt;/h2&gt;

&lt;p&gt;In January 2018, the Spectre and Meltdown vulnerabilities became public. Not at all software bugs, these were flaws in how modern CPUs speculatively execute instructions – flaws that allowed attackers to read memory they should not have been able to access.&lt;/p&gt;

&lt;p&gt;The response involved operating system patches, compiler changes and – famously – microcode updates.&lt;/p&gt;

&lt;p&gt;Intel, AMD and ARM shipped new microcode that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Modified branch prediction behaviour&lt;/li&gt;
&lt;li&gt;Inserted serialisation barriers&lt;/li&gt;
&lt;li&gt;Changed how speculative execution interacts with memory protection&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Without changing the silicon of chips that were already in computers around the world, the microcode was updated and behaviour changed.&lt;/p&gt;

&lt;p&gt;This made microcode visible in a way it had not been for decades. "We fixed the CPU with a software update" is a sentence that only makes sense if you understand that CPU behaviour is partly defined by mutable control logic.&lt;/p&gt;

&lt;p&gt;In the years after Spectre and Meltdown there were many more such incidents:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Foreshadow (L1 Terminal Fault)&lt;/li&gt;
&lt;li&gt;MDS (Microarchitectural Data Sampling)&lt;/li&gt;
&lt;li&gt;TAA (TSX Asynchronous Abort)&lt;/li&gt;
&lt;li&gt;Retbleed&lt;/li&gt;
&lt;li&gt;Downfall&lt;/li&gt;
&lt;li&gt;Inception&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each required microcode mitigations; and each exposed ever more about the gap between architectural promises and microarchitectural reality.&lt;/p&gt;

&lt;h2&gt;
  
  
  What this says about modern computing
&lt;/h2&gt;

&lt;p&gt;The traditional conception is that hardware is fixed and software is mutable. You design a chip, fabricate it and its behaviour is set. Software is written that runs on top and whose can be changed at will.&lt;/p&gt;

&lt;p&gt;But the underlying reality is that microcode means a CPU is not a fixed hardware object. Its behaviour is affected in three ways:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Architectural&lt;/strong&gt;: defined by the ISA specification&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Microarchitectural&lt;/strong&gt;: determined by the physical implementation&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Policy-driven&lt;/strong&gt;: controlled by microcode that can be updated&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This continues the working model of mainframes from the 1960s – but its security implications are new: mutable microcode becomes an attack surface; and when it defines security boundaries, microcode bugs become security vulnerabilities.&lt;/p&gt;

&lt;p&gt;CPU vendors now publish microcode updates regularly. Linux distributions ship them and Windows Update delivers them. Your BIOS may load them before the operating system even starts. The CPU you are using now is not quite the CPU you bought.&lt;/p&gt;

&lt;p&gt;This naturally results out of complexity. Modern CPUs are so complex – billions of transistors, speculative execution, out-of-order pipelines, multiple cache levels, simultaneous multithreading – that getting everything right in silicon is perhaps now impossible. Microcode provides a route for fixes without unpopular hardware upgrades: a way to fix mistakes after the fact, to adjust trade-offs and to respond to threats that were not anticipated during design.&lt;/p&gt;

&lt;h2&gt;
  
  
  Reframing the original surprise
&lt;/h2&gt;

&lt;p&gt;So if you have been programming for decades and only recently learned about microcode, that does not indicate a gap in your education or a failure of curiosity. It means that you worked above an abstraction boundary that abstraction mostly held.&lt;/p&gt;

&lt;p&gt;This is how successful design manifests. Abstraction exists so that programmers can ignore lower layers. For most of computing history, ignoring microcode was the correct choice: it let you focus on problems that actually mattered for your work.&lt;/p&gt;

&lt;p&gt;We are however now in a transition where hardware is not just mutable but patchable. Not fully – most programmers still do not need to understand microcode in detail – but enough that awareness matters.&lt;/p&gt;

&lt;h2&gt;
  
  
  Closing thoughts
&lt;/h2&gt;

&lt;p&gt;Microcode was always there. For most of us, we did not need to know. Now, sometimes, we need to understand where "software" ends and "hardware" begins. That boundary is a little more porous than programmers came to believe, but for practical purposes it held. Security research, performance engineering, and the sheer complexity of modern processors have eroded it.&lt;/p&gt;

&lt;p&gt;If you write software that cares about security, performance or correctness at the edges, you should know that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The CPU is not a fixed machine; it runs updateable control code&lt;/li&gt;
&lt;li&gt;Microcode updates can change behaviour in ways that affect your software&lt;/li&gt;
&lt;li&gt;The ISA is a contract, but the implementation beneath it is mutable&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The illusion of fixed ISA's is still useful. But there's a lot going on beneath it that you need.&lt;/p&gt;




&lt;p&gt;Further reading&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Agner Fog's microarchitecture manuals &lt;a href="https://www.agner.org/optimize/microarchitecture.pdf" rel="noopener noreferrer"&gt;https://www.agner.org/optimize/microarchitecture.pdf&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Intel's optimisation referencehttps://&lt;a href="http://www.intel.com/content/www/us/en/developer/articles/technical/intel-sdm.html" rel="noopener noreferrer"&gt;www.intel.com/content/www/us/en/developer/articles/technical/intel-sdm.html&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;academic literature on Spectre-class vulnerabilities e.g. &lt;a href="https://css.csail.mit.edu/6.858/2023/readings/spectre-meltdown.pdf" rel="noopener noreferrer"&gt;https://css.csail.mit.edu/6.858/2023/readings/spectre-meltdown.pdf&lt;/a&gt; 
There's a big rabbit-hole to go down after those.&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>microchips</category>
      <category>cpu</category>
      <category>architecture</category>
    </item>
    <item>
      <title>Just what IS Python, anyway?</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Tue, 27 Jan 2026 08:38:33 +0000</pubDate>
      <link>https://forem.com/dimension-ai/just-what-is-python-anyway-7ce</link>
      <guid>https://forem.com/dimension-ai/just-what-is-python-anyway-7ce</guid>
      <description>&lt;p&gt;&lt;em&gt;A mental model for understanding Python's role&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Every mainstream language fits a mental slot. C is a systems language. JavaScript is a browser language. Rust is a safety-focused systems language. SQL is a query language.&lt;/p&gt;

&lt;p&gt;Python doesn't fit. It can be tricky, for an experienced programmer, to grasp what Python actually is and which slot to put it in. Often, they conclude "I don't like Python" and express confusion at its vast popularity in the 2020s.&lt;/p&gt;

&lt;p&gt;Like the eponymous snake, Python can be hard to pin down — and like a real python, it may end up wrapped around your codebase whether you planned it or not.&lt;/p&gt;




&lt;h3&gt;
  
  
  A slippery language
&lt;/h3&gt;

&lt;p&gt;Traditionally we're taught to classify languages along a few axes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;compiled vs interpreted&lt;/li&gt;
&lt;li&gt;scripting vs "real" languages&lt;/li&gt;
&lt;li&gt;imperative vs OO vs functional&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Python fits poorly into all of them.&lt;/p&gt;

&lt;p&gt;It isn't a compiled language in the C or Rust sense: it doesn't result in a standalone executable. It supports imperative, object-oriented and functional styles but isn't optimized for any of them. It began as a scripting language, but today it's used to build large, long-running systems.&lt;/p&gt;

&lt;p&gt;So what &lt;em&gt;IS&lt;/em&gt; Python?&lt;/p&gt;




&lt;h3&gt;
  
  
  The Key Insight: Python Is Not Defined by Its Output
&lt;/h3&gt;

&lt;p&gt;The turning point is to realize that Python is &lt;strong&gt;not defined by the artefact it produces&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;C, C++, Rust, Zig and Fortran produce binaries that can be directly run. The output is the thing. Once compiled, the language largely disappears from the execution model.&lt;/p&gt;

&lt;p&gt;Python doesn't work like that.&lt;/p&gt;

&lt;p&gt;A Python program needs a runtime ecosystem to execute: the interpreter, the object model, the garbage collector and the standard library. These are not incidental. They &lt;em&gt;are&lt;/em&gt; Python. A Python program can't run standalone unless all of these components are bundled with it.&lt;/p&gt;




&lt;h3&gt;
  
  
  Python as a Runtime Ecosystem
&lt;/h3&gt;

&lt;p&gt;In structural terms, Python sits alongside other runtime-centric language ecosystems:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;C#, F#, VB.NET on .NET&lt;/li&gt;
&lt;li&gt;Java, Scala, Kotlin, Clojure on the JVM&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The similarity is architectural role: in all these cases, the runtime is the &lt;em&gt;unit of execution&lt;/em&gt;, not any compiled artefact.&lt;/p&gt;

&lt;p&gt;Strictly speaking, Python is a language specification with multiple implementations — IronPython runs on .NET, Jython on the JVM. But in practice, CPython and its C-API-dependent package ecosystem &lt;em&gt;are&lt;/em&gt; Python. That's what made Python popular, and that's what we're discussing here.&lt;/p&gt;

&lt;p&gt;The differences from .NET and the JVM matter too.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;.NET and the JVM have JIT compilation to native code; CPython does not by default.&lt;/li&gt;
&lt;li&gt;.NET and the JVM enforce static typing as part of the compilation model; Python's type hints are advisory.&lt;/li&gt;
&lt;li&gt;.NET and the JVM produce native distributable artefacts (.dll, .jar) with stable ABIs (Application Binary Interfaces); Python does not, making it more difficult to call. Python prefers to call other libraries rather than be called itself.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;So Python belongs in the "runtime ecosystem" category, but it's a looser, more dynamic variant; it trades static type-safe guarantees for flexibility and rapid development.&lt;/p&gt;

&lt;p&gt;This structural similarity doesn't fully explain Python's success — Ruby, Perl and PHP share similar characteristics but declined while Python grew. Historical contingency matters: NumPy's timing, Google's investment in TensorFlow, and early academic adoption all played roles that had little to do with language design.&lt;/p&gt;




&lt;h3&gt;
  
  
  Where Python's Nature Is Clearest: Orchestration
&lt;/h3&gt;

&lt;p&gt;Going to back to our question of "What IS Python?", the key is to realize that Python is a runtime-centric language. Its nature is clearest in numerical computing, data engineering and machine learning, where Python orchestrates work rather than performing it.&lt;/p&gt;

&lt;p&gt;The most important Python libraries—NumPy, SciPy, Pandas, PyTorch, TensorFlow—are not written in Python in any meaningful sense. Python provides the API, the glue and the control flow. The heavy computation happens in C, C++, Fortran or CUDA libraries that expose a C ABI.&lt;/p&gt;

&lt;p&gt;Python performs the same role over its libraries as:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;SQL over databases&lt;/li&gt;
&lt;li&gt;shell over Unix&lt;/li&gt;
&lt;li&gt;VBA over Office&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It is an orchestration language sitting above high-performance systems. That's why it thrives in scientific computing, data pipelines and machine learning. You build rapidly and easily, with simple syntax, while the underlying libraries deliver the performance. So long as orchestration overhead is low, Python-based systems can scale surprisingly far.&lt;/p&gt;

&lt;p&gt;This is the glamorous use case — but not necessarily the most common one.&lt;/p&gt;




&lt;h3&gt;
  
  
  That's Not the Whole Story
&lt;/h3&gt;

&lt;p&gt;The orchestration model explains Python's dominance in scientific and data-heavy domains — but most Python code written globally is web apps, scripts, automation and data munging where Python &lt;em&gt;is&lt;/em&gt; doing the work directly.&lt;/p&gt;

&lt;p&gt;In web development and business applications (think Django, Flask, FastAPI), Python handles HTTP requests, processes strings and executes business logic. Here, Python trades raw performance for development speed and ecosystem breadth. A Django application will be slower than an equivalent in Go or C#, but it may ship months earlier.&lt;/p&gt;

&lt;p&gt;For these workloads, the framing is different: Python is a productive general-purpose language that prioritizes developer time over CPU time.&lt;/p&gt;




&lt;h3&gt;
  
  
  Why Python Succeeds
&lt;/h3&gt;

&lt;p&gt;Python's popularity is no mystery once you consider this trade-off.&lt;/p&gt;

&lt;p&gt;Being able to assemble things quickly, in readable code, with vast ecosystem support, matters more for mass adoption than type-safety, compilation speed or raw performance. Make something easy, and more people will do it; make something quick to do, and more people will do it, more often.&lt;/p&gt;

&lt;p&gt;Python lowers the barrier to entry for proof-of-concept and prototype work. You can validate an idea in hours rather than days. If performance becomes critical later, you can translate hot paths into a compiled language—but you've already learned what needs building.&lt;/p&gt;

&lt;p&gt;Getting something working &lt;em&gt;at all&lt;/em&gt;, quickly, turns out to be more important than getting it working fast or elegantly. Shell scripting demonstrated this in the 1970s; Visual Basic and VBA did this in the 1990s; Python demonstrates it today. Make it easy and fast to build, and they will come and build.&lt;/p&gt;

&lt;p&gt;A note of realism: "rewrite hot paths later" is technically true but economically rare. Most prototypes never get rewritten; they become production systems. This is true of any language, but Python's low barrier to entry means more prototypes get written in the first place — and more of them survive into production.&lt;/p&gt;




&lt;h3&gt;
  
  
  When Python Is the Wrong Choice
&lt;/h3&gt;

&lt;p&gt;A fair assessment requires acknowledgement of where Python doesn't belong:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Hard real-time systems&lt;/strong&gt; — Garbage collection pauses are unacceptable when deadlines are measured in microseconds.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Mobile applications&lt;/strong&gt; — Neither iOS nor Android use Python as a first-class development language.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Browser code&lt;/strong&gt; — JavaScript and WebAssembly own this space.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Memory-constrained embedded systems&lt;/strong&gt; — Python's runtime overhead is prohibitive on microcontrollers (although MicroPython, a cut-down implementation, has some adoption in this space).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Latency-critical network services&lt;/strong&gt; — Where every millisecond matters, Go, Rust or C++ are better choices.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;CPU-bound pure-Python workloads&lt;/strong&gt; — If you can't offload to native libraries, Python's interpreter speed becomes a genuine bottleneck.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These are domains Python doesn't seriously contest. More relevant are the pain points in domains where Python &lt;em&gt;is&lt;/em&gt; used:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;The GIL&lt;/strong&gt; — The Global Interpreter Lock limits true parallelism in CPU-bound multithreaded code.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Packaging and distribution&lt;/strong&gt; — pip, virtualenv, conda, poetry, and pyproject.toml represent years of fragmented solutions to dependency management.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Import system complexity&lt;/strong&gt; — Relative imports, __init__.py behaviour, and module resolution remain sources of confusion.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Deployment&lt;/strong&gt; — Shipping a Python application to end users without requiring them to install Python remains harder than it should be.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Python excels at orchestration, rapid prototyping and domains with strong library support. It is not a universal solution, and it carries real operational costs.&lt;/p&gt;




&lt;h3&gt;
  
  
  Why Python still feels slippery
&lt;/h3&gt;

&lt;p&gt;Even with this framing, Python can still feel oddly unsatisfying if you come from strongly structured languages.&lt;/p&gt;

&lt;p&gt;Compared with C#/Java or their ecosystems, Python has:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;weak static guarantees&lt;/li&gt;
&lt;li&gt;loose module boundaries&lt;/li&gt;
&lt;li&gt;a simpler, leakier object model&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you're used to the discipline of C#, the functional elegance of F# or the precision of Rust, Python can feel vague. Things work — until they don't — and the language often declines to help you reason about correctness ahead of time.&lt;/p&gt;

&lt;p&gt;That's a real cost. But as the previous section argues, for many problem domains it's a cost worth paying.&lt;/p&gt;




&lt;h3&gt;
  
  
  Clearing Up Misconceptions
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;"Python is slow."&lt;/strong&gt;&lt;br&gt;
True for CPU-bound pure-Python code. False when Python orchestrates native libraries—NumPy array operations execute at C speed regardless of Python's overhead.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;"Python is a scripting language."&lt;/strong&gt;&lt;br&gt;
Historically accurate; Python originated as a scripting tool. But "scripting language" now undersells what Python has become.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;"Python is interpreted."&lt;/strong&gt;&lt;br&gt;
Misleading. CPython compiles source to bytecode, then executes that bytecode on a virtual machine — much like many modern interpreters do. The distinction matters when reasoning about performance and behaviour, but it's an implementation detail rather than a defining characteristic.&lt;/p&gt;




&lt;h3&gt;
  
  
  A Better Language Taxonomy
&lt;/h3&gt;

&lt;p&gt;Python fits comfortably into this three-tier classification:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Category&lt;/th&gt;
&lt;th&gt;Examples&lt;/th&gt;
&lt;th&gt;Defining Trait&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Standalone native&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;C, C++, Rust, Zig, Fortran&lt;/td&gt;
&lt;td&gt;The binary is the product&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Runtime ecosystems&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Python, Java, C#, F#&lt;/td&gt;
&lt;td&gt;The runtime is the product&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Host-bound scripting&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Bash, PowerShell, VBA, Lua&lt;/td&gt;
&lt;td&gt;The host environment is the product&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Python belongs firmly in the second group—with the caveat that it's a more dynamic, less rigidly structured member than Java or C#.&lt;/p&gt;

&lt;p&gt;The boundaries are not perfectly clean. Go has garbage collection, a runtime and reflection, yet produces statically-linked binaries — it sits at the boundary between the first two categories. Taxonomies are useful simplifications, not natural laws.&lt;/p&gt;




&lt;h3&gt;
  
  
  A brief note for Rust and Go proponents
&lt;/h3&gt;

&lt;p&gt;A common challenge: Python's role is better served by "doing it properly" in a compiled language from the start.&lt;/p&gt;

&lt;p&gt;That view makes sense if your problem is well-specified, stable, performance-critical and worth committing to upfront architectural constraints. In such cases, Rust or Go can be excellent choices.&lt;/p&gt;

&lt;p&gt;But many real-world problems do not start that way. They begin as ill-defined, exploratory or fast-moving systems: data pipelines, research code, internal tools, integration glue. A research team needs to test an idea quickly at small scale. A business team needs a tactical solution because the problem won't wait for strategic architecture.&lt;/p&gt;

&lt;p&gt;In those contexts, using a language with strict typing, memory models or concurrency primitives can frustrate development with language-wrestling, where making the language work becomes centre-stage.&lt;/p&gt;

&lt;p&gt;Python and compiled languages are therefore not competitors but complements: Python for orchestration and discovery; Rust, Go or C# for stabilised, performance-critical components. Your Python prototype becomes your teacher—clarifying what the real system needs to do.&lt;/p&gt;

&lt;p&gt;That said, Python's actual competition in most domains isn't Rust or Go — it's JavaScript/TypeScript, Ruby, R, and Julia. Python's victory over these closer competitors owes as much to ecosystem momentum and historical timing as to language design.&lt;/p&gt;




&lt;h3&gt;
  
  
  Summary
&lt;/h3&gt;

&lt;p&gt;Python isn't confused, incoherent or a "toy" language. It simply departs from the mental models of earlier language generations.&lt;/p&gt;

&lt;p&gt;Python is a runtime-centric ecosystem that excels at orchestration, rapid prototyping and leveraging high-performance native libraries. It trades static guarantees and raw speed for flexibility, readability and development velocity.&lt;/p&gt;

&lt;p&gt;That trade-off turns out to be exactly what a large portion of programmers need — including many who aren't professional developers at all, but scientists, analysts and business users who need working code fast. It lets you deliver, quickly. And that's what makes Python incredibly useful — and wildly popular.&lt;/p&gt;

</description>
      <category>python</category>
      <category>dotnet</category>
      <category>jvm</category>
      <category>languagecategories</category>
    </item>
    <item>
      <title>13 Languages Are Challenging C. Most Fail. Only Five Stack Up.</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Sat, 10 Jan 2026 13:02:17 +0000</pubDate>
      <link>https://forem.com/dimension-ai/13-languages-are-challenging-c-most-fail-only-five-stack-up-2no5</link>
      <guid>https://forem.com/dimension-ai/13-languages-are-challenging-c-most-fail-only-five-stack-up-2no5</guid>
      <description>&lt;p&gt;Bjarne Stroustrup once observed:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"There are only two kinds of languages — the ones people complain about and the ones nobody uses."&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;No language exemplifies the first category as well as C.&lt;/p&gt;

&lt;p&gt;So, are there any languages that are better?&lt;/p&gt;

&lt;p&gt;That's an interesting question, but it's not the one worth answering.&lt;/p&gt;

&lt;p&gt;The better question is:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Which languages can occupy C's position in the software stack?&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The reason: C is far more than just a syntax. It is the binary and organisational substrate of modern computing. Every kernel, every runtime, every Python module, every shared library — they all pass through C at some point.&lt;/p&gt;

&lt;p&gt;So "replacing C" means replacing &lt;em&gt;that&lt;/em&gt; role, not just writing nicer code.&lt;/p&gt;

&lt;p&gt;This essay examines C against thirteen "challenger" languages that are known — by their communities or by observers — as C-adjacent, low-level or C successor candidates.&lt;/p&gt;

&lt;p&gt;We applied a systematic methodology using five filters to identify which ones are genuinely competing for C's throne — and which ones are solving different problems entirely.&lt;/p&gt;




&lt;h2&gt;
  
  
  What C actually is
&lt;/h2&gt;

&lt;p&gt;C has been reigning champ of low-level procedural programming for over 50 years. You can think of it as having an &lt;strong&gt;alias-centric memory model&lt;/strong&gt; — a metaphor, not a formal model, but a useful one.&lt;/p&gt;

&lt;p&gt;Its defining axiom can be paraphrased as:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Memory is bytes; pointers alias; mutation is unrestricted.&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Everything else follows from that.&lt;/p&gt;

&lt;p&gt;By default, the compiler cannot assume pointer targets are unaliased unless explicitly told or able to infer it:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Unrestricted mutation becomes dangerous in the presence of aliasing;&lt;/li&gt;
&lt;li&gt;Undefined behaviour is a constant hazard the programmer must actively avoid; and&lt;/li&gt;
&lt;li&gt;Concurrency requires explicit synchronisation.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;(For instance, &lt;code&gt;memcpy&lt;/code&gt; assumes non-overlapping buffers; &lt;code&gt;memmove&lt;/code&gt; does not. The distinction exists because unrestricted aliasing is the default.)&lt;/p&gt;

&lt;p&gt;C's power comes from unrestricted aliasing; but so does its fragility.&lt;/p&gt;

&lt;p&gt;C is nonetheless:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The ABI lingua franca&lt;/li&gt;
&lt;li&gt;The OS interface&lt;/li&gt;
&lt;li&gt;The library boundary&lt;/li&gt;
&lt;li&gt;The toolchain anchor&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;C is the dangerous uncle everyone still invites to dinner — because he owns the house.&lt;/p&gt;




&lt;h2&gt;
  
  
  The "C-family" illusion
&lt;/h2&gt;

&lt;p&gt;What we call the "C family" is really:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;A shared grammar sitting on top of incompatible execution models.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;C-family syntax means braces, semicolons, infix operators, &lt;code&gt;if (…)&lt;/code&gt;, &lt;code&gt;for (…)&lt;/code&gt;, &lt;code&gt;while (…)&lt;/code&gt;, and block-scoped variables. These define a &lt;strong&gt;parser-level&lt;/strong&gt; classification but not a semantic one.&lt;/p&gt;

&lt;p&gt;Java, Rust, Zig, Go and C# look related because their keywords, braces and semicolons are so similar. But they are not related at the level that counts: aliasing, lifetime and memory ownership.&lt;/p&gt;

&lt;p&gt;The economic reason is simple: C syntax is the QWERTY keyboard of programming. Changing it raises training cost without improving machine-level behaviour.&lt;/p&gt;

&lt;p&gt;C syntax won; C semantics didn't. Every major descendant kept the braces but rewrote the rules underneath.&lt;/p&gt;

&lt;p&gt;This is why these languages attract endless argument: they sit on real fault lines (memory, aliasing and lifetime) and we can't avoid paying the costs. As Stroustrup's observation implies, the complaints are evidence of relevance — languages that matter enough to argue about.&lt;/p&gt;




&lt;h2&gt;
  
  
  The four real semantic families
&lt;/h2&gt;

&lt;p&gt;Once you strip syntax away, all C-inspired languages fall into four execution models, defined by &lt;strong&gt;who controls memory and aliasing&lt;/strong&gt;.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Model&lt;/th&gt;
&lt;th&gt;Who owns memory?&lt;/th&gt;
&lt;th&gt;Who enforces safety?&lt;/th&gt;
&lt;th&gt;Examples&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Unmanaged&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Programmer&lt;/td&gt;
&lt;td&gt;Nobody&lt;/td&gt;
&lt;td&gt;C, C++, Zig, Odin, Hare, C3, Carbon, Jai&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Hybrid&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Programmer + runtime&lt;/td&gt;
&lt;td&gt;Partial&lt;/td&gt;
&lt;td&gt;D, Nim, V, Swift&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Managed&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Runtime&lt;/td&gt;
&lt;td&gt;GC&lt;/td&gt;
&lt;td&gt;Go&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Verified&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Compiler&lt;/td&gt;
&lt;td&gt;Type system&lt;/td&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;(Languages like Java, C#, and JavaScript also fall into the Managed category, but they don't claim to be C replacements, so we exclude them from this analysis.)&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Important caveat:&lt;/strong&gt; These buckets are coarse. They describe the default execution model, not every possible subset mode. D in &lt;code&gt;-betterC&lt;/code&gt; mode behaves differently from idiomatic D; Swift in Embedded mode behaves differently from standard Swift. The taxonomy captures where each language's gravity pulls you, not where heroic effort can push you.&lt;/p&gt;




&lt;h2&gt;
  
  
  The five filters a real C-successor must pass
&lt;/h2&gt;

&lt;p&gt;A language must be able to:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Filter&lt;/th&gt;
&lt;th&gt;Why&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Freestanding compilation&lt;/td&gt;
&lt;td&gt;Kernels, firmware, runtimes — code that runs with no OS beneath it&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Deterministic layout&lt;/td&gt;
&lt;td&gt;Structs and ABI are contracts&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Stable C interop&lt;/td&gt;
&lt;td&gt;The world is already written&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;No mandatory runtime&lt;/td&gt;
&lt;td&gt;You must own the process&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Toolchain-level visibility&lt;/td&gt;
&lt;td&gt;Debuggers, profilers, sanitizers&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Only a few languages can pass all of these filters.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Critical distinction:&lt;/strong&gt; A language passes these filters if its &lt;em&gt;default compilation mode&lt;/em&gt; meets them, or if a &lt;em&gt;first-class, maintained mode&lt;/em&gt; exists that does. Languages requiring unofficial hacks, unmaintained forks, or heroic effort to achieve freestanding operation do not pass.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scope note:&lt;/strong&gt; This analysis evaluates languages &lt;em&gt;positioned as C successors&lt;/em&gt; — that is, languages whose stated purpose or community framing includes replacing C in its substrate role. Languages that pass all five filters but are designed for different purposes (extending C, migrating from C++, or serving as C's companion) are excluded from the contender tiers. This is a scope choice based on stated intent, not a technical judgement.&lt;/p&gt;




&lt;h2&gt;
  
  
  The candidate pool
&lt;/h2&gt;

&lt;p&gt;Here are the fourteen languages we'll evaluate. Each is discussed in public discourse as C-adjacent, low-level, or a potential C successor. Listed with core philosophy, semantic family, and a characterisation.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Year&lt;/th&gt;
&lt;th&gt;Philosophy&lt;/th&gt;
&lt;th&gt;Semantic family&lt;/th&gt;
&lt;th&gt;Characterisation&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;C&lt;/td&gt;
&lt;td&gt;1972&lt;/td&gt;
&lt;td&gt;"Trust the programmer"&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;The dangerous uncle who owns the house&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C++&lt;/td&gt;
&lt;td&gt;1985&lt;/td&gt;
&lt;td&gt;"Zero-cost abstractions"&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;C's midlife crisis&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;2015&lt;/td&gt;
&lt;td&gt;"Make invalid states unrepresentable"&lt;/td&gt;
&lt;td&gt;Verified&lt;/td&gt;
&lt;td&gt;Your compiler's disappointed parent&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Zig&lt;/td&gt;
&lt;td&gt;2016&lt;/td&gt;
&lt;td&gt;"No hidden control flow, no hidden allocations"&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;C's responsible older brother&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Odin&lt;/td&gt;
&lt;td&gt;2016&lt;/td&gt;
&lt;td&gt;"The joy of programming, distilled"&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;C for people who ship video games&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Hare&lt;/td&gt;
&lt;td&gt;2022&lt;/td&gt;
&lt;td&gt;"Stability is a feature"&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;C if Dennis Ritchie had a time machine&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C3&lt;/td&gt;
&lt;td&gt;2019&lt;/td&gt;
&lt;td&gt;"Evolution, not revolution"&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;C with a sensible haircut&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;D&lt;/td&gt;
&lt;td&gt;2001&lt;/td&gt;
&lt;td&gt;"C++ without the baggage"&lt;/td&gt;
&lt;td&gt;Hybrid&lt;/td&gt;
&lt;td&gt;The language that tried to please everyone&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Nim&lt;/td&gt;
&lt;td&gt;2008&lt;/td&gt;
&lt;td&gt;"Write like Python, run like C"&lt;/td&gt;
&lt;td&gt;Hybrid&lt;/td&gt;
&lt;td&gt;Python wearing a C costume to a job interview&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;V&lt;/td&gt;
&lt;td&gt;2019&lt;/td&gt;
&lt;td&gt;"Simple, fast, done"&lt;/td&gt;
&lt;td&gt;Hybrid&lt;/td&gt;
&lt;td&gt;The language of bold promises&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Swift&lt;/td&gt;
&lt;td&gt;2014&lt;/td&gt;
&lt;td&gt;"Safety without sacrifice"&lt;/td&gt;
&lt;td&gt;Hybrid&lt;/td&gt;
&lt;td&gt;Objective-C's attractive replacement&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Go&lt;/td&gt;
&lt;td&gt;2009&lt;/td&gt;
&lt;td&gt;"Simplicity scales"&lt;/td&gt;
&lt;td&gt;Managed&lt;/td&gt;
&lt;td&gt;Java in a hoodie&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Carbon&lt;/td&gt;
&lt;td&gt;2022&lt;/td&gt;
&lt;td&gt;"C++ migration path"&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;Google's C++ apology letter&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Jai&lt;/td&gt;
&lt;td&gt;~2014&lt;/td&gt;
&lt;td&gt;"Game programmers know best"&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;The language equivalent of "coming soon"&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;C is the baseline. The other thirteen are the challengers.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scope note:&lt;/strong&gt; This analysis restricts itself to C-adjacent languages actively discussed as C successors in modern discourse. It excludes Ada/SPARK, Pascal variants, and other capable low-level languages that aren't typically framed as "C replacements" in contemporary conversation. Ada/SPARK in particular has decades of deployment in safety-critical domains (avionics, rail, defence) and would pass most of the filters below — its exclusion reflects contemporary framing, not capability. That's a scope choice, not a quality judgement.&lt;/p&gt;




&lt;h2&gt;
  
  
  The defining axioms
&lt;/h2&gt;

&lt;p&gt;Each language can be summarised by what it believes about memory. But first, six definitions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Memory:&lt;/strong&gt; A flat array of addressable bytes. In C's model, any byte can be read or written if you have its address.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pointer:&lt;/strong&gt; A value that holds a memory address. Pointers let you indirectly access and modify data elsewhere in memory.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Aliasing:&lt;/strong&gt; When two or more pointers refer to the same memory location. If &lt;code&gt;p&lt;/code&gt; and &lt;code&gt;q&lt;/code&gt; both point to address &lt;code&gt;0x1000&lt;/code&gt;, writing through &lt;code&gt;p&lt;/code&gt; changes what &lt;code&gt;q&lt;/code&gt; sees. This is powerful and dangerous.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Mutation:&lt;/strong&gt; Changing the value stored at a memory location. Unrestricted mutation means any code with a pointer can modify that memory at any time.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Lifetime:&lt;/strong&gt; The span during which a piece of memory is valid to access — from allocation to deallocation. In C, the programmer tracks lifetimes manually; use-after-free bugs occur when code accesses memory whose lifetime has ended.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ownership:&lt;/strong&gt; The responsibility for a piece of memory's lifecycle — who allocates it, who may access it, and who deallocates it. In C, ownership is a convention. In Rust, ownership is enforced by the compiler: every value has exactly one owner, and memory is freed when the owner goes out of scope.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;C's defining axiom combines the first four: memory is bytes, pointers can alias freely, and mutation is unrestricted. Lifetimes and ownership are the programmer's problem. Every C-inspired language must decide whether to keep, constrain, or replace that axiom.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Axiom&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;C&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is bytes; pointers alias; mutation is unrestricted"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;C++&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is bytes; pointers alias; mutation is unrestricted; also here are 47 ways to abstract over that"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Rust&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is owned; aliasing is controlled; mutation is permissioned"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Zig&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is bytes; pointers alias; but we will tell you when you mess up"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Odin&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is bytes; pointers alias; but the defaults should be sensible"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Hare&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is bytes; pointers alias; but the language should be finished"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;C3&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is bytes; pointers alias; but the syntax shouldn't fight you"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;D&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is whatever you want it to be today"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Nim&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is managed unless you insist otherwise"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;V&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is managed but we're working on making it optional"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Swift&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is reference-counted; aliasing is... complicated"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Go&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is the runtime's problem, not yours"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Carbon&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is bytes; pointers alias; but we're fixing C++'s mistakes"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Jai&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;"Memory is bytes; pointers alias; and we'll ship when we ship"&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  The C-ness comparison
&lt;/h2&gt;

&lt;p&gt;How close is each language to C's actual execution model?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Scoring rubric:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;C-like syntax:&lt;/strong&gt; Braces, semicolons, infix operators, familiar keywords&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;C-style pointers idiomatic:&lt;/strong&gt; Pointer arithmetic, manual aliasing, and explicit address manipulation are normal usage patterns&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Safety checks optional:&lt;/strong&gt; Runtime bounds checking, null checks, and similar guards can be disabled or are off by default. (Note: Rust scores zero here because its safety is structural and compile-time, not a runtime check that can be toggled — this measures runtime check configurability, not overall safety philosophy.)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Manual control:&lt;/strong&gt; Programmer controls allocation, layout, and teardown without runtime intervention&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Star calibration:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;⭐⭐⭐⭐⭐ = Indistinguishable from C in this dimension&lt;/li&gt;
&lt;li&gt;⭐⭐⭐⭐☆ = Minor differences; C programmer would adapt quickly&lt;/li&gt;
&lt;li&gt;⭐⭐⭐☆☆ = Recognisably similar but with significant divergence&lt;/li&gt;
&lt;li&gt;⭐⭐☆☆☆ = Different default model; C-like usage possible but not idiomatic&lt;/li&gt;
&lt;li&gt;⭐☆☆☆☆ = Fundamentally different approach&lt;/li&gt;
&lt;li&gt;☆☆☆☆☆ = No meaningful similarity&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Overall = minimum of the four columns.&lt;/strong&gt; Rationale: a language is only as C-like as its weakest dimension. A language with perfect syntax but no pointer arithmetic cannot substitute for C in practice; the binding constraint dominates.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Syntax&lt;/th&gt;
&lt;th&gt;Pointers&lt;/th&gt;
&lt;th&gt;Safety&lt;/th&gt;
&lt;th&gt;Control&lt;/th&gt;
&lt;th&gt;Overall&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;C&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;C++&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Zig&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Odin&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Hare&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;C3&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Carbon&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Jai&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;D&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Nim&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;V&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Swift&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;⭐☆☆☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐☆☆&lt;/td&gt;
&lt;td&gt;⭐☆☆☆☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Rust&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;☆☆☆☆☆&lt;/td&gt;
&lt;td&gt;☆☆☆☆☆&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;☆☆☆☆☆&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Go&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;☆☆☆☆☆&lt;/td&gt;
&lt;td&gt;☆☆☆☆☆&lt;/td&gt;
&lt;td&gt;⭐☆☆☆☆&lt;/td&gt;
&lt;td&gt;☆☆☆☆☆&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Key insight:&lt;/strong&gt; Zig, Odin, Hare, C3, and Jai are C-family &lt;strong&gt;in physics&lt;/strong&gt;. Rust is C-family &lt;strong&gt;only in grammar&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Note:&lt;/strong&gt; C++ provides C ABI compatibility via &lt;code&gt;extern "C"&lt;/code&gt; declarations, and this is how virtually all C++ libraries expose C-compatible interfaces. However, C++ cannot &lt;em&gt;replace&lt;/em&gt; C at the ABI boundary: templates, overloads, and name-mangling have no C ABI representation, so the C++ features that distinguish it from C cannot cross that boundary.&lt;/p&gt;




&lt;h2&gt;
  
  
  How C-like languages diverged
&lt;/h2&gt;

&lt;p&gt;Every descendant of C had to decide what to do about C's aliasing axiom. They took four different paths.&lt;/p&gt;

&lt;h3&gt;
  
  
  Path A — Keep C's physics
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Languages:&lt;/strong&gt; C++, Zig, Odin, Hare, C3, Carbon, Jai&lt;/p&gt;

&lt;p&gt;They kept pointer aliasing, manual lifetimes, and undefined behaviour as a feature (or at least as an explicit tradeoff). They only changed ergonomics, not semantics.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;These are C-family in physics.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;(Some of these keep C's physics but still fail the five filters for non-technical reasons: maturity, release status, or scope. Carbon targets C++ migration; Jai isn't released. Semantic similarity doesn't guarantee practical usability as a C successor.)&lt;/p&gt;

&lt;h3&gt;
  
  
  Path B — Hide C's physics
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Languages:&lt;/strong&gt; Go (and outside our candidate pool: Java, C#, JavaScript)&lt;/p&gt;

&lt;p&gt;They removed pointer arithmetic, explicit lifetimes, and alias control. They replaced them with tracing garbage collection, moving heaps, and runtime checks.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;These are managed VMs with curly braces.&lt;/strong&gt; They look like C but execute like Java.&lt;/p&gt;

&lt;h3&gt;
  
  
  Path C — Half-escape
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Languages:&lt;/strong&gt; D, Nim, V, Swift&lt;/p&gt;

&lt;p&gt;They offer raw pointers when you ask, GC or ARC when you don't, plus slices and bounds checking. This creates two languages inside one compiler.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;They are not C and not safe.&lt;/strong&gt; They rely on discipline plus runtime support.&lt;/p&gt;

&lt;h3&gt;
  
  
  Path D — Replace the memory model
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Languages:&lt;/strong&gt; Rust&lt;/p&gt;

&lt;p&gt;Rust throws away C's axiom — "Memory is bytes; pointers alias; mutation is unrestricted" — and replaces it with: "Memory is owned; aliasing is controlled; mutation is permissioned."&lt;/p&gt;

&lt;p&gt;That is a new machine model, enforced at compile time.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Rust does not make C safer. It makes a different kind of machine that happens to use C syntax.&lt;/strong&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Applying the five filters
&lt;/h2&gt;

&lt;p&gt;We now take all fourteen languages and run them through the five filters. C is the baseline — it passes by definition. The other thirteen are sorted into three tiers.&lt;/p&gt;

&lt;h3&gt;
  
  
  Tier 3: Clear failures
&lt;/h3&gt;

&lt;p&gt;These languages have fundamental design conflicts with substrate programming, cannot be evaluated, or are explicitly designed for different purposes.&lt;/p&gt;

&lt;h4&gt;
  
  
  Go — "Simplicity scales"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; Java in a hoodie&lt;/p&gt;

&lt;p&gt;Go fails the "no mandatory runtime" and "freestanding" filters. It has a mandatory garbage collector, mandatory runtime scheduler, and goroutine stack management that requires runtime support.&lt;/p&gt;

&lt;p&gt;This means Go cannot occupy C's substrate role. It can, however, occupy a different and valuable role.&lt;/p&gt;

&lt;p&gt;The distinction means a lot:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Freestanding substrate programming&lt;/strong&gt; = you control the machine (kernels, drivers, firmware, runtimes). No runtime beneath you.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Systems infrastructure programming&lt;/strong&gt; = you build infrastructure software (containers, networking, CLI tools). Runtime handles memory and scheduling.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Go is excellent at systems infrastructure. Docker, Kubernetes, and countless networking tools prove this. But Go is structurally incapable of freestanding substrate work.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Illustrative example:&lt;/strong&gt; Imagine a hardware interrupt with a ~10μs deadline. A garbage collector that might pause for milliseconds cannot coexist with that constraint. This isn't a criticism of Go — it's a category distinction. Go solves different problems.&lt;/p&gt;

&lt;p&gt;Go replaces C in infrastructure. It does not compete for C's substrate role.&lt;/p&gt;

&lt;h4&gt;
  
  
  Carbon — "C++ migration path"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; Google's C++ apology letter&lt;/p&gt;

&lt;p&gt;Carbon's publicly stated focus is C++ interoperability and migration. It is not positioned as a freestanding substrate language today.&lt;/p&gt;

&lt;p&gt;Carbon is a C++ successor, not a C successor. Different problem.&lt;/p&gt;

&lt;h4&gt;
  
  
  Jai — "Game programmers know best"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; The language equivalent of "coming soon"&lt;/p&gt;

&lt;p&gt;Not publicly released. No stable toolchain. Cannot be evaluated against filters.&lt;/p&gt;

&lt;h4&gt;
  
  
  C++ — "Zero-cost abstractions"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; C's midlife crisis&lt;/p&gt;

&lt;p&gt;C++ passes all five technical filters — it has freestanding mode, deterministic layout, full C interop via &lt;code&gt;extern "C"&lt;/code&gt;, optional runtime features, and complete toolchain support. However, C++ is explicitly designed to &lt;em&gt;extend&lt;/em&gt; C, not &lt;em&gt;replace&lt;/em&gt; it. Its stated philosophy ("zero-cost abstractions") is about adding capabilities above C, not providing an alternative substrate.&lt;/p&gt;

&lt;p&gt;C++ is C's companion, not its successor. This is a scope exclusion based on the language's stated purpose, not a technical failure.&lt;/p&gt;




&lt;h3&gt;
  
  
  Tier 2: Conditional passes
&lt;/h3&gt;

&lt;p&gt;These languages &lt;em&gt;can&lt;/em&gt; pass the filters, but only when compiled in a restricted mode that disables significant features.&lt;/p&gt;

&lt;h4&gt;
  
  
  D (with -betterC) — "C++ without the baggage"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; The language that tried to please everyone&lt;/p&gt;

&lt;p&gt;BetterC mode removes the GC, runtime, classes, exceptions, dynamic arrays, associative arrays, nested functions with context, and most of the standard library (Phobos). What remains is closer to "C with templates" than idiomatic D.&lt;/p&gt;

&lt;p&gt;D's tragedy is that it's genuinely well-designed, but arrived too early (before Rust proved the market) and tried to be too many things. In BetterC mode, &lt;strong&gt;C++ plus garbage collection&lt;/strong&gt; becomes just &lt;strong&gt;C with better syntax&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verdict:&lt;/strong&gt; Passes filters, but the mode strips so much that it's debatable whether "D" is being used or a D-shaped subset.&lt;/p&gt;

&lt;h4&gt;
  
  
  Nim (with manual configuration) — "Write like Python, run like C"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; Python wearing a C costume to a job interview&lt;/p&gt;

&lt;p&gt;Nim can be pushed toward freestanding and embedded targets, but it becomes a "bring-your-own-runtime-stubs" exercise. Debugging largely happens at the generated-C level. The ABI is the underlying C compiler's, not Nim's.&lt;/p&gt;

&lt;p&gt;Nim is better understood as a &lt;strong&gt;C producer&lt;/strong&gt; than a &lt;strong&gt;C successor&lt;/strong&gt;. It doesn't replace C; it generates it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verdict:&lt;/strong&gt; Technically achievable. Philosophically awkward.&lt;/p&gt;

&lt;h4&gt;
  
  
  Swift (with Embedded Swift) — "Safety without sacrifice"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; Objective-C's attractive replacement&lt;/p&gt;

&lt;p&gt;Swift has an experimental Embedded Swift compilation mode that produces standalone object files with no runtime required. It disables reflection, existentials, and ABI stability. It primarily targets ARM and RISC-V embedded platforms.&lt;/p&gt;

&lt;p&gt;This is genuinely interesting. But Embedded Swift is experimental, and Apple's priorities are phones, not kernels.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verdict:&lt;/strong&gt; Passes filters in experimental mode. Ecosystem and tooling are nascent. Worth monitoring.&lt;/p&gt;

&lt;h4&gt;
  
  
  V (with -gc none) — "Simple, fast, done"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; The language of bold promises&lt;/p&gt;

&lt;p&gt;V has a manual memory management mode and community members have built experimental kernel projects. But V's autofree — its main selling point — is experimental and not production-ready. The language is pre-1.0.&lt;/p&gt;

&lt;p&gt;V promised a lot. The jury is still out on delivery.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Verdict:&lt;/strong&gt; Technically passes. Maturity concerns.&lt;/p&gt;




&lt;h3&gt;
  
  
  Tier 1: Clear passes
&lt;/h3&gt;

&lt;p&gt;These languages pass all five filters in their &lt;strong&gt;default&lt;/strong&gt; or &lt;strong&gt;primary&lt;/strong&gt; compilation mode. Their distinctive features remain available when targeting the substrate.&lt;/p&gt;

&lt;h4&gt;
  
  
  Rust — "Make invalid states unrepresentable"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; Your compiler's disappointed parent&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Axiom:&lt;/strong&gt; "Memory is owned; aliasing is controlled; mutation is permissioned"&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What it really is:&lt;/strong&gt; A safe systems language wearing C syntax as a compatibility layer.&lt;/p&gt;

&lt;p&gt;Rust's &lt;code&gt;no_std&lt;/code&gt; mode is first-class. The borrow checker works without the standard library. In late 2024 and through 2025, Linux kernel maintainers progressively elevated Rust's status, with key maintainers publicly stating that Rust support is moving beyond experimental toward production use. (See LKML discussions and LWN coverage from this period.)&lt;/p&gt;

&lt;p&gt;In safe Rust, the compiler enforces aliasing and lifetime rules that eliminate use-after-free and data races. This guarantee applies to safe code; &lt;code&gt;unsafe&lt;/code&gt; blocks and FFI boundaries are the programmer's responsibility.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Trade-off:&lt;/strong&gt; Borrow checker complexity. Steep learning curve. Interop with C requires unsafe blocks.&lt;/p&gt;

&lt;h4&gt;
  
  
  Zig — "No hidden control flow, no hidden allocations"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; C's responsible older brother&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Axiom:&lt;/strong&gt; "Memory is bytes; pointers alias; but we will tell you when you mess up"&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What it really is:&lt;/strong&gt; C with a compiler that cares.&lt;/p&gt;

&lt;p&gt;Zig has no runtime by default. It can replace not just C code but the C &lt;em&gt;toolchain&lt;/em&gt; — it works as a drop-in C compiler and cross-compilation system. Comptime works without an OS.&lt;/p&gt;

&lt;p&gt;Zig doesn't stop you shooting yourself in the foot. It hands you the gun with the safety on and a note explaining which end is dangerous.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Trade-off:&lt;/strong&gt; Language still pre-1.0. API stability not guaranteed.&lt;/p&gt;

&lt;h4&gt;
  
  
  Odin — "The joy of programming, distilled"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; C for people who ship video games&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Axiom:&lt;/strong&gt; "Memory is bytes; pointers alias; but the defaults should be sensible"&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What it really is:&lt;/strong&gt; Modernised C for large codebases.&lt;/p&gt;

&lt;p&gt;Odin's freestanding target is first-class. The context system — Odin's mechanism for threading allocators and loggers through call stacks — works freestanding. Memory management is explicit.&lt;/p&gt;

&lt;p&gt;Odin is what happens when someone who actually ships software designs a language, rather than someone who writes papers about shipping software.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Trade-off:&lt;/strong&gt; Smaller ecosystem. Less tooling than Rust or Zig.&lt;/p&gt;

&lt;h4&gt;
  
  
  Hare — "Stability is a feature"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; C if Dennis Ritchie had a time machine&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Axiom:&lt;/strong&gt; "Memory is bytes; pointers alias; but the language should be finished"&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What it really is:&lt;/strong&gt; What C might have been with hindsight.&lt;/p&gt;

&lt;p&gt;Hare does not link to libc by default. It is designed explicitly for kernels, compilers, and system tools. The language specification will freeze at 1.0.&lt;/p&gt;

&lt;p&gt;Hare's radical proposition: what if a programming language stopped changing?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Trade-off:&lt;/strong&gt; Very early ecosystem. The project explicitly states support for Linux and FreeBSD only, on x86_64, aarch64, and riscv64 architectures. There is no intention to support proprietary platforms — this is a stated design choice, not an oversight.&lt;/p&gt;

&lt;h4&gt;
  
  
  C3 — "Evolution, not revolution"
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Characterisation:&lt;/strong&gt; C with a sensible haircut&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Axiom:&lt;/strong&gt; "Memory is bytes; pointers alias; but the syntax shouldn't fight you"&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What it really is:&lt;/strong&gt; C with obvious mistakes fixed.&lt;/p&gt;

&lt;p&gt;C3 has full C ABI compatibility. You can mix C and C3 files in the same build with no friction. Contracts and slices work without libc. The compiler uses an LLVM backend.&lt;/p&gt;

&lt;p&gt;C3 asks: what if we just... fixed the preprocessor and added slices? Sometimes the boring answer is correct.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Trade-off:&lt;/strong&gt; Small community. Limited tooling. Pre-1.0.&lt;/p&gt;




&lt;h2&gt;
  
  
  Summary: The filter results
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Tier&lt;/th&gt;
&lt;th&gt;C-ness&lt;/th&gt;
&lt;th&gt;Outcome&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;C&lt;/td&gt;
&lt;td&gt;Baseline&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐⭐&lt;/td&gt;
&lt;td&gt;The thing we're trying to replace&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;☆☆☆☆☆&lt;/td&gt;
&lt;td&gt;Clear pass&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Zig&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;Clear pass&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Odin&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;Clear pass&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Hare&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;Clear pass&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C3&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;Clear pass&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;D&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;Conditional pass (-betterC)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Nim&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;Conditional pass (manual configuration)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;V&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;⭐⭐☆☆☆&lt;/td&gt;
&lt;td&gt;Conditional pass (-gc none)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Swift&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;⭐☆☆☆☆&lt;/td&gt;
&lt;td&gt;Conditional pass (Embedded Swift)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;td&gt;&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C++&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;Excluded: companion to C, not successor (scope)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Go&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;☆☆☆☆☆&lt;/td&gt;
&lt;td&gt;Clear fail: mandatory GC/runtime&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Carbon&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;Excluded: wrong target (C++ successor)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Jai&lt;/td&gt;
&lt;td&gt;3&lt;/td&gt;
&lt;td&gt;⭐⭐⭐⭐☆&lt;/td&gt;
&lt;td&gt;Cannot evaluate: not released&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Important distinction:&lt;/strong&gt; The C-ness column measures semantic similarity to C's execution model. The tier column measures practical capability to occupy C's substrate role. These are independent axes. Rust scores zero on C-ness but passes all five substrate filters — it can replace C &lt;em&gt;despite&lt;/em&gt; being nothing like it. Zig, Odin, Hare and C3 score high on both: they can replace C &lt;em&gt;and&lt;/em&gt; feel familiar to C programmers.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why the clear passes form the real contender set
&lt;/h2&gt;

&lt;p&gt;The distinction between Tier 1 and Tier 2 is very important.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Tier 1 (Clear Pass) languages&lt;/strong&gt; (Rust, Zig, Odin, Hare, C3) are designed such that their &lt;em&gt;distinctive features&lt;/em&gt; remain available in freestanding mode.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Rust's borrow checker works without std&lt;/li&gt;
&lt;li&gt;Zig's comptime works without an OS&lt;/li&gt;
&lt;li&gt;Odin's context system works freestanding&lt;/li&gt;
&lt;li&gt;Hare's tagged unions work in kernels&lt;/li&gt;
&lt;li&gt;C3's contracts and slices work without libc&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Tier 2 (Conditional Pass) languages&lt;/strong&gt; (D, Nim, Swift, V) &lt;em&gt;can&lt;/em&gt; target C's substrate position, but doing so requires abandoning the features that differentiate them from C.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;D without GC is just C with nicer syntax&lt;/li&gt;
&lt;li&gt;Nim without its runtime is just a C code generator&lt;/li&gt;
&lt;li&gt;Swift without reflection is just... less Swift&lt;/li&gt;
&lt;li&gt;V without autofree is still figuring out what it is&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is the meaningful distinction:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Which languages let you be productive in their idiom while targeting the substrate?&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  What each Tier 1 language is actually trying to fix about C
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Philosophy&lt;/th&gt;
&lt;th&gt;C's failure it targets&lt;/th&gt;
&lt;th&gt;Approach&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;"Make invalid states unrepresentable"&lt;/td&gt;
&lt;td&gt;Memory safety&lt;/td&gt;
&lt;td&gt;Compile-time ownership tracking&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Zig&lt;/td&gt;
&lt;td&gt;"No hidden control flow"&lt;/td&gt;
&lt;td&gt;Implicit behaviour, toolchain complexity&lt;/td&gt;
&lt;td&gt;Explicit allocators, drop-in C compiler&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Odin&lt;/td&gt;
&lt;td&gt;"The joy of programming"&lt;/td&gt;
&lt;td&gt;Ergonomic scaling&lt;/td&gt;
&lt;td&gt;Context system, better defaults&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Hare&lt;/td&gt;
&lt;td&gt;"Stability is a feature"&lt;/td&gt;
&lt;td&gt;Undefined behaviour, language drift&lt;/td&gt;
&lt;td&gt;Conservative redesign, frozen specification&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C3&lt;/td&gt;
&lt;td&gt;"Evolution, not revolution"&lt;/td&gt;
&lt;td&gt;Preprocessor, weak type safety&lt;/td&gt;
&lt;td&gt;Semantic macros, contracts, slices&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;This explains why they coexist: they disagree on &lt;em&gt;what's wrong with C&lt;/em&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  The real fault line
&lt;/h2&gt;

&lt;p&gt;The true divide is not syntax. It is this:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Who controls memory?&lt;/th&gt;
&lt;th&gt;Languages&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Programmer&lt;/td&gt;
&lt;td&gt;C, C++, Zig, Odin, Hare, C3, Carbon, Jai&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Shared: Runtime + programmer&lt;/td&gt;
&lt;td&gt;D, Nim, V, Swift&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Compiler&lt;/td&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Runtime&lt;/td&gt;
&lt;td&gt;Go&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Rust is the only language where the compiler enforces memory correctness for safe code.&lt;/p&gt;

&lt;p&gt;Everyone else either:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Trusts you&lt;/strong&gt; (Unmanaged) — "Here's a gun, try not to shoot your foot"&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Hides memory from you&lt;/strong&gt; (Managed) — "What gun? There is no gun"&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Does both depending on mode&lt;/strong&gt; (Hybrid) — "Here's a gun, but we've hidden most of the bullets"&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Rust says: "You can have the gun, but I'm going to check your paperwork first, and also during, and also after."&lt;/p&gt;




&lt;h2&gt;
  
  
  The philosophical spectrum
&lt;/h2&gt;

&lt;p&gt;The five Tier 1 contenders can be placed on two axes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Axis 1: Compiler protection&lt;/strong&gt; — how much should the compiler constrain you?&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Maximum protection                              Minimum protection
        |                                               |
      Rust -------- C3 -------- Odin -------- Hare -------- Zig
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Axis 2: Language complexity&lt;/strong&gt; — how much machinery does the language have?&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;High complexity                                 Low complexity
        |                                               |
      Rust -------- Zig -------- C3 -------- Odin -------- Hare
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The philosophies track these positions:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Philosophy&lt;/th&gt;
&lt;th&gt;Protection&lt;/th&gt;
&lt;th&gt;Complexity&lt;/th&gt;
&lt;th&gt;Personality&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;"Make invalid states unrepresentable"&lt;/td&gt;
&lt;td&gt;Maximum&lt;/td&gt;
&lt;td&gt;High&lt;/td&gt;
&lt;td&gt;The perfectionist who won't ship until it's right&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Zig&lt;/td&gt;
&lt;td&gt;"No hidden control flow"&lt;/td&gt;
&lt;td&gt;Minimum&lt;/td&gt;
&lt;td&gt;Medium&lt;/td&gt;
&lt;td&gt;The pragmatist who reads the assembly output&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Odin&lt;/td&gt;
&lt;td&gt;"The joy of programming"&lt;/td&gt;
&lt;td&gt;Medium&lt;/td&gt;
&lt;td&gt;Low&lt;/td&gt;
&lt;td&gt;The game dev who's sick of fighting the language&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Hare&lt;/td&gt;
&lt;td&gt;"Stability is a feature"&lt;/td&gt;
&lt;td&gt;Low&lt;/td&gt;
&lt;td&gt;Minimal&lt;/td&gt;
&lt;td&gt;The greybeard who remembers when C was new&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C3&lt;/td&gt;
&lt;td&gt;"Evolution, not revolution"&lt;/td&gt;
&lt;td&gt;Medium&lt;/td&gt;
&lt;td&gt;Low&lt;/td&gt;
&lt;td&gt;The conservative who just wants C, but less annoying&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  The ecosystem reality
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Philosophy&lt;/th&gt;
&lt;th&gt;Can replace C&lt;/th&gt;
&lt;th&gt;Is replacing C&lt;/th&gt;
&lt;th&gt;Where&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;"Make invalid states unrepresentable"&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Yes&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Linux kernel, Android, Windows components&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Zig&lt;/td&gt;
&lt;td&gt;"No hidden control flow"&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;&lt;strong&gt;Yes&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Build toolchains, Bun runtime&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Odin&lt;/td&gt;
&lt;td&gt;"The joy of programming"&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Partially&lt;/td&gt;
&lt;td&gt;Game engines, developer tools&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Hare&lt;/td&gt;
&lt;td&gt;"Stability is a feature"&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Not yet&lt;/td&gt;
&lt;td&gt;Too early (released 2022)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C3&lt;/td&gt;
&lt;td&gt;"Evolution, not revolution"&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Not yet&lt;/td&gt;
&lt;td&gt;Too early, small community&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;C does not get replaced in theory. It gets replaced by where people migrate their code.&lt;/p&gt;




&lt;h2&gt;
  
  
  What this analysis does not cover
&lt;/h2&gt;

&lt;p&gt;This essay focuses on whether languages &lt;em&gt;can&lt;/em&gt; occupy C's substrate role. It does not evaluate several practical factors that matter for adoption decisions:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Compilation speed&lt;/strong&gt; — Zig and Odin emphasise fast compilation; Rust is notoriously slow&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Corporate backing and sustainability&lt;/strong&gt; — Rust (Mozilla origins, now independent foundation), Zig (foundation-backed), others vary&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Migration costs and learning curves&lt;/strong&gt; — Rust's borrow checker is steep; Zig and C3 are gentler for C programmers&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ecosystem maturity&lt;/strong&gt; — Rust has crates.io; others have smaller package ecosystems&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These factors may dominate practical language choice even when substrate capability is equal.&lt;/p&gt;




&lt;h2&gt;
  
  
  The real trajectory
&lt;/h2&gt;

&lt;p&gt;C will not be "killed".&lt;/p&gt;

&lt;p&gt;It will be:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Surrounded&lt;/strong&gt; — Rust in Linux, Zig in toolchains&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Absorbed&lt;/strong&gt; — C3's C-mixing model&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Constrained&lt;/strong&gt; — new drivers in Rust, not C&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Replaced in layers&lt;/strong&gt; — userspace first, then kernel modules, then core kernel&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Rust, Zig and Odin are already doing this. Hare and C3 show what a clean C &lt;em&gt;could have been&lt;/em&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The story is not "C vs new languages".&lt;/p&gt;

&lt;p&gt;It is:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Which languages can occupy the deepest layer of computing — the one C has held since 1973?&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Fourteen languages were evaluated. Four failed outright, were excluded on scope, or couldn't be assessed. Four passed conditionally. Five passed cleanly. One is the baseline.&lt;/p&gt;

&lt;p&gt;Each survivor has a philosophy and an axiom:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Philosophy&lt;/th&gt;
&lt;th&gt;Axiom&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;C&lt;/td&gt;
&lt;td&gt;"Trust the programmer"&lt;/td&gt;
&lt;td&gt;Memory is bytes; pointers alias; mutation is unrestricted&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;"Make invalid states unrepresentable"&lt;/td&gt;
&lt;td&gt;Memory is owned; aliasing is controlled; mutation is permissioned&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Zig&lt;/td&gt;
&lt;td&gt;"No hidden control flow"&lt;/td&gt;
&lt;td&gt;Memory is bytes; pointers alias; but we will tell you when you mess up&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Odin&lt;/td&gt;
&lt;td&gt;"The joy of programming"&lt;/td&gt;
&lt;td&gt;Memory is bytes; pointers alias; but the defaults should be sensible&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Hare&lt;/td&gt;
&lt;td&gt;"Stability is a feature"&lt;/td&gt;
&lt;td&gt;Memory is bytes; pointers alias; but the language should be finished&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C3&lt;/td&gt;
&lt;td&gt;"Evolution, not revolution"&lt;/td&gt;
&lt;td&gt;Memory is bytes; pointers alias; but the syntax shouldn't fight you&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The substrate is diversifying. Slowly, but measurably.&lt;/p&gt;

&lt;p&gt;And the key insight:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Rust belongs to the C family only in grammar.&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Zig, Odin, Hare and C3 belong to it in physics.&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;That is why Rust feels alien to C programmers, and Zig feels familiar.&lt;/p&gt;

&lt;p&gt;They solve different problems; and that's why, as Stroustrup reminds us, people will complain about them for decades — because they matter.&lt;/p&gt;




&lt;h2&gt;
  
  
  Appendix A: Filter application detail
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Freestanding&lt;/th&gt;
&lt;th&gt;Deterministic layout&lt;/th&gt;
&lt;th&gt;C interop&lt;/th&gt;
&lt;th&gt;No mandatory runtime&lt;/th&gt;
&lt;th&gt;Toolchain visibility&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;C&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C++&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓ (via extern "C")&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;✓ (no_std)&lt;/td&gt;
&lt;td&gt;✓ (repr(C))&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Zig&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Odin&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Hare&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;Partial (QBE backend)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C3&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓ (LLVM)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;D&lt;/td&gt;
&lt;td&gt;✓ (-betterC)&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓ (-betterC)&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Nim&lt;/td&gt;
&lt;td&gt;✓ (manual)&lt;/td&gt;
&lt;td&gt;Via C&lt;/td&gt;
&lt;td&gt;Via C&lt;/td&gt;
&lt;td&gt;✓ (manual)&lt;/td&gt;
&lt;td&gt;Via C&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;V&lt;/td&gt;
&lt;td&gt;✓ (-gc none)&lt;/td&gt;
&lt;td&gt;Via C&lt;/td&gt;
&lt;td&gt;Via C&lt;/td&gt;
&lt;td&gt;✓ (-gc none)&lt;/td&gt;
&lt;td&gt;Via C&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Swift&lt;/td&gt;
&lt;td&gt;✓ (Embedded)&lt;/td&gt;
&lt;td&gt;Partial&lt;/td&gt;
&lt;td&gt;Partial&lt;/td&gt;
&lt;td&gt;✓ (Embedded)&lt;/td&gt;
&lt;td&gt;Partial&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Go&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;Partial (cgo)&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Carbon&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;C++ focus&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Jai&lt;/td&gt;
&lt;td&gt;?&lt;/td&gt;
&lt;td&gt;?&lt;/td&gt;
&lt;td&gt;?&lt;/td&gt;
&lt;td&gt;?&lt;/td&gt;
&lt;td&gt;?&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  Appendix B: Complete language characterisation
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Philosophy&lt;/th&gt;
&lt;th&gt;Characterisation&lt;/th&gt;
&lt;th&gt;Semantic family&lt;/th&gt;
&lt;th&gt;What it really is&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;C&lt;/td&gt;
&lt;td&gt;"Trust the programmer"&lt;/td&gt;
&lt;td&gt;The dangerous uncle who owns the house&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;The alias-centric memory model&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C++&lt;/td&gt;
&lt;td&gt;"Zero-cost abstractions"&lt;/td&gt;
&lt;td&gt;C's midlife crisis&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;C with objects, templates, and regrets&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;"Make invalid states unrepresentable"&lt;/td&gt;
&lt;td&gt;Your compiler's disappointed parent&lt;/td&gt;
&lt;td&gt;Verified&lt;/td&gt;
&lt;td&gt;A safe systems language wearing C syntax&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Zig&lt;/td&gt;
&lt;td&gt;"No hidden control flow"&lt;/td&gt;
&lt;td&gt;C's responsible older brother&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;C with a compiler that cares&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Odin&lt;/td&gt;
&lt;td&gt;"The joy of programming"&lt;/td&gt;
&lt;td&gt;C for people who ship video games&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;Modernised C for large codebases&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Hare&lt;/td&gt;
&lt;td&gt;"Stability is a feature"&lt;/td&gt;
&lt;td&gt;C if Ritchie had a time machine&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;What C might have been with hindsight&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C3&lt;/td&gt;
&lt;td&gt;"Evolution, not revolution"&lt;/td&gt;
&lt;td&gt;C with a sensible haircut&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;C with obvious mistakes fixed&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;D&lt;/td&gt;
&lt;td&gt;"C++ without the baggage"&lt;/td&gt;
&lt;td&gt;The language that tried to please everyone&lt;/td&gt;
&lt;td&gt;Hybrid&lt;/td&gt;
&lt;td&gt;C++ plus garbage collection&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Nim&lt;/td&gt;
&lt;td&gt;"Write like Python, run like C"&lt;/td&gt;
&lt;td&gt;Python wearing a C costume&lt;/td&gt;
&lt;td&gt;Hybrid&lt;/td&gt;
&lt;td&gt;Python syntax on a C backend&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;V&lt;/td&gt;
&lt;td&gt;"Simple, fast, done"&lt;/td&gt;
&lt;td&gt;The language of bold promises&lt;/td&gt;
&lt;td&gt;Hybrid&lt;/td&gt;
&lt;td&gt;Fast-compile application language&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Swift&lt;/td&gt;
&lt;td&gt;"Safety without sacrifice"&lt;/td&gt;
&lt;td&gt;Objective-C's attractive replacement&lt;/td&gt;
&lt;td&gt;Hybrid&lt;/td&gt;
&lt;td&gt;Apple's managed systems language&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Go&lt;/td&gt;
&lt;td&gt;"Simplicity scales"&lt;/td&gt;
&lt;td&gt;Java in a hoodie&lt;/td&gt;
&lt;td&gt;Managed&lt;/td&gt;
&lt;td&gt;Systems infrastructure language, not substrate language&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Carbon&lt;/td&gt;
&lt;td&gt;"C++ migration path"&lt;/td&gt;
&lt;td&gt;Google's C++ apology letter&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;C++ interop successor&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Jai&lt;/td&gt;
&lt;td&gt;"Game programmers know best"&lt;/td&gt;
&lt;td&gt;The language equivalent of "coming soon"&lt;/td&gt;
&lt;td&gt;Unmanaged&lt;/td&gt;
&lt;td&gt;Unknown (not released)&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  Appendix C: The four divergence paths from C
&lt;/h2&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Path&lt;/th&gt;
&lt;th&gt;What they did&lt;/th&gt;
&lt;th&gt;Languages&lt;/th&gt;
&lt;th&gt;Summary&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;A: Keep C's physics&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Kept pointer aliasing, manual lifetimes, UB&lt;/td&gt;
&lt;td&gt;C++, Zig, Odin, Hare, C3, Carbon, Jai&lt;/td&gt;
&lt;td&gt;"C, but we fixed the obvious stuff"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;B: Hide C's physics&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Replaced with GC, moving heaps, runtime checks&lt;/td&gt;
&lt;td&gt;Go&lt;/td&gt;
&lt;td&gt;"What if we just... didn't let you touch memory?"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;C: Half-escape&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Raw pointers when asked, GC when not&lt;/td&gt;
&lt;td&gt;D, Nim, V, Swift&lt;/td&gt;
&lt;td&gt;"Have it both ways (and debug both sets of bugs)"&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;D: Replace the model&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Static aliasing constraints, compiler verification&lt;/td&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;"What if the compiler was your mum?"&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  Appendix D: Freestanding substrate vs systems infrastructure
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Freestanding substrate programming:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You control the machine&lt;/li&gt;
&lt;li&gt;No runtime beneath you&lt;/li&gt;
&lt;li&gt;You handle interrupts, memory maps, hardware registers&lt;/li&gt;
&lt;li&gt;Examples: kernels, drivers, firmware, bootloaders, language runtimes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Systems infrastructure programming:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You build infrastructure software&lt;/li&gt;
&lt;li&gt;Runtime handles memory, scheduling, I/O&lt;/li&gt;
&lt;li&gt;You handle requests, processes, orchestration&lt;/li&gt;
&lt;li&gt;Examples: containers, networking tools, build systems, CLI utilities&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Architectural compatibility with hard real-time / ISR constraints:&lt;/strong&gt;&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Language&lt;/th&gt;
&lt;th&gt;Compatible?&lt;/th&gt;
&lt;th&gt;Notes&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;C&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Reference implementation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C++&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;With care (no exceptions in ISR)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rust&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;no_std mode&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Zig&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;No runtime by default&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Odin&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Freestanding target&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Hare&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;No libc by default&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C3&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;LLVM freestanding&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;D (-betterC)&lt;/td&gt;
&lt;td&gt;Likely&lt;/td&gt;
&lt;td&gt;Requires restricted mode&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Nim (freestanding)&lt;/td&gt;
&lt;td&gt;Unclear&lt;/td&gt;
&lt;td&gt;Depends on stub implementation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;V (-gc none)&lt;/td&gt;
&lt;td&gt;Unclear&lt;/td&gt;
&lt;td&gt;Pre-1.0, limited evidence&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Swift (Embedded)&lt;/td&gt;
&lt;td&gt;Unclear&lt;/td&gt;
&lt;td&gt;Experimental mode&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Go&lt;/td&gt;
&lt;td&gt;No&lt;/td&gt;
&lt;td&gt;Mandatory GC/runtime&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Carbon&lt;/td&gt;
&lt;td&gt;Unclear&lt;/td&gt;
&lt;td&gt;Not yet positioned for this&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Jai&lt;/td&gt;
&lt;td&gt;Unclear&lt;/td&gt;
&lt;td&gt;Not released&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Go is excellent at systems infrastructure. It is structurally incapable of freestanding substrate work.&lt;/p&gt;




&lt;h2&gt;
  
  
  ACKNOWLEDGEMENTS
&lt;/h2&gt;

&lt;p&gt;Thanks to Paul J. Lucas for serving as technical language spec cop and pointing out an important correction.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Last updated: January 2026.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>c</category>
      <category>rust</category>
      <category>zig</category>
      <category>systemsprogramming</category>
    </item>
    <item>
      <title>Why C is still the most important programming language</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Wed, 07 Jan 2026 13:03:09 +0000</pubDate>
      <link>https://forem.com/dimension-ai/why-c-is-still-the-most-important-programming-language-2n9j</link>
      <guid>https://forem.com/dimension-ai/why-c-is-still-the-most-important-programming-language-2n9j</guid>
      <description>&lt;p&gt;&lt;em&gt;&lt;em&gt;Every programmer must learn it, even if they don't use it&lt;/em&gt;&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;Modern languages have replaced C in most application-level programming. Python dominates scripting and data science. Java and C# run enterprise systems. Go powers cloud infrastructure. Swift builds iOS applications. Rust is taking over systems programming.&lt;/p&gt;

&lt;p&gt;Yet none of these languages has displaced C at the boundary.&lt;/p&gt;

&lt;p&gt;When any of them needs to call a native library, talk to the operating system, or interoperate with code written in another language, it speaks C. Not because C is a good language – it has well-documented problems with memory safety and undefined behaviour – but because every platform defines the low-level rules for how compiled code communicates in terms of C.&lt;/p&gt;

&lt;p&gt;Understanding why this is true, and what it means, is the reason every programmer should learn C.&lt;/p&gt;

&lt;h2&gt;
  
  
  ABI and FFI
&lt;/h2&gt;

&lt;p&gt;When compiled code from one language calls compiled code from another, something must govern the mechanics of that call. Which CPU registers hold the arguments? Does the caller or the callee clean up the stack? How are structures laid out in memory? How are return values passed back?&lt;/p&gt;

&lt;p&gt;These questions are answered by the Application Binary Interface, or ABI. The ABI is not defined by any programming language; it is defined by the platform. On Windows x64, Microsoft's calling convention applies. On Linux and macOS x64, the System V AMD64 ABI applies. On ARM64, the AArch64 ABI applies.&lt;/p&gt;

&lt;p&gt;Every language that wants to call foreign code provides a Foreign Function Interface, or FFI. Python has ctypes and cffi. Rust has &lt;code&gt;extern "C"&lt;/code&gt;. Java has JNI. Zig has &lt;code&gt;@cImport&lt;/code&gt;. The FFI handles language-level concerns: declaring signatures, marshalling strings, managing memory ownership. But when the call actually happens, the FFI must emit machine code that obeys the platform's ABI.&lt;/p&gt;

&lt;p&gt;Here is the critical point: there is no single "C calling convention" that spans all platforms. Windows x64 differs from System V AMD64 differs from AArch64. But on every platform, the C compiler's calling convention becomes the platform's standard ABI. All other languages target that ABI when they need to interoperate.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;ABI&lt;/th&gt;
&lt;th&gt;Stable&lt;/th&gt;
&lt;th&gt;Documented&lt;/th&gt;
&lt;th&gt;Universal on its platform&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Platform C ABI&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;C++ ABI&lt;/td&gt;
&lt;td&gt;Varies by compiler&lt;/td&gt;
&lt;td&gt;Partially&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Rust ABI&lt;/td&gt;
&lt;td&gt;Explicitly unstable&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Swift ABI&lt;/td&gt;
&lt;td&gt;✓ (since 2019)&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;Apple platforms only&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;C++ ABIs differ between MSVC, GCC and Clang. Rust documents its ABI as unstable and reserves the right to change it. Swift's ABI is stable but was not designed for cross-language use.&lt;/p&gt;

&lt;p&gt;The platform C ABI is the only binary interface that every compiler supports, every language runtime can target, and that has remained stable for decades. It works from assembly language with no runtime required.&lt;/p&gt;

&lt;p&gt;The practical consequence: when you want to build a system spanning multiple languages – Rust for performance, Python for scripting, C# for business logic – every FFI involved will emit calls using that platform's C conventions. SQLite, OpenSSL, zlib, libgit2, LLVM, and thousands of other libraries expose C interfaces regardless of their implementation language, because the C ABI is the only interface all consumers share.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvl0bgfufk0tnhql5pk0y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvl0bgfufk0tnhql5pk0y.png" alt=" " width="800" height="545"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The operating system interface
&lt;/h2&gt;

&lt;p&gt;The binary interface is not the only reason C remains central. The operating system itself is specified in C terms.&lt;/p&gt;

&lt;p&gt;The Linux kernel is written in C. The Windows kernel is written in C. Darwin, the core of macOS and iOS, is written in C. FreeBSD, OpenBSD, and most embedded real-time operating systems are written in C.&lt;/p&gt;

&lt;p&gt;More importantly, the portable operating system interface – POSIX – is specified as C function signatures and C types. File I/O, process management, signals, threads, memory mapping: all defined in terms that map directly to the platform's C ABI. This is not a philosophical choice about C's elegance; it is a practical choice about binary stability. C types are the lingua franca because the C ABI is the lingua franca.&lt;/p&gt;

&lt;p&gt;Every language that wants to open a file, spawn a process, or allocate memory must ultimately call these interfaces. The call goes through the language's FFI, crosses into the platform's C ABI, and reaches the kernel.&lt;/p&gt;

&lt;h2&gt;
  
  
  The infrastructure that remains
&lt;/h2&gt;

&lt;p&gt;An enormous amount of critical infrastructure is written in C and will remain written in C for the foreseeable future.&lt;/p&gt;

&lt;p&gt;Linux, Git, curl, nginx, Apache, PostgreSQL, SQLite, OpenSSL, zlib, ffmpeg – foundational software that the modern internet depends on. These projects are actively maintained, and they are not being rewritten.&lt;/p&gt;

&lt;p&gt;Someone must maintain this code, understand it when it breaks, fix the security vulnerabilities, diagnose the edge cases, and extend the functionality. That someone needs to read C fluently.&lt;/p&gt;

&lt;p&gt;Even if you never contribute to these projects, you depend on them. Understanding the language they are written in helps you understand their behaviour, their limitations, and their failure modes.&lt;/p&gt;

&lt;h2&gt;
  
  
  The mental model
&lt;/h2&gt;

&lt;p&gt;There is a secondary reason to learn C, less original but still valid: C forces you to confront the machine.&lt;/p&gt;

&lt;p&gt;In C, memory is a flat array of bytes. Pointers are addresses. Allocation is explicit, and so is deallocation. There is no garbage collector, no hidden allocations behind convenient syntax. When you write &lt;code&gt;malloc(100)&lt;/code&gt;, you are responsible for those bytes.&lt;/p&gt;

&lt;p&gt;A programmer who understands C understands what higher-level abstractions are hiding. They can reason about cache locality, memory layout, and the cost of indirection. When a Java program runs slowly or a Python script consumes unexpected memory, the programmer who learned C has a mental model for diagnosing the problem.&lt;/p&gt;

&lt;p&gt;This is not the primary argument for learning C – the ABI argument is stronger and more specific – but it reinforces it.&lt;/p&gt;

&lt;h2&gt;
  
  
  What about Rust?
&lt;/h2&gt;

&lt;p&gt;Rust and Zig are genuine competitors to C for new systems programming. They offer memory safety, better tooling, and modern language design.&lt;/p&gt;

&lt;p&gt;In embedded and real-time domains – microcontrollers with 32 kilobytes of RAM, medical devices that cannot tolerate unpredictable latency, avionics and industrial control systems – C once had no competition. You needed a language that compiled to small, predictable, deterministic code with no garbage collector and no hidden runtime behaviour. Rust now offers this, with memory safety as well, and is a serious alternative.&lt;/p&gt;

&lt;p&gt;More significantly, Rust has now been accepted into the Linux kernel. In December 2022, Linux 6.1 merged support for Rust as a second implementation language alongside C – and in December 2025, the kernel maintainers promoted Rust from experimental to a core part of the kernel. This is a genuinely epochal shift: for over 30 years, the Linux kernel was C and C alone. The decision to admit Rust, and then to make it a first-class citizen, reflects how seriously the kernel maintainers take its safety guarantees and how mature the language has become for systems work.&lt;/p&gt;

&lt;p&gt;Even so, Rust in the kernel does not escape the C ABI. Rust kernel modules must interoperate with the vast existing C codebase through C-compatible interfaces. When a Rust library outside the kernel needs to be callable from Python, Go, C#, and Ruby, it exposes a C API. When Rust talks to the operating system, it uses C conventions. When Rust calls vendor SDKs, hardware abstraction layers, or existing libraries, it speaks C at the boundary.&lt;/p&gt;

&lt;p&gt;Rust is not replacing C at the interface level; it is building safer implementations behind a C-compatible façade.&lt;/p&gt;

&lt;h2&gt;
  
  
  The only partial challengers
&lt;/h2&gt;

&lt;p&gt;Two alternatives have attempted to displace the C ABI as the universal binary interface.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Criterion&lt;/th&gt;
&lt;th&gt;C ABI&lt;/th&gt;
&lt;th&gt;COM / WinRT&lt;/th&gt;
&lt;th&gt;WebAssembly&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Cross-platform&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;Windows only&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Mature tooling&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;Limited&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Native performance&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;Slower (5–30%)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;No runtime required&lt;/td&gt;
&lt;td&gt;✓&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;td&gt;✗&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;OS API access&lt;/td&gt;
&lt;td&gt;Direct&lt;/td&gt;
&lt;td&gt;Direct&lt;/td&gt;
&lt;td&gt;Via host&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Adoption in HPC / AI / scientific&lt;/td&gt;
&lt;td&gt;Dominant&lt;/td&gt;
&lt;td&gt;Rare&lt;/td&gt;
&lt;td&gt;Emerging&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Stable since&lt;/td&gt;
&lt;td&gt;1970s&lt;/td&gt;
&lt;td&gt;1993&lt;/td&gt;
&lt;td&gt;2017&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;COM, and its modern descendant WinRT, provides a binary object interface with versioning and cross-language support on Windows. It remains Windows-only and is rarely used in high-performance computing, scientific computing, or AI infrastructure.&lt;/p&gt;

&lt;p&gt;WebAssembly defines a portable binary format with its own ABI and is the first serious challenger in forty years. WASM today is slower than native code, however, tooling remains immature, and every operating system API still speaks C. WebAssembly may grow into a genuine alternative, but it has not done so yet.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Fifty years of programming language research have produced remarkable advances in safety, expressiveness, and developer experience. Modern languages are genuinely better than C for almost every application-level task.&lt;/p&gt;

&lt;p&gt;Yet all of them, when they need to call outside themselves, target the C ABI.&lt;/p&gt;

&lt;p&gt;This did not come about as a deliberate choice. C was simply there first, and by the time anyone thought to standardize cross-language interoperability, C's conventions were already embedded in every operating system, system library, and driver.  In modern parlance: C gained network effects that make it extremely hard to dislodge.&lt;/p&gt;

&lt;p&gt;The C ABI has thus become the binary equivalent of TCP/IP: a protocol everything else is built on, invisible until you look for it and impossible to escape once you do. Kernels are written in C; OS interfaces are specified in C types; critical infrastructure is maintained in C; and the ABI that connects everything is the C ABI.&lt;/p&gt;

&lt;p&gt;You may never write a line of C. But if you write software that loads libraries, calls system APIs, embeds interpreters or links against native code, you are programming against the C ABI whether you know it or not.&lt;/p&gt;

&lt;p&gt;That is why every programmer should learn it.&lt;/p&gt;

</description>
      <category>c</category>
      <category>ffi</category>
      <category>abi</category>
      <category>interoperability</category>
    </item>
    <item>
      <title>No, Clojure: your REPL is not new – or best</title>
      <dc:creator>Dimension AI Technologies</dc:creator>
      <pubDate>Sat, 20 Dec 2025 11:07:34 +0000</pubDate>
      <link>https://forem.com/dimension-ai/no-clojure-your-repl-is-not-new-or-best-556</link>
      <guid>https://forem.com/dimension-ai/no-clojure-your-repl-is-not-new-or-best-556</guid>
      <description>&lt;p&gt;Spend any time around &lt;a href="https://en.wikipedia.org/wiki/Clojure" rel="noopener noreferrer"&gt;Clojure&lt;/a&gt;'s devoted community and you'll encounter a cluster of claims:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the Read–Eval–Print Loop ("REPL") is what makes Clojure fundamentally different (&lt;a href="https://en.wikipedia.org/wiki/Read%E2%80%93eval%E2%80%93print_loop" rel="noopener noreferrer"&gt;REPL – Wikipedia&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;REPL-driven development supersedes the Edit–Compile–Run model typical of C-family, .NET and Java&lt;/li&gt;
&lt;li&gt;Clojure enables a uniquely live, exploratory way of building systems&lt;/li&gt;
&lt;li&gt;other languages "don't really have a REPL"&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These claims sound radical, historically grounded and quietly superior, but they are – alas – more rhetoric than fact.&lt;/p&gt;

&lt;p&gt;This article challenges established wisdom about Clojure in three areas:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the REPL long predates Clojure&lt;/li&gt;
&lt;li&gt;Clojure neither invented nor uniquely exemplifies it&lt;/li&gt;
&lt;li&gt;several alternative REPL models outperform Clojure's on important engineering criteria&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To be clear, Clojure is a powerful language with dedicated supporters; so what follows is not an attack on Clojure, but a suggestion that its mythology needs a tweak.&lt;/p&gt;




&lt;h2&gt;
  
  
  The REPL long predates Clojure
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Lisp had already established the model by 1958
&lt;/h3&gt;

&lt;p&gt;The Read–Eval–Print Loop originates in Lisp systems developed at MIT in the late 1950s (&lt;a href="https://en.wikipedia.org/wiki/History_of_Lisp" rel="noopener noreferrer"&gt;History of Lisp – Wikipedia&lt;/a&gt;). From the outset, Lisp environments supported:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;interactive evaluation&lt;/li&gt;
&lt;li&gt;incremental definition of functions&lt;/li&gt;
&lt;li&gt;inspection and modification of live runtime state&lt;/li&gt;
&lt;li&gt;persistent sessions&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The terminology itself has a long and well-documented history. The expression &lt;em&gt;"read–eval–print cycle"&lt;/em&gt; is used by L. Peter Deutsch and Edmund Berkeley in a 1964 implementation of Lisp on the PDP-1 (&lt;a href="https://softwarepreservation.computerhistory.org/projects/LISP/book/III_LispBook_Apr66.pdf" rel="noopener noreferrer"&gt;Deutsch &amp;amp; Berkeley, 1964&lt;/a&gt;). Just one month later, Project MAC published a report by Joseph Weizenbaum — later known as the creator of ELIZA, the world's first chatbot — describing a REPL-based language, OPL-1, implemented in his Fortran-SLIP language on the Compatible Time Sharing System (CTSS) (&lt;a href="https://dspace.mit.edu/handle/1721.1/149332" rel="noopener noreferrer"&gt;Weizenbaum, 1964&lt;/a&gt;; &lt;a href="https://en.wikipedia.org/wiki/Compatible_Time-Sharing_System" rel="noopener noreferrer"&gt;CTSS – Wikipedia&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;By 1974, the &lt;em&gt;Maclisp Reference Manual&lt;/em&gt; by David A. Moon explicitly refers to a &lt;em&gt;"read-eval-print loop"&lt;/em&gt; (page 89), even though the acronym "REPL" is not yet used (&lt;a href="https://www.softwarepreservation.org/projects/LISP/MIT/Moon-MACLISP_Reference_Manual-Apr_08_1974.pdf" rel="noopener noreferrer"&gt;Moon, 1974&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;From at least the early 1980s onward, the abbreviations &lt;em&gt;REP loop&lt;/em&gt; and &lt;em&gt;REPL&lt;/em&gt; are attested in the context of Scheme, where the term became standard (&lt;a href="https://en.wikipedia.org/wiki/Read%E2%80%93eval%E2%80%93print_loop" rel="noopener noreferrer"&gt;Scheme REPL terminology history&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;Clojure deliberately situates itself within this lineage and explicitly inherits from it.&lt;/p&gt;

&lt;p&gt;Once that inheritance is acknowledged, the oft-claimed notion that the REPL distinguishes Clojure needs correction. Any feature drawn wholesale from a sixty-year-old tradition cannot serve as a defining innovation.&lt;/p&gt;




&lt;h3&gt;
  
  
  ML: a typed REPL from the early 1970s
&lt;/h3&gt;

&lt;p&gt;Alongside Lisp, the original ML language — developed in the early 1970s as part of the Edinburgh LCF project (&lt;a href="https://en.wikipedia.org/wiki/Logic_for_Computable_Functions" rel="noopener noreferrer"&gt;LCF – Wikipedia&lt;/a&gt;) — was explicitly designed for interactive use.&lt;/p&gt;

&lt;p&gt;ML's REPL supported incremental definition, immediate evaluation and full static type inference at the prompt. This was not an afterthought. ML was a &lt;em&gt;metalanguage&lt;/em&gt;, intended to be explored live while retaining formal guarantees.&lt;/p&gt;

&lt;p&gt;This matters because it shows that REPL-driven development under strong static typing is not a modern compromise or a reaction against Lisp. It is a parallel tradition, older than Clojure by decades.&lt;/p&gt;




&lt;h3&gt;
  
  
  Smalltalk pushed live systems further in the 1970s
&lt;/h3&gt;

&lt;p&gt;Smalltalk systems went beyond REPL interaction and embraced image-based development (&lt;a href="https://en.wikipedia.org/wiki/Smalltalk" rel="noopener noreferrer"&gt;Smalltalk – Wikipedia&lt;/a&gt;), where the entire system existed as a continuously mutable artefact. Programs were edited while running; the notion of a clean restart receded into the background.&lt;/p&gt;

&lt;p&gt;This approach predates Clojure by decades and represents a more radical commitment to liveness than Clojure's own model.&lt;/p&gt;

&lt;p&gt;However one judges Smalltalk today, its existence alone undermines the idea that live, interactive programming is a modern breakthrough.&lt;/p&gt;




&lt;h3&gt;
  
  
  Home computers normalised persistent REPLs
&lt;/h3&gt;

&lt;p&gt;The most consequential historical counterexample is neither Lisp nor Smalltalk. It is home computing.&lt;/p&gt;

&lt;p&gt;From the late 1970s through the mid-1980s, the overwhelming majority of home computers booted directly into a persistent BASIC environment (&lt;a href="https://en.wikipedia.org/wiki/BASIC" rel="noopener noreferrer"&gt;BASIC – Wikipedia&lt;/a&gt;). Immediate mode was the primary interface. Variables survived &lt;code&gt;RUN&lt;/code&gt;. Programs routinely relied on pre-initialised state in order to function within severe memory constraints.&lt;/p&gt;

&lt;p&gt;This behaviour was universal and far from exceptional.&lt;/p&gt;

&lt;h4&gt;
  
  
  Home-computer BASIC: persistent REPL as the default interface
&lt;/h4&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Machine&lt;/th&gt;
&lt;th&gt;Year&lt;/th&gt;
&lt;th&gt;BASIC variant&lt;/th&gt;
&lt;th&gt;Persistent variables&lt;/th&gt;
&lt;th&gt;Immediate mode&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;TRS-80 Model I&lt;/td&gt;
&lt;td&gt;1977&lt;/td&gt;
&lt;td&gt;Microsoft BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Commodore PET&lt;/td&gt;
&lt;td&gt;1977&lt;/td&gt;
&lt;td&gt;Microsoft BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Apple II&lt;/td&gt;
&lt;td&gt;1977&lt;/td&gt;
&lt;td&gt;Applesoft BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Atari 400/800&lt;/td&gt;
&lt;td&gt;1979&lt;/td&gt;
&lt;td&gt;Atari BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;ZX-80&lt;/td&gt;
&lt;td&gt;1980&lt;/td&gt;
&lt;td&gt;Sinclair BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;VIC-20&lt;/td&gt;
&lt;td&gt;1981&lt;/td&gt;
&lt;td&gt;Commodore BASIC 2.0&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;ZX-81&lt;/td&gt;
&lt;td&gt;1981&lt;/td&gt;
&lt;td&gt;Sinclair BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;BBC Model A&lt;/td&gt;
&lt;td&gt;1981&lt;/td&gt;
&lt;td&gt;BBC BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;BBC Model B&lt;/td&gt;
&lt;td&gt;1981&lt;/td&gt;
&lt;td&gt;BBC BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;ZX Spectrum&lt;/td&gt;
&lt;td&gt;1982&lt;/td&gt;
&lt;td&gt;Sinclair BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Commodore 64&lt;/td&gt;
&lt;td&gt;1982&lt;/td&gt;
&lt;td&gt;Commodore BASIC 2.0&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Dragon 32/64&lt;/td&gt;
&lt;td&gt;1982–83&lt;/td&gt;
&lt;td&gt;Microsoft BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Oric-1&lt;/td&gt;
&lt;td&gt;1983&lt;/td&gt;
&lt;td&gt;Oric Extended BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Amstrad CPC&lt;/td&gt;
&lt;td&gt;1984&lt;/td&gt;
&lt;td&gt;Locomotive BASIC&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;td&gt;Yes&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Tens of millions of machines shipped with precisely this interaction model.&lt;/p&gt;

&lt;p&gt;Rather than being an esoteric Lisp technique rediscovered decades later, the "tight feedback loop" was simply how personal computing worked for an entire generation. Many senior industry figures learned their craft in that world.&lt;/p&gt;

&lt;p&gt;It is also worth noting that the BASIC world itself once abandoned the interactive model it had popularised. Early BASIC systems treated the programming environment as a live, persistent session. With the move to Visual Basic and later VB.NET (&lt;a href="https://en.wikipedia.org/wiki/Visual_Basic_(classic)" rel="noopener noreferrer"&gt;Visual Basic – Wikipedia&lt;/a&gt;; &lt;a href="https://en.wikipedia.org/wiki/Visual_Basic_.NET" rel="noopener noreferrer"&gt;VB.NET – Wikipedia&lt;/a&gt;), that model gave way to an IDE-centred, project-based workflow, where code was edited, built and executed as a discrete artefact.&lt;/p&gt;

&lt;p&gt;This shifted BASIC from language-as-environment to language-as-artifact.&lt;/p&gt;

&lt;p&gt;Interactivity did not vanish entirely, but it moved from the language to the tooling, and the REPL ceased to be the organising centre.&lt;/p&gt;




&lt;h3&gt;
  
  
  Erlang demonstrated live code in production
&lt;/h3&gt;

&lt;p&gt;To argue that Clojure uniquely enables live mutation of running systems means overlooking Erlang, the telecoms language created by Ericsson (&lt;a href="https://en.wikipedia.org/wiki/Erlang_(programming_language)" rel="noopener noreferrer"&gt;Erlang – Wikipedia&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;From the late 1980s onward, Erlang supported hot code swapping in safety-critical telephony infrastructure. Far from exploratory hacking, this was production engineering under strict uptime requirements.&lt;/p&gt;

&lt;p&gt;Live systems were already operational long before Clojure appeared.&lt;/p&gt;




&lt;h2&gt;
  
  
  What Clojure actually contributes
&lt;/h2&gt;

&lt;p&gt;Clojure's achievement lies elsewhere.&lt;/p&gt;

&lt;p&gt;It brings together:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;the Lisp REPL tradition&lt;/li&gt;
&lt;li&gt;the JVM ecosystem&lt;/li&gt;
&lt;li&gt;persistent data structures (&lt;a href="https://en.wikipedia.org/wiki/Persistent_data_structure" rel="noopener noreferrer"&gt;Persistent data structure – Wikipedia&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;modern concurrency primitives&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This synthesis is real and valuable, and the result of thoughtful engineering.&lt;/p&gt;

&lt;p&gt;Interactive programming and the REPL itself, however, sit firmly in the inherited category.&lt;/p&gt;




&lt;h2&gt;
  
  
  The real trade-off: semantic mutability
&lt;/h2&gt;

&lt;p&gt;The distinctive characteristic of Clojure's REPL is not persistence per se, but the level at which persistence operates.&lt;/p&gt;

&lt;p&gt;Early BASIC systems preserved data. Numbers, arrays and flags survived across runs. Control flow remained linear, and program meaning remained fixed.&lt;/p&gt;

&lt;p&gt;Clojure extends persistence to meaning. Functions may be redefined live. Dispatch rules can be altered. Existing call sites can acquire new behaviour without any obvious signpost.&lt;/p&gt;

&lt;p&gt;This shift increases expressive power, but the engineering bill arrives as reduced reconstructability and predictability.&lt;/p&gt;

&lt;h3&gt;
  
  
  Clojure: silent semantic drift
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight clojure"&gt;&lt;code&gt;&lt;span class="c1"&gt;;; Session start: define calculate&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="n"&gt;user=&amp;gt;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;defn&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;calculate&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;*&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="o"&gt;#&lt;/span&gt;&lt;span class="ss"&gt;'user/calculate&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="n"&gt;user=&amp;gt;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;calculate&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="c1"&gt;;; Later, perhaps in another file or by another developer:&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="n"&gt;user=&amp;gt;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;defn&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;calculate&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;+&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="w"&gt;   &lt;/span&gt;&lt;span class="c1"&gt;; silently replaces original&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="o"&gt;#&lt;/span&gt;&lt;span class="ss"&gt;'user/calculate&lt;/span&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="n"&gt;user=&amp;gt;&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nf"&gt;calculate&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="mi"&gt;105&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="c1"&gt;;; No warning. No error. Source file unchanged.&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For a single developer, this is simply a useful tool. In a team, it becomes a coordination hazard: runtime truth can drift away from the text everyone believes they are running.&lt;/p&gt;




&lt;h2&gt;
  
  
  On Rich Hickey's position
&lt;/h2&gt;

&lt;p&gt;In talks such as &lt;em&gt;Simple Made Easy&lt;/em&gt; (&lt;a href="https://www.infoq.com/presentations/Simple-Made-Easy/" rel="noopener noreferrer"&gt;InfoQ video&lt;/a&gt;) and various discussions of REPL-driven development, Rich Hickey has argued that interactive development should be central, while edit–compile–run should be viewed as inefficient.&lt;/p&gt;

&lt;p&gt;The intuition behind this argument is understandable. Iteration speed matters.&lt;/p&gt;

&lt;p&gt;The framing, however, tends to obscure two facts:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;interactive development long predates Clojure&lt;/li&gt;
&lt;li&gt;many systems combine interactivity with compilation discipline&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Growing systems live accelerates exploration, but it also erodes the ability to reconstruct behaviour from source alone. That trade-off deserves to be stated explicitly.&lt;/p&gt;

&lt;p&gt;Hickey did not claim the REPL was new. The Clojure community, however, often treats it as a unique truth.&lt;/p&gt;




&lt;h2&gt;
  
  
  A structural comparison
&lt;/h2&gt;

&lt;p&gt;A clearer perspective emerges from comparing REPL models across history.&lt;/p&gt;

&lt;p&gt;Legend: ✅ Yes, ❌ No, 🟡 Limited&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Property&lt;/th&gt;
&lt;th&gt;1958–60: Lisp REPL (image-based)&lt;/th&gt;
&lt;th&gt;1964: Dartmouth BASIC&lt;/th&gt;
&lt;th&gt;1977–85: Home BASIC&lt;/th&gt;
&lt;th&gt;1973–: ML / OCaml REPL&lt;/th&gt;
&lt;th&gt;2005–: F# REPL&lt;/th&gt;
&lt;th&gt;2007–: Clojure REPL&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Interactive evaluation&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Immediate mode&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Persistent session&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Variables persist across runs&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;First-class functions&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Live function redefinition&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌*&lt;/td&gt;
&lt;td&gt;🟡&lt;/td&gt;
&lt;td&gt;🟡&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Behavioural semantics mutable live&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;🟡&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Static type checking&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Type system constrains REPL&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Silent semantic mutation possible&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Reconstructable from source alone&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Designed for long-lived systems&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;❌&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;td&gt;✅&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;em&gt;BBC BASIC excepted; most home BASICs lacked named functions and relied on line-numbered subroutines.&lt;/em&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  A counterexample: the ML lineage (OCaml and F#)
&lt;/h2&gt;

&lt;p&gt;F# is not an outlier. It belongs to a lineage – ML and OCaml (&lt;a href="https://en.wikipedia.org/wiki/OCaml" rel="noopener noreferrer"&gt;OCaml – Wikipedia&lt;/a&gt;) – that has treated interactive, typed REPL-driven development as normal practice for over forty years.&lt;/p&gt;

&lt;p&gt;F#'s REPL chooses constraint over maximal dynamism, but the mechanism deserves to be described accurately.&lt;/p&gt;

&lt;p&gt;F# allows shadowing with a new binding. When type signatures match, however, F# offers no protection against semantic drift – the same 'silent mutation' problem exists as in Clojure:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight fsharp"&gt;&lt;code&gt;&lt;span class="c1"&gt;// F#: type system provides no protection when signatures match&lt;/span&gt;
&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;calculate&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;;;&lt;/span&gt;
&lt;span class="k"&gt;val&lt;/span&gt; &lt;span class="n"&gt;calculate&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="p"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="kt"&gt;int&lt;/span&gt;

&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;calculate&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;;;&lt;/span&gt;
&lt;span class="k"&gt;val&lt;/span&gt; &lt;span class="n"&gt;it&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;10&lt;/span&gt;

&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;calculate&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;&lt;span class="p"&gt;;;&lt;/span&gt;   &lt;span class="c1"&gt;// same signature, different semantics&lt;/span&gt;
&lt;span class="k"&gt;val&lt;/span&gt; &lt;span class="n"&gt;calculate&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="p"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="kt"&gt;int&lt;/span&gt;

&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;calculate&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;;;&lt;/span&gt;
&lt;span class="k"&gt;val&lt;/span&gt; &lt;span class="n"&gt;it&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;105&lt;/span&gt;
&lt;span class="c1"&gt;// Same silent semantic drift as Clojure&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The difference is that F#'s type system provides &lt;em&gt;partial&lt;/em&gt; protection: it prevents one class of errors (type mismatches at call sites) but not semantic changes within the same type. When a function is redefined with a different type signature, the type system catches inconsistent usage at the point of application, forcing the programmer to address the incompatibility explicitly.&lt;/p&gt;

&lt;p&gt;For teams that value refactoring, long-term maintenance and source-level truth, this partial discipline often proves advantageous – though it does not eliminate the reconstructability problem entirely.&lt;/p&gt;




&lt;h2&gt;
  
  
  "But Clojure's REPL is integrated"
&lt;/h2&gt;

&lt;p&gt;A predictable rebuttal points to tooling: CIDER (&lt;a href="https://cider.mx" rel="noopener noreferrer"&gt;https://cider.mx&lt;/a&gt;), nREPL (&lt;a href="https://nrepl.org" rel="noopener noreferrer"&gt;https://nrepl.org&lt;/a&gt;) and editor integration.&lt;/p&gt;

&lt;p&gt;Better tools improve the experience. They do not alter the underlying model, which remains rooted in a design first implemented in the 1950s.&lt;/p&gt;




&lt;h2&gt;
  
  
  Why the mythology persists
&lt;/h2&gt;

&lt;p&gt;The persistence of the REPL myth is easy to explain:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Lisp culture has always emphasised interactivity&lt;/li&gt;
&lt;li&gt;computing history before the web is poorly remembered&lt;/li&gt;
&lt;li&gt;expressive power is often confused with novelty&lt;/li&gt;
&lt;li&gt;difficulty is frequently mistaken for depth&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;None of this indicates bad faith, but it does reward clearer framing.&lt;/p&gt;

&lt;p&gt;This transition took place more than a generation ago. As a consequence, the interactive environments that once defined everyday programming gradually slipped from the memory of both enthusiasts and professionals. Later rediscoveries of REPL-driven workflows were then easily mistaken for innovations rather than revivals.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The REPL is an immensely valuable tool. Its lineage stretches back more than six decades. Clojure's indisputable power rests heavily on it, but that power arrives through inheritance rather than invention.&lt;/p&gt;

&lt;p&gt;Other languages explore the same design space differently, sometimes with stronger engineering constraints.&lt;/p&gt;

&lt;p&gt;It is also worth noting a structural echo. REPL-driven workflows treat programming as a dialogue rather than a batch process, and today's agent-driven coding tools adopt a similar stance. They extend the conversational model from interaction with a running &lt;em&gt;program&lt;/em&gt; to interaction with an entire &lt;em&gt;codebase&lt;/em&gt;. It would be a stretch to claim a direct causal link, but the resemblance is clear: both approaches reject the assumption that code must be "finished" before it can be tested against reality.&lt;/p&gt;

&lt;p&gt;Clojure's REPL offers power and has made a great deal of noise. It has reinvigorated an old paradigm and delighted its community. Power is not, however, novelty; and enthusiasm must not rewrite history.&lt;/p&gt;




&lt;h2&gt;
  
  
  Acknowledgments
&lt;/h2&gt;

&lt;p&gt;Thanks to Frank Adrian for pointing out that the original F# example compared type-system differences rather than REPL-model differences. The corrected example now shows how F#, like Clojure, permits silent semantic drift when function signatures remain unchanged; the F# type system provides only partial protection against this class of error.&lt;/p&gt;




&lt;h2&gt;
  
  
  References
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Read–Eval–Print Loop (REPL) – Wikipedia&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/Read%E2%80%93eval%E2%80%93print_loop" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Read%E2%80%93eval%E2%80%93print_loop&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;History of Lisp – Wikipedia&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/History_of_Lisp" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/History_of_Lisp&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;L. Peter Deutsch &amp;amp; Edmund Berkeley, &lt;em&gt;LISP on the PDP-1&lt;/em&gt; (1964)&lt;br&gt;
&lt;a href="https://softwarepreservation.computerhistory.org/projects/LISP/book/III_LispBook_Apr66.pdf" rel="noopener noreferrer"&gt;https://softwarepreservation.computerhistory.org/projects/LISP/book/III_LispBook_Apr66.pdf&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Joseph Weizenbaum, &lt;em&gt;OPL-1 on CTSS&lt;/em&gt; (1964)&lt;br&gt;
&lt;a href="https://dspace.mit.edu/handle/1721.1/149332" rel="noopener noreferrer"&gt;https://dspace.mit.edu/handle/1721.1/149332&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Compatible Time-Sharing System (CTSS) – Wikipedia&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/Compatible_Time-Sharing_System" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Compatible_Time-Sharing_System&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;David A. Moon, &lt;em&gt;Maclisp Reference Manual&lt;/em&gt; (1974)&lt;br&gt;
&lt;a href="https://www.softwarepreservation.org/projects/LISP/MIT/Moon-MACLISP_Reference_Manual-Apr_08_1974.pdf" rel="noopener noreferrer"&gt;https://www.softwarepreservation.org/projects/LISP/MIT/Moon-MACLISP_Reference_Manual-Apr_08_1974.pdf&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Scheme REPL terminology history&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/Read%E2%80%93eval%E2%80%93print_loop" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Read%E2%80%93eval%E2%80%93print_loop&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Edinburgh LCF – Wikipedia&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/Edinburgh_LCF" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Edinburgh_LCF&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Smalltalk – Wikipedia&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/Smalltalk" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Smalltalk&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;BASIC – Wikipedia&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/BASIC" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/BASIC&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Visual Basic (classic) – Wikipedia&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/Visual_Basic_(classic)" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Visual_Basic_(classic)&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;VB.NET – Wikipedia&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/Visual_Basic_.NET" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Visual_Basic_.NET&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Erlang (programming language) – Wikipedia&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/Erlang_(programming_language)" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Erlang_(programming_language)&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Persistent data structure – Wikipedia&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/Persistent_data_structure" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Persistent_data_structure&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Rich Hickey, &lt;em&gt;Simple Made Easy&lt;/em&gt; (InfoQ video)&lt;br&gt;
&lt;a href="https://www.infoq.com/presentations/Simple-Made-Easy/" rel="noopener noreferrer"&gt;https://www.infoq.com/presentations/Simple-Made-Easy/&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;OCaml – Wikipedia&lt;br&gt;
&lt;a href="https://en.wikipedia.org/wiki/OCaml" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/OCaml&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;CIDER&lt;br&gt;
&lt;a href="https://cider.mx" rel="noopener noreferrer"&gt;https://cider.mx&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;nREPL&lt;br&gt;
&lt;a href="https://nrepl.org" rel="noopener noreferrer"&gt;https://nrepl.org&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>clojure</category>
      <category>readevaluateprintloop</category>
      <category>programminghistory</category>
      <category>basic</category>
    </item>
  </channel>
</rss>
