<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Tao Wen</title>
    <description>The latest articles on Forem by Tao Wen (@taowen).</description>
    <link>https://forem.com/taowen</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/taowen"/>
    <language>en</language>
    <item>
      <title>Technology Minimalist</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Fri, 20 May 2022 06:21:38 +0000</pubDate>
      <link>https://forem.com/taowen/technology-minimalist-3b66</link>
      <guid>https://forem.com/taowen/technology-minimalist-3b66</guid>
      <description>&lt;p&gt;Instead of keep adding new layers, there is a trend to go the opposite direction&lt;/p&gt;

&lt;h2&gt;
  
  
  Library over Framework
&lt;/h2&gt;

&lt;p&gt;We used to think reuse is a good thing. Now more and more people realise reuse cause more damage than good. There are good reasons&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;UI design is highly volatile. More and more things become "headless" because we just want the stuff without the "style"&lt;/li&gt;
&lt;li&gt;Big framework controls the control flow, making the debugging process mysterious. &lt;/li&gt;
&lt;li&gt;Implement your requirement using options/callbacks provided by other guys is not fun. It takes time to memorize the options/callbacks and sometimes there isn't any.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Reuse is still going to happen, just in a more granular way. The UI will be ripped out, the framework will be teared apart, left us with a lot of highly focused libraries which just solve one problem one time.&lt;/p&gt;

&lt;h2&gt;
  
  
  Boring over Fancy
&lt;/h2&gt;

&lt;p&gt;We used to think adding a new fancy technology will always be a good thing. Now more and more people realise they just need to get the job done with the least amount of moving parts.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Do we always need nosql or hadoop? Can we get the job done with postgresql with some extensions?&lt;/li&gt;
&lt;li&gt;Do we always need SPA? Can we get the job done with server generated HTML?&lt;/li&gt;
&lt;li&gt;Do we always need another microservice? Can we get the job done by extracting out a library?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://boringtechnology.club/"&gt;Choosing boring technology&lt;/a&gt; is not just about stability, it is also about better developoer experience. With less compilation unit, less runtime service, and less co-workers, the job will be a lot easier.&lt;/p&gt;

&lt;h2&gt;
  
  
  Copy-paste is underrated
&lt;/h2&gt;

&lt;p&gt;We used to think there could be another SaaS, another product, another library to build another abstraction, to make the reuse happen. Well, because we all hate to start from scratch again and again.&lt;/p&gt;

&lt;p&gt;Between starting from scratch and reusing a library, there is a middle ground called "copy-paste". We all have used this approach, but shame to admit it. With the aid of github co-pilot, this kind of reuse will be more and more common.&lt;/p&gt;

&lt;h2&gt;
  
  
  Encapsulation can be cheap
&lt;/h2&gt;

&lt;p&gt;We spent a lot of effort to combat with increasing complexity of legacy code. We want to entangle the codebase by enforced encapsulation. So there was a fashion to start a new project with 100 and more micro-services.&lt;/p&gt;

&lt;p&gt;It does not have to be that expensive to enforce encapsulation. With the help of TypeScript and other static type checker, we can enforce encapsulation at compile time without paying runtime cost. Hide some internal implementation details through package.json dependency relationship is not a rocket science, it is called "dependency management".&lt;/p&gt;

&lt;h2&gt;
  
  
  Technology Minimalist
&lt;/h2&gt;

&lt;p&gt;Minimalist seeks the balance point, minimal but still effective. &lt;/p&gt;

&lt;p&gt;Minimalist does not care there is no shiny new keyword appear in their resume.&lt;/p&gt;

&lt;p&gt;Minimalist enjoys solving the problem and deliver result, and having the fun of writing useful code during the journey.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>programming</category>
    </item>
    <item>
      <title>Use jsx as server side html template</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Wed, 18 May 2022 07:33:10 +0000</pubDate>
      <link>https://forem.com/taowen/use-jsx-as-server-side-html-template-3k40</link>
      <guid>https://forem.com/taowen/use-jsx-as-server-side-html-template-3k40</guid>
      <description>&lt;p&gt;source code: &lt;a href="https://github.com/taowen/incremental-html/tree/main/packages/jsx-to-html"&gt;https://github.com/taowen/incremental-html/tree/main/packages/jsx-to-html&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight tsx"&gt;&lt;code&gt;&lt;span class="c1"&gt;// filename: newsletter.tsx&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;jsxToHtml&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@incremental-html/jsx-to-html&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;

&lt;span class="c1"&gt;// server is an express router&lt;/span&gt;
&lt;span class="nx"&gt;server&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="kd"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/newsletter&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;resp&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="na"&gt;html&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;jsxToHtml&lt;/span&gt;&lt;span class="p"&gt;(&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;hello&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;);&lt;/span&gt;
    &lt;span class="nx"&gt;resp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="kd"&gt;set&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Content-Type&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;text/html&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;}).&lt;/span&gt;&lt;span class="nx"&gt;end&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;html&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;tsconfig.json&lt;/code&gt; should configure like this to translate &lt;code&gt;*.tsx&lt;/code&gt; using jsxToHtml&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"compilerOptions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt;//...&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"jsx"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"react"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"jsxFactory"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"jsxToHtml.createElement"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="nl"&gt;"jsxFragmentFactory"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"jsxToHtml.Fragment"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="err"&gt;//...&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  async context
&lt;/h2&gt;

&lt;p&gt;We can use jsxToHtml as an alterantive to node.js &lt;code&gt;async_hooks&lt;/code&gt;. &lt;br&gt;
There is no runtime trick, works in any environment (including deno, cloudflare workers, etc)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight tsx"&gt;&lt;code&gt;&lt;span class="nx"&gt;test&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;component with context&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;C1&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="na"&gt;props&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{},&lt;/span&gt; &lt;span class="na"&gt;ctx&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kr"&gt;string&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nb"&gt;Promise&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="k"&gt;void&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;resolve&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;resolve&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;ctx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;msg&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;context&lt;/span&gt; &lt;span class="na"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"hello"&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;C1&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;context&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="nx"&gt;expect&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;jsxToHtml&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;result&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;original msg&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;})).&lt;/span&gt;&lt;span class="nx"&gt;toBe&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;&amp;lt;div&amp;gt;&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s1"&gt;hello&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s1"&gt;&amp;lt;/div&amp;gt;&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The context will be automatically passed down the tree. &lt;br&gt;
&lt;code&gt;&amp;lt;context&amp;gt;&lt;/code&gt; is a built-in component to alter the context in the middle.&lt;/p&gt;

</description>
      <category>node</category>
      <category>javascript</category>
      <category>web</category>
    </item>
    <item>
      <title>How to auto reload node.js server</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Thu, 12 May 2022 00:13:53 +0000</pubDate>
      <link>https://forem.com/taowen/how-to-auto-reload-nodejs-server-ebm</link>
      <guid>https://forem.com/taowen/how-to-auto-reload-nodejs-server-ebm</guid>
      <description>&lt;p&gt;Source code: &lt;a href="https://github.com/taowen/vite-howto/tree/main/packages/SSR/auto-reload-node-server"&gt;https://github.com/taowen/vite-howto/tree/main/packages/SSR/auto-reload-node-server&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Code Structure &amp;amp; Motivation
&lt;/h2&gt;

&lt;p&gt;It is a node.js application, using express to listen at &lt;a href="http://localhost:3000"&gt;http://localhost:3000&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;server/server-entry.ts is the entry point, which listens the http port&lt;/li&gt;
&lt;li&gt;server/server.ts is the main logic&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  DX Problems
&lt;/h2&gt;

&lt;p&gt;dev server should auto reload the node.js server when we have changed the source. nodemon can monitor soure code change and restart node process, but it takes time to restart. It would be nice to make the change without process restart.&lt;/p&gt;

&lt;h2&gt;
  
  
  UX Problems
&lt;/h2&gt;

&lt;p&gt;&lt;code&gt;vite build server&lt;/code&gt; should package every server-entry.ts dependency (except node itself), so we do not need to &lt;code&gt;npm install&lt;/code&gt; again when deploy.&lt;/p&gt;

&lt;h2&gt;
  
  
  Solution Walkthrough
&lt;/h2&gt;

&lt;h3&gt;
  
  
  build node.js application to a bundle
&lt;/h3&gt;

&lt;p&gt;server/vite.config.ts&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;defineConfig&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;vite&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nx"&gt;defineConfig&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;build&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;ssr&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;./server-entry.ts&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;outDir&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;../dist&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;will bundle the &lt;code&gt;server/server-entry.ts&lt;/code&gt; to &lt;code&gt;dist/server-entry.js&lt;/code&gt; with everything it referenced (except node.js standard library). It is in commonjs format, ready to be executed in node.js environment. &lt;code&gt;build.ssr&lt;/code&gt; is provided by vite to build node.js server.&lt;/p&gt;

&lt;h3&gt;
  
  
  development server
&lt;/h3&gt;

&lt;p&gt;During development, &lt;code&gt;http://localhost:3000/&lt;/code&gt; we want vite to transform server.ts on the fly, so we can skip compilation process after making changes&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;express&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;express&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;createServer&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="nx"&gt;createViteServer&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;vite&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nx"&gt;main&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;app&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;express&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="c1"&gt;// auto reload in dev mode&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;vite&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;createViteServer&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
        &lt;span class="na"&gt;server&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;middlewareMode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ssr&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;watch&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="c1"&gt;// During tests we edit the files too fast and sometimes chokidar&lt;/span&gt;
                &lt;span class="c1"&gt;// misses change events, so enforce polling for consistency&lt;/span&gt;
                &lt;span class="na"&gt;usePolling&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="na"&gt;interval&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="nx"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;all&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/(.*)&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;resp&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;originalUrl&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;method&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;default&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;handle&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;vite&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ssrLoadModule&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;./server/server.ts&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="nx"&gt;handle&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;req&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;resp&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="nx"&gt;vite&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;ssrFixStacktrace&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;stack&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="nx"&gt;resp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;500&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nx"&gt;end&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;stack&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="nx"&gt;resp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;404&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nx"&gt;end&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;});&lt;/span&gt;
    &lt;span class="p"&gt;})&lt;/span&gt;
    &lt;span class="nx"&gt;app&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;listen&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;3000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;http://localhost:3000&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="nx"&gt;main&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We use &lt;code&gt;await vite.ssrLoadModule('./server/server.ts')&lt;/code&gt; to transform the code and run it. Because the ssrLoadModule invoked per request, and &lt;code&gt;server.watch&lt;/code&gt; is configured&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;vite&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;createViteServer&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;server&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;middlewareMode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ssr&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;watch&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="c1"&gt;// During tests we edit the files too fast and sometimes chokidar&lt;/span&gt;
            &lt;span class="c1"&gt;// misses change events, so enforce polling for consistency&lt;/span&gt;
            &lt;span class="na"&gt;usePolling&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;interval&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;100&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If we changed the server code, we can see the effect just by refreshing browser to send another request to dev server. &lt;code&gt;vite.ssrFixStacktrace(e)&lt;/code&gt; will fix the exception stack trace, to report the correct original line number, instead of the line number in transformed file. &lt;/p&gt;

</description>
      <category>javascript</category>
      <category>node</category>
      <category>vite</category>
    </item>
    <item>
      <title>How to get more people fund your opensource project</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Sat, 07 May 2022 12:01:36 +0000</pubDate>
      <link>https://forem.com/taowen/how-to-get-more-people-fund-your-opensource-project-54ml</link>
      <guid>https://forem.com/taowen/how-to-get-more-people-fund-your-opensource-project-54ml</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_r-qRzEV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b0gjcuksludk6amohtkr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_r-qRzEV--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/b0gjcuksludk6amohtkr.png" alt="Image description" width="880" height="412"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Access to all Pro examples; Remove the React Flow attribution; Prioritized Bug Reports and Feature Requests&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6g--ngPS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jxmtg4nm5lffq8r35doe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6g--ngPS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jxmtg4nm5lffq8r35doe.png" alt="Image description" width="880" height="599"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;TypeScript source code is only accesssible to sponsors&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--BAGGo1_a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0hpe9ubwp9kxh54dk462.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--BAGGo1_a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0hpe9ubwp9kxh54dk462.png" alt="Image description" width="880" height="1214"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;TypeScript source code is only accesssible to sponsors&lt;/p&gt;

&lt;p&gt;The pattern is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;anyone can npm install for free&lt;/li&gt;
&lt;li&gt;make examples exclusive for sponsors&lt;/li&gt;
&lt;li&gt;make source code exclusive for sponsors&lt;/li&gt;
&lt;li&gt;has better chance to get your feature request implemented&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>opensource</category>
    </item>
    <item>
      <title>Checklist for UI frameworks evaluation</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Tue, 19 Apr 2022 23:03:10 +0000</pubDate>
      <link>https://forem.com/taowen/a-checklist-for-ui-frameworks-evaluation-27kd</link>
      <guid>https://forem.com/taowen/a-checklist-for-ui-frameworks-evaluation-27kd</guid>
      <description>&lt;ul&gt;
&lt;li&gt;load server generated page without waiting in blank state for too long, load as much as possible within a time budget, defer unfinished loading to second roundtrip&lt;/li&gt;
&lt;li&gt;show loading indicator to allow other areas to show before everything ready, but not too many indicators&lt;/li&gt;
&lt;li&gt;show error in area, instead of make whole page unusable&lt;/li&gt;
&lt;li&gt;show result of my action without whole page refresh, keep uncommited edit state in other area&lt;/li&gt;
&lt;li&gt;show feedback while typing, save extra click&lt;/li&gt;
&lt;li&gt;show search results while typing, save extra click&lt;/li&gt;
&lt;li&gt;if button click takes some time, show a processing indicator on the button to prevent user clicking twice&lt;/li&gt;
&lt;li&gt;if server processing takes time, client may optimistically update before server confirm&lt;/li&gt;
&lt;li&gt;show error next to the input&lt;/li&gt;
&lt;li&gt;avoid multi page form, prefer minimal data entry initially, grow the form gradually as user provided more information&lt;/li&gt;
&lt;li&gt;use infinite scroll to load more&lt;/li&gt;
&lt;li&gt;use pull down to refresh&lt;/li&gt;
&lt;li&gt;use swipe to show/hide more actions&lt;/li&gt;
&lt;li&gt;use drag and drop to re-order items&lt;/li&gt;
&lt;li&gt;use drag and drop to connect relationship&lt;/li&gt;
&lt;li&gt;use half screen dialog to replace page jumping, use inline editing to replace modal dialog, avoid jumping around if possible&lt;/li&gt;
&lt;li&gt;use mansonry to layout double columns, use screen space more efficiently&lt;/li&gt;
&lt;li&gt;use FLIP layout animation to avoid content suddenly appear/disappear&lt;/li&gt;
&lt;li&gt;preload next page, show progress, save the waiting time after switching&lt;/li&gt;
&lt;li&gt;show current and next page side by side with transition animation, if no loading required&lt;/li&gt;
&lt;li&gt;go back to previous page without reload waiting&lt;/li&gt;
&lt;li&gt;perserve unsaved form in browser&lt;/li&gt;
&lt;li&gt;render big page with many dom nodes, showing only the porition in viewport&lt;/li&gt;
&lt;li&gt;multiple concurrent actions, end up with a consistent final state&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Go through the checklist, and find out how to implement each feature in the framework you are evaluating (for example &lt;a href="https://remix.run"&gt;https://remix.run&lt;/a&gt;).&lt;/p&gt;

</description>
      <category>frontend</category>
      <category>ui</category>
      <category>design</category>
    </item>
    <item>
      <title>Will Low-Code development tools commercially profitable?</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Wed, 06 Jan 2021 22:03:32 +0000</pubDate>
      <link>https://forem.com/taowen/will-low-code-development-tools-commercially-profitable-34ib</link>
      <guid>https://forem.com/taowen/will-low-code-development-tools-commercially-profitable-34ib</guid>
      <description>&lt;p&gt;There is a ideal story: If Low-Code development tools can be 10x more productive, then we gain competitive advantage over the cost of any kind of software project. Outsource vendor will want to purchase such kind of tool to compete with others.&lt;/p&gt;

&lt;p&gt;Why this ideology does not work?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Unlike SaaS model, user will find the vendor through market search. Outsource vendor seek customers actively, it does not have the luxury to filter out "good quality" customer through funnel. To keep the utilization rate, sometimes some less ideal project must be taken. Those newly found customer will have high business development cost. Also some revenue will have high cost to collect after the project finished a long time. Among all the cost of outsource project, high development labor cost is just one, sometimes the insignificant one.&lt;/li&gt;
&lt;li&gt;Tool can not be fully automated. We should need human to drive the tool. Human need certain period of learning, and need to make design decisions wisely in daily use. Even if the tool is bug-free, and well-documented, learning still takes time. Also to archive engineering property such as "low coupling, high cohesion" is still a black art largely depending on the human designer, no matter what the development tools market itself.&lt;/li&gt;
&lt;li&gt;Software development tools is naturally monopoly due to network effect. You do not want a private company to run water/electricity, otherwise the monopoly will make you pay more. You do not want javascript/npm to be privately owned by a company, so that every npm publish will incur a charge to your credit card. Software development tools as public infrastructure should be open sourced, and remain open sourced.
&lt;/li&gt;
&lt;li&gt;Development tools users are developers, SaaS users are normal corporate worker. Developer will be more concerned with vendor lock-in. Why? To SaaS user, it is just a matter of choice, they will be lock-in by you or others. But to developer, they will think open source developer tools is truly in their hands.&lt;/li&gt;
&lt;li&gt;Developer will have career concerns. However, this is a double edged sword. Like salesforce admin will try their best to maintain salesforce in the market, given their investment in the technology.&lt;/li&gt;
&lt;li&gt;I am not saying open source === no profitable business opportunity. There are many successful commercial company behind open source projects. But in the area of development tools, it is more likely a consulting model business. It might be profitable, but due to human intensive nature, it will be unscalable. Investing millions of dollars on Low-Code to sell copies of the  tool will likely lead to commercial failure. It has a much better chance of success running as not for profit open source project, like Ruby on Rails. Next DHH could be you.&lt;/li&gt;
&lt;li&gt;The tool will be easier to sell to have immediate effect on existing codebase. I have written a record &amp;amp; replay tool that could re-produce production error without code modification. The tool took the heart of developers over-night. Low-Code does not work for legacy code, it targets green field projects. It is much much harder to sell if the tool can not solve any existing problem just now. It takes courage to trust something might work in the far future.&lt;/li&gt;
&lt;li&gt;10x more productive does not mean 10x drop in latency (time to market). Unlike outsource vendors, SaaS companies, Internet business does not care the total cost that much. However, extreme time to market is desired, in hope of market domination. Modifying PHP file in production is cutting corner, but maybe exactly what is needed to be extremely fast.&lt;/li&gt;
&lt;li&gt;Is 10x more productive Low-Code a lie? I still believe it. But it will not a good business commercially given the points above.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;There are many startups in this area recently. If not selling development tools directly to developers, what market should they target?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;SaaS + lowcode: Besides "truly useful to the user", stability, speed of page loading, the 4th should be "on premise deployment and customization". SaaS can sell to more customer if have Low-Code capability built-in.&lt;/li&gt;
&lt;li&gt;lowcode + a family of SaaS: Instead of selling as development tools, lowcode can be wrapped as a family of SaaS. It will not be that feature rich (or feature bloated should we say) to be competitive by itself. But it will be appealing to those who want customization.&lt;/li&gt;
&lt;li&gt;Data Centric base: Microsoft launched a product &lt;a href="https://docs.microsoft.com/en-us/powerapps/maker/data-platform/data-platform-intro"&gt;Dataverse&lt;/a&gt; recently. It might address the problem described by the book &lt;a href="https://www.amazon.com/Software-Wasteland-Application-Centric-Hobbling-Enterprises/dp/1634623169"&gt;"Software Wasteland"&lt;/a&gt;. I would describe the solution as "Backend as a database" for the age of frontend-backend separation.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You can also find me at twitter &lt;a href="https://twitter.com/nctaowen"&gt;https://twitter.com/nctaowen&lt;/a&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Connect react svg components</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Sat, 12 Oct 2019 07:31:49 +0000</pubDate>
      <link>https://forem.com/taowen/connect-react-svg-components-l70</link>
      <guid>https://forem.com/taowen/connect-react-svg-components-l70</guid>
      <description>&lt;h1&gt;
  
  
  Connect two rectangles via straight line
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/wvvMMpm?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;The from and to position is hard coded. Can we make the shapes collaboratively figure out these parameters?&lt;/p&gt;

&lt;h1&gt;
  
  
  Add from and to property
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/zYYrryG?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;This time, Rect saved its center to its datum. The Line can figure out x1, y1, x2, y2 by querying its from and to node datum.&lt;/p&gt;

&lt;h1&gt;
  
  
  Make it draggable
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/xxxZVdd?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Copied the code from &lt;a href="https://dev.to/taowen/make-react-svg-component-draggable-2kc"&gt;https://dev.to/taowen/make-react-svg-component-draggable-2kc&lt;/a&gt; . We can see, the line is not following the rectangles. Let's fix it.&lt;/p&gt;

&lt;h1&gt;
  
  
  Make the connector follow dragging
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/gOOPPZv?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;We added a custom event called &lt;code&gt;moved&lt;/code&gt;. When Rect being dragged, the &lt;code&gt;moved&lt;/code&gt; event will be handled by both Rect itself and the connected lines. D3 require multiple listener to be registered in its own namespace, so the event name has different suffix.&lt;/p&gt;

&lt;h1&gt;
  
  
  Add circle
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/JjjGXJJ?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Circle is easier than Rect, as cx and cy is its center. However, due to we have two lines now, the event namespace need to be unique, so &lt;code&gt;assignId&lt;/code&gt; is introduced.&lt;/p&gt;

&lt;h1&gt;
  
  
  Draw line before drawing the rectangles
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/jOOWqZE?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;We can see the line disappeared, because the connected rect is not drawn yet. We need to fix this.&lt;/p&gt;

&lt;h1&gt;
  
  
  Order should not matter
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/qBBbZGY?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Introduced another custom event 'nodeAdded'. If the line can not find the node, it will monitor the nodeAdded event to find out if the collaborators all ready.&lt;/p&gt;

</description>
      <category>react</category>
      <category>svg</category>
    </item>
    <item>
      <title>Make react svg component draggable</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Fri, 11 Oct 2019 13:11:23 +0000</pubDate>
      <link>https://forem.com/taowen/make-react-svg-component-draggable-2kc</link>
      <guid>https://forem.com/taowen/make-react-svg-component-draggable-2kc</guid>
      <description>&lt;h1&gt;
  
  
  Create a Rect component
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/VwwvgdB?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;It does nothing special. Just a dummy wrapper of &lt;code&gt;&amp;lt;rect&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Add d3-drag
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/abbvXaV?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;We use ReactDOM to find out the dom node. And use d3-drag to respond to mouse enter and mouse move. Although svg does not have drag drop event, d3-drag simulated it by intercepting window wide mouse events. &lt;/p&gt;

&lt;p&gt;Setting x y attribute works. However, dragging will cause the rect "jump" to the position of mouse cursor. We would want it to stay there.&lt;/p&gt;

&lt;h1&gt;
  
  
  Make dragging starts without "jump"
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/LYYpqKJ?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;The trick is to set the &lt;code&gt;subject&lt;/code&gt; to have x y properties from the target rect. The relative position will be kept&lt;/p&gt;

&lt;h1&gt;
  
  
  How about circle?
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/OJJyqby?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;For circle, we need to change x to cx, y to cy. And it seems working. The relative position is still kept. &lt;/p&gt;

&lt;p&gt;We noticed the code is nearly the same as Rect. Can we make it generic?&lt;/p&gt;

&lt;h1&gt;
  
  
  Support both circle and rect
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/jOObJwm?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Just like change from rect to circle, we changed from x to cx, y to cy. This time, we use &lt;code&gt;translate(x, y)&lt;/code&gt; to set the x y coordinate transformation. This &lt;code&gt;transform&lt;/code&gt; attribute is supported by both rect and circle, so we only need to make one makeDraggable.&lt;/p&gt;

&lt;h1&gt;
  
  
  How about group?
&lt;/h1&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/nctaowen/embed/zYYvbyy?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;It also works on &lt;code&gt;&amp;lt;g&amp;gt;&lt;/code&gt;. However, we have to make the individual rect and circle undraggable, otherwise the dragged element will be rect or circle instead of the group. &lt;/p&gt;

</description>
      <category>svg</category>
      <category>react</category>
      <category>d3</category>
      <category>d3drag</category>
    </item>
    <item>
      <title>Out of the Tar Pit, another approach</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Tue, 23 Jul 2019 14:20:53 +0000</pubDate>
      <link>https://forem.com/taowen/out-of-the-tar-pit-another-approach-4km4</link>
      <guid>https://forem.com/taowen/out-of-the-tar-pit-another-approach-4km4</guid>
      <description>&lt;h3&gt;
  
  
  What is the problem to solve?
&lt;/h3&gt;

&lt;p&gt;A problem well-stated is Half-solved&lt;/p&gt;

&lt;p&gt;The challenge was laid out by &lt;a href="https://raw.githubusercontent.com/taowen/lonely-road/master/Brooks-NoSilverBullet.pdf" rel="noopener noreferrer"&gt;“No Silver Bullet — Essence and Accident in Software Engineering”&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And &lt;a href="https://raw.githubusercontent.com/taowen/lonely-road/master/MoseleyMarks06a.pdf" rel="noopener noreferrer"&gt;“Out of the Tar Pit”&lt;/a&gt; elaborated the problem statement, and contributed a possible solution.&lt;/p&gt;

&lt;p&gt;I don’t think debating what is essential state, what is accidental state is helpful. But it is clear, the problem is about STATE&lt;/p&gt;

&lt;p&gt;The problem to solve is always the same. It is the big elephant in the room all the time. Imperative programming to manage state is too hard.&lt;/p&gt;

&lt;h3&gt;
  
  
  What is the problem of imperative programming?
&lt;/h3&gt;

&lt;p&gt;We can update those states, quite easily actually. Imperative programming is straightforward, it gets things done. Just put together a bunch of CPU instructions to update state, and CPU will do it. The software execution can be drawn as this naive diagram&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F670%2F1%2ABkT_0dH4sNhHuMANwz5eEw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F670%2F1%2ABkT_0dH4sNhHuMANwz5eEw.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The external behavior of a software is essentially a ordered series of state update (yellow circles in the diagram). However, the problem is:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Describe this sequence of state update literally with one to one mapping, will lead to code that is "lengthy", "tedious", "fragmented".&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The most straightforward way, the easist way, is not the best way. Even if, we extracted out lots of nice functions (those blue circles), the code might looks tidier, but still “lengthy”, “tedious”, “fragmented”. Readability is a subjective. “&lt;a href="https://link.zhihu.com/?target=https%3A//www.infoq.com/presentations/Simple-Made-Easy/" rel="noopener noreferrer"&gt;Simple Made Easy&lt;/a&gt;” suggests “Simple” is objective. I will define simple as following measurable properties:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Fewer states&lt;/strong&gt; : the number of states, smaller is better&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sequential&lt;/strong&gt; : if we have to deal with time ordered state update, let it be sequential&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Continuous&lt;/strong&gt; : this line and next line, they are put so close together because they are causally related. On the other thand, if two state update are causally related, they should be put as close as possible.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Isolated&lt;/strong&gt; : state update is already hard enough to reason about. If they are not isolated, we have to pull all of them in my working memory to think about them all in once.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The goal is not to have zero states, but fewer the better. Then we keep the code sequential/continous/isolated to keep the complexity of state updating code on check. The opposite of those four properties are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Excessively&lt;/strong&gt;  &lt;strong&gt;stateful&lt;/strong&gt; : many many states, even unnecessary (or accidental state as described in out of tar pit paper)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Concurrent or Parallel&lt;/strong&gt; : threads or coroutines, they are complex.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Long range causality&lt;/strong&gt; : the application logic is fragmented, connected via a global state called database&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Entangled&lt;/strong&gt; : we have to ignore the fact they are components, to reason about them or to optimize them as a big entangled mess.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The goal is not to eliminate all states, but to keep them as few as possible. There will always be essential states that we have to manage and update with logic in order, that is what business process is all about after all. Even TLA+ evolved as &lt;a href="https://en.wikipedia.org/wiki/PlusCal" rel="noopener noreferrer"&gt;PlusCal&lt;/a&gt; looks a lot like a imperative program. Imperative style is still the best known form to express temporal logic for human comprehension. But we have to keep it simple, by keeping it sequential, continuous and isolated.&lt;/p&gt;

&lt;h3&gt;
  
  
  What about OOP/DDD?
&lt;/h3&gt;

&lt;p&gt;Using object to encapsulate state is a big part of OOP or DDD in particular. DDD in practices boils down to three steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Application service load back the domain model from database&lt;/li&gt;
&lt;li&gt;Use aggregate root to encapsulate all the changes&lt;/li&gt;
&lt;li&gt;The side-effect is in the form of modified domain model or issued new domain events. Application service save the models and publishes the events&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The core idea is “Aggregate Root” can be a black box, nothing can by pass it, that will make our state update code encapsulated in one place. But there are two issues:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The method on aggregate root, is not very different from a free function taking the aggregate root as first argument. OOP is just ordinal imperative programming.&lt;/li&gt;
&lt;li&gt;The interaction between objects, especially for the business process involving multiple domain concepts. It is hard to decide which aggregate root is the real aggregate root, which should take care of the process. Or put in another way, business process as a function in essence, itself should be the aggregate root.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We can re-examine the four properties above&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Fewer states&lt;/strong&gt; : as long as the state has to be updated manually in the temporal manner, they are still there&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Sequential&lt;/strong&gt; : still need thread or coroutine to exploit new hardware.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Continuous&lt;/strong&gt; : the business logic is scatter around. It tends to be less continuous, not more continuous, compared to raw imperative programming.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Isolated&lt;/strong&gt; : domain model isolates the states, but to optimize the data loading, we have to load them in bigger batch to avoid 1+N too many SQL problem.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  What about pure functional?
&lt;/h3&gt;

&lt;p&gt;In &lt;a href="https://raw.githubusercontent.com/taowen/lonely-road/master/MoseleyMarks06a.pdf" rel="noopener noreferrer"&gt;“Out of the Tar Pit”,&lt;/a&gt; author proposed a style of programming called Functional Relation Programming. It is known as event sourcing in DDD community. I agree, the event (user input) is the only true essential state, everything else can be derived from it, so accidental. But projecting every state from event is not as “easy” as temporal logic expressed by imperative programming. As Rich Hickey suggested, “easy” is subjective. But we can not ignore the fact most people is not familiar with this style, because it is detached from our primary familiarity source, the physical real world.&lt;/p&gt;

&lt;p&gt;The physical world do not store the state as a ledger of events. The god does not derivative your height and weight from the food you have eaten since you have born. We make decision according to the situation. It is just the matter we load the situation from a mutable state, or we derivative the situation from the history on demand in memory. Reasoning based on current state for future, coupled with calculating the current state, will complicate the logic.&lt;/p&gt;

&lt;p&gt;The approach is still not popular after all these years. We are presenting an alternative approach to manage state, which might be more familiar and practical.&lt;/p&gt;

&lt;h3&gt;
  
  
  Simpler State Management
&lt;/h3&gt;

&lt;p&gt;The goal is to make it simple&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Fewer states: if one state is not necessary eliminate it. If one state can be derived from other state, make it declarative.&lt;/li&gt;
&lt;li&gt;Make imperative programming sequential / isolated / continuous&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The code is written in TypeScript, but need a special runtime to make it working. You can think it is a new language, with identical syntax with TypeScript.&lt;/p&gt;

&lt;h4&gt;
  
  
  Fewer states by UI binding
&lt;/h4&gt;

&lt;p&gt;UI binding is mainstream now. This demo looks like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F180%2F1%2A5cZd8sOhTxlQhKkCh6E93w.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F180%2F1%2A5cZd8sOhTxlQhKkCh6E93w.gif"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The UI is powered by Web DOM, it is a separate state. Using binding like this we can make it a derived state:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight xml"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;Button&lt;/span&gt; &lt;span class="err"&gt;@&lt;/span&gt;&lt;span class="na"&gt;onClick=&lt;/span&gt;&lt;span class="s"&gt;"onMinusClick"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;-&lt;span class="nt"&gt;&amp;lt;/Button&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;span&lt;/span&gt; &lt;span class="na"&gt;margin=&lt;/span&gt;&lt;span class="s"&gt;"8px"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;{{ value }}&lt;span class="nt"&gt;&amp;lt;/span&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;Button&lt;/span&gt; &lt;span class="err"&gt;@&lt;/span&gt;&lt;span class="na"&gt;onClick=&lt;/span&gt;&lt;span class="s"&gt;"onPlusClick"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&lt;/span&gt;+&lt;span class="nt"&gt;&amp;lt;/Button&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;The data bind to:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;CounterDemo&lt;/span&gt; &lt;span class="kd"&gt;extends&lt;/span&gt; &lt;span class="nc"&gt;RootSectionModel&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
 &lt;span class="nx"&gt;value&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
 &lt;span class="nf"&gt;onMinusClick&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
 &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;value&lt;/span&gt; &lt;span class="o"&gt;-=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
 &lt;span class="p"&gt;}&lt;/span&gt;
 &lt;span class="nf"&gt;onPlusClick&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
 &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;value&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
 &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;But why binding is eliminating state? Compare code written in these two styles. First, we update these two states in order&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// update state in temporal order
this.value -= 1;
this.updateView({msg: this.value})
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Second, we bind the two state, then update the model&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// setup the binding in advance
&amp;lt;span margin="8px"&amp;gt;{{ value }}&amp;lt;/span&amp;gt;

// then update the state
this.value -= 1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;The difference is, binding or any declarative style programming, setup the relationship which is true, out of context. This can remove the derived state from temporal reasoning and description. The fewer states need to manually maintained within the time arrow, the better.&lt;/p&gt;
&lt;h4&gt;
  
  
  Fewer states by even more binding
&lt;/h4&gt;

&lt;p&gt;If binding can remove state from temporal logic, we need more. Let’s layout all the states within the system:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F411%2F1%2AAUrnE5Z2lXkVAQLVcuU5xA.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F411%2F1%2AAUrnE5Z2lXkVAQLVcuU5xA.jpeg"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Binding UI state to Frontend state only solved a tiny bit of the problem. Who will manage the Frontend state then? We need to question every single state, can we eliminate it? Or, can we derive it from another state? There is obvious quick win here. When presenting the UI, it loads the data from database, all the way up to UI. There might be some transformation, but can be conveniently described as data binding as well. Given this simple application:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F566%2F1%2AvIp-oT0QzIjVd6Di3L7MwA.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F566%2F1%2AvIp-oT0QzIjVd6Di3L7MwA.gif"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It is just a simple list view, to show some data loaded from database. We can bind the UI directly to some database query:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;When "from" or "to" changed, the "filteredReservations" will be updated, and the "totalCount" will also be updated, and the UI will be re-rendered with latest value. Binding to database, eliminate a lot of intermediate state between the UI and DB.&lt;/p&gt;

&lt;h4&gt;
  
  
  Fewer states by tight coupling
&lt;/h4&gt;

&lt;p&gt;Frontend and backend work together to provide a user interface for the human to interact with. They should be tightly coupled.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F121%2F1%2Ax487hOwq6mNyuLdVrt84OQ.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F121%2F1%2Ax487hOwq6mNyuLdVrt84OQ.jpeg"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If the RPC interface is dedicated for the service of the functionality provided by the form, why not merge the two states? Then it become a shared whiteboard between user/frontend/backend:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F411%2F1%2AMmij64sQQRZnKBlbLZMsdw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F411%2F1%2AMmij64sQQRZnKBlbLZMsdw.jpeg"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Take this application for example:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F398%2F1%2Aj9O-TRMGLowESQtmrfkjFQ.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F398%2F1%2Aj9O-TRMGLowESQtmrfkjFQ.gif"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can share the form state, with both frontend and backend. Here is the code:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The method saveReservation is marked with "runAt: server". The form state will be sent to backend for calculation, and then the form state updated in the backend will be rendered back to the UI. Although this is unconventional, but with fewer states, the code should be easier to reason about in theory.&lt;/p&gt;

&lt;p&gt;This concludes the first part, we have seen following ways to reduce states count:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Derived state: such as computed property, autorun state subscription, database view, materialized view&lt;/li&gt;
&lt;li&gt;Bind the database query, use it as if it is local state&lt;/li&gt;
&lt;li&gt;Share form between frontend and backend&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Simpler Temporal Logic
&lt;/h3&gt;

&lt;p&gt;Declarative style data binding can remove some non-essential state from the system. Now, we need to focus on the temporal logic itself. The first property is sequential.&lt;/p&gt;

&lt;h4&gt;
  
  
  Sequential
&lt;/h4&gt;

&lt;p&gt;Sequential single thread programming is simple. Yet, CPU tried very hard to parallelize the execution of the single threaded instruction stream. It will look ahead and speculate, as long as there is no data dependency.&lt;/p&gt;

&lt;p&gt;Most temporal logic in business application can be single threaded, as long as the I/O can run concurrently, this is where co-routine comes in handy. However concurrent programming with co-routine, need to deal with “async” and “await”, it is still more complex than sequential programming. Can we write the program sequentially, and then speculative execution will try run it concurrently?&lt;/p&gt;

&lt;p&gt;This is what &lt;a href="https://wiki.haskell.org/wikiupload/c/cf/The_Haxl_Project_at_Facebook.pdf" rel="noopener noreferrer"&gt;Facebook Haxl project&lt;/a&gt; has demonstrated to us. The idea is simple, given we have these two tables:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;Post has author, and author has inviter. We can map the relation as computed property in code:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;If we express the UI rendering logic sequentially:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const author = somePost.author
const editor = somePost.editor
return new UI({ author, editor })
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Even the code is sequential, loading author and editor can be executed concurrently, as they are independent from each other. However&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const author = somePost.author
const authorInviter = author.inviter
return new UI({ author, authorInviter })
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Because inviter depends on author, this two lines can not be executed concurrently. We have implemented this kind of “optimization” in TypeScript, with modification of the compiler and a runtime scheduler.&lt;/p&gt;
&lt;h4&gt;
  
  
  Isolated
&lt;/h4&gt;

&lt;p&gt;The second property to make temporal logic simple is “isolated”. Here is a frequent case where isolation is broken:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F547%2F1%2ACG79mmeaJOyqCQO8psJ1DA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F547%2F1%2ACG79mmeaJOyqCQO8psJ1DA.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If we render this table from the “Post” object defined above, it will lead to so call “1+N” problem. For each row, there will need extra SQL to fetch author name and inviter of the author. This is where we normally ignore isolation, and load them in batch. What we want is automatically batching up the load operations.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2APZMnJ7aJPskb2w1zdUHs6w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F1024%2F1%2APZMnJ7aJPskb2w1zdUHs6w.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;First, all the query in browser need to be batched in one big http query, with all the additional data to be fetched.&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;2019-07-19T11:25:04.136927Z 27 Query START TRANSACTION
2019-07-19T11:25:04.137426Z 27 Query SELECT id, title, authorId FROM Post
2019-07-19T11:25:04.138444Z 27 Query COMMIT
2019-07-19T11:25:04.772221Z 27 Query START TRANSACTION
2019-07-19T11:25:04.773019Z 27 Query SELECT id, name, inviterId FROM User WHERE id IN (10, 9, 11)
2019-07-19T11:25:04.774173Z 27 Query COMMIT
2019-07-19T11:25:04.928393Z 27 Query START TRANSACTION
2019-07-19T11:25:04.936851Z 27 Query SELECT id, name, inviterId FROM User WHERE id IN (8, 7, 9)
2019-07-19T11:25:04.937918Z 27 Query COMMIT
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Then in the backend, the SQL queries should be rewritten as IN query. It is implemented by controlled coroutine call stack evaluation. Instead of doing depth-first evaluation, we do breadth-first. So all I/O operation can be collected then to be rewritten as a batch query.&lt;/p&gt;

&lt;p&gt;This way, I/O batching as a cross cutting non-functional concern, no longer complect the essential temporal logic we want to express.&lt;/p&gt;
&lt;h4&gt;
  
  
  Continuous
&lt;/h4&gt;

&lt;p&gt;The non-continuous version interleaves the temporal logic like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F524%2F1%2AHdJCZa5-3U1DQnT8rNBVAw.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F524%2F1%2AHdJCZa5-3U1DQnT8rNBVAw.jpeg"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Although, billing process is more causally related, it can not be written close to each other. Although, contract singed and assignment dispatched has fewer common business, but because they are temporally related, they have to be written one after another in the order of time.&lt;/p&gt;

&lt;p&gt;It is essentially several timelines squashed into one mainline. The POV constantly changes, in this single threaded story telling. The sequential nature is no longer a blessing, but a curse. We need to separate those independent timelines, and describes them in its own encapsulation block. It is called "Process", as business process.&lt;/p&gt;

&lt;p&gt;Let's look at this final example&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F371%2F1%2A1nzVbM1AJtQsTokOX7yeqA.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn-images-1.medium.com%2Fmax%2F371%2F1%2A1nzVbM1AJtQsTokOX7yeqA.gif"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The account life-cycle can be described as a simple state machine. But in normal imperative programming style, it can be written as a continuous temporal logic. Here we have contributed a alternative style:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;



&lt;p&gt;The process() is a function with infinite loop. It is never meant to be executed forever. Every time, this.recv(), the execution breaks, and the control yield to other process.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;locked&lt;/strong&gt; : this.commit()&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;normal&lt;/strong&gt; : this.commit()&lt;/p&gt;

&lt;p&gt;statement will label is a savepoint, just like in the game. The process can be loaded back from that savepoint, and resume. But where is the process saved to? Just ordinary Mysql table:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;p&gt;The label will be saved in the "status" column. Other local variable, will not be saved in database, unless there is a object property with the same name (password, retryCount).&lt;/p&gt;

&lt;p&gt;Business transaction, or moment interval in color uml, is a good fit for this kind of programming style. It also settled the problem of where to put inter-object interaction in DDD, we can put it in Process. Because the code written out this way is not very different from the box/arrow in whiteboard, and it is not very different from original business requirement, it will facilitate the communication between different groups with shared language.&lt;/p&gt;

&lt;p&gt;Unlike BPM or workflow solutions. Process is part of the application, it can be rendered as UI, and sum/avg by the report. We do not need to maintain a extra copy of data (extra states) to feed the "engine".&lt;/p&gt;

&lt;p&gt;The UI binding of this application looks like this:&lt;/p&gt;


&lt;div class="ltag_gist-liquid-tag"&gt;
  
&lt;/div&gt;


&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;We are building a aPaaS product called MulCloud to enable this kiund of programming style. It has identical syntax with TypeScript, and inherited a lot of familiar ideas. We believe our approach, not only simplified programming, and because we stand-upon giant shoulders, people will also find it to be easy.&lt;/p&gt;

&lt;p&gt;There is one more caution to leave behind for future innovators:&lt;/p&gt;

&lt;p&gt;MulCloud has to put a lot of effort to build observability solution to allow the developer debug in production. Old-school imperative programming despite the odds, it has one shiny virtue: &lt;strong&gt;it has one-to-one mapping with the instruction stream executed by the CPU&lt;/strong&gt;. By building the program in sequential / isolated / continuous way, the source code is detached from the reality, which is concurrent / entangled / with lots of long range causality. When something goes wrong in the wild, it is harder, much harder to debug. Bret Victor has demonstrated a lot of cool ideas on providing more runtime feedback. We borrowed a lot of them, but that deserves another post.&lt;/p&gt;

&lt;p&gt;Simplicity is hard, when simplicity is not the reality.&lt;/p&gt;




</description>
      <category>softwaredevelopment</category>
      <category>programminglanguage</category>
      <category>softwareengineering</category>
      <category>typescript</category>
    </item>
    <item>
      <title>What represents the past, present, and future: the future(2)</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Mon, 29 Oct 2018 15:09:29 +0000</pubDate>
      <link>https://forem.com/taowen/what-represents-the-past-present-and-future-the-future2-2omm</link>
      <guid>https://forem.com/taowen/what-represents-the-past-present-and-future-the-future2-2omm</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--PR1zWfFS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47513241-68dda500-d8b0-11e8-937c-f272f883dffb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--PR1zWfFS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47513241-68dda500-d8b0-11e8-937c-f272f883dffb.png" alt="image" width="770" height="578"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Active process
&lt;/h2&gt;

&lt;p&gt;In the previous simple process, the coroutine is passive. It waits for future instruction and just does what was told. This contradicts with the requirement document, which normally describes a task as if it is alive. It knows what it is doing, and it knows what it should do next.&lt;br&gt;
In this chapter, we are going to explore the ways to make the future representation more active.&lt;/p&gt;

&lt;h3&gt;
  
  
  Time scheduler
&lt;/h3&gt;

&lt;p&gt;The first example is about sleep. &lt;/p&gt;

&lt;p&gt;lua version: &lt;a href="https://tio.run/##rVFBboMwEDyXV6xywZYIChB64y0RNZtgFWxkr6VWVd9ObYNI4dBT92TP7szO2INr5/nulCCpFVjRY@cGZNTad56Ar761t1EbzGC0jwxa87DQgNBGO5IKc4PWjb8J8g5MadqIHKhHFVuhDJIzyxVVtzG8ODQNpHZAnNIDR9scP1A4QnaKA3CCPI9ecotCq87yneIhR4CTZ0j3VtxCgy2kyUhFzO8g6YPwzJsgnIp06T6TfkocOrY6zOBr3dyU33/plP@kU6VrjuB89wPCYOtfZovFk0P88xn86OQonIr6eikur9e6eIkxd1gVsXKH1RGr5vkH"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nQ_DJrMZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488756-935f3c00-d877-11e8-9216-23e25dfba377.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nQ_DJrMZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488756-935f3c00-d877-11e8-9216-23e25dfba377.png" alt="image" width="880" height="664"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://tio.run/##rZDBboMwDIbvPIVvJBVD0N0qsdMeYTeEpgxcxpYmCCfdJsSzM6dA1e02aT5YSvz7s3@/qbOieuh6d2dsg/N89KZ2nTVA9Ss2XqNwit4ljBFwaHRwVhoKCL@pwU8n5LUyNtbgAUJOgszzozxRm4AaWqq4q6wmzly69HRHEEG80UMM6PxgLs/pKmIGFEUBMWnEPr7VE7qn7oTWOyEkFA@/9l5Gp4S1NQ3BDvIsy@SKn6Jos7sD8i/5c@gRG55byGpMtW2FwQ94VA6FTNtlomB2TA77PF54Xx3qBsp1xwTGdehhP1V/4e3/mXfPPHYarPHtb2xGP081z98"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wrIIXakf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488248-94439e00-d876-11e8-9fe2-cad2aed794f8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wrIIXakf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488248-94439e00-d876-11e8-9fe2-cad2aed794f8.png" alt="image" width="880" height="710"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Call with continuation
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;coroutine.yield('sleep', {seconds=2})&lt;/code&gt; is awkward. Why sub1_task can not put itself into sleep mode directly? Because it does not have reference to itself. Some language provides “call with continuation” which essentially allow the coroutine to get its remaining calculation as a continuation, and pass that to another function as callback. We can simulate that in lua&lt;/p&gt;

&lt;p&gt;lua version: &lt;a href="https://tio.run/##nVHLboMwEDyXr1jlgpEICtAop3wLcswmWHFs5IfUKsq3U9uhAVTUSvXF9uzOemYsHB0GoRgVYARi35yddRrhCJKLJDk7ySxXEgzrsHUCSZaAX/wMRCq74GRgO5SxHJZGDz6vKNu4W2quOVB9MX6@pSeBhZM9ZVeymBN7lSnwA5mzSDaxChsoikguDDIlW/NsXJEd4I6a5qY05nAzl9ejTGnlLJdYaDTuhiRIWlr6Jv5tZ5ZJgGZhBUlkcrsq9D7VH7E@afvkKNqfU92pbAJpJrrXXFris7Lcm8lySI3FvkxnL4467mNox@rxG7X6P7VOR8WBtciaaaT@H18GsmT1H3J4Fme5Jtst@Mbe2XAq9@@7sjocqt1btLnA6ohVC2wfsXoYvgA"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--jhqx6Yk2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488455-fe5c4300-d876-11e8-847a-ae0dea76d2f6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--jhqx6Yk2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488455-fe5c4300-d876-11e8-847a-ae0dea76d2f6.png" alt="image" width="880" height="915"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;“Future” concept is introduced, it wraps the task continuation along with its requirement. When the requirement fulfilled, the task execution will be resumed.&lt;/p&gt;

&lt;h3&gt;
  
  
  Async and await
&lt;/h3&gt;

&lt;p&gt;Notice that we did not give es2017 version in above example. Because es2017 has its own version of “future” builtin. &lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://tio.run/##pVC9bsMgEN55itsClZuASSNLUTr1ATp0r7B9TWgxWAZiRVWf3cVpUJOxKtOJ776/e1dH5ZtB9@Heuhan6S3aJmhnwRvEnnpsnG09g08C6a1WMKB35oigPeTdDI0HtNAoY7CF@gQew4vu0MVQQDggBOU/YNTGQI2zTOywPVMHDHGwYHGE58F12iPNLrvHK5n8W8AlFtyB4JwztiVfhCh/sg38Foi1eJ09aU6fOImPS@P2dDZ7UgEpW@5/DCgrYOED9mLBzutqVDpc7lCyvyiU/1aQSSFVuupASLpwOkIfwzyJhzUXUkheVpLn3DfAupKbHOcG2FSyyi7T9A0"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UqqmyswX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488534-29df2d80-d877-11e8-9e2d-6206468f1794.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UqqmyswX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488534-29df2d80-d877-11e8-9e2d-6206468f1794.png" alt="image" width="880" height="542"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Although the code looks very different. The underlying mechanism is actually very similar. The compiler/interpreter of es2017 will turn “await” into a continuation passed to the promised given.&lt;/p&gt;

&lt;p&gt;We can change the lua code to work like the javascript promise. It just returns a function called “resolve” to capture the task continuation and register it to somewhere, then yield execution.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--uLX0NAJj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488621-51ce9100-d877-11e8-86d5-fb58e0415cfd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--uLX0NAJj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488621-51ce9100-d877-11e8-86d5-fb58e0415cfd.png" alt="image" width="880" height="180"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Network client
&lt;/h3&gt;

&lt;p&gt;Time scheduler is simple. It does not need to return value back into the task. If the coroutine needs to play the role of a network client. It needs a way to get the result back.&lt;/p&gt;

&lt;p&gt;lua version: &lt;a href="https://tio.run/##bVDbasMwDH33V@gtMayB9j3fUrJEoaau7dlyyhj79ky2u7QxNRiEpHPR0XFYV23HQcOoFRo6z5GiR@jBKC3EHM1IyhoIaKazx3FpaQjXD/D4JQXwK@AN9ZPG/bbT8//NezV9KcrIehtJGey@FeqpLcQeecE89jqPwQn28Gopfh7PSSlbkgAZ5rwy1DaB0B0b@dKqL2iCvSHMwxWT0YiBGilrihNTZNWEYddPq6PHgbDdTEjxnLHZeMOHThkeDkyLxFxwR7gMC6ZICeiyyYuiuwuqyzm/AfMhuCicdgQ1NDi2vLszOGsCNu@9SpGUuO8ipSpnmIs6qbpZaP8xp3X9Aw"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HzaShwCl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488934-f6e96980-d877-11e8-845e-5d4e409ddcbb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HzaShwCl--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47488934-f6e96980-d877-11e8-845e-5d4e409ddcbb.png" alt="image" width="880" height="731"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Returning value is not a hard job as it turns out. The caller and callee share a “future” object so that they can send request and response via this object. With es2017 await syntax, return value back into coroutine is much simpler.&lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://repl.it/@taowen/es2017-client"&gt;https://repl.it/@taowen/es2017-client&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Ah66vcfm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47498718-ab8f8500-d890-11e8-9229-fd8ed3a07d02.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Ah66vcfm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47498718-ab8f8500-d890-11e8-9229-fd8ed3a07d02.png" alt="image" width="880" height="793"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Network server
&lt;/h3&gt;

&lt;p&gt;At this point, we have already seen how coroutine can compute, sleep and call external service. Now, we need to make the coroutine a server. We only present es2017 version here, the lua code will be pretty much the same as “network client example”.&lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://repl.it/@taowen/es2017-server"&gt;https://repl.it/@taowen/es2017-server&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TfCur9iA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47498748-bc3ffb00-d890-11e8-9a91-f8d473a1bc56.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TfCur9iA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47498748-bc3ffb00-d890-11e8-9a91-f8d473a1bc56.png" alt="image" width="880" height="774"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The difference between client and server is, for the server, it needs two await, but for the client, it only needs one await. One for retrieving request from the scheduler, one for sending back a response. For TCP socket, writing can be blocked by the kernel when the buffer is full. So sending back response is also an async operation. &lt;/p&gt;

&lt;h3&gt;
  
  
  Passive Process V.S. Active Process
&lt;/h3&gt;

&lt;p&gt;We have checked out several examples of the active process. It can do the following things:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;sleep&lt;/li&gt;
&lt;li&gt;as a client to call another service&lt;/li&gt;
&lt;li&gt;as a server to be called by another service&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Compared to the passive process from the first chapter, the active process is indeed very active. It has control of its own destiny. For a direct comparison, let’s say we have two steps, and one sleep between them.&lt;/p&gt;

&lt;p&gt;Passive version: &lt;a href="https://tio.run/##lZBBDoMgEEX3nmLiRtyQ6L5naSz8pqQUDDBJe3qq0qjRVVny/3sDY3nI@c5OJeMdRQuMIkJ5p2Nb0XR8lHhDcYKol5hqkpLWDpyuqk3At@6ahvgUhR6DcUk0MWHsmnKlfPCcjIP8GFh9LvbNyTrjYtYeFQGRX/hFZ6j/G1IPaLYQLS3Q3KHLjlUBw7SK9Z/FfXxg2WO3hf1@2jYk5y8"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Active version: &lt;a href="https://tio.run/##hVHLqsMgEN37FUM3UQhC7r7fEqyZNFKjwQf0UvrtudGkNaGFOwvFOTPnzBx1FPOsrRQavEac2j6G6JCQPhoZlDVrmgbhbzUId/UMliDp2DfAGR6l5plxaZ2NQRnkvwp1lzkYQdPt2eOlaROwoalvcsoEWvmAU1OxIrUN8fAoren8uXl@1P9UHwJywC5qpGut6oEaGw6zMwgDmgyncLgk12eiSnfZbNkziItGHs0k5I0eeHKt9RzvKGNAesoonIDz3My3ydk3@4zSOT0I347WYQ2jv75Fi5cOfRxxZ9drpVfj/@vsPMlmJa6DhnQolvnfn8PIV/0aVrDwzfMf"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--D5Z-2CKg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47514486-15208b00-d8b3-11e8-81da-3dda31daf99c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--D5Z-2CKg--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47514486-15208b00-d8b3-11e8-81da-3dda31daf99c.png" alt="image" width="608" height="557"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It is really no different from “passive” v.s. “active” sense. From sub1_task point of view, the passive version use “coroutine.yield” directly to wait for the scheduler to wake it up, the active version use “sleep” which in turn uses “coroutine.yield” to wait for the scheduler. Both versions essentially do the same thing, they call "yield" to park itself.&lt;/p&gt;

&lt;p&gt;The difference is about predefined protocol. The active version has a protocol between scheduler and task. If the task park itself to “sleep_future”, the scheduler has the responsibility to wake it up in the specified time. Just like you use os.execute(“sleep 1”), the operating system has the responsibility to wake your os thread up 1 second later.&lt;/p&gt;

&lt;p&gt;The benefit of the active process is the scheduler become infrastructure, all unstable business logic is isolated in the task. For the passive process, both sub1_task and schedule will need to be modified if the process needs to be updated. &lt;/p&gt;

&lt;p&gt;The scheduler becomes an extensible platform, the tasks are plugins to reuse platform capability.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--AD6X3pdS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47509355-8870cf80-d8a8-11e8-8c92-b7b8c71e4245.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--AD6X3pdS--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47509355-8870cf80-d8a8-11e8-8c92-b7b8c71e4245.png" alt="image" width="340" height="463"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Coroutine UI
&lt;/h3&gt;

&lt;p&gt;The task parked in the scheduler is anonymous. This is a problem. If we do not know what are you waiting for, how can we know what to give? We know the sub1_task want to know if step1_add or step2_add. This is a “User Interface Problem” essentially. &lt;/p&gt;

&lt;p&gt;Here is an example process we want to describe&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--769my4m9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47511685-4dbd6600-d8ad-11e8-9804-11ffd88d6588.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--769my4m9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47511685-4dbd6600-d8ad-11e8-9804-11ffd88d6588.png" alt="image" width="584" height="179"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When the task is launched, it will present the user with two options, either to add or sub. Then the calculation result will be displayed back. One second later, the UI will change back to the two options to let the user do the second calculation. This forms an infinite loop.&lt;/p&gt;

&lt;p&gt;This process can be represented by this code&lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://jsfiddle.net/taowen/L0p516xv/56/"&gt;https://jsfiddle.net/taowen/L0p516xv/56/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--sSK-BcVQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47512237-595d5c80-d8ae-11e8-9c49-8b692ff6b96e.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--sSK-BcVQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47512237-595d5c80-d8ae-11e8-9c49-8b692ff6b96e.png" alt="image" width="503" height="299"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Just like the scheduler can expose API “sleep”, “recv”, “reply”, it can also provide “user_input” as a reusable infrastructure. Of course, the scheduler can not know what the user interface will look alike, so it takes the argument to define the UI component to use and the model to render the view. Here is the source code of scheduler&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--l3FJCqru--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47512280-7003b380-d8ae-11e8-8222-bdd34092161f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--l3FJCqru--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47512280-7003b380-d8ae-11e8-8222-bdd34092161f.png" alt="image" width="587" height="433"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The user_input the render the view by setting three variables:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;step_ui: which component to render the view&lt;/li&gt;
&lt;li&gt;step_ui_input: the model of the view&lt;/li&gt;
&lt;li&gt;step_callback: what to do when the user submitted the form&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;As we have already learned, how “await” “promise” and “resolve” works, it is obvious when the user submitted the form, resolve callback will resume the execution of sub1_task. The actually view rendering and form submission implementation is irrelevant to the topic here. You can play with code at &lt;a href="https://jsfiddle.net/taowen/L0p516xv/56/"&gt;https://jsfiddle.net/taowen/L0p516xv/56/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The scheduler.user_input is provided as a generic tool. It can be used regardless of the business logic we need to describe. The vocabulary of “active process” is expanded. There will be more “future” can be represented by coroutine.&lt;/p&gt;

&lt;h3&gt;
  
  
  Coroutine hibernation
&lt;/h3&gt;

&lt;p&gt;We can not put the coroutine UI into production, because there is a serious flaw. The coroutine needs to stay alive in computer memory for the whole time. If the user decided to click the button tomorrow, we can not keep the process alive that long. We need to click the “hibernate” button so that the state can be dumped to disk so that we can save resources only resuming the process when needed.&lt;/p&gt;

&lt;p&gt;This example requires a special version of lua interpreter: &lt;a href="https://github.com/fnuecke/eris"&gt;https://github.com/fnuecke/eris&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_M0wIn56--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47512433-c5d85b80-d8ae-11e8-9356-b92b182ac103.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_M0wIn56--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47512433-c5d85b80-d8ae-11e8-9356-b92b182ac103.png" alt="image" width="672" height="1002"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We have expanded what the scheduler can do further. This time, the “hibernate” function allow the task to dump itself into the disk. The representation of the future is totally encapsulated into one coroutine “fib”.&lt;/p&gt;

&lt;h3&gt;
  
  
  Coroutine database
&lt;/h3&gt;

&lt;p&gt;The hibernation implementation is also seriously flawed. The “continuation.data” is written in an alien binary format. Represent the future using an object such as “order” does not have this problem. We normally would create a table called “order” to save the unfinished order processes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DHB3Ng6b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47512477-e6a0b100-d8ae-11e8-8565-7850acd09b1f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DHB3Ng6b--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47512477-e6a0b100-d8ae-11e8-8565-7850acd09b1f.png" alt="image" width="565" height="148"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To represent the order process using coroutine, it would look like:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--UvnoBHri--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47512544-02a45280-d8af-11e8-94d2-5556677f49ba.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--UvnoBHri--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/47512544-02a45280-d8af-11e8-94d2-5556677f49ba.png" alt="image" width="576" height="249"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The order_status column of the order table is a cursor tracking the position of execution. It is essentially the position of “coroutine.yield” in the coroutine. So we can represent the order_status with lua label. When persisting the coroutine, the “program counter” can be translated to “order_status”, and mapped to a database column.&lt;/p&gt;

&lt;h3&gt;
  
  
  Summary
&lt;/h3&gt;

&lt;p&gt;Now, we can represent calculation logic, UI and database by the active process. The coroutine yield control to its scheduler for reusing predefined and stable atomic operation. All those unstable business logic can be encapsulated in a single coroutine. &lt;/p&gt;

&lt;p&gt;Using the active process to describe “long-term future” is unconventional. Looking a coroutine to “invoke” a UI or database is mind-blowing. But it is a plausible representation, and sometimes come in handy.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>programming</category>
      <category>lua</category>
    </item>
    <item>
      <title>What represents the past, present, and future</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Tue, 23 Oct 2018 01:18:07 +0000</pubDate>
      <link>https://forem.com/taowen/what-represents-the-past-present-and-future-963</link>
      <guid>https://forem.com/taowen/what-represents-the-past-present-and-future-963</guid>
      <description>

&lt;h3&gt;What represents the past, present, and future: the future(1)&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--p3cuW0L7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/800/0%2AO97GbaErKzttHXOM.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--p3cuW0L7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/800/0%2AO97GbaErKzttHXOM.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Our software builds a cyberspace for intelligent agents to interact with each other. Using machine learning algorithms to build intelligent agents is a hot topic. But most of the code written is still about building the virtual environment itself. The code and runtime describe what could and would happen in this environment, which is about “the future”. To solve ambiguity, code and runtime must resolve what is really happening at “the present”. The code and runtime also need to faithfully record what has happened in “the past”. To maintain the integrity and lay a solid foundation, all three things should be done properly, otherwise, we perceive the anomaly as “a bug”. Most prominent cyberspace is the online multiplayer game as the article cover suggests.&lt;/p&gt;

&lt;p&gt;As the business interest is rapidly shifting to build more intelligent agent, the cyberspace building technique became a commodity, some good old wisdom might be lost and forgotten. The current mainstream approach is mature and working but is far from elegant. This series of three articles attempt to summarize my observation on the possible representations of “the past”, “the present” and “the future”. Some of them deserve more attention.&lt;/p&gt;

&lt;p&gt;What represents “the future”? This seems to be a quite obvious question. The so-called business logic has many many ways to express. We are going to go through some very junior concepts from “future representation” point of view. Then we can see it is not actually as simple as we thought.&lt;/p&gt;

&lt;h3&gt;Source code&lt;/h3&gt;

&lt;p&gt;“the future” lives in the document, in the source code, and in the runtime. The language we are going to use is lua 5.3 and es2017 (ECMAScript 2017). All code can be tried online at &lt;a href="https://tio.run/"&gt;https://tio.run/&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;Simple process&lt;/h3&gt;

&lt;p&gt;We are going to describe a very trivial process: result = (a+b)*3. The “some_important_business_process” describes the process of taking a, b and computing the result.&lt;/p&gt;

&lt;h3&gt;Function&lt;/h3&gt;

&lt;p&gt;The simplest way to describe “what could and would happen” is using a function.&lt;br&gt;&lt;br&gt;
lua version: &lt;a href="https://tio.run/##hY69DgIhEIR7nmLKQ@8KtbLwWQggJiTcQtjl@ZEoMcbGbWbn259Marb3RyMvMRM478HEveQqlsS4xpECsyk1@6GLXeG0wqiUvU2ogVsS3GBxhPsa@MEuL/9Zmc0BfnJplSZVge5Q6ufpvzSnFWetSo0ky/tIY9tw7f0J"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CWsg2xqC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/476/0%2ABunIN7l1MyvZ43hA.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CWsg2xqC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/476/0%2ABunIN7l1MyvZ43hA.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://tio.run/##hY6xDsIwDET3fMWNLRQqYGLgW6o0BBSU2lXsdEF8e6igqsTELed7sq172MmKS2HUHfHVl3LL5DQwQXjwXRhGTmpJuz5LIC/SjYnd7JVt0Nd4GsyKXpG85Ki4wGKLfsVuJqdPWheWYQO3cM2JFmpexvx8@1fj0OBYG8ckHP0@8r36ntZoW5xLeQM"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--B7f4a-6_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/479/0%2Aq2_WvsomoXtWL6ba.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--B7f4a-6_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/479/0%2Aq2_WvsomoXtWL6ba.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;Partial function&lt;/h3&gt;

&lt;p&gt;In the previous function, we take both a and b as the input. However, we might not know all the things from the beginning. To tell the story gradually, we take “a” first and then take “b”.&lt;/p&gt;

&lt;p&gt;lua version: &lt;a href="https://tio.run/##hZDBbsMgDIbvPIWPsCaRtp162GFPEhlKKzRiEJjnT0lJMm5FQlj@/duf8QXX9V7IsAsEERM79PI@wDRNSkA9Phj0oEOh2296ZPgBRu3tFNH8ybMqWS6J4Oj0L/SibM5CL@/ZUnXTLN3Edk@kHBY7uyWGxEg865Id2ZznmIKpr8QBdA@abC6eKyXCBXQnmJr73ln3kj34ANPv0LIviubcRtXq43PeEA3wqcQ4NpfLlcMEYkcFt33EOXzT5ZcSMTli2dIKqvG6rk8"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--43N21qRe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/521/0%2Ad_swNOp2pKadKGHx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--43N21qRe--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/521/0%2Ad_swNOp2pKadKGHx.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://tio.run/##hZC/UsMwDMZ3P4VGB4J7wMTA0CfJKa5bzLlWzpJZuD57UP40l054sSX9PumTv/EH2Zc4yEumUxjHc81eImUYsEjEZM8tOOd6qvl0LBdu4NeAnhKklgwbbhXCXX3POI8pWfmK/NhrjmbRLLmZm9n6MV1DF68DFcEsXV855sDcDYW83hZb6O@zUhCdxTUJfALCM/Rb2mvmfTW8AuvjCfx@kSWrDibVNETJ@xf846WF18YcDosqsjrwlCXmitMm5sHdxNi3xijBlIJLdLFLsQFt8TGOfw"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ir5uxzU8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/520/0%2AzH8mH9OPjM9Hdmuf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ir5uxzU8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/520/0%2AzH8mH9OPjM9Hdmuf.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The variable “proc” captured “a” into itself. It is a continuation which execution can continue from it.&lt;/p&gt;

&lt;h3&gt;Object&lt;/h3&gt;

&lt;p&gt;Having one initial input and one final output is not enough to describe what is to come. The complete representation of the future must describe what would happen given more input in the future. To represent this, the normal choice is to use an object to store the process, and a couple of methods to be called when the story unfolds.&lt;/p&gt;

&lt;p&gt;lua version: &lt;a href="https://tio.run/##nZHdasMwDIXv/RS6TLY20PRqgz2LcVwVDI4cbJkOSp89808oZXQzm2@MrHP0yZKNal2t08pCcDNKMy/OsyKWUwyGMAS5eKfTDR9wvYmGaJDS0Ak/k7ihFOIcSbNx1FIOhJdO9QLS2TpFe84E5BlZsZosdtfbrlWolsjmQSW7KqFHjp7Kq0A6/aGtwLgcumzcwfRQ22OIlkt/hfQK03fSpvkHcCzAStOJcfyJu0UvoH@F14FmQntlZRGHXuz31WACKNCO2FBU@Qfi3kLOv9cBjb1YvCHuarKHZD8@UY7dE@Hbun4B"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RnqLTKoZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/613/0%2AIq8tvnXBt_TI95cT.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RnqLTKoZ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/613/0%2AIq8tvnXBt_TI95cT.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://tio.run/##hZDBbsIwDIbveYoc000UAacdeJbKDRYEhbiKne0w8eyd01YIgQY@xLL1@Xf8n@Eb2OcwyCrRAccxolimC3bhMlAWSNL1hUNC5m7I5DXbvfURNP8aq@EpseTihbKDZmnWkFPgFpSGqXWdXhYcNq5/4jJyiaLwMvVp@xuQUUpO99yD4Nbd69UTvCrtXm1Yqg/r36@5mipZr9fphD/vDHKbxqzX80BgC9UiCamABEqT1u0vlWlnT7aNqVZSxDbS0c1IY1VoZ555Pfk//Gsc/wA"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--M3zxFLEm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/617/0%2A2gfsAqD5tPwnh_Vf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--M3zxFLEm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/617/0%2A2gfsAqD5tPwnh_Vf.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The variable “proc” here is same as previous “proc”. It is a continuation which execution can continue from it.&lt;/p&gt;

&lt;h3&gt;State machine&lt;/h3&gt;

&lt;p&gt;However, using object does not exactly represent the process. As step1 and step2 are equal. There is no place defining step1 should happen before step2. A more accurate representation is a state machine:&lt;/p&gt;

&lt;p&gt;lua version: &lt;a href="https://tio.run/##rZLRasMgFIbvfQrvmmyt0PRqgz7JGGLMKRPMMeiRFUqfPVMTtox1CxvzRsz5z/9/OWqjGkfrtLI8uB6k6QfnSSHJNgaDEIIcvNNp50d@ubIVkZDSYAfnJF5RMnaKqMk4XFMKhNdK1YynNZOCPeUEoB5IkWotVJfrds1ossjNQqV29XFEOJMMBEP6vMn7flOKHih6LBoG2P2CuZhUuXHL20WwhxAtFfiCcc/bHzCaLxizwR9omkKzHKNOMYfv0ObTHde3@dDY/2HLpvOghBBLvnS5L66bgZ4@Ezwvwyfh0qQgTC45Z/05lke2r9luNzWYwBXXDslgVPk/2Pt4cv2xUDc1G7xBqqZazVP34Zbwhu5hHN8A"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--zKz70ia2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/618/0%2AiFEPjY7vjABwyQg0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zKz70ia2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/618/0%2AiFEPjY7vjABwyQg0.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://tio.run/##jZJNboMwEIX3nMI7TH8chay6yEmqCg3OKHFlbGSP20pVzk49QKOENk29YMTw5ntPHl7hDaIOpqdH53c4DBZJRN9hY7reBwJHTZuicRhj0wevcxVboS3k@lmIfLR3kULS5IOEam7yoYOJCrIaLlsOP6iJhH3@VHJdl6PgOD7Hhmx/gALGZCmPzNh70d7A1uVJEJBScOeghWMtzw35EnTGbP6KML/dCX09h0vW/i8ET0mlFIR9XEbpkA5@N/s@X5q8LPGTWGmwVrL0QXxTZ7djwUzeJQfE91vrluuqWK2mARMF8MLJuARkvBtZp3thzRhN1lXB/4W3qKzfy0lRiczZFL/Ir6qfhuEL"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HXmujFjC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/626/0%2AzAalUgdrhIQnyyC9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HXmujFjC--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/626/0%2AzAalUgdrhIQnyyC9.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The “next_step” is a cursor, that keeps track of execution status. In x86–64 CPU, there is an instruction pointer register called “RIP” which is similar to “next_step” concept here.&lt;/p&gt;

&lt;h3&gt;Coroutine&lt;/h3&gt;

&lt;p&gt;Coroutine works like a state machine but looks like a function. It takes much less code to express the same idea. There are two styles of coroutine, one with “yield” (also known as “generator”), one with “async/await”. We are going to look at how “yield” works first.&lt;/p&gt;

&lt;p&gt;lua version: &lt;a href="https://tio.run/##hZBLDsIgEIb3nGKWoNZEu3LhWRpKx4SEQsNj4elxaKutRi0bIPP9DzBJ5nxLVkXtLATXY6P7wfkobWzaFLTFEJrBO0U7l4IBLeOUNNDCFZTzLkWCjneNpuPruceQTCRIwh7acfCJT8hapIivx/tLPh92oP56oO0Ym0xK3bdyyqOMyDeeJ1hVTVodqLRylsRJlp9hi1fJ65EX7gAnMUc2h6XwD/Ys2OC1jc/GQGk12xR@UV1yfgA"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--p9Ug2x1m--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/633/0%2A2YWa0WAsue2z0Ygo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--p9Ug2x1m--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/633/0%2A2YWa0WAsue2z0Ygo.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://tio.run/##hY/RDoIgFIbveYpzKbZ05VUXPYtDPDUagpODq7WenUCdrdUmN8B/vo/93MQonBxUT3tjWwzh4o0kZU0OznZYq663AwlDdeOdMuhc3Q9Wxj0THJ4M4tJI0MAZHgp1uyYDOq8pxgJ20EzxBCyDlZMRqabbaiyHHOSv9mIsWalEJLc6HjgryxlWLjaR1pAyXqQfshQXBu@UcfbV@DM4chYVZzUW2l6zmShGoT1yiC9X7I@04ZxCeAM"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--B8k7MJvW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/632/0%2AtiTU_KLed24o1klO.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--B8k7MJvW--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/632/0%2AtiTU_KLed24o1klO.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Yield kill two birds with one stone. It returns value to the caller and gets input back. Visually, it looks like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dj0Cv4w8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/463/0%2ASmYYhp2mBrbg-njt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dj0Cv4w8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/463/0%2ASmYYhp2mBrbg-njt.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;Coroutine object I&lt;/h3&gt;

&lt;p&gt;The coroutine is always resumed by the same function name, such as “next” or “resume”. It is less expressive than “step1” or “step2”. We can wrap the coroutine as an object to make the code looks better.&lt;/p&gt;

&lt;p&gt;lua version: &lt;a href="https://tio.run/##lZLBbsMgDIbvPAVHsqWRmp02ac@CgLgSEoEIjLap6rNnENJE7Tol5QLCn//f2JgoxtE4JQwNrgeu@8F5FBa5jEFbCIEP3qm00096vpANqOFc2w6@E7xBEnKKVqF2dotsLHwxURGa1lwpmFN2AOwBBQppgJ0v9ZZQkcjJTb5JCsp5FzFBjfIgENi1JlbY1VLe0D8aTPeH8RCiwQQK@krlErxPK9h9ci7nbblbpObDC1W79MB25bBCOdwDW55d4h4wejs1g6SkJ6YREIbjJFdTeSP2v@dEPm/TTgo7PWaD0s15vnt@1rGac3i9tj0DH@WlbUUGry1e@0wPhzSoh3DLHrDv4/gL"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--df_KbVXm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/632/0%2AwvCAy95sGUEca2Ke.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--df_KbVXm--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/632/0%2AwvCAy95sGUEca2Ke.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://tio.run/##lZDBbsMgDIbP4yl8JJ1K1fS0w54lItTbmChE2FSrpj175rAm2pRDNC4G@/8/bL/bqyWX/cD7mM44ji8lOvYp7oDSBTt/GVJmG7nrC/mIRN2Qk5OobQOfCuQEZOjhGW4ew3nJZKQSWNIWHqGv6Sq4FxadE8mpvhbH/bIDt7Z9qY2@jEROfBvQEONwFN48k@6nlh8ycskR@M2TifjBkv43tv2NnRexAk9cNQFEvbXOY6MOB6hiT7I0lyL7WOz0Q2XMyD/LrYU6p24bJR5KAU1Ir/pHYq42FGxA0Ce1drV6w/Q0jt8"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GN-QLyN3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/638/0%2Aje87lK5U-diJAEUa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GN-QLyN3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/638/0%2Aje87lK5U-diJAEUa.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;Coroutine object II&lt;/h3&gt;

&lt;p&gt;Although we have added method “step1” and “step2”, but they are not forced to be invoked in that order. Also, what if we want to have two options for “step1”? Say, it could be “step1_add” or “step1_sub”. Let’s upgrade the coroutine object to version 2.&lt;/p&gt;

&lt;p&gt;lua version: &lt;a href="https://tio.run/##jVPbjoIwEH3nK/om3UUS9Wkf/JamwKgkpSW9qBvj/jo7tVhUJEhi0Jkz55y5KBzvOqFKLohRDbC6aZW2XFpWOFNLMIa1WpX4JltyuSYzoJyxWlZwRvDOydLWSqYGxC4jkjdAE4JPUDsiRPPTHuwIUO8waw8gb7/8o8E6LcnxFgBZJQ/BqMP13tBY4klzH0Id/4qJIN@APagKc172VadUWjmLjeUajGvg5jD3LWZ9IY1O/Ce5e5ibYS7hlPLHMXhmdGHAIjG3vBCQXq7ZHFGgiLaQYfBcauAW0jgXOtX6UPJbg6hGQN@8sDGIa0nvtVuyMBbaFeNVtaDPywqD9KV@9OR7WEVeRBAIA28JjStmCJeThE8VoLXS6cJJOLdQWkCfA7g/oXtbr5MIapT89bbW7yx9KhCG6Xe0SUbt9F@@SBlzE2aSQBs0pi@UPv43fDRcaH8ln9zniiYsGxz6VB53cym26ytZLtGBExUpgMQ7QI1a2jg7xGzeE63x0kbYn677Bw"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--fUD-dvCi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/638/0%2AdHmec8p3Nsfu-ZjX.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--fUD-dvCi--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/638/0%2AdHmec8p3Nsfu-ZjX.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;es2017 version: &lt;a href="https://tio.run/##jVNNc4IwEL3nV2xPBKs46s2O/6H3TocJYdW0mDBJ8GMcfjtNAFGhreYCyb59u/vy8sX2zHAtcjuRKsWqWheSW6EkGLXDWOxypS2TNk4KIyQaE@dacfelLIQzAbcytOAPYQWX5BG9BC8AjabIbHf0sUO7VekYmN6YT5d5EpilXdgchOVb2oBuqfzizCAExmI@i1maBsu7qF9NMcfK4LWuECUDTKKRff/Ja4rkf97J87wprplLG9LZrVYHCAqJxxy5xTToIOVVKcl22NOpr6ZYA/UweFmtmgnmQV@1p4r5m/L3uCCDkdufEfAuNuilpGH99W6IXCHb7jXaQkuQeIB3rY4n6gHjmwY3aJedefyOWjcw2jH8boGW8JJBvTp9zA2uIWs6undeeJdS9iQpwzdSEnJj8EePYhaS6bSZv74G9w4OIsta9fHIMfcNk@uTcKQdvPYdPSdLmJchOCKuCqdwgtC5nXAljcowytSGNgzRnmUF1vjJjAxZXRMPshZV9QM"&gt;https://tio.run/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--2pR4XznX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/638/0%2AZPis7MPmj9v4S76a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--2pR4XznX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://cdn-images-1.medium.com/max/638/0%2AZPis7MPmj9v4S76a.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can see coroutine and object are very similar, and coroutine is more powerful. But without direct syntax support, writing method for coroutine is cumbersome.&lt;br&gt;&lt;br&gt;
This concludes the first chapter. We have used these ways to implement a very simple process&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;function&lt;/li&gt;
&lt;li&gt;object =&amp;gt; state machine&lt;/li&gt;
&lt;li&gt;coroutine =&amp;gt; coroutine object&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Function is inadequate in real-world usage. So we either choose object or coroutine to represent the future. Object is explicit about its expectation but implicit about its state. Coroutine is explicit about its state but implicit about its expectation. We have tried to merge object and coroutine into one with limited success.&lt;/p&gt;

&lt;p&gt;Also, we have learned many forms of continuation&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;partial function&lt;/li&gt;
&lt;li&gt;object instance&lt;/li&gt;
&lt;li&gt;coroutine instance&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Continuation will form the foundation of complex scheduling in following chapters.&lt;/p&gt;

&lt;p&gt;See also:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/software-engineering-problems/what-makes-code-unreadable-to-human-45fa30544386?source=collection_detail----1d0a7ad41f0e-----1---------------------"&gt;What makes code unreadable&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/software-engineering-problems/what-makes-up-the-software-20f607da9155?source=collection_detail----1d0a7ad41f0e-----2---------------------"&gt;What makes good software architecture&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://medium.com/software-engineering-problems/4-major-problems-of-programming-language-c6974c24e083?source=collection_detail----1d0a7ad41f0e-----3---------------------"&gt;3 Roles of Programming Languages&lt;/a&gt;&lt;/p&gt;





</description>
      <category>javascript</category>
      <category>programming</category>
      <category>ai</category>
      <category>lua</category>
    </item>
    <item>
      <title>What makes code slow to execute</title>
      <dc:creator>Tao Wen</dc:creator>
      <pubDate>Sun, 30 Sep 2018 08:08:55 +0000</pubDate>
      <link>https://forem.com/taowen/what-makes-code-slow-to-execute-1d6j</link>
      <guid>https://forem.com/taowen/what-makes-code-slow-to-execute-1d6j</guid>
      <description>

&lt;p&gt;“Within the light cone is destiny” -- Cixin Liu&lt;/p&gt;

&lt;p&gt;Performance is primarily determined by the algorithm of the code. There is no question of that. Some algorithm written in python is a lot slower than written in c. To me, this is also an algorithm problem, if you think the code and its language as a whole. It takes a lot more machine instruction to do the same thing in python than c. The sequence of machine instructions is an algorithm. &lt;/p&gt;

&lt;p&gt;x86 instruction can be seen as a high-level language for micro-code used internally in the CPU. When CPU decode the x86 instruction to micro-code with speculative execution will cause significant performance difference. This can also be categorized as an algorithm (of increasing concurrency) problem, and it is not what I want to talk about here. &lt;/p&gt;

&lt;p&gt;Besides the algorithm, what else can makes code slow to execute on the machine? The answer is "&lt;strong&gt;data locality&lt;/strong&gt;", it takes time to read and write the data for computation. There are multiple examples of it&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;CPU Cache&lt;/li&gt;
&lt;li&gt;GPU Architecture&lt;/li&gt;
&lt;li&gt;Distributed Map/Reduce&lt;/li&gt;
&lt;li&gt;Heap Management&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Walking through these examples, we can see why &lt;strong&gt;Heterogeneous Computing&lt;/strong&gt; is inevitable, and programming model in mainstream programming languages is overly simplified.&lt;/p&gt;

&lt;h1&gt;CPU Cache&lt;/h1&gt;

&lt;p&gt;Consider this example  &lt;a href="https://stackoverflow.com/questions/9936132/why-does-the-order-of-the-loops-affect-performance-when-iterating-over-a-2d-arra"&gt;https://stackoverflow.com/questions/9936132/why-does-the-order-of-the-loops-affect-performance-when-iterating-over-a-2d-arra&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Iterate matrix column-wise&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight c"&gt;&lt;code&gt;&lt;span class="cp"&gt;#include &amp;lt;stdio.h&amp;gt;
#include &amp;lt;stdlib.h&amp;gt;
&lt;/span&gt;
&lt;span class="n"&gt;main&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;4000&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="mi"&gt;4000&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
  &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;4000&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;4000&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Iterate matrix row-wise&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight c"&gt;&lt;code&gt;&lt;span class="cp"&gt;#include &amp;lt;stdio.h&amp;gt;
#include &amp;lt;stdlib.h&amp;gt;
&lt;/span&gt;
&lt;span class="n"&gt;main&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="kt"&gt;int&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;4000&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="mi"&gt;4000&lt;/span&gt;&lt;span class="p"&gt;];&lt;/span&gt;
  &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;4000&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
     &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;4000&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="o"&gt;++&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
       &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;][&lt;/span&gt;&lt;span class="n"&gt;i&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;i&lt;/span&gt; &lt;span class="o"&gt;+&lt;/span&gt; &lt;span class="n"&gt;j&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
   &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Row-wise version is 3x to 10x times faster than column-wise version. When the memory address is written continuously, the write can be combined into a single request to save the time on the round-trip.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ap-MsYM_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46247871-3fae2e00-c444-11e8-9d39-550623ad4bc0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ap-MsYM_--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46247871-3fae2e00-c444-11e8-9d39-550623ad4bc0.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;According to &lt;a href="https://en.wikipedia.org/wiki/CPU_cache"&gt;https://en.wikipedia.org/wiki/CPU_cache&lt;/a&gt;, as the x86 microprocessors reached clock rates of 20 MHz and above in the 386, small amounts of fast cache memory began to be featured in systems to improve performance. The biggest problem of processor design is that the memory cannot keep up with the speed of the processor.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--GWbZoEl1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46323686-0a5f3700-c622-11e8-8759-12710e920251.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--GWbZoEl1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46323686-0a5f3700-c622-11e8-8759-12710e920251.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can see the chart above, modern CPU can crunch a lot of numbers per second. The theoretical data transfer rate to keep the cpu busy is way higher than the bandwidth currently main memory can provide.&lt;/p&gt;

&lt;p&gt;The latency between CPU and main memory is ultimately governed by the speed of light. We have to bring the memory closer to CPU to lower the latency. But on-chip memory (L1/L2/L3 cache) cannot be too large, otherwise, the circuit will be long, and latency will be high.&lt;/p&gt;

&lt;h1&gt;GPU Architecture&lt;/h1&gt;

&lt;p&gt;In the design of CPU, the cache is invisible, the memory is coherent. You do not need to take care of the synchronization between the cache and memory of all sockets. Although you do need to use a memory barrier to declare the actions sequence on one memory address as if they have contended directly on the memory itself directly. To make this happen, CPU need high-speed ring/mesh to synchronize the cache between sockets.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---B302QD7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46253034-bbd85e00-c4a4-11e8-82ae-575f770937f2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---B302QD7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46253034-bbd85e00-c4a4-11e8-82ae-575f770937f2.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This programming model prevents the cores from growing. It is too costly to maintain the illusion of direct memory sharing between cores because the latency of data transfer is inevitable. The memory change needs to be broadcasted to all cache.&lt;/p&gt;

&lt;p&gt;GPU architecture is different. It solves the memory latency issue by making the "cache" explicit. Instead of calling it "cache", it is just another layer of memory you can use.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9mKd5WuO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46254299-c3f1c700-c4bf-11e8-8f24-266ee57d7f2b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9mKd5WuO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46254299-c3f1c700-c4bf-11e8-8f24-266ee57d7f2b.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this programming model, there is not just one "heap" of memory, instead the variable need to be explicit about its scope. For example&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;__global__ void parallel_shared_reduce_kernel(float *d_out, float* d_in){
    int myID = threadIdx.x + blockIdx.x * blockDim.x;
    int tid = threadIdx.x;
    extern __shared__ float sdata[];
    sdata[tid] = d_in[myID];
    __syncthreads();

    //divide threads into two parts according to threadID, and add the right part to the left one, 
    //lead to reducing half elements, called an iteration; iterate until left only one element
    for(unsigned int s = blockDim.x / 2 ; s&amp;gt;0; s&amp;gt;&amp;gt;=1){
        if(tid&amp;lt;s){
            sdata[tid] += sdata[tid + s];
        }
        __syncthreads(); //ensure all adds at one iteration are done
    }
    if (tid == 0){
        d_out[blockIdx.x] = sdata[myId];
    }
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Variable either has no prefix, or &lt;code&gt;__shared__&lt;/code&gt; or &lt;code&gt;__global__&lt;/code&gt;. In CPU programming, we need to guess how the cache works. Instead, we can just ask in GPU programming.&lt;/p&gt;

&lt;h1&gt;Distributed Map/Reduce&lt;/h1&gt;

&lt;p&gt;Map/Reduce can handle big data distributed because it avoids data transfer to do computation on the node where data originally is&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--eY1iE0gE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46254553-b8080400-c4c3-11e8-9ad2-46b2ff7c5a13.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--eY1iE0gE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46254553-b8080400-c4c3-11e8-9ad2-46b2ff7c5a13.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The data sent from "map" nodes to "reduce" nodes is substantially smaller than original input because partial computation has already been done.&lt;/p&gt;

&lt;h1&gt;Heap Management&lt;/h1&gt;

&lt;p&gt;Big heap is hard to manage. In modern c++ or rust, we use ownership to manage the memory. Essentially, instead of managing the whole heap as a connected object graph using the garbage collector, the heap is divided into smaller scopes. For example&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight cpp"&gt;&lt;code&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;widget&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;  
&lt;span class="k"&gt;private&lt;/span&gt;&lt;span class="o"&gt;:&lt;/span&gt;  
  &lt;span class="n"&gt;gadget&lt;/span&gt; &lt;span class="n"&gt;g&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;   &lt;span class="c1"&gt;// lifetime automatically tied to enclosing object  &lt;/span&gt;
&lt;span class="k"&gt;public&lt;/span&gt;&lt;span class="o"&gt;:&lt;/span&gt;  
  &lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="n"&gt;draw&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;  
&lt;span class="p"&gt;};&lt;/span&gt;  

&lt;span class="kt"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;functionUsingWidget&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;  
  &lt;span class="n"&gt;widget&lt;/span&gt; &lt;span class="n"&gt;w&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;   &lt;span class="c1"&gt;// lifetime automatically tied to enclosing scope  &lt;/span&gt;
              &lt;span class="c1"&gt;// constructs w, including the w.g gadget member  &lt;/span&gt;
  &lt;span class="err"&gt;…&lt;/span&gt;  
  &lt;span class="n"&gt;w&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;draw&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;  
  &lt;span class="err"&gt;…&lt;/span&gt;  
&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="c1"&gt;// automatic destruction and deallocation for w and w.g  &lt;/span&gt;
  &lt;span class="c1"&gt;// automatic exception safety,   &lt;/span&gt;
  &lt;span class="c1"&gt;// as if "finally { w.dispose(); w.g.dispose(); }"  &lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;There are a lot of cases, the lifetime of one memory address aligned well with function enter/exit and object construction/destruction. There are cases, explicit annotation of the lifetime will be needed&lt;/p&gt;



&lt;div class="highlight"&gt;&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;fn&lt;/span&gt; &lt;span class="n"&gt;substr&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nv"&gt;'a&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="nv"&gt;'a&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;until&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;u32&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="nv"&gt;'a&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Lifetime declaration like &lt;code&gt;'a&lt;/code&gt; is similar with &lt;code&gt;__shared__&lt;/code&gt; in CUDA. Explicit lifetime management like c++ and rust is one extreme. Multi-thread safe garbage collected language like Java and Go is another extreme. Language like Nim partition the memory by threads, which lies in the middle of the spectrum. To ensure the memory in the different partition does not accidentally shared, memory needs to be copied when crossing the boundary. Again, minimizing cost of memory copy is a data locality problem.&lt;/p&gt;

&lt;p&gt;JVM has garbage collection. Instead of annotating the source code statically, it tries to figure out what goes with what in the runtime by dividing the big heap into rough generations:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--gl4b81P2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46323083-14cc0180-c61f-11e8-8870-4f51dec4f681.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--gl4b81P2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46323083-14cc0180-c61f-11e8-8870-4f51dec4f681.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It will a lot easier to deal with heap management problem if we do not share big heap in the first place. There will be cases having multiprocessor sharding memory be beneficial (&lt;a href="https://en.wikipedia.org/wiki/Symmetric_multiprocessing"&gt;https://en.wikipedia.org/wiki/Symmetric_multiprocessing&lt;/a&gt;). But for a lot of cases, co-locating data and computation (actor programming model) will be easier.&lt;/p&gt;

&lt;h1&gt;Co-locate data and computation&lt;/h1&gt;

&lt;p&gt;Instruction per clock is not growing very fast.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s---dh7W85z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46322713-13013e80-c61d-11e8-9937-a34c1909b393.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s---dh7W85z--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46322713-13013e80-c61d-11e8-9937-a34c1909b393.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Main memory access latency is nearly flat&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_8TpIlk5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46322909-1d700800-c61e-11e8-95f5-c07549e7e67c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_8TpIlk5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/40541/46322909-1d700800-c61e-11e8-95f5-c07549e7e67c.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Having to deal with more than one processors, each with its own memory (Heterogeneous Computing) is likely to be pervasive. It is a clear trend that we need to ensure the data and computation co-located with each other in this paradigm. The problem is how to properly partition the data and computation, in the way data copy or message passing is minimized.&lt;/p&gt;





</description>
      <category>hardware</category>
      <category>softwaredevelopment</category>
      <category>programming</category>
      <category>bigdata</category>
    </item>
  </channel>
</rss>
