<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Rasheem</title>
    <description>The latest articles on Forem by Rasheem (@rasheem).</description>
    <link>https://forem.com/rasheem</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/rasheem"/>
    <language>en</language>
    <item>
      <title>How computers actually see time?</title>
      <dc:creator>Rasheem</dc:creator>
      <pubDate>Thu, 11 Dec 2025 14:09:50 +0000</pubDate>
      <link>https://forem.com/rasheem/how-computers-actually-see-time-58p8</link>
      <guid>https://forem.com/rasheem/how-computers-actually-see-time-58p8</guid>
      <description>&lt;p&gt;I saved a record at 5 PM here in India.&lt;br&gt;
When I viewed that same data as a user in the UAE, it still showed 5 PM.&lt;br&gt;
It was wrong. It should have been 3:30 PM.&lt;br&gt;
That specific bug forced me to stop guessing and actually understand how machines measure time. It turns out, it’s shockingly simple compared to the mess of timezones we deal with.&lt;/p&gt;

&lt;p&gt;Everything starts from one fixed moment: The Unix Epoch (Jan 1, 1970).&lt;br&gt;
From that exact second, computers just start counting. Every timestamp is nothing more than "how many milliseconds have passed since 1970." That’s it. Just a number.&lt;/p&gt;

&lt;p&gt;The biggest realization I had is what’s not inside that number.&lt;br&gt;
There is no timezone hiding in a Unix timestamp. It’s just a raw integer. The timestamp 0 always refers to the same instant in reality. It stays the same everywhere. The timezone only gets involved when you convert that number into a readable date.&lt;/p&gt;

&lt;p&gt;This is exactly where I messed up with PostgreSQL.&lt;br&gt;
I was using TIMESTAMP (without timezone). I was basically saving a picture of a clock. I saved "5 PM," and the database blindly showed "5 PM" to everyone. It forgot that my 5 PM is very different from 5 PM in Dubai.&lt;br&gt;
If I had used TIMESTAMPTZ, the database would have been smarter. It would have taken my 5 PM India time, converted it to the universal machine baseline (UTC 11:30 AM), and stored that.&lt;br&gt;
Then, when the UAE user asked for the data, the database would have seen their timezone and done the math automatically: 11:30 AM + 4 hours = 3:30 PM.&lt;/p&gt;

&lt;p&gt;It’s a tiny piece of engineering, but it’s the difference between a system that breaks across borders and one that keeps the whole world in sync.&lt;br&gt;
Store in UTC. View in Local.&lt;/p&gt;

</description>
      <category>softwareengineering</category>
      <category>postgres</category>
      <category>backenddevelopment</category>
      <category>systemsthinking</category>
    </item>
  </channel>
</rss>
