<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Caf RF</title>
    <description>The latest articles on Forem by Caf RF (@caf_rf_1986).</description>
    <link>https://forem.com/caf_rf_1986</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/caf_rf_1986"/>
    <language>en</language>
    <item>
      <title>January 1, 1970 =&gt; January 19, 2038 Error!</title>
      <dc:creator>Caf RF</dc:creator>
      <pubDate>Sat, 24 Feb 2024 02:24:42 +0000</pubDate>
      <link>https://forem.com/caf_rf_1986/january-1-1970-january-19-2038-error-4ibj</link>
      <guid>https://forem.com/caf_rf_1986/january-1-1970-january-19-2038-error-4ibj</guid>
      <description>&lt;p&gt;There have always been key moments in history … important dates … memorable days. Hopefully, we will reach 2038, a historic moment.&lt;/p&gt;

&lt;p&gt;Have you noticed that when setting a date, whether with Javascript or another language, the initial date is always January 1, 1970?&lt;/p&gt;

&lt;p&gt;Well, I'll explain it to you. This is because it is Unix time. But first, let's talk a little about what UNIX is.&lt;/p&gt;

&lt;p&gt;In the late 1960s, MIT, AT&amp;amp;T, and General Electric were working to create an experimental operating system called Multics, which was to run on a mainframe computer. However, the first versions had poor performance.&lt;/p&gt;

&lt;p&gt;Later, Ken Thompson of Bell Labs programmed a 'Space Travel' game but discovered that the game was slow and expensive on a General Electric machine. Thompson rewrote the program with the help of Dennis Ritchie on a DEC PDP-7 model computer.&lt;/p&gt;

&lt;p&gt;Consequence: Start a new operating system.&lt;/p&gt;

&lt;p&gt;Later, Russ Canaday joined, among other programmers. They developed a command interpreter and a small set of programs. Later, they achieved text processing, among other things, and with it … financing and success! In 1972, it was decided to write UNIX again, but in the C language, so that it could be modified to work on other computers.&lt;/p&gt;

&lt;p&gt;Thanks gammers and great programmers!&lt;/p&gt;

&lt;p&gt;The date they decided to use as a base was January 1, 1970.&lt;/p&gt;

&lt;p&gt;Briefly, UNIX is a portable, multitasking, multiuser operating system.&lt;/p&gt;

&lt;p&gt;By the way, in 1991, a Helsinki student named Linus Torvalds developed a kernel for computers that emulated many of the functionalities of UNIX and released it as open source under the name LINUX.&lt;/p&gt;

&lt;p&gt;Anyway, returning to the starting point, Since January 1, 1970, seconds have been counted, and on most 32-bit systems, the &lt;code&gt;time_t&lt;/code&gt; data type used to store the seconds counter is a 32-bit signed integer (positive and negative). This means that the last second representable in this format will be at 03:14:07 UTC on January 19, 2038, when the counter reaches 2,147,483,647. One second later, the counter will overflow and jump to the value of -2,147,483,648.&lt;/p&gt;

&lt;p&gt;It may not seem important, but this bug can cause quite a few crashes because many programs and devices have an internal clock, so … anything can fail.&lt;/p&gt;

&lt;p&gt;Would it be very irresponsible to know this and do nothing?&lt;/p&gt;

&lt;p&gt;YEAH&lt;/p&gt;

&lt;p&gt;Changing the definition of &lt;code&gt;time_t&lt;/code&gt; to use a 64-bit type would break binary support for software, data storage, and generally anything that has anything to do with the binary representation of time. Changing &lt;code&gt;time_t&lt;/code&gt; to an unsigned 32-bit integer would affect programs that do calculations with time differences. &lt;/p&gt;

&lt;p&gt;Most operating systems for 64-bit architectures use 64-bit integers for &lt;code&gt;time_t&lt;/code&gt;. The migration is still in process but will surely be completed before 2038. Using a 64-bit integer would push the problem back about 290 billion years (2.9 × 1011). That is, 22 times the approximate age of the universe. &lt;/p&gt;

&lt;p&gt;But if you are one of the people who treasures an older device with an internal clock, then you will remember that historical date … or if you were born on January 1, 1970.&lt;/p&gt;

&lt;p&gt;By the way, for those curious people, here is a link that shows you the current seconds.&lt;a href="https://www.unixtimestamp.com/en/index.php"&gt;HERE&lt;/a&gt;&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>learning</category>
      <category>codenewbie</category>
      <category>web3</category>
    </item>
    <item>
      <title>A Brief History of the Internet</title>
      <dc:creator>Caf RF</dc:creator>
      <pubDate>Fri, 26 Jan 2024 02:48:09 +0000</pubDate>
      <link>https://forem.com/caf_rf_1986/a-brief-history-of-the-internet-14b5</link>
      <guid>https://forem.com/caf_rf_1986/a-brief-history-of-the-internet-14b5</guid>
      <description>&lt;p&gt;The way to communicate before the Internet was the telegraph, which was invented in 1840 and was really very useful at the time.&lt;/p&gt;

&lt;p&gt;The true origin of the Internet is in 1958... seriously... back then the USA founded the Advanced Research Projects Agency (ARPA) through its Ministry of Defense.&lt;/p&gt;

&lt;p&gt;The objective?&lt;/p&gt;

&lt;p&gt;Find a way to have direct communication between two computers to be able to communicate to different research bases. Only that? ... of course not, in fact it was created during the Cold War and they intended to make military communications less vulnerable by eliminating dependence on a central computer.&lt;/p&gt;

&lt;p&gt;To do this they needed about 200 scientists and a lot of money. In 1962 ARPA created a research program under the direction of John Licklider and in 1967 ARPANET (Advanced Research Projects Agency Network) was born, a computer network that compiled the best ideas from three teams: MIT, Natinonal Physics Laboratory (UK) and the Rand Corporation, however in 1969 the first 4 computers were connected through Leonard Kleinrock's Interface Message Processor, in that same year the 'internet' opened to the public.&lt;/p&gt;

&lt;p&gt;In 1971 ARPANET already had 23 connected points and the first email was sent by Ray Tomlinson. The following year it was presented at the First International Conference on Computers and Communication in Washington DC, where scientists demonstrated that the system did work by creating a network of 40 connected points in different places.&lt;/p&gt;

&lt;p&gt;Between 1974 and 1982, quite a few networks were created, but the following stood out:&lt;/p&gt;

&lt;p&gt;Telenet (1974) commercial version of ARPANET&lt;br&gt;
Usenet (1979) Open system focused on e-mail that continues to function!&lt;br&gt;
Bitnet (1981) linked American universities using IBM systems&lt;br&gt;
Eunet (1982) which linked the United Kingdom, Scandinavia and The Netherlands.&lt;/p&gt;

&lt;p&gt;Finally in 1981, ARPANET adopted the TCP/IP protocol and at that time the Internet (International Net) was created.&lt;/p&gt;

&lt;p&gt;The use of the network was limited to the exchange of emails and as a documentary fund to store global information. But locating and identifying information still remains a fairly complex task. Fun fact Google in 2014 mentioned its search engine only reflected 0.004% of all information.&lt;/p&gt;

&lt;p&gt;After that HTML was created, but you can read about this on another article &lt;a href="https://dev.to/caf_rf_1986/html-a-fighter-who-had-to-evolve-to-get-the-place-it-has-with-a-bit-of-help--gli"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The Internet, the World Wide Web (www), was introduced in 1991. Two years later CERN opened the web for commercial use.&lt;/p&gt;

&lt;p&gt;The first web page is still active, it is a very basic HTML with active links. You can access &lt;a href="https://info.cern.ch/"&gt;here&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>learning</category>
      <category>codenewbie</category>
      <category>web</category>
    </item>
    <item>
      <title>VAR ... the Almighty Yet Obsolete</title>
      <dc:creator>Caf RF</dc:creator>
      <pubDate>Sun, 14 Jan 2024 01:46:45 +0000</pubDate>
      <link>https://forem.com/caf_rf_1986/var-the-almighty-yet-obsolete-40me</link>
      <guid>https://forem.com/caf_rf_1986/var-the-almighty-yet-obsolete-40me</guid>
      <description>&lt;p&gt;A few days ago I read an article that caught my attention, it talked about var. I’ll briefly explain about it.&lt;/p&gt;

&lt;p&gt;Currently to declare a variable in JS we use let, const and to a lesser extent var. But var existed before let and const, in fact ‘var’ was the only type of variables within JS.&lt;/p&gt;

&lt;p&gt;So why don't we currently use var?&lt;/p&gt;

&lt;p&gt;While var has been around since the beginning of JS, it hasn't changed much since then, however JS has. In fact it has changed so much that now var could cause us problems.&lt;/p&gt;

&lt;p&gt;The reason?&lt;/p&gt;

&lt;p&gt;Var ignores code blocks and this is because a long time ago Javascript did not have lexical environments, it can be declared again and in the same way var is always processed at the beginning of a function.&lt;/p&gt;

&lt;p&gt;Let's first review the first case. When we use var inside an 'if statement' or inside a 'loop', we can access var outside the block (which does not happen with let or const).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;if (true) {
     var testVar = true;
     let testLet = false;
}

console.log(testVar); // true
console.log(testLet); // Reference error testLet not defined
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The second case is very simple, var can be declared over and over again, unlike let or const, which would give us an error mentioning that the variable has already been declared.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;var testVar = 0;
var testVar = 'new value';
console.log(testVar); // new value

let testLet = 'value';
let testLet = 'no new value'; // SyntaxError 'testLet' has already been declared;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The third case, in functions, is my favorite. Var can be declared below where they are used.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;function test() {
    testVar = 'some value';
    console.log(testVar); // some value
    var testVar;
}

console.log(testVar);  // undefined
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To avoid this strange behavior, programmers invented IIFE (immediately invoked function expressions).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;(function() {
  // some code
})();

(function() {
  // some code
}());

!function() {
  // some code
}();

+function() {
  // some code
}();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;They are NOT currently used, DO NOT CODE IT but it is very nice to know a little history about something you are passionate about. By the way, you can see the article about var and all JS stuff &lt;a href="https://javascript.info/"&gt;here&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>webdev</category>
      <category>learning</category>
      <category>architecture</category>
    </item>
    <item>
      <title>HTML a fighter who had to evolve to get the place It has … with a bit of help …</title>
      <dc:creator>Caf RF</dc:creator>
      <pubDate>Mon, 01 Jan 2024 02:06:58 +0000</pubDate>
      <link>https://forem.com/caf_rf_1986/html-a-fighter-who-had-to-evolve-to-get-the-place-it-has-with-a-bit-of-help--gli</link>
      <guid>https://forem.com/caf_rf_1986/html-a-fighter-who-had-to-evolve-to-get-the-place-it-has-with-a-bit-of-help--gli</guid>
      <description>&lt;p&gt;Currently when we mention HTML (HyperText Markup Language), we think of CSS and JS which are the basis for most web pages today, however HTML had to evolve to survive and finally stand as a king within programming. Curiously, its origins go back to 1980 with Tim Berners-Lee, a CERN researcher, who proposed a new 'hypertext' system for sharing documents.&lt;/p&gt;

&lt;p&gt;Tim later presented his system with Robert Cailliau in a call to develop a hypertext system for the Internet. They won!!! …and the first formal document with the description of HTML, published in 1991, you can see it &lt;a href="https://www.w3.org/History/19921103-hypertext/hypertext/WWW/MarkUp/Tags.html"&gt;HERE&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;However, two years passed and in 1993 the first official proposal to convert HTML into a standard was made by the IETF (Internet Engineering Task Force) but this time... They failed!!!&lt;/p&gt;

&lt;p&gt;Two more years had to pass, 1995 arrived when the IETF organized an HTML working group and published the HTML 2.0 standard, this being the first official standard... thanks to this HTML is as large and used as it is today... in part… since that was not enough.&lt;/p&gt;

&lt;p&gt;One more year, 1996, the W3C (World Wide Web Consortium) published the HTML 3.2 standards and a year later, in January 1997, The first W3C HTML recommendation was published.&lt;/p&gt;

&lt;p&gt;The great 1998 arrived with an unexpected evolution of HTML 4.0 where the amazing CSS and the possibility of including small scripts enter. You can already imagine this great event, right? And even with this great leap, the last publication of HTML was in December 1999 with version 4.0.1.&lt;/p&gt;

&lt;p&gt;Standardization activity stopped…&lt;br&gt;
…the W3C focused on the development of the XHTML (XML-based HTML) standard…&lt;br&gt;
So why is HTML the standard today?&lt;/p&gt;

&lt;p&gt;In 2004 Apple, Mozilla and Opera organized themselves into a new association, the WHATWG (Web Hypertext Application Technology Working Group) that focused on the HTML standard. Thanks to the strength of the companies that are part of the WHATWG and the publication of drafts in 2007, the W3C resumed the HTML5 standardization activities, the first draft of which was published in 2008.&lt;/p&gt;

&lt;p&gt;Finally thanks to Tim Berners-Lee, the IETF, the changing W3C and the WHATWG on October 27, 2014 the final version of HTML5 was presented. Its intention is to create what has been called Open Web Platform where HTML5 together with CSS3 and JS can be used for the development of cross-platform applications. You can read the ‘application fundamentals’ published by Dr Jeff Jaffe (MIT) &lt;a href="https://www.w3.org/blog/2014/application-foundations-for-the-open-web-platform/"&gt;HERE&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;HTML5 was the first language with which I had contact and one of the ones that I feel is most accessible to use thanks to its syntax, a language that, if it were not for its evolution and the support of so many organizations, perhaps we would not know it today.&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>html</category>
      <category>programming</category>
      <category>learning</category>
    </item>
    <item>
      <title>About Bit, Bytes and Qubits</title>
      <dc:creator>Caf RF</dc:creator>
      <pubDate>Sun, 24 Dec 2023 22:45:17 +0000</pubDate>
      <link>https://forem.com/caf_rf_1986/about-bit-bytes-and-qubits-3a76</link>
      <guid>https://forem.com/caf_rf_1986/about-bit-bytes-and-qubits-3a76</guid>
      <description>&lt;p&gt;For some time now I have been interested in knowing a little more about how computers and the Internet work. One of the things I didn't understand was the difference between bits and bytes. The story behind all this is very interesting!&lt;/p&gt;

&lt;p&gt;The bit (binary digit) corresponds to a digit of the binary numbering system and represents the minimum unit of information. The storage capacity of a digital memory is also measured in bits. That is, a bit can be 0 or 1.&lt;/p&gt;

&lt;p&gt;Gottfried Wilhelm Leibniz invented the binary system at the end of the 17th century. The idea? Convert certain linguistic concepts to logic, that is, interpret them as “true” or “false”&lt;/p&gt;

&lt;p&gt;This is how a bit can represent two values such as true/false, on/off, etc. Following this logic, 2 bits can have up to 4 different combinations: 0-0, 0-1, 1-0, 1-1. 8 bits make up one octet and are equivalent to 256 different values.&lt;/p&gt;

&lt;p&gt;Does 8 bits sound familiar to you?&lt;/p&gt;

&lt;p&gt;Currently we say that 1 byte (Binary Tuple) is equivalent to 8 bits, but a byte and an octet may not be the same. While an octet always has 8 bits, a byte contains a fixed number of bits, but they must not be 8. In older computers, the byte was made up of different numbers of bits.&lt;/p&gt;

&lt;p&gt;Before, computers processed information through ‘words’. These had a specific number of bits, to be exact, 10 bits since decimal precision was prioritized academically.&lt;/p&gt;

&lt;p&gt;But having 10 digits was a lot of 'waste of resources'. The German Werner Buchholz came to the conclusion that each character had to be addressed individually, with a fixed reference of 8 adjacent bits per unit. &lt;/p&gt;

&lt;p&gt;Therefore, 1 byte can be a number, a letter or a symbol and represents one letter of the character code in the binary system in all computing applications.&lt;/p&gt;

&lt;p&gt;You can check this byte counter:&lt;br&gt;
&lt;a href="https://www.atatus.com/tools/byte-counter"&gt;Atatus&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And this string to binary converter online tool where you can count the bytes as well:&lt;br&gt;
&lt;a href="https://codebeautify.org/string-binary-converter"&gt;codebeautify&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;But everything is constantly changing and evolving, which brings me to the next question.&lt;br&gt;
Is there anything that could replace the bit in the future?&lt;/p&gt;

&lt;p&gt;This is when we reach a more complex world, the quantum world.&lt;/p&gt;

&lt;p&gt;A qubit (quantum bits) is based on quantum theory and is a very interesting topic, but to give a brief introduction, in the quantum world a simple action can have different possibilities at the same time and so a qubit is based on some basic principles of quantum physics, which are: superposition and entanglement.&lt;/p&gt;

&lt;p&gt;A qubit is the bit of quantum computing, since it represents two base states, 0 and 1, but the big difference is that a bit can have the value of 0 or 1, and a qubit has both values at the same time, which saves a lot of time when performing parallel calculations. The surprising thing is that the qubit is not as new as it seems, but rather the first concepts were made known in 1968 thanks to Stephen Wiesner with his invention 'conjugate coding'.&lt;/p&gt;

</description>
      <category>learning</category>
      <category>webdev</category>
      <category>codenewbie</category>
      <category>web</category>
    </item>
  </channel>
</rss>
