<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: silvie demie</title>
    <description>The latest articles on Forem by silvie demie (@silvidemeter).</description>
    <link>https://forem.com/silvidemeter</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/silvidemeter"/>
    <language>en</language>
    <item>
      <title>The way you type tells a lot about you, including your IQ
</title>
      <dc:creator>silvie demie</dc:creator>
      <pubDate>Fri, 22 Jan 2021 14:47:31 +0000</pubDate>
      <link>https://forem.com/typingdna/the-way-you-type-tells-a-lot-about-you-including-your-iq-hkj</link>
      <guid>https://forem.com/typingdna/the-way-you-type-tells-a-lot-about-you-including-your-iq-hkj</guid>
      <description>&lt;p&gt;Ever since the use of papyrus in ancient times, the way people write has been a fascinating curiosity known as graphology or the study of physical characteristics and patterns of handwriting. The first book on handwriting analysis appeared in 1575 and was written by the renowned scholar Juan Huarte de San Juan. &lt;/p&gt;

&lt;p&gt;In modern times, how much does one reveal when typing on a keyboard or device? Quite a lot, it turns out.&lt;/p&gt;

&lt;h2&gt;
  
  
  The fist of the Sender – ally or enemy?
&lt;/h2&gt;

&lt;p&gt;During World War II, military intelligence successfully identified the rhythm of &lt;a href="https://www.bbc.com/news/technology-23162846"&gt;morse code&lt;/a&gt; to distinguish allies from enemies. Even in the mid-1800s, based on the typestyle, telegraph operators' identities could be revealed. &lt;br&gt;
How we interact with devices divulges a series of unique patterns, which, when analyzed, can reveal our age, gender, personality, state of mind, IQ, and, in a nutshell, who we are—our identity.&lt;/p&gt;

&lt;p&gt;The technology that captures and analyzes typing patterns is known as typing biometrics. Over the last few years, typing biometrics has seen rapid growth and is quickly gaining momentum. The use cases and industries they apply to are endless. The popularity of this technology could be guaranteed by a faulty technology that might soon be obsolete, illegal, and replaced for good: facial recognition.&lt;/p&gt;

&lt;h2&gt;
  
  
  Facial recognition technology will be obsolete soon
&lt;/h2&gt;

&lt;p&gt;Most smartphones today incorporate facial recognition, which is a great alternative to other biometrics such as fingerprints. Still, more and more people today see facial recognition as a faulty method to authenticate individuals and a very easy to use tool for mass surveillance. &lt;/p&gt;

&lt;p&gt;Facial recognition was first introduced to devices in 2005 by OMRON Corporation. The breakthrough method of authentication easily spread from personal devices to private and public organizations using it for various applications. &lt;/p&gt;

&lt;p&gt;One successful use case was proctoring online courses and exams. It didn't take long to notice the racial bias in facial recognition technology, with several studies conducted by governmental research or &lt;a href="https://www.media.mit.edu/projects/gender-shades/overview/"&gt;universities&lt;/a&gt; debunking the myth of secure facial recognition. A reliable, unbiased alternative was needed.&lt;/p&gt;

&lt;p&gt;In the worst case, the failure rate on darker female faces is over one in three for a task with a 50 percent chance of being correct. In the best case, one classifier achieves flawless performance on lighter males: 0 percent error rate. - The Gender Shades Project&lt;br&gt;
Another alarming toll of this technology is individuals' privacy concerns. More and more people around the globe criticize the technology because it can be used to infringe human rights or civil liberties and for mass surveillance. &lt;/p&gt;

&lt;p&gt;In June 2019, law enforcement agencies held the most substantial chunk of the facial recognition technology on the market. And according to a new bill—the Facial Recognition and Biometric Technology Moratorium Act, which was introduced to ban the use of facial recognition technology by federal law enforcement agencies—this will change. &lt;/p&gt;

&lt;p&gt;Also, as of June 2020, tech giants like Microsoft, Amazon, and IBM have decided to stop the research, use, and selling of facial recognition technology at least until stronger laws have been enacted to ensure safe deployment.&lt;/p&gt;

&lt;h2&gt;
  
  
  The alternative: Typing biometrics
&lt;/h2&gt;

&lt;p&gt;In addition to physiological biometrics (e.g., fingerprint, facial identification, and retina scans), there are other metrics that measure human behavior characteristics called behavioral biometrics. &lt;/p&gt;

&lt;p&gt;Broadly, behavioral biometrics represent uniquely identifying and measurable patterns in individual interaction with devices. These behaviors are virtually impossible to imitate due to the innate nature of their characteristics. Powered by machine learning algorithms, typing biometrics technology is able to learn about a variety of pattern elements that are unique to the individual.&lt;/p&gt;

&lt;p&gt;The prevalence of typing biometrics in the future is also highlighted by the shift in modes of human interaction. According to research by the American Psychological Association, people favor typing over talking. &lt;a href="https://news.gallup.com/poll/179288/new-era-communication-americans.aspx"&gt;Since 2014&lt;/a&gt;, much of oral communication has transitioned to text-based communication, and now 75% of individuals under 30 prefer written communication over phone calls. Also, keyboards are widely available in any household and are present in most technologies today.&lt;/p&gt;

&lt;h2&gt;
  
  
  Shifting from authentication to other uses cases
&lt;/h2&gt;

&lt;p&gt;Since facial recognition opens a playing ground for other biometrics in authentication, typing biometrics can do more and thus are intriguing due to their versatility and broad use cases. &lt;/p&gt;

&lt;h2&gt;
  
  
  Mental health and well-being
&lt;/h2&gt;

&lt;p&gt;As previously noted, the technology can be used to analyze the way people are. The complexity of the human mind and body is yet to be understood. Still, the traces we leave when typing is quite revealing with respect to physical and mental traits.&lt;/p&gt;

&lt;p&gt;This leads to a more complex analysis of more than just who we are but also how we behave. Just just like our DNA can tell a lot about our ancestry, reveal health conditions, and identify personality indicators, &lt;strong&gt;our typing similarly says a lot about us, including &lt;a href="https://www.typingdna.com/research-and-development.html"&gt;gender or IQ&lt;/a&gt;.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;To illustrate, a new product called &lt;a href="https://www.typingdna.com/focus"&gt;TypingDNA Focus&lt;/a&gt;, which is currently in the research phase can analyze data based on how people type on their keyboards and reveal information about their state of mind. &lt;strong&gt;A fitness tracker for the mind, Focus provides statistics about when users are tired, focused, or stressed, and that data can then be analyzed to enhance productivity.&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;Focus aims to help users better understand how and when they focus and gain visibility and control over the stressful periods during a day. Additionally, users get insights based on their typing behavior and see weekly trends, a breakdown of daily activity, a window into when they’re most engaged, focus levels, immediate mood analysis, and more.&lt;/p&gt;

&lt;p&gt;This use case of typing biometrics is a natural response to the current state of individuals experiencing burnout, high anxiety, stress levels, and the inability to focus on tasks and get productive. According to the &lt;a href="https://adaa.org/understanding-anxiety/related-illnesses/stress"&gt;Anxiety and Depression Association of America&lt;/a&gt;, stress over a long period leads to anxiety and depression, disorders associated with severe chronic diseases.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;The fin-de-siècle neurasthenic, in whom exhaustion and innervation converge, uncannily anticipates the burnout of today. They have in common an overloaded and overstimulated nervous system.&lt;/em&gt;&lt;br&gt;
Josh Cohen, psychoanalyst, and Professor of Modern Literary Theory at the Goldsmiths University of London.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Reports from the World Health Organization (WHO) indicate depression will become a primary reason for disability worldwide by 2030, while more than 16% of the global population is already affected by it today. &lt;/p&gt;

&lt;h2&gt;
  
  
  Medical disorders prevention and diagnostics
&lt;/h2&gt;

&lt;p&gt;Due to its innate nature, typing biometrics are virtually impossible to imitate, revealing each individual's unique typing pattern. Therefore, behavioral biometrics technology is suitable for medical research to address disease prevention. Recent medical advancements focus on analyzing any changes that occur in someone's typing pattern over time, which can sometimes indicate neurological system disorders, damage, or other transformations. Abnormal locomotions, movement patterns, or synergies have been described after CNS disorders, such as stroke or traumatic brain injury. &lt;/p&gt;

&lt;p&gt;The aim here is to help patients suffering from different diseases or disabilities to rehabilitate and, in the future, even detect various anomalies that can indicate a specific disorder, assisting medical professionals in diagnosing their patients more effectively. &lt;/p&gt;

&lt;p&gt;An alternative to privacy-intrusive surveillance technologies&lt;br&gt;
Over the past decade, data protection and privacy-enhancing online and offline tools have gained momentum, and various public institutions are addressing the ban of intrusive surveillance technologies. &lt;/p&gt;

&lt;p&gt;This approach may seem like a win for privacy and civil rights advocates who speak out against the use of facial recognition for mass surveillance. This sends an alarming global signal on the effects of the lack of regulation in facial biometrics and addresses the need for less privacy-invasive substitute technologies. &lt;/p&gt;

&lt;p&gt;Given this context, typing biometrics represents a viable authentication solution for mobile and desktop without putting civil liberties at risk.  &lt;/p&gt;

&lt;h2&gt;
  
  
  E-learning and student verification
&lt;/h2&gt;

&lt;p&gt;Suffice it to say that 2020 has ushered in challenges around innovation, digitalization, and adaptability. &lt;/p&gt;

&lt;p&gt;Governments have been required to take an integrated cross-sectoral approach to prevent and minimize the impact of COVID-19. According to UNESCO, the changing education imperative comes after 1.38 billion learners have been impacted by national school closures worldwide due to the coronavirus. &lt;/p&gt;

&lt;p&gt;The ability to verify a student's identity is a crucial part of an equitable and effective learning experience and process. Since most of the curriculum is currently online at all education stages, typing biometrics is a seamless, non-intrusive way to verify the course and exam participants' identities, allowing students to focus on learning—not the proctoring technology.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Looking ahead, typing biometrics will be present in our daily lives And it will also be found in complex use cases developed today that will shape the future for good, influencing our lives and while helping people explore the unknown areas of &lt;a href="https://www.typingdna.com/focus"&gt;who we are&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>typingbiometrics</category>
      <category>productivity</category>
      <category>typingpatterns</category>
    </item>
    <item>
      <title>AI Facts Every Dev Should Know: Artificial intelligence is older than you, probably</title>
      <dc:creator>silvie demie</dc:creator>
      <pubDate>Mon, 14 Dec 2020 15:43:20 +0000</pubDate>
      <link>https://forem.com/typingdna/ai-facts-every-dev-should-know-artificial-intelligence-is-older-than-you-probably-2j8c</link>
      <guid>https://forem.com/typingdna/ai-facts-every-dev-should-know-artificial-intelligence-is-older-than-you-probably-2j8c</guid>
      <description>&lt;h2&gt;
  
  
  The hype around AI is growing rapidly, as most research companies predict AI will take on an increasingly important role in the future.
&lt;/h2&gt;

&lt;p&gt;While business leaders are very interested in leveraging machine learning technology, there’s a talent &lt;a href="https://www.forbes.com/sites/bernardmarr/2018/06/25/the-ai-skills-crisis-and-how-to-close-the-gap"&gt;shortage&lt;/a&gt; standing in the way. &lt;/p&gt;

&lt;p&gt;It turns out that there are very few developers that have the skills needed to spearhead serious new AI projects. This means that developers who can acquire these skills will be highly in demand. &lt;/p&gt;

&lt;p&gt;With all this in mind, let’s take a look at several facts about AI every developer should know before changing their focus to machine learning, artificial intelligence, and—while we’re at it—deep learning and neural networks.&lt;/p&gt;

&lt;h2&gt;
  
  
  Artificial intelligence is older than you, probably.
&lt;/h2&gt;

&lt;p&gt;The first recorded use of the term artificial intelligence was made by John McCarthy, an American computer scientist and one of the discipline's founders. Spending most of his academic career at Stanford, he invented Lisp in the late 1950s. Based on the lambda calculus, Lisp soon became the programming language of choice for AI applications after its publication in 1960. &lt;/p&gt;

&lt;p&gt;Still, creating AI departments at Stanford and MIT didn't advance the field as much as the founders had imagined. In large part, this is because scientists encountered a myriad of issues, including limited computer power (i.e., the memory or processing speed needed to accomplish anything truly useful), intractability, the combinatorial explosion, lack of databases, and lack of the common sense knowledge and reasoning needed to train algorithms effectively. &lt;/p&gt;

&lt;p&gt;The so-called AI winter began in the 1970s, and AI eventually reached its limits as substantial funding was put on hold. It was only in the 2000s that computational power and data became widely available. The ice was finally broken by ImageNET, a database project that stored 15 million images led by Stanford's Fei Fei Li in 2009. At the same time, data storage rates became affordable, setting the stage for more AI investments.&lt;/p&gt;

&lt;h2&gt;
  
  
  The talent pool is shallow.
&lt;/h2&gt;

&lt;p&gt;Talent is in short supply in the A.I. industry, with various reports indicating that the worldwide market is seeking to fill millions of roles. Due to a widespread lack of education on AI skills and topics, there’s a bottleneck in delivering highly trained individuals. In fact, Element AI, a Montreal-based startup, estimated fewer than &lt;a href="https://jfgagne.ai/talent-2019/"&gt;22,000 people&lt;/a&gt; in the world with the expertise needed to create machine learning systems.  &lt;/p&gt;

&lt;p&gt;What’s more, another study made by the Chinese Tencent Research Institute estimates that there are currently 300,000 AI researchers and practitioners in the world today, out of which about 100,000 are still studying. Tencent claims the United States is far ahead when it comes to developing this talent, being home to more than 1,000 universities of the 2,600 schools teaching machine learning and related subjects in the world. &lt;/p&gt;

&lt;p&gt;The same report claims the U.S. is also a leading nation when it comes to the number of startups developing AI technologies. Interestingly enough, more and more academic conferences turn into playgrounds for corporate recruiters, while entire AI research departments from well-known universities are transferred to privately held companies deploying AI. &lt;/p&gt;

&lt;h2&gt;
  
  
  AI Engineers get paid very well
&lt;/h2&gt;

&lt;p&gt;Scarcity in any job market equates to higher salaries, and AI is no different. For example, DeepMind, acquired by Google for a reported $650 million in 2014, spends $138 million on 400 employees. The staff costs were researched by The New York Times, which had a look at the company's recently released annual financial accounts in the U.K. This translates to base salaries of between $300,000 and $500,000 a year. &lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TtCoPw2W--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/carlvrus2dbok2anplf6.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TtCoPw2W--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/carlvrus2dbok2anplf6.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Based on Monster.com's analysis, the median salary for data scientists, senior data scientists, artificial intelligence consultants, and machine learning managers was $127,000 in 2019.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cRNZG01o--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/lklzltvm4ft6mg4dibix.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cRNZG01o--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/lklzltvm4ft6mg4dibix.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Over the last four years, the demand for AI talent has increased by 74%, while technology and financial service companies are currently absorbing 60% of AI talent.&lt;/p&gt;

&lt;p&gt;AI/ML Professionals need to possess a lot of skills&lt;/p&gt;

&lt;p&gt;There are two job roles for every AI professional today, and similar growth is expected to be seen in the future. Currently, the three most in-demand AI positions on the market are data scientists and algorithm developers, machine learning engineers, and deep learning engineers.&lt;br&gt;
According to the job site Indeed, the main skills and tools software developers need to be proficient on AI projects include math, algebra, statistics, big data, data mining, data science, machine learning, cognitive computing, natural language processing (NLP), Hadoop, Spark, and many others.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--LoPbs1s2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/vxcrv1zacwo3y5rrphlq.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--LoPbs1s2--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/vxcrv1zacwo3y5rrphlq.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The most frequent programming languages AI developers use are Phyton, C++, Java, LISP, and Prolog. Still, qualifying job seekers must also be experienced working with open source development environments. For example, proficiency with Spark, MATLAB, and Hadoop is one of the most in-demand skills. &lt;/p&gt;

&lt;h2&gt;
  
  
  The hype around AI is worth it.
&lt;/h2&gt;

&lt;p&gt;In 2018, Gartner predicted that 80% of emerging technologies would have AI foundations within three years. What’s more, the research firm Markets and Markets expects that the AI market will grow to a $190 billion industry by 2025. Beyond that, Accenture predicts that the impact of AI technologies on business will boost labor productivity by up to 40%. Also, according to IDC, the AI use cases that saw the most investment in 2019 were automated customer service agents ($4.5 billion worldwide), sales process recommendation and automation ($2.7 billion), and automated threat intelligence and prevention systems ($2.7 billion).&lt;/p&gt;

&lt;p&gt;Add it all up, and the hype surrounding AI is worth it.&lt;/p&gt;

&lt;h2&gt;
  
  
  AI has all kinds of implications
&lt;/h2&gt;

&lt;p&gt;Before wondering whether AI will replace software developers, let's take a look at what AI can actually do. &lt;/p&gt;

&lt;p&gt;The industries and use cases where AI can be deployed have surged in the past few years. &lt;br&gt;
In December 2018, the New York auction house Christie's sold Portrait of Edmond de Belamy, an algorithm-generated print in the style of 19th-century European portraiture, for $432,500. Various AI-generated works of art are now frequently exhibited; one example is the "Faceless Portraits Transcending Time" collection in New York. Dr. Ahmed Elgammal and his creation, the AICAM AI, benefit from the first solo gallery exhibit devoted to an AI artist. As Andy Warhol once said, art is what you can get away with. &lt;/p&gt;

&lt;p&gt;The frenzy around AI-generated art is also touching the music industry. While continuing to read these words, play this piece generated in ASCII using roughly 500 megabytes worth of famous guitar tabs of mostly classical and rock music. It's called Recurrence, and—if it's not contemporary enough for your taste—please note this "record" is already five years old.&lt;/p&gt;

&lt;p&gt;With a more substantial societal impact, AI tools are also being used to solve medical issues. AI is also being used in medical research to identify, prevent, and cure disorders and diseases is more appealing. These applications are projected to create $150 billion in annual savings for the healthcare economy by 2026. &lt;/p&gt;

&lt;p&gt;AI-based typing patterns matching algorithms can also verify users' identity based solely on their typing behavior. In 2016, &lt;a href="https://www.typingdna.com/"&gt;TypingDNA’s&lt;/a&gt; technology was launched to analyze how humans interact with keyboards to provide accurate authentication. The breakthrough discovery here relies on the fact that humans are all different and behave in a distinctive way. The &lt;a href="https://www.typingdna.com/#demo"&gt;demo&lt;/a&gt; how it works can turn into an addictive challenge game among your friends trying to "fool" the system and replicate each other's typing behavior. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ExugYYdr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/58o89ojyy2xpanovacrs.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ExugYYdr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/58o89ojyy2xpanovacrs.PNG" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Further, Google's Deep Learning machine learning program is accurate 89 percent of the time in detecting breast cancer compared to a human pathologist, who is just 73 percent accurate. This is why machine learning and AI are regarded as healthcare’s new nervous system.&lt;br&gt;
Finally,  AI is also very smart, shedding light on its future capabilities. For example, AlphaGo Zero, a Google Deep mind project, was able to achieve superhuman-level performance, flawlessly beating champion predecessor, AlphaGo, the first AI to defeat Ke Jie, the world's top-ranked player in the ancient Chinese strategy game Go. Interestingly, AlphaGo Zero taught itself how to play the game, given only the basic rules.&lt;/p&gt;

&lt;h2&gt;
  
  
  AI won’t replace human beings but their jobs.
&lt;/h2&gt;

&lt;p&gt;Twenty-five years ago, Jeff Dean started working on a "brain" that mimicked the neural networks to analyze information and learn. But its capabilities were limited. It was only in 2012 that neural networks were successfully used for machine learning, memory, perception, and symbol processing.&lt;/p&gt;

&lt;p&gt;Geoff Hinton marked a new era when introducing neural networks that could learn tasks mostly on their own by analyzing vast amounts of data. Both Dean and Hinton are now part of Google's AI research teams. In 2017, Google announced its project AutoML successfully taught itself to program machine learning software on its own. Completing basic programming tasks, AutoML also marked the popularization of a new fear: Because of their ability to learn on their own, will machines replace humans? &lt;/p&gt;

&lt;p&gt;Welcome to this century's agnostophobia.&lt;/p&gt;

&lt;p&gt;Unlike the Narrow/Weak AI,  specified to handle singular or limited tasks humans can perform as well, General or Strong AI poses fears towards its capabilities once out of control. Currently, AI is used mostly to assist developers, and it will probably continue to grow its role in augmenting human teams' capabilities. We see it everywhere around us—in tools to help write documentation, test code, and even identify bugs and address them. &lt;/p&gt;

&lt;p&gt;Open AI and it’s recent Generative Pre-trained Transformer 3 (GPT-3), an autoregressive language model with 175 billion parameters, achieves strong performance on many NLP datasets, including translation, question-answering, and cloze tasks, as well as several tasks that require on-the-fly reasoning or domain adaptation. This means it has capabilities to generate news articles in which human evaluators have difficulty distinguishing from articles written by humans while researchers claim GPT-3 has the "potential to advance both the beneficial and harmful applications of language models."&lt;/p&gt;

&lt;p&gt;Researchers at MIT created a program that automatically fixed software bugs by replacing faulty code lines with working lines from other programs. Here are some more tools that help in the process of building software products: DeepCode, Synopsys Logojoy, and UIzard.&lt;/p&gt;

&lt;p&gt;How do developers look at AI and its potential threats?&lt;/p&gt;

&lt;p&gt;If you fear that AI will eventually replace your role, you're not alone. That’s how the majority of developers around the world feel. &lt;/p&gt;

&lt;p&gt;According to Evans Data, when asked to identify the most worrisome thing in their careers, the largest plurality of software developers cited this: "I and my development efforts are replaced by artificial intelligence."&lt;/p&gt;

&lt;p&gt;On a positive note, Stack Overflow research showed that 70% of respondents feel more excited about AI's possibilities instead of worrying about its potential dangers. And most developers are eagerly looking forward to the new possibilities automation brings to the table.&lt;/p&gt;

&lt;p&gt;Just like the industrial revolution shifted humankind into developing new skills while leaving agricultural labor behind, so will intelligent robots. In fact, McKinsey predicts AI could replace 30% of the human workforce globally by 2030. According to AI technology statistics, robotics could replace about 800 million jobs, making about 30% of occupations extinct. &lt;/p&gt;

&lt;p&gt;With this significant shift, nearly 400 million people will have to adapt and change careers. Forrester predicts cognitive technologies—such as robots, A.I., machine learning, and automation—will create 9% of new U.S. jobs by 2025. These new jobs include robot monitoring professionals, data scientists, automation specialists, and content curators.&lt;/p&gt;

&lt;h2&gt;
  
  
  It’s easy to start learning or teaching
&lt;/h2&gt;

&lt;p&gt;Since the skilled workers needed to build advanced AI software are still scarce, companies like &lt;a href="https://ai.facebook.com/"&gt;Facebook&lt;/a&gt; and &lt;a href="https://ai.google/education/"&gt;Google&lt;/a&gt; have prepared educational programs designed to get anyone on board—no matter their level of expertise. If you are interested in online courses to grasp the basics of A.I., check these machine learning courses from &lt;a href="https://online.stanford.edu/courses/cs229-machine-learning"&gt;Stanford&lt;/a&gt;, &lt;a href="https://programs.emeritus.org/mit-pe-machine-learning/"&gt;MIT&lt;/a&gt;, and &lt;a href="https://www.classcentral.com/course/edx-machine-learning-7231"&gt;Columbia University&lt;/a&gt;,  or dive into the depths of deep learning at &lt;a href="https://www.nvidia.com/en-us/deep-learning-ai/education/"&gt;Nvidia's Deep Learning Institute&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ELpMA_xG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/lp0guas1brygainpaq07.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ELpMA_xG--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/lp0guas1brygainpaq07.jpeg" alt="Alt Text"&gt;&lt;/a&gt;&lt;br&gt;
For more information, you can also read this book, check out these popular open-source AI tools, and browse this list of popular AI projects. This will help you build the technology of the future and solve real-world problems.&lt;/p&gt;

&lt;p&gt;And if you’re a teacher looking to introduce your students to AI, check out Tom Vander Ark’s compelling guide on how to teach artificial intelligence.&lt;/p&gt;

&lt;p&gt;The graphics in this article can be downloaded &lt;a href="https://blog.typingdna.com/wp-content/uploads/2020/10/AI-infographic.pdf"&gt;here&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>futureofai</category>
      <category>typingbiometrics</category>
      <category>machinelearning</category>
    </item>
  </channel>
</rss>
