<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Terezija Semenski</title>
    <description>The latest articles on Forem by Terezija Semenski (@tsemenski).</description>
    <link>https://forem.com/tsemenski</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/tsemenski"/>
    <language>en</language>
    <item>
      <title>The 10 free books I recommend to every engineer learning ML math (and why order matters)</title>
      <dc:creator>Terezija Semenski</dc:creator>
      <pubDate>Wed, 06 May 2026 14:24:20 +0000</pubDate>
      <link>https://forem.com/tsemenski/the-10-free-books-i-recommend-to-every-engineer-learning-ml-math-and-why-order-matters-4c0i</link>
      <guid>https://forem.com/tsemenski/the-10-free-books-i-recommend-to-every-engineer-learning-ml-math-and-why-order-matters-4c0i</guid>
      <description>&lt;p&gt;After teaching hundreds of engineers learn machine learning last 5 years, a pattern becomes hard to ignore.&lt;/p&gt;

&lt;p&gt;Most people don’t struggle because machine learning is too difficult.&lt;/p&gt;

&lt;p&gt;struggle because they start in the wrong place.&lt;/p&gt;

&lt;p&gt;The usual path looks like this:&lt;/p&gt;

&lt;p&gt;Take a crash course.&lt;/p&gt;

&lt;p&gt;Import a framework.&lt;/p&gt;

&lt;p&gt;Train a model.&lt;/p&gt;

&lt;p&gt;Tune hyperparameters and move on.&lt;/p&gt;

&lt;p&gt;They start with tools.&lt;/p&gt;

&lt;p&gt;Frameworks. APIs. Pretrained models.&lt;br&gt;
Everything works, until it doesn’t.&lt;/p&gt;

&lt;p&gt;At first, progress feels fast.&lt;/p&gt;

&lt;p&gt;You can reproduce a tutorial in an evening. You can train a model in an afternoon. You can get something into production surprisingly quickly.&lt;/p&gt;

&lt;p&gt;And then, slowly, friction appears.&lt;/p&gt;

&lt;p&gt;A model overfits.&lt;br&gt;
A small data change breaks performance.&lt;br&gt;
A colleague asks why a method works better than another.&lt;br&gt;
A paper introduces a “simple” idea that somehow feels impossible to follow.&lt;/p&gt;

&lt;p&gt;At that moment, many engineers quietly conclude:&lt;/p&gt;

&lt;p&gt;“I’m not a math person.”&lt;br&gt;
That conclusion is wrong.&lt;/p&gt;

&lt;p&gt;What’s actually missing is structure.&lt;/p&gt;

&lt;p&gt;**Machine learning is not a collection of tricks.&lt;/p&gt;

&lt;p&gt;Linear algebra, probability, statistics, and optimization are not “prerequisites.”**&lt;/p&gt;

&lt;p&gt;&lt;em&gt;They are the language machine learning is written in.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Once that language is familiar, many things that seemed complex become obvious.&lt;/p&gt;

&lt;p&gt;When you skip those layers, everything above them feels fragile and mysterious.&lt;/p&gt;

&lt;p&gt;This is why so many ML practitioners:&lt;/p&gt;

&lt;p&gt;Can train models but can’t explain them&lt;br&gt;
Can follow tutorials but can’t adapt ideas&lt;br&gt;
Can use tools but struggle to reason about failure modes&lt;/p&gt;

&lt;p&gt;The solution is not learning more frameworks, libraries and tools.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo6m608nm373b9dh4p06s.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fo6m608nm373b9dh4p06s.webp" alt=" " width="363" height="437"&gt;&lt;/a&gt;It’s better foundations.&lt;/p&gt;

&lt;p&gt;And contrary to popular belief, you don’t need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;a PhD&lt;/li&gt;
&lt;li&gt;5 more years of experience&lt;/li&gt;
&lt;li&gt;years of formal math training&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;What you need are the right books, written by people who understand how learning actually happens.&lt;/p&gt;

&lt;p&gt;Books that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;respect your time&lt;/li&gt;
&lt;li&gt;explain ideas before formalism&lt;/li&gt;
&lt;li&gt;connect math directly to algorithms&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That’s what this list is about.&lt;/p&gt;

&lt;p&gt;Below are 10 free, high-quality books that quietly do what most courses fail to do:&lt;br&gt;
they help you understand machine learning, not just use it.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1.Mathematics for Machine Learning&lt;/strong&gt;, A. Aldo Faisal, and Cheng Soon Ong&lt;/p&gt;

&lt;p&gt;&lt;a href="https://mml-book.github.io/book/mml-book.pdf" rel="noopener noreferrer"&gt;https://mml-book.github.io/book/mml-book.pdf&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The gold standard for ML math: linear algebra, calculus, probability, clearly connected to algorithms.&lt;/p&gt;

&lt;p&gt;2.Dive into Deep Learning, Cambridge University Press&lt;/p&gt;

&lt;p&gt;&lt;a href="https://d2l.ai" rel="noopener noreferrer"&gt;https://d2l.ai&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A modern deep learning textbook with math, code, and intuition side-by-side.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3.Think Bayes&lt;/strong&gt; by Allen B. Downey&lt;/p&gt;

&lt;p&gt;&lt;a href="https://allendowney.github.io/ThinkBayes2/" rel="noopener noreferrer"&gt;https://allendowney.github.io/ThinkBayes2/&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Bayesian reasoning explained through code and real examples.&lt;/p&gt;

&lt;p&gt;**&lt;br&gt;
4.Think Stats** by Allen B. Downey&lt;/p&gt;

&lt;p&gt;&lt;a href="https://greenteapress.com/thinkstats2/thinkstats2.pdf" rel="noopener noreferrer"&gt;https://greenteapress.com/thinkstats2/thinkstats2.pdf&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Statistics for people who want to understand data, not memorize formulas.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5.Machine Learning from Scratch&lt;/strong&gt; By Danny Friedman&lt;/p&gt;

&lt;p&gt;&lt;a href="https://dafriedman97.github.io/mlbook" rel="noopener noreferrer"&gt;https://dafriedman97.github.io/mlbook&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Classic ML algorithms built step by step, no black boxes.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6.Patterns, Predictions, and Actions, M. Hardt and B Recht&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://mlstory.org/pdf/patterns.pdf" rel="noopener noreferrer"&gt;https://mlstory.org/pdf/patterns.pdf&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A conceptual ML book focused on generalization, optimization, and causality.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;7.Mathematical Introduction to Deep Learning&lt;/strong&gt;, A. Jentzen, B. Kuckuck, P. von Wurstemberger&lt;/p&gt;

&lt;p&gt;&lt;a href="https://arxiv.org/pdf/2310.20360" rel="noopener noreferrer"&gt;https://arxiv.org/pdf/2310.20360&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Neural networks explained from first principles.&lt;/p&gt;

&lt;p&gt;**&lt;br&gt;
8.Calculus, by Gilbert Strang, MIT Press**&lt;/p&gt;

&lt;p&gt;&lt;a href="https://ocw.mit.edu/courses/res-18-001-calculus-fall-2023/mitres_18_001_f17_full_book.pdf" rel="noopener noreferrer"&gt;https://ocw.mit.edu/courses/res-18-001-calculus-fall-2023/mitres_18_001_f17_full_book.pdf&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;A book that will give you step by step foundation of Calculus.&lt;/p&gt;

&lt;p&gt;**&lt;br&gt;
9.Linear Algebra for Machine Learning**, by University of Pennsylvania.&lt;br&gt;
&lt;a href="https://www.cis.upenn.edu/%7Ecis5150/linalg-I-f.pdf" rel="noopener noreferrer"&gt;https://www.cis.upenn.edu/~cis5150/linalg-I-f.pdf&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The language of data, vectors, and transformations, made practical.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;10.Mathematical Theory of Deep Learning&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://arxiv.org/pdf/2407.18384" rel="noopener noreferrer"&gt;https://arxiv.org/pdf/2407.18384&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For readers who want to understand how and why deep learning works under the hood.&lt;/p&gt;

&lt;p&gt;Understanding accumulates quietly.&lt;/p&gt;

&lt;p&gt;And once it’s there, it doesn’t disappear when the tools change.&lt;/p&gt;

&lt;p&gt;Frameworks will be replaced.&lt;/p&gt;

&lt;p&gt;APIs will evolve.&lt;/p&gt;

&lt;p&gt;Terminology will shift.&lt;/p&gt;

&lt;p&gt;The underlying ideas will not.&lt;/p&gt;

&lt;p&gt;That’s why these books matter.&lt;/p&gt;

&lt;p&gt;They are not about keeping up.&lt;/p&gt;

&lt;p&gt;They are not trendy.&lt;/p&gt;

&lt;p&gt;They are not optimised for clicks.&lt;/p&gt;

&lt;p&gt;They are the kind of resources you come back to years later and think,“I finally see it now.”&lt;/p&gt;

&lt;p&gt;And in a field that changes as quickly as machine learning, that turns out to be a long-term advantage.&lt;/p&gt;

&lt;p&gt;If this was useful, I write about Math and ML weekly at Math Mindset (&lt;a href="https://dev.tourl"&gt;mathmindset.substack.com&lt;/a&gt;). It's free.&lt;/p&gt;

</description>
      <category>machinelearning</category>
      <category>ai</category>
      <category>mathematics</category>
    </item>
  </channel>
</rss>
