<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Krzysztof Fraus</title>
    <description>The latest articles on Forem by Krzysztof Fraus (@krzysztof_fraus).</description>
    <link>https://forem.com/krzysztof_fraus</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/krzysztof_fraus"/>
    <language>en</language>
    <item>
      <title>How to explain React hooks in interviews</title>
      <dc:creator>Krzysztof Fraus</dc:creator>
      <pubDate>Mon, 16 Mar 2026 16:47:46 +0000</pubDate>
      <link>https://forem.com/krzysztof_fraus/how-to-explain-react-hooks-in-interviews-odo</link>
      <guid>https://forem.com/krzysztof_fraus/how-to-explain-react-hooks-in-interviews-odo</guid>
      <description>&lt;p&gt;"Can you explain how React hooks work?"&lt;/p&gt;

&lt;p&gt;You know hooks. You use them every day. You've built entire applications with useState, useEffect, useRef, useMemo. But when this question hits you in an interview, something strange happens. You start reciting: "useState is a hook that lets you add state to functional components..."&lt;/p&gt;

&lt;p&gt;And the interviewer's eyes glaze over. Right?&lt;/p&gt;

&lt;p&gt;You just described what the React docs say. You didn't show that you &lt;em&gt;understand&lt;/em&gt; hooks. There's a gap between knowing how to use something and being able to explain it in a way that demonstrates real understanding — and that gap is where most candidates lose points.&lt;/p&gt;

&lt;p&gt;What follows are ways to talk about the core hooks that actually show you've worked with them. Not what to know — how to &lt;em&gt;say&lt;/em&gt; it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why docs-style answers don't work
&lt;/h2&gt;

&lt;p&gt;Every blog post about &lt;a href="https://www.prepovo.com/blog/frontend-interview-questions-2026?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-explain-react-hooks-in-interviews" rel="noopener noreferrer"&gt;React interview questions&lt;/a&gt; gives you the same format:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Q: What is useState?&lt;/strong&gt;&lt;br&gt;
A: useState is a React hook that allows you to add state variables to functional components. It returns an array with the current state value and a setter function.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Technically correct. Zero signal to the interviewer.&lt;/p&gt;

&lt;p&gt;The interviewer already knows what useState does. They're not quizzing you on facts. They're evaluating three things:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Mental model&lt;/strong&gt; — Do you understand &lt;em&gt;why&lt;/em&gt; hooks exist and how they fit into React's rendering model?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Practical judgment&lt;/strong&gt; — Can you choose the right hook for the right problem?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Communication&lt;/strong&gt; — Can you explain technical decisions to teammates?&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;A docs-recitation scores zero on all three. An answer that starts with the &lt;em&gt;problem&lt;/em&gt; hooks solve scores on all three.&lt;/p&gt;

&lt;h2&gt;
  
  
  The "problem-first" approach
&lt;/h2&gt;

&lt;p&gt;Before we get to individual hooks, here's the structure that works for explaining any technical concept in an interview. This is basically the core idea from &lt;a href="https://www.prepovo.com/blog/first-30-seconds-interview-answer?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-explain-react-hooks-in-interviews" rel="noopener noreferrer"&gt;the first 30 seconds of your interview answer&lt;/a&gt;:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Problem&lt;/strong&gt; (1-2 sentences) — What pain point does this solve?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Mechanism&lt;/strong&gt; (2-3 sentences) — How does it actually work, in your own words?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Gotcha&lt;/strong&gt; (1 sentence) — What trips people up?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Real usage&lt;/strong&gt; (1-2 sentences) — When you'd reach for it and when you wouldn't.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;About 60 seconds to deliver. Under 30 feels shallow. Over 90 and you're &lt;a href="https://www.prepovo.com/blog/how-to-stop-rambling-in-technical-interviews?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-explain-react-hooks-in-interviews" rel="noopener noreferrer"&gt;rambling&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  useState: stop saying "adds state to functional components"
&lt;/h2&gt;

&lt;p&gt;The docs answer: "useState is a hook that lets you add state to functional components. It returns an array with the current value and a setter function. When you call the setter, the component re-renders."&lt;/p&gt;

&lt;p&gt;That's a Wikipedia entry, not an interview answer.&lt;/p&gt;

&lt;p&gt;Something more like this:&lt;/p&gt;

&lt;p&gt;"Before hooks, if you needed a piece of data that could change and should trigger a re-render — a counter, a form field, a toggle — you had to convert your function component to a class. useState solved that. It gives your component a slot in React's internal state array, tied to the position of the hook call. That's why hooks can't be called conditionally — React relies on call order to match each useState to the right slot. In practice, I use it for local UI state — things like 'is this dropdown open' or 'what has the user typed.' For anything shared across components or with complex transitions, I'll reach for useReducer or something external."&lt;/p&gt;

&lt;p&gt;The difference: you started with the problem, explained the mechanism with a mental model (slots, call order), dropped the key gotcha (conditional calls) with the &lt;em&gt;reason&lt;/em&gt; behind it, and showed practical judgment. About 45 seconds of speaking.&lt;/p&gt;

&lt;h2&gt;
  
  
  useEffect: the most misunderstood hook
&lt;/h2&gt;

&lt;p&gt;This is where candidates either show they understand React's model or reveal they don't.&lt;/p&gt;

&lt;p&gt;The docs answer: "useEffect lets you perform side effects in function components. It runs after every render by default, and you can pass a dependency array to control when it runs."&lt;/p&gt;

&lt;p&gt;Mechanically correct, reveals nothing.&lt;/p&gt;

&lt;p&gt;Something better:&lt;/p&gt;

&lt;p&gt;"useEffect exists because React's render function is supposed to be pure — given the same props and state, it should return the same JSX. But real applications need side effects: fetching data, setting up subscriptions, manually touching the DOM. useEffect is React's escape hatch for that. It says 'after you're done painting this to the screen, run this code.' The dependency array tells React which values this effect depends on — React skips re-running if none of them changed."&lt;/p&gt;

&lt;p&gt;And the part that actually matters: "The biggest mistake I see is treating useEffect like a lifecycle method — trying to recreate componentDidMount with an empty dependency array. It works, but it misses the point. useEffect is about synchronizing with external systems, not about running code at specific lifecycle moments. That shift in mental model changes how you think about cleanup and when effects re-run."&lt;/p&gt;

&lt;p&gt;This shows you understand:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Why useEffect exists (purity of render)&lt;/li&gt;
&lt;li&gt;What "after painting" means (the timing model)&lt;/li&gt;
&lt;li&gt;The real purpose of the dependency array (correctness, not optimization)&lt;/li&gt;
&lt;li&gt;The conceptual trap (lifecycle thinking vs. synchronization thinking)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If the interviewer follows up on the cleanup function: "The cleanup runs before the effect re-runs and when the component unmounts. Classic example is a WebSocket subscription — you connect in the effect, disconnect in the cleanup. React cleans up the old effect before running the new one, which prevents stale-subscription bugs that were really common with class components."&lt;/p&gt;

&lt;h2&gt;
  
  
  useRef: not just "for DOM elements"
&lt;/h2&gt;

&lt;p&gt;The docs answer: "useRef gives you a reference to a DOM element."&lt;/p&gt;

&lt;p&gt;That's like saying "a car is for going to the grocery store." True, incomplete, misses the point.&lt;/p&gt;

&lt;p&gt;Better: "useRef gives you a mutable container that persists across renders without triggering re-renders. That's the key distinction from useState — both persist values, but updating a ref doesn't cause React to re-paint. The DOM reference use case falls out of that naturally: you need a stable reference to a DOM node that doesn't change on every render. But I use refs for other things too — storing interval IDs, tracking previous prop values, keeping a mutable flag for whether a component is still mounted. Basically, anytime I need an instance variable that isn't part of the render output."&lt;/p&gt;

&lt;p&gt;The phrase "instance variable that isn't part of the render output" connects hooks back to the class component mental model in a way that signals you've thought about this at a deeper level.&lt;/p&gt;

&lt;h2&gt;
  
  
  useMemo and useCallback: show restraint
&lt;/h2&gt;

&lt;p&gt;This is where you can really stand out — by showing you know when &lt;em&gt;not&lt;/em&gt; to use them.&lt;/p&gt;

&lt;p&gt;The docs answer: "useMemo caches a computed value, and useCallback caches a function. You should use them to optimize performance."&lt;/p&gt;

&lt;p&gt;Something that shows more judgment:&lt;/p&gt;

&lt;p&gt;"Both are memoization tools. useMemo caches the &lt;em&gt;result&lt;/em&gt; of a computation, useCallback caches a &lt;em&gt;function reference&lt;/em&gt;. You'd want a stable function reference usually to prevent unnecessary re-renders in child components that take that function as a prop, or to keep a useEffect dependency stable. But here's the thing — memoization itself has a cost. React has to store the previous value, compare dependencies on every render, manage that cache. For cheap computations, adding useMemo actually makes performance &lt;em&gt;worse&lt;/em&gt;."&lt;/p&gt;

&lt;p&gt;And then — "I only reach for useMemo when I've profiled and confirmed there's a bottleneck — an expensive filter or sort on a large list. useCallback mostly when passing callbacks to memoized children. The React team has been pretty vocal about this: don't optimize prematurely with these hooks."&lt;/p&gt;

&lt;p&gt;Most candidates talk about these hooks as pure performance wins. Showing you understand the cost of memoization and that you profile before optimizing — that's the signal.&lt;/p&gt;

&lt;h2&gt;
  
  
  useReducer: the underrated hook
&lt;/h2&gt;

&lt;p&gt;Most candidates skip useReducer or dismiss it as "useState for complex state." That's a missed opportunity.&lt;/p&gt;

&lt;p&gt;"useReducer follows the same pattern as Redux: you dispatch actions, and a pure reducer function computes the next state. I reach for it when state transitions depend on the previous state in non-trivial ways — like a form with validation that depends on multiple fields, or a multi-step wizard where going back needs to undo several state changes. The reducer function is pure and lives outside the component, so it's trivially testable. You write unit tests for your state logic without rendering anything."&lt;/p&gt;

&lt;p&gt;The testability angle catches interviewers off guard. Most people don't mention it, and it shows you think about code organization, not just functionality.&lt;/p&gt;

&lt;p&gt;That said — I'll admit I probably underuse useReducer in my own code. I reach for useState by default even when the state is getting complex enough that a reducer would be cleaner. Old habits. Something about the action/dispatch pattern feels like overhead when you're just prototyping. Whether that's laziness or pragmatism, I'm not sure.&lt;/p&gt;

&lt;h2&gt;
  
  
  useContext: keep it short
&lt;/h2&gt;

&lt;p&gt;Not much to say about this one. useContext reads from a React context, provider sits up the tree, any descendant can read the value without prop drilling. The only thing worth mentioning is the re-render trap: every consumer re-renders when the context value changes, which is why you don't put rapidly-changing values in context. Theme, locale, auth state — things that change infrequently. For everything else, use a proper state management tool.&lt;/p&gt;

&lt;p&gt;Don't over-explain this in an interview. Ten seconds is fine. Move on.&lt;/p&gt;

&lt;h2&gt;
  
  
  Custom hooks: the real test
&lt;/h2&gt;

&lt;p&gt;If the interviewer asks about custom hooks, they're testing whether you see hooks as a composition mechanism, not just a React API.&lt;/p&gt;

&lt;p&gt;"Custom hooks are just functions that call other hooks. The convention is the 'use' prefix so React can enforce the rules of hooks. But the real value is that they let you extract and reuse stateful logic — not UI, just logic. For example, I've built a useDebounce hook that wraps useState and useEffect to give any component debounced input handling. The component doesn't know about the timing logic. You get reusable stateful behavior that's decoupled from any particular component's rendering."&lt;/p&gt;

&lt;p&gt;Then give one concrete example from your own work. "I built useMediaQuery that tracks viewport breakpoints" or "we had a usePolling hook that fetched data on an interval with automatic cleanup." Specific beats generic.&lt;/p&gt;

&lt;h2&gt;
  
  
  The 60-second hooks overview
&lt;/h2&gt;

&lt;p&gt;If you get the broad "explain hooks" question, here's roughly what a complete answer looks like:&lt;/p&gt;

&lt;p&gt;"Hooks let function components do everything class components could — and more composably. Before hooks, reusing stateful logic meant higher-order components or render props, both of which created wrapper hell. With hooks, you extract stateful logic into a function and call it from any component. The core hooks map to specific needs: useState for local reactive values, useEffect for synchronizing with external systems, useRef for mutable values that don't trigger renders, useMemo/useCallback for targeted performance optimization. Custom hooks are where it gets powerful — you compose the primitive hooks into domain-specific logic that any component can use. The mental model shift is thinking in terms of &lt;em&gt;what this component synchronizes with&lt;/em&gt; rather than &lt;em&gt;what happens at each lifecycle stage&lt;/em&gt;."&lt;/p&gt;

&lt;p&gt;That's about 60 seconds. It covers the why, the what, and the mental model.&lt;/p&gt;

&lt;h2&gt;
  
  
  Practice out loud, not in your head
&lt;/h2&gt;

&lt;p&gt;Reading these examples won't help you in an interview. You need to say them. Out loud.&lt;/p&gt;

&lt;p&gt;This is the core insight behind &lt;a href="https://www.prepovo.com/blog/why-explaining-out-loud-beats-reading?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-explain-react-hooks-in-interviews" rel="noopener noreferrer"&gt;verbal practice for interview prep&lt;/a&gt;: the gap between reading an explanation and delivering one is enormous. Your brain processes information differently when you're producing speech. You'll stumble on transitions, realize you don't actually understand the dependency array as well as you thought, and discover which parts of your explanation feel forced.&lt;/p&gt;

&lt;p&gt;Pick one hook — whichever you feel least confident explaining. Set a timer for 60 seconds. Explain it out loud using the problem-mechanism-gotcha-usage structure. Record yourself if you can stand it.&lt;/p&gt;

&lt;p&gt;Then do it again. The second take will be noticeably better. The fifth take will sound like you actually understand it, which is different from sounding like you invented it — nobody needs to sound like Dan Abramov in an interview, they just need to sound like they've actually used the thing and thought about why it works the way it does.&lt;/p&gt;

</description>
      <category>react</category>
      <category>frontend</category>
      <category>career</category>
      <category>programming</category>
    </item>
    <item>
      <title>Frontend interview questions you'll actually get in 2026</title>
      <dc:creator>Krzysztof Fraus</dc:creator>
      <pubDate>Thu, 12 Mar 2026 10:29:07 +0000</pubDate>
      <link>https://forem.com/krzysztof_fraus/frontend-interview-questions-youll-actually-get-in-2026-igh</link>
      <guid>https://forem.com/krzysztof_fraus/frontend-interview-questions-youll-actually-get-in-2026-igh</guid>
      <description>&lt;p&gt;You can probably answer most of these questions in your head. That's not the problem. The problem is that knowing the answer and being able to &lt;strong&gt;say&lt;/strong&gt; the answer clearly, under pressure, in 60-90 seconds — those are two different skills. And the second one is what actually gets tested in interviews.&lt;/p&gt;

&lt;p&gt;What follows are 15 frontend questions you're likely to encounter in 2026, organized by category. For each one: what the interviewer is probably evaluating, a rough structure for a verbal response, and the mistakes that sink candidates.&lt;/p&gt;

&lt;p&gt;This isn't a 100-question dump. It's opinionated, built for speaking out loud.&lt;/p&gt;

&lt;h2&gt;
  
  
  JavaScript fundamentals
&lt;/h2&gt;

&lt;p&gt;These never go away. Frameworks change; the language stays. I've seen interviewers at every level of company still ask these because they reveal how deeply you actually understand the tools you use daily.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. "Explain the difference between &lt;code&gt;let&lt;/code&gt;, &lt;code&gt;const&lt;/code&gt;, and &lt;code&gt;var&lt;/code&gt;."
&lt;/h3&gt;

&lt;p&gt;This one feels like it should be easy, and that's the trap. Most candidates say "var is function-scoped, let and const are block-scoped" and stop there, which is technically fine but doesn't show much.&lt;/p&gt;

&lt;p&gt;The thing that actually trips people up: &lt;code&gt;const&lt;/code&gt; doesn't mean the value can't change. It prevents reassignment. You can still push to a &lt;code&gt;const&lt;/code&gt; array, mutate a &lt;code&gt;const&lt;/code&gt; object. If you get this wrong, the interviewer &lt;em&gt;will&lt;/em&gt; push on it, and then you're flustered for the next question.&lt;/p&gt;

&lt;p&gt;Start with scope (block vs function), give one concrete example — &lt;code&gt;var&lt;/code&gt; inside a for loop leaking — mention hoisting and the temporal dead zone, close with your default: "const by default, let when reassignment is needed, var never." About 60 seconds.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. "What is the event loop? How does it work?"
&lt;/h3&gt;

&lt;p&gt;This separates people who've debugged race conditions from people who've only read about promises.&lt;/p&gt;

&lt;p&gt;Start concrete: "JavaScript is single-threaded, so it uses an event loop to handle async operations without blocking." Name the pieces: call stack, Web APIs (or Node APIs), callback queue, microtask queue. Walk through one example — &lt;code&gt;setTimeout(() =&amp;gt; console.log('a'), 0)&lt;/code&gt; followed by &lt;code&gt;Promise.resolve().then(() =&amp;gt; console.log('b'))&lt;/code&gt; — and explain why &lt;code&gt;b&lt;/code&gt; prints first. The microtask queue drains completely before the next macrotask runs.&lt;/p&gt;

&lt;p&gt;The mistake that immediately flags you: forgetting the microtask queue entirely. If you only talk about "the callback queue," the interviewer knows your mental model has a hole in it. Promises, &lt;code&gt;queueMicrotask&lt;/code&gt;, and &lt;code&gt;MutationObserver&lt;/code&gt; callbacks all go through microtasks. About 90 seconds for this one.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. "Explain closures. When would you use one intentionally?"
&lt;/h3&gt;

&lt;p&gt;They don't want the textbook definition. They want to see if you recognize closures in real code.&lt;/p&gt;

&lt;p&gt;Define it in one sentence: "A closure is a function that retains access to variables from its outer scope even after that outer function has returned." Give a use case: a factory function like &lt;code&gt;createCounter()&lt;/code&gt; that returns an object with &lt;code&gt;increment&lt;/code&gt; and &lt;code&gt;getCount&lt;/code&gt; methods sharing a private &lt;code&gt;count&lt;/code&gt; variable. Then mention where closures bite you: stale closures in React &lt;code&gt;useEffect&lt;/code&gt; when dependencies aren't listed correctly.&lt;/p&gt;

&lt;p&gt;The mistake here is giving the definition and stopping. The interviewer asked "when would you use one" — skip the second half and you look like you memorized without understanding.&lt;/p&gt;

&lt;p&gt;For more on structuring answers about React-specific concepts, see &lt;a href="https://www.prepovo.com/blog/how-to-explain-react-hooks-in-interviews?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=frontend-interview-questions-2026" rel="noopener noreferrer"&gt;how to explain React hooks in interviews&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. "What's the difference between &lt;code&gt;==&lt;/code&gt; and &lt;code&gt;===&lt;/code&gt;? When would you ever use &lt;code&gt;==&lt;/code&gt;?"
&lt;/h3&gt;

&lt;p&gt;The sneaky part is the second question. Most candidates say "never," which is technically a safe answer but shows you've absorbed a linting rule without understanding why it exists.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;===&lt;/code&gt; checks value and type, &lt;code&gt;==&lt;/code&gt; coerces types before comparing. &lt;code&gt;"5" == 5&lt;/code&gt; is &lt;code&gt;true&lt;/code&gt;, &lt;code&gt;"5" === 5&lt;/code&gt; is &lt;code&gt;false&lt;/code&gt;. The honest answer about when to use &lt;code&gt;==&lt;/code&gt;: checking for &lt;code&gt;null&lt;/code&gt; or &lt;code&gt;undefined&lt;/code&gt; simultaneously — &lt;code&gt;value == null&lt;/code&gt; catches both, which is cleaner than &lt;code&gt;value === null || value === undefined&lt;/code&gt;. Strict equality everywhere, with the &lt;code&gt;null&lt;/code&gt; check as the one exception you'll defend. 45 seconds.&lt;/p&gt;

&lt;h2&gt;
  
  
  React and framework questions
&lt;/h2&gt;

&lt;p&gt;React still dominates frontend interviews in 2026. Even companies using Vue, Svelte, or Solid will often ask React questions because it's the common denominator.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. "Walk me through what happens when state changes in a React component."
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://www.prepovo.com/blog/first-30-seconds-interview-answer?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=frontend-interview-questions-2026" rel="noopener noreferrer"&gt;The first 30 seconds of your answer&lt;/a&gt; here really determine whether the interviewer sees you as junior or mid-level.&lt;/p&gt;

&lt;p&gt;State update is scheduled, not synchronous — React batches updates. React triggers a re-render of the component and its children. During re-render, the component function runs again, producing a new virtual DOM tree. React diffs the new tree against the previous one (reconciliation). Only the diffed changes are applied to the actual DOM. Worth mentioning that React 18+ batches updates across event handlers, timeouts, and promises — this wasn't always true.&lt;/p&gt;

&lt;p&gt;If you say state updates are synchronous, you're done. They're not. If you &lt;code&gt;setState&lt;/code&gt; and then immediately read the state variable, you'll get the old value. This is one of those things where a lot of developers use it correctly but explain it incorrectly.&lt;/p&gt;

&lt;h3&gt;
  
  
  6. "When would you use &lt;code&gt;useRef&lt;/code&gt; instead of &lt;code&gt;useState&lt;/code&gt;?"
&lt;/h3&gt;

&lt;p&gt;&lt;code&gt;useState&lt;/code&gt; triggers a re-render when updated; &lt;code&gt;useRef&lt;/code&gt; doesn't. That's the whole thing. Use &lt;code&gt;useRef&lt;/code&gt; for values you need to persist across renders but that don't affect the UI: timer IDs, previous values for comparison, DOM element references. A concrete example: storing an &lt;code&gt;AbortController&lt;/code&gt; for a fetch request so you can cancel it on unmount without causing unnecessary re-renders.&lt;/p&gt;

&lt;p&gt;Only mentioning DOM refs is the common trap. Yes, &lt;code&gt;useRef&lt;/code&gt; gives you access to DOM elements, but that's the beginner use case. If you only say "it's for accessing DOM elements," you've answered maybe 30% of the question.&lt;/p&gt;

&lt;h3&gt;
  
  
  7. "How do you decide between client state and server state?"
&lt;/h3&gt;

&lt;p&gt;This reveals whether you've thought about data ownership or just throw everything into Redux.&lt;/p&gt;

&lt;p&gt;Server state is data that lives on the backend and is fetched/cached on the client — user profiles, lists of items, API responses. Client state is UI-specific data that doesn't exist on the server — modal open/closed, form input before submission, sidebar collapsed. Name your tools: TanStack Query or SWR for server state, Zustand or Context for client state.&lt;/p&gt;

&lt;p&gt;Why does the distinction matter? Server state needs cache invalidation, background refetching, optimistic updates. Client state doesn't. Mixing them creates complexity that doesn't need to exist. I've worked on projects that had the entire API response shoved into Redux, and honestly, the amount of boilerplate just to keep that cache in sync with the backend was absurd. Purpose-built tools like TanStack Query handle all of that — saying "I use Redux for everything" in 2026 signals you haven't kept up.&lt;/p&gt;

&lt;h3&gt;
  
  
  8. "Explain React Server Components. How are they different from SSR?"
&lt;/h3&gt;

&lt;p&gt;This has gone from "nice to know" to "expected" in 2026, and most candidates still confuse the two.&lt;/p&gt;

&lt;p&gt;SSR renders your components to HTML on the server, sends that HTML, then hydrates with JavaScript on the client — the component code still ships to the browser. Server Components run &lt;em&gt;only&lt;/em&gt; on the server. Their code never reaches the client bundle — zero JavaScript cost for those components.&lt;/p&gt;

&lt;p&gt;Here's the distinction that matters: SSR is a rendering strategy (same components, different execution timing). Server Components are a component type (some components never leave the server). A Server Component can directly query a database or read the filesystem. Mention the &lt;code&gt;'use client'&lt;/code&gt; directive as the boundary marker. They're complementary, not competing — you can use SSR &lt;em&gt;with&lt;/em&gt; Server Components. Saying "Server Components replaced SSR" reveals you haven't actually worked with them.&lt;/p&gt;

&lt;h2&gt;
  
  
  Browser and performance
&lt;/h2&gt;

&lt;p&gt;These questions test whether you've ever opened DevTools for more than console logging.&lt;/p&gt;

&lt;h3&gt;
  
  
  9. "A page is loading slowly. Walk me through how you'd diagnose it."
&lt;/h3&gt;

&lt;p&gt;They're testing your process, not your knowledge of performance APIs.&lt;/p&gt;

&lt;p&gt;Start with measurement, not guessing: open Lighthouse, check Core Web Vitals (LCP, INP, CLS). Identify the bottleneck category: is it network (large bundles, too many requests), rendering (layout thrashing, expensive repaints), or JavaScript (long tasks blocking the main thread)? For network: check the Network tab waterfall, look for render-blocking resources. For JavaScript: profile with the Performance tab, look for long tasks (&amp;gt;50ms).&lt;/p&gt;

&lt;p&gt;The mistake that kills you here: jumping straight to "I'd add lazy loading and code splitting." That's a solution, not a diagnosis. I've done this myself — reached for code splitting as a reflex before even checking whether the bundle was the problem. Turns out the issue was an unoptimized image. Embarrassing, but it taught me to measure first.&lt;/p&gt;

&lt;h3&gt;
  
  
  10. "What are Core Web Vitals and why should a frontend developer care?"
&lt;/h3&gt;

&lt;p&gt;Name the three: LCP (Largest Contentful Paint — loading speed), INP (Interaction to Next Paint — responsiveness, replaced FID in March 2024), CLS (Cumulative Layout Shift — visual stability). Google uses them as a ranking signal, they correlate with conversion rates. Give one concrete example per metric: LCP — a hero image that takes 4 seconds to load. INP — a button click that freezes the UI for 300ms. CLS — an ad that loads late and pushes content down.&lt;/p&gt;

&lt;p&gt;Still mentioning FID instead of INP? Your knowledge looks stale. FID was replaced in March 2024. Small detail, but interviewers notice.&lt;/p&gt;

&lt;h3&gt;
  
  
  11. "Explain how the browser renders a page from receiving HTML to pixels on screen."
&lt;/h3&gt;

&lt;p&gt;This is one of those questions where most candidates know more than they think — they just can't get the steps in order under pressure.&lt;/p&gt;

&lt;p&gt;Parse HTML to build the DOM tree. Parse CSS to build the CSSOM tree. Combine DOM + CSSOM into a Render tree (only visible elements). Layout: calculate exact position and size of each element. Paint: fill in pixels — colors, images, text, borders. Composite: layer ordering, GPU-accelerated transforms.&lt;/p&gt;

&lt;p&gt;Mention that JavaScript can block parsing unless &lt;code&gt;async&lt;/code&gt; or &lt;code&gt;defer&lt;/code&gt;, which is why script placement matters. And know the difference between reflow and repaint: changing geometry triggers layout + paint + composite; changing only color triggers paint + composite. Stopping at "it parses HTML and CSS and shows the page" is just restating the question.&lt;/p&gt;

&lt;h3&gt;
  
  
  12. "What's the difference between &lt;code&gt;async&lt;/code&gt; and &lt;code&gt;defer&lt;/code&gt; on a script tag?"
&lt;/h3&gt;

&lt;p&gt;Both download the script in parallel with HTML parsing. &lt;code&gt;async&lt;/code&gt; executes as soon as downloaded, pausing HTML parsing — execution order is not guaranteed. &lt;code&gt;defer&lt;/code&gt; waits until HTML parsing is complete, then executes — execution order preserved. &lt;code&gt;defer&lt;/code&gt; for scripts that depend on the DOM or on each other, &lt;code&gt;async&lt;/code&gt; for independent scripts like analytics. Default choice: &lt;code&gt;defer&lt;/code&gt; is almost always what you want.&lt;/p&gt;

&lt;p&gt;Saying both are "the same thing" or that &lt;code&gt;async&lt;/code&gt; waits for the DOM — it doesn't. If your async script tries to query a DOM element that hasn't been parsed yet, it fails silently, and that's the kind of bug that takes hours to track down because there's no error.&lt;/p&gt;

&lt;h2&gt;
  
  
  TypeScript questions
&lt;/h2&gt;

&lt;p&gt;TypeScript is no longer optional in frontend interviews. If the job posting mentions it — and in 2026, most do — expect at least one question.&lt;/p&gt;

&lt;p&gt;I'll say something that might be unpopular: TypeScript interview questions are often the weakest signal of actual TypeScript ability. The difference between someone who can explain &lt;code&gt;interface&lt;/code&gt; vs &lt;code&gt;type&lt;/code&gt; and someone who can actually model a complex domain with discriminated unions and conditional types — it's massive, and these standard questions don't really capture it. But they're what you'll get asked, so.&lt;/p&gt;

&lt;h3&gt;
  
  
  13. "Explain the difference between &lt;code&gt;interface&lt;/code&gt; and &lt;code&gt;type&lt;/code&gt; in TypeScript."
&lt;/h3&gt;

&lt;p&gt;Both can describe object shapes — for basic use, they're interchangeable. Key differences: &lt;code&gt;interface&lt;/code&gt; supports declaration merging (declaring the same interface twice merges them). &lt;code&gt;type&lt;/code&gt; supports union types, intersection types, and mapped types. &lt;code&gt;interface&lt;/code&gt; can be extended with &lt;code&gt;extends&lt;/code&gt;, &lt;code&gt;type&lt;/code&gt; uses &lt;code&gt;&amp;amp;&lt;/code&gt; for intersection.&lt;/p&gt;

&lt;p&gt;Your heuristic: "I use &lt;code&gt;interface&lt;/code&gt; for object shapes that might be extended, especially in public APIs. &lt;code&gt;type&lt;/code&gt; for unions, intersections, and anything that isn't a plain object shape." Don't say "always use type" or "always use interface" without justification — the interviewer wants to see you've actually thought about it.&lt;/p&gt;

&lt;h3&gt;
  
  
  14. "What are generics? Give me a practical example."
&lt;/h3&gt;

&lt;p&gt;Generics let you write functions, classes, or types that work with multiple types while preserving type safety. A practical example: an API response wrapper — &lt;code&gt;type ApiResponse&amp;lt;T&amp;gt; = { data: T; error: string | null; loading: boolean }&lt;/code&gt; — reusable across every endpoint while keeping the &lt;code&gt;data&lt;/code&gt; field strongly typed. Mention constraints: &lt;code&gt;&amp;lt;T extends string | number&amp;gt;&lt;/code&gt; limits what types can be used, preventing runtime errors.&lt;/p&gt;

&lt;p&gt;Whatever you do, don't explain generics with &lt;code&gt;function identity&amp;lt;T&amp;gt;(x: T): T&lt;/code&gt;. The interviewer has seen it a thousand times and it tells them nothing about whether you use generics in real code. API wrappers, form handlers, component props — anything from actual application code.&lt;/p&gt;

&lt;h2&gt;
  
  
  Behavioral-technical crossover
&lt;/h2&gt;

&lt;p&gt;These trip people up most. They sound behavioral but require technical depth. &lt;a href="https://www.prepovo.com/blog/how-to-stop-rambling-in-technical-interviews?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=frontend-interview-questions-2026" rel="noopener noreferrer"&gt;Not rambling&lt;/a&gt; is especially important here.&lt;/p&gt;

&lt;h3&gt;
  
  
  15. "Tell me about a time you improved performance on a frontend application."
&lt;/h3&gt;

&lt;p&gt;This is a behavioral-technical crossover. You need a story, not a lecture. For a deeper look at answering this type of question, see &lt;a href="https://www.prepovo.com/blog/tell-me-about-a-technical-challenge?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=frontend-interview-questions-2026" rel="noopener noreferrer"&gt;what interviewers mean by "tell me about a technical challenge"&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Rough structure for ~90 seconds: situation (what was slow and why it mattered), diagnosis (what you measured and what you found), action (what you specifically did — name the techniques), result (numbers). "I made it faster" means nothing. "I reduced LCP from 4.2s to 1.8s" is concrete. If you don't have exact numbers from a real project, approximate — the interviewer cares about the structure of your thinking more than the precision.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to actually prepare
&lt;/h2&gt;

&lt;p&gt;Reading these questions and nodding is worth approximately nothing. You've read through 15 questions. You understand the concepts. But here's the test: pick question #2 (the event loop) right now. Set a 90-second timer. Explain it out loud.&lt;/p&gt;

&lt;p&gt;If your explanation was clean and structured on the first try, you're ahead of most candidates.&lt;/p&gt;

&lt;p&gt;If you stumbled, repeated yourself, or went silent trying to remember the difference between the microtask queue and the callback queue — that's normal. That's the gap between reading and speaking. And that gap only closes with practice.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Practice verbally, not mentally.&lt;/strong&gt; Thinking through an answer in your head uses different cognitive pathways than saying it out loud. Your mouth will betray you in an interview if you've only ever rehearsed silently. As we covered in &lt;a href="https://www.prepovo.com/blog/why-explaining-out-loud-beats-reading?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=frontend-interview-questions-2026" rel="noopener noreferrer"&gt;why explaining out loud beats reading&lt;/a&gt;, the act of speaking forces you to linearize your knowledge — you can't hand-wave past the fuzzy parts.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Record yourself.&lt;/strong&gt; It's uncomfortable. Do it anyway. Listen for filler words, unexplained jargon, and moments where you lose the thread.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Time yourself.&lt;/strong&gt; After 90 seconds on a single question, you're either rambling or going too deep. Practice hitting the key points within 60-90 seconds, then stopping.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Practice daily, not in bursts.&lt;/strong&gt; One question per day for three weeks beats cramming 20 questions the night before.&lt;/p&gt;

&lt;p&gt;This is what &lt;a href="https://www.prepovo.com/interview-preparation?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=frontend-interview-questions-2026" rel="noopener noreferrer"&gt;Prepovo&lt;/a&gt; is built for. You get a technical question, explain your answer out loud, and receive AI-powered feedback on your verbal structure, accuracy, and clarity. Five minutes a day. The skill of articulating complex concepts clearly compounds faster than you'd expect.&lt;/p&gt;

&lt;p&gt;Pick three questions from this article — one from each category. Tomorrow morning, set a timer for 90 seconds and answer each one out loud. No notes. Record yourself if you can stand it.&lt;/p&gt;

&lt;p&gt;Then listen back and ask: "Would I hire someone who gave this answer?"&lt;/p&gt;

&lt;p&gt;If not — you know what to work on.&lt;/p&gt;

</description>
      <category>frontend</category>
      <category>interview</category>
      <category>javascript</category>
      <category>react</category>
    </item>
    <item>
      <title>Why explaining out loud beats reading for interview prep</title>
      <dc:creator>Krzysztof Fraus</dc:creator>
      <pubDate>Thu, 12 Mar 2026 09:55:03 +0000</pubDate>
      <link>https://forem.com/krzysztof_fraus/why-explaining-out-loud-beats-reading-for-interview-prep-3bo6</link>
      <guid>https://forem.com/krzysztof_fraus/why-explaining-out-loud-beats-reading-for-interview-prep-3bo6</guid>
      <description>&lt;p&gt;You've read the system design primer. You've gone through hundreds of LeetCode solutions. You can mentally trace through a B-tree insertion. But when the interviewer asks you to "walk me through how you'd design a URL shortener," your mind goes blank.&lt;/p&gt;

&lt;p&gt;Most developers have experienced some version of this. You spend hours reading about a topic, feel confident about it, and then someone asks you to explain it and you realize... you can't. Not clearly. Not in a way that sounds like you actually understand it.&lt;/p&gt;

&lt;h2&gt;
  
  
  The illusion
&lt;/h2&gt;

&lt;p&gt;There's a specific thing that happens when you read about a concept: your brain registers familiarity and interprets it as understanding. You read about how database indexing works, the explanation makes sense, and you think "ok, I know this." But making sense of someone else's explanation is not the same as being able to produce your own.&lt;/p&gt;

&lt;p&gt;I noticed this when I was prepping for a system design interview a couple of years ago. I'd read about event sourcing — understood the diagrams, nodded along to the examples, felt good about it. Then I tried to explain it out loud, and what came out was: "So basically, you store events instead of current state, and then... you can replay them? And there's a read model that's eventually consistent, I think?"&lt;/p&gt;

&lt;p&gt;That's not an explanation. That's a half-remembered summary of someone else's explanation. I'm a frontend developer — I don't work with event sourcing daily. And it showed the moment I tried to produce my own words instead of recognizing someone else's.&lt;/p&gt;

&lt;p&gt;Here's what bothers me about this, though. I'd spent maybe two hours reading about event sourcing that week. Two hours. And I couldn't produce a coherent 30-second explanation. What were those two hours actually worth?&lt;/p&gt;

&lt;h2&gt;
  
  
  What happens when you say things out loud
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;The gaps become obvious immediately.&lt;/strong&gt; When you're reading, your brain fills in missing pieces automatically. When you're speaking, you hit a gap and there's just... silence. You either know the next step or you don't. No hand-waving. No skimming past the hard part.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;You build the actual skill the interview tests.&lt;/strong&gt; Interviews don't test whether you can read about distributed systems. They test whether you can explain them in real time, under mild pressure, to someone who's evaluating you. Reading practices the wrong skill. Speaking practices the right one.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Your explanations develop structure.&lt;/strong&gt; The first time you try to explain something out loud, it comes out messy. By the third time, you've naturally developed a sequence: you start with the problem, explain the mechanism, mention the trade-offs. This structure doesn't come from reading about "how to structure answers" — it emerges from the act of trying to say coherent things and failing enough times.&lt;/p&gt;

&lt;p&gt;I want to be careful not to oversell this, though. Speaking out loud doesn't magically fix knowledge gaps. If you don't understand how a hash map works, saying it out loud won't help. It just makes the gap painfully visible, which is a different thing. Useful, but not the same as learning.&lt;/p&gt;

&lt;h2&gt;
  
  
  The production gap
&lt;/h2&gt;

&lt;p&gt;Musicians don't get better by listening to music. Athletes don't improve by watching game tape. But developers preparing for interviews? They mostly read. Articles, textbooks, solution write-ups. For hours.&lt;/p&gt;

&lt;p&gt;The gap between consuming information and producing explanations is the single biggest blind spot in how most people prepare. It feels productive to read for three hours. It feels uncomfortable to speak out loud for five minutes. But the five minutes of speaking will improve your interview performance more than the three hours of reading.&lt;/p&gt;

&lt;p&gt;I keep coming back to this ratio and it still seems wrong. Five minutes versus three hours. But every time I've tested it — on myself, watching candidates — it holds up.&lt;/p&gt;

&lt;h2&gt;
  
  
  What changes when you start doing this
&lt;/h2&gt;

&lt;p&gt;The first few times you try explaining something out loud, it's painful. You stumble on concepts you thought you knew. You trail off mid-sentence because you realize you don't actually understand the next step. You use the word "basically" four times in 30 seconds.&lt;/p&gt;

&lt;p&gt;After about a week of doing it regularly — even just one concept a day — something shifts. Your explanations start having a shape. You instinctively start with the problem, not the solution. You mention trade-offs without thinking about it.&lt;/p&gt;

&lt;p&gt;There's a side effect I didn't expect: you start studying differently. Instead of passively reading, you read something and immediately try to explain it out loud. If you can't, you go back and re-read. The explaining becomes the test, and the reading becomes targeted instead of aimless. Whether this actually leads to "better" understanding or just better &lt;em&gt;interview&lt;/em&gt; performance, I'm honestly not sure. Probably both. Maybe just the second one. Either way, it works for the thing you care about right now.&lt;/p&gt;

&lt;h2&gt;
  
  
  The hard part
&lt;/h2&gt;

&lt;p&gt;Talking out loud to yourself feels weird. Hearing your own stumbling explanation is uncomfortable. There's no immediate feedback — nobody's telling you "that was good" or "you lost me at the third sentence."&lt;/p&gt;

&lt;p&gt;I keep thinking there should be a way to make this less awkward. There probably isn't. The discomfort is the whole mechanism. Interviews are uncomfortable. The candidates who handle them well are the ones who've already practiced being uncomfortable.&lt;/p&gt;

&lt;p&gt;If you're preparing for an interview right now, try this: pick one concept you expect to be asked about. Don't look it up. Just explain it out loud, right now, for 60 seconds.&lt;/p&gt;

&lt;p&gt;Notice where you stumbled. That's where your prep should go tomorrow.&lt;/p&gt;

</description>
      <category>career</category>
      <category>programming</category>
      <category>beginners</category>
      <category>webdev</category>
    </item>
    <item>
      <title>5 interview mistakes that have nothing to do with knowledge</title>
      <dc:creator>Krzysztof Fraus</dc:creator>
      <pubDate>Thu, 12 Mar 2026 09:52:16 +0000</pubDate>
      <link>https://forem.com/krzysztof_fraus/5-interview-mistakes-that-have-nothing-to-do-with-knowledge-1l4h</link>
      <guid>https://forem.com/krzysztof_fraus/5-interview-mistakes-that-have-nothing-to-do-with-knowledge-1l4h</guid>
      <description>&lt;p&gt;These are mistakes I've made as a candidate and then watched other people make when I was on the interviewer side. The pattern is always the same: the candidate knows the material, but something about how they deliver it makes me unsure whether they actually do.&lt;/p&gt;

&lt;p&gt;Completely fixable. But reading about them doesn't fix them — you have to catch yourself doing them, which requires practice under pressure.&lt;/p&gt;

&lt;h2&gt;
  
  
  1. Diving into details before anyone knows what you're building
&lt;/h2&gt;

&lt;p&gt;"How would you design a notification system?" And the candidate immediately starts talking about message queues, WebSocket connections, and database schemas. Five minutes in and I still don't know what system they're describing.&lt;/p&gt;

&lt;p&gt;The instinct makes sense — you know the answer, you want to prove it fast. But the interviewer can't evaluate your decisions without the big picture. It's like explaining a git merge conflict by talking about individual lines of code before saying which files are involved.&lt;/p&gt;

&lt;p&gt;Spend 30-60 seconds outlining the major components first. "I'd break this into three parts: something that receives notification triggers, a layer that decides how to deliver them, and per-channel delivery services." Now the interviewer has a map. Everything you say after that slots into a structure they can follow.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. Stream of consciousness answers
&lt;/h2&gt;

&lt;p&gt;I still do this in meetings. You start explaining one thing, touch on a related thing, think of an edge case, mention that, loop back to the original point, and somewhere in there the listener is gone.&lt;/p&gt;

&lt;p&gt;In your head, it all connects. Caching relates to consistency relates to CAP theorem relates to your database choice. For the listener, it's a random walk through your brain. I've been on the interviewer side of this and honestly, after about 90 seconds of stream-of-consciousness, I stop evaluating the content and start just waiting for a pause.&lt;/p&gt;

&lt;p&gt;Signal your structure. "There are three things I want to cover." "First, the data model." "Now, moving to the API design." Sounds almost too simple. It is simple. It works anyway.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. Presenting decisions without trade-offs
&lt;/h2&gt;

&lt;p&gt;"I'd use MongoDB here."&lt;/p&gt;

&lt;p&gt;OK. Why? The interviewer learns nothing from this. You might have a great reason. You might have picked it because it's the only database you know. They can't tell, and they're not going to assume the best.&lt;/p&gt;

&lt;p&gt;Interviewers care about reasoning more than the specific choice. "I'd use MongoDB because our access patterns are key-value lookups and we need horizontal scalability, though we lose strong consistency" — now they can see you've thought about it. The choice itself almost doesn't matter.&lt;/p&gt;

&lt;p&gt;This is where candidates undersell themselves constantly. They have the right instincts but don't say the reasoning out loud. And here's the thing — when I'm interviewing someone and they just state a decision without trade-offs, I have to ask a follow-up to figure out if they actually thought about it. That follow-up eats time, and sometimes the candidate interprets it as "they think I'm wrong" and gets defensive. The whole dynamic goes sideways over something that could've been avoided with one extra sentence.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Going silent while thinking
&lt;/h2&gt;

&lt;p&gt;Fifteen seconds of silence in an interview feels like fifteen minutes. The interviewer doesn't know if you're thinking deeply or completely stuck. The longer the silence, the more pressure builds, the harder it gets to think. Spiral.&lt;/p&gt;

&lt;p&gt;Think out loud instead. It doesn't have to be coherent. "Let me think about the bottlenecks here... The main concern would be write throughput during peak hours. So I'm thinking about whether to batch writes or use a write-behind cache..." Messy? Yes. But it turns silence into collaboration, and it lets the interviewer nudge you if you're heading in the wrong direction.&lt;/p&gt;

&lt;p&gt;I still struggle with this one, honestly. My natural instinct is to think silently and then present a complete answer. It takes conscious effort to narrate my thinking in real time, and I'm not sure I do it well even now. But even doing it badly is better than going silent.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. The 10-minute monologue
&lt;/h2&gt;

&lt;p&gt;Some candidates start answering and just... don't stop. Five minutes, eight minutes, ten minutes. No pauses, no check-ins. By the time they finish, I wanted to explore a different direction seven minutes ago but couldn't find an opening.&lt;/p&gt;

&lt;p&gt;This comes from anxiety. The feeling that you need to demonstrate everything you know right now, because what if they don't ask a follow-up? But a shorter answer that invites conversation is always better. After covering a major point, pause. "Does this make sense so far?" or "Should I go deeper here or move on?"&lt;/p&gt;

&lt;p&gt;That's it. That's the fix. Turns a presentation into a conversation. The interview should feel like a technical discussion between two engineers, not a lecture.&lt;/p&gt;

&lt;h2&gt;
  
  
  The pattern
&lt;/h2&gt;

&lt;p&gt;None of these are about knowing more stuff. They're about packaging — structure before details, trade-offs explicit, thinking out loud, checking in.&lt;/p&gt;

&lt;p&gt;The annoying part is that reading about these mistakes doesn't fix them. I knew about all five and still made them in actual interviews. Knowing what to do and doing it under pressure are different skills. The only thing that helped was practice — recording myself, hearing the rambling, hearing the silences, and gradually getting better at catching them in real time. Gradually. Not quickly. I'd fix the monologuing and then notice I'd started going silent instead, like some kind of communication whack-a-mole.&lt;/p&gt;

</description>
      <category>career</category>
      <category>programming</category>
      <category>beginners</category>
      <category>webdev</category>
    </item>
    <item>
      <title>The skill that gets engineers promoted (it's not coding)</title>
      <dc:creator>Krzysztof Fraus</dc:creator>
      <pubDate>Thu, 12 Mar 2026 09:51:54 +0000</pubDate>
      <link>https://forem.com/krzysztof_fraus/the-skill-that-gets-engineers-promoted-its-not-coding-1kai</link>
      <guid>https://forem.com/krzysztof_fraus/the-skill-that-gets-engineers-promoted-its-not-coding-1kai</guid>
      <description>&lt;p&gt;At my current company, there are developers who are genuinely better at writing code than the people above them in the hierarchy. Better abstractions, cleaner implementations, faster debugging. And yet they're stuck at mid-level while less technically impressive engineers are getting promoted to senior and staff roles.&lt;/p&gt;

&lt;p&gt;The difference, almost every time, is communication. Which sounds like one of those obvious statements everybody agrees with and nobody acts on. But I want to be specific about what I mean, because "communication" is one of those words that's so broad it's almost useless.&lt;/p&gt;

&lt;h2&gt;
  
  
  What "communication" actually means here
&lt;/h2&gt;

&lt;p&gt;Not giving presentations. Not writing blog posts. The everyday stuff:&lt;/p&gt;

&lt;p&gt;Explaining a technical decision in a PR review so the other person understands the &lt;em&gt;why&lt;/em&gt;, not just the &lt;em&gt;what&lt;/em&gt;. Writing a one-page proposal that a product manager can follow. Jumping on a call during an incident and describing what's broken, what you've tried, and what you need — in under 60 seconds.&lt;/p&gt;

&lt;p&gt;At every company I've worked at, this is what the senior level actually means. The technical bar is table stakes. The thing that blocks most mid-level engineers from promotion is their ability to make their expertise &lt;em&gt;useful to other people&lt;/em&gt;. I've seen this play out with at least four or five developers I've worked with closely — technically strong, but their ideas died in their heads because they couldn't get them across.&lt;/p&gt;

&lt;h2&gt;
  
  
  The "smart but can't explain it" problem
&lt;/h2&gt;

&lt;p&gt;You know someone like this. Maybe you &lt;em&gt;are&lt;/em&gt; someone like this. Technically strong — the person everyone goes to when something is really broken. But their design docs are hard to follow. Their PR descriptions say "fixed the thing." In meetings, they're technically accurate but confusing — jumping into implementation details when the room needs the big picture.&lt;/p&gt;

&lt;p&gt;From the other hand, there are engineers who are maybe 70% as technically strong but who can write a clear RFC, explain a trade-off in plain language, and align a cross-functional team on a direction. Those engineers get promoted. They get the interesting projects. They get asked for their opinion.&lt;/p&gt;

&lt;p&gt;This is probably unfair. I go back and forth on it. On one side, the "smart but can't explain it" person is often doing the hardest actual work. On the other side — if nobody understands your technical direction, it doesn't matter how good it is. The org can't execute on ideas that live only in your head. So is communication a proxy for value, or is it actual value? I lean toward the latter, but I get why people are frustrated by it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why this compounds
&lt;/h2&gt;

&lt;p&gt;Communication compounds in a way that technical skills don't. A framework you learn today might be irrelevant in three years. Your React expertise might become less relevant as the landscape shifts. But explaining a complex system clearly, writing a convincing proposal, breaking down a problem for someone with different context — that transfers everywhere.&lt;/p&gt;

&lt;p&gt;At the staff and principal level, communication &lt;em&gt;is&lt;/em&gt; the job. Writing documents, leading discussions, influencing technical direction across teams. The engineers who can articulate their thinking have an enormous advantage.&lt;/p&gt;

&lt;p&gt;Though I should be honest — I'm not at staff level myself. I'm extrapolating from what I've observed and from conversations with people who are. So take this with appropriate skepticism. What I &lt;em&gt;can&lt;/em&gt; say from direct experience is that communication was the biggest factor in my own jump from mid to senior, and it's the most common gap I see when I interview candidates.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to actually get better at this
&lt;/h2&gt;

&lt;p&gt;Nobody teaches you this. CS programs don't cover it. Bootcamps don't. Most companies just expect you to figure it out, which is kind of insane when you think about it — they'll send you to a $2000 conference to learn a framework but won't invest anything in the skill that actually determines your career trajectory.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Explain your work to people who don't share your context.&lt;/strong&gt; Not just to other engineers on your team — to product managers, designers, new hires. When I was building Prestonly (a language learning app I co-founded), I had to explain technical decisions to my non-technical co-founder constantly. That forced a level of clarity I never needed when talking to other devs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Write shorter.&lt;/strong&gt; I used to write long, detailed technical documents because I thought thoroughness meant quality. A one-page document that makes a clear argument is more valuable than a five-page document that covers every edge case. When I started cutting my design docs in half, my thinking got &lt;em&gt;clearer&lt;/em&gt;, not less thorough.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Notice how other people explain things.&lt;/strong&gt; When someone in a meeting gives a really clear explanation — like, the room visibly gets it — pay attention to what they did. Usually it's something simple: they started with the problem, not the solution. They used an analogy. They said one thing instead of three.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Get feedback on clarity, not just correctness.&lt;/strong&gt; Ask "was that clear?" after explaining something. Not "was that right?" A correct explanation that nobody follows is useless.&lt;/p&gt;

&lt;h2&gt;
  
  
  The uncomfortable part
&lt;/h2&gt;

&lt;p&gt;The reason most engineers don't work on communication is that it's uncomfortable. No compiler tells you whether you got it right. No test suite. You explain something in a meeting and you think it went well, but maybe three people in the room were confused and just didn't say anything. There's no feedback loop unless you actively create one.&lt;/p&gt;

&lt;p&gt;I'm still working on this myself. Writing in English as a non-native speaker (Polish is my first language) adds another layer — sometimes the communication gap isn't about clarity of thought but about vocabulary or phrasing that doesn't land the way I intended. That's a separate problem but it compounds with the same skill deficit.&lt;/p&gt;

&lt;p&gt;But yeah. It's the kind of thing where even a small amount of deliberate practice makes a noticeable difference. Not weeks-to-mastery difference. More like: you start noticing when you're losing people mid-explanation, and you start correcting in real time. That alone is worth a lot.&lt;/p&gt;

</description>
      <category>tutorial</category>
      <category>programming</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Why cramming for technical interviews doesn't work</title>
      <dc:creator>Krzysztof Fraus</dc:creator>
      <pubDate>Thu, 12 Mar 2026 09:50:00 +0000</pubDate>
      <link>https://forem.com/krzysztof_fraus/why-cramming-for-technical-interviews-doesnt-work-8fh</link>
      <guid>https://forem.com/krzysztof_fraus/why-cramming-for-technical-interviews-doesnt-work-8fh</guid>
      <description>&lt;p&gt;You just got the email: "We'd like to schedule a technical interview." Your first instinct is to block off the weekend and cram. Read every system design article. Grind LeetCode. Review all the CS fundamentals you haven't thought about since university.&lt;/p&gt;

&lt;p&gt;I've done this. More than once. And every time, I walked into the interview feeling like I knew the material — and then couldn't explain basic concepts when someone was actually sitting across from me, waiting.&lt;/p&gt;

&lt;p&gt;The thing is, interviews aren't exams. An exam tests whether you can recall facts. An interview tests whether you can &lt;em&gt;think through problems while explaining your reasoning to another human being.&lt;/em&gt; Those are completely different skills, and one of them can't be crammed.&lt;/p&gt;

&lt;h2&gt;
  
  
  What cramming actually gives you
&lt;/h2&gt;

&lt;p&gt;Fragile knowledge. You know the words — "consistent hashing," "eventual consistency," "dependency injection" — but you know them the way you know a phone number you just looked up. There for about 30 seconds, then gone.&lt;/p&gt;

&lt;p&gt;Compare that to topics you work with daily. If someone asks you about React's rendering pipeline or how you'd structure a tRPC API, you don't need to think about it — you've debugged these things, argued about them in PR reviews, explained them to teammates. That knowledge is &lt;em&gt;settled&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;The crammed version is different. You have the vocabulary but not the understanding. And interviewers can tell within the first follow-up question. They ask "why would you choose that over X?" and suddenly you're exposed, because you memorized the what but never thought about the why.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why daily practice is different
&lt;/h2&gt;

&lt;p&gt;When you spread practice across short sessions, the knowledge consolidates differently. Each session reinforces what you covered before. Your brain actually needs the gaps between sessions to process things — that's not productivity-guru talk, it's how memory consolidation works during sleep.&lt;/p&gt;

&lt;p&gt;But there's a more important benefit that people miss: you build the &lt;em&gt;performance&lt;/em&gt; skill, not just the knowledge. Five minutes a day explaining a concept out loud — actually saying the words, hearing yourself stumble, correcting mid-sentence — after a month you've done this 30 times. The act of speaking about technical concepts stops feeling abnormal. That matters more than most people realize.&lt;/p&gt;

&lt;h2&gt;
  
  
  The habit part
&lt;/h2&gt;

&lt;p&gt;Make it small enough that you can't talk yourself out of it. One question. Explained out loud. Under 5 minutes.&lt;/p&gt;

&lt;p&gt;On days when I was tired or busy, I'd do it while making coffee. On good days, I'd spend 10-15 minutes and go deeper. But the baseline was always 5 minutes, and that baseline is what kept the streak alive. Some days the 5-minute version was genuinely bad — unfocused, half-asleep, barely coherent. Those days still counted. The point is you showed up.&lt;/p&gt;

&lt;p&gt;The other thing that matters: remove the decision of &lt;em&gt;what&lt;/em&gt; to practice. If you have to browse through a question bank and pick something every morning, you'll skip it half the time. Decision fatigue kills habits faster than laziness does. The question should already be there when you sit down — whether that's a random page from a system design book, whatever concept came up at work yesterday, or a tool that picks for you.&lt;/p&gt;

&lt;h2&gt;
  
  
  The gap nobody warns you about
&lt;/h2&gt;

&lt;p&gt;After about two weeks of daily practice, something shifted. I stopped worrying about whether I knew enough. Not because I suddenly knew everything — I definitely didn't. But the &lt;em&gt;act&lt;/em&gt; of explaining things out loud had become normal. The nervousness was still there but it wasn't running the show anymore.&lt;/p&gt;

&lt;p&gt;That gap — between knowing the material and being able to deliver it under pressure — is what daily practice actually closes. Cramming can't touch it because cramming only addresses the knowledge side.&lt;/p&gt;

&lt;p&gt;Here's the uncomfortable irony: the developers who need this most — the ones who know their stuff but &lt;a href="https://www.prepovo.com/blog/why-you-freeze-in-technical-interviews?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=building-a-daily-practice-habit-for-technical-interviews" rel="noopener noreferrer"&gt;freeze under pressure&lt;/a&gt; — are usually the ones who think more studying is the answer. They'll spend another weekend grinding LeetCode instead of spending 5 minutes a day just talking. I was one of them for a long time. It's hard to accept that the problem isn't what you know, it's what you can say.&lt;/p&gt;

</description>
      <category>career</category>
      <category>programming</category>
      <category>beginners</category>
      <category>webdev</category>
    </item>
    <item>
      <title>How to practice for technical interviews alone</title>
      <dc:creator>Krzysztof Fraus</dc:creator>
      <pubDate>Wed, 11 Mar 2026 16:00:41 +0000</pubDate>
      <link>https://forem.com/krzysztof_fraus/how-to-practice-for-technical-interviews-alone-no-mock-interviewer-needed-p0</link>
      <guid>https://forem.com/krzysztof_fraus/how-to-practice-for-technical-interviews-alone-no-mock-interviewer-needed-p0</guid>
      <description>&lt;p&gt;Every piece of interview advice eventually says the same thing: "Do mock interviews with a friend." Great. Except your dev friends are busy. Or you don't have dev friends in the same time zone. Or — and this is more common than anyone admits — you're embarrassed to stumble through an explanation of event loops in front of someone you respect.&lt;/p&gt;

&lt;p&gt;So you skip the practice that matters most. You go back to reading articles and solving problems on a screen. And then in the actual interview, when you need to &lt;em&gt;talk&lt;/em&gt;, you freeze. Because &lt;a href="https://www.prepovo.com/blog/why-explaining-out-loud-beats-reading?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-practice-for-technical-interviews-alone" rel="noopener noreferrer"&gt;reading and explaining are completely different skills&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Here's the thing — solo practice can be just as effective as mock interviews. Sometimes more effective, because you're not performing for anyone. You're actually learning. The key is doing it right.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why solo practice gets a bad reputation
&lt;/h2&gt;

&lt;p&gt;Most people who "practice alone" do this: they read a question, think about the answer in their head, nod to themselves, and move on. That's not practice. That's the illusion of preparation.&lt;/p&gt;

&lt;p&gt;Real solo practice means speaking out loud, recording yourself, timing your responses, and actually evaluating what came out of your mouth. It's uncomfortable. It's supposed to be.&lt;/p&gt;

&lt;p&gt;Mock interviews work because having someone there forces you to verbalize. You can create that forcing function on your own — you just need to be more deliberate about it. Whether it's "just as effective" as having a real person, I honestly don't know. Probably depends on how disciplined you are. But it's infinitely better than reading answers in your head and calling it preparation.&lt;/p&gt;

&lt;h2&gt;
  
  
  The rubber duck, but for real
&lt;/h2&gt;

&lt;p&gt;You probably know rubber duck debugging — explain your code to a rubber duck to find bugs. The same principle works for interview prep, but most people do it wrong.&lt;/p&gt;

&lt;p&gt;They explain to the duck the way they'd explain to themselves: skipping steps, using mental shortcuts, glossing over the parts they're fuzzy on. That defeats the purpose.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Set the scene.&lt;/strong&gt; Put something in front of you — a duck, a coffee mug, a stuffed animal, whatever. This isn't silly. Having a physical "audience" activates different cognitive pathways than talking to the air. Your brain treats it more like a conversation and less like a monologue.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Start from the question, not the answer.&lt;/strong&gt; Say it out loud first: "Explain how a hash map handles collisions." Then answer as if this person has never heard of a hash map. Don't assume shared context.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Force yourself to use analogies.&lt;/strong&gt; If you can only explain something in textbook terms, you don't understand it well enough. Saying "a hash map is like a filing cabinet where each drawer is labeled with a number, and sometimes two files get assigned the same drawer" proves deeper understanding than reciting "it uses separate chaining or open addressing."&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Notice the silence.&lt;/strong&gt; When you pause for more than three seconds, that's a knowledge gap. Don't fill it with "umm" and push through. Stop. Write down what you got stuck on. That's your study list for later.&lt;/p&gt;

&lt;p&gt;I tried this with "explain the CAP theorem" once. In my head, I had it cold. Out loud, I said "consistency, availability, partition tolerance" and then... nothing. I knew what each word meant individually but I couldn't articulate &lt;em&gt;why&lt;/em&gt; you can only pick two, or what that actually means in practice when you're choosing between, say, DynamoDB and Postgres for a specific service. I just stood there in my kitchen talking to a coffee mug, realizing I'd been nodding along to this concept for months without actually understanding it.&lt;/p&gt;

&lt;p&gt;I still don't think I can explain CAP particularly well, honestly. I've since read multiple explanations and I get it better, but the "pick two" framing is itself misleading — it's more like "you always have to handle network partitions, so really you're choosing between C and A during a partition." Every time I try to explain it simply, I end up in this recursion of "well, but it depends on..." which, fine, that's probably more honest than a clean 30-second answer anyway.&lt;/p&gt;

&lt;h2&gt;
  
  
  Record yourself and actually watch it back
&lt;/h2&gt;

&lt;p&gt;This is the method everyone recommends and nobody does. Because watching yourself stumble through a technical explanation is genuinely painful. Do it anyway.&lt;/p&gt;

&lt;p&gt;Use your phone's voice recorder or screen recording. Don't overthink the tooling. Pick a question, hit record, answer it as if you're in an interview. Stop recording.&lt;/p&gt;

&lt;p&gt;When you play it back, listen for specific things:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Structure.&lt;/strong&gt; Did you start with a high-level answer before diving into details? Or did you immediately get lost in implementation specifics? Good interview answers follow a pattern: headline answer, then supporting details, then trade-offs. Listen for whether you did that or just... rambled.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Filler words.&lt;/strong&gt; Count the "um"s, "like"s, and "basically"s. Everyone has them. The goal isn't zero — that would sound robotic. But if you're saying "basically" every other sentence, it signals uncertainty.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Technical accuracy.&lt;/strong&gt; Did you say anything wrong? This is where recording beats thinking in your head. When you think about an answer, your brain auto-corrects errors. When you listen to a recording, you catch yourself saying things like "TCP is connectionless" and realize you mixed up TCP and UDP.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Time.&lt;/strong&gt; How long was your answer? Most interview answers should land between 1-3 minutes. If you went 7 minutes on "what's the difference between a process and a thread," you're going too deep.&lt;/p&gt;

&lt;p&gt;Don't record and review in the same session. Record three answers in the morning. Listen during your commute or lunch break. The time gap helps you evaluate more objectively — you're less attached to what you said.&lt;/p&gt;

&lt;p&gt;After a week of this, you'll notice patterns. Maybe you always forget to mention trade-offs. Maybe your opening sentences are consistently weak. These patterns are invisible without recordings.&lt;/p&gt;

&lt;h2&gt;
  
  
  Timed verbal drills
&lt;/h2&gt;

&lt;p&gt;Interviews have time pressure. Your solo practice should too.&lt;/p&gt;

&lt;p&gt;Set a timer for 2 minutes. Read a question. Start explaining immediately. When the timer goes off, stop — even mid-sentence.&lt;/p&gt;

&lt;p&gt;This builds three things at once:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Speed of retrieval.&lt;/strong&gt; In an interview, you don't have 30 seconds to collect your thoughts. The &lt;a href="https://www.prepovo.com/blog/first-30-seconds-interview-answer?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-practice-for-technical-interviews-alone" rel="noopener noreferrer"&gt;first few seconds after a question&lt;/a&gt; matter a lot — they set the interviewer's expectation. Timed drills train you to start talking quickly with a structured opener.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Prioritization.&lt;/strong&gt; Two minutes isn't enough to cover everything about "how does garbage collection work in Java." You have to decide what matters most. That decision-making is exactly what interviews evaluate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Comfort with pauses.&lt;/strong&gt; Paradoxically, timed drills teach you that brief pauses are fine. When you know you only have 2 minutes, a 3-second pause to collect your thoughts feels less catastrophic than it does when you have unlimited time and are spiraling.&lt;/p&gt;

&lt;p&gt;Some variations:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The 30-second version.&lt;/strong&gt; Explain a concept in 30 seconds or less. This forces extreme clarity. If you can explain event-driven architecture in 30 seconds, the 3-minute version becomes easy.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The pivot drill.&lt;/strong&gt; Start explaining one concept. After 60 seconds, switch to a related concept without starting over. "I was explaining REST APIs, now let me contrast that with GraphQL." This simulates how real interviews flow — interviewers redirect you constantly.&lt;/p&gt;

&lt;p&gt;I'll be honest, I don't actually do the pivot drill much. It feels forced when I try it. The 30-second version though — that one genuinely helps.&lt;/p&gt;

&lt;h2&gt;
  
  
  Self-evaluation (the part most people skip)
&lt;/h2&gt;

&lt;p&gt;Here's where most solo practice falls apart. People practice, but they don't evaluate. And &lt;a href="https://www.prepovo.com/blog/why-you-freeze-in-technical-interviews?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-practice-for-technical-interviews-alone" rel="noopener noreferrer"&gt;without evaluation, you're just repeating the same mistakes&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;After each practice answer, rate yourself on a few things:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Accuracy&lt;/strong&gt; — Was the technical content correct?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Structure&lt;/strong&gt; — Did you organize your answer logically?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Clarity&lt;/strong&gt; — Would a non-expert follow your explanation?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Completeness&lt;/strong&gt; — Did you cover the key points without going overboard?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The hard part is being honest. You'll be too easy on yourself ("that was fine") or too harsh ("I'm terrible at everything"). Both are useless. I still struggle with this — my self-evaluations are all over the place depending on my mood.&lt;/p&gt;

&lt;p&gt;One thing that helped: write down your evaluation &lt;em&gt;before&lt;/em&gt; you listen to the recording. Then listen and re-evaluate. The gap between the two is where the learning happens. Sometimes the gap is embarrassingly large. You think you nailed it and then you listen back and it's just... you saying "basically" eleven times while circling the same point.&lt;/p&gt;

&lt;h2&gt;
  
  
  Making it a daily thing
&lt;/h2&gt;

&lt;p&gt;These methods work best when you do them consistently, not in a weekend cram session. Something like 15-20 minutes a day:&lt;/p&gt;

&lt;p&gt;Pick a question. Talk through it with your rubber duck first — just to find the gaps. Then record a clean 2-3 minute answer applying what you learned. Score yourself. Listen to the recording later.&lt;/p&gt;

&lt;p&gt;That's it. Fifteen minutes a day, and you're doing more effective interview practice than someone who blocks out an entire Saturday for mock interviews once a month.&lt;/p&gt;

&lt;p&gt;The key is &lt;a href="https://www.prepovo.com/blog/building-a-daily-practice-habit-for-technical-interviews?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-practice-for-technical-interviews-alone" rel="noopener noreferrer"&gt;building this into a habit&lt;/a&gt;, not treating it as a one-time thing. Twenty days of short sessions beats two marathon sessions. Right?&lt;/p&gt;

&lt;h2&gt;
  
  
  When solo practice isn't enough
&lt;/h2&gt;

&lt;p&gt;I should be honest here. Solo practice has real limits.&lt;/p&gt;

&lt;p&gt;You can't simulate follow-up questions well. In a real interview, the interviewer probes your weak spots. When you practice alone, you tend to avoid the uncomfortable areas — you don't even realize you're doing it.&lt;/p&gt;

&lt;p&gt;Self-evaluation is inherently biased. You don't know what you don't know. If you misunderstand how connection pooling works, you'll confidently explain the wrong thing and give yourself high marks.&lt;/p&gt;

&lt;p&gt;And there's no feedback on how you &lt;em&gt;sound&lt;/em&gt; to someone else. You might think your explanation of microservices is crystal clear, but an actual listener might be lost by sentence three.&lt;/p&gt;

&lt;p&gt;This is where tools can help. &lt;a href="https://www.prepovo.com/interview-preparation?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-practice-for-technical-interviews-alone" rel="noopener noreferrer"&gt;Prepovo&lt;/a&gt; gives you a daily technical question, you explain the answer out loud, and AI evaluates your response on clarity, accuracy, and structure. It fills the gap between solo practice and having a real person listen — you get external feedback without needing to schedule anyone. No scheduling, no awkwardness.&lt;/p&gt;

&lt;p&gt;But whether you use a tool or not, the solo methods in this article work. The tool just makes the feedback loop tighter.&lt;/p&gt;

&lt;h2&gt;
  
  
  The real question
&lt;/h2&gt;

&lt;p&gt;Why are you preparing alone in the first place?&lt;/p&gt;

&lt;p&gt;Usually it's one of three reasons: you don't have the right connections, the timing doesn't work, or you're not confident enough yet to practice in front of someone.&lt;/p&gt;

&lt;p&gt;That third reason is the most interesting. People want to get "good enough" before they'll practice with another person. But getting good enough &lt;em&gt;is&lt;/em&gt; the practice. There's no level of reading that prepares you to speak.&lt;/p&gt;

&lt;p&gt;Solo practice lowers the stakes enough to start. Nobody's watching. Nobody's judging. It's just you and a recording app and 5 minutes.&lt;/p&gt;

&lt;p&gt;Pick one technical concept you'd expect to be asked about. Explain it out loud right now. Notice where you stumble.&lt;/p&gt;

</description>
      <category>career</category>
      <category>programming</category>
      <category>beginners</category>
      <category>webdev</category>
    </item>
    <item>
      <title>How to say 'I don't know' in a technical interview</title>
      <dc:creator>Krzysztof Fraus</dc:creator>
      <pubDate>Wed, 11 Mar 2026 16:00:20 +0000</pubDate>
      <link>https://forem.com/krzysztof_fraus/how-to-say-i-dont-know-in-a-technical-interview-ajk</link>
      <guid>https://forem.com/krzysztof_fraus/how-to-say-i-dont-know-in-a-technical-interview-ajk</guid>
      <description>&lt;p&gt;At some point in every technical interview, you hit a question you can't answer. Maybe it's a distributed systems concept you've only read about. Maybe it's a language you've never used. Maybe the interviewer just went deeper than your experience goes.&lt;/p&gt;

&lt;p&gt;That moment — when you realize you don't know — is one of the highest-signal moments in the entire interview. Not because of what you know, but because of what you do next.&lt;/p&gt;

&lt;h2&gt;
  
  
  Most people either freeze or fake it
&lt;/h2&gt;

&lt;p&gt;Either you go silent — "I don't know" — and wait for the next question, or you start talking and hope something coherent comes out.&lt;/p&gt;

&lt;p&gt;Faking it is worse. When you bluff through an answer, the interviewer usually notices. They probe deeper. You get more tangled. The credibility you built over the last 40 minutes starts to erode. I've seen this from the interviewer side — you can almost watch the candidate realize they're in too deep, and then they just keep going anyway because stopping feels like giving up.&lt;/p&gt;

&lt;p&gt;The urge to fill silence with &lt;em&gt;something&lt;/em&gt; is hard to resist in the moment. &lt;a href="https://www.prepovo.com/blog/why-you-freeze-in-technical-interviews?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-say-i-dont-know-in-interviews" rel="noopener noreferrer"&gt;Same freeze response&lt;/a&gt; that hits people who technically know the material.&lt;/p&gt;

&lt;h2&gt;
  
  
  What works better
&lt;/h2&gt;

&lt;p&gt;There's a middle ground between "I don't know" and faking it. It's not complicated, but it takes practice to do under pressure.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Name the gap, then bridge to what you do know.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;That's basically it. You acknowledge what you don't know — directly, without apologizing or hedging for thirty seconds — and then you connect it to something you &lt;em&gt;can&lt;/em&gt; reason about.&lt;/p&gt;

&lt;p&gt;Something like:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"I haven't worked with consensus protocols directly. But I've dealt with consistency trade-offs in distributed caches — I know the problem space. Let me think through how that might apply here."&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Why this works: it gives the interviewer a map of your knowledge. They can see the boundary, and they can see that you don't just stop at the boundary. You reason past it. That's what engineers do on the job every day — nobody knows everything, and the good ones figure things out from adjacent knowledge.&lt;/p&gt;

&lt;p&gt;Sometimes the question is so far from your experience that there's no obvious bridge. No adjacent knowledge to draw from. In that case, just be honest: "That's outside my experience. I don't want to guess and waste your time — can we come back to it, or would you like me to reason from first principles?"&lt;/p&gt;

&lt;p&gt;I wish I could say this always goes well. Sometimes the interviewer just moves on and you can feel the score dropping. That's the reality — not every interviewer values honesty the same way. But the alternative (bluffing through something you clearly don't know) has a worse expected outcome.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why interviewers push past your knowledge
&lt;/h2&gt;

&lt;p&gt;A lot of candidates think hitting a question they can't answer means they're failing. Usually it means the interviewer is doing their job.&lt;/p&gt;

&lt;p&gt;Good interviews are supposed to find your edges. If you answer everything perfectly, the interviewer hasn't probed deep enough. They &lt;em&gt;need&lt;/em&gt; to see how you handle the gap — that's where they learn the most about how you actually work.&lt;/p&gt;

&lt;p&gt;Sometimes it's not even intentional. The interviewer asks something they think is straightforward and it happens to be outside your background. Either way, the response that works is the same: be honest, bridge to what you know, reason forward.&lt;/p&gt;

&lt;h2&gt;
  
  
  A couple of examples
&lt;/h2&gt;

&lt;p&gt;I don't want to script out perfect responses — that's part of the problem with most interview advice. But here's roughly what this looks like:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;"How would you implement rate limiting in a distributed system?"&lt;/strong&gt; (and you've only done it on a single server)&lt;/p&gt;

&lt;p&gt;"I've done token bucket rate limiting on a single server, but not the distributed case. The obvious problem is each node only sees local traffic. I'd probably start with a centralized counter in Redis — it's simple, even if it adds a network hop. Not sure if there's a better approach for high-throughput, but that's where I'd start."&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;"What's your experience with Kubernetes operators?"&lt;/strong&gt; (and you've used Kubernetes but never written one)&lt;/p&gt;

&lt;p&gt;"I've deployed stuff on Kubernetes — Helm charts, HPAs, the usual. Never written an operator. My understanding is it's basically a custom controller that watches CRDs and reconciles state. I'd probably reach for the Operator SDK if I needed to build one, but I haven't actually done it."&lt;/p&gt;

&lt;p&gt;Both of these are rough on purpose. They end with uncertainty. They don't cover everything. That's the whole point — you're showing the boundary of your knowledge, not pretending it doesn't exist.&lt;/p&gt;

&lt;h2&gt;
  
  
  The trap
&lt;/h2&gt;

&lt;p&gt;The failure mode that's worse than "I don't know" is confidently giving a wrong answer. Interviewers can work with gaps. They can't work with someone who doesn't know what they don't know.&lt;/p&gt;

&lt;p&gt;The gap between &lt;em&gt;knowing&lt;/em&gt; you should be honest and &lt;em&gt;actually being honest&lt;/em&gt; under interview pressure is bigger than you'd think. Your brain genuinely prefers fabricating a plausible-sounding answer over sitting in the discomfort of admitting a gap. It's not a character flaw — it's how stress works.&lt;/p&gt;

&lt;p&gt;There's a cultural layer here too, at least from my experience. In Poland — and from what I've seen in a lot of European tech companies — admitting you don't know something is less stigmatized than in some US interview cultures where confidence is weighted heavily. I don't know if that's actually true or just my perception. But it's worth thinking about who's interviewing you and what they might value.&lt;/p&gt;

&lt;h2&gt;
  
  
  Practicing this
&lt;/h2&gt;

&lt;p&gt;You can't get better at this by reading about it. You need to actually say the words out loud, in a situation where you feel at least some pressure.&lt;/p&gt;

&lt;p&gt;Pick a topic adjacent to your expertise but outside your comfort zone. Have someone ask you a question about it. The goal isn't to study the topic first — it's to practice the moment of hitting the wall and responding well.&lt;/p&gt;

&lt;p&gt;Record yourself. Listen back. You'll hear the hesitation, the filler words, &lt;a href="https://www.prepovo.com/blog/how-to-stop-rambling-in-technical-interviews?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-say-i-dont-know-in-interviews" rel="noopener noreferrer"&gt;the rambling&lt;/a&gt; that happens when you're buying time.&lt;/p&gt;

&lt;p&gt;The goal isn't to eliminate "I don't know" from your interviews. It's to make sure that when it happens — and it will — you handle it in a way that shows how you think, not just what you've memorized.&lt;/p&gt;

&lt;p&gt;One last thing. I've noticed that the candidates who handle "I don't know" best are usually the ones who are most comfortable with it in their daily work too. The developer who says "I don't know, let me check" in a code review handles the interview version better than the one who always has an answer for everything. Something to think about.&lt;/p&gt;

</description>
      <category>career</category>
      <category>programming</category>
      <category>beginners</category>
      <category>webdev</category>
    </item>
    <item>
      <title>How to stop rambling in technical interviews (the 60-second rule)</title>
      <dc:creator>Krzysztof Fraus</dc:creator>
      <pubDate>Wed, 11 Mar 2026 15:59:45 +0000</pubDate>
      <link>https://forem.com/krzysztof_fraus/how-to-stop-rambling-in-technical-interviews-the-60-second-rule-5f7d</link>
      <guid>https://forem.com/krzysztof_fraus/how-to-stop-rambling-in-technical-interviews-the-60-second-rule-5f7d</guid>
      <description>&lt;p&gt;You know the answer. You know it &lt;em&gt;well&lt;/em&gt;. So you start talking. You cover the main concept, then the edge cases, then the historical context of why it works that way, then a tangent about a production incident from last year — and somewhere around the three-minute mark, the interviewer's eyes glaze over.&lt;/p&gt;

&lt;p&gt;Rambling is the senior engineer's interview disease. Juniors tend to give answers that are too short. Seniors give answers that are too long. And the more you know about a topic, the worse the rambling gets.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why developers ramble
&lt;/h2&gt;

&lt;p&gt;Three things drive it, and they usually stack on top of each other.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Anxiety disguised as thoroughness.&lt;/strong&gt; Interview nerves create a compulsion to fill silence. Your brain interprets a pause as a failure signal, so you keep talking. You tell yourself you're being thorough. You're actually being nervous.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The proof impulse.&lt;/strong&gt; You want to demonstrate that you &lt;em&gt;really&lt;/em&gt; know this. Not just the textbook answer — you know the edge cases, the gotchas, the production reality. So you keep adding layers of evidence, hoping each one pushes your score higher. It doesn't. After a certain point, you're not adding signal. You're adding noise.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;No defined stopping point.&lt;/strong&gt; When you write code, you know when a function is done — it returns a value. When you speak in an interview, there's no natural return statement. Your mouth just keeps running until you trail off with "...so, yeah."&lt;/p&gt;

&lt;p&gt;The third one is the real problem. The first two go away if you have a structure to fall back on.&lt;/p&gt;

&lt;h2&gt;
  
  
  The cost of rambling
&lt;/h2&gt;

&lt;p&gt;A rambling answer hurts you in ways that aren't obvious:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;It obscures your key point.&lt;/strong&gt; When everything gets equal airtime, nothing stands out.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;It eats the clock.&lt;/strong&gt; A 45-minute interview has room for maybe 4-6 substantial questions. If your answers run 5 minutes each instead of 2, the interviewer gets less data to evaluate you on.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;It signals how you'll communicate on the job.&lt;/strong&gt; Design reviews, incident responses, Slack threads — if your interview answers ramble, your daily communication probably does too. Right?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;It takes control away from the interviewer.&lt;/strong&gt; They have a plan. They want to probe specific areas. When you monologue for four minutes, you're not letting them drive. Most won't interrupt — they'll just wait it out and move on.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The 60-second rule
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Your first answer to any interview question should be under 60 seconds.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Not 60 seconds total. 60 seconds for your &lt;em&gt;initial&lt;/em&gt; response. Then stop. Let the interviewer decide where to go next.&lt;/p&gt;

&lt;p&gt;This feels counterintuitive. Sixty seconds isn't enough to cover everything about database indexing or event-driven architecture. That's exactly the point. You're not supposed to cover everything. You're supposed to cover the &lt;em&gt;most important thing&lt;/em&gt; and then let the interviewer pull for more.&lt;/p&gt;

&lt;p&gt;Think of it like an API. Your first response is the summary endpoint. If the interviewer wants the detail endpoint, they'll ask for it. Don't return the entire database on every request.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;It forces you to prioritize.&lt;/strong&gt; When you have 60 seconds, you can't cover everything. You have to decide: what's the single most important thing the interviewer should take away?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;It creates a conversation.&lt;/strong&gt; Interviews that go well feel like technical discussions, not presentations. Short answers invite follow-up questions. Follow-ups let you demonstrate depth on the things the interviewer actually cares about — not the things you assumed they'd care about.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;It gives you feedback.&lt;/strong&gt; When the interviewer asks a follow-up, they're telling you what they want to hear about. A 4-minute monologue gives you zero feedback until it's over.&lt;/p&gt;

&lt;p&gt;I should mention — 60 seconds is arbitrary. Some questions genuinely need 90 seconds. System design walkthroughs are a different thing entirely. The number isn't the point. The habit of stopping early and letting the interviewer steer is the point.&lt;/p&gt;

&lt;h2&gt;
  
  
  A simple structure: claim, evidence, stop
&lt;/h2&gt;

&lt;p&gt;Okay, 60 seconds. What goes in them?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Claim (about 10 seconds).&lt;/strong&gt; State your answer directly. No preamble, no "that's a great question." Lead with the point.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;"The main benefit of event sourcing is auditability — you get a complete history of every state change."&lt;/li&gt;
&lt;li&gt;"I'd use a message queue here rather than synchronous HTTP calls."&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is what the &lt;a href="https://www.prepovo.com/blog/first-30-seconds-interview-answer?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-stop-rambling-in-technical-interviews" rel="noopener noreferrer"&gt;first 30 seconds&lt;/a&gt; are about — anchoring the interviewer to your key point before details muddy the water.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Evidence (about 40 seconds).&lt;/strong&gt; Support your claim with one or two concrete points. Not five. Pick the strongest.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A technical reason ("Because event sourcing stores immutable events, you can replay them to debug production issues or build new read models after the fact.")&lt;/li&gt;
&lt;li&gt;A trade-off ("The downside is eventual consistency — your read models will lag behind writes, which doesn't work for every use case.")&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Two points is almost always enough. It feels like you're leaving stuff out. You are.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Stop (0 seconds).&lt;/strong&gt; This is the hard part. Actually stop talking. Don't add "and also..." Don't trail into a tangent. Just stop.&lt;/p&gt;

&lt;p&gt;The silence feels uncomfortable. Your brain will scream at you to fill it. Let it scream. The interviewer needs a moment to process what you said.&lt;/p&gt;

&lt;p&gt;Stopping mid-thought when you know there's more to say goes against every instinct. I still struggle with this in meetings — I'll make my point and then keep talking for another 30 seconds, adding nothing. In an interview the stakes are higher and the instinct is stronger.&lt;/p&gt;

&lt;h2&gt;
  
  
  Recognizing the ramble
&lt;/h2&gt;

&lt;p&gt;Here's something worth trying. Pick a technical concept you know well — dependency injection, the event loop, whatever. Set a timer and explain it out loud.&lt;/p&gt;

&lt;p&gt;Most developers blow past 60 seconds without getting to a clear point. They start with context, then definitions, then history, and &lt;em&gt;then&lt;/em&gt; maybe get to the actual answer.&lt;/p&gt;

&lt;p&gt;That gap — between how well you understand something and how concisely you can explain it — is the whole problem. Knowing something and being able to say it clearly in real time are two different skills. The second one only improves with &lt;a href="https://www.prepovo.com/blog/why-explaining-out-loud-beats-reading?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-stop-rambling-in-technical-interviews" rel="noopener noreferrer"&gt;actual verbal practice&lt;/a&gt;, not just reading about it.&lt;/p&gt;

&lt;h2&gt;
  
  
  When you're unsure, you ramble more
&lt;/h2&gt;

&lt;p&gt;Rambling gets especially bad when you don't fully know the answer. You talk in circles, hoping you'll stumble into something coherent, because admitting uncertainty feels dangerous.&lt;/p&gt;

&lt;p&gt;It's usually not dangerous. Saying "I'm not sure about the specifics of X, but here's how I'd approach it" is a much better signal than a three-minute non-answer. Interviewers can tell when you're stalling. Most would rather hear you name what you don't know and reason from there.&lt;/p&gt;

&lt;p&gt;There's more to say about handling the "I don't know" moment — it's its own skill. I wrote about it &lt;a href="https://www.prepovo.com/blog/how-to-say-i-dont-know-in-interviews?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=how-to-stop-rambling-in-technical-interviews" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  What happens after
&lt;/h2&gt;

&lt;p&gt;After your 60-second answer, the interviewer will do one of three things:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Ask you to go deeper.&lt;/strong&gt; Now you know exactly what they want. Give another short chunk on that specific subtopic.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Move to a new question.&lt;/strong&gt; Your answer was sufficient. A lot of people interpret this as "I didn't say enough." Try reframing it: you said enough.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Push back.&lt;/strong&gt; This is actually good — it means you said something substantive enough to debate. Engage with their specific point. Don't retreat into "well, it depends."&lt;/p&gt;

&lt;h2&gt;
  
  
  Building the habit
&lt;/h2&gt;

&lt;p&gt;You can't just decide to be concise in the interview. It has to be a habit you've already built.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Time yourself.&lt;/strong&gt; Record yourself answering a question. If your first answer is over 90 seconds, it's too long. Cut it and try again. The first recording will be painful to listen to. Every recording after that gets slightly less painful.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Practice the stop.&lt;/strong&gt; The hardest part isn't talking for 60 seconds — it's stopping after 60 seconds. Practice sitting in the silence for a few seconds after you finish.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Use the structure outside of interviews.&lt;/strong&gt; Code reviews, team meetings, Slack answers. The more places you practice claim-evidence-stop, the more automatic it becomes when the stakes are real. I started doing this in PR descriptions — state the change, give the reason, stop. No three-paragraph context essay.&lt;/p&gt;

&lt;p&gt;One thing I haven't figured out: how to practice this for system design questions, where the format is fundamentally different. The 60-second rule breaks down when you're expected to lead a 30-minute discussion. Maybe it still applies to each sub-answer within the discussion? Not sure.&lt;/p&gt;

&lt;p&gt;Pick a technical topic you could talk about for ten minutes. Explain it in under 60 seconds, out loud. Record it. Play it back.&lt;/p&gt;

&lt;p&gt;Were you under 60 seconds? Did you actually stop?&lt;/p&gt;

</description>
      <category>career</category>
      <category>programming</category>
      <category>beginners</category>
      <category>webdev</category>
    </item>
    <item>
      <title>The first 30 seconds of your interview answer matter more than the rest</title>
      <dc:creator>Krzysztof Fraus</dc:creator>
      <pubDate>Thu, 26 Feb 2026 10:49:48 +0000</pubDate>
      <link>https://forem.com/krzysztof_fraus/the-first-30-seconds-of-your-interview-answer-matter-more-than-the-rest-3fig</link>
      <guid>https://forem.com/krzysztof_fraus/the-first-30-seconds-of-your-interview-answer-matter-more-than-the-rest-3fig</guid>
      <description>&lt;p&gt;A few months ago I was interviewing a backend developer. Mid-level, solid resume, clearly knew his stuff. I asked him: "What's the difference between SQL and NoSQL databases?" And he started with: "Yeah so, that's a good question. Basically, um, there are a lot of differences, but I think the main thing is..."&lt;/p&gt;

&lt;p&gt;He talked for about two minutes. The answer was actually decent — he covered schema flexibility, horizontal scaling, trade-offs for different use cases. But by the time he got to the good stuff, I'd already written "communication — needs work" in my notes. That's unfair, probably. But that's how it works — the first impression gets made before you even get to your actual point, and most interviewers do the same thing without even thinking about it.&lt;/p&gt;

&lt;p&gt;I do the exact same thing. Not just in interviews — in meetings, when someone asks me a question I wasn't expecting. This "so basically..." warm-up before getting to the actual point. My mouth starts before my brain figures out what to say first.&lt;/p&gt;

&lt;p&gt;Your opening sentence is doing way more work than you think. Get it right, and the interviewer relaxes. Get it wrong, and you spend the next five minutes trying to recover.&lt;/p&gt;

&lt;h2&gt;
  
  
  The preamble problem
&lt;/h2&gt;

&lt;p&gt;You've heard these openers. You've probably used them. I definitely have:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;"That's a great question..."&lt;/li&gt;
&lt;li&gt;"So basically..."&lt;/li&gt;
&lt;li&gt;"Hmm, let me think... so there are many ways to look at this..."&lt;/li&gt;
&lt;li&gt;"That's actually something I've been meaning to read more about..."&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;These are filler. They tell the interviewer one of two things: you're stalling because you don't know the answer, or you haven't practiced saying what you know. Neither helps.&lt;/p&gt;

&lt;p&gt;The worst version is when you spend 20 seconds talking &lt;em&gt;about&lt;/em&gt; what you're going to talk about instead of actually talking about it. "So there are several differences between SQL and NoSQL, and I think it depends on the use case, but generally speaking, the way I think about it is..." You've said nothing. Thirty seconds gone.&lt;/p&gt;

&lt;p&gt;After sitting on the interviewer side, the pattern became really obvious. The candidates who skip the warm-up and lead with substance stand out immediately. Not because they're smarter. Because they sound like they've done this before.&lt;/p&gt;

&lt;h2&gt;
  
  
  Lead with the answer, not the journey
&lt;/h2&gt;

&lt;p&gt;Here's a rule that fixed most of my weak openings: &lt;strong&gt;start with your conclusion.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The question is "What's the difference between SQL and NoSQL databases?" A strong opening:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;"The core difference is the data model. SQL databases enforce a fixed schema with relational tables, while NoSQL databases use flexible schemas — document, key-value, columnar, or graph — optimized for specific access patterns."&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Two sentences. The interviewer now knows you understand the fundamental distinction, you can articulate it, and you know there are multiple NoSQL categories. That's it. You've bought yourself the next two minutes.&lt;/p&gt;

&lt;p&gt;Compare that to: "So basically, SQL is like traditional databases and NoSQL is like newer databases that are more flexible..." Same general direction, but it sounds vague. The interviewer has to do the work of extracting your point. And most won't bother.&lt;/p&gt;

&lt;p&gt;This is the &lt;strong&gt;inverted pyramid&lt;/strong&gt; — the thing journalists use. Most important information first. Detail and nuance after. If the interviewer interrupts you at any point, they've already heard your strongest material.&lt;/p&gt;

&lt;p&gt;I should say — this sounds clean on paper but it's genuinely hard in the moment. When you're nervous and someone asks you about CAP theorem, your brain doesn't neatly produce a conclusion-first answer. It produces mush. The only thing that helps is having done it enough times that the structure is automatic.&lt;/p&gt;

&lt;h2&gt;
  
  
  Three patterns that handle most questions
&lt;/h2&gt;

&lt;p&gt;These handle probably 80-90% of what you'll get asked. I stumbled into them after doing enough interviews on both sides — they're not from a book.&lt;/p&gt;

&lt;h3&gt;
  
  
  The direct definition
&lt;/h3&gt;

&lt;p&gt;For "What is X?" or "Explain X" questions.&lt;/p&gt;

&lt;p&gt;Define the thing in one sentence, then immediately add &lt;em&gt;why it matters&lt;/em&gt; or &lt;em&gt;when you'd use it&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;Question: "What is a reverse proxy?"&lt;/p&gt;

&lt;p&gt;Weak: "So a reverse proxy is like... it's kind of the opposite of a regular proxy. It sits in front of your servers."&lt;/p&gt;

&lt;p&gt;Strong: "A reverse proxy sits between clients and your backend servers, forwarding requests on the client's behalf. You'd typically use one for load balancing, SSL termination, or caching — Nginx and HAProxy are the common choices."&lt;/p&gt;

&lt;p&gt;The strong version defines it, gives the use cases, and drops concrete examples. Three sentences. The interviewer already knows you've actually worked with this.&lt;/p&gt;

&lt;h3&gt;
  
  
  The trade-off frame
&lt;/h3&gt;

&lt;p&gt;For "How would you..." or "Which would you choose..." questions.&lt;/p&gt;

&lt;p&gt;State your choice and the reason in one sentence. Then acknowledge the trade-off.&lt;/p&gt;

&lt;p&gt;Question: "How would you handle session management in a distributed system?"&lt;/p&gt;

&lt;p&gt;Weak: "That's a really interesting problem. There are a lot of ways to handle it. I guess it depends on the requirements..."&lt;/p&gt;

&lt;p&gt;Strong: "I'd use a centralized session store like Redis rather than sticky sessions, because it lets you scale horizontally without routing constraints. The trade-off is an extra network hop per request and a dependency on Redis availability, which you'd mitigate with replication."&lt;/p&gt;

&lt;p&gt;You've made a decision, justified it, and shown you understand the cost. That's what senior engineers do — make decisions and own the trade-offs.&lt;/p&gt;

&lt;h3&gt;
  
  
  The scope-setter
&lt;/h3&gt;

&lt;p&gt;For broad questions like system design.&lt;/p&gt;

&lt;p&gt;Acknowledge the breadth, then narrow to a starting point.&lt;/p&gt;

&lt;p&gt;Question: "How would you design a URL shortener?"&lt;/p&gt;

&lt;p&gt;Weak: "OK so... a URL shortener. Let me think about this. So you need to take a long URL and make it short..."&lt;/p&gt;

&lt;p&gt;Strong: "I'll break this into three parts: the shortening algorithm and storage, the redirect service, and analytics. Let me start with the core — generating and storing short URLs — then we can expand to the other pieces."&lt;/p&gt;

&lt;p&gt;You've shown the interviewer your mental model. They know where you're going. They can relax and listen instead of wondering if you have a plan.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why this is harder than it sounds
&lt;/h2&gt;

&lt;p&gt;Most developers who give weak openings &lt;em&gt;do&lt;/em&gt; know the answer. The problem isn't knowledge. It's that speaking your answer out loud, on the spot, with someone evaluating you — that's a completely different skill than knowing the answer in your head.&lt;/p&gt;

&lt;p&gt;I could write a perfectly clear technical document about how a load balancer works. Explain it to a colleague over coffee, no problem. But put me in an interview, and my mouth would start before my brain finished organizing. The result: "so basically, a load balancer is like..."&lt;/p&gt;

&lt;p&gt;This is the gap between &lt;a href="https://www.prepovo.com/blog/why-you-freeze-in-technical-interviews?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=first-30-seconds-interview-answer" rel="noopener noreferrer"&gt;knowing and articulating&lt;/a&gt;. Writing gives you a backspace key. Speaking doesn't. Interviews don't care about what you know — they care about what comes out of your mouth in real time.&lt;/p&gt;

&lt;p&gt;The fix isn't memorizing opening lines. It's practicing the &lt;em&gt;act of formulating a first sentence&lt;/em&gt; under time pressure, over and over, until your brain learns to organize before your mouth starts.&lt;/p&gt;

&lt;p&gt;Although — I've started to wonder if some people are just naturally better at this. The "think before you speak" thing. I know developers who can produce a clean first sentence on any topic, no practice needed. Is that a skill they built or a personality trait? No idea. Doesn't matter, I guess — for the rest of us, practice is the only lever we have.&lt;/p&gt;

&lt;h2&gt;
  
  
  Drills that actually helped me
&lt;/h2&gt;

&lt;h3&gt;
  
  
  The 10-second rule
&lt;/h3&gt;

&lt;p&gt;Pick any technical concept you know well. Set a timer. You have 10 seconds to deliver a clear opening sentence. Not a full answer — just the first sentence.&lt;/p&gt;

&lt;p&gt;Try it with "What is Docker?" Go.&lt;/p&gt;

&lt;p&gt;If your first attempt started with "So basically..." or "Docker is like...", try again. Aim for something like: "Docker packages an application and its dependencies into an isolated container that runs consistently across any environment."&lt;/p&gt;

&lt;p&gt;It's a small thing, but doing one of these per day with random topics builds the habit faster than you'd expect.&lt;/p&gt;

&lt;h3&gt;
  
  
  The rewind drill
&lt;/h3&gt;

&lt;p&gt;Record yourself answering a technical question. Play it back. Listen to just the first 15 seconds. Ask yourself:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Did I state my main point in the first sentence?&lt;/li&gt;
&lt;li&gt;Could someone interrupt me after 10 seconds and still get value?&lt;/li&gt;
&lt;li&gt;Did I use any filler phrases before my actual answer?&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If the answer to any of these is no, re-record. This is where I fixed most of my rambling — at the start, not the middle.&lt;/p&gt;

&lt;h3&gt;
  
  
  The cold-start practice
&lt;/h3&gt;

&lt;p&gt;The reason interview openings are hard is that they're &lt;em&gt;cold starts&lt;/em&gt;. You go from silence to speaking with no warm-up. The best training is to simulate exactly that.&lt;/p&gt;

&lt;p&gt;Have someone throw a question at you with no warning. No prep time. You speak immediately. The first few times will be rough. That's the point. You're training your brain to organize before your mouth starts — and that only comes from repetition, not from reading about it.&lt;/p&gt;

&lt;h2&gt;
  
  
  One more thing
&lt;/h2&gt;

&lt;p&gt;Strong openings do something beyond impressing the interviewer. They organize &lt;em&gt;your own thinking&lt;/em&gt;. When you force yourself to lead with a clear statement, you're deciding what the most important point is before you speak. That cascades — you ramble less, structure better, stay on topic.&lt;/p&gt;

&lt;p&gt;Same principle behind writing a good commit message before the code. Constraints at the start create clarity downstream.&lt;/p&gt;

&lt;p&gt;Your next technical interview will have at least a dozen questions. Each one is a fresh chance to make a first impression — or to burn the first 30 seconds on filler. The candidates who get offers aren't the ones who know the most. They're the ones who &lt;a href="https://www.prepovo.com/blog/common-technical-interview-mistakes?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=first-30-seconds-interview-answer" rel="noopener noreferrer"&gt;sound like they know it&lt;/a&gt; from the first sentence.&lt;/p&gt;

&lt;p&gt;Pick a technical topic right now. Say your opening sentence out loud.&lt;/p&gt;

&lt;p&gt;Did you lead with the answer?&lt;/p&gt;

</description>
      <category>career</category>
      <category>programming</category>
      <category>beginners</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Why you know the answer but freeze in technical interviews</title>
      <dc:creator>Krzysztof Fraus</dc:creator>
      <pubDate>Tue, 24 Feb 2026 07:06:53 +0000</pubDate>
      <link>https://forem.com/krzysztof_fraus/why-you-know-the-answer-but-freeze-in-technical-interviews-25l6</link>
      <guid>https://forem.com/krzysztof_fraus/why-you-know-the-answer-but-freeze-in-technical-interviews-25l6</guid>
      <description>&lt;p&gt;I froze on a CSS flexbox question once. The interviewer asked me "what does &lt;code&gt;flex: 1&lt;/code&gt; do?" — and I went blank. Flexbox. Something I'd used literally hundreds of times in production. I knew it, obviously, but in that moment my brain just... stopped. I started throwing out different possible answers in a super chaotic way, jumping between explanations, trying to say anything that would confirm I'm good enough for this position. After that part of the interview they told me they don't want to move forward.&lt;/p&gt;

&lt;p&gt;The thing is — I knew flexbox. I'd shipped production layouts with it. But under pressure, that knowledge was locked somewhere I couldn't reach.&lt;/p&gt;

&lt;p&gt;This isn't a knowledge problem. It's a performance problem. And the fix isn't more studying.&lt;/p&gt;

&lt;h2&gt;
  
  
  Your brain literally works differently under pressure
&lt;/h2&gt;

&lt;p&gt;Here's a simplified version of what happens when you freeze — I'm a frontend developer, not a neuroscientist, so take this with a grain of salt. When you're in a stressful situation, your brain switches into survival mode. Quick reactions over deep thinking. Great for running from danger. Terrible for explaining how a load balancer works.&lt;/p&gt;

&lt;p&gt;Two things happen:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Your thinking capacity drops.&lt;/strong&gt; The part of your brain responsible for reasoning and putting thoughts into words gets fewer resources. Your brain decides "we don't need the complex thinking right now, we need to survive." Except you DO need the complex thinking — that's the whole point of the interview.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Your memory retrieval gets blocked.&lt;/strong&gt; The stress hormones make it harder to pull information from memory. The knowledge is there. You just can't access it.&lt;/p&gt;

&lt;p&gt;Same knowledge, different context. That's the entire explanation for why you can tell your friend exactly how event delegation works over lunch and then blank on it with an interviewer watching.&lt;/p&gt;

&lt;h2&gt;
  
  
  Your brain is doing too many things at once
&lt;/h2&gt;

&lt;p&gt;When you code normally, you think about one thing: the problem. In an interview, you're running like six parallel threads:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Solving the actual problem&lt;/li&gt;
&lt;li&gt;Monitoring what the interviewer thinks of you (are they nodding? frowning?)&lt;/li&gt;
&lt;li&gt;Choosing the right words to sound competent&lt;/li&gt;
&lt;li&gt;Managing your internal panic&lt;/li&gt;
&lt;li&gt;Deciding what to say next vs. what to skip&lt;/li&gt;
&lt;li&gt;Wondering if you should ask a clarifying question or if that makes you look like you don't know enough&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Working memory can hold roughly &lt;a href="https://pubmed.ncbi.nlm.nih.gov/11515286/" rel="noopener noreferrer"&gt;4 things&lt;/a&gt; at a time. When 2-3 of those slots are occupied by anxiety and self-monitoring, you've got maybe 1-2 slots left for the actual technical problem. You're trying to explain distributed systems with the mental capacity of someone who just woke up from a nap.&lt;/p&gt;

&lt;h2&gt;
  
  
  "I'm not an anxious person" — it doesn't matter
&lt;/h2&gt;

&lt;p&gt;I was interviewing a candidate for a Senior Frontend Developer position recently. First 30 minutes — genuinely impressive. Clear explanations, honest about which decisions were his vs. the team's, asked clarifying questions when he wasn't sure how to respond. All great signs.&lt;/p&gt;

&lt;p&gt;Then we got to the technical part. Something shifted. The same person who was articulate 10 minutes ago started giving short, vague answers. Like watching a different candidate.&lt;/p&gt;

&lt;p&gt;He could have done the job. I'm fairly sure of that. But I also couldn't fully override what I saw in the technical portion, even knowing it was probably nerves. That's the uncomfortable thing about being on the interviewer side — you can &lt;em&gt;suspect&lt;/em&gt; someone is freezing up, but you can't just assume the good performance is the "real" one and the bad performance is the anomaly. You don't have enough data. So the freeze costs them.&lt;/p&gt;

&lt;p&gt;This isn't about being an anxious person. It's about a specific skill that almost nobody practices: &lt;strong&gt;explaining technical concepts clearly while someone is evaluating you.&lt;/strong&gt; When you code normally, you think, pause, Google something, backtrack, try again. Nobody watches. In an interview, everything is visible and there's a timer running. Completely different skill.&lt;/p&gt;

&lt;h2&gt;
  
  
  The "study more" trap
&lt;/h2&gt;

&lt;p&gt;When developers freeze in interviews, the natural reaction is to study more. Read another system design book. Do 50 more LeetCode problems. Watch more YouTube explanations.&lt;/p&gt;

&lt;p&gt;This makes things worse in a specific way: it builds confidence without building the skill you actually need. You walk into the next interview thinking "I really know this stuff now" and then freeze again. That's more demoralizing than the first time, because now you can't even blame lack of preparation.&lt;/p&gt;

&lt;p&gt;I actually started wondering at some point whether I was just bad at interviews as a personality trait. Like maybe some people are wired for it and I'm not. That's not true, but it felt true after the third time I studied hard and still choked.&lt;/p&gt;

&lt;p&gt;The gap between knowing something and &lt;em&gt;owning&lt;/em&gt; it is where interviews are won and lost. Cramming doesn't close it. Only sustained practice under some kind of pressure does.&lt;/p&gt;

&lt;h2&gt;
  
  
  What actually works: getting used to the discomfort
&lt;/h2&gt;

&lt;p&gt;The freeze response gets weaker with repetition. Your brain learns that "someone is evaluating me" is not actually a life-threatening situation — but it needs to learn this through experience, not through reading about it.&lt;/p&gt;

&lt;p&gt;For interviews specifically, this means practicing verbal explanation under conditions that feel at least a little pressured:&lt;/p&gt;

&lt;h3&gt;
  
  
  The 60-second drill
&lt;/h3&gt;

&lt;p&gt;Pick a concept you know well. Set a timer for 60 seconds. Explain it out loud as if you're in an interview.&lt;/p&gt;

&lt;p&gt;Try it right now with "How does HTTPS work?" Go.&lt;/p&gt;

&lt;p&gt;If you actually did it, you probably noticed something: you either rushed through the TLS handshake, forgot to mention certificate validation, or spent too long on DNS and ran out of time before getting to encryption. The 60-second constraint forces you to make decisions about what to include and what to skip — which is exactly the skill you need in interviews.&lt;/p&gt;

&lt;p&gt;Do this daily with different topics. The timer is important because it creates just enough pressure to simulate the real thing.&lt;/p&gt;

&lt;h3&gt;
  
  
  Record yourself (yes, it's uncomfortable)
&lt;/h3&gt;

&lt;p&gt;Record yourself explaining a technical concept, then play it back. You'll hate it. That's the point.&lt;/p&gt;

&lt;p&gt;Most developers who try this discover they use way more filler words than they thought, and their explanations jump around without clear structure. I recorded myself explaining WebSocket vs. SSE once and realized I spent the first 30 seconds saying essentially nothing — just throat-clearing before the actual content started.&lt;/p&gt;

&lt;p&gt;A rough structure that helps: &lt;strong&gt;what problem it solves, how it works, what the tradeoffs are.&lt;/strong&gt; But honestly, just recording and listening back teaches you more than any framework.&lt;/p&gt;

&lt;h3&gt;
  
  
  Practice with interruptions
&lt;/h3&gt;

&lt;p&gt;Have a friend (or use a tool) give you a question, then interrupt you 20 seconds in with a follow-up. "Wait, can you go deeper on that part?" or "What happens if X fails?"&lt;/p&gt;

&lt;p&gt;Interruptions are where most freezes happen in real interviews. You had a plan in your head for how to explain something, someone breaks it, and suddenly you're lost. Training with interruptions builds the ability to adjust on the fly.&lt;/p&gt;

&lt;h3&gt;
  
  
  Build up gradually
&lt;/h3&gt;

&lt;p&gt;Start low-pressure and work up: explain to yourself in an empty room, then record yourself, then explain to a friend, then do a mock with someone you don't know. Each step adds social pressure. Jumping straight from solo study to a real interview is like never running more than 1km and then signing up for a marathon.&lt;/p&gt;

&lt;h2&gt;
  
  
  The pre-interview warmup (most people skip this)
&lt;/h2&gt;

&lt;p&gt;Athletes warm up before competing. Musicians play scales before performing. Developers open their laptop and hope for the best.&lt;/p&gt;

&lt;p&gt;10 minutes of warmup before an interview makes a noticeable difference. Just explain a couple of concepts out loud — something easy first (how does a hash map work?), then something moderately complex (what happens when you type a URL into a browser?). Finish with some rapid-fire retrieval: name 4 HTTP methods, 3 differences between SQL and NoSQL.&lt;/p&gt;

&lt;p&gt;The point is to walk in with your "explaining brain" already running. Cold-starting it on the first real question is a recipe for freezing.&lt;/p&gt;

&lt;h2&gt;
  
  
  When you freeze anyway
&lt;/h2&gt;

&lt;p&gt;Even with preparation, you'll still freeze sometimes. The goal is recovery, not prevention.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Name it internally.&lt;/strong&gt; Just thinking "ok, I'm stressed right now, that's why I can't think" actually helps. &lt;a href="https://pubmed.ncbi.nlm.nih.gov/17576282/" rel="noopener noreferrer"&gt;Labeling the emotion&lt;/a&gt; reduces its intensity. Sounds too simple. Works anyway.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Buy time.&lt;/strong&gt; "Let me think about how to structure this." Not stalling — giving your brain a moment to come back online.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Say one true thing.&lt;/strong&gt; Don't try to give the perfect answer. Just say one true statement. "At its core, this is about managing concurrent access to shared state." That single starting point often unlocks the rest. Or it doesn't, and you've at least said something correct while your brain catches up.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Draw something.&lt;/strong&gt; Shifting from verbal to visual thinking can bypass the freeze entirely. I've seen candidates who couldn't articulate an architecture grab a marker and suddenly become fluent.&lt;/p&gt;

&lt;h2&gt;
  
  
  This takes weeks, not days
&lt;/h2&gt;

&lt;p&gt;The freeze response doesn't disappear after one practice session. Expect a 3-4 week ramp if you practice daily.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 1:&lt;/strong&gt; Daily 60-second drills on concepts you already know. Focus on structure and getting comfortable hearing yourself speak.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 2:&lt;/strong&gt; Start recording yourself. Review and notice filler words, trailing off, and disorganized explanations.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 3:&lt;/strong&gt; Practice with another person. A friend, a study partner, or &lt;a href="https://www.prepovo.com?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=why-you-freeze-in-technical-interviews" rel="noopener noreferrer"&gt;a tool that gives you questions and evaluates your spoken answers&lt;/a&gt; — the point is that it can't be purely self-directed anymore. You need at least some discomfort of having an audience. Practicing alone is valuable, but at some point you need to add social pressure.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Week 4:&lt;/strong&gt; Mock interviews or rapid-fire sessions with someone you don't know well. If you've been doing daily practice, the freeze response should be noticeably weaker by now.&lt;/p&gt;

&lt;p&gt;The goal isn't to eliminate nervousness completely — some level of alertness actually helps you perform better (it's called the &lt;a href="https://en.wikipedia.org/wiki/Yerkes%E2%80%93Dodson_law" rel="noopener noreferrer"&gt;Yerkes-Dodson effect&lt;/a&gt; if you're curious). The goal is to bring it down from "I can't think" to "I'm focused."&lt;/p&gt;

&lt;h2&gt;
  
  
  The real bottleneck isn't knowledge
&lt;/h2&gt;

&lt;p&gt;Most interview prep assumes you don't know enough. That's rarely the actual problem. The problem is the gap between what you know and what you can say under pressure.&lt;/p&gt;

&lt;p&gt;I froze on &lt;code&gt;flex: 1&lt;/code&gt;. A senior developer I interviewed fell apart on questions he clearly knew the answers to. Same pattern. The knowledge was there. The ability to deliver it was not.&lt;/p&gt;

&lt;p&gt;The fix is &lt;a href="https://www.prepovo.com/blog/why-explaining-out-loud-beats-reading?utm_source=devto&amp;amp;utm_medium=blog&amp;amp;utm_campaign=why-you-freeze-in-technical-interviews" rel="noopener noreferrer"&gt;opening your mouth and practicing out loud&lt;/a&gt;, repeatedly, until your brain stops treating "explain how a load balancer works" as a threat. Pick one concept. Set a 60-second timer. Explain it out loud right now. Notice where you stumble.&lt;/p&gt;

</description>
      <category>performance</category>
      <category>career</category>
      <category>programming</category>
      <category>beginners</category>
    </item>
  </channel>
</rss>
