<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Klarna</title>
    <description>The latest articles on Forem by Klarna (@klarna).</description>
    <link>https://forem.com/klarna</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/klarna"/>
    <language>en</language>
    <item>
      <title>Architecture Decision Studio</title>
      <dc:creator>Thibault Jan Beyer</dc:creator>
      <pubDate>Wed, 30 Sep 2020 10:34:04 +0000</pubDate>
      <link>https://forem.com/klarna/architecture-decision-studio-2nie</link>
      <guid>https://forem.com/klarna/architecture-decision-studio-2nie</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Fast actionable collaborative decisions&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Is your team forming knowledge silos rather than working together? Do you have difficulties getting actionable outcomes from your meetings? Would you like to hold a fun and useful remote session to find solutions to problems together as a team but don't know how to start? We've designed a "remote first" method for this. But before I tell you more about it, let me quickly explain how we got there.&lt;/p&gt;

&lt;p&gt;Can’t wait? &lt;a href="https://jamboard.google.com/d/1ObmByD_H-UZ-2xSzQUFdDYyaVck3wU_a47vVnRsTFs4/edit?usp=sharing"&gt;Get the template&lt;/a&gt; and jump reading further below!&lt;/p&gt;

&lt;h2&gt;
  
  
  A lil’ Backstory
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;“At Klarna things change rapidly as we move very fast”&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;They said. How true. Within the first month of joining, we shifted focus 3 times. Handing over some services we created to other teams, taking over their services. While it was exciting to gain insight in a lot of areas from different perspectives, that also meant we got to handle a lot of tasks simultaneously. Naturally, individual knowledge silos formed around each team member. We ended up in a situation where each one of us was working on a separate feature/project. We identified the problem and were trying to re-focus, to get the whole team back working together on one single project at a time again. I had an MVP running for a product that was meant to become one of our main focuses.&lt;/p&gt;

&lt;p&gt;My colleagues would avoid taking tickets regarding the next iteration of the project. I felt trapped, explaining the same thing over and over again. Wasn’t able to delegate tasks to other team members because of a lack of knowledge or context required.&lt;/p&gt;

&lt;p&gt;It was time to break the silos, regroup, and take common decisions!&lt;/p&gt;

&lt;p&gt;Usually, that is when we would lock us up in the “war room” (meeting room next door) and iterate over the project setup. Wait, a tiny room full of sweaty engineers during a pandemic? What a bad idea 🦠🦠🦠&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--MfLVkgBo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/1bmdcslmdhbzrecyc2r4.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--MfLVkgBo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/1bmdcslmdhbzrecyc2r4.jpeg" alt="Man with Mask"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;No whiteboard, no in-person conversations. How can you run an architecture session to align everyone and brainstorm, while making sure everyone’s included? That’s where we set sails on a mission to try out collaborative techniques online using tools like Jamboard (because it’s part of the G Suite and sounds cool).&lt;/p&gt;

&lt;h3&gt;
  
  
  Trying the Architecture Golf
&lt;/h3&gt;

&lt;p&gt;I started by just trying to squeeze the &lt;a href="https://engineering.klarna.com/architecture-golf-60fb51a6e787"&gt;Architecture Golf&lt;/a&gt;, also referred to as “Group Whiteboard Sketching” into an online session without any big planning or preparation. Just to see how it goes. &lt;/p&gt;

&lt;p&gt;Here is how it went: I took half the time just drawing out the MVP, I had some architecture diagram and explained it. By drawing out the steps and explaining it as we go we find team members are all onboarded around the problem space and can break down issues in approaches more rapidly. It also keeps it easy for everyone to follow and gives a common understanding of the problem space, which we found makes it easier for teammates to share and challenge ideas later on. Unfortunately, digitally that took way longer than it would have on a whiteboard.&lt;br&gt;
Then I went ahead talking about the use cases that we want to solve and the rest of the time we tried to find a solution to the use-cases by an open discussion around the diagram. It was good as people were very happy to finally, after several months of isolation, kind of work together again. It also helped to explain the service in detail once and have everyone understand the architecture and vaguely the challenges ahead. But it wasn’t great either. We didn’t start with the most important use-case. The one we picked was not very clear, which led to a big discussion taking up the whole remaining time.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WZR1QJCr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/3ecxrzw2h6havgqywjzj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WZR1QJCr--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/3ecxrzw2h6havgqywjzj.png" alt="Jamboard with architecture"&gt;&lt;/a&gt;&lt;/p&gt;
Example of an &lt;a href="https://engineering.klarna.com/architecture-golf-60fb51a6e787"&gt;Architecture Golf&lt;/a&gt; session.



&lt;h3&gt;
  
  
  Trying the Lightning Decision Jam
&lt;/h3&gt;

&lt;p&gt;So with these thoughts in mind, I set up a follow-up session trying another technique. Some sort of free interpretation of the &lt;a href="https://engineering.klarna.com/stepping-down-as-a-dictator-giving-great-teams-permission-to-make-awesome-decisions-215377538a19"&gt;Lightning Decision Jam&lt;/a&gt;. This time, focusing on only the most urgent use-case. &lt;/p&gt;

&lt;p&gt;Overall I enjoyed the jam. It led to a decision. However, it kinda confused some participants. The structure was not clearly defined and poorly prepared from my side. I asked participants to draw their idea on a piece of paper first and when their turn comes to re-draw it digitally. A bad approach as it wasted time. Because of that, we required yet another session to refine the ideas presented, even fewer people attended.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--F0qxrvKF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/gkwukozep5vcjwafhefm.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--F0qxrvKF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/gkwukozep5vcjwafhefm.png" alt="Jamboard with boat and pro-contra stickers"&gt;&lt;/a&gt;&lt;/p&gt;
Example of a &lt;a href="https://uxplanet.org/lightning-decision-jam-a-workshop-to-solve-any-problem-65bb42af41dc"&gt;Lightning Decision Jam&lt;/a&gt; Step 1–3.



&lt;h3&gt;
  
  
  Feedback &amp;amp; Iterate
&lt;/h3&gt;

&lt;p&gt;In order to be able to iterate fast &amp;amp; improve I sent out a simple Google form survey asking for feedback.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--66LtvMk5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/rx6p5083enzjntwrrqcy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--66LtvMk5--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/rx6p5083enzjntwrrqcy.png" alt="Survey responses"&gt;&lt;/a&gt;Example of a &lt;a href="https://www.google.com/forms/about/"&gt;Google form survey&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Knowing that my team is awesome, I was very keen on their actionable feedback. &lt;br&gt;
And they sure did not let me down:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Architecture Golf: great to share knowledge, understanding, - and ownership of the project&lt;/li&gt;
&lt;li&gt;Lightning Decision Jam: good to find an actual actionable solution to one problem&lt;/li&gt;
&lt;li&gt;Generally: The sessions were missing structure, did not match time-lines and were not focused enough&lt;/li&gt;
&lt;li&gt;When stuck on a topic, park it (write it down) and move forward to another topic&lt;/li&gt;
&lt;li&gt;Quiet team members, even though they might have fabulous ideas, didn’t speak. Becoming a discussion between the loudest team members and the rest just watching silently.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Time to iterate. Read up more on lean sessions and find ideas to conduct them remotely. Plus, I remembered something similar to the lightning jam that an awesome project manager held back at &lt;a href="https://www.edenspiekermann.com/eu/"&gt;edenspiekermann&lt;/a&gt;, it was called The &lt;a href="https://www.edenspiekermann.com/insights/working-with-design-studios/"&gt;Design Studio Method&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Soon enough we had another use-case to tackle. So I sharpened my keyboard and tried to funnel the learnings into a new grandiose session.&lt;/p&gt;

&lt;p&gt;As it is an improved combination of Architecture Golf, Lightning Decision Jam, and Design Studio, let’s just name it:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--EWZrPHf6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/sfjldemntldiq89iznfc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--EWZrPHf6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/sfjldemntldiq89iznfc.png" alt="Trinity of users"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  The Architecture Decision Studio
&lt;/h2&gt;

&lt;p&gt;Helps you find actionable solutions for your problems. We designed in a “remote first” fashion. It is a great tool to find solutions to problems within existing systems. Get everyone's input and produce actionable output. The first half is about finding a decision, the second half is about the architecture.&lt;br&gt;
2x 1h meetings worked great for us.&lt;/p&gt;

&lt;p&gt;Can’t wait? &lt;a href="https://jamboard.google.com/d/1ObmByD_H-UZ-2xSzQUFdDYyaVck3wU_a47vVnRsTFs4/edit?usp=sharing"&gt;Get the template&lt;/a&gt;!&lt;/p&gt;

&lt;h3&gt;
  
  
  Guidelines when conducting the studio
&lt;/h3&gt;

&lt;h4&gt;
  
  
  Before the session
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Research: how are other companies/teams solving this problem?&lt;/li&gt;
&lt;li&gt;Define a list of topics you need to discuss and limit yourself to these&lt;/li&gt;
&lt;li&gt;Keep the number of participants small: the more, the longer it will take. 3-5 is perfect&lt;/li&gt;
&lt;li&gt;Split the sessions into multiple meetings&lt;/li&gt;
&lt;li&gt;Consider when best to schedule meetings to get the best out of the attendees&lt;/li&gt;
&lt;li&gt;Be very clear what needs to be discussed and what the supposed outcome of the meeting is&lt;/li&gt;
&lt;li&gt;Have a narrow and well-defined scope&lt;/li&gt;
&lt;li&gt;Write a pre-read, explaining the problem and get attendees to think about the agenda items beforehand&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  During the session
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Create a safe space!&lt;/strong&gt; Don’t openly challenge people's opinions. Don’t always give feedback, even if you don’t agree with an idea, let it live. Give only constructive feedback. If someone does not fully understand, follow, or does not feel well, they shouldn’t be forced to participate actively and feel free to just observe the session to learn. If none of your team feels comfortable, be honest about it, then this might not be the right format for your team. &lt;/li&gt;
&lt;li&gt;Have an agenda&lt;/li&gt;
&lt;li&gt;Start with introducing the agenda, what is getting addressed, what will happen, which decisions are going to be made and what are the next steps&lt;/li&gt;
&lt;li&gt;Timebox discussions &amp;amp; timebox each agenda item&lt;/li&gt;
&lt;li&gt;If it is looking like it will run over time, take it offline or bump it to another meeting&lt;/li&gt;
&lt;li&gt;If an agenda item is becoming blocked, feel free to park it and move onto an item where consensus is likely&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  Afterward
&lt;/h4&gt;

&lt;ul&gt;
&lt;li&gt;Cut the meeting 5 minutes before time is up to summarize it again and the next steps&lt;/li&gt;
&lt;li&gt;Ask for “anonymous” feedback on the session (survey)&lt;/li&gt;
&lt;li&gt;Create actionable next steps&lt;/li&gt;
&lt;li&gt;Ideally, send out a summary&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--ci-DvnWA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/sdsl7x4o02oiis9r4f61.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--ci-DvnWA--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/sdsl7x4o02oiis9r4f61.png" alt="Summary of the Architecture Decision Studio"&gt;&lt;/a&gt;Summary of the Architecture Decision Studio. &lt;a href="https://jamboard.google.com/d/1ObmByD_H-UZ-2xSzQUFdDYyaVck3wU_a47vVnRsTFs4/edit?usp=sharing"&gt;Get the Jamboard template&lt;/a&gt;!&lt;/p&gt;

&lt;h3&gt;
  
  
  Stepping in the Studio
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--aSLWLo8Q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/jfr6ixjuboguf96si685.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--aSLWLo8Q--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/jfr6ixjuboguf96si685.png" alt="Pre-Read"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 0 — Pre-read
&lt;/h4&gt;

&lt;p&gt;Share a pre-read in the invite where people can read up details on the problem/use-case you’re trying to tackle and any information you think is useful.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--_cBWXl1P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/lmwjc5cbkt57gkfe3lva.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--_cBWXl1P--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/lmwjc5cbkt57gkfe3lva.png" alt="Introduction"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 1 — Introduction — 8min
&lt;/h4&gt;

&lt;p&gt;Set up a presentation where you take about 5-10 minutes to quickly share the knowledge you have and introduce the problem/feature. Inspiration:&lt;br&gt;
What is the task (why this meeting)&lt;br&gt;
What else might come our way in the future?&lt;br&gt;
Summary of previous discussions&lt;br&gt;
How is it done/can this be done currently?&lt;/p&gt;

&lt;p&gt;Afterwards, ask for questions. Make sure that everything is clear to all attendees.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--3hRXPcyJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/mg8i0cmqmgeeaq6ia5dy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--3hRXPcyJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/mg8i0cmqmgeeaq6ia5dy.png" alt="Warm-up"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 2 — Warm-up — 2min
&lt;/h4&gt;

&lt;p&gt;Warm-up your audience and put them in the mood to participate. Ensure that there is a safe space for discussion. Make everyone feel comfortable. I.e. Brainstorming is a good idea (feel free to do something else instead): Start the timer for 2-3mins and have everyone write stickers on a board. After the timer is over, briefly go over the stickers, organize &amp;amp; group them, clarify uncertainties. Force yourself and others to talk about positive things. Whether it is “Pros and Cons of the current solution” or “What do we have already and what is missing” or “your favourite animals”. Use it to give a bit of context. Make it generic so that everyone can participate.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--O5iZyGFd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/r0dufmksq8rlsghzm8ew.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--O5iZyGFd--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/r0dufmksq8rlsghzm8ew.png" alt="Pros and Cons Example"&gt;&lt;/a&gt;Example of a pros and cons warm-up&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--NIq_SM9G--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/c11813pu0kshekvsq8p5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--NIq_SM9G--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/c11813pu0kshekvsq8p5.png" alt="Ideas"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 3 — Ideas — 8min
&lt;/h4&gt;

&lt;p&gt;This is where the magic happens. As an introvert, I love this part. It’s “working alone together”. In contrast to an offline meeting, online it is very important that you prepare this slide. You know who is attending so prepare 1 slide per participant (the Design Studio method would suggest that you split each slide into 6 squares, but that is up to you). Each participant can then within the 5-10mins draw as many solutions as they can come up with on their slide. The more the merrier.&lt;/p&gt;

&lt;p&gt;This is not a drawing competition. It doesn't matter whether someone draws a sketch, an abstract idea or even just writes a phrase. As long as they can explain it later on, it’s perfect.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--r4-mnQTb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/790c286ebf8blg7i1uaj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--r4-mnQTb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/790c286ebf8blg7i1uaj.png" alt="Presentation"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 4 — Presentation — 2min/pp
&lt;/h4&gt;

&lt;p&gt;One by one, go through each slide and give each participant the opportunity to present their idea(s). 2-3mins per person. No feedback. No comments. No “I like/dislike this idea… It’s possible/impossible…”. This is solely about presenting the idea. Questions that help to understand the idea are allowed. By not allowing ideas to be questioned or shut down you create a safe place and open up lines of thinking that may have been quickly disregarded previously.&lt;/p&gt;

&lt;p&gt;Make sure to give each idea a “friendly name” (2-3 words that summarize the idea). Will be useful later on. Put it on a post-it and go on to the next slide. Next person. Rinse. Repeat.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--30ntLetx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/b86nogja9zz7t84p0v2b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--30ntLetx--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/b86nogja9zz7t84p0v2b.png" alt="Voting"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 5 — Voting — 1.5min/pp
&lt;/h4&gt;

&lt;p&gt;Put all idea summaries together. From now on the ideas belong to the group, not to individuals. Make sure not to add names to the ideas. Just place the ideas up for a vote. They should stand for themselves. If you already have repetitive ideas you can group them together as one idea.&lt;/p&gt;

&lt;p&gt;Now is the time to vote. Everyone has 1-2mins to give, one by one, friendly feedback focusing on the problem. Take each idea and say what problems the idea solves or not. If you think the idea is a potential solution candidate, give it a thumbs up (vote). There is no limit of votes.&lt;/p&gt;

&lt;p&gt;Once everyone has voted you should have a winning idea. &lt;/p&gt;

&lt;h5&gt;
  
  
  Draw?
&lt;/h5&gt;

&lt;p&gt;If there is a draw for the first place, you have various options:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go back to step 3 and have another round of idea drawing where everyone can improve ideas or combine ideas to form another better idea. (this is the suggested way)&lt;/li&gt;
&lt;li&gt;You can let people discuss why they voted for one or the other idea and then re-vote&lt;/li&gt;
&lt;li&gt;The team-lead can decide for a solution in order to unblock the situation.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--0w7K0WR3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/3kxqbushni3ue1hvmntv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--0w7K0WR3--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/3kxqbushni3ue1hvmntv.png" alt="Ideas ordered by 1st, 2nd and 3rd"&gt;&lt;/a&gt;Example of a winning idea&lt;/p&gt;

&lt;h5&gt;
  
  
  We have a winner!
&lt;/h5&gt;

&lt;p&gt;You should now have a 1st place, 2nd place, and 3rd place idea(s). Usually, the first place is the solution we want to implement. Feel free to put 2nd and 3rd in your backlog if they address other important issues but focus on the 1st idea within this studio.&lt;/p&gt;

&lt;h5&gt;
  
  
  Break
&lt;/h5&gt;

&lt;p&gt;Perfect time to take a break. You may split the meeting into several meetings and stop here. Otherwise, this is a good time for a 5-15min break.&lt;/p&gt;

&lt;p&gt;If you had a long break, start the next step by recapitulating in 3mins what was achieved in the last session, which decisions were taken, and what will be tried to achieve now.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RDbOYhVE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/w8c3cx0eeronyxdmwxrz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RDbOYhVE--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/w8c3cx0eeronyxdmwxrz.png" alt="Refinement"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 6 — Refinement — 45min
&lt;/h4&gt;

&lt;p&gt;Refine the idea and figure out the steps necessary to implement it.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Have the existing website/app layout/system pre-sketched&lt;/li&gt;
&lt;li&gt;Take turns drawing/writing what is necessary to be changed to adapt the solution idea&lt;/li&gt;
&lt;li&gt;Each participant has 1-2minutes to:&lt;/li&gt;
&lt;li&gt;- &lt;strong&gt;Change&lt;/strong&gt; the existing design where necessary, OR&lt;/li&gt;
&lt;li&gt;- &lt;strong&gt;Add&lt;/strong&gt; to the design (or sketches done by the previous participant(s)), OR&lt;/li&gt;
&lt;li&gt;- &lt;strong&gt;Pass&lt;/strong&gt; their turn if they don’t have anything to add
Ideally, whenever the design changes, take a screenshot to document the evolution.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Naturally, discussions will evolve as people are sketching. That’s ok, go with the flow and don’t try to force it. However, try to keep the time schedule and turn-based system. It helps to give everyone an equal amount of speak/sketch/write time, makes sure that the discussion is not one-sided, gives quiet people the opportunity to come to word, and prevents the discussion from getting out of hand.&lt;/p&gt;

&lt;p&gt;If you still feel that a topic still gets out of hand, park it, and continue with something else.&lt;br&gt;
Limit the refinement part to somewhere around 40-50mins depending on the complexity.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--O5c15ZUD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/5l2cumw6kw22p79pg25f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--O5c15ZUD--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/5l2cumw6kw22p79pg25f.png" alt="Actionables"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Step 7 — Actionables — 10min
&lt;/h4&gt;

&lt;p&gt;Write down summaries of all the changes necessary (from the refinement). Write down the open questions. Write down the parked topics. These are your actionables. Ideally, they can be translated into tickets. If there is not enough time left-over to do it together, feel free to take it offline after the session.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--b0XCRtEs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/xzh09y924l1nf7v8q2mc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--b0XCRtEs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/xzh09y924l1nf7v8q2mc.png" alt="Closure"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Closure
&lt;/h4&gt;

&lt;p&gt;Summarize what was achieved and make sure to honestly thank your participants because they’re the best!&lt;/p&gt;

&lt;h3&gt;
  
  
  Template
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://jamboard.google.com/d/1ObmByD_H-UZ-2xSzQUFdDYyaVck3wU_a47vVnRsTFs4/edit?usp=sharing"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--zShKVa2s--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_66%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/549ddrg7ss6aewtk142j.gif" alt="Template"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;a href="https://jamboard.google.com/d/1ObmByD_H-UZ-2xSzQUFdDYyaVck3wU_a47vVnRsTFs4/edit?usp=sharing"&gt;Jamboard template&lt;/a&gt;



&lt;h2&gt;
  
  
  What now?
&lt;/h2&gt;

&lt;p&gt;We tried the Decision Studio several times already. It was fun and effective. Putting us on the same page, fostering collaboration, and producing actionable output.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--HyIs-765--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/1ijwpnp4lm7tm3iv0xwj.jpeg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--HyIs-765--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/1ijwpnp4lm7tm3iv0xwj.jpeg" alt="Pinky and the brain"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It’s not done yet, we’ll continue to use the Decision Studio Process and encourage other teams to try it. Continuously improving the format. I’m planning a central gateway for it to make updates/refinements easier.&lt;/p&gt;

&lt;p&gt;However, I fear that by standardizing it, it might become boring. To fight boredom, there has to be variety! So I strongly encourage you to make tiny changes in each session and tailor the Studio exactly to your current needs to really make it your own!&lt;br&gt;
(and please share your outcome with us)&lt;/p&gt;

&lt;p&gt;That’s it, keep it &lt;em&gt;smoooth&lt;/em&gt;!&lt;/p&gt;

</description>
      <category>engineering</category>
      <category>architecture</category>
      <category>agile</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Stepping Down as a Dictator: Giving Great Teams Permission to Make Awesome Decisions.</title>
      <dc:creator>Phil Bennett</dc:creator>
      <pubDate>Wed, 17 Jun 2020 07:23:31 +0000</pubDate>
      <link>https://forem.com/klarna/stepping-down-as-a-dictator-giving-great-teams-permission-to-make-awesome-decisions-i2j</link>
      <guid>https://forem.com/klarna/stepping-down-as-a-dictator-giving-great-teams-permission-to-make-awesome-decisions-i2j</guid>
      <description>&lt;p&gt;At Klarna, we hire amazing people. This statement isn't hyperbole. I've not met one person in the year that I've been at Klarna who didn't impress me in some way.&lt;/p&gt;

&lt;p&gt;But one thing that I've learned this last year is that if you put a group of amazing people in a room, they struggle to make decisions. The deeper a team thinks about its problem space, the more it generates strong, convincing ideas and solutions. And therefore, the harder it is for the team to decide how to move forward.&lt;/p&gt;

&lt;p&gt;In this article, I will explain how we're using structured, collaborative, and fun decision making processes to help drive our team forward and have maximum impact on our problem space.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--QZGhMeZQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/3hc9dd5no1mk5b2hvthn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--QZGhMeZQ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/3hc9dd5no1mk5b2hvthn.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Collective Analysis Paralysis
&lt;/h2&gt;

&lt;p&gt;Analysis Paralysis is the concept that an individual can over-analyze a problem and block themselves from moving forward with a solution. The paralysis problem is exacerbated when a group of highly analytical individuals come together as a team.&lt;/p&gt;

&lt;p&gt;Klarna hires against a set of &lt;a href="https://www.klarna.com/careers/tips-and-tricks/"&gt;leadership principles&lt;/a&gt;. These principles define how we go about delivering products. One of those principles is 'Detailed Thinkers'.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Leaders know that to be a disruptor in a competitive industry requires radical, detailed thinking. They are ambitious and communicate a daring direction that inspires results, but they also love details and have a contagious desire for knowledge. They leave no stone unturned when it comes to finding Smoooth solutions. Nothing is impossible. They are a unique combination of ambitious, free-thinking with a meticulous eye for detail.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;As a result, we end up with teams with strong analytical skills.This is what happened in my team. The result is impressive people who struggle to make collective decisions. We are great at generating concepts, ideas, and potential solutions, but agreeing on the right outcome has always been a challenge.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem with Deadlock Breaking.
&lt;/h2&gt;

&lt;p&gt;As a manager, or lead within a team, it’s your responsibility to break these kinds of stalemate situations. And there’s an easy solution to this. As the lead, you get a single overriding vote to unblock the discussion and move the team forward.&lt;/p&gt;

&lt;p&gt;This single vote is traditionally what managers do. They use their wealth of experience and knowledge to choose the right option. However, this veto vote has two significant challenges associated with it; bias, and ownership.&lt;/p&gt;

&lt;h3&gt;
  
  
  Challenge 1: Bias
&lt;/h3&gt;

&lt;p&gt;As a manager with 20 years of industry experience, I have biases. I actively try to keep them in check but the situations that I have experienced over my career have built up a set of subconscious patterns in my decision making. If I make the final call, it will be based more on my knowledge and understanding than the collective consciousness of the team.&lt;/p&gt;

&lt;h3&gt;
  
  
  Challenge 2: Ownership
&lt;/h3&gt;

&lt;p&gt;The second challenge is that, as the lead, I now own that decision. It’s mine and it doesn’t fully belong to the team. If we had multiple conflicting ideas to start with, I have dismissed the ideas of a proportion of the group.&lt;/p&gt;

&lt;p&gt;This is manageable with good leadership skills, but it would have been a whole lot better if we had collaboratively made that choice.&lt;/p&gt;

&lt;h2&gt;
  
  
  Unblocking Collaboratively in 5 Easy Steps.
&lt;/h2&gt;

&lt;p&gt;As a team, we use several tools to help us make significant decisions collaboratively. Two of these tools are &lt;a href="https://www.thesprintbook.com/how"&gt;Design Sprints&lt;/a&gt; for larger problems and &lt;a href="https://uxplanet.org/lightning-decision-jam-a-workshop-to-solve-any-problem-65bb42af41dc"&gt;Lightning Decision Jams&lt;/a&gt; for smaller ones.&lt;/p&gt;

&lt;p&gt;The Design Sprint is a week-long five-phase process that helps teams use design thinking to come up with awesome product ideas. &lt;/p&gt;

&lt;p&gt;The Lightning Decision Jam is a similar process that focuses on a similar set of phases into the time slot of a single meeting.&lt;br&gt;
What is shared with both of these processes is that they follow a tight timeline that allows you to use the best elements of your team to come to fantastic decisions.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--VMnZi4-X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/edh5rxn5ypy9615f0zpl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--VMnZi4-X--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/edh5rxn5ypy9615f0zpl.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Understand the Problem.
&lt;/h3&gt;

&lt;p&gt;The first step in any decision-making process is fully understanding the problem you’re trying to solve. It would be best if you had a detailed outline of what you want to achieve as a team before doing any problem-solving. If you’re all trying to solve different problems, you’re already setting off on the wrong foot.&lt;/p&gt;

&lt;p&gt;When pushed for time, you can use your team to perform this step. However, when you have the luxury of time, it’s great to get domain expert views on the problem. The Design Sprint sets aside a time that allows you to invite experts into your session and interview them to gain a better understanding of the problem area.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Explosive Creativity.
&lt;/h3&gt;

&lt;p&gt;Never discredit the hidden skills in your team. In talented, deep-thinking teams anyone can come up with the best solution for your customer, not just the designers and product managers.&lt;/p&gt;

&lt;p&gt;Get everyone in on the solutions part of the process, give them all time to draft, sketch and propose ideas. Give them permission to be creative.&lt;/p&gt;

&lt;p&gt;Both of these processes use the concept of ‘working alone together’. They let each teammate work on the solutions process independently but in the same physical space.&lt;/p&gt;

&lt;p&gt;The outcome of this step is a ton of fantastic ideas.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Collect Solutions.
&lt;/h3&gt;

&lt;p&gt;Now that you have at least as many great ideas as you have great minds in your team, it’s time to collect and present the ideas.&lt;br&gt;
These ideas should be presented with minimal discussion. The concept is that the ideas should be clear and stand up on their own, without spending time digging into the details.&lt;/p&gt;

&lt;p&gt;If the idea is great, then you should need no more than a few sketches and three minutes of your time to describe it. If the concept can fit on one post-it note, all the better.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Vote.
&lt;/h3&gt;

&lt;p&gt;Now to make the decision. In all of these processes, the voting step is simplest; everyone gets some voting points; they spread the voting point between the ideas that they think are best.&lt;br&gt;
The best idea wins.&lt;/p&gt;

&lt;p&gt;It’s that simple; immediately you’ve got down from several awesome ideas to the one the team collectively thinks is the best.&lt;/p&gt;

&lt;p&gt;Time to deliver!&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Own it.
&lt;/h3&gt;

&lt;p&gt;What is so amazing about this process is that involving the whole team in every step of the process breeds a fantastic level of ownership. These solutions belong to the group, not the manager, designer, or product owner.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;We are all building the thing that we all chose to create, and we’re going to smash it as a team!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Next Steps.
&lt;/h2&gt;

&lt;p&gt;To run a similar process, you only really need one thing, a facilitator. That person can be anyone in the team, so it could easily be you.&lt;/p&gt;

&lt;p&gt;Getting an agreement for a team to spend a whole week on a design sprint can be challenging, the first time. What I would recommend is next time you have a problem, and you can’t collectively agree on the best way forward, use the Lightning Decision Jam process or define something very similar.&lt;/p&gt;

&lt;p&gt;Once you have built confidence in this process, you will find it easier to convince people that the investment of a Design Sprint is worthwhile.&lt;/p&gt;

&lt;p&gt;I can promise you that once you’ve run one Design Sprint, you’ll have plenty of evidence to convince people to do more.&lt;/p&gt;

</description>
      <category>leadership</category>
      <category>management</category>
      <category>productivity</category>
    </item>
    <item>
      <title>6 Lessons learned from optimizing the performance of a Node.js service</title>
      <dc:creator>benzaita</dc:creator>
      <pubDate>Tue, 14 Jan 2020 08:05:45 +0000</pubDate>
      <link>https://forem.com/klarna/6-lessons-learned-from-optimizing-the-performance-of-a-node-js-service-34j4</link>
      <guid>https://forem.com/klarna/6-lessons-learned-from-optimizing-the-performance-of-a-node-js-service-34j4</guid>
      <description>&lt;p&gt;Here at Klarna, we put a lot of effort into empowering our developers to deliver high-quality and secure services. One of the services we provide our developers with is a platform for running A/B tests. A critical component of this platform is a fleet of processes that for every incoming request, makes the decision: which flavor of the test (A or B) to expose the request to. That, in turn, determines what color to render a button, what layout to show the user, or even which 3rd party backend to use. These decisions have a direct impact on user experience.&lt;/p&gt;

&lt;p&gt;The performance of each process in this fleet is critical since it is used synchronously in the critical decision paths in the Klarna ecosystem. A typical requirement in such flows is to decide within a single-digit latency for 99.9% of the requests. To be confident that we keep adhering to these requirements, we developed a performance testing pipeline to load test this service.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Takeaway #1: performance testing can give us confidence that we are not degrading performance with each release.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Although we barely saw any performance issues in two years that this platform is in production, the tests were unambiguously showing some issues. Several minutes into the test, at a moderate and stable request rate, the request duration spikes up from its normal range to several seconds:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--dDyw8b5j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/x87by6rpirqmw930382g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--dDyw8b5j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/x87by6rpirqmw930382g.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We decided that although this did not happen in production yet, it was just a matter of time until the real-life load “catches up” with the synthesized load, and therefore, this is something worth investigating.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Takeaway #2: by “cranking up” the load we can expose problems before they ever reach production.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Another thing to note is that it takes around two or three minutes for the problems to appear. In the first iterations, we ran this test for only two minutes. Only after extending the duration of the test to ten minutes, we discovered this problem.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Takeaway #3: long load tests can surface different kinds of problems. If everything looks OK, try extending the duration of the test.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;We normally monitor services using the following metrics: number of incoming requests per second, duration of incoming requests, and the error rate. These give a pretty good indication of whether the service is experiencing problems or not.&lt;/p&gt;

&lt;p&gt;But these metrics do not offer any insights when the service misbehaves. When things go wrong, you need to know where the bottleneck is. For that, you need to monitor the resources that the Node.js runtime uses. The obvious ones are CPU and memory utilization. But sometimes these are not the actual bottlenecks. In our case, the CPU utilization was low, and the memory utilization was low as well.&lt;/p&gt;

&lt;p&gt;Another resource that Node.js uses is the event loop. In the same way we need to know how many megabytes of memory the process is using, we also need to know how many “tasks” the event loop needs to handle. The event loop is implemented in a C++ library called “libuv” (&lt;a href="https://www.youtube.com/watch?v=GE6MpnxhW_Q"&gt;here&lt;/a&gt; is a great talk about the event loop by Kenneth Gibson). The term it uses for these “tasks” is Active Requests. Another important metric to follow is the number of Active Handles, which is the number of open file handles or sockets that the Node.js process holds (for a complete list of the kinds of handles, see the &lt;a href="http://docs.libuv.org/en/v1.x/handle.html#c.uv_handle_type"&gt;libuv documentation&lt;/a&gt;). So if the test is using 30 connections, it would make sense to see around 30 Active Handles. Active Requests is the number of operations pending on these Handles. Which operations? The full list is available in the &lt;a href="http://docs.libuv.org/en/v1.x/request.html#c.uv_req_t.type"&gt;libuv documentation&lt;/a&gt;, but these can be read/write operations, for example.&lt;/p&gt;

&lt;p&gt;Looking at the metrics reported by the service, there was something wrong. While the number of active handles is what we would expect (around 30 in this test), the number of active requests was disproportionately large — several tens of thousands:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Wh4gemaJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/2lum5vpm4qrm0qbm0hza.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Wh4gemaJ--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/2lum5vpm4qrm0qbm0hza.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We still didn’t know which types of requests were in the queue, though. After breaking down the number of active requests by their type, the picture was clearer. One type of request stood out in the reported metrics: UV_GETADDRINFO. This type of request is generated when Node.js is attempting to resolve a DNS name.&lt;/p&gt;

&lt;p&gt;But why would it generate so many DNS resolution requests? Turns out that the &lt;a href="https://github.com/brightcove/hot-shots/"&gt;StatsD client we are using&lt;/a&gt; attempts to resolve the hostname for each outgoing message. To be fair, it does offer an option to cache the DNS results, but that option does not respect the TTL of that DNS record — it caches the results indefinitely. So if that record is updated after the client already resolved it, the client will never be aware of it. Since the StatsD load balancer might be redeployed with a different IP, and we cannot force a restart of our service to update the DNS cache, this approach of indefinitely caching the results was not an option for us.&lt;/p&gt;

&lt;p&gt;The solution we came up with was to add proper DNS caching outside of the client. It’s not hard to do by monkey patching the “DNS” module. And the results were better:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--w1gRn_8j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/yc6lgnnt6w39cnflzlhe.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--w1gRn_8j--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/yc6lgnnt6w39cnflzlhe.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Takeaway #4: don’t forget to consider DNS resolution when thinking about outgoing requests. And don’t ignore the record’s TTL — it can literally break your app.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;After solving this problem, we re-enabled some more features in the service and tested again. Specifically, we enabled a piece of logic that produces a message to a Kafka topic for every incoming request. The tests revealed, again, significant spikes in response time (seconds) for significant periods:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TlwJDLUs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/zvgpo1tdqcx6y5lwz3vs.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TlwJDLUs--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/zvgpo1tdqcx6y5lwz3vs.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Looking at the metrics from the service showed an obvious problem in that very feature we just enabled — the latency of producing messages to Kafka was extremely high:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Z7Bnb7F0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/o929su9vn0cnnfdj7cv0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Z7Bnb7F0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/o929su9vn0cnnfdj7cv0.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We decided to try a trivial improvement — queuing the outgoing messages in memory and flushing them in a batch every second. Running the test again, we saw a clear improvement in the response times of the service:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6ogdyUk8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/uc0ftiw7qv4srxuwppf8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6ogdyUk8--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://thepracticaldev.s3.amazonaws.com/i/uc0ftiw7qv4srxuwppf8.png" alt=""&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Takeaway #5: batch I/O operations! Even when async, I/O is expensive.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Final note: running the tests mentioned above would have been impossible without a way to run tests with reproducible and consistent results. The first iterations of our performance testing pipeline did not provide us confidence in their results since they were not consistent. Investing in a proper testing pipeline allowed us to try out things, experiment with fixes, and mostly be confident that the numbers we are looking at are not coincidental.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Takeaway #6: Before attempting any improvements, you should have a test that you trust its results.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Frequently Asked Questions (FAQ)
&lt;/h2&gt;

&lt;p&gt;I’ve received some questions about which tools were used to perform the tests here. There are a couple of tools used here:&lt;br&gt;
The load is generated by an internal tool that simplified running Locust in &lt;a href="https://docs.locust.io/en/stable/running-locust-distributed.html"&gt;distributed mode&lt;/a&gt;. Basically we just need to run a single command and that tool will spin up the load generators, provide them with the test script, and collect the results to a dashboard in Grafana. These are the black screenshots in the article. This is the perspective of the (client) in the tests.&lt;br&gt;
The service under test is reporting metrics to Datadog. These are the white screenshots in the article.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>performance</category>
      <category>node</category>
    </item>
  </channel>
</rss>
