<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: FINN Admin</title>
    <description>The latest articles on Forem by FINN Admin (@finncontenthub).</description>
    <link>https://forem.com/finncontenthub</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/finncontenthub"/>
    <language>en</language>
    <item>
      <title>Modern data stack: scaling people and technology at FINN</title>
      <dc:creator>FINN Admin</dc:creator>
      <pubDate>Thu, 31 Aug 2023 15:22:08 +0000</pubDate>
      <link>https://forem.com/finnauto/modern-data-stack-scaling-people-and-technology-at-finn-3fme</link>
      <guid>https://forem.com/finnauto/modern-data-stack-scaling-people-and-technology-at-finn-3fme</guid>
      <description>&lt;p&gt;&lt;em&gt;by &lt;a href="https://www.linkedin.com/in/jorrit-p-13461674"&gt;Jorrit Posor&lt;/a&gt;, &lt;a href="https://www.linkedin.com/in/felix-kreitschmann"&gt;Felix Kreitschmann&lt;/a&gt;, and &lt;a href="https://www.linkedin.com/in/kosara-g/"&gt;Kosara Golemshinska&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Building proficient data teams and providing them with the right technical tools is crucial for deriving analytical insights that drive your company forward. Understanding the skills, technologies, and roles at play forms an essential part of this process.&lt;/p&gt;

&lt;p&gt;This post offers an overview of these key components shaping a 'Modern Data Stack', which you can use to guide your hiring and strategic planning.&lt;/p&gt;

&lt;h2&gt;
  
  
  Context and Background
&lt;/h2&gt;

&lt;p&gt;The strategies shared in this post are drawn from a successful initiative at &lt;a href="https://www.finn.com/"&gt;FINN&lt;/a&gt;, a German scale-up. The approach enabled the data teams' significant expansion from a small team of three data engineers to a comprehensive team of 35 practitioners over the course of two years. Concurrently, the company's size quadrupled from 100 to 400 employees.&lt;/p&gt;

&lt;p&gt;Integrating data into &lt;a href="https://www.finn.com/"&gt;FINN&lt;/a&gt;'s culture and processes is significant: we utilize over 600 dashboards; 58% of our company uses Looker weekly, spending an average of 90 minutes each week on the platform. This level of engagement translates into nearly half a million queries weekly.&lt;/p&gt;

&lt;p&gt;While the insights presented here can also benefit larger organizations, they are primarily based on the experiences and challenges encountered during &lt;a href="https://www.finn.com/"&gt;FINN&lt;/a&gt;’s growth trajectory.&lt;/p&gt;

&lt;p&gt;The technology focus of this post is on a batch-based technology stack while streaming technologies are not considered.&lt;/p&gt;

&lt;h2&gt;
  
  
  From Raw, Fragmented Data to Analytical Artifacts
&lt;/h2&gt;

&lt;p&gt;When working with business data, your goal is analytical artifacts: key performance indicators (KPIs), dashboards, insights, and a comprehensive understanding of your business—all derived from your data. You most likely already possess raw source data in the SaaS tools you use (like HubSpot or Airtable) or in your databases. &lt;/p&gt;

&lt;p&gt;So, the pertinent question is: what transforms your source data (shown under '&lt;em&gt;Data Sources&lt;/em&gt;' in the top part of &lt;em&gt;image 1&lt;/em&gt;, in yellow) into these insightful analytical artifacts (shown under '&lt;em&gt;4. Analytical Artefacts&lt;/em&gt;' in the bottom part of &lt;em&gt;image 1&lt;/em&gt;, in purple)?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--YwRenv2f--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2yyb8hx8crbff0h44m2y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--YwRenv2f--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/2yyb8hx8crbff0h44m2y.png" alt="Image 1" width="800" height="813"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Image 1. Overview of an analytics platform. What gets you from data sources to analytics artifacts?&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Extract, Load, Transform, Analyze: The Pathway to Analytical Artifacts
&lt;/h2&gt;

&lt;p&gt;Transforming your source data (at the top of &lt;em&gt;image 2&lt;/em&gt;, in yellow) into analytical artifacts (at the bottom of &lt;em&gt;image 2&lt;/em&gt;, in purple) involves a sequence of technical steps (in the middle of &lt;em&gt;image 2&lt;/em&gt;, in blue, numbered 1-3).&lt;/p&gt;

&lt;p&gt;Data is systematically extracted from various sources and maneuvered through transformations (changes) and technical components. It eventually emerges as part of the analytical artifacts. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--CnBkfkw7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tb179sb7t8fob8nicnb1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--CnBkfkw7--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/tb179sb7t8fob8nicnb1.png" alt="Image 2" width="800" height="899"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Image 2. Overview of the technical steps of the dataflow through an analytics platform, moving from source data to analytical artifacts.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Data professionals often describe this as "data flowing" from sources through data pipelines into analytical artifacts. The arrows in the diagram signify this dataflow, highlighting the key operations involved (in blue):&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Extract &amp;amp; Load Data&lt;/strong&gt;: This stage involves copying data from data sources to a data warehouse.&lt;br&gt;
The data is systematically duplicated from every relevant source, copying it table-by-table to a data warehouse (such as BigQuery). Technologies utilized include ingestion providers (SaaS tools) and custom-built data connectors. These tools extract and transfer data from various sources to a data warehouse. This process follows a specific schedule, such as loading new batches of data into the data warehouse every hour.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Data Transformations&lt;/strong&gt;: This phase involves modifying and integrating tables to generate new tables optimized for analytical use.&lt;br&gt;
Consider this example: you want to understand the purchasing behavior of customers aged between 20-30 in your online shop. This means you'll need to join product, customer, and transaction data to create a unified table for analytics. These data preparation tasks (e.g., joining fragmented data) for analysis are essentially what "Data Transformations" entail.&lt;br&gt;
At &lt;a href="https://www.finn.com/"&gt;FINN&lt;/a&gt;, technologies utilized in this phase include &lt;a href="https://cloud.google.com/bigquery"&gt;BigQuery&lt;/a&gt; as a data warehouse, &lt;a href="https://www.getdbt.com/"&gt;dbt&lt;/a&gt; for data transformation, and a combination of &lt;a href="https://github.com/features/actions"&gt;GitHub Actions&lt;/a&gt; and &lt;a href="https://www.datafold.com/"&gt;Datafold&lt;/a&gt; for quality assurance.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Data Exposure to BI Tools&lt;/strong&gt;: This stage involves making the optimized tables from Step 2 accessible company-wide.&lt;br&gt;
Many users across your organization will want access to these tables, and they can obtain it by connecting their preferred tools (such as spreadsheets, business intelligence (BI) tools, or code) to the tables in the data warehouse. Connecting a data warehouse to a tool typically requires a few clicks, although depending on the context, it can sometimes involve more configuration or even coding.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Creation of Analytical Artifacts&lt;/strong&gt;: The final step involves the creation of analytical artifacts within these BI tools (or code).&lt;br&gt;
This work is typically done by BI users, analysts, and data scientists. These professionals take the accessible tables and transform them into actionable insights. They may create dashboards for monitoring business processes, produce KPIs and reports for strategic decisions, generate charts or plots for visual understanding, or even construct advanced predictive models for future planning.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This process not only unveils valuable business insights hidden within the data but also delivers information in an easy-to-understand format that supports informed decision-making across various levels of the organization.&lt;/p&gt;

&lt;h2&gt;
  
  
  Essential Hard Skills Along the Dataflow
&lt;/h2&gt;

&lt;p&gt;Let's now have a look at the hard skills that are required in terms of our analytics platform.&lt;/p&gt;

&lt;p&gt;Different types of work are required along the dataflow, and hence the hard skills change depending on the stage of the dataflow. These skills, along the yellow vertical lines on the left side of &lt;em&gt;image 3&lt;/em&gt;, are crucial to managing the different stages of the process.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--N8X2kAUf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i13u910d2edf3qfutvwz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--N8X2kAUf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/i13u910d2edf3qfutvwz.png" alt="Image 3" width="800" height="899"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Image 3. Overview of the skills that are required to work with data in the different parts of the analytics platform.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Let’s take a closer look at what the different skills mean in the context of our analytics platform.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Software Engineering&lt;/strong&gt;: Implementing data connectors in-house, which extract data from sources, requires writing software.&lt;/p&gt;

&lt;p&gt;This skill is particularly crucial for the initial step of making data available in the data warehouse. It also requires cloud and infrastructure knowledge, as the software must operate in the cloud, follow a schedule, and consistently extract/load new data.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Analytics Engineering&lt;/strong&gt;: Predominantly SQL-focused when working with dbt, this skill involves transforming raw data from sources into tables useful for analytics—a process labeled 'analytics engineering'.&lt;/p&gt;

&lt;p&gt;At &lt;a href="https://www.finn.com/"&gt;FINN&lt;/a&gt;, this primarily takes place within the data warehouse (BigQuery) using a data transformation tool (dbt). From a technical standpoint, raw tables are cleaned, combined, filtered, and aggregated to create many new tables for analytics. A common practice to make tables analytics-ready is "&lt;a href="https://www.kimballgroup.com/data-warehouse-business-intelligence-resources/kimball-techniques/dimensional-modeling-techniques/"&gt;dimensional modeling&lt;/a&gt;".&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Analytics &amp;amp; Data Science&lt;/strong&gt;: This refers to utilizing transformed tables and extracting insights from them.&lt;/p&gt;

&lt;p&gt;Analytical artifacts created at this stage include dashboards, KPIs, plots, forecasts, etc.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Business Understanding&lt;/strong&gt;: As data is the product of a business process, understanding these processes is crucial.&lt;/p&gt;

&lt;p&gt;Business understanding is required from the data transformation phase right through to the creation of analytical artifacts. Without business understanding, effectively transforming raw data into insights would be impossible.&lt;/p&gt;

&lt;p&gt;For example, accurately counting the number of converted leads requires understanding what qualifies as a 'converted lead' (considering fraud cases, credit checks, and so on).&lt;/p&gt;

&lt;h2&gt;
  
  
  Navigating the Implementation and Scaling of a Data Teams
&lt;/h2&gt;

&lt;p&gt;We've explored the process of transforming source data into analytical artifacts. Now, we face the most critical question: how can you efficiently integrate this mix of people and technology?&lt;/p&gt;

&lt;p&gt;Addressing this question involves various technical and non-technical dimensions, which, if not properly managed, can lead to costly decisions, inefficient data teams, and delays in delivering insights.&lt;/p&gt;

&lt;p&gt;Regrettably, the answers to these technical and non-technical questions change as your company grows. The journey of onboarding your initial five data practitioners differs from the leap from 30 to 35 practitioners.&lt;/p&gt;

&lt;h2&gt;
  
  
  How FINN Navigates the Complexities of Scaling
&lt;/h2&gt;

&lt;p&gt;Understanding your organization's specific needs and aligning those with your team's structure and technical architecture is critical.&lt;/p&gt;

&lt;p&gt;The complexities, inherent in technical and non-technical aspects, aren't static; they evolve as your company grows. Thus, it's important to remember that the process is dynamic - transitioning from a small team of data practitioners to a larger one, or even adding just a few more members, can change your operational dynamics.&lt;/p&gt;

&lt;p&gt;Let's delve into the experience at &lt;a href="https://www.finn.com/"&gt;FINN&lt;/a&gt;. As we navigated the process of building and scaling an analytics platform across multiple data teams, we discovered that our journey encompassed distinct scaling phases. Each phase introduced its challenges and requirements, necessitating different approaches and shifts in team dynamics and responsibilities. &lt;em&gt;Image 4&lt;/em&gt; highlights the responsibilities of roles, which are constantly changing.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--f4GWr5z9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n3jwx9uz89mjhgr8171a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--f4GWr5z9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/n3jwx9uz89mjhgr8171a.png" alt="Image 4" width="800" height="899"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Image 4. Overview of the roles required to work with data in the different parts of the analytics platform. The responsibilities of roles change while growing data teams. Initially, a small platform team has to cover data tasks end-to-end, from raw data to analytical artifacts. While growing, the roles can specialize in specific areas of the analytics platform.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Let's explore &lt;a href="https://www.finn.com/"&gt;FINN&lt;/a&gt;’s scaling  phases in more detail.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Setting up the Analytics Platform Team&lt;/strong&gt;: This initial stage is focused on establishing the core analytics platform team, laying down the technical infrastructure, and delivering initial insights to stakeholders. The platform team works end-to-end, meaning it picks up raw data and delivers analytical artifacts likes dashboards to stakeholders.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Onboarding Other Data Teams&lt;/strong&gt;: The second phase entails the integration of additional data roles (like analysts, analytics engineers, and data scientists), referred to as "data teams". They deliver insights using the analytics platform but don't form part of the platform team. Their introduction shifts the dynamics as they take over previously built analytical artifacts and BI-related tasks and share data transformation responsibilities.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Specialization&lt;/strong&gt;: In the third phase, the platform team focuses on platform improvement and enhancing the data teams' productivity. Other data teams, meanwhile, specialize in specific business areas and data transformations. The extent of this specialization is highly context dependent and aligns with the company's unique requirements.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Navigating Conway's Law&lt;/strong&gt;: When scaling, a company may find itself in a phase filled with considerations around the balance of centralization vs decentralization, both from a technical setup perspective and from a team structure perspective. Informed by Conway's Law—which suggests that a team's structure should mirror the desired technical architecture—a company may seek to align its teams accordingly.&lt;br&gt;
For example, this could mean centralizing communication patterns when stability is needed in centralized, shared parts of the data pipeline. Keep in mind that each step towards centralization may secure some benefits at the cost of potentially surrendering those derived from decentralization, and vice versa.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Thus, navigating these trade-offs to find the sweet spot can be an intricate journey.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Understanding the necessary skills, roles, and technologies is crucial in the dynamic and complex journey of building and scaling effective data teams. The transformation of raw data into insightful analytical artifacts requires a broad set of hard skills and deep business understanding.&lt;/p&gt;

&lt;p&gt;The journey through different scaling phases—initial setup, onboarding, role specialization, and Conway's Law—will demand adaptability and resilience.&lt;/p&gt;

&lt;p&gt;This blog post provides a foundational understanding of these concepts, and we hope it serves as a valuable guide for your scaling journey. For more deep dives into these topics, consider subscribing to our blog. Future posts will dive deeper into "Scaling Phases".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://jposor.substack.com/"&gt;Subscribe now&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Happy scaling! Thanks for reading!&lt;/p&gt;

&lt;p&gt;Thanks, &lt;a href="https://www.linkedin.com/in/kosara-g/"&gt;Kosara Golemshinska&lt;/a&gt; and &lt;a href="https://www.linkedin.com/in/meyns/"&gt;Chris Meyns&lt;/a&gt;, for recommendations and for reviewing drafts!&lt;/p&gt;

&lt;p&gt;&lt;em&gt;This post was originally published on &lt;a href="https://jposor.substack.com/p/modern-data-stack-scaling-people"&gt;Substack&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Appendix: Pivotal Dimensions
&lt;/h2&gt;

&lt;p&gt;This is an overview of pivotal dimensions when scaling data teams. The primary aim is to provide a broad picture, while the secondary objective is to encourage you to subscribe to this blog. Doing so will ensure that future in-depth discussions on these topics (for example, 'Modern Data Stack: Deep Dive into Pivotal Dimensions') land directly in your inbox. 😊&lt;/p&gt;

&lt;p&gt;&lt;a href="https://jposor.substack.com/"&gt;Subscribe now&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Phase Goals&lt;/strong&gt;: What goals should be set for each data team to generate business value in a given scaling phase?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Technology&lt;/strong&gt;: Does your technology empower data practitioners, or does it hinder them due to improper usage patterns, knowledge gaps, lack of support, or technical debt?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;(De)centralization Trade-offs&lt;/strong&gt;: What are the implications of fundamental decisions regarding "decentralized vs centralized" data teams in your organizational structure and technology architecture?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Technical Debt&lt;/strong&gt;: How can you mitigate the creeping technical debt that could hamper and gradually slow progress?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Policy Automation&lt;/strong&gt;: How can you implement policies (such as programming standards) with decentralized data practitioners?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Critical Skills&lt;/strong&gt;: What skills are necessary for each scaling phase?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Communication Processes&lt;/strong&gt;: How many people, on average, need to be involved to deliver insight to their stakeholders?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Knowledge Transfer&lt;/strong&gt;: Are knowledge silos emerging, creating bottlenecks? How can you efficiently distribute required business knowledge, especially when business processes evolve?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Support Culture&lt;/strong&gt;: How and when should you foster a support culture? Are data practitioners blocked due to a lack of information that others may have?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Resource Alignment&lt;/strong&gt;: How can you manage effort peaks? Can you flexibly reassign data practitioners between business units?&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Engineering &amp;lt;&amp;gt; Data Alignment&lt;/strong&gt;: How can you guarantee that changes to data sources, business processes, or technical aspects (such as a schema) don't disrupt your downstream analytical artifacts?&lt;/p&gt;

</description>
      <category>data</category>
      <category>analytics</category>
    </item>
    <item>
      <title>Driving the future of mobility with an AI Hackathon at FINN</title>
      <dc:creator>FINN Admin</dc:creator>
      <pubDate>Wed, 07 Jun 2023 12:12:53 +0000</pubDate>
      <link>https://forem.com/finnauto/driving-the-future-of-mobility-with-an-ai-hackathon-at-finn-27e0</link>
      <guid>https://forem.com/finnauto/driving-the-future-of-mobility-with-an-ai-hackathon-at-finn-27e0</guid>
      <description>&lt;p&gt;How do you get a bunch of disruptive minds together in a (virtual) room to build innovative AI applications? Organize a hackathon! During the FINN feat. AI Hackathon held on May 30, participants embarked on a one-day challenge to generate measurable business impact by leveraging AI to solve problems. Solutions developed ranged from automated vehicle damage recognition and GPT-4-powered dynamic car pricing to a bot to answer queries about vehicles directly from Slack.&lt;/p&gt;

&lt;p&gt;In this article, we’ll showcase the two winning projects: one winner from the teams competing at FINN’s base in Germany, and the other from the New York City headquarters. Get ready for some inspiration on hacking your business with AI 🧠🤖&lt;/p&gt;

&lt;h2&gt;
  
  
  Winner 🇩🇪: Finding your dream car with the power of AI
&lt;/h2&gt;

&lt;p&gt;Picture discovering your perfect car with just a few clicks. That’s the remarkable power of personalization. Did you know that &lt;a href="https://segment.com/pdfs/State-of-Personalization-Report-Twilio-Segment-2023.pdf"&gt;56% of consumers&lt;/a&gt; state they would become repeat buyers if they get a personalized experience?&lt;/p&gt;

&lt;h3&gt;
  
  
  Problem
&lt;/h3&gt;

&lt;p&gt;How can we at FINN leverage personalization to bring the car dealership experience to our customers on the go?&lt;/p&gt;

&lt;p&gt;That’s the question our winning team in Germany asked themselves. &lt;a href="https://www.linkedin.com/in/sofiane-zeghoud-152b15176/"&gt;Sofiane Zeghoud&lt;/a&gt;, &lt;a href="https://www.linkedin.com/in/ishtiaquezafar/"&gt;Ishtiaque Zafar&lt;/a&gt;, &lt;a href="https://www.linkedin.com/in/sofyadurneva/"&gt;Sofya Durneva&lt;/a&gt;, and &lt;a href="https://www.linkedin.com/in/robert-ghazaryan/"&gt;Robert Ghazaryan&lt;/a&gt; were driven by a shared vision: transforming the way our customers engage with car subscriptions.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--nxWRKQMX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/omos62npmpjqyiv25mdu.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--nxWRKQMX--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/omos62npmpjqyiv25mdu.jpg" alt="From left to right: Sofya Durneva, Sofiane Zeghoud, Ishtiaque Zafar, and Robert Ghazaryan." width="800" height="600"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;From left to right: Sofya Durneva, Sofiane Zeghoud, Ishtiaque Zafar, and Robert Ghazaryan.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;The team noticed that we currently offer a uniform product listing page (PLP) experience to everyone. The problem? Our customers are all unique and have distinct needs and preferences. We want to meet them where they are.&lt;/p&gt;

&lt;h3&gt;
  
  
  Solution
&lt;/h3&gt;

&lt;p&gt;In just a few hours, the team created a fully functional recommendation engine prototype for our website, delivering personalized suggestions that instantly connect with each customer.&lt;/p&gt;

&lt;p&gt;For a smooth customer experience, the engine uses already available analytics data to display the most relevant vehicles on the PLP. Imagine effortlessly navigating our website and being greeted by a vibrant display of vehicles that are exactly what you’re looking for. The “Help me pick a car” chat box becomes your trusted advisor that can always find the perfect car for your taste.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--TwUAGDCu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6epcuqkzwp7oi8o50ucf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--TwUAGDCu--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/6epcuqkzwp7oi8o50ucf.png" alt="Personalized car recommendations hand picked for the visitor’s needs." width="800" height="448"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Personalized car recommendations hand picked for the visitor’s needs.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;By offering a seamless personalized experience this solution:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;drives &lt;strong&gt;higher clickthrough rates&lt;/strong&gt; (CTR)&lt;/li&gt;
&lt;li&gt;drives &lt;strong&gt;higher conversion rates&lt;/strong&gt; (CVR) to subscription&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;reduces paid marketing costs&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is how the magic happens: Sofiane leveraged the power of deep learning to anticipate the preferences of each visitor for our top car models and list the models accordingly. The “Help me pick a car” feature runs on GPT-4 under the hood, a powerful language model, to seamlessly fuse AI and user-centric design, helping customers find their dream car with ease.&lt;/p&gt;

&lt;p&gt;The winning team is already setting its sights on training the models on richer fleet data and fine-tuning the engine on more matching parameters.&lt;/p&gt;

&lt;p&gt;As we continue making mobility fun and sustainable, we invite you to stay tuned for more exciting innovations from us! But first, let’s explore the winning idea of our USA-based team.&lt;/p&gt;

&lt;h2&gt;
  
  
  Winner 🇺🇲: Revolutionizing user acquisition calls with AI
&lt;/h2&gt;

&lt;p&gt;In the hackathon held at FINN’s NYC base, the winner was Team User Acquisition (UA), consisting of &lt;a href="https://www.linkedin.com/in/kevintallen/"&gt;Kevin Allen&lt;/a&gt;, &lt;a href="https://www.linkedin.com/in/bethanylooi/"&gt;Bethany Looi&lt;/a&gt;, &lt;a href="https://www.linkedin.com/in/anna-kohlasch-75703516a/"&gt;Anna Kohlasch&lt;/a&gt;, and &lt;a href="https://www.linkedin.com/in/ericvanthuyne/"&gt;Eric Van Thuyne&lt;/a&gt;. Team UA pitched a project to revolutionize the processing of user acquisition call information with the use of artificial intelligence.&lt;/p&gt;

&lt;h3&gt;
  
  
  Problem
&lt;/h3&gt;

&lt;p&gt;Team UA addressed the issue that on the one hand calls with customers are a crucial part of the work for many people working in user acquisition at FINN, while on the other hand call records can be quite cumbersome to extract actionable information from. Yes, calls are recorded, but re-listening those calls would take ages. And yes, call recordings are transcribed, but again, plowing through all those call transcriptions would be a huge effort. As a result, currently it is difficult to keep track of which topics came up in an individual call with a customer, or whether there are any trends or common topics across multiple calls. This means that a lot of valuable call data and information is currently left unused, or takes significant effort to use, hence restricting the UA department’s overall efficiency.&lt;/p&gt;

&lt;h3&gt;
  
  
  Solution
&lt;/h3&gt;

&lt;p&gt;To solve this issue, Team UA developed an AI-powered automation to facilitate quick and easy call summaries. The solution automatically summarizes individual calls, identifies customer intent (that is, whether the customer is likely to want to get a subscription), action items (such as requesting a customer’s confirmation), as well as any keywords associated with the call. With this in hand, anyone can swiftly get the gist of what was covered in a call.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DYDHvU9m--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/h09aunpbfsg8xsl95ovq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DYDHvU9m--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/h09aunpbfsg8xsl95ovq.png" alt="Note on Hubspot with estimated customer intent, a call summary, action items, and keywords" width="800" height="259"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Note on Hubspot with estimated customer intent, a call summary, action items, and keywords&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;In addition, the keywords extracted from a call can be used to automatically summarize the contents of multiple calls over a specific period. A keyword cloud can give a visual summary of the common topics that user acquisition agents are dealing with.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--rTD6ySII--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ti1z86c8zrveyyqm133y.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--rTD6ySII--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ti1z86c8zrveyyqm133y.png" alt="A keyword cloud visually summarizes the topics of multiple calls" width="800" height="467"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;A keyword cloud visually summarizes the topics of multiple calls&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;How does the solution work in practice? The AI-driven automation consists of the following steps:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A phone call between a sales agent and a customer is recorded and transcribed via &lt;a href="https://www.cloudtalk.io/"&gt;CloudTalk&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;Using the low-code tool &lt;a href="http://www.make.com"&gt;Make&lt;/a&gt;, an automation scenario fetches the call transcript from CloudTalk, matches the call to the customer contact details on &lt;a href="https://www.hubspot.com/"&gt;Hubspot&lt;/a&gt;, and sends the transcript to &lt;a href="https://chat.openai.com"&gt;ChatGPT&lt;/a&gt; for parsing.&lt;/li&gt;
&lt;li&gt;ChatGPT uses the call transcript to create a call summary, and to extract inferred customer intent, action items, and keywords.&lt;/li&gt;
&lt;li&gt;The Make scenario adds a note with the extracted information to the relevant customer record on Hubspot, and saves the extracted keywords in a Google sheet.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--cQtpALbz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9a91ncsvrf30y2lxn9pt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--cQtpALbz--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_800/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9a91ncsvrf30y2lxn9pt.png" alt="Make scenario that summarizes calls, and extracts customer intent, action items, and keywords" width="800" height="492"&gt;&lt;/a&gt;&lt;br&gt;
&lt;em&gt;Make scenario that summarizes calls, and extracts customer intent, action items, and keywords&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Why it's so good
&lt;/h3&gt;

&lt;p&gt;Team UA’s AI-driven solution is expected to offer three core direct benefits to anyone working with call data in user acquisition:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Time management&lt;/strong&gt; — Automated call summaries can reduce manual effort in figuring out what happened in a call, or across multiple calls.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Increase sales&lt;/strong&gt; — The summaries can capture valuable insights that can in turn be used to train agents and enhance scripts.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Collaboration&lt;/strong&gt; — The call summaries can be useful in collaboration, by allowing the easy sharing of keywords and learnings with other departments.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In addition, in the longer term the automated call summaries and keyword clouds are also expected to offer the following advantages:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Sales training&lt;/strong&gt; — Call summaries can be used to create training exercises and prepare agents for specific sales scenarios.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Capacity planning&lt;/strong&gt; — Having a good, quick and easy overview of call topics will enable the department to scale, anticipate resource needs, and to allocate resources effectively.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Customer service and close rates&lt;/strong&gt; — Call information can ultimately be used to increase the number of subscriptions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In fact, Team UA’s project was developed to such a high standard that, within days of winning the US hackathon, it has already been implemented 🥳&lt;/p&gt;

&lt;h2&gt;
  
  
  What's next?
&lt;/h2&gt;

&lt;p&gt;AI hackathons can be a powerful tool when you’re seeking to drive innovation and generate tangible business outcomes. At FINN, we’re definitely going to continue implementing some more of the solutions developed during the hackathon day — and come up with new ones! How about you, have you used AI to build any business solutions? What worked well, what didn’t? Let us know in the comments.&lt;/p&gt;

</description>
      <category>finn</category>
      <category>hackathon</category>
      <category>ai</category>
    </item>
  </channel>
</rss>
