<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Adriano Avelino</title>
    <description>The latest articles on Forem by Adriano Avelino (@adrianoavelino).</description>
    <link>https://forem.com/adrianoavelino</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/adrianoavelino"/>
    <language>en</language>
    <item>
      <title>📦 ASDF: Gerenciando versões de linguagens e ferramentas num lugar só</title>
      <dc:creator>Adriano Avelino</dc:creator>
      <pubDate>Wed, 27 Nov 2024 00:46:07 +0000</pubDate>
      <link>https://forem.com/adrianoavelino/asdf-gerenciando-versoes-de-linguagens-e-ferramentas-num-lugar-so-1lmh</link>
      <guid>https://forem.com/adrianoavelino/asdf-gerenciando-versoes-de-linguagens-e-ferramentas-num-lugar-so-1lmh</guid>
      <description>&lt;p&gt;Imagine que você está em um restaurante fast-food e você tem a liberdade de escolher os acompanhamentos do seu lanche. Você pode optar por um pão integral, carne de frango grelhada, queijo cheddar e tomate. Mas e se, na próxima vez, você quiser um hambúrguer com bacon e cebola caramelizada?&lt;/p&gt;

&lt;p&gt;O &lt;strong&gt;asdf&lt;/strong&gt; funciona como esse fast-food. No lugar de lanches, temos linguagens de programação e ferramentas como Python, Ruby, Node.js, etc. Em vez de ingredientes, temos versões específicas dessas linguagens.&lt;/p&gt;

&lt;p&gt;Como funciona:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Você, o cliente&lt;/strong&gt;: É o desenvolvedor que precisa de uma determinada versão de uma linguagem para um projeto específico.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;O menu&lt;/strong&gt;: É o repositório de plugins do &lt;strong&gt;asdf&lt;/strong&gt;, onde você encontra todas as linguagens e ferramentas disponíveis.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;O funcionário&lt;/strong&gt;: É o próprio &lt;strong&gt;asdf&lt;/strong&gt;, que vai buscar a versão exata da linguagem que você escolheu e configurá-la para você.
Exemplo:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Imagine que você está trabalhando em dois projetos: um que usa Python 3.6 e outro que usa Python 3.9. Com o &lt;strong&gt;asdf&lt;/strong&gt;, você pode instalar ambas as versões e configurar cada projeto para usar a versão correta, sem conflitos.&lt;/p&gt;

&lt;p&gt;Em resumo:&lt;/p&gt;

&lt;p&gt;O &lt;strong&gt;asdf&lt;/strong&gt; é como um restaurante fast-food onde você monta o seu próprio lanche, mas em vez de comida, você está "montando" o seu ambiente de desenvolvimento. É uma ferramenta poderosa e flexível que pode te ajudar a gerenciar múltiplas versões de linguagens de programação de forma eficiente e organizada.&lt;/p&gt;

&lt;p&gt;Neste post, vamos explorar como instalar e usar o &lt;strong&gt;asdf&lt;/strong&gt; para gerenciar suas dependências de forma eficaz. Mas antes, vamos entender:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;O que é o asdf&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Plugins e versões&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Principais Comandos do asdf&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Instalação do asdf&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Guia Prático para Instalação de Plugins e Versões com o asdf&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;(Opcional) Laboratório&lt;/strong&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;O que é o asdf?&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;O &lt;strong&gt;&lt;a href="https://asdf-vm.com/pt-br/" rel="noopener noreferrer"&gt;asdf&lt;/a&gt;&lt;/strong&gt; é um gerenciador de versões universal que permite que você utilize múltiplas versões de linguagens de programação e ferramentas, tudo em um só lugar. Com ele, você pode alternar entre diferentes versões de uma mesma linguagem como se estivesse trocando de roupa, facilitando seu fluxo de trabalho e evitando conflitos.&lt;/p&gt;

&lt;p&gt;Ao contrário de outros gerenciadores, como o &lt;strong&gt;&lt;a href="https://github.com/rbenv/rbenv" rel="noopener noreferrer"&gt;rbenv&lt;/a&gt;&lt;/strong&gt; para Ruby ou o &lt;strong&gt;&lt;a href="https://github.com/nvm-sh/nvm" rel="noopener noreferrer"&gt;nvm&lt;/a&gt;&lt;/strong&gt; para Node.js, o &lt;strong&gt;asdf&lt;/strong&gt; se destaca pela sua versatilidade. Ele suporta uma variedade de linguagens através de &lt;strong&gt;plugins&lt;/strong&gt;. Você pode ter várias versões do Node.js, Python ou Java instaladas e alternar entre elas sem esforço.&lt;/p&gt;

&lt;p&gt;Se ainda não ficou muito claro, Fábio Akita fala e dá algumas dicas de uso do asdf no seu vídeo &lt;a href="https://www.youtube.com/watch?v=epiyExCyb2s&amp;amp;t=2440s" rel="noopener noreferrer"&gt;The DEFINITIVE UBUNTU Guide for Beginning Devs&lt;/a&gt;, no minuto &lt;strong&gt;40:42&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Neste post, vamos explorar como instalar e usar o &lt;strong&gt;asdf&lt;/strong&gt; para gerenciar suas dependências de forma prática. Mas antes, vamos entender dois conceitos-chave: &lt;strong&gt;plugins&lt;/strong&gt; e &lt;strong&gt;versões&lt;/strong&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Plugins e versões
&lt;/h3&gt;

&lt;p&gt;De acordo com a &lt;a href="https://asdf-vm.com/pt-br/manage/plugins.html" rel="noopener noreferrer"&gt;documentação oficial&lt;/a&gt;, &lt;strong&gt;plugins&lt;/strong&gt; são as extensões que permitem ao &lt;strong&gt;asdf&lt;/strong&gt; gerenciar diferentes ferramentas, como Node.js, Ruby e Elixir. Já as &lt;strong&gt;versões&lt;/strong&gt; são as diferentes iterações das dependências que você pode utilizar. Por exemplo, você pode optar pela versão &lt;strong&gt;v20.18.0&lt;/strong&gt; do Node.js para um projeto específico, enquanto usa uma versão diferente para outro.&lt;/p&gt;

&lt;p&gt;Pronto para mergulhar no mundo do &lt;strong&gt;asdf&lt;/strong&gt;? Então vamos à prática!&lt;/p&gt;

&lt;h2&gt;
  
  
  Principais Comandos do asdf
&lt;/h2&gt;

&lt;p&gt;Esses são os comandos essenciais para instalar, listar e gerenciar versões no &lt;strong&gt;asdf&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;&lt;code&gt;asdf list&lt;/code&gt;&lt;/strong&gt;: Lista todos os plugins e as versões já instaladas no seu ambiente. Ideal para verificar rapidamente o que está disponível.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;&lt;code&gt;asdf plugin add &amp;lt;NOME-DO-PLUGIN&amp;gt;&lt;/code&gt;&lt;/strong&gt;: Adiciona um plugin ao &lt;strong&gt;asdf&lt;/strong&gt; para que ele gerencie uma nova linguagem ou ferramenta. Por exemplo, para adicionar o plugin do Python, digite:&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf plugin add python
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;asdf list all &amp;lt;NOME-DO-PLUGIN&amp;gt;&lt;/code&gt;&lt;/strong&gt;: Lista todas as versões disponíveis para um plugin específico, permitindo que você escolha a que deseja instalar. Exemplo para ver as versões do Python:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf list all python
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;asdf install &amp;lt;NOME-DO-PLUGIN&amp;gt; &amp;lt;VERSAO&amp;gt;&lt;/code&gt;&lt;/strong&gt;: Instala uma versão específica de uma linguagem ou ferramenta. Para instalar a versão 3.13.0 do Python, por exemplo, use:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf &lt;span class="nb"&gt;install &lt;/span&gt;python 3.13.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;asdf global &amp;lt;NOME-DO-PLUGIN&amp;gt; &amp;lt;VERSAO&amp;gt;&lt;/code&gt;&lt;/strong&gt;: Define uma versão específica de uma linguagem ou ferramenta como padrão para todo o sistema. Para definir a versão 3.13.0 do Python como global, digite:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf global python 3.13.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;&lt;code&gt;asdf local &amp;lt;NOME-DO-PLUGIN&amp;gt; &amp;lt;VERSAO&amp;gt;&lt;/code&gt;&lt;/strong&gt;: Define uma versão específica de uma linguagem ou ferramenta apenas para o diretório atual do projeto. Navegue até a pasta do projeto e execute o comando. Por exemplo:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf &lt;span class="nb"&gt;local &lt;/span&gt;python 3.13.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Observação: Este comando cria um arquivo chamado .tool-versions na pasta do projeto, permitindo configurar uma versão específica para cada repositório, independentemente da versão global.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Agora que já conhecemos os comandos básicos, vamos turbinar o nosso ambiente com alguns plugins e versões. Vêm comigo!&lt;/p&gt;

&lt;h2&gt;
  
  
  Instalação do asdf
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;Observação: se você é daquelas pessoas que gosta de fazer testes na ferramenta antes de configurar no seu ambiente, você pode ir para o item &lt;strong&gt;(Opcional) Laboratório&lt;/strong&gt; e práticar em um container docker para avaliar o asdf.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Clone o respositorio para a sua pasta home:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/asdf-vm/asdf.git ~/.asdf &lt;span class="nt"&gt;--branch&lt;/span&gt; v0.14.1
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Execute os comandos abaixo para configurar o seu &lt;code&gt;~/.bashrc&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s1"&gt;'. "$HOME/.asdf/completions/asdf.bash"'&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&amp;gt;&lt;/span&gt; ~/.bashrc
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s1"&gt;'. "$HOME/.asdf/asdf.sh"'&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&amp;gt;&lt;/span&gt; ~/.bashrc
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Aplique as configurações no terminal ativo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;source&lt;/span&gt; ~/.bashrc
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Para outros tipos de shell como o zsh ou fish shell, &lt;a href="https://asdf-vm.com/pt-br/guide/getting-started.html#_3-adicionando-ao-seu-shell" rel="noopener noreferrer"&gt;acesse a documentação oficial do asdf&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Guia Prático para Instalação de Plugins e Versões com o asdf
&lt;/h2&gt;

&lt;p&gt;Vamos instalar plugins e definir versões de linguagens e ferramentas populares com o &lt;strong&gt;asdf&lt;/strong&gt;! A cada passo, você verá como configurar o ambiente para &lt;strong&gt;Node.js&lt;/strong&gt;, &lt;strong&gt;Java&lt;/strong&gt;, &lt;strong&gt;Maven&lt;/strong&gt; e &lt;strong&gt;Python&lt;/strong&gt;. &lt;/p&gt;

&lt;h3&gt;
  
  
  🚀 Começando com Node.js
&lt;/h3&gt;

&lt;p&gt;Para configurar o &lt;strong&gt;Node.js&lt;/strong&gt; no seu ambiente, siga estes passos:&lt;/p&gt;

&lt;p&gt;1- &lt;strong&gt;Instalar dependências&lt;/strong&gt;: Vamos garantir que o sistema tenha todas as dependências que você precisa.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo &lt;/span&gt;apt-get &lt;span class="nb"&gt;install &lt;/span&gt;dirmngr gpg curl gawk
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;2- &lt;strong&gt;Adicionar o plugin do Node.js&lt;/strong&gt;: Com o plugin, o &lt;strong&gt;asdf&lt;/strong&gt; será capaz de gerenciar versões do Node.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf plugin add nodejs https://github.com/asdf-vm/asdf-nodejs.git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;3- &lt;strong&gt;Verificar o plugin instalado&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf list
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Você deve ver algo assim:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;nodejs
  No versions installed
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;4- &lt;strong&gt;Instalar uma versão do Node.js&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;O primeiro passo antes de instalar uma versão, é escolher uma versão disponível e para isso liste as versões disponíveis:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf list all nodejs
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Você deve ver uma lista semelhante ao exemplo abaixo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;...
22.11.0
23.0.0
23.1.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Escolha uma das versões disponíveis e execute o comando abaixo para instalar, por exemplo, a versão &lt;strong&gt;23.0.0&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf &lt;span class="nb"&gt;install &lt;/span&gt;nodejs 23.0.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;5- &lt;strong&gt;Definir a versão do Node.js&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Globalmente&lt;/strong&gt; (para todo o sistema):
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf global nodejs 23.0.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Não se esqueça, o próximo passo é opcional, mas para executar, instale a versão &lt;strong&gt;latest&lt;/strong&gt; com o comando &lt;code&gt;asdf install nodejs latest&lt;/code&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Localmente&lt;/strong&gt; (apenas no diretório atual):
&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf &lt;span class="nb"&gt;local &lt;/span&gt;nodejs latest &lt;span class="c"&gt;# ou qualquer outra versão disponível&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;6- &lt;strong&gt;Verifique a versão do nodejs&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;node &lt;span class="nt"&gt;--version&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  ☕ Configurando o Java
&lt;/h3&gt;

&lt;p&gt;1- &lt;strong&gt;Adicionar o plugin do Java&lt;/strong&gt;: Isso permite instalar e gerenciar versões do Java no &lt;strong&gt;asdf&lt;/strong&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf plugin add java https://github.com/halcyon/asdf-java.git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;2- &lt;strong&gt;Verificar as versões disponíveis do Java&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf list all java
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;3- &lt;strong&gt;Configurar a variável JAVA_HOME&lt;/strong&gt;: Para garantir que o sistema identifique o Java.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="nb"&gt;.&lt;/span&gt; ~/.asdf/plugins/java/set-java-home.bash &lt;span class="o"&gt;&amp;gt;&amp;gt;&lt;/span&gt; ~/.bashrc
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;4- &lt;strong&gt;Instalar uma versão específica do Java&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf &lt;span class="nb"&gt;install &lt;/span&gt;java adoptopenjdk-8.0.432+6
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Observação: você pode executar o comando &lt;code&gt;asdf list all java&lt;/code&gt; para ver as versões disponíveis.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;5- &lt;strong&gt;Verificar a versão do Java instalada&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;java &lt;span class="nt"&gt;-version&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;6- &lt;strong&gt;Definir a versão do Java globalmente&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf global java adoptopenjdk-8.0.432+6
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  🔧 Configurando o Maven
&lt;/h3&gt;

&lt;p&gt;1- &lt;strong&gt;Adicionar o plugin do Maven&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf plugin add maven
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;2- &lt;strong&gt;Instalar a versão mais recente do Maven&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf &lt;span class="nb"&gt;install &lt;/span&gt;maven latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;3- &lt;strong&gt;Definir a versão global do Maven&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf global maven latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;4- &lt;strong&gt;Verifique a versão do maven instalada&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;mvn &lt;span class="nt"&gt;--version&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  🐍 Configurando o Python
&lt;/h3&gt;

&lt;p&gt;1- &lt;strong&gt;Instalar dependências para o Python&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo &lt;/span&gt;apt-get update &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="nb"&gt;sudo &lt;/span&gt;apt-get &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
       make build-essential libssl-dev zlib1g-dev &lt;span class="se"&gt;\&lt;/span&gt;
       libbz2-dev libreadline-dev libsqlite3-dev wget curl &lt;span class="se"&gt;\&lt;/span&gt;
       llvm libncurses5-dev libncursesw5-dev &lt;span class="se"&gt;\&lt;/span&gt;
       xz-utils tk-dev libffi-dev liblzma-dev &lt;span class="se"&gt;\&lt;/span&gt;
       git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;2- &lt;strong&gt;Adicionar o plugin do Python&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf plugin add python
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;3- &lt;strong&gt;Instalar a versão mais recente do Python&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf &lt;span class="nb"&gt;install &lt;/span&gt;python latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;4- &lt;strong&gt;Definir a versão global do Python&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;asdf global python latest
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;5- &lt;strong&gt;Verifique a versão do python&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python &lt;span class="nt"&gt;--version&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Esses passos são suficientes para instalar e configurar as linguagens e ferramentas essenciais no &lt;strong&gt;asdf&lt;/strong&gt;. Com essas instruções, seu ambiente estará preparado para alternar entre versões específicas conforme necessário.&lt;/p&gt;

&lt;h2&gt;
  
  
  (Opcional) Laboratório
&lt;/h2&gt;

&lt;p&gt;Se você está aqui é porque você ficou interessado em testar o asdf antes de configurar em sua máquina. Sendo assim, vamos lá!&lt;/p&gt;

&lt;h3&gt;
  
  
  Pré-requisitos:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Docker&lt;/li&gt;
&lt;li&gt;Docker Compose&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Criando o ambiente
&lt;/h3&gt;

&lt;p&gt;Para facilitar o laboratório com o asdf, preparamos um ambiente configurado e pronto para uso em um container Docker. Você só precisa criar os arquivos indicados abaixo:&lt;/p&gt;

&lt;p&gt;Arquivo &lt;code&gt;Dockerfile&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;FROM ubuntu:24.04

&lt;span class="c"&gt;# Atualiza o sistema e instala as dependências&lt;/span&gt;
RUN apt-get update &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; apt-get &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;-y&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    make build-essential libssl-dev zlib1g-dev &lt;span class="se"&gt;\&lt;/span&gt;
    libbz2-dev libreadline-dev libsqlite3-dev wget curl &lt;span class="se"&gt;\&lt;/span&gt;
    llvm libncurses5-dev libncursesw5-dev &lt;span class="se"&gt;\&lt;/span&gt;
    xz-utils tk-dev libffi-dev liblzma-dev &lt;span class="se"&gt;\&lt;/span&gt;
    git nano &lt;span class="nb"&gt;sudo&lt;/span&gt;

&lt;span class="c"&gt;# Adiciona um novo usuário&lt;/span&gt;
RUN useradd &lt;span class="nt"&gt;-m&lt;/span&gt; &lt;span class="nt"&gt;-s&lt;/span&gt; /bin/bash usuario &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"usuario:123"&lt;/span&gt; | chpasswd &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
usermod &lt;span class="nt"&gt;-aG&lt;/span&gt; &lt;span class="nb"&gt;sudo &lt;/span&gt;usuario

&lt;span class="c"&gt;# Da permissões sudo sem senha para o novo usuário&lt;/span&gt;
RUN &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"usuario ALL=(ALL) NOPASSWD: ALL"&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&amp;gt;&lt;/span&gt; /etc/sudoers

&lt;span class="c"&gt;# Comando inicial para rodar o container no terminal do novo usuário&lt;/span&gt;
USER usuario

&lt;span class="c"&gt;# Instala e configura o asdf&lt;/span&gt;
RUN git clone https://github.com/asdf-vm/asdf.git ~/.asdf &lt;span class="nt"&gt;--branch&lt;/span&gt; v0.14.1
RUN &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s1"&gt;'. "$HOME/.asdf/completions/asdf.bash"'&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&amp;gt;&lt;/span&gt; ~/.bashrc &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
    &lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s1"&gt;'. "$HOME/.asdf/asdf.sh"'&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&amp;gt;&lt;/span&gt; ~/.bashrc

CMD &lt;span class="o"&gt;[&lt;/span&gt;&lt;span class="s2"&gt;"/bin/bash"&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Arquivo &lt;code&gt;docker-compose.yml&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;services:
  asdf:
    build:
      context: &lt;span class="nb"&gt;.&lt;/span&gt;  &lt;span class="c"&gt;# Indica o diretório onde está o Dockerfile&lt;/span&gt;
      dockerfile: Dockerfile
    container_name: asdf
    &lt;span class="nb"&gt;tty&lt;/span&gt;: &lt;span class="nb"&gt;true&lt;/span&gt;  &lt;span class="c"&gt;# Mantém o terminal aberto&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Construindo e executando o container
&lt;/h3&gt;

&lt;p&gt;Com os arquivos prontos, execute o container com o comando abaixo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose up &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="nt"&gt;--build&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Onde:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;-d&lt;/code&gt;: Executa o container em segundo plano&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;--build&lt;/code&gt;: Recria o container se o Dockerfile mudar&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;💡 Isso deve demorar um tempinho para instalar todas as dependências... ⏳ Aproveite para esticar as canelas e tomar uma água 💧!&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Após a instalação, acesse o container com o seguinte comando:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose &lt;span class="nb"&gt;exec &lt;/span&gt;asdf bash
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Agora você está pronto para seguir com o tutorial no tópico Guia Prático para Instalação de Plugins e Versões com o asdf e testar os comandos do &lt;strong&gt;asdf&lt;/strong&gt;!&lt;/p&gt;

&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://asdf-vm.com/pt-br/" rel="noopener noreferrer"&gt;Documentação Oficial&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/asdf-vm/asdf-nodejs" rel="noopener noreferrer"&gt;Plugin asdf para NodeJS&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/halcyon/asdf-java" rel="noopener noreferrer"&gt;Plugin asdf para Java&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/halcyon/asdf-maven" rel="noopener noreferrer"&gt;Plugin asdf para Maven&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/asdf-community/asdf-python" rel="noopener noreferrer"&gt;Plugin asdf para Python&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>asdf</category>
      <category>rbenv</category>
      <category>nvm</category>
      <category>mvn</category>
    </item>
    <item>
      <title>Conector Kafka Lambda Sink com Localstack</title>
      <dc:creator>Adriano Avelino</dc:creator>
      <pubDate>Wed, 09 Oct 2024 01:30:59 +0000</pubDate>
      <link>https://forem.com/adrianoavelino/conector-kafka-lambda-sink-com-localstack-oe0</link>
      <guid>https://forem.com/adrianoavelino/conector-kafka-lambda-sink-com-localstack-oe0</guid>
      <description>&lt;h2&gt;
  
  
  Introdução
&lt;/h2&gt;

&lt;p&gt;Ao trabalhar em projetos corporativos, é comum mover informações de um lugar para outro utilizando o Kafka e seus conectores &lt;em&gt;source&lt;/em&gt; e &lt;em&gt;sink&lt;/em&gt;. Os conectores &lt;em&gt;source&lt;/em&gt; são responsáveis por enviar dados aos tópicos Kafka, enquanto os conectores &lt;strong&gt;sink&lt;/strong&gt; exportam essas informações para sistemas externos, como bancos de dados, ferramentas de log ou serviços na AWS, como uma &lt;em&gt;Lambda&lt;/em&gt;. Neste tutorial, abordaremos especificamente o conector &lt;em&gt;sink&lt;/em&gt;, mais precisamente o &lt;strong&gt;Lambda Sink Connector&lt;/strong&gt;, utilizando o &lt;a href="https://www.localstack.cloud/" rel="noopener noreferrer"&gt;Localstack&lt;/a&gt; para simular o serviço AWS Lambda.&lt;/p&gt;

&lt;h2&gt;
  
  
  Pré-requisitos
&lt;/h2&gt;

&lt;p&gt;Antes de começar, certifique-se de ter os seguintes itens configurados no seu ambiente:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Docker&lt;/strong&gt;: para criar os containers que rodarão o Kafka e o Localstack.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Docker Compose&lt;/strong&gt;: para orquestrar os containers.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://docs.aws.amazon.com/pt_br/cli/latest/userguide/getting-started-install.html" rel="noopener noreferrer"&gt;AWS CLI&lt;/a&gt;&lt;/strong&gt;: necessário para executar comandos relacionados à AWS.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;cURL ou uma ferramenta similar como o Insomnia&lt;/strong&gt;: para realizar requisições HTTP.&lt;/li&gt;
&lt;li&gt;Arquivo de configuração AWS (&lt;code&gt;~/.aws/config&lt;/code&gt;) configurado da seguinte forma:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="o"&gt;[&lt;/span&gt;default]
region &lt;span class="o"&gt;=&lt;/span&gt; us-east-1
output &lt;span class="o"&gt;=&lt;/span&gt; json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Sobre o projeto
&lt;/h2&gt;

&lt;p&gt;O objetivo deste projeto é simular um ambiente corporativo usando Kafka e AWS Lambda, de forma acessível e sem custos, utilizando o &lt;a href="https://www.localstack.cloud/" rel="noopener noreferrer"&gt;Localstack&lt;/a&gt;. Isso permite que você emule serviços da AWS localmente, facilitando o desenvolvimento e testes.&lt;/p&gt;

&lt;p&gt;No fluxo deste projeto, uma mensagem é enviada por um &lt;strong&gt;producer&lt;/strong&gt; para um tópico Kafka, que por sua vez dispara um evento para o conector Kafka Sink. O conector envia essa mensagem para uma função Lambda, conforme ilustrado no diagrama abaixo:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0elgyam1edcorf7quo47.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0elgyam1edcorf7quo47.png" alt="Fluxograma do conector Lambda Sink. Producer enviando mensagem para um tópico Kafka que dispara um evento para o conector que envia a mensagem para uma Lambda" width="568" height="157"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Para isso, utilizaremos o &lt;strong&gt;Docker Compose&lt;/strong&gt; para gerenciar os containers do Kafka e do Localstack. A imagem Docker &lt;strong&gt;fast-data-dev&lt;/strong&gt; será usada para o Kafka, pois já inclui uma interface gráfica e ferramentas como &lt;strong&gt;Zookeeper&lt;/strong&gt;, &lt;strong&gt;Kafka Cluster&lt;/strong&gt;, &lt;strong&gt;Schema Registry&lt;/strong&gt; e &lt;strong&gt;Kafka Connect&lt;/strong&gt;. O Localstack será usado para emular serviços da AWS, incluindo a Lambda.&lt;/p&gt;

&lt;p&gt;Se você preferir uma visualização gráfica da Lambda em execução, o Localstack possui uma &lt;a href="https://docs.localstack.cloud/user-guide/web-application/" rel="noopener noreferrer"&gt;interface web&lt;/a&gt;. Além disso, existe um &lt;a href="https://www.youtube.com/watch?v=1ow0NQv5Fsk" rel="noopener noreferrer"&gt;vídeo introdutório sobre o Localstack no YouTube&lt;/a&gt;, caso queira entender mais sobre a ferramenta.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqgca4cunmhq88yhhiaiy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqgca4cunmhq88yhhiaiy.png" alt="Tela da interface gráfica do Localstack Web mostrando os logs de execução de uma Lambda" width="800" height="373"&gt;&lt;/a&gt;&lt;br&gt;&lt;br&gt;
&lt;em&gt;Tela da interface gráfica do Localstack Web mostrando os logs de execução de uma Lambda.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Para configurar o Kafka Connect, utilizaremos o &lt;a href="https://github.com/adrianoavelino/kafka-connect-lambda-localstack" rel="noopener noreferrer"&gt;plugin Lambda Sink&lt;/a&gt; que oferece suporte ao Localstack. Este plugin é um fork do projeto &lt;a href="https://github.com/Nordstrom/kafka-connect-lambda" rel="noopener noreferrer"&gt;kafka-connect-lambda&lt;/a&gt;, que não funciona com o Localstack. Caso tenha interesse em contribuir, a sua contribuição é bem-vinda!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F55nqfwqk7499ja7vx2u1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F55nqfwqk7499ja7vx2u1.png" alt="Tela da interface gráfica do fast-data-dev mostrando o conector Lambda Sink" width="800" height="550"&gt;&lt;/a&gt;Tela da interface gráfica do fast-data-dev mostrando o conector Lambda Sink.&lt;/p&gt;

&lt;p&gt;Para enviar mensagens ao tópico Kafka, usaremos um &lt;strong&gt;producer&lt;/strong&gt;. Existem diversas opções para isso:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Linha de comando&lt;/strong&gt;: Usando o &lt;code&gt;kafka-console-producer&lt;/code&gt; do container Kafka.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://github.com/edenhill/kcat" rel="noopener noreferrer"&gt;Kafkacat&lt;/a&gt;&lt;/strong&gt;: Ferramenta de linha de comando para interagir com o Kafka.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;JMeter com o plugin Kafka&lt;/strong&gt;: Veja mais detalhes na seção Dicas e Recomendações.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Conector Data Source como o &lt;a href="https://github.com/MichaelDrogalis/voluble" rel="noopener noreferrer"&gt;Voluble&lt;/a&gt;&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;Criar uma aplicação customizada na sua linguagem de programação preferida.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Para simplificar, neste tutorial, utilizaremos o &lt;code&gt;kafka-console-producer&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8lo7ujakbb2aj1ut66np.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8lo7ujakbb2aj1ut66np.png" alt="Tela do terminal executando Kafka Console Producer na linha de comando" width="800" height="129"&gt;&lt;/a&gt;Tela do terminal executando Kafka Console Producer na linha de comando&lt;/p&gt;
&lt;h2&gt;
  
  
  Criação dos arquivos de configuração
&lt;/h2&gt;

&lt;p&gt;Antes de tudo começar a funcionar, é necessário criarmos os arquivos de configuração da infraestrutura do Kafka, Lambda e conector Lambda Sink. Vamos começar configurando o &lt;strong&gt;Docker Compose&lt;/strong&gt; para orquestrar Kafka e LocalStack, seguido do &lt;strong&gt;CloudFormation&lt;/strong&gt; para provisionar a função Lambda no Localstack. Em seguida, definimos o &lt;strong&gt;conector Kafka&lt;/strong&gt; e baixamos o &lt;strong&gt;plugin Lambda Sink&lt;/strong&gt;. Por fim, verificamos a &lt;strong&gt;estrutura final do projeto&lt;/strong&gt;, listando os arquivos criados para garantir que estão no local correto e prontos para execução. Vamos nessa!&lt;/p&gt;
&lt;h3&gt;
  
  
  Docker Compose
&lt;/h3&gt;

&lt;p&gt;Começaremos criando o arquivo &lt;strong&gt;docker-compose.yml&lt;/strong&gt;, que será responsável por orquestrar os containers do Kafka e do Localstack. Ele define os serviços necessários, como o Kafka (com a imagem &lt;strong&gt;fast-data-dev&lt;/strong&gt;) e o Localstack para emular a AWS. Crie o arquivo &lt;code&gt;./docker-compose.yml&lt;/code&gt; com o seguinte conteúdo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;services&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;fast-data-dev&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;landoop/fast-data-dev:3.3&lt;/span&gt;
    &lt;span class="na"&gt;ports&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;2181:2181"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;3030:3030"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;8081-8083:8081-8083"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;9581-9585:9581-9585"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;9092:9092"&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;ADV_HOST&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;127.0.0.1&lt;/span&gt;
      &lt;span class="na"&gt;DEBUG&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;1&lt;/span&gt;
      &lt;span class="na"&gt;RUNTESTS&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="m"&gt;0&lt;/span&gt;
      &lt;span class="na"&gt;AWS_ACCESS_KEY_ID&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;local&lt;/span&gt;
      &lt;span class="na"&gt;AWS_SECRET_ACCESS_KEY&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;local&lt;/span&gt;
      &lt;span class="na"&gt;CONNECT_PLUGIN_PATH&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;/var/run/connect/connectors/stream-reactor,/var/run/connect/connectors/third-party,/connectors&lt;/span&gt;
    &lt;span class="na"&gt;volumes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;./plugins:/connectors/kafka-connect-lambda-localstack&lt;/span&gt;
    &lt;span class="na"&gt;network_mode&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;host&lt;/span&gt;

  &lt;span class="na"&gt;localstack&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;container_name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;${LOCALSTACK_DOCKER_NAME-localstack-main}"&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;localstack/localstack:2.3&lt;/span&gt;
    &lt;span class="na"&gt;ports&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;127.0.0.1:4566:4566"&lt;/span&gt;            &lt;span class="c1"&gt;# LocalStack Gateway&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;127.0.0.1:4510-4559:4510-4559"&lt;/span&gt;  &lt;span class="c1"&gt;# external services port range&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;DEBUG=${DEBUG-}&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;DOCKER_HOST=unix:///var/run/docker.sock&lt;/span&gt;
    &lt;span class="na"&gt;volumes&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;${LOCALSTACK_VOLUME_DIR:-./volume}:/var/lib/localstack"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;/var/run/docker.sock:/var/run/docker.sock"&lt;/span&gt;
    &lt;span class="na"&gt;depends_on&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="nv"&gt;fast-data-dev&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
    &lt;span class="na"&gt;network_mode&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;host&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  CloudFormation
&lt;/h3&gt;

&lt;p&gt;Agora, criaremos um arquivo &lt;strong&gt;cloudformation.yml&lt;/strong&gt; para provisionar a função Lambda simulada no Localstack. O arquivo também define as permissões necessárias para a execução da Lambda. Crie o arquivo &lt;code&gt;./cloudformation.yml&lt;/code&gt; com o seguinte conteúdo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;&lt;span class="na"&gt;AWSTemplateFormatVersion&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;2010-09-09'&lt;/span&gt;
&lt;span class="na"&gt;Description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Lambda&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;connector&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;example&lt;/span&gt;&lt;span class="nv"&gt; &lt;/span&gt;&lt;span class="s"&gt;function'&lt;/span&gt;
&lt;span class="na"&gt;Resources&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;ExampleFunction&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::Lambda::Function&lt;/span&gt;
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;FunctionName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;example-function&lt;/span&gt;
      &lt;span class="na"&gt;Handler&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;index.handler&lt;/span&gt;
      &lt;span class="na"&gt;Runtime&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;python3.7&lt;/span&gt;
      &lt;span class="na"&gt;Role&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!GetAtt&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ExampleFunctionRole.Arn'&lt;/span&gt;
      &lt;span class="na"&gt;Code&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;ZipFile&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;|&lt;/span&gt;
          &lt;span class="s"&gt;import json&lt;/span&gt;
          &lt;span class="s"&gt;def handler(event, context):&lt;/span&gt;
            &lt;span class="s"&gt;print(json.dumps(event))&lt;/span&gt;
            &lt;span class="s"&gt;return event&lt;/span&gt;

  &lt;span class="na"&gt;ExampleFunctionRole&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Type&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;AWS::IAM::Role&lt;/span&gt;
    &lt;span class="na"&gt;Properties&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;RoleName&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;example-lambda-role&lt;/span&gt;
      &lt;span class="na"&gt;AssumeRolePolicyDocument&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="na"&gt;Version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;2012-10-17'&lt;/span&gt;
        &lt;span class="na"&gt;Statement&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
        &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="na"&gt;Effect&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Allow&lt;/span&gt;
          &lt;span class="na"&gt;Principal&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
            &lt;span class="na"&gt;Service&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;lambda.amazonaws.com&lt;/span&gt;
          &lt;span class="na"&gt;Action&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;sts:AssumeRole&lt;/span&gt;
      &lt;span class="na"&gt;ManagedPolicyArns&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s"&gt;arn:aws:iam::aws:policy/service-role/AWSLambdaBasicExecutionRole&lt;/span&gt;

&lt;span class="na"&gt;Outputs&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;ExampleFunctionArn&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Value&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!GetAtt&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ExampleFunction.Arn'&lt;/span&gt;
  &lt;span class="na"&gt;ExampleFunctionRoleArn&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;Value&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="kt"&gt;!GetAtt&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ExampleFunctionRole.Arn'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Configuração do conector Kafka
&lt;/h3&gt;

&lt;p&gt;Agora, crie o arquivo de configuração do &lt;strong&gt;conector Lambda Sink&lt;/strong&gt;. Este arquivo define as propriedades do conector, incluindo o tópico Kafka e a função Lambda a ser invocada. Crie o arquivo &lt;code&gt;./connector-localstack.json&lt;/code&gt; com o seguinte conteúdo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"example-lambda-connector-localstack"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"config"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"tasks.max"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"connector.class"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"com.nordstrom.kafka.connect.lambda.LambdaSinkConnector"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"topics"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"example-stream"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"key.converter"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"org.apache.kafka.connect.storage.StringConverter"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"value.converter"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"org.apache.kafka.connect.storage.StringConverter"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"aws.region"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"us-east-1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"aws.lambda.function.arn"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:lambda:us-east-1:000000000000:function:example-function"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"aws.lambda.invocation.timeout.ms"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"60000"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"aws.lambda.invocation.mode"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"SYNC"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"aws.lambda.batch.enabled"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"false"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"localstack.enabled"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"true"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Download do plugin Lambda Sink
&lt;/h3&gt;

&lt;p&gt;Por fim, faça o &lt;strong&gt;download do plugin Lambda Sink&lt;/strong&gt; e salve-o no diretório &lt;code&gt;./plugins&lt;/code&gt;. Execute o seguinte comando no terminal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;mkdir&lt;/span&gt; ./plugins &amp;amp; curl &lt;span class="nt"&gt;-L&lt;/span&gt; &lt;span class="nt"&gt;-o&lt;/span&gt; ./plugins/kafka-connect-lambda-localstack-1.4.0.jar &lt;span class="se"&gt;\&lt;/span&gt;
https://github.com/adrianoavelino/kafka-connect-lambda-localstack/releases/download/v1.4.0/kafka-connect-lambda-localstack-1.4.0.jar
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;A última versão do plugin é a &lt;a href="https://github.com/adrianoavelino/kafka-connect-lambda-localstack/releases/tag/v1.4.0" rel="noopener noreferrer"&gt;1.4.0&lt;/a&gt;. Para versões mais recentes, consulte as &lt;a href="https://github.com/adrianoavelino/kafka-connect-lambda-localstack/releases" rel="noopener noreferrer"&gt;releases no GitHub&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Estrutura final do projeto
&lt;/h3&gt;

&lt;p&gt;Ao finalizar esta etapa, a estrutura de arquivos do projeto deverá ser semelhante a esta:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.
├── cloudformation.yml
├── connector-localstack.json
├── docker-compose.yml
├── plugins
│   └── kafka-connect-lambda-localstack-1.4.0.jar
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Criação da infrastrutura
&lt;/h2&gt;

&lt;p&gt;Para que nossa aplicação comece a funcionar, precisamos preparar o terreno: vamos criar a infraestrutura que dará vida ao Kafka, à Lambda e ao conector Lambda Sink.&lt;/p&gt;

&lt;p&gt;Iniciaremos com a inicialização dos containers. Em seguida, provisionaremos a função Lambda usando o &lt;strong&gt;CloudFormation&lt;/strong&gt;. Depois, criaremos o &lt;strong&gt;conector Kafka&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Por fim, vamos conferir se tudo foi criado corretamente. Preparados? Então, vamos lá!&lt;/p&gt;

&lt;h3&gt;
  
  
  Inicialização dos containers
&lt;/h3&gt;

&lt;p&gt;Inicie os containers do Kafka e do Localstack com o seguinte comando:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose up &lt;span class="nt"&gt;-d&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Se você é uma pessoa como eu que adora acompanhar o passo a passo de cada informação nos logs, é possível utilizar o comando &lt;code&gt;docker compose logs -f&lt;/code&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Para garantir que tudo está funcionando como devia, podemos realizar algumas verificações:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;docker compose ps&lt;/code&gt;: para validar se os containers estejam no estado &lt;code&gt;healthy&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Acesse o endereço &lt;a href="http://localhost:3030/" rel="noopener noreferrer"&gt;http://localhost:3030/&lt;/a&gt; para acessar a interface gráfica do Landoop para visualizar os tópicos, conectores e plugins dos conectores instalados.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;curl --url http://localhost:8083/connector-plugins/&lt;/code&gt;: listas os plugins de conectores disponíveis, verifique se o plugin com a class &lt;code&gt;com.nordstrom.kafka.connect.lambda.LambdaSinkConnector&lt;/code&gt; está disponível.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Criação da lambda
&lt;/h3&gt;

&lt;p&gt;Crie uma Lambda no Localstack utilizando &lt;strong&gt;Cloudformation&lt;/strong&gt; com seguinte comando:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws cloudformation create-stack &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--stack-name&lt;/span&gt; example-lambda-stack &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--capabilities&lt;/span&gt; CAPABILITY_NAMED_IAM &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--template-body&lt;/span&gt; file://cloudformation.yml &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--endpoint-url&lt;/span&gt; http://localhost:4566
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Se a stack for criada com sucesso, a saída será semelhante a esta:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="o"&gt;{&lt;/span&gt;
    &lt;span class="s2"&gt;"StackId"&lt;/span&gt;: &lt;span class="s2"&gt;"arn:aws:cloudformation:us-east-1:000000000000:stack/example-lambda-stack/d61cbd21"&lt;/span&gt;
&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Para verificar se a Lambda foi criada corretamente, execute:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws lambda list-functions &lt;span class="nt"&gt;--endpoint-url&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;http://localhost:4566
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A saída esperada será semelhante ao exemplo abaixo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"Functions"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"FunctionName"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"example-function"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"FunctionArn"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:lambda:us-east-1:000000000000:function:example-function"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Runtime"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"python3.7"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Role"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:iam::000000000000:role/example-lambda-role"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Handler"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"index.handler"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"CodeSize"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;1630&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Description"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Timeout"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"MemorySize"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;128&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"LastModified"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2024-09-09T01:17:54.471773+0000"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"CodeSha256"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"qDSOE5NTun0FiK+cFAAZGHPqarSjlyJtlGMCPPRpJ8Y="&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Version"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"$LATEST"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"TracingConfig"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"Mode"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"PassThrough"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"RevisionId"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"debe4c55-40b0-46e3-94b4-acfc1aeb17fb"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"PackageType"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Zip"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"Architectures"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="s2"&gt;"x86_64"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"EphemeralStorage"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"Size"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;512&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="nl"&gt;"SnapStart"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"ApplyOn"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"None"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
                &lt;/span&gt;&lt;span class="nl"&gt;"OptimizationStatus"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Off"&lt;/span&gt;&lt;span class="w"&gt;
            &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Para testar a Lambda, use o seguinte comando:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws lambda invoke &lt;span class="nt"&gt;--function-name&lt;/span&gt; example-function &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--cli-binary-format&lt;/span&gt; raw-in-base64-out &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--payload&lt;/span&gt; &lt;span class="s1"&gt;'{"value": "my example"}'&lt;/span&gt; &lt;span class="nt"&gt;--output&lt;/span&gt; text result.txt &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--endpoint-url&lt;/span&gt; http://localhost:4566
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Se tudo ocorreu bem, você deve receber a seguinte resposta:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;$LATEST&lt;/span&gt; Unhandled       200
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Criação do conector
&lt;/h3&gt;

&lt;p&gt;A criação do conector Lambda Sink pode ser feita via &lt;a href="https://docs.confluent.io/platform/current/connect/references/restapi.html" rel="noopener noreferrer"&gt;API Kafka Connect REST&lt;/a&gt; ou pela interface gráfica do &lt;a href="http://localhost:3030/" rel="noopener noreferrer"&gt;Landoop&lt;/a&gt;. Neste exemplo, utilizaremos o &lt;code&gt;curl&lt;/code&gt; para enviar as requisições, mas você também pode usar ferramentas como o &lt;a href="https://insomnia.rest/download" rel="noopener noreferrer"&gt;Insomnia&lt;/a&gt; ou similares.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;-XPOST&lt;/span&gt; &lt;span class="nt"&gt;-H&lt;/span&gt; &lt;span class="s2"&gt;"Content-Type: application/json"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
http://localhost:8083/connectors &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;-d&lt;/span&gt; @connector-localstack.json
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Dica: No Insomnia, você pode importar comandos &lt;strong&gt;curl&lt;/strong&gt; para gerar a requisição automaticamente. Confira este &lt;a href="https://www.youtube.com/watch?v=wGzQrWcUcjc" rel="noopener noreferrer"&gt;vídeo tutorial&lt;/a&gt; ou consulte a &lt;a href="https://docs.insomnia.rest/insomnia/import-export-data#import-data" rel="noopener noreferrer"&gt;documentação oficial&lt;/a&gt; para mais detalhes.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Você deve receber a seguinte resposta:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"example-lambda-connector-localstack"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"config"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"tasks.max"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"connector.class"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"com.nordstrom.kafka.connect.lambda.LambdaSinkConnector"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"topics"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"example-stream"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"key.converter"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"org.apache.kafka.connect.storage.StringConverter"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"value.converter"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"org.apache.kafka.connect.storage.StringConverter"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"aws.region"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"us-east-1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"aws.lambda.function.arn"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"arn:aws:lambda:us-east-1:000000000000:function:example-function"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"aws.lambda.invocation.timeout.ms"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"60000"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"aws.lambda.invocation.mode"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"SYNC"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"aws.lambda.batch.enabled"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"false"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"localstack.enabled"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"true"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"example-lambda-connector-localstack"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"tasks"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"sink"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Para validar o status do conector, execute o seguinte comando &lt;strong&gt;curl&lt;/strong&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;--request&lt;/span&gt; GET &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--url&lt;/span&gt; http://localhost:8083/connectors/example-lambda-connector-localstack/status 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A resposta deve se algo como:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"example-lambda-connector-localstack"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"connector"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"state"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"RUNNING"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"worker_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"127.0.0.1:8083"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"tasks"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"state"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"RUNNING"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"worker_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"127.0.0.1:8083"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"sink"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Observação:&lt;/strong&gt; Este comando também pode ser utilizado para diagnosticar problemas de integração entre o conector Kafka e a Lambda no Localstack.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Enviar mensagem no kafka
&lt;/h2&gt;

&lt;p&gt;Agora que tudo está configurado e em execução, é hora de fazer o Kafka funcionar de verdade! Vamos aprender a enviar mensagens para o tópico que criamos. Neste momento, vamos explorar duas opções para enviar mensagens: uma forma rápida e direta para quem precisa de agilidade e outra que permite o envio contínuo de múltiplas mensagens. Com essas opções, você poderá interagir facilmente com o Kafka e começar a testar a comunicação entre os componentes da sua aplicação. Vamos lá!&lt;/p&gt;

&lt;h3&gt;
  
  
  Opção 1: Envio rápido de mensagem única
&lt;/h3&gt;

&lt;p&gt;Utilize o comando abaixo para enviar uma mensagem única ao Kafka de forma rápida:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"teste"&lt;/span&gt; | docker compose &lt;span class="nb"&gt;exec&lt;/span&gt; &lt;span class="nt"&gt;-T&lt;/span&gt; fast-data-dev &lt;span class="se"&gt;\&lt;/span&gt;
kafka-console-producer &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--broker-list&lt;/span&gt; localhost:9092 &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--topic&lt;/span&gt; example-stream
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Obs: em caso de problema de comunicação com o broker ou outro serviço utilize o comando: &lt;code&gt;nc -vz localhost &amp;lt;PORTA&amp;gt;&lt;/code&gt;. Ex: &lt;code&gt;nc -vz localhost 9092&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Opção 2: Envio contínuo de mensagens
&lt;/h3&gt;

&lt;p&gt;Se preferir manter um terminal ativo para enviar várias mensagens ao Kafka, utilize o seguinte comando:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose &lt;span class="nb"&gt;exec &lt;/span&gt;fast-data-dev &lt;span class="se"&gt;\&lt;/span&gt;
kafka-console-producer &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--broker-list&lt;/span&gt; localhost:9092 &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--topic&lt;/span&gt; example-stream
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Após a execução, você verá o prompt &lt;code&gt;&amp;gt;&lt;/code&gt;, onde poderá inserir várias mensagens ao tópico Kafka. Exemplo de mensagens:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;"value"&lt;/span&gt;: &lt;span class="s2"&gt;"my example"&lt;/span&gt;&lt;span class="o"&gt;}&lt;/span&gt;
&lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;"value"&lt;/span&gt;: &lt;span class="s2"&gt;"my example 2"&lt;/span&gt;&lt;span class="o"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Como visualizar os logs da Lambda no terminal
&lt;/h2&gt;

&lt;p&gt;Para visualizar os logs da execução da Lambda diretamente no terminal, siga os passos abaixo:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Primeiro, obtenha o nome do grupo de logs da Lambda:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nv"&gt;LOG_GROUP&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sb"&gt;`&lt;/span&gt;aws logs describe-log-groups &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--endpoint-url&lt;/span&gt; http://localhost:4566 &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--query&lt;/span&gt; &lt;span class="s2"&gt;"logGroups[0].logGroupName"&lt;/span&gt; | &lt;span class="nb"&gt;sed&lt;/span&gt; &lt;span class="s1"&gt;'s/"//g'&lt;/span&gt;&lt;span class="sb"&gt;`&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;Depois, use o comando aws logs tail para seguir os logs em tempo real:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;aws logs &lt;span class="nb"&gt;tail&lt;/span&gt; &lt;span class="nv"&gt;$LOG_GROUP&lt;/span&gt; &lt;span class="nt"&gt;--follow&lt;/span&gt; &lt;span class="nt"&gt;--endpoint-url&lt;/span&gt; http://localhost:4566
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A resposta esperada é algo semelhante ao exempĺo abaixo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;2024-09-15T18:27:34.888000+00:00 2024/09/15/[&lt;span class="nv"&gt;$LATEST&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt;57661289d19ebedfe4a6782395866989 START RequestId: 61ab5b4f-569a-4348-905b-b15ceadfcc26 Version: &lt;span class="nv"&gt;$LATEST&lt;/span&gt;
2024-09-15T18:27:34.902000+00:00 2024/09/15/[&lt;span class="nv"&gt;$LATEST&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt;57661289d19ebedfe4a6782395866989 &lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;"key"&lt;/span&gt;: &lt;span class="s2"&gt;""&lt;/span&gt;, &lt;span class="s2"&gt;"keySchemaName"&lt;/span&gt;: null, &lt;span class="s2"&gt;"value"&lt;/span&gt;: &lt;span class="s2"&gt;"teste"&lt;/span&gt;, &lt;span class="s2"&gt;"valueSchemaName"&lt;/span&gt;: null, &lt;span class="s2"&gt;"topic"&lt;/span&gt;: &lt;span class="s2"&gt;"example-stream"&lt;/span&gt;, &lt;span class="s2"&gt;"partition"&lt;/span&gt;: 0, &lt;span class="s2"&gt;"offset"&lt;/span&gt;: 1, &lt;span class="s2"&gt;"timestamp"&lt;/span&gt;: 1726424854776, &lt;span class="s2"&gt;"timestampTypeName"&lt;/span&gt;: &lt;span class="s2"&gt;"CreateTime"&lt;/span&gt;&lt;span class="o"&gt;}&lt;/span&gt;
2024-09-15T18:27:34.917000+00:00 2024/09/15/[&lt;span class="nv"&gt;$LATEST&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt;57661289d19ebedfe4a6782395866989 END RequestId: 61ab5b4f-569a-4348-905b-b15ceadfcc26
2024-09-15T18:27:34.932000+00:00 2024/09/15/[&lt;span class="nv"&gt;$LATEST&lt;/span&gt;&lt;span class="o"&gt;]&lt;/span&gt;57661289d19ebedfe4a6782395866989 REPORT RequestId: 61ab5b4f-569a-4348-905b-b15ceadfcc26    Duration: 7.85 ms   Billed Duration: 8 msMemory Size: 128 MB    Max Memory Used: 128 MB
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Observação:&lt;/strong&gt; também é possível visualizar os logs da Lambda diretamente na &lt;a href="https://app.localstack.cloud/inst/default/resources" rel="noopener noreferrer"&gt;interface gráfica do Localstack&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Automatização
&lt;/h2&gt;

&lt;p&gt;Parabéns! Se você chegou até aqui, já percorreu um longo caminho e fez tudo manualmente  ou ... você foi uma pessoa "espertinha" e encontrou um atalho aqui para chegar no parque de diversão.&lt;/p&gt;

&lt;p&gt;Aqui não vamos precisar criar os arquivos de configuração e nem a criação da infraestrutura de forma manual, tudo está automatizado. Seguindo os passo abaixo você terá o ambiente completo configurado para enviar mensagens ao tópico Kafka e validar os logs. Vamos a prática:&lt;/p&gt;

&lt;p&gt;Clone o repositório:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;git clone https://github.com/adrianoavelino/posts.git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Entre no diretório &lt;code&gt;automation&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd &lt;/span&gt;posts/dev.to/2024-09-01-conector-lambda-sink-localstack/automation/
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Adicione a permissão de execução nos arquivos de scrips:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;chmod&lt;/span&gt; +x init-scripts/&lt;span class="k"&gt;*&lt;/span&gt;.sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Inicie os containers:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose up
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Agora é só aguardar a inicialização e criação da lambda e conector Lambda Sink para inicar os testes. Se o procorreu como planejado, você deve ver algo parecido com o exemplo abaixo:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;localstack-main  | &lt;span class="c"&gt;########### script 02 - Lambda function has been invoked ###########&lt;/span&gt;
fast-data-dev-1  | Sat 28 Sep 2024 02:15:31 AM UTC  Kafka Connect listener HTTP state:  000  &lt;span class="o"&gt;(&lt;/span&gt;waiting &lt;span class="k"&gt;for &lt;/span&gt;200&lt;span class="o"&gt;)&lt;/span&gt;
fast-data-dev-1  | Sat 28 Sep 2024 02:15:36 AM UTC  Kafka Connect listener HTTP state:  000  &lt;span class="o"&gt;(&lt;/span&gt;waiting &lt;span class="k"&gt;for &lt;/span&gt;200&lt;span class="o"&gt;)&lt;/span&gt;
fast-data-dev-1  | Sat 28 Sep 2024 02:15:41 AM UTC  Kafka Connect listener HTTP state:  000  &lt;span class="o"&gt;(&lt;/span&gt;waiting &lt;span class="k"&gt;for &lt;/span&gt;200&lt;span class="o"&gt;)&lt;/span&gt;
fast-data-dev-1  | Sat 28 Sep 2024 02:15:47 AM UTC  Kafka Connect listener HTTP state:  200  &lt;span class="o"&gt;(&lt;/span&gt;waiting &lt;span class="k"&gt;for &lt;/span&gt;200&lt;span class="o"&gt;)&lt;/span&gt;
fast-data-dev-1  | 
fast-data-dev-1  | &lt;span class="nt"&gt;--&lt;/span&gt;
fast-data-dev-1  | +&amp;gt; Creating Lambda Sink Connector with avro
fast-data-dev-1  | &lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;"name"&lt;/span&gt;:&lt;span class="s2"&gt;"example-lambda-connector-localstack"&lt;/span&gt;,&lt;span class="s2"&gt;"config"&lt;/span&gt;:&lt;span class="o"&gt;{&lt;/span&gt;&lt;span class="s2"&gt;"tasks.max"&lt;/span&gt;:&lt;span class="s2"&gt;"1"&lt;/span&gt;,&lt;span class="s2"&gt;"connector.class"&lt;/span&gt;:&lt;span class="s2"&gt;"com.nordstrom.kafka.connect.lambda.LambdaSinkConnector"&lt;/span&gt;,&lt;span class="s2"&gt;"topics"&lt;/span&gt;:&lt;span class="s2"&gt;"example-stream"&lt;/span&gt;,&lt;span class="s2"&gt;"key.converter"&lt;/span&gt;:&lt;span class="s2"&gt;"org.apache.kafka.connect.storage.StringConverter"&lt;/span&gt;,&lt;span class="s2"&gt;"value.converter"&lt;/span&gt;:&lt;span class="s2"&gt;"org.apache.kafka.connect.storage.StringConverter"&lt;/span&gt;,&lt;span class="s2"&gt;"aws.region"&lt;/span&gt;:&lt;span class="s2"&gt;"us-east-1"&lt;/span&gt;,&lt;span class="s2"&gt;"aws.lambda.function.arn"&lt;/span&gt;:&lt;span class="s2"&gt;"arn:aws:lambda:us-east-1:000000000000:function:example-function"&lt;/span&gt;,&lt;span class="s2"&gt;"aws.lambda.invocation.timeout.ms"&lt;/span&gt;:&lt;span class="s2"&gt;"60000"&lt;/span&gt;,&lt;span class="s2"&gt;"aws.lambda.invocation.mode"&lt;/span&gt;:&lt;span class="s2"&gt;"SYNC"&lt;/span&gt;,&lt;span class="s2"&gt;"aws.lambda.batch.enabled"&lt;/span&gt;:&lt;span class="s2"&gt;"false"&lt;/span&gt;,&lt;span class="s2"&gt;"localstack.enabled"&lt;/span&gt;:&lt;span class="s2"&gt;"true"&lt;/span&gt;,&lt;span class="s2"&gt;"name"&lt;/span&gt;:&lt;span class="s2"&gt;"example-lambda-connector-localstack"&lt;/span&gt;&lt;span class="o"&gt;}&lt;/span&gt;,&lt;span class="s2"&gt;"tasks"&lt;/span&gt;:[],&lt;span class="s2"&gt;"type"&lt;/span&gt;:&lt;span class="s2"&gt;"sink"&lt;/span&gt;&lt;span class="o"&gt;}&lt;/span&gt;2024-09-28 02:15:53,267 INFO exited: logs-to-kafka &lt;span class="o"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;exit &lt;/span&gt;status 0&lt;span class="p"&gt;;&lt;/span&gt; expected&lt;span class="o"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Execute o seguinte comando para enviar um evento ao tópico Kafka:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;echo&lt;/span&gt; &lt;span class="s2"&gt;"teste"&lt;/span&gt; | docker compose &lt;span class="nb"&gt;exec&lt;/span&gt; &lt;span class="nt"&gt;-T&lt;/span&gt; fast-data-dev &lt;span class="se"&gt;\&lt;/span&gt;
kafka-console-producer &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--broker-list&lt;/span&gt; localhost:9092 &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--topic&lt;/span&gt; example-stream
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Agora você pode começar a realizar seus testes e ajustar a configuração conforme necessário. Se encontrar algum problema, não se preocupe, basta verificar os logs dos containers para identificar possíveis erros. E deixa eu te contar mais um segredo, deixei várias dicas legais logo abaixo.&lt;/p&gt;

&lt;h2&gt;
  
  
  Erros e Soluções: Desvendando Problemas com Estilo
&lt;/h2&gt;

&lt;p&gt;Vamos lá, todo mundo já passou por isso: algo dá errado e você fica ali, encarando a tela, tentando entender o que aconteceu. Mas calma, eu tô aqui pra te ajudar a resolver esses pepinos! 😎&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Logs: Seus Melhores Amigos
&lt;/h3&gt;

&lt;p&gt;Primeira dica de ouro: sempre dê uma olhada nos logs dos containers. Eles são tipo aquele amigo que te conta o que realmente está acontecendo por trás das cortinas. Você pode conferir os logs de cada serviço separadamente, assim:&lt;/p&gt;

&lt;p&gt;No LocalStack:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker compose logs -f localstack
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Ou no fast-data-dev:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker compose logs -f fast-data-dev
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Agora, se você estiver lidando com o &lt;strong&gt;fast-data-dev&lt;/strong&gt; e não estiver vendo todos os logs, não se desespere! Basta rodar o seguinte comando pra dar uma espiada no arquivo de logs dentro do container:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker compose &lt;span class="nb"&gt;exec &lt;/span&gt;fast-data-dev &lt;span class="nb"&gt;cat&lt;/span&gt; /var/log/broker.log
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;E se quiser saber mais sobre o fast-data-dev, dá uma olhada no &lt;a href="https://hub.docker.com/r/landoop/fast-data-dev" rel="noopener noreferrer"&gt;Docker Hub&lt;/a&gt;. Eles têm tudo lá!&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Erro Clássico: "Function not found"
&lt;/h3&gt;

&lt;p&gt;Ah, o famoso erro "Function not found arn:aws:lambda:us-east-1:000000000000:function:example-function". Isso geralmente significa que a Lambda que você está tentando chamar não foi encontrada. Provavelmente, ela foi criada em outra região. Então, dá uma conferida no seu arquivo de configuração do AWS CLI (&lt;code&gt;~/.aws/config&lt;/code&gt;) e veja se a região está certinha.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Erros no Conector Kafka: Como Encontrar?
&lt;/h3&gt;

&lt;p&gt;Se o problema for no conector Kafka, você pode usar a &lt;a href="https://docs.confluent.io/platform/current/connect/references/restapi.html#get--connectors-(string-name)-status" rel="noopener noreferrer"&gt;api do Kafka Connect&lt;/a&gt; pra descobrir o que tá rolando. Aqui vai o comando mágico:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;--request&lt;/span&gt; GET &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--url&lt;/span&gt; http://localhost:8083/connectors/example-lambda-connector-localstack/status
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Exemplo de Erro: Lambda Não Encontrada
&lt;/h4&gt;

&lt;p&gt;Se a Lambda não foi criada no Localstack, você vai ver algo assim:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"example-lambda-connector-localstack"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"connector"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"state"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"RUNNING"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"worker_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"127.0.0.1:8083"&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"tasks"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"state"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"FAILED"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"worker_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"127.0.0.1:8083"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"trace"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"org.apache.kafka.connect.errors.ConnectException: Exiting WorkerSinkTask due to unrecoverable exception.&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:611)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at org.apache.kafka.connect.runtime.WorkerSinkTask.poll(WorkerSinkTask.java:333)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at org.apache.kafka.connect.runtime.WorkerSinkTask.iteration(WorkerSinkTask.java:234)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at org.apache.kafka.connect.runtime.WorkerSinkTask.execute(WorkerSinkTask.java:203)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at org.apache.kafka.connect.runtime.WorkerTask.doRun(WorkerTask.java:189)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at org.apache.kafka.connect.runtime.WorkerTask.run(WorkerTask.java:244)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:515)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at java.base/java.lang.Thread.run(Thread.java:829)&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s2"&gt;Caused by: com.nordstrom.kafka.connect.lambda.InvocationClient$InvocationException: java.util.concurrent.ExecutionException: com.amazonaws.services.lambda.model.ResourceNotFoundException: Function not found: arn:aws:lambda:us-east-1:000000000000:function:example-function (Service: AWSLambda; Status Code: 404; Error Code: ResourceNotFoundException; Request ID: 52512fac-927b-4db0-a910-907270c4166f; Proxy: null)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at com.nordstrom.kafka.connect.lambda.InvocationClient.invoke(InvocationClient.java:71)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at com.nordstrom.kafka.connect.lambda.LambdaSinkTask.invoke(LambdaSinkTask.java:190)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at com.nordstrom.kafka.connect.lambda.LambdaSinkTask.put(LambdaSinkTask.java:86)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;at org.apache.kafka.connect.runtime.WorkerSinkTask.deliverMessages(WorkerSinkTask.java:581)&lt;/span&gt;&lt;span class="se"&gt;\n\t&lt;/span&gt;&lt;span class="s2"&gt;... 10 more&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"sink"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4. Interface Gráfica: Porque Nem Todo Mundo Gosta de Linha de Comando
&lt;/h3&gt;

&lt;p&gt;Se você é do tipo que prefere ver as coisas de forma visual (e quem não gosta, né?), o fast-data-dev tem uma &lt;a href="http://localhost:3030/kafka-connect-ui/#/cluster/fast-data-dev/connector/example-lambda-connector-localstack" rel="noopener noreferrer"&gt;interface gráfica&lt;/a&gt; bem bacana pra você acompanhar tudo que está rolando com seus conectores e tópicos do Kafka.&lt;/p&gt;

&lt;p&gt;Pronto! Agora você já sabe como lidar com esses erros chatos e seguir em frente com seu projeto. E lembre-se: os logs são seus amigos, e a interface gráfica tá aí pra te ajudar quando a linha de comando não for suficiente. 🚀&lt;/p&gt;

&lt;h2&gt;
  
  
  Dicas e recomendações
&lt;/h2&gt;

&lt;p&gt;No nosso passo a passo, colocamos o código-fonte da Lambda diretamente dentro do template do CloudFormation. Fizemos isso porque é um exemplo bem simples. No entanto, é possível criar uma Lambda com várias dependências usando ferramentas como &lt;strong&gt;Terraform&lt;/strong&gt; ou &lt;strong&gt;Serverless&lt;/strong&gt;, ambas integradas ao &lt;strong&gt;Localstack&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Se você quiser ver um exemplo prático, recomendo o vídeo no &lt;a href="https://www.youtube.com/watch?v=DFS3CnB-Z0k" rel="noopener noreferrer"&gt;YouTube do Localstack&lt;/a&gt;, que ensina como criar uma Lambda com &lt;strong&gt;hot reload&lt;/strong&gt;. Isso significa que você não precisa empacotar todo o código-fonte e suas dependências a cada alteração para testar a execução da Lambda no Localstack. Aqui está o &lt;a href="https://github.com/localstack-samples/localstack-pro-samples/tree/master/lambda-hot-reloading/javascript-terraform" rel="noopener noreferrer"&gt;repositório no GitHub&lt;/a&gt; utilizado no vídeo, além de um &lt;a href="https://github.com/my-study-area/poc-kafka-connector-lambda" rel="noopener noreferrer"&gt;exemplo de uso&lt;/a&gt; que fiz, onde testei uma Lambda com um conversor de Avro (no nosso guia, usamos um conversor de string).&lt;/p&gt;

&lt;p&gt;Outra maneira de acelerar a criação de Lambdas é utilizando o &lt;a href="https://www.serverless.com/" rel="noopener noreferrer"&gt;Serverless Framework&lt;/a&gt;. Ele simplifica a implantação de aplicações serverless. Para ajudar, deixo aqui um &lt;a href="https://github.com/my-study-area/estudo-serverless-framework" rel="noopener noreferrer"&gt;repositório no GitHub com exemplos de uso com Localstack&lt;/a&gt;, tanto em &lt;strong&gt;Node.js&lt;/strong&gt; quanto em &lt;strong&gt;Python&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Para produzir mensagens em um tópico Kafka via linha de comando, você pode usar o &lt;a href="https://github.com/edenhill/kcat" rel="noopener noreferrer"&gt;kcat&lt;/a&gt;. Se preferir uma interface gráfica, o &lt;a href="https://jmeter.apache.org/" rel="noopener noreferrer"&gt;JMeter&lt;/a&gt; com o &lt;a href="https://github.com/rollno748/di-kafkameter" rel="noopener noreferrer"&gt;plugin Kafka&lt;/a&gt; é uma ótima opção.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe2q301jfar2qdtjp321m.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe2q301jfar2qdtjp321m.png" alt="Tela da interface gráfica do JMeter com um plugin producer Kafka enviando diversas mensagens a um tópico Kafka" width="800" height="536"&gt;&lt;/a&gt;&lt;br&gt;
Tela da interface gráfica do JMeter com um plugin producer Kafka enviando diversas mensagens a um tópico Kafka.&lt;/p&gt;

&lt;p&gt;Para facilitar, aqui estão alguns tutoriais de instalação:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://stackoverflow.com/a/54181626/6415045" rel="noopener noreferrer"&gt;Instalação manual no Linux (StackOverflow)&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.youtube.com/watch?v=SP9H7Xc3oU8&amp;amp;list=PLPHt--SznmcAAankcwYa5Pdn3t1qAl8Cp&amp;amp;index=1" rel="noopener noreferrer"&gt;Instalação no Windows (YouTube)&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Para instalar o plugin Kafka no JMeter, recomendo usar o &lt;a href="https://www.youtube.com/watch?v=SP9H7Xc3oU8&amp;amp;list=PLPHt--SznmcAAankcwYa5Pdn3t1qAl8Cp&amp;amp;index=1" rel="noopener noreferrer"&gt;Plugin Manager&lt;/a&gt;, que facilita bastante o processo. Também deixo aqui um &lt;a href="https://github.com/adrianoavelino/posts/blob/main/dev.to/2024-09-01-conector-lambda-sink-localstack/jmeter-producer-kafka.jmx"&gt;arquivo de configuração&lt;/a&gt; para o plugin no JMeter, para agilizar ainda mais. Com ele você pode realizar o download e importar as configurações do producer Kafka no JMeter.&lt;/p&gt;

&lt;p&gt;Se você está buscando um desafio maior, pode experimentar o conector de data source &lt;strong&gt;&lt;a href="https://github.com/MichaelDrogalis/voluble" rel="noopener noreferrer"&gt;Voluble&lt;/a&gt;&lt;/strong&gt;. Ele permite gerar eventos automaticamente para um tópico Kafka. Confira um exemplo de uso no vídeo &lt;a href="https://youtu.be/3Gj_SoyuTYk?si=H7YCHqTsz4bRHpUi&amp;amp;t=105" rel="noopener noreferrer"&gt;🎄 Twelve Days of SMT 🎄 - Day 1: InsertField (timestamp)&lt;/a&gt; ou veja &lt;a href="https://github.com/my-study-area/estudo-kafka-connect?tab=readme-ov-file#twelve-days-of-smt----day-1-insertfield-timestamp" rel="noopener noreferrer"&gt;algumas anotações&lt;/a&gt; que podem te ajudar na prática.&lt;/p&gt;

&lt;h2&gt;
  
  
  Links
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://app.diagrams.net/" rel="noopener noreferrer"&gt;Ferramenta para criar Diagrama&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://jmeter.apache.org/" rel="noopener noreferrer"&gt;JMeter&lt;/a&gt; &lt;/li&gt;
&lt;li&gt;
&lt;a href="https://github.com/rollno748/di-kafkameter" rel="noopener noreferrer"&gt;Plugin DI KafkaMeter&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.aws.amazon.com/pt_br/cli/latest/userguide/getting-started-install.html" rel="noopener noreferrer"&gt;Instalação do AWS CLI&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.localstack.cloud/" rel="noopener noreferrer"&gt;Localstack&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/Nordstrom/kafka-connect-lambda" rel="noopener noreferrer"&gt;kafka-connect-lambda&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.serverless.com/" rel="noopener noreferrer"&gt;Serverless&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/localstack/terraform-local" rel="noopener noreferrer"&gt;tflocal: ferramenta para trabalhar com terraform no Localstack&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.confluent.io/platform/current/connect/references/restapi.html#get--connectors-(string-name)-status" rel="noopener noreferrer"&gt;api do Kafka Connect&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;blockquote&gt;
&lt;p&gt;Observação: todo conteúdo foi criado com auxílio da Inteligência Artificial &lt;a href="https://www.stackspot.com/pt/ai-assistente" rel="noopener noreferrer"&gt;Stackspot AI&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>kafka</category>
      <category>localstack</category>
      <category>conector</category>
      <category>sink</category>
    </item>
    <item>
      <title>Multiplos schemas no mesmo tópico Kafka na linha de comando</title>
      <dc:creator>Adriano Avelino</dc:creator>
      <pubDate>Thu, 04 Jan 2024 01:28:23 +0000</pubDate>
      <link>https://forem.com/adrianoavelino/multiplos-schemas-no-mesmo-topico-kafka-na-linha-de-comando-1o0h</link>
      <guid>https://forem.com/adrianoavelino/multiplos-schemas-no-mesmo-topico-kafka-na-linha-de-comando-1o0h</guid>
      <description>&lt;p&gt;Recentemente, tive a oportunidade de trabalhar em um projeto que utiliza três tipos distintos de eventos (schemas) no mesmo tópico do Kafka. Até então, só havia utilizado um único schema Avro por tópico. Isso só foi possível devido ao uso da estratégia de nome do subject (Subject name strategy) no Schema Registry.&lt;/p&gt;

&lt;h2&gt;
  
  
  Teoria
&lt;/h2&gt;

&lt;p&gt;Um &lt;strong&gt;subject&lt;/strong&gt; no Schema Registry é uma coleção de schemas associados a um tópico ou a um schema específico usado na serialização e desserialização de dados em um tópico do Kafka. Em termos simples, um "subject" é uma chave sob a qual os schemas são registrados no Schema Registry. Como exemplos de nomes de &lt;strong&gt;subjects&lt;/strong&gt; podemos citar: &lt;code&gt;transacoes-value&lt;/code&gt; e &lt;code&gt;transacoes-key&lt;/code&gt;, repectivamente relacionados ao valor e a chave de identificação de um evento enviado a um tópico Kafka. &lt;/p&gt;

&lt;p&gt;O Schema Registry suporta diversos tipos de schemas, mas estaremos abordando a utilização de schemas avro, por exemplo:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"record"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Transacao"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"fields"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"valor"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"double"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"data"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"tipo"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"conta_origem"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"conta_destino"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"descricao"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;


&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"record"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"DetalhesConta"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"fields"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"conta_id"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"titular"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"saldo"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"double"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"tipo_conta"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="nl"&gt;"name"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"agencia"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"type"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"string"&lt;/span&gt;&lt;span class="p"&gt;},&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;


&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;blockquote&gt;
&lt;p&gt;Mais informações relacionadas a arquivos avro podem ser encontradas na &lt;a href="https://avro.apache.org/docs/" rel="noopener noreferrer"&gt;documentação do Apache Avro&lt;/a&gt;.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Seguindo a estratégia de nome de subject padrão, o nome do &lt;strong&gt;subject&lt;/strong&gt; num tópico chamado &lt;strong&gt;transacoes&lt;/strong&gt; seria: &lt;code&gt;transacoes-value&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;O uso de mais de um schema no mesmo tópico Kafka é possível através do uso da estratégia de nome do subject (Subject name strategy). Existem três opções: TopicNameStrategy, RecordNameStrategy e TopicRecordNameStrategy.&lt;/p&gt;

&lt;p&gt;O &lt;strong&gt;TopicNameStrategy&lt;/strong&gt; é o padrão e gera o nome do subject com base no nome do tópico seguido do caractere &lt;code&gt;-&lt;/code&gt;, mais a palavra &lt;code&gt;key&lt;/code&gt; ou &lt;code&gt;value&lt;/code&gt;. No &lt;strong&gt;RecordNameStrategy&lt;/strong&gt; o nome do subject é baseado no nome do record do schema avro. Já o &lt;strong&gt;TopicRecordNameStrategy&lt;/strong&gt;, o nome subject é baseado no nome do tópico seguido  do caractere &lt;code&gt;-&lt;/code&gt;, mais o nome do record do schema avro. Veja a tabela abaixo com exemplos dos nomes gerados num tópico chamado &lt;strong&gt;transacoes&lt;/strong&gt; e nome do record &lt;code&gt;br.com.DetalhesConta&lt;/code&gt;:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Nome da estratégia do Subject&lt;/th&gt;
&lt;th&gt;Exemplo Subject gerado&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;TopicNameStrategy&lt;/td&gt;
&lt;td&gt;transacoes-value ou transacoes-key&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;RecordNameStrategy&lt;/td&gt;
&lt;td&gt;DetalhesConta&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;TopicRecordNameStrategy&lt;/td&gt;
&lt;td&gt;transacoes-DetalhesConta&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Seguindo essa lógica fica fácil entender porque não conseguimos utilizar mais de um schema utilizando a estratégia padrão (TopicNameStrategy). A formação do nome do subject é a mesma, tanto para um schema &lt;strong&gt;Transacao&lt;/strong&gt; ou &lt;strong&gt;DetalhesConta&lt;/strong&gt;, por exemplo. Ao cadastrar os schemas, ambos geram o mesmo nome do subject: &lt;code&gt;transacoes-value&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  O problema
&lt;/h2&gt;

&lt;p&gt;Vamos à prática, executando o ambiente Kafka no &lt;code&gt;docker-compose.yml&lt;/code&gt; com o seguinte conteúdo:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight yaml"&gt;&lt;code&gt;

&lt;span class="na"&gt;version&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;3.7'&lt;/span&gt;

&lt;span class="na"&gt;services&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
  &lt;span class="na"&gt;fast-data-dev&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
    &lt;span class="na"&gt;image&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;landoop/fast-data-dev:3.3&lt;/span&gt;
    &lt;span class="na"&gt;ports&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;2181:2181"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;3030:3030"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;8081-8083:8081-8083"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;9581-9585:9581-9585"&lt;/span&gt;
      &lt;span class="pi"&gt;-&lt;/span&gt; &lt;span class="s2"&gt;"&lt;/span&gt;&lt;span class="s"&gt;9092:9092"&lt;/span&gt;
    &lt;span class="na"&gt;environment&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt;
      &lt;span class="na"&gt;ADV_HOST&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;127.0.0.1&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;E executando o comando abaixo:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

docker-compose up &lt;span class="nt"&gt;-d&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Agora vamos acessar o &lt;strong&gt;bash&lt;/strong&gt; do container:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

docker-compose &lt;span class="nb"&gt;exec &lt;/span&gt;fast-data-dev bash


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Dentro do container podemos criar um producer que por padrão cria os subjects e os schemas de forma automática, primeiramente criamos um producer para o schema &lt;strong&gt;Transacoes&lt;/strong&gt;:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

kafka-avro-console-producer &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--broker-list&lt;/span&gt; localhost:9092 &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--topic&lt;/span&gt; transacoes &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--property&lt;/span&gt; value.schema&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;'{ "type": "record", "name": "Transacao", "fields": [ {"name": "id", "type": "string"}, {"name": "valor", "type": "double"}, {"name": "data", "type": "string"}, {"name": "tipo", "type": "string"}, {"name": "conta_origem", "type": "string"}, {"name": "conta_destino", "type": "string"}, {"name": "descricao", "type": "string"} ] }'&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--property&lt;/span&gt; schema.registry.url&lt;span class="o"&gt;=&lt;/span&gt;http://localhost:8081


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Logo após, colamos o seguinte código:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"123456789"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"valor"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;500.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"data"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2023-01-01"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"tipo"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"transferencia"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"conta_origem"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"987654321"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"conta_destino"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"567890123"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"descricao"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Transferência entre contas"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;


&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Com isso podemos visualizar os &lt;strong&gt;subjects&lt;/strong&gt; cadastrados através da interface gráfica do container landoop, através do terminal utilizando o &lt;strong&gt;curl&lt;/strong&gt; ou qualquer outra aplicação como o Postman ou Insomnia. Para acessar via interface gráfica do landoop, acesse o endereço &lt;a href="http://localhost:3030/schema-registry-ui/#/" rel="noopener noreferrer"&gt;http://localhost:3030/schema-registry-ui/#/&lt;/a&gt; e filtre pelo nome do subject &lt;code&gt;transacoes-value&lt;/code&gt;, conforme a imagem abaixo:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2w5ce7km72njh8mcfgy6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F2w5ce7km72njh8mcfgy6.png" alt="Print da interface gráfica do landoop filtrando pelo termo transacoes-value"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Se preferir também pode utilizar o &lt;strong&gt;curl&lt;/strong&gt; em um novo terminal:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

curl &lt;span class="nt"&gt;-X&lt;/span&gt; GET http://localhost:8081/subjects


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;O resultado deve ser algo semelhante a imagem abaixo:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fme6bqan0uyxbp1qvnbdd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fme6bqan0uyxbp1qvnbdd.png" alt="Print com o exemplo da resposta para uma requisição na API do schema registry para listar os subjects registrados"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Obs: a reposta deve conter diversos &lt;strong&gt;subjects&lt;/strong&gt;, mas vamos focar somente no subject &lt;code&gt;transacoes-value&lt;/code&gt; para realizar a validação.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Vamos ver o que acontece ao criarmos mais um producer, mas para o schema &lt;strong&gt;DetalhesConta&lt;/strong&gt;. Como comentado anteriormente, por padrão, o producer cria os subjects e os schemas de forma automática. O nosso objetivo ao utilizar um segundo producer é adicionar mais um schema no mesmo tópico e para isso primeiro abrimos um novo terminal e iniciamos o bash:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

docker-compose &lt;span class="nb"&gt;exec &lt;/span&gt;fast-data-dev bash


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Agora iniciamos o nosso producer:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

kafka-avro-console-producer &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--broker-list&lt;/span&gt; localhost:9092 &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--topic&lt;/span&gt; transacoes &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--property&lt;/span&gt; value.schema&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;'{ "type": "record", "name": "DetalhesConta", "fields": [ {"name": "conta_id", "type": "string"}, {"name": "titular", "type": "string"}, {"name": "saldo", "type": "double"}, {"name": "tipo_conta", "type": "string"}, {"name": "agencia", "type": "string"} ] }'&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--property&lt;/span&gt; schema.registry.url&lt;span class="o"&gt;=&lt;/span&gt;http://localhost:8081


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Logo após, colamos o seguinte evento para o schema &lt;code&gt;DetalhesConta&lt;/code&gt;:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"conta_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"987654321"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"titular"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Joaquim"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"saldo"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;1000.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"tipo_conta"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"corrente"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"agencia"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"123"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;


&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Devemos receber uma mensagem de erro semelhante ao erro abaixo:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

org.apache.kafka.common.errors.InvalidConfigurationException: Schema being registered is incompatible with an earlier schema for subject "transacoes-value", 
details: [Incompatibility{type:NAME_MISMATCH, location:/name, message:expected: Transacao, reader:{"type":"record","name":"DetalhesConta","fields":
[{"name":"conta_id","type":"string"},{"name":"titular","type":"string"},{"name":"saldo","type":"double"},{"name":"tipo_conta","type":"string"},
{"name":"agencia","type":"string"}]}, writer:{"type":"record","name":"Transacao","fields":[{"name":"id","type":"string"},{"name":"valor","type":"double"},
{"name":"data","type":"string"},{"name":"tipo","type":"string"},{"name":"conta_origem","type":"string"},{"name":"conta_destino","type":"string"},
{"name":"descricao","type":"string"}]}}, Incompatibility{type:READER_FIELD_MISSING_DEFAULT_VALUE, location:/fields/0, message:conta_id, reader:{"type":"record",
"name":"DetalhesConta","fields":[{"name":"conta_id","type":"string"},{"name":"titular","type":"string"},{"name":"saldo","type":"double"},{"name":"tipo_conta",
"type":"string"},{"name":"agencia","type":"string"}]}, writer:{"type":"record","name":"Transacao","fields":[{"name":"id","type":"string"},{"name":"valor",
"type":"double"},{"name":"data","type":"string"},{"name":"tipo","type":"string"},{"name":"conta_origem","type":"string"},{"name":"conta_destino",
"type":"string"},{"name":"descricao","type":"string"}]}}, Incompatibility{type:READER_FIELD_MISSING_DEFAULT_VALUE, location:/fields/1, message:titular, reader:
{"type":"record","name":"DetalhesConta","fields":[{"name":"conta_id","type":"string"},{"name":"titular","type":"string"},{"name":"saldo","type":"double"},
{"name":"tipo_conta","type":"string"},{"name":"agencia","type":"string"}]}, writer:{"type":"record","name":"Transacao","fields":[{"name":"id","type":"string"},
{"name":"valor","type":"double"},{"name":"data","type":"string"},{"name":"tipo","type":"string"},{"name":"conta_origem","type":"string"},
{"name":"conta_destino","type":"string"},{"name":"descricao","type":"string"}]}}, Incompatibility{type:READER_FIELD_MISSING_DEFAULT_VALUE, location:/fields/2, 
message:saldo, reader:{"type":"record","name":"DetalhesConta","fields":[{"name":"conta_id","type":"string"},{"name":"titular","type":"string"},{"name":"saldo",
"type":"double"},{"name":"tipo_conta","type":"string"},{"name":"agencia","type":"string"}]}, writer:{"type":"record","name":"Transacao","fields":[{"name":"id",
"type":"string"},{"name":"valor","type":"double"},{"name":"data","type":"string"},{"name":"tipo","type":"string"},{"name":"conta_origem","type":"string"},
{"name":"conta_destino","type":"string"},{"name":"descricao","type":"string"}]}}, Incompatibility{type:READER_FIELD_MISSING_DEFAULT_VALUE, location:/fields/3, 
message:tipo_conta, reader:{"type":"record","name":"DetalhesConta","fields":[{"name":"conta_id","type":"string"},{"name":"titular","type":"string"},
{"name":"saldo","type":"double"},{"name":"tipo_conta","type":"string"},{"name":"agencia","type":"string"}]}, writer:{"type":"record","name":"Transacao",
"fields":[{"name":"id","type":"string"},{"name":"valor","type":"double"},{"name":"data","type":"string"},{"name":"tipo","type":"string"},{"name":"conta_origem",
"type":"string"},{"name":"conta_destino","type":"string"},{"name":"descricao","type":"string"}]}}, Incompatibility{type:READER_FIELD_MISSING_DEFAULT_VALUE, 
location:/fields/4, message:agencia, reader:{"type":"record","name":"DetalhesConta","fields":[{"name":"conta_id","type":"string"},{"name":"titular",
"type":"string"},{"name":"saldo","type":"double"},{"name":"tipo_conta","type":"string"},{"name":"agencia","type":"string"}]}, writer:{"type":"record",
"name":"Transacao","fields":[{"name":"id","type":"string"},{"name":"valor","type":"double"},{"name":"data","type":"string"},{"name":"tipo","type":"string"},
{"name":"conta_origem","type":"string"},{"name":"conta_destino","type":"string"},{"name":"descricao","type":"string"}]}}]; error code: 409


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;A mensagem de erro é extensa, porém, resumidamente, o problema ocorreu ao tentar registrar automaticamente um novo schema, seguindo a estratégia de nome &lt;code&gt;TopicNameStrategy&lt;/code&gt; que gera um subject chamado &lt;code&gt;transacoes-value&lt;/code&gt;. Como este subject já foi registrado anteriormente, foi realizada uma tentativa de atualizar um schema tentado adicionar campos obrigatórios gerando erros de incompatibilidade. Esse erro está relacionado aos tipos de compatibilidade dos schemas, onde o padrão é &lt;code&gt;BACKWARD&lt;/code&gt;. Esse padrão permite a exclusão de campos e a adição de campos opcionais. Mais informações sobre compatibilidade dos schemas podem ser encontradas na &lt;a href="https://docs.confluent.io/cloud/current/sr/fundamentals/schema-evolution.html#compatibility-types" rel="noopener noreferrer"&gt;documentação sobre Compatibility types&lt;/a&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Dica: para finalizar o producer em qualquer um dos terminais abertos, basta digitar &lt;code&gt;Ctrl + C&lt;/code&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Solução
&lt;/h2&gt;

&lt;p&gt;Para resolver esse problema devemos utilizar a estratégia de nome de subject &lt;code&gt;TopicRecordNameStrategy&lt;/code&gt; ou &lt;code&gt;RecordNameStrategy&lt;/code&gt;. Aqui vamos utilizar a estratégia &lt;code&gt;TopicRecordNameStrategy&lt;/code&gt;, mas antes vamos recriar a nossa infraestrutura seguindo alguns passos:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Obs: esses passos não são obrigatórios, mas para mantermos o ambiente sem estados iremos recriar a infraestrutura&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Desligue o container:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

docker-compose down &lt;span class="nt"&gt;-v&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Recrie a infraestrutura:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

docker-compose up &lt;span class="nt"&gt;-d&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Agora vamos registrar dois schemas para o mesmo tópico da seguinte forma:&lt;br&gt;
Acesse o &lt;strong&gt;bash&lt;/strong&gt; do container:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

docker-compose &lt;span class="nb"&gt;exec &lt;/span&gt;fast-data-dev bash


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Crie um producer para o schema &lt;strong&gt;Transacoes&lt;/strong&gt;:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

kafka-avro-console-producer &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--broker-list&lt;/span&gt; localhost:9092 &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--topic&lt;/span&gt; transacoes &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--property&lt;/span&gt; value.schema&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;'{ "type": "record", "name": "Transacao", "fields": [ {"name": "id", "type": "string"}, {"name": "valor", "type": "double"}, {"name": "data", "type": "string"}, {"name": "tipo", "type": "string"}, {"name": "conta_origem", "type": "string"}, {"name": "conta_destino", "type": "string"}, {"name": "descricao", "type": "string"} ] }'&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--property&lt;/span&gt; schema.registry.url&lt;span class="o"&gt;=&lt;/span&gt;http://localhost:8081 &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--property&lt;/span&gt; value.subject.name.strategy&lt;span class="o"&gt;=&lt;/span&gt;io.confluent.kafka.serializers.subject.TopicRecordNameStrategy


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Agora cole o seguinte evento:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"123456789"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"valor"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;500.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"data"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"2023-01-01"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"tipo"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"transferencia"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"conta_origem"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"987654321"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"conta_destino"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"567890123"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"descricao"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Transferência entre contas"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;


&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Agora vamos criar um producer para o schema &lt;code&gt;DetalhesConta&lt;/code&gt;, primeiro abrimos um novo terminal e inciamos um bash para o container:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

docker-compose &lt;span class="nb"&gt;exec &lt;/span&gt;fast-data-dev bash


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Agora iniciamos o nosso producer:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

kafka-avro-console-producer &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--broker-list&lt;/span&gt; localhost:9092 &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--topic&lt;/span&gt; transacoes &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--property&lt;/span&gt; value.schema&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s1"&gt;'{ "type": "record", "name": "DetalhesConta", "fields": [ {"name": "conta_id", "type": "string"}, {"name": "titular", "type": "string"}, {"name": "saldo", "type": "double"}, {"name": "tipo_conta", "type": "string"}, {"name": "agencia", "type": "string"} ] }'&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--property&lt;/span&gt; schema.registry.url&lt;span class="o"&gt;=&lt;/span&gt;http://localhost:8081 &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--property&lt;/span&gt; value.subject.name.strategy&lt;span class="o"&gt;=&lt;/span&gt;io.confluent.kafka.serializers.subject.TopicRecordNameStrategy


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Logo após, colamos o seguinte evento para o schema &lt;code&gt;DetalhesConta&lt;/code&gt;:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="w"&gt;

&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"conta_id"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"987654321"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"titular"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"Joaquim"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"saldo"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="mf"&gt;1000.0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"tipo_conta"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"corrente"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="nl"&gt;"agencia"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"123"&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;


&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Com isso podemos validar os subjects na &lt;a href="http://localhost:3030/schema-registry-ui/#/cluster/fast-data-dev" rel="noopener noreferrer"&gt;interface gráfica do landoop&lt;/a&gt; e pesquisar pela palavra &lt;code&gt;transacoes&lt;/code&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frg6mhek4ltku321dnozc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frg6mhek4ltku321dnozc.png" alt="Print da interface gráfica do landoop pesquisando pelos schemas que contém a palavra transacoes"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Ou através do &lt;strong&gt;curl&lt;/strong&gt; com o seguinte comando:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

curl &lt;span class="nt"&gt;-X&lt;/span&gt; GET http://localhost:3030/schema-registry-ui/#/subjects


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Ao analisar a resposta do comando acima podemos identificar que foram criados 2 subjects para o tópico &lt;strong&gt;transacoes&lt;/strong&gt;: &lt;code&gt;transacoes-Transacao&lt;/code&gt; e &lt;code&gt;transacoes-DetalhesConta&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Também podemos visualizar os eventos recebidos no tópico Kafka usando a &lt;a href="http://localhost:3030/kafka-topics-ui/#/cluster/fast-data-dev/topic/n/transacoes/" rel="noopener noreferrer"&gt;interface gráfica do landoop&lt;/a&gt;:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh2ixosqagwu1rx8870wl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh2ixosqagwu1rx8870wl.png" alt="Print da interface gráfica do landoop listando os eventos do tópico transacoes"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Ou através de um consumer na linha de comando, num outro terminal, após iniciar o bash com o comando &lt;code&gt;docker-compose exec fast-data-dev bash&lt;/code&gt;:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;

kafka-avro-console-consumer &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--bootstrap-server&lt;/span&gt; localhost:9092 &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--topic&lt;/span&gt; transacoes &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--from-beginning&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
&lt;span class="nt"&gt;--property&lt;/span&gt; schema.registry.url&lt;span class="o"&gt;=&lt;/span&gt;http://localhost:8081


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Ao finalizar o passo a passo será possível registrar dois schemas no mesmo tópico utilizando o &lt;code&gt;kafka-avro-console-producer&lt;/code&gt;, de forma automática, através da estratégia de nome de subject &lt;code&gt;TopicRecordNameStrategy&lt;/code&gt; para não sofrer com problemas de conflito de subjects no mesmo tópico. Você pode encontrar um exemplo dessa implantação em &lt;a href="https://www.karengryg.io/2018/08/18/multi-schemas-in-one-kafka-topic/" rel="noopener noreferrer"&gt;Multi schemas in one Kafka topic&lt;/a&gt; utilizando a linguagem Java ou &lt;a href="https://github.com/my-study-area/poc-kafka-connector-lambda/blob/main/run_local/multi-schema-same-topic-avro-producer.py" rel="noopener noreferrer"&gt;outro exemplo em python&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Referências
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.confluent.io/cloud/current/sr/fundamentals/index.html" rel="noopener noreferrer"&gt;Schema Registry Key Concepts&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.confluent.io/platform/current/schema-registry/fundamentals/serdes-develop/index.html#subject-name-strategy" rel="noopener noreferrer"&gt;Subject name strategy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.confluent.io/cloud/current/sr/fundamentals/schema-evolution.html#compatibility-types" rel="noopener noreferrer"&gt;Compatibility types&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://docs.confluent.io/platform/current/clients/confluent-kafka-python/html/index.html#avroserializer" rel="noopener noreferrer"&gt;Avroserializer&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://avro.apache.org/docs/" rel="noopener noreferrer"&gt;Apache Avro&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://hub.docker.com/r/landoop/fast-data-dev" rel="noopener noreferrer"&gt;Imagem docker com o ecossitema Kafka&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://chat.openai.com/" rel="noopener noreferrer"&gt;Chat GPT&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.confluent.io/blog/put-several-event-types-kafka-topic/" rel="noopener noreferrer"&gt;Should You Put Several Event Types in the Same Kafka Topic?&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.karengryg.io/2018/08/18/multi-schemas-in-one-kafka-topic/" rel="noopener noreferrer"&gt;Multi schemas in one Kafka topic&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>kafka</category>
      <category>schema</category>
      <category>schemaregistry</category>
      <category>avro</category>
    </item>
  </channel>
</rss>
