<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Shivay Lamba</title>
    <description>The latest articles on Forem by Shivay Lamba (@shivaylamba).</description>
    <link>https://forem.com/shivaylamba</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/shivaylamba"/>
    <language>en</language>
    <item>
      <title>The LiteLLM Supply Chain Attack: A Wake-Up Call for AI Infrastructure</title>
      <dc:creator>Shivay Lamba</dc:creator>
      <pubDate>Fri, 27 Mar 2026 15:24:29 +0000</pubDate>
      <link>https://forem.com/shivaylamba/the-litellm-supply-chain-attack-a-wake-up-call-for-ai-infrastructure-2mi3</link>
      <guid>https://forem.com/shivaylamba/the-litellm-supply-chain-attack-a-wake-up-call-for-ai-infrastructure-2mi3</guid>
      <description>&lt;p&gt;A routine dependency install triggered one of the most serious supply chain incidents in the AI ecosystem. A compromised release of LiteLLM, an AI gateway with about 97 million monthly downloads, introduced malicious code that quietly extracted sensitive credentials from developers' systems. The attack needed no explicit action. Simply installing the affected package was enough to begin data exfiltration.&lt;/p&gt;

&lt;p&gt;What makes this significant is how it happened and what it exposed. The breach began upstream in the software supply chain and exploited trust in CI/CD pipelines and dependency systems. It did not go after users directly. Even well-secured environments were affected by normal development workflows.&lt;/p&gt;

&lt;p&gt;The scale of concern became clear when voices like &lt;a href="https://www.linkedin.com/in/andrej-karpathy-9a650716/" rel="noopener noreferrer"&gt;Andrej Karpathy&lt;/a&gt;, former director of Tesla AI and former research scientist at OpenAI, pointed out how dangerous supply chain attacks have become, with Elon Musk reinforcing the need for caution.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhwpq4l8tk0uzuytf4fb4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhwpq4l8tk0uzuytf4fb4.png" alt=" "&gt;&lt;/a&gt;&lt;br&gt;
source: &lt;a href="https://x.com/karpathy/status/2036487306585268612?s=20" rel="noopener noreferrer"&gt;https://x.com/karpathy/status/2036487306585268612?s=20&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This points to a deeper issue in how modern AI infrastructure is built and trusted. In this article, let us examine what happened, how the attack unfolded, and what it means for building safer AI systems going forward.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Incident Overview: Scope, Timeline, and Entry Point&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;The issue began days before the public release. On March 19, the attacker group TeamPCP changed Git tags in the Trivy GitHub Action to point to a malicious build that carried a credential harvester. &lt;/p&gt;

&lt;p&gt;Trivy runs deep within many CI/CD pipelines, including LiteLLM. It became a quiet but effective entry point. On March 23, a similar pattern appeared in Checkmarx KICS and the domain &lt;code&gt;models.litellm.cloud&lt;/code&gt; was registered just before the main event.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqoolztqvexaud7yjoyvi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqoolztqvexaud7yjoyvi.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;On March 24 at 10:39 UTC, LiteLLM’s CI/CD pipeline ran the compromised Trivy scanner without version pinning. That gap exposed the PyPI publish token from the GitHub Actions environment. Within hours, two malicious versions were released and stayed available until about 16:00 UTC.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Version 1.82.7&lt;/strong&gt;: Malicious code placed in &lt;code&gt;proxy_server.py&lt;/code&gt;, triggered on import&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Version 1.82.8&lt;/strong&gt;: A .pth file that runs on every Python startup, no import needed
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;pip &lt;span class="nb"&gt;install &lt;/span&gt;litellm
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A simple install like this could pull sensitive data such as SSH keys, cloud credentials, API keys, and more. The data was encrypted and quietly sent out. The breach came to light only after a bug in the attacker’s code caused a system crash, exposing activity that would have stayed hidden through a transitive dependency.&lt;/p&gt;

&lt;h2&gt;
  
  
  How the Attack Moved Through Trusted Systems
&lt;/h2&gt;

&lt;p&gt;The attack began outside LiteLLM and moved through trusted systems until it reached developers.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvhtlja6zt1vi5nn289sv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvhtlja6zt1vi5nn289sv.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Attackers changed the Trivy GitHub Action and pointed it to a malicious version&lt;/li&gt;
&lt;li&gt;LiteLLM’s CI CD pipeline used it without fixing a specific version&lt;/li&gt;
&lt;li&gt;The scanner pulled the PyPI publishing token from the pipeline&lt;/li&gt;
&lt;li&gt;Attackers used this access to release malicious LiteLLM versions&lt;/li&gt;
&lt;li&gt;Developers installed or updated the package and received the compromised code&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each step depended on trust. The pipeline relied on the scanner, and developers relied on the package source. Nothing seemed unusual at each step.&lt;/p&gt;

&lt;p&gt;The attack spread through everyday workflows, including indirect installs through dependencies. Many systems were exposed, and developers remained unaware. This points to a clear issue. Risk extends beyond application code and includes the tools and systems that support it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Understanding the True Impact of the Breach
&lt;/h2&gt;

&lt;p&gt;The impact of this attack was severe. A simple install gave access to sensitive data across developer systems.&lt;/p&gt;

&lt;p&gt;Attackers could access SSH keys, cloud credentials, Kubernetes secrets, API keys, CI CD tokens, and database passwords. Shell history, git credentials, and crypto wallets were also exposed .&lt;/p&gt;

&lt;p&gt;This data was then encrypted and sent to an external domain. The process ran quietly in the background and gave no clear signal to the user. If Kubernetes access was present, the attack extended further. It could read cluster secrets and create privileged workloads to maintain access inside the system.&lt;/p&gt;

&lt;p&gt;This level of access goes beyond a single application. It opens the door to full control of infrastructure across environments. AI systems increase this risk further. Tools like LiteLLM act as a central layer for multiple provider keys and requests. Once exposed, the impact spreads across connected services.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;What This Reveals About Modern AI Systems&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;This event highlights key patterns in how modern AI systems are built and where risk enters.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Deep Dependency Chains:&lt;/strong&gt; Modern AI stacks rely on multiple external packages layered atop one another. Many dependencies are indirect, which makes them harder to track. A single weak link can affect the entire system.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Trust Across Layers:&lt;/strong&gt; CI CD pipelines trust external tools. Build systems’ trust dependencies. Developers trust package registries. Each layer depends on the next, creating a chain that attackers can move through step by step.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Centralized Gateway Design:&lt;/strong&gt; AI gateways such as LiteLLM collect keys from multiple providers and route all requests through a single layer. This improves convenience but increases risk. Exposure at this layer affects every connected service.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Limited Visibility:&lt;/strong&gt; Teams often lack clear insight into what runs inside pipelines or which dependencies are pulled in indirectly. This reduces early detection and slows response.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Security Scope Gaps:&lt;/strong&gt; Security efforts often focus on application code. Supporting systems such as pipelines, tools, and dependencies receive less attention, even though they carry equal risk.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;The Risk Built into Modern AI Architectures&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;The earlier points highlight a gap. Deep dependency chains, layered trust across systems, centralized gateways, and limited visibility increase risk across AI infrastructure.&lt;/p&gt;

&lt;p&gt;Core functions like API routing and key handling often rely on large dependency chains. As these layers grow, exposure increases. The same structure that supports speed also creates entry points for compromise. Addressing this requires architectural changes, including reducing exposure to dependencies, isolating critical components, and maintaining tighter control over execution environments.&lt;/p&gt;

&lt;p&gt;AI gateways sit at a critical layer, handling traffic, managing keys, and connecting providers, which makes their design directly tied to system risk.&lt;/p&gt;

&lt;h2&gt;
  
  
  Building Low-Exposure AI Infrastructure with Bifrost
&lt;/h2&gt;

&lt;p&gt;For teams evaluating options right now, the focus should be on how a gateway limits the impact of a compromised dependency. This depends on how the system is designed and where control is maintained across it.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.getmaxim.ai/bifrost" rel="noopener noreferrer"&gt;Bifrost&lt;/a&gt; is an &lt;a href="https://github.com/maximhq/bifrost" rel="noopener noreferrer"&gt;open-source&lt;/a&gt; AI gateway that routes and manages requests across 20+ model providers through a single interface. Keys, traffic, and access stay under your control.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsvvfr88k30lemtfxkxu0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsvvfr88k30lemtfxkxu0.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Its design takes a different path at the architectural level. The PyPI attack surface is removed entirely. Built in Go, Bifrost runs as a single binary or container, which removes the need for pip installs, avoids a Python runtime inside the gateway, and eliminates dependency chains that can be altered during installation.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftze50vlrmi8qjeusfdn3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ftze50vlrmi8qjeusfdn3.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Control remains within the user’s environment because Bifrost runs on private networks, where traffic and credentials remain internal. Key management is handled through direct integration with systems such as &lt;a href="https://developer.hashicorp.com/vault" rel="noopener noreferrer"&gt;HashiCorp Vault&lt;/a&gt;, AWS Secrets Manager, GCP, and Azure, where secrets stay in managed storage with controlled access and clear audit visibility.&lt;/p&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Comparative Overview: LiteLLM vs. Bifrost&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;The differences become clearer when both approaches are viewed side by side across key areas of risk, control, and deployment.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Category&lt;/th&gt;
&lt;th&gt;LiteLLM&lt;/th&gt;
&lt;th&gt;Bifrost&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Language&lt;/td&gt;
&lt;td&gt;Python (PyPI)&lt;/td&gt;
&lt;td&gt;Go (binary/Docker)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Supply chain exposure&lt;/td&gt;
&lt;td&gt;High&lt;/td&gt;
&lt;td&gt;Eliminated (no pip dependency)&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Credential storage&lt;/td&gt;
&lt;td&gt;Environment/config-based&lt;/td&gt;
&lt;td&gt;Vault-integrated&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Deployment model&lt;/td&gt;
&lt;td&gt;Application-level&lt;/td&gt;
&lt;td&gt;In-VPC, isolated&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Audit capability&lt;/td&gt;
&lt;td&gt;Limited&lt;/td&gt;
&lt;td&gt;Immutable, SIEM-ready&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Migration effort&lt;/td&gt;
&lt;td&gt;—&lt;/td&gt;
&lt;td&gt;One-line configuration change&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Performance&lt;/td&gt;
&lt;td&gt;—&lt;/td&gt;
&lt;td&gt;~11 µs overhead at scale&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Compatibility&lt;/td&gt;
&lt;td&gt;Native&lt;/td&gt;
&lt;td&gt;Full LiteLLM compatibility&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;Closing Thoughts&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;The LiteLLM incident leaves a lasting takeaway. Systems built on layers of trust, dependencies, and software supply chains carry risk that spreads faster than expected. AI gateways sit at the center of this setup, which makes their design choices critical to overall security.&lt;/p&gt;

&lt;p&gt;This is a moment to rethink how these systems are structured. Reducing dependency exposure, separating critical components, and maintaining control within your own environment can limit how far an issue can travel across the supply chain.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.getmaxim.ai/bifrost" rel="noopener noreferrer"&gt;Bifrost&lt;/a&gt; follows this direction. Its design reduces exposure to dependencies, keeps credentials within managed systems, and runs in controlled environments, which helps limit the spread of similar attacks. To see how this approach can be applied, explore the &lt;a href="https://docs.getbifrost.ai/overview" rel="noopener noreferrer"&gt;Bifrost documentation&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If you do not know what each dependency in your AI stack has access to, that is the first thing to fix.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>programming</category>
      <category>opensource</category>
      <category>security</category>
    </item>
    <item>
      <title>How I built a Content Curator CLI using Github Copilot CLI SDK</title>
      <dc:creator>Shivay Lamba</dc:creator>
      <pubDate>Sun, 25 Jan 2026 19:44:21 +0000</pubDate>
      <link>https://forem.com/shivaylamba/how-i-built-a-content-curator-cli-using-github-copilot-cli-sdk-51ff</link>
      <guid>https://forem.com/shivaylamba/how-i-built-a-content-curator-cli-using-github-copilot-cli-sdk-51ff</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwsdj5jkhp3hmusqo6kng.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwsdj5jkhp3hmusqo6kng.png" alt=" " width="800" height="272"&gt;&lt;/a&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/github-2026-01-21"&gt;GitHub Copilot CLI Challenge&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;h1&gt;
  
  
  Building Content Curator: CLI + Copilot SDK
&lt;/h1&gt;

&lt;p&gt;I wanted a simple but practical project to test out the Copilot SDK. I ended up creating an &lt;strong&gt;AI-powered CLI&lt;/strong&gt; that uses &lt;strong&gt;Copilot’s agentic core&lt;/strong&gt; along with &lt;strong&gt;real-time web search (via Exa AI)&lt;/strong&gt; to generate &lt;strong&gt;short-form video ideas, hooks, and full scripts&lt;/strong&gt; for &lt;strong&gt;Reels, YouTube Shorts, and TikTok&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Repo:&lt;/strong&gt; &lt;code&gt;github-copilot-cli-sdk-content-curator&lt;/code&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  How Content Curator Uses GitHub Copilot SDK &amp;amp; Exa AI
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. Initializing the Copilot Client
&lt;/h3&gt;

&lt;p&gt;At the heart of the app is the &lt;code&gt;CopilotService&lt;/code&gt; class, which manages the AI client lifecycle:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;CopilotClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;clientOptions&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;start&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createSession&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;CopilotClient&lt;/code&gt; connects directly to Copilot models installed locally via the CLI.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;client.start()&lt;/code&gt; initializes the client, preparing it for session creation.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;createSession()&lt;/code&gt; sets up a streaming session with the chosen model and system prompt.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  2. Session Management
&lt;/h3&gt;

&lt;p&gt;Each AI session represents a continuous conversation with the Copilot model:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;session&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createSession&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;model&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;currentModel&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;streaming&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;systemMessage&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;mode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;append&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;SYSTEM_PROMPT_BASE&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;model&lt;/strong&gt; – Determines which Copilot model (GPT-4o, GPT-5, Claude, etc.) powers content generation.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;streaming: true&lt;/strong&gt; – Enables partial outputs to be sent in real time, allowing the CLI to render content as it’s being generated.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;systemMessage&lt;/strong&gt; – Provides base instructions for the agent, ensuring that generated content matches the expected format for short-form videos.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  3. Prompt Workflow
&lt;/h3&gt;

&lt;p&gt;Content Curator sends structured prompts to Copilot for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Content generation (&lt;code&gt;/script&lt;/code&gt;, &lt;code&gt;/ideas&lt;/code&gt;, &lt;code&gt;/hooks&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Content refinement (&lt;code&gt;/refine&lt;/code&gt;)&lt;/li&gt;
&lt;li&gt;Generating variations (&lt;code&gt;/more-variations&lt;/code&gt;)&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Example:&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;promptWithSearch&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;searchResults&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;

---

&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;basePrompt&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="k"&gt;this&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;sendPrompt&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;promptWithSearch&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;contentType&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Incorporates &lt;strong&gt;real-time search results from Exa AI&lt;/strong&gt; to ensure generated content is relevant and up-to-date.&lt;/li&gt;
&lt;li&gt;Supports &lt;strong&gt;topic- and platform-specific instructions&lt;/strong&gt;, so scripts are optimized for Instagram, TikTok, or YouTube Shorts.&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;Using &lt;code&gt;CopilotClient&lt;/code&gt; directly allows Content Curator to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Maintain persistent AI sessions with context across multiple commands.&lt;/li&gt;
&lt;li&gt;Stream outputs in real time, providing an interactive user experience.&lt;/li&gt;
&lt;li&gt;Dynamically switch models without restarting the application.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Why Choose Copilot SDK Over Generic LLM Clients?
&lt;/h2&gt;

&lt;p&gt;There are several strong reasons to prefer the Copilot SDK over generic LLM clients:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;GitHub-native workflows&lt;/strong&gt; – Direct access to repos, files, and workflows.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Built-in agentic behavior&lt;/strong&gt; – Conversations, tools, reasoning, and MCP support.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The kinds of apps you can build with the Copilot SDK go well beyond content curation:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Summarizing PRs and generating release notes&lt;/li&gt;
&lt;li&gt;Creating learning-path threads from a repository&lt;/li&gt;
&lt;li&gt;Querying personal knowledge stores&lt;/li&gt;
&lt;li&gt;Building CLIs and custom GUIs for AI agents&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The Copilot SDK exposes the &lt;strong&gt;agentic capabilities of the Copilot CLI&lt;/strong&gt;—including planning, tool execution, and multi-turn workflows—directly in your preferred programming language. This allows you to integrate Copilot into any environment, whether you’re building:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;GUIs with AI workflows&lt;/li&gt;
&lt;li&gt;Personal productivity tools&lt;/li&gt;
&lt;li&gt;Custom internal agents for enterprise processes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Essentially, the SDK acts as an &lt;strong&gt;execution platform&lt;/strong&gt;. It provides the same agentic loop that powers the Copilot CLI, while GitHub manages &lt;strong&gt;authentication, models, MCP servers, and streaming sessions&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;You remain fully in control of what you build on top of these core capabilities.&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;Here's the project link - &lt;a href="https://github.com/shivaylamba/github-copilot-cli-sdk-content-curator" rel="noopener noreferrer"&gt;https://github.com/shivaylamba/github-copilot-cli-sdk-content-curator&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Video Walkthrough - &lt;a href="https://www.youtube.com/watch?v=znmkkjpntKc" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=znmkkjpntKc&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  My Experience with GitHub Copilot CLI
&lt;/h2&gt;

&lt;p&gt;I used GitHub Copilot CLI extensively while building this project, primarily as a thinking partner rather than just a code generator. It helped me move faster from intent to implementation by translating natural-language prompts into concrete shell commands, code snippets, and workflow suggestions directly in the terminal.&lt;/p&gt;

&lt;p&gt;Copilot CLI was especially useful during early development and iteration. Whether it was scaffolding project structure, generating boilerplate code, debugging errors, or suggesting optimized commands, it reduced context switching between the editor, browser, and documentation. I could stay focused on the problem I was solving instead of searching for syntax or command references.&lt;/p&gt;

&lt;p&gt;One of the biggest impacts was how it accelerated experimentation. I could quickly try alternative approaches, validate ideas, and refine implementations by asking follow-up questions in the CLI itself. This made the development process more interactive and exploratory, particularly when working with unfamiliar tools or configurations.&lt;/p&gt;

&lt;p&gt;Overall, GitHub Copilot CLI significantly improved my productivity and flow. It didn’t replace my understanding or decision-making, but it acted as a powerful assistant that helped me write, test, and iterate faster—especially in moments where friction would normally slow development down.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fefxafe0lmz56oo6i4fme.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fefxafe0lmz56oo6i4fme.png" alt=" " width="800" height="272"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>githubchallenge</category>
      <category>cli</category>
      <category>githubcopilot</category>
    </item>
    <item>
      <title>How I Used GitLab Duo Agent Platorm to Build a Conference Demo in under an hour</title>
      <dc:creator>Shivay Lamba</dc:creator>
      <pubDate>Thu, 08 Jan 2026 22:14:31 +0000</pubDate>
      <link>https://forem.com/shivaylamba/how-i-used-gitlab-duo-agent-platorm-to-build-a-conference-demo-in-under-an-hour-1c88</link>
      <guid>https://forem.com/shivaylamba/how-i-used-gitlab-duo-agent-platorm-to-build-a-conference-demo-in-under-an-hour-1c88</guid>
      <description>&lt;p&gt;&lt;em&gt;Learn how the Gitlab Duo Agent Platform helped me design and ship a complete GitLab CI pipeline: emitting OTel traces and integrating with Argo, in just one hour before my KubeCon talk.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;Building a live demo for a conference can feel overwhelming. A few weeks ago, I committed myself to building an entire end-to-end CI/CD observability demo for my talk at KubeCon using Argo Workflows, Argo CD, and OpenTelemetry. But as anyone who has built live demos knows, things rarely stay simple. Between trace propagation, pipeline wiring, and environment setup, the list of moving pieces grows fast.&lt;/p&gt;

&lt;p&gt;And then came the curveball. The original demo only included Argo Workflows and ArgoCD, with no CI component at all,  and just one hour before my talk, my co-presenter and I decided that GitLab CI needed to be part of the story. That meant creating a pipeline, instrumenting it with OTel, wiring it to Argo, and exporting everything to SigNoz. Thankfully, I also had a secret superpower: the GitLab Duo Agent Platform. It became the key to building a fully functioning CI demo in record time.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting the Scene - What we wanted to show in our talk
&lt;/h2&gt;

&lt;p&gt;While production observability is mature, visibility into the CI/CD journey before code reaches production is often missing. Thus, I and my co-presenter were presenting a KubeCon talk focused on end-to-end CI/CD observability using OpenTelemetry. In the session, we introduced CI/CD observability using OTel, with a deep dive into Argo Workflows and ArgoCD, showing how a single code change moves through GitLab CI, Argo Workflows, and Argo CD, all stitched together using OpenTelemetry’s new CI/CD semantic conventions. &lt;/p&gt;

&lt;p&gt;Our goal was to highlight how the latest OTel CI/CD Semantic Conventions provide a vendor-neutral way to instrument every stage of the delivery pipeline, that is from build to deployment, thus making CI/CD telemetry just as accessible and standardized as application telemetry. We covered how to emit traces and metrics from Argo Workflows and ArgoCD, and how OTel can surface critical signals like job duration, workflow execution latency, deployment failures, and promotion events.&lt;/p&gt;

&lt;p&gt;The original demo only covered ArgoCD, with no CI component at all. But just an hour before our presentation, during a quick conversation with my co-presenter, we realized that the story would be incomplete without showing how GitLab CI fits into the end-to-end trace flow. &lt;/p&gt;

&lt;h2&gt;
  
  
  GitLab Duo Agent Platform saves the day
&lt;/h2&gt;

&lt;p&gt;To keep the talk impactful, we needed a fully working end-to-end demo: a commit triggering GitLab CI, emitting spans from each stage, launching an Argo Workflow for integration testing, handing off to ArgoCD for deployment, and finally showing a single trace in our SigNoz dashboard. Doing this manually would not have been possible in 1 hour, but with the Gitlab Duo Agent Platform we were able to generate the GitLab CI integration complete with OTel spans, otel-cli setup, trace propagation, and Argo Workflow submission in under an hour. That last-minute addition ended up becoming the highlight of our session. &lt;/p&gt;

&lt;p&gt;To extend the talk demo to include CI, I needed GitLab CI to become part of the same end-to-end trace that already included Argo Workflows and Argo CD. So I asked Duo to help me generate the missing CI integration. Here’s the exact prompt I used:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;I already have my OTel demo working for Argo Workflows and Argo CD using OTEL CLI. Now I also want to integrate the CI layer using GitLab CI.

Use a GitLab Runner to emit OTel spans via OTEL_EXPORTER_OTLP_ENDPOINT.

Wrap the key GitLab stages:  build, test, package,  using OpenTelemetry SDK calls so each step appears as a span. 

Export all CI traces to SigNoz (OTLP endpoint).

Finally, show how the GitLab job (build) can trigger an Argo Workflow (integration test) which then leads to an Argo CD deployment — all stitched together through trace context propagation.

The final output should create one seamless story:

 Commit → GitLab CI → Argo Workflow → Argo CD → Deployment → Observability Dashboard.”*
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This became the starting point for the GitLab CI configuration that powered my KubeCon demo. Using the DAP agent I generated a .gitlab-ci.yml in minutes. The resulting pipeline includes four stages, each emitting its own OpenTelemetry span to trace the CI/CD execution. Every job installs otel-cli and creates a span that represents that stage of the pipeline (build, test, or package). In the integration stage, the pipeline installs the Argo CLI and automatically submits an Argo Workflow. This allowed the GitLab CI trace to seamlessly connect with the downstream Argo Workflow execution, creating a unified, end-to-end observability story.&lt;/p&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;You can check the live demo of the CI pipeline here: &lt;/p&gt;

&lt;p&gt;

  &lt;iframe src="https://www.youtube.com/embed/nYNxtd804CI"&gt;
  &lt;/iframe&gt;


&lt;/p&gt;

&lt;p&gt;And you find the code for the CI/CD pipeline here: &lt;a href="https://gitlab.com/shivaylamba/kubecongitlabdemo" rel="noopener noreferrer"&gt;https://gitlab.com/shivaylamba/kubecongitlabdemo&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Final Thoughts
&lt;/h2&gt;

&lt;p&gt;With just a few prompts and light debugging, the Duo Agent Platform helped me build the entire GitLab CI flow for the KubeCon demo in just about an hour,  something that would normally take much longer. But the capabilities of Duo Agent Platform doesn’t end at helping create your code, you can also use the Duo Agent Platform with your GitLab CI environment to debug runners, optimize jobs, reason about logs, fix pipeline failures, and even explain complex YAML configurations. &lt;/p&gt;

</description>
      <category>gitlab</category>
      <category>devops</category>
      <category>ai</category>
      <category>machinelearning</category>
    </item>
    <item>
      <title>How I Built an Agentic RAG Application to Brainstorm Conference Talk Ideas</title>
      <dc:creator>Shivay Lamba</dc:creator>
      <pubDate>Tue, 08 Jul 2025 23:16:09 +0000</pubDate>
      <link>https://forem.com/couchbase/how-i-built-an-agentic-rag-application-to-brainstorm-conference-talk-ideas-42oo</link>
      <guid>https://forem.com/couchbase/how-i-built-an-agentic-rag-application-to-brainstorm-conference-talk-ideas-42oo</guid>
      <description>&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9yq139beo0wj6b2w2gng.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9yq139beo0wj6b2w2gng.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  TL;DR
&lt;/h2&gt;

&lt;p&gt;I love speaking at technical conferences. But in order to get selected to speak at the event, you need to submit a strong talk proposal or abstract—one that clearly shows relevance, technical depth, and actionable takeaways for the audience attending your talk. A good abstract isn’t just about the idea itself; it needs to show why the topic matters right now and how the talk will benefit attendees. At the same time, you want to avoid repeating something that’s already been presented.&lt;/p&gt;

&lt;p&gt;To solve this, I built an AI-powered agentic application that helps me ideate and draft compelling talk abstracts. It uses a research agent to do deep research on a topic—finding the latest trends, developments, and active discussions—and combines that with fast vector search using &lt;a href="https://www.couchbase.com/" rel="noopener noreferrer"&gt;Couchbase&lt;/a&gt; over previous talks on the same subject from past conferences. In this case, the system is specifically designed for KubeCon, and in this post, I’ll walk you through how I built the full pipeline to create a conference talk brainstorming AI tool. &lt;/p&gt;

&lt;p&gt;You can find the code for this project &lt;a href="https://github.com/shivay-couchbase/conference-agentic-rag-talk-proposal-generator" rel="noopener noreferrer"&gt;here&lt;/a&gt;. &lt;/p&gt;

&lt;h3&gt;
  
  
  Important Note 🚨
&lt;/h3&gt;

&lt;p&gt;The goal of the agent is just to provide a well structured abstract idea. One shouldn't just directly copy this AI generated abstract and submit it. But use it as a source of reference and draft an original handcrafted proposal. &lt;/p&gt;

&lt;h3&gt;
  
  
  Tech Stack
&lt;/h3&gt;

&lt;p&gt;I used a mix of tools to build this project, each handling a different part of the process. &lt;a href="https://google.github.io/adk-docs/" rel="noopener noreferrer"&gt;Google ADK&lt;/a&gt; helps run the AI agents, &lt;a href="http://couchbase.com/" rel="noopener noreferrer"&gt;Couchbase&lt;/a&gt; stores past Kubecon talks data and performs the vector search, and &lt;a href="https://studio.nebius.com/" rel="noopener noreferrer"&gt;Nebius&lt;/a&gt; Embedding model for generating embeddings and LLM models (Example: Qwen) generates summaries and talk abstracts.&lt;/p&gt;

&lt;h2&gt;
  
  
  Complete Pipeline Flow / Architecture Deep Dive
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxnigc27q35z9te5srkmn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxnigc27q35z9te5srkmn.png" alt=" "&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The system is built as a modular, multi-stage pipeline that combines historical data with real-time research to generate high-quality talk proposal ideas. &lt;/p&gt;

&lt;h3&gt;
  
  
  Step 1: URL Extraction / Data Collection (&lt;code&gt;extract_events.py&lt;/code&gt;)
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Purpose&lt;/strong&gt;: Scrape and extract all available KubeCon talk URLs from official conference schedule pages.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Save the KubeCon schedule HTML to a file, then run:&lt;/span&gt;
python extract_events.py &amp;lt; schedule.html
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;What it does&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Parses HTML content from stdin&lt;/li&gt;
&lt;li&gt;Extracts all event URLs with pattern &lt;code&gt;event/&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Merges with existing URLs in &lt;code&gt;event_urls.txt&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Outputs the count of new URLs discovered&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Output&lt;/strong&gt;: &lt;code&gt;event_urls.txt&lt;/code&gt; - Contains all unique talk URLs&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 2: Talk Data Crawling / Data Ingestion (&lt;code&gt;couchbase_utils.py&lt;/code&gt;)
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Purpose&lt;/strong&gt;: Crawl each talk page, extract structured metadata (title, description, speakers, tags, etc.), and store it in Couchbase using well-defined document schemas.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python couchbase_utils.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;What it does&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Reads URLs from &lt;code&gt;event_urls.txt&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Uses AsyncWebCrawler to fetch talk pages in batches&lt;/li&gt;
&lt;li&gt;Extracts structured data:

&lt;ul&gt;
&lt;li&gt;Title&lt;/li&gt;
&lt;li&gt;Description&lt;/li&gt;
&lt;li&gt;Speaker(s)&lt;/li&gt;
&lt;li&gt;Category&lt;/li&gt;
&lt;li&gt;Date&lt;/li&gt;
&lt;li&gt;Location&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Stores directly to Couchbase with document keys like &lt;code&gt;talk_&amp;lt;event_id&amp;gt;&lt;/code&gt;
&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Features&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Batch processing (5 URLs at a time)&lt;/li&gt;
&lt;li&gt;Error handling and retry logic&lt;/li&gt;
&lt;li&gt;Progress tracking with success/failure counts&lt;/li&gt;
&lt;li&gt;Automatic document key generation&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 3: Embedding Generation (&lt;code&gt;embeddinggeneration.py&lt;/code&gt;)
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Purpose&lt;/strong&gt;: Generate semantic vector embeddings from talk content (title + description + category) using the &lt;code&gt;intfloat/e5-mistral-7b-instruct&lt;/code&gt; model from &lt;a href="https://studio.nebius.com/" rel="noopener noreferrer"&gt;Nebius AI Studio&lt;/a&gt;, and store them back in Couchbase for fast vector search.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;python embeddinggeneration.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;What it does&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Queries all documents from Couchbase&lt;/li&gt;
&lt;li&gt;Combines title, description, and category into searchable text&lt;/li&gt;
&lt;li&gt;Generates embeddings using &lt;code&gt;intfloat/e5-mistral-7b-instruct&lt;/code&gt; model&lt;/li&gt;
&lt;li&gt;Updates documents with embedding vectors&lt;/li&gt;
&lt;li&gt;Enables vector search functionality&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Model&lt;/strong&gt;: Uses Nebius AI's embedding endpoint for high-quality vectors&lt;/p&gt;

&lt;h3&gt;
  
  
  Step 4: Agent + RAG Application (&lt;code&gt;talk_suggestions_app.py&lt;/code&gt;)
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Purpose&lt;/strong&gt;: The user inputs a rough topic idea via the Streamlit interface. The system runs both the research agent and vector search in parallel, then combines the outputs using a Nebius AI LLM to generate a unique, well-structured abstract with key takeaways.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;streamlit run kubecon-talk-agent/talk_suggestions_app.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Core Features&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;On one side, the application performs a vector search through a database (Couchbase) of past KubeCon talks to understand what’s already been covered. On the other, it leverages a web research agent powered by Google ADK to gather the latest trends, technical developments, and community discussions around the topic. This leads into a three-stage generation process: the &lt;code&gt;Research Phase&lt;/code&gt;, where the agent collects up-to-date context; the &lt;code&gt;Retrieval Phase&lt;/code&gt;, where similar historical talks are surfaced; and the &lt;code&gt;Synthesis Phase&lt;/code&gt;, where an LLM merges both streams into a compelling proposal.&lt;/p&gt;

&lt;p&gt;Let's look a bit deeper into the 3 step process: &lt;/p&gt;

&lt;h3&gt;
  
  
  Research Agent Execution
&lt;/h3&gt;

&lt;p&gt;I created a custom multi-agent research system using Google ADK (Agent Development Kit). This system is designed to autonomously explore the to research emerging trends across the CNCF ecosystem in real-time from trusted sources.&lt;/p&gt;

&lt;p&gt;Here's how it works under the hood:&lt;/p&gt;

&lt;h4&gt;
  
  
  Parallel Execution for Web Research
&lt;/h4&gt;

&lt;p&gt;The first step involves spinning up multiple research agents that gather insights independently from different web sources. I use a &lt;code&gt;ParallelAgent&lt;/code&gt; to run all of these at the same time:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;ExaAgent&lt;/code&gt;: Leverages the Exa API to search for recent high-quality blogs, articles, and summaries published in the past 90 days.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;TavilyAgent&lt;/code&gt; (optional): Pulls developer sentiment and discussion threads from platforms like Reddit, X (formerly Twitter), and Dev.to.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;LinkupAgent&lt;/code&gt; (optional): Surfaces curated technical posts, deep-dives from sites like GitHub and Hacker News.&lt;/p&gt;

&lt;p&gt;Each of these tools is wrapped in its own &lt;code&gt;LlmAgent&lt;/code&gt;, configured with dynamic instructions based on the user’s topic. Because they operate independently, they don’t interfere with one another and collectively reduce total response time.&lt;/p&gt;

&lt;p&gt;These agents are executed in parallel using a &lt;code&gt;ParallelAgent&lt;/code&gt;, ensuring low latency and independent execution. Once all the raw data is collected, it is passed to a &lt;code&gt;SummaryAgent&lt;/code&gt;, which synthesizes the results into a clean, structured summary using a powerful LLM (nebius/Qwen/Qwen3-235B-A22B).&lt;/p&gt;

&lt;h4&gt;
  
  
  Sequential Reasoning for Synthesis and Insight
&lt;/h4&gt;

&lt;p&gt;Once all agents (&lt;code&gt;ParallelAgent&lt;/code&gt;) complete their respective searches, I combine their outputs into a single structured flow.&lt;/p&gt;

&lt;p&gt;Other than search agents, the entire pipeline with steps like summarization and analysis is being done sequentially, managed using ADK’s &lt;code&gt;SequentialAgent&lt;/code&gt;:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;SummaryAgent&lt;/code&gt;: This agent synthesizes the raw research results into a cohesive, structured Markdown summary. It filters the highlights common themes, and stitches together the key insights from the research agents. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;AnalysisAgent&lt;/code&gt;: This agent reviews the summary and delivers deeper insights including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Key Trends – Major developments or patterns observed&lt;/li&gt;
&lt;li&gt;Novel Angles – Unique viewpoints or underexplored ideas&lt;/li&gt;
&lt;li&gt;Unanswered Questions – What the community is still trying to figure out&lt;/li&gt;
&lt;li&gt;Contrarian Viewpoints – Active debates or non-mainstream takes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This sequential setup is intentional: the &lt;code&gt;AnalysisAgent&lt;/code&gt; depends on the clean output from the &lt;code&gt;SummaryAgent&lt;/code&gt;. Running them in parallel would reduce quality and coherence.&lt;/p&gt;

&lt;h4&gt;
  
  
  The Orchestration Layer
&lt;/h4&gt;

&lt;p&gt;The full pipeline is managed through ADK’s orchestration features:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;ParallelAgent → for running web search agents&lt;/li&gt;
&lt;li&gt;SequentialAgent → for dependent reasoning steps&lt;/li&gt;
&lt;li&gt;Runner → to execute the pipeline&lt;/li&gt;
&lt;li&gt;InMemorySessionService → for fast, stateless execution&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here's a simplified breakdown of the pipeline:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def run_adk_research(topic: str) -&amp;gt; str:
    # 1. Setup Models
    nebius_base_model = LiteLlm(model="nebius/Qwen/Qwen3-235B-A22B", api_key=os.getenv("NEBIUS_API_KEY"))

    # 2. Define Agents
    exa_agent = LlmAgent(
        name="ExaAgent",
        model=nebius_base_model,
        instruction=f"Use the exa_search_ai tool to fetch the latest news and developments about '{topic}'.",
        tools=[exa_search_ai],
        output_key="exa_results"
    )

    # 3. Summarize Results
    summary_agent = LlmAgent(
        name="SummaryAgent",
        model=nebius_base_model,
        instruction="""
            You are a meticulous research summarizer. Combine the results from 'exa_results' 
            into a cohesive markdown summary. Focus on trends, notable discussions, and 
            community sentiment.
        """,
        output_key="final_summary"
    )

    # 4. Execute Pipeline
    pipeline = SequentialAgent(
        name="AIPipelineAgent",
        sub_agents=[
            ParallelAgent(name="ParallelSearch", sub_agents=[exa_agent]),
            summary_agent
        ]
    )

    runner = Runner(agent=pipeline, app_name="adk_research_app", session_service=InMemorySessionService())

    content = types.Content(role="user", parts=[types.Part(text=f"Start analysis for {topic}")])
    events = runner.run(user_id="streamlit_user", session_id="session_xyz", new_message=content)

    for event in events:
        if event.is_final_response():
            return event.content.parts[0].text

    return "Failed to generate summary."

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Retrieval Agent Execution
&lt;/h3&gt;

&lt;p&gt;Once real-time research is complete, the system now proceeds to retrieving historical context from past KubeCon talks. This is done using Couchbase vector search, which allows us to compare the semantic similarity of the user's idea with previous talk proposals.&lt;/p&gt;

&lt;h4&gt;
  
  
  What happens here?
&lt;/h4&gt;

&lt;p&gt;We take the user’s query and generate an embedding using &lt;code&gt;intfloat/e5-mistral-7b-instruct&lt;/code&gt; via Nebius' embedding API.&lt;/p&gt;

&lt;p&gt;We then perform a vector search against a &lt;code&gt;kubecontalks&lt;/code&gt; index in Couchbase that stores embeddings of historical talks.&lt;/p&gt;

&lt;p&gt;Finally, we fetch the metadata (title, speaker, category, description) for the top matching talks.&lt;/p&gt;

&lt;p&gt;This helps us:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Understand what’s already been covered.&lt;/li&gt;
&lt;li&gt;Avoid duplicate proposals.&lt;/li&gt;
&lt;li&gt;Borrow inspiration from successful submissions&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Here's the sample code for the same:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;class CouchbaseConnection:
    def __init__(self):
        connection_string = os.getenv('CB_CONNECTION_STRING')
        username = os.getenv('CB_USERNAME')
        password = os.getenv('CB_PASSWORD')
        bucket_name = os.getenv('CB_BUCKET')
        collection_name = os.getenv('CB_COLLECTION')

        auth = PasswordAuthenticator(username, password)
        options = ClusterOptions(auth)
        self.cluster = Cluster(connection_string, options)
        self.bucket = self.cluster.bucket(bucket_name)
        self.scope = self.bucket.scope("_default")
        self.collection = self.bucket.collection(collection_name)
        self.search_index_name = os.getenv('CB_SEARCH_INDEX', "kubecontalks")

    def generate_embedding(self, text: str) -&amp;gt; List[float]:
        client = OpenAI(base_url=os.getenv("NEBIUS_API_BASE"), api_key=os.getenv("NEBIUS_API_KEY"))
        response = client.embeddings.create(
            model="intfloat/e5-mistral-7b-instruct",
            input=text,
            timeout=30
        )
        return response.data[0].embedding

    def get_similar_talks(self, query: str, num_results: int = 5) -&amp;gt; List[Dict[str, Any]]:
        embedding = self.generate_embedding(query)
        search_req = SearchRequest.create(MatchNoneQuery()).with_vector_search(
            VectorSearch.from_vector_query(
                VectorQuery("embedding", embedding, num_candidates=num_results)
            )
        )
        result = self.scope.search(self.search_index_name, search_req)
        rows = list(result.rows())

        similar_talks = []
        for row in rows:
            doc = self.collection.get(row.id)
            if doc and doc.value:
                talk = doc.value
                similar_talks.append({
                    "title": talk.get("title", "N/A"),
                    "description": talk.get("description", "N/A"),
                    "category": talk.get("category", "N/A"),
                    "speaker": talk.get("speaker", "N/A"),
                    "score": row.score
                })
        return similar_talks


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The results of this phase are then passed into the final synthesis stage.&lt;/p&gt;

&lt;h3&gt;
  
  
  Synthesis Phase
&lt;/h3&gt;

&lt;p&gt;The final phase brings everything together: the user’s idea, the ADK-generated real-time insights, and the similar historical talks.&lt;/p&gt;

&lt;p&gt;The goal is to produce a talk propsal idea proposal that is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Timely – aligned with current trends.&lt;/li&gt;
&lt;li&gt;Unique – not duplicating past talks.&lt;/li&gt;
&lt;li&gt;Actionable – with clear learning objectives and audience fit.&lt;/li&gt;
&lt;/ul&gt;

&lt;h4&gt;
  
  
  How it works?
&lt;/h4&gt;

&lt;p&gt;We use a LLM &lt;code&gt;(Qwen/Qwen3-235B-A22B)&lt;/code&gt; to analyze: &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;User’s raw idea&lt;/li&gt;
&lt;li&gt;Web analysis from the research agent&lt;/li&gt;
&lt;li&gt;Historical KubeCon talks from vector search&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We then ask the model to synthesize all of this into a structured format containing:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Title&lt;/li&gt;
&lt;li&gt;Abstract&lt;/li&gt;
&lt;li&gt;Key Learning Objectives&lt;/li&gt;
&lt;li&gt;Target Audience&lt;/li&gt;
&lt;li&gt;Why this talk is unique
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def generate_talk_suggestion(query: str, similar_talks: List[Dict[str, Any]], adk_research: str) -&amp;gt; str:
    historical_context = "\n\n".join([
        f"Title: {talk['title']}\nDescription: {talk['description']}\nCategory: {talk['category']}"
        for talk in similar_talks
    ]) if similar_talks else "No similar talks found."

    prompt = f"""
You are an expert in cloud-native conference planning.

User's Idea:
{query}

PART 1: Historical Talks
{historical_context}

PART 2: Web Research
{adk_research}

Your task is to generate a fresh and compelling talk proposal. Follow this structure:

**Title:**  
*A catchy title that grabs attention.*

**Abstract:**  
*2–3 paragraphs outlining the core idea, approach, and takeaways.*

**Key Learning Objectives:**  
- Bullet 1  
- Bullet 2  
- Bullet 3  

**Target Audience:**  
*Beginner SREs? Advanced Platform Engineers?*

**Why This Talk is Unique:**  
*Explain how it differs from existing talks and addresses a fresh trend or gap.*
"""

    client = OpenAI(api_key=os.getenv("NEBIUS_API_KEY"), base_url=os.getenv("NEBIUS_API_BASE"))
    response = client.chat.completions.create(
        model="Qwen/Qwen3-235B-A22B",
        messages=[
            {"role": "system", "content": "You are a cloud-native conference program advisor."},
            {"role": "user", "content": prompt}
        ],
        temperature=0.7,
        max_tokens=2048
    )
    return response.choices[0].message.content

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is where the magic happens. The model takes a dual-context approach—both fresh insights and past data—to recommend a proposal that’s:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;grounded in reality,&lt;/li&gt;
&lt;li&gt;informed by what’s already been done&lt;/li&gt;
&lt;li&gt;provides real world use-cases&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;  &lt;iframe src="https://www.youtube.com/embed/Nbo_iB5zA5k"&gt;
  &lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  Final thoughts
&lt;/h2&gt;

&lt;p&gt;Building this made me realize that talk ideation is just another AI use case. Blending historical talk data with up-to-the-minute research minimizes time and effort spent to getting latest information and having to spend time finding previous talks on the topics. &lt;/p&gt;

&lt;p&gt;AI Agents help simplify tasks and can orchestrate complex workflows with ease. &lt;/p&gt;

&lt;p&gt;Curious to try this for your own conference? Drop me a note—I’d love to hear your ideas and evolve this further with the community!&lt;/p&gt;

</description>
      <category>programming</category>
      <category>ai</category>
      <category>couchbase</category>
      <category>python</category>
    </item>
    <item>
      <title>Using Nebius AI Models with LangChain/Langgraph via LiteLLM</title>
      <dc:creator>Shivay Lamba</dc:creator>
      <pubDate>Mon, 09 Jun 2025 19:33:08 +0000</pubDate>
      <link>https://forem.com/shivaylamba/using-nebius-ai-models-with-langchainlanggraph-via-litellm-5c78</link>
      <guid>https://forem.com/shivaylamba/using-nebius-ai-models-with-langchainlanggraph-via-litellm-5c78</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;Note: there is a Python package for Nebius Langchain &lt;a href="https://pypi.org/project/langchain-nebius/" rel="noopener noreferrer"&gt;https://pypi.org/project/langchain-nebius/&lt;/a&gt;&lt;br&gt;
This approach is an alternative approach with LiteLLM&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Nebius AI Studio is a platform from Nebius that simplifies the process of building applications using AI models. It provides a suite of tools and services for developers to easily test, integrate and fine-tune various AI models, including those for text and image generation. &lt;br&gt;
You can checkout the list of available models &lt;a href="https://studio.nebius.com/" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;LangChain is one of the most popular open source frameworks for building Generative AI / LLM based applications. &lt;/p&gt;

&lt;p&gt;Thus in this blog we will cover how you can use LiteLLM to access Nebius AI Studio Models when building LLM/Agentic Applications with Langchain and Langgraph (LangGraph is an open source AI agent framework designed to build, deploy and manage complex generative AI agent workflows). &lt;/p&gt;
&lt;h1&gt;
  
  
  LiteLLM Nebius AI provider
&lt;/h1&gt;

&lt;p&gt;LiteLLM is a library and proxy server that simplifies interactions with various Large Language Model (LLM) APIs, allowing developers to use a consistent interface with over 100 different LLMs. It essentially provides a standardized OpenAI API format for these different providers, making it easier to switch between them without rewriting code. &lt;/p&gt;

&lt;p&gt;LiteLLM includes Nebius AI Studio as one of the &lt;a href="https://docs.litellm.ai/docs/providers/nebius" rel="noopener noreferrer"&gt;LLM providers&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;In order to use a Nebius Model with LiteLLM, ensure you have the NEBIUS_API_KEY set as an environment variable. &lt;br&gt;
&lt;code&gt;os.environ['NEBIUS_API_KEY'] = "insert-your-nebius-ai-studio-api-key"&lt;br&gt;
&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;This is how we can use a Nebius Model: &lt;br&gt;
&lt;code&gt;&lt;br&gt;
model="nebius/Qwen/Qwen3-235B-A22B"&lt;br&gt;
&lt;/code&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Using LiteLLM with LangChain
&lt;/h2&gt;

&lt;p&gt;You can use the &lt;code&gt;ChatLiteLLM&lt;/code&gt; &lt;a href="https://docs.litellm.ai/docs/langchain/" rel="noopener noreferrer"&gt;chat model&lt;/a&gt; from the LangChain Community models. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;from langchain_community.chat_models import ChatLiteLLM&lt;br&gt;
&lt;/code&gt;&lt;br&gt;
And here's how you can use Nebius AI model with ChatLiteLLM&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from langchain_community.chat_models import ChatLiteLLM

# Create LLM class
llm = ChatLiteLLM(model="nebius/Qwen/Qwen3-235B-A22B")
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h1&gt;
  
  
  Creating an ReAct Agent with Langgraph
&lt;/h1&gt;

&lt;p&gt;Once you have initialized the LLM, you can now use to create agents with Langgraph. &lt;/p&gt;

&lt;p&gt;In the example below example, we are using Langgraph to create a ReAct agent that can interact with a Couchbase database using Model Context Protocol. ReAct Agents iteratively think, use tools, and act on observations to achieve user goals, dynamically adapting their approach. LangGraph offers a prebuilt ReAct agent (create_react_agent).&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from langchain_mcp_adapters.tools import load_mcp_tools
from langgraph.prebuilt import create_react_agent
from langgraph.checkpoint.memory import InMemorySaver
from langchain_community.chat_models import ChatLiteLLM

import os
# env variable
os.environ['NEBIUS_API_KEY']

llm = ChatLiteLLM(
model="nebius/Qwen/Qwen3-235B-A22B",
)

async def main():
    async with stdio_client(server_params) as (read, write):
        async with ClientSession(read, write) as session:
            # Initialize the connection
            print("Initializing connection...")
            await session.initialize()

            # Get tools
            print("Loading tools...")
            tools = await load_mcp_tools(session)

            # Create and run the agent
            print("Creating agent...")
            checkpoint = InMemorySaver()

            agent = create_react_agent(
                llm, 
                tools,
                prompt=system_prompt,
                checkpointer=checkpoint
            )

            print("-"*25, "Starting Run", "-"*25)
            await qna(agent)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The main function ties everything together to set up and run our agent:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Start &amp;amp; Connect to MCP Server: It first starts the mcp-server-couchbase process using stdio_client and establishes a communication ClientSession with it.&lt;/li&gt;
&lt;li&gt;Initialize Session &amp;amp; Load Tools: The session is initialized. Then, load_mcp_tools queries the MCP server to get the available Couchbase tools and prepares them for LangChain.&lt;/li&gt;
&lt;li&gt;Set Up Agent Memory: InMemorySaver is created to allow the agent to remember conversation history.&lt;/li&gt;
&lt;li&gt;Create ReAct Agent: The create_react_agent function builds our AI agent, providing it with the language model, the Couchbase tools, our system_prompt, and the checkpoint for memory.&lt;/li&gt;
&lt;li&gt;Run Q&amp;amp;A: Finally, it calls the qna function, passing the created agent to start the question-and-answer process with the database.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You dive deeper into this example, you can checkout the &lt;a href="https://github.com/Arindam200/awesome-ai-apps/tree/main/mcp_ai_agents/langchain_langgraph_mcp_agent" rel="noopener noreferrer"&gt;Github Repository&lt;/a&gt; for this project. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fruiatuzjuwrx638hfvcy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fruiatuzjuwrx638hfvcy.png" alt="Image description" width="800" height="617"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;While Nebius AI Studio doesn't yet have direct integrations with LangChain, the flexibility of the AI dev ecosystem means you’re not stuck waiting. With LiteLLM, you can use Nebius Models with LangChain/Langgraph to build agentic applications. &lt;/p&gt;

</description>
      <category>ai</category>
      <category>langchain</category>
      <category>programming</category>
    </item>
    <item>
      <title>Building a Multi-Agent RAG System with Couchbase, CrewAI, and Nebius AI Studio</title>
      <dc:creator>Shivay Lamba</dc:creator>
      <pubDate>Thu, 05 Jun 2025 03:27:08 +0000</pubDate>
      <link>https://forem.com/couchbase/building-a-multi-agent-rag-system-with-couchbase-crewai-and-nebius-ai-studio-a8b</link>
      <guid>https://forem.com/couchbase/building-a-multi-agent-rag-system-with-couchbase-crewai-and-nebius-ai-studio-a8b</guid>
      <description>&lt;p&gt;Traditional RAG systems typically follow a linear approach: retrieve documents, generate a response, and present it to the user. While effective, this approach can lack the nuanced understanding and specialized processing that complex queries often require. In contrast, AI agents introduce a more dynamic and intelligent workflow to RAG-based operations. They can collaborate, iterate, and critique each other’s outputs, enabling deeper understanding and more coherent responses. This agent-driven approach transforms RAG from a static process into an adaptive system capable of producing higher-quality, context-aware content.&lt;/p&gt;

&lt;p&gt;In this blog post, we’ll guide you through building a powerful semantic search engine using &lt;a href="https://www.couchbase.com/" rel="noopener noreferrer"&gt;Couchbase&lt;/a&gt; as the database, CrewAI for agent-based Retrieval-Augmented Generation (RAG) operations, and Nebius AI Studio for the LLM and embedding model.&lt;/p&gt;

&lt;h1&gt;
  
  
  What is CrewAI?
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://www.crewai.com/" rel="noopener noreferrer"&gt;CrewAI&lt;/a&gt; is an open-source Python framework that supports developing and managing multi-agent AI systems. &lt;/p&gt;

&lt;p&gt;What sets CrewAI apart is its ability to facilitate collaborative task handling and goal-driven execution, enabling agents to work together efficiently toward shared outcomes. It incorporates iterative thinking and feedback loops, allowing agents to refine their decisions and outputs continuously. It also offers seamless communication between the agents. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feonesba78924l2zzwrr8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Feonesba78924l2zzwrr8.png" alt="Image description" width="634" height="473"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Building a Multi-Agent RAG System with Couchbase and CrewAI
&lt;/h1&gt;

&lt;p&gt;The foundation of this example is a powerful semantic search engine using Couchbase as the vector database and CrewAI for agent-based RAG operations. CrewAI allows us to create specialized agents that can work together to handle different aspects of the RAG workflow, from document retrieval to response generation.&lt;/p&gt;

&lt;h1&gt;
  
  
  Setting Up LLM Components
&lt;/h1&gt;

&lt;p&gt;For this project, we chose &lt;a href="https://studio.nebius.com/" rel="noopener noreferrer"&gt;Nebius AI Studio&lt;/a&gt; as our LLM and embedding provider. Nebius AI Studio is a platform that simplifies the process of building applications using AI models. It provides a suite of open-source LLM and embedding models. &lt;/p&gt;

&lt;h2&gt;
  
  
  Extending the base OpenAI LLM
&lt;/h2&gt;

&lt;p&gt;For the CrewAI LLM, we replace the &lt;code&gt;base_url&lt;/code&gt; with the Nebius AI Studio endpoint: &lt;code&gt;https://api.studio.nebius.com/v1/&lt;/code&gt;, define the model we wish to use, and add the &lt;code&gt;NEBIUS_AI_KEY&lt;/code&gt; API key&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Define the LLM with Nebius AI Studio
llm = LLM(
    model="openai/meta-llama/Meta-Llama-3.1-70B-Instruct",
    base_url="https://api.studio.nebius.com/v1/",
    api_key=os.getenv('NEBIUS_AI_KEY')
)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Creating the NebiusEmbeddings Class
&lt;/h2&gt;

&lt;p&gt;LangChain doesn’t natively support Nebius’s API. To bridge this gap, we created a custom &lt;code&gt;NebiusE5MistralEmbeddings&lt;/code&gt; class that extends LangChain’s &lt;code&gt;Embeddings&lt;/code&gt; interface, allowing seamless integration with Nebius’s embedding model:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from langchain_core.embeddings import Embeddings
from typing import List
import requests

class NebiusEmbeddings(Embeddings):
    def __init__(self, api_key: str, model: str = "BAAI/bge-en-icl"):
        self.api_key = api_key
        self.model = model
        self.base_url = "https://api.studio.nebius.ai/v1/embeddings"

    def embed_documents(self, texts: List[str]) -&amp;gt; List[List[float]]:
        """Embed a list of documents using Nebius AI Studio."""
        headers = {
            "Authorization": f"Bearer {self.api_key}",
            "Content-Type": "application/json"
        }

        payload = {
            "model": self.model,
            "input": texts
        }

        response = requests.post(self.base_url, json=payload, headers=headers)
        response.raise_for_status()

        data = response.json()
        return [item["embedding"] for item in data["data"]]

    def embed_query(self, text: str) -&amp;gt; List[float]:
        """Embed a single query using Nebius AI Studio."""
        return self.embed_documents([text])[0]

embeddings = NebiusE5MistralEmbeddings(
    api_key=os.getenv('NEBIUS_AI_KEY'),  # Or set NEBIUS_API_KEY environment variable
    model="intfloat/e5-mistral-7b-instruct"
)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This class implements &lt;code&gt;embed_documents&lt;/code&gt; for processing batches of text and &lt;code&gt;embed_query&lt;/code&gt; for individual search queries, interacting directly with the Nebius API endpoint. Nebius AI Studio supports various embedding models, including BGE and E5-mistral. &lt;/p&gt;

&lt;h2&gt;
  
  
  Setting Up the Couchbase Vector Store
&lt;/h2&gt;

&lt;p&gt;We initialize the Couchbase vector store and ingest the articles in batches:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from langchain_couchbase.vectorstores import CouchbaseVectorStore

vector_store = CouchbaseVectorStore(
    cluster=cluster,
    bucket_name=CB_BUCKET_NAME,
    scope_name=SCOPE_NAME,
    collection_name=COLLECTION_NAME,
    embedding=embeddings,
    index_name=INDEX_NAME
)

batch_size = 100
articles = [article for article in unique_articles if len(article) &amp;lt;= 50000]
vector_store.add_texts(texts=articles, batch_size=batch_size)
logging.info("Document ingestion completed successfully.")

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Creating a Vector Search Tool with CrewAI
&lt;/h2&gt;

&lt;p&gt;We create a vector search tool to allow CrewAI agents to retrieve relevant documents:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from crewai.tools import tool

retriever = vector_store.as_retriever(search_type="similarity")

@tool("vector_search")
def search_tool(query: str) -&amp;gt; str:
    """Search for relevant documents using vector similarity."""
    docs = retriever.invoke(query)
    return "\n\n".join([f"Document {i+1}:\n{doc.page_content}" for i, doc in enumerate(docs)])
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This tool:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Uses the vector store’s retriever to perform similarity searches.&lt;/li&gt;
&lt;li&gt;Formats results for easy consumption by AI agents.&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Defining the Agents
&lt;/h1&gt;

&lt;p&gt;The Research Expert uses the vector store to retrieve relevant documents using the vector_search tool. The Technical Writer agent, also powered by the same model, takes the research output and generates a well-structured, reader-friendly response.&lt;/p&gt;

&lt;p&gt;Here’s how we set up the agents&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;from crewai import Agent, Task, Crew, Process

# Research Expert Agent
researcher = Agent(
    role="Research Expert",
    goal="Retrieve and analyze relevant information from the vector store to answer user queries accurately.",
    backstory="A skilled data analyst with expertise in semantic search and information retrieval.",
    tools=[search_tool],
    llm=llm,
    verbose=True
)

# Technical Writer Agent
writer = Agent(
    role="Technical Writer",
    goal="Craft clear, concise, and well-structured responses based on research findings.",
    backstory="An experienced writer specializing in turning complex data into engaging narratives.",
    llm=llm,
    verbose=True
)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Defining Tasks
&lt;/h2&gt;

&lt;p&gt;We define tasks for each agent to ensure a clear division of work:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Research Task
research_task = Task(
        description=f"Research and analyze information relevant to: {query}",
        agent=researcher,
        expected_output="A detailed analysis with key findings and supporting evidence"
    )

# Writing Task
writing_task = Task(
        description="Create a comprehensive and well-structured response",
        agent=writer,
        expected_output="A clear, comprehensive response that answers the query",
        context=[research_task]
    )

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;Crew&lt;/code&gt; object orchestrates the agents in a sequential process, where the Research Expert completes its task before passing the output to the Technical Writer.&lt;/p&gt;

&lt;h2&gt;
  
  
  Running the Crew!
&lt;/h2&gt;

&lt;p&gt;With agents and tasks defined, it's time to kick off the crew. To demonstrate the agent workflow, we process a sample query about the FA Cup third round draw:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def process_query(query, researcher, writer):
    crew = Crew(
        agents=[researcher, writer],
        tasks=[research_task, writing_task],
        process=Process.sequential,
        verbose=True,
        cache=True,
        planning=True
    )
    return crew.kickoff(inputs={"query": query})

query = "What are the key details about the FA Cup third round draw? Include information about Manchester United vs Arsenal, Tamworth vs Tottenham, and other notable fixtures."
result = process_query(query, researcher, writer)
print(result)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This query triggers the following workflow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;The Research Expert uses the vector_search tool to retrieve relevant BBC News articles from the Couchbase vector store, leveraging Nebius’s embeddings for semantic similarity.&lt;/li&gt;
&lt;li&gt;The agent analyzes the retrieved documents and generates a summary of key details, such as the historical context of Manchester United vs Arsenal and the potential for an upset in Tamworth vs Tottenham.&lt;/li&gt;
&lt;li&gt;The Technical Writer takes this summary and crafts a polished article.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The console output during &lt;code&gt;crew.kickoff()&lt;/code&gt; shows each agent thinking, using tools, and passing information to the next agent, &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fohgam7hl3n9rdvti3wjb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fohgam7hl3n9rdvti3wjb.png" alt="Image description" width="800" height="577"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv2l57nw433o8lr8kuggy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv2l57nw433o8lr8kuggy.png" alt="Image description" width="800" height="652"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Final Thoughts
&lt;/h1&gt;

&lt;p&gt;Building a semantic search engine with Couchbase, CrewAI, and Nebius AI Studio is a powerful way to leverage AI Agents to build more robust and accurate RAG applications. Couchbase’s vector search enables efficient semantic retrieval, CrewAI’s agent-based architecture streamlines the RAG workflow, and Nebius AI Studio’s models provide high-quality embeddings and language generation. &lt;/p&gt;

&lt;p&gt;Agentic RAG provides a powerful foundation for building intelligent applications that can understand and respond to user queries with high relevance and accuracy.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Building Multi-Agent Workflows using Mastra AI and Couchbase</title>
      <dc:creator>Shivay Lamba</dc:creator>
      <pubDate>Sun, 18 May 2025 00:43:41 +0000</pubDate>
      <link>https://forem.com/couchbase/building-multi-agent-workflows-using-mastra-ai-and-couchbase-198n</link>
      <guid>https://forem.com/couchbase/building-multi-agent-workflows-using-mastra-ai-and-couchbase-198n</guid>
      <description>&lt;p&gt;TL;DR: In this post, we explore how to use Mastra AI — a TypeScript-native agent framework — with Couchbase Vector Search to build a production-ready, multi-agent RAG (Retrieval-Augmented Generation) blog-writing assistant.&lt;/p&gt;

&lt;p&gt;2025 has marked a significant shift toward agent-based AI systems. There have been a large number of AI Agent frameworks released in the past few months. One such powerful agent framework is Mastra.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://mastra.ai" rel="noopener noreferrer"&gt;Mastra&lt;/a&gt; is an open-source TypeScript agent framework. You can use Mastra to build AI agents that have memory, can execute functions, or chain LLM calls in deterministic workflows. You can also feed them application-specific knowledge using RAG. Mastra also comes with built-in support for running evals and observability, making it very suitable for production use cases.&lt;/p&gt;

&lt;h2&gt;
  
  
  What are Mastra Workflows
&lt;/h2&gt;

&lt;p&gt;There has been growing complexity in building robust AI applications. What starts as a simple LLM call quickly becomes extremely complex with a large number of agents, prompts, and coordination logic among agents. This is where &lt;a href="https://mastra.ai/en/docs/workflows/overview" rel="noopener noreferrer"&gt;Mastra workflows&lt;/a&gt; come in. The Mastra Workflows bring structure, reliability, and developer-friendly patterns to building generative AI applications.&lt;/p&gt;

&lt;p&gt;Mastra workflows are graph-based state machines that allow you to orchestrate complex sequences of AI operations. Workflows let you define discrete steps with clear inputs, outputs, and execution logic.&lt;/p&gt;

&lt;p&gt;This functionality combines the flexibility of TypeScript with the structure of a workflow engine.&lt;/p&gt;

&lt;p&gt;With workflows providing orchestration, the next crucial component is knowledge retrieval. That’s where Couchbase Vector Store fits in.&lt;/p&gt;

&lt;h2&gt;
  
  
  Couchbase Vector Store for Mastra
&lt;/h2&gt;

&lt;p&gt;Mastra now natively supports Couchbase as a vector store for RAG workflows.&lt;/p&gt;

&lt;p&gt;In Mastra, you can process your documents into chunks, create embeddings, store them in a vector database, and then retrieve relevant context at query time.&lt;/p&gt;

&lt;p&gt;Setting up the Couchbase vector store is relatively simple:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;// Initialize Couchbase connection
const couchbaseStore = new CouchbaseVector({
  connectionString: process.env.CB_CONNECTION_STRING,
  username: process.env.CB_USERNAME,
  password: process.env.CB_PASSWORD,
  bucket: process.env.CB_BUCKET,
  scope: process.env.CB_SCOPE,
  collection: process.env.CB_COLLECTION
});

// Create vector search index
await couchbaseStore.createIndex({
  indexName: "research_embeddings", // name of Couchbase Vector Search Index
  dimension: 1536, // Embedding dimensionality 
  similarity: "cosine"
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note: When setting up vector search in Couchbase, developers can optimize for either recall (accuracy) or latency based on application needs. The system supports vectors with dimensions up to 4096, making it compatible with most modern embedding models.&lt;/p&gt;

&lt;h2&gt;
  
  
  Implementing a Multi-Agent Research-Writer Workflow using Mastra AI and Couchbase
&lt;/h2&gt;

&lt;p&gt;The foundation of this example is a multi-agent architecture in which the user initially provides a search query. Based on this, the Researcher Agent performs RAG, leveraging Couchbase vector search to retrieve relevant documents. Then, the Writer Agent transforms the research output into polished content.&lt;/p&gt;

&lt;p&gt;You can check out the Github Repo for this project &lt;a href="https://github.com/shivay-couchbase/mastra-ai-couchbase-research-writer-agent" rel="noopener noreferrer"&gt;here&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Creating a Research Agent
&lt;/h3&gt;

&lt;p&gt;The research agent takes an input query from the user and retrieves relevant information using RAG:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const researchAgent = new Agent({
  name: 'Research Agent',
  instructions: `You are a research assistant that analyzes academic papers.
    Find relevant information using vector search and provide accurate answers.`,
  model: openai('gpt-4o-mini'),
  tools: {
    vectorQueryTool: createVectorQueryTool({
      vectorStoreName: 'couchbaseStore',
      indexName: 'research_embeddings',
      model: openai.embedding('text-embedding-3-small'),
    }),
  },
});

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This agent uses OpenAI's GPT-4o-mini model and is equipped with a vector query tool that allows it to search through embedded documents stored in Couchbase.&lt;/p&gt;

&lt;h3&gt;
  
  
  Creating a Writer Agent
&lt;/h3&gt;

&lt;p&gt;The writer agent takes the research output and transforms it into polished content&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { Agent } from "@mastra/core/agent";
import { openai } from "@ai-sdk/openai";

export const writerAgent = new Agent({
  name: "Writer Assistant",
  instructions: `You are a professional blog writer that creates engaging content. 
    Your task is to write a well-structured blog post based on the research provided.
    Focus on creating high-quality, informative, and engaging content.
    Make sure to maintain a clear narrative flow and use the research effectively.
    Focus on the specific content available in the tool and acknowledge if you cannot find sufficient information to answer a question.
    Base your responses only on the content provided, not on general knowledge.`,
  model: openai("gpt-4o-mini"),
});

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Creating a Writer Workflow using Mastra Workflows
&lt;/h3&gt;

&lt;p&gt;As mentioned previously, Mastra’s workflow system provides a standardized way to define steps and link them together. In this example, we create a workflow for the Researcher and Writer Agents to connect these agents in a sequential process.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { Step, Workflow } from "@mastra/core/workflows";
import { z } from "zod";
import { researchAgent } from "../agents/researchAgent";
import { writerAgent } from "../agents/writerAgent";

// Define the research step
const researchStep = new Step({
  id: "researchStep",
  execute: async ({ context }) =&amp;gt; {
    if (!context?.triggerData?.query) {
      throw new Error("Query not found in trigger data");
    }
    const result = await researchAgent.generate(
      `Research information for a blog post about: ${context.triggerData.query}`
    );
    console.log("Research result:", result.text);
    return {
      research: result.text,
    };
  },
});

// Define the writing step
const writingStep = new Step({
  id: "writingStep",
  execute: async ({ context }) =&amp;gt; {
    const research = context?.getStepResult&amp;lt;{ research: string }&amp;gt;("researchStep")?.research;
    if (!research) {
      throw new Error("Research not found from previous step");
    }

    const result = await writerAgent.generate(
      `Write a blog post using this research: ${research}. Focus on the specific content available in the tool and acknowledge if you cannot find sufficient information to answer a question.
    Base your responses only on the content provided, not on general knowledge.`
    );
    console.log("Writing result:", result.text);
    return {
      blogPost: result.text,
      research: research,
    };
  },
});

// Create and configure the workflow
export const blogWorkflow = new Workflow({
  name: "blog-workflow",
  triggerSchema: z.object({
    query: z.string().describe("The topic to research and write about"),
  }),
});

// Run steps sequentially
blogWorkflow.step(researchStep).then(writingStep).commit(); 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This code defines the workflow with two steps: first research, then writing. Data flows between the steps, with the research output feeding directly into the writing process. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fczo74ib2y9cii24jojmu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fczo74ib2y9cii24jojmu.png" alt="Image description" width="800" height="406"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The Couchbase Vector Store integration with Mastra AI enables you to build scalable and production-ready AI agents, thanks to Mastra's agent orchestration capabilities and Couchbase's vector search.&lt;/p&gt;

&lt;p&gt;As AI agent adoption continues to evolve, we can expect even more powerful AI systems that combine the strengths of multiple specialized agents and knowledge retrieval mechanisms.&lt;/p&gt;

</description>
      <category>couchbase</category>
      <category>rag</category>
      <category>mastra</category>
      <category>genai</category>
    </item>
    <item>
      <title>Elasticsearch to Meilisearch migration guide</title>
      <dc:creator>Shivay Lamba</dc:creator>
      <pubDate>Fri, 07 Apr 2023 10:06:14 +0000</pubDate>
      <link>https://forem.com/shivaylamba/elasticsearch-migration-guide-383p</link>
      <guid>https://forem.com/shivaylamba/elasticsearch-migration-guide-383p</guid>
      <description>&lt;h1&gt;
  
  
  Elasticsearch migration guide
&lt;/h1&gt;

&lt;p&gt;&lt;strong&gt;Contents&lt;/strong&gt;:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Introduction&lt;/li&gt;
&lt;li&gt;Differences&lt;/li&gt;
&lt;li&gt;Migration script&lt;/li&gt;
&lt;li&gt;Front-end components &lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;em&gt;If you are currently using Elasticsearch and plan to migrate to Meilisearch for your application, follow this guide to help you with the transition.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Differences
&lt;/h2&gt;

&lt;p&gt;Before migrating, you may want to understand better what differentiates Elasticsearch from Meilisearch.&lt;/p&gt;

&lt;p&gt;Elasticsearch is a distributed search engine that uses clusters and nodes to optimize performance, whereas Meilisearch is a single instance/node with customizable startup options. Meilisearch is still evolving, and many functionalities are not yet supported. You can think of Meilsearch as similar to the Elasticsearch component in the ELK Stack, but without the rest of the components like Kibana, Logstash and Beats.&lt;/p&gt;

&lt;p&gt;You may see what new upgrades our developers are working on by visiting the &lt;a href="https://roadmap.meilisearch.com/tabs/4-in-progress"&gt;public roadmap&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Migration Script
&lt;/h2&gt;

&lt;p&gt;This guide will show you how to migrate data from Elasticsearch to Meilisearch using NodeJs and it has no impact on the programming language that will be used with Meilisearch in the future.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Initialize project&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Start by creating a directory &lt;code&gt;elastic-meilisearch-migration&lt;/code&gt; and initialize it as an npm project:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;mkdir &lt;/span&gt;elastic-meilisearch-migration
&lt;span class="nb"&gt;cd &lt;/span&gt;elastic-meilisearch-migration
npm init &lt;span class="nt"&gt;-y&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next, create a &lt;code&gt;script.js&lt;/code&gt; file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;touch &lt;/span&gt;script.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This file will contain our migration code.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Install required packages&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;To get started, you'll need two npm packages. The first is &lt;code&gt;@elastic/elasticsearch&lt;/code&gt;, the JavaScript client for the ElasticSearch API, and the second is &lt;code&gt;meilisearch&lt;/code&gt;, the JavaScript client for the Meilisearch API. To install them, run the below command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm &lt;span class="nb"&gt;install&lt;/span&gt; &lt;span class="nt"&gt;--save&lt;/span&gt; @elastic/elasticsearch meilisearch
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  *&lt;strong&gt;&lt;em&gt;Create client objects&lt;/em&gt;&lt;/strong&gt;*
&lt;/h3&gt;

&lt;p&gt;We need client instances of both Elasticsearch and Meilisearch to access their respective API. Paste the below code in &lt;code&gt;script.js&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;Client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;ElasticSearch&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@elastic/elasticsearch&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;MeiliSearch&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;meilisearch&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;esClient&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nx"&gt;ElasticSearch&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;cloud&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ES_CLOUD_ID&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="na"&gt;auth&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;username&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ES_USER_NAME&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;password&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ES_PASSWORD&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;meiliClient&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nx"&gt;MeiliSearch&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;host&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;MEILI_HOST&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;MEILI_API_KEY&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;First, we create an Elasticsearch client &lt;code&gt;esClient&lt;/code&gt; using &lt;a href="https://www.elastic.co/cloud/"&gt;elastic cloud&lt;/a&gt; credentials. You can also try other &lt;a href="https://www.elastic.co/guide/en/elasticsearch/client/javascript-api/current/client-connecting.html#client-connecting"&gt;authentication methods&lt;/a&gt;. Replace the original values for &lt;code&gt;ES_CLOUD_ID&lt;/code&gt;,&lt;code&gt;ES_USER_NAME&lt;/code&gt; and &lt;code&gt;ES_PASSWORD&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;Then, we create a Meilisearch client &lt;code&gt;meiliClient&lt;/code&gt; by providing the host url and API key. Also, replace &lt;code&gt;MEILI_HOST&lt;/code&gt; and &lt;code&gt;MEILI_API_KEY&lt;/code&gt; with their respective values.&lt;/p&gt;

&lt;h3&gt;
  
  
  *&lt;strong&gt;&lt;em&gt;Fetch data from Elasticsearch&lt;/em&gt;&lt;/strong&gt;*
&lt;/h3&gt;

&lt;p&gt;This could be a challenging step because we will iterate the index until the entire index is not searched. In each iteration, we search for the subsequent 10,000 documents, format them, and store them. The iteration will end when we get an empty result. To achieve such a paginated search result, we will use the &lt;a href="https://www.elastic.co/guide/en/elasticsearch/reference/current/paginate-search-results.html#search-after"&gt;Search After&lt;/a&gt; method.&lt;/p&gt;

&lt;p&gt;This logic has been reduced to the below code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nx"&gt;getElasticSearchDocuments&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;isFetchingComplete&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;search_after&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;searchObject&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;index&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ES_INDEX&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;size&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;10000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;match_all&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{},&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="na"&gt;sort&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;id.keyword&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;asc&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;

  &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;search_after&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="nx"&gt;searchObject&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;search_after&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;search_after&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;esClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;searchObject&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;hits&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;

    &lt;span class="nx"&gt;isFetchingComplete&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;
    &lt;span class="nx"&gt;search_after&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]?.&lt;/span&gt;&lt;span class="nx"&gt;sort&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;
    &lt;span class="nx"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;concat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;hit&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;hit&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;_source&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;isFetchingComplete&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;documents&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace &lt;code&gt;ES_INDEX&lt;/code&gt; with the index name you want to export.&lt;/p&gt;

&lt;h3&gt;
  
  
  Upload data to Meilisearch
&lt;/h3&gt;

&lt;p&gt;Add the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;getElasticSearchDocuments&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;meiliIndex&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;meiliClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;MEILI_INDEX&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;meiliIndex&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;addDocuments&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The variable &lt;code&gt;documents&lt;/code&gt; contain an array of documents ready to be uploaded to Meilisearch. &lt;/p&gt;

&lt;p&gt;After that, we get the index instance and upload all the documents with the &lt;code&gt;addDocuments&lt;/code&gt; method. Meilisearch will create the index if it doesn't already exist. &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Note: The &lt;code&gt;addDocuments&lt;/code&gt; method is &lt;a href="https://docs.meilisearch.com/learn/advanced/asynchronous_operations.html#asynchronous-operations"&gt;asynchronous&lt;/a&gt;, which means the request is not handled as soon as it is received, but it is put into a queue and processed.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Replace  &lt;code&gt;MEILI_INDEX&lt;/code&gt; with the index name where you would like documents to be added. &lt;/p&gt;

&lt;p&gt;That's all! When you're ready to run the script, enter the below command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;node script.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Finished Script
&lt;/h3&gt;

&lt;p&gt;Here's the complete script code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;Client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;ElasticSearch&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@elastic/elasticsearch&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;MeiliSearch&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;meilisearch&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;esClient&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nx"&gt;ElasticSearch&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;cloud&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ES_CLOUD_ID&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="na"&gt;auth&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;username&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ES_USER_NAME&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;password&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ES_PASSWORD&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;meiliClient&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nx"&gt;MeiliSearch&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;host&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;MEILI_HOST&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;MEILI_API_KEY&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nx"&gt;getElasticSearchDocuments&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;isFetchingComplete&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;search_after&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;

  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;searchObject&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;index&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;ES_INDEX&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;size&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;10000&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;match_all&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{},&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="na"&gt;sort&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;id.keyword&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;asc&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;

  &lt;span class="k"&gt;do&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;search_after&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="nx"&gt;searchObject&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;search_after&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;search_after&lt;/span&gt;

    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;esClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;searchObject&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;hits&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;

    &lt;span class="nx"&gt;isFetchingComplete&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;
    &lt;span class="nx"&gt;search_after&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;]?.&lt;/span&gt;&lt;span class="nx"&gt;sort&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;
    &lt;span class="nx"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;concat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;hit&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;hit&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;_source&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;isFetchingComplete&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nx"&gt;documents&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="p"&gt;;(&lt;/span&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;getElasticSearchDocuments&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;meiliIndex&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;meiliClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;MEILI_INDEX&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;meiliIndex&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;addDocuments&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;})()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Front-end components
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/meilisearch/instant-meilisearch"&gt;Instant Meilisearch&lt;/a&gt; is a plugin connecting your Meilisearch instance with InstantSearch, giving you access to many (but not all) of the same front-end components as Algolia users. Here is &lt;a href="https://github.com/meilisearch/instant-meilisearch/#-api-resources"&gt;an up-to-date list of components compatible with Instant Meilisearch&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In summary, we have discussed the primary advantages and limitations of both search engines, but our discussion has only touched the tip of the iceberg. The functionalities of Meilisearch and Elasticsearch surpass what has been covered in this article. To obtain a comprehensive comparison between Meilisearch, Elasticsearch, and other search engines, please consult the dedicated section in our documentation.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>search</category>
      <category>elasticsearch</category>
    </item>
    <item>
      <title>Integrate a seamless search experience in an NextJS e-commerce application with Meilisearch</title>
      <dc:creator>Shivay Lamba</dc:creator>
      <pubDate>Sat, 01 Apr 2023 02:52:24 +0000</pubDate>
      <link>https://forem.com/shivaylamba/integrate-a-seamless-search-experience-in-an-nextjs-e-commerce-application-31kl</link>
      <guid>https://forem.com/shivaylamba/integrate-a-seamless-search-experience-in-an-nextjs-e-commerce-application-31kl</guid>
      <description>&lt;p&gt;Drastically improve your users' product discovery experience with our easy-to-use and easy-to-deploy search engine.&lt;/p&gt;

&lt;h3&gt;
  
  
  Introduction
&lt;/h3&gt;

&lt;p&gt;Searching is a crucial feature of any e-commerce application which directly affects conversions. A good search experience requires quick and accurate search results. However, it requires investing time and developer resources to build. Here's when Meilisearch enters the picture—providing an open-source search engine that is lightning fast, hyper-relevant, and typo-tolerant with little to no setup time.&lt;/p&gt;

&lt;p&gt;In this tutorial, we'll add a seamless search experience with Meilisearch to a light e-commerce application. We will import a list of 1,000 products to Meilisearch. Users can search through these products and benefit from advanced features like filtering and sorting options.&lt;/p&gt;

&lt;p&gt;Upon completion, you will have an app similar to our demo application at &lt;a href="http://ecommerce.meilisearch.com/" rel="noopener noreferrer"&gt;http://ecommerce.meilisearch.com&lt;/a&gt;, which provides rapid search results on a catalog of over 1.4 million products.&lt;/p&gt;

&lt;h2&gt;
  
  
  *&lt;strong&gt;&lt;em&gt;Prerequisites&lt;/em&gt;&lt;/strong&gt;*
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;&lt;a href="https://nodejs.org" rel="noopener noreferrer"&gt;NodeJs&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://nextjs.org/" rel="noopener noreferrer"&gt;Next.js&lt;/a&gt; - React Framework &lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.meilisearch.com/" rel="noopener noreferrer"&gt;Meilisearch&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h1&gt;
  
  
  Getting Started
&lt;/h1&gt;

&lt;p&gt;Let's get started with the installation of the necessary tools.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. *&lt;strong&gt;&lt;em&gt;Install and launch Meilisearch&lt;/em&gt;&lt;/strong&gt;*
&lt;/h3&gt;

&lt;p&gt;You have many ways to &lt;a href="https://docs.meilisearch.com/learn/getting_started/quick_start.html#download-and-launch" rel="noopener noreferrer"&gt;install Meilisearch&lt;/a&gt;. One of the methods is to use &lt;a href="https://curl.se/" rel="noopener noreferrer"&gt;cURL&lt;/a&gt; for the installation.&lt;/p&gt;

&lt;p&gt;You can paste the following code into your terminal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# Install Meilisearch&lt;/span&gt;
curl &lt;span class="nt"&gt;-L&lt;/span&gt; https://install.meilisearch.com | sh

&lt;span class="c"&gt;# Launch Meilisearch&lt;/span&gt;
./meilisearch
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This command will launch a local Meilisearch server at &lt;code&gt;http://localhost:7700/&lt;/code&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Adding data to Meilisearch
&lt;/h3&gt;

&lt;p&gt;Create and navigate to a new folder called &lt;code&gt;seed&lt;/code&gt;. You can use the following command to install &lt;a href="https://github.com/meilisearch/meilisearch-js" rel="noopener noreferrer"&gt;Meilisearch's Javascript client&lt;/a&gt; using npm.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="nx"&gt;npm&lt;/span&gt; &lt;span class="nx"&gt;install&lt;/span&gt; &lt;span class="nx"&gt;meilisearch&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We have gathered a list of 1,000 products from various Amazon datasets and compiled them in a &lt;code&gt;data.json&lt;/code&gt; file. We will add these products to the Meilisearch index.&lt;/p&gt;

&lt;p&gt;You can download this &lt;code&gt;data.json&lt;/code&gt; file from &lt;a href="https://github.com/shivaylamba/demos/tree/main/src/ecommerce/seed/data.json" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Each record is associated with a product. Each product has a &lt;em&gt;brand, category, tag, price, rating&lt;/em&gt;, and other related information. We will make these attributes sortable and filterable in our Meilisearch instance via &lt;code&gt;[updateFilterableAttributes](https://docs.meilisearch.com/reference/api/filterable_attributes.html#update-filterable-attributes)&lt;/code&gt; and &lt;code&gt;[updateSortableAttributes](https://docs.meilisearch.com/reference/api/sortable_attributes.html#update-sortable-attributes)&lt;/code&gt; methods.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="cm"&gt;/* Here's an example of a product record in the data.json file */&lt;/span&gt;

&lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;id&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;711decb2a3fdcbbe44755afc5af25e2f&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;title&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Kitchenex Stainless Steel Flatware Pie Server and Pie Cutter Set of 2&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;description&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;The Kitchen Stainless Pie Server is both flashy and chic...&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;category&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Home &amp;amp; Kitchen&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;brand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Dr. Pet&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;16.84&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;tag&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Kitchen&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;rating&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mf"&gt;4.7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;reviews_count&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;7&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;...&lt;/span&gt;&lt;span class="mi"&gt;999&lt;/span&gt; &lt;span class="nx"&gt;more&lt;/span&gt; &lt;span class="nx"&gt;items&lt;/span&gt;
&lt;span class="p"&gt;]&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let’s create a script to add the data from the &lt;code&gt;data.json&lt;/code&gt; file to a &lt;code&gt;products&lt;/code&gt; index in Meilisearch. Here’s the code of the script:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;MeiliSearch&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;meilisearch&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;MeiliSearch&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
  &lt;span class="na"&gt;host&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;http://localhost:7700&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="na"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;// No API key has been set&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;INDEX_NAME&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;products&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;INDEX_NAME&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;./data.json&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`Adding Filterable and Sortable Attributes to "&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;INDEX_NAME&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;updateFilterableAttributes&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;brand&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;category&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;tag&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;rating&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;reviews_count&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="p"&gt;]);&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;updateSortableAttributes&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;reviews_count&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;rating&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;price&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]);&lt;/span&gt;

  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`Adding Documents to "&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;INDEX_NAME&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;index&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;updateDocuments&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Documents Added&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;})();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you called your script &lt;code&gt;seed.js&lt;/code&gt;, you could run the command below to start adding data to your Meilisearch instance:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;node seed.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Wait until the data is ingested into the Meilisearch instance. It is usually done in a few seconds. You may go to &lt;code&gt;[http://localhost:7700/](http://localhost:7700/)&lt;/code&gt; to check the list of 1000 products in the Meilisearch.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmys4x6y57w2crrtj0mg5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmys4x6y57w2crrtj0mg5.png" alt="Image description" width="800" height="384"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Set up the project
&lt;/h3&gt;

&lt;p&gt;Let's build an e-commerce application in &lt;a href="https://nextjs.org/" rel="noopener noreferrer"&gt;Next.js&lt;/a&gt;. N*ext.js* is an open-source React framework that enables server-side rendering and static website generation.&lt;/p&gt;

&lt;p&gt;We can set up a Next.js application using the &lt;a href="https://nextjs.org/docs/api-reference/create-next-app" rel="noopener noreferrer"&gt;create-next-app&lt;/a&gt; tool. Navigate to your project base directory in the terminal, and use the following command to install a &lt;em&gt;Next.js&lt;/em&gt; application:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npx create-next-app@latest ecommerce
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It may take a few minutes to complete. This command will create a folder named &lt;code&gt;ecommerce&lt;/code&gt; in your project base directory with all the boilerplate code. Navigate to  the &lt;code&gt;ecommerce&lt;/code&gt; folder and install the following npm libraries:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;code&gt;react-instantsearch-dom&lt;/code&gt; -  an open-source library that uses &lt;a href="https://www.algolia.com/doc/guides/building-search-ui/what-is-instantsearch/js" rel="noopener noreferrer"&gt;InstantSearch.js&lt;/a&gt; to build search interfaces in front-end applications quickly.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;@meilisearch/instant-meilisearch&lt;/code&gt; - It's a client for &lt;code&gt;InstantSearch.js&lt;/code&gt;. It establishes communication between &lt;em&gt;Meilisearch&lt;/em&gt; and &lt;em&gt;InstantSearch&lt;/em&gt;.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You can install these libraries using the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npm i &lt;span class="nt"&gt;--save&lt;/span&gt; react-instantsearch-dom @meilisearch/instant-meilisearch
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You may add some pre-defined stylesheet files. You can download all the CSS modules from &lt;a href="https://github.com/shivaylamba/demos/tree/main/src/ecommerce/styles" rel="noopener noreferrer"&gt;here&lt;/a&gt; and paste all the files into the &lt;code&gt;styles&lt;/code&gt; folder in the base directory.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Important&lt;/strong&gt; &lt;strong&gt;Note:&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;As of Next.js v12 and above, &lt;code&gt;react-instantsearch-dom&lt;/code&gt; is not working as expected when &lt;a href="https://reactjs.org/docs/strict-mode.html" rel="noopener noreferrer"&gt;React’s Strict mode&lt;/a&gt; is enabled (refer to &lt;a href="https://discourse.algolia.com/t/react-instantsearch-not-working/15053/2" rel="noopener noreferrer"&gt;link&lt;/a&gt;). A quick and easy fix for this is to disable strict mode. You can disable it by adding &lt;code&gt;reactStrictMode: false&lt;/code&gt; to &lt;code&gt;next.config.js&lt;/code&gt;. You may find this file at the root of your project.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="cm"&gt;/* next.config.js */&lt;/span&gt;

&lt;span class="nx"&gt;module&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;exports&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;reactStrictMode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;false&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="c1"&gt;// set this to false&lt;/span&gt;
  &lt;span class="na"&gt;images&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;domains&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;''&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To apply modifications, restart the Next.js server.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. Building and integrating components
&lt;/h3&gt;

&lt;p&gt;We have completed the initial step of seeding the data to the Meilisearch instance and installing the necessary tools. From here, we will start integrating the components required for creating a search experience in the application.&lt;/p&gt;

&lt;h3&gt;
  
  
  4.1 Adding search box and connecting components to Meilisearch
&lt;/h3&gt;

&lt;p&gt;We can create a navigation bar that contains a search box to perform searches within the application. We will use the &lt;code&gt;SearchBox&lt;/code&gt; component from the &lt;code&gt;react-instantsearch-dom&lt;/code&gt; library to do so.&lt;/p&gt;

&lt;p&gt;You can create a &lt;code&gt;navbar.jsx&lt;/code&gt; file in the &lt;code&gt;components/layout&lt;/code&gt; folder and use the following code in it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;React&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;SearchBox&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react-instantsearch-dom&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;styles&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;../../styles/nav-bar.module.css&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;NavBar&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;container&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;SearchBox&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nx"&gt;NavBar&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We must connect the &lt;code&gt;SearchBox&lt;/code&gt; component to Meilisearch. We will create a &lt;code&gt;searchClient&lt;/code&gt; with the necessary credentials and pass it to the &lt;code&gt;InstantSearch&lt;/code&gt; component as a prop along with the index name.&lt;/p&gt;

&lt;p&gt;To do this, you can create a &lt;code&gt;layout.jsx&lt;/code&gt; file in the &lt;code&gt;components/layout/&lt;/code&gt; folder and use the following code in it:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;React&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;NavBar&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;./navbar&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;InstantSearch&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react-instantsearch-dom&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;instantMeiliSearch&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;@meilisearch/instant-meilisearch&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;searchClient&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;instantMeiliSearch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;http://localhost:7700&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;''&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;Layout&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;children&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;InstantSearch&lt;/span&gt; &lt;span class="na"&gt;indexName&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"products"&lt;/span&gt; &lt;span class="na"&gt;searchClient&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;searchClient&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;NavBar&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;main&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;children&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;main&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nc"&gt;InstantSearch&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nx"&gt;Layout&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;Next.js&lt;/code&gt; uses the &lt;a href="https://nextjs.org/docs/advanced-features/custom-app" rel="noopener noreferrer"&gt;App&lt;/a&gt; component to initialize pages. Every time the server is launched, &lt;code&gt;_app.js&lt;/code&gt; is the primary file rendered. We will add the &lt;code&gt;Layout&lt;/code&gt; component and a few stylesheet files to the &lt;code&gt;_app.js&lt;/code&gt; file. Adding the &lt;code&gt;Layout&lt;/code&gt; component will supply the search results to all the child components.&lt;/p&gt;

&lt;p&gt;You may replace the code in the &lt;code&gt;_app.js&lt;/code&gt; file with the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;Layout&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;../components/layout/layout&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;../styles/globals.css&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;../styles/searchBoxAIS.css&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;MyApp&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;Component&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;pageProps&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Layout&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Component&lt;/span&gt; &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="p"&gt;...&lt;/span&gt;&lt;span class="nx"&gt;pageProps&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nc"&gt;Layout&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nx"&gt;MyApp&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4.2 Creating various search filters
&lt;/h3&gt;

&lt;p&gt;Let’s add a filter functionality to the application. We will need a few components from the &lt;code&gt;react-instantsearch-dom&lt;/code&gt; library:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;The &lt;code&gt;RefinementList&lt;/code&gt; component can be used to filter the data based on &lt;a href="https://docs.meilisearch.com/learn/advanced/filtering_and_faceted_search.html" rel="noopener noreferrer"&gt;facets&lt;/a&gt;. The number of items related to the given filter will also be displayed.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;RefinementList&lt;/span&gt; &lt;span class="na"&gt;attribute&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"category"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;The &lt;code&gt;RatingMenu&lt;/code&gt; component creates a rating list. We need to define what’s the maximum rating. It displays a select menu &lt;em&gt;(starting from 1 star till max)&lt;/em&gt; to select the rating. The component only works with integers.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt; &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;RatingMenu&lt;/span&gt; &lt;span class="na"&gt;attribute&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"rating"&lt;/span&gt; &lt;span class="na"&gt;max&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;The &lt;code&gt;ClearRefinements&lt;/code&gt; component will be used to provide a button that will clear all the applied filters within the application. This is handy when we've applied many filters and want to delete them all at once.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;ClearRefinements&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We will add all three components to implement the filter functionality. You can create a new file &lt;code&gt;SearchFilters.jsx&lt;/code&gt; in the &lt;code&gt;components/home/&lt;/code&gt; folder and use the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;ClearRefinements&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nx"&gt;RatingMenu&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nx"&gt;RefinementList&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react-instantsearch-dom&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;SearchFilters&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;h2&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Filters&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;ClearRefinements&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;h2&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;h4&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Categories&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;h4&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;RefinementList&lt;/span&gt; &lt;span class="na"&gt;attribute&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"category"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;h4&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Tags&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;h4&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;RefinementList&lt;/span&gt; &lt;span class="na"&gt;attribute&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"tag"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;h4&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Brands&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;h4&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;RefinementList&lt;/span&gt; &lt;span class="na"&gt;attribute&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"brand"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;h4&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Rating&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;h4&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;RatingMenu&lt;/span&gt; &lt;span class="na"&gt;attribute&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"rating"&lt;/span&gt; &lt;span class="na"&gt;max&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nx"&gt;SearchFilters&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4.3 Implementing a product card component
&lt;/h3&gt;

&lt;p&gt;We will create a component named &lt;code&gt;Hit&lt;/code&gt; to represent a single product. We may design a card view for the component that will include the basic information for the product, like - &lt;em&gt;title, description, rating, review count, price, brand,&lt;/em&gt; and &lt;em&gt;image&lt;/em&gt;. All these details can be retrieved from the component’s &lt;code&gt;product&lt;/code&gt; prop. After that, we can display the results by looping over the component.&lt;/p&gt;

&lt;p&gt;We can create a file &lt;code&gt;hits.jsx&lt;/code&gt; in the &lt;code&gt;components/home/&lt;/code&gt; folder. You can paste the following lines of code into the file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;styles&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;../../styles/searchResult.module.css&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;Hit&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;product&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;rating&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;images&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;title&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;description&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;brand&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;price&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;reviews_count&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;product&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;card&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;
        &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;productResImg&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
        &lt;span class="na"&gt;style&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;backgroundImage&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`url(&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;images&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;]}&lt;/span&gt;&lt;span class="s2"&gt;)`&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
      &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;productResCntnt&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;h6&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;productResBrand&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;brand&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;h6&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;productResTitl&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;title&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;productResDesc&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
          &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;description&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;substring&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;...
        &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;productResPrice&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;$&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;price&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;productResReview&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
          &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;reviews_count&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;productRateWrap&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
              &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;productRate&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;reviews_count&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; review
              &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt; &lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
              &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;⭐ &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;rating&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
          &lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;No Review&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;
          &lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nx"&gt;Hit&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4.4 Designing a page layout to display the list of products
&lt;/h3&gt;

&lt;p&gt;It’s time to design the product list page with a sort feature and pagination to display a limited number of products on a page. For this, we can use the following components from the &lt;code&gt;react-instantsearch-dom&lt;/code&gt; library:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;p&gt;The &lt;code&gt;Pagination&lt;/code&gt; component, by default, allows you to display a list of 20 products per page.  We can also provide the &lt;code&gt;showLast&lt;/code&gt; property to display an estimate of the number of pages.&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Pagination&lt;/span&gt; &lt;span class="na"&gt;showLast&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;The &lt;code&gt;SortBy&lt;/code&gt; component will sort the search results based on the facets. We need to provide a Meilisearch index and a list of &lt;code&gt;value&lt;/code&gt; and &lt;code&gt;label&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Let's say we need to sort the &lt;code&gt;category&lt;/code&gt; in the index &lt;code&gt;products&lt;/code&gt; alphabetically. You may use the following code:&lt;br&gt;
&lt;/p&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;SortBy&lt;/span&gt;
  &lt;span class="na"&gt;defaultRefinement&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"products"&lt;/span&gt;
  &lt;span class="na"&gt;items&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;category&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Sort category alphabetically&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;


&lt;p&gt;The &lt;code&gt;SortBy&lt;/code&gt; has a few properties, including the &lt;code&gt;defaultRefinement&lt;/code&gt; attribute for providing the Meilisearch index and the &lt;code&gt;items&lt;/code&gt; attribute to provide a list of &lt;code&gt;value&lt;/code&gt; and &lt;code&gt;label&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;value&lt;/code&gt; includes the &lt;a href="https://docs.meilisearch.com/learn/core_concepts/documents.html#structure" rel="noopener noreferrer"&gt;document's attribute&lt;/a&gt;, and as the name suggests, we will use the &lt;code&gt;label&lt;/code&gt; to add label text in a dropdown list.&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;The &lt;code&gt;connectStateResults&lt;/code&gt; hook will retrieve the search results from the Meilisearch instance using &lt;code&gt;InstantSearch&lt;/code&gt;. The &lt;code&gt;connectStateResults&lt;/code&gt; function has some arguments, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;searchState&lt;/code&gt; provides the information about the user's input in the provided search box.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;searchResults&lt;/code&gt; gives the result obtained after querying the Meilisearch instance.&lt;/li&gt;
&lt;li&gt;The &lt;code&gt;searching&lt;/code&gt; argument represents the loading state for the result retrieval from the Meilisearch.&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We will use the &lt;code&gt;searchState&lt;/code&gt; argument to get the list of products and loop the list using the &lt;code&gt;map&lt;/code&gt; function over the &lt;code&gt;Hit&lt;/code&gt; component to display the result. We will use the &lt;code&gt;SortBy&lt;/code&gt; component to sort based on &lt;code&gt;price&lt;/code&gt; and &lt;code&gt;reviews_count&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;We can create a file &lt;code&gt;SearchResult.jsx&lt;/code&gt; in the &lt;code&gt;component/home/&lt;/code&gt; folder and use the following code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;Pagination&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nx"&gt;SortBy&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
  &lt;span class="nx"&gt;connectStateResults&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react-instantsearch-dom&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;SearchFilters&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;./SearchFilters&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;Hit&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;./hits.jsx&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;styles&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;../../styles/searchResult.module.css&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;Results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;connectStateResults&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
  &lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="nx"&gt;searchState&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;searchResults&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;searching&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;hits&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;searchResults&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;searching&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="c1"&gt;// This will show the loading state till the results are retrieved&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Loading&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;SearchFilters&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;products&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                    &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="cm"&gt;/* Condition for rendering component based on results */&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
          &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;searchResults&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;nbHits&lt;/span&gt; &lt;span class="o"&gt;!==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;&amp;gt;&lt;/span&gt;
              &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;resultPara&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                  Showing &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;searchResults&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; of&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt; &lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
                  &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;searchResults&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;nbHits&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="si"&gt;}{&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt; &lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
                  &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;searchState&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;query&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt;
                    &lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;searching&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt;
                    &lt;span class="s2"&gt;`for "&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;searchState&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;"`&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
                &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;span&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                  &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;SortBy&lt;/span&gt;
                    &lt;span class="na"&gt;defaultRefinement&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"products"&lt;/span&gt;
                    &lt;span class="na"&gt;items&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;
                      &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;products&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Sort&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt;
                      &lt;span class="p"&gt;{&lt;/span&gt;
                        &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;products:price:desc&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                        &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Price: High to Low&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                      &lt;span class="p"&gt;},&lt;/span&gt;
                      &lt;span class="p"&gt;{&lt;/span&gt;
                        &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;products:price:asc&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                        &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Price: Low to High&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                      &lt;span class="p"&gt;},&lt;/span&gt;
                      &lt;span class="p"&gt;{&lt;/span&gt;
                        &lt;span class="na"&gt;value&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;products:reviews_count:desc&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                        &lt;span class="na"&gt;label&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;Most Reviews&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                      &lt;span class="p"&gt;},&lt;/span&gt;
                    &lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
                  &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
                &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
              &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
              &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;grid&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
                                &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="cm"&gt;/* Using search result list, we will loop over the Hit component */&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
                &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&amp;amp;&lt;/span&gt;
                  &lt;span class="nx"&gt;hits&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;product&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
                    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Hit&lt;/span&gt; &lt;span class="na"&gt;key&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;product&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="na"&gt;product&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;product&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
                  &lt;span class="p"&gt;))&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
              &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;/&amp;gt;&lt;/span&gt;
          &lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;p&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;paragraph&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
              No results have been found for &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;searchState&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;.
            &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;p&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
          &lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
                    &lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="cm"&gt;/* Adding pagination over the results */&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;
          &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Pagination&lt;/span&gt; &lt;span class="na"&gt;showLast&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;/&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nx"&gt;Results&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4.5 Displaying results on the application
&lt;/h3&gt;

&lt;p&gt;We have created all the required components to display the results. We need to add the &lt;code&gt;Results&lt;/code&gt; component to the &lt;code&gt;index.jsx&lt;/code&gt; file to display the result on the screen. You can copy and paste the following code into the &lt;code&gt;index.jsx&lt;/code&gt; file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;styles&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;../styles/home.module.css&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;Results&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;../components/home/SearchResult&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;Home&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;container&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;main&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;main&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt; &lt;span class="na"&gt;className&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;styles&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;mainContent&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
          &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Results&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
      &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;main&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;div&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
  &lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;getStaticProps&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;props&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{}&lt;/span&gt;
  &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can use the following command to run the application:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="nx"&gt;npm&lt;/span&gt; &lt;span class="nx"&gt;run&lt;/span&gt; &lt;span class="nx"&gt;dev&lt;/span&gt; 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Go to &lt;a href="http://localhost:3000" rel="noopener noreferrer"&gt;http://localhost:3000&lt;/a&gt;. The output will resemble the image provided below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ng03flr0676gpnlsy1l.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6ng03flr0676gpnlsy1l.gif" alt="Image description" width="800" height="391"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can find the complete code here, &lt;a href="https://github.com/shivaylamba/demos/tree/main/src/ecommerce" rel="noopener noreferrer"&gt;https://github.com/shivaylamba/demos/tree/main/src/ecommerce&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;We created a lightning-fast search experience for an e-commerce application. You can now take the learnings from above and integrate a similar search experience into your applications. You may play with typo-tolerance, geo-search filters, and many other features to better suit your needs.&lt;/p&gt;

&lt;p&gt;Here are some references of real-world examples of Meilisearch used in e-commerce experiences:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;a href="https://www.minipouce.fr/i/marketplace/explore" rel="noopener noreferrer"&gt;https://www.minipouce.fr/i/marketplace/explore&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://palmes.co/" rel="noopener noreferrer"&gt;https://palmes.co/&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;We're excited to see what you come up with. Share your experience and Meilisearch integration in your e-commerce application on the &lt;a href="https://meilicommunity.slack.com/join/shared_invite/zt-19l5kemhc-IFXAhC4T44arR6uRGSWDVQ#/shared-invite/email" rel="noopener noreferrer"&gt;Slack community&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If you have any queries or suggestions, please let us know &lt;a href="https://slack.meilisearch.com/" rel="noopener noreferrer"&gt;on Slack&lt;/a&gt;. For more information on Meilisearch, check out our &lt;a href="https://github.com/meilisearch/meilisearch/" rel="noopener noreferrer"&gt;Github repository&lt;/a&gt; and official &lt;a href="https://docs.meilisearch.com/" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;.”&lt;/p&gt;

</description>
      <category>nextjs</category>
    </item>
    <item>
      <title>Add search in your Gatsby project using Meilisearch</title>
      <dc:creator>Shivay Lamba</dc:creator>
      <pubDate>Tue, 24 May 2022 09:06:37 +0000</pubDate>
      <link>https://forem.com/shivaylamba/add-search-in-your-gatsby-project-using-meilisearch-2g3l</link>
      <guid>https://forem.com/shivaylamba/add-search-in-your-gatsby-project-using-meilisearch-2g3l</guid>
      <description>&lt;p&gt;This guide will go through all the steps for adding the content of your Gatsby app to Meilisearch. ( &lt;a href="https://github.com/shivaylamba/meilisearch-gatsby-plugin-guide.git"&gt;https://github.com/shivaylamba/meilisearch-gatsby-plugin-guide.git&lt;/a&gt;)&lt;br&gt;
We will use the content of &lt;a href="https://nikita.js.org/"&gt;NikitaJS Documentation&lt;/a&gt;, an open-source documentation site built with Gatsby, for this guide. We will then make this content searchable using Meilisearch.&lt;br&gt;
&lt;a href="https://nikita.js.org/current/guide/tutorial/"&gt;Nikita&lt;/a&gt; is a toolkit that gathers a set of functions commonly used during system deployment. It is built for Node.js. We chose this project as it doesn’t have search functionality for its documentation.&lt;/p&gt;
&lt;h1&gt;
  
  
  Introducing our Tools
&lt;/h1&gt;
&lt;h2&gt;
  
  
  &lt;strong&gt;Meilisearch&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://github.com/meilisearch/meilisearch"&gt;Meilisearch&lt;/a&gt; is an open-source search engine that works out-of-the-box.&lt;/p&gt;

&lt;p&gt;It provides a customizable searching and indexing solution along with features like typo tolerance, filters, and synonyms.&lt;/p&gt;
&lt;h1&gt;
  
  
  Requirements
&lt;/h1&gt;

&lt;p&gt;To be able to follow this tutorial, you'll need the following:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An open &lt;a href="https://www.ionos.com/help/email/troubleshooting-mail-basicmail-business/access-the-command-prompt-or-terminal"&gt;terminal or command prompt&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://nodejs.org/en/download/"&gt;Node.js&lt;/a&gt; &amp;gt;= 14.15.X and &amp;lt;= 16.X: this makes it possible to run JS outside a browser

&lt;ul&gt;
&lt;li&gt;You can check your active version with the command &lt;code&gt;node --version&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;If your version of Node is outside this range, we recommend that you &lt;a href="https://github.com/nvm-sh/nvm/blob/master/README.md#about"&gt;install nvm&lt;/a&gt; and use it to access Node 14&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;NPM and NPX &amp;gt;= 6.x (installed with Node.js): package managers that help you access and use JS libraries&lt;/li&gt;
&lt;/ul&gt;
&lt;h1&gt;
  
  
  Steps
&lt;/h1&gt;

&lt;ol&gt;
&lt;li&gt;Start MeiliSearch&lt;/li&gt;
&lt;li&gt;Setup the Gatsby project&lt;/li&gt;
&lt;li&gt;Add the content of the Gatsby app to Meilisearch&lt;/li&gt;
&lt;li&gt;Build the frontend&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ol&gt;
&lt;h1&gt;
  
  
  Start MeiliSearch
&lt;/h1&gt;

&lt;p&gt;There are multiple ways to &lt;a href="https://docs.meilisearch.com/learn/getting_started/installation.html#download-and-launch"&gt;download and run a MeiliSearch instance&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;For example, you can use Docker to install and run it locally:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;docker pull getmeili/meilisearch:latest 

docker run -it --rm -p 7700:7700 getmeili/meilisearch:latest ./meilisearch --master-key=masterKey
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The Meilisearch instance will run on the following host:&lt;code&gt;http://localhost:7700&lt;/code&gt;. &lt;/p&gt;

&lt;h1&gt;
  
  
  Setup the Gatsby project
&lt;/h1&gt;

&lt;p&gt;Let’s open a terminal and clone the NikitaJs project that we will be using for this guide:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;git clone https://github.com/adaltas/node-nikita.git
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can find the content of the Gatsby app in markdown files in the &lt;code&gt;docs/content&lt;/code&gt; folder. This content is used by Gatsby to create the application.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4DmS2ot0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/19529592/161043477-afd719ab-fe45-4f1a-ae0b-5f27e4983df6.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4DmS2ot0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/19529592/161043477-afd719ab-fe45-4f1a-ae0b-5f27e4983df6.jpg" alt="Untitled" width="880" height="335"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Adding the Meilisearch plugin&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Let’s start by installing the meilisearch plugin:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;cd node-nikita/docs/website
npm install gatsby-plugin-meilisearch
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Let’s also install all the dependencies:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;npm install
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;To see how the website is rendered, run it using &lt;code&gt;npm run serve&lt;/code&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Adding the content of the Gatsby app to Meilisearch
&lt;/h1&gt;

&lt;p&gt;Now that you have installed the plugin and have both the web app and the Meilisearch instance running, let's configure the plugin to make the content searchable.&lt;/p&gt;

&lt;p&gt;The main code for the docs website lies in the directory &lt;code&gt;docs/website&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;All the plugin configurations can be found in the &lt;code&gt;gatsby-config.js&lt;/code&gt; file in the documentation website source directory.&lt;/p&gt;

&lt;p&gt;In this example, all the markdown files will be retrieved from the documentation and added to the Meilisearch instance. This will be handled by the &lt;a href="https://www.gatsbyjs.com/plugins/gatsby-plugin-mdx/"&gt;gatsby-plugin-mdx&lt;/a&gt; module.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Configuring the plugin&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;The first step is to add your Meilisearch &lt;a href="https://docs.meilisearch.com/learn/getting_started/"&gt;credentials&lt;/a&gt; to the Meilisearch Gatsby plugin:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;resolve&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;gatsby-plugin-meilisearch,
    options: {
      host: &lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="na"&gt;http&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="c1"&gt;//localhost:7700', // your host URL goes here&lt;/span&gt;
      &lt;span class="na"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;masterKey&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;   &lt;span class="c1"&gt;// your API key goes here       &lt;/span&gt;
      &lt;span class="na"&gt;indexes&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[],&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The next step is to add the content to the &lt;a href="https://github.com/meilisearch/gatsby-plugin-meilisearch#-indexes"&gt;indexes field&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;This field can be considered the heart of the plugin. This is where you store the data that needs to be added to Meilisearch.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;indexes&lt;/code&gt; field is an array that can be composed of multiple index objects. Each index object contains the following information:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;code&gt;indexUid&lt;/code&gt; This field contains the name of the index in Meilisearch.
For this example, the index &lt;code&gt;nikita-api-docs&lt;/code&gt; will be created.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;code&gt;indexUid: 'nikita-api-docs'&lt;/code&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;code&gt;query&lt;/code&gt; This is the graphQL query that will be executed to retrieve the data, i.e. the content of the pages. The query can be very specific depending on the plugins used. Gatsby comes with inbuilt support for GraphQL and you can use the GraphQL tool (&lt;a href="http://localhost:8000/___graphql"&gt;http://localhost:8000/___graphql&lt;/a&gt;) provided by Gatsby in development mode to help you build it.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You can now see the graphQL query using the &lt;code&gt;gatsby-plugin-mdx&lt;/code&gt; plugin. &lt;code&gt;gatsby-plugin-mdx&lt;/code&gt; will fetch all the markdown files in the documentation given through the &lt;code&gt;gatsby-source-filesystem&lt;/code&gt; plugin.&lt;/p&gt;

&lt;p&gt;This will include the title, navtitle, and description of the markdown files from the &lt;a href="https://www.gatsbyjs.com/docs/how-to/routing/adding-markdown-pages/"&gt;frontmatter&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The data is transformed into a JavaScript object:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="nx"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`query MyQuery {
              allMdx {
                edges {
                  node {
                    frontmatter {
                      title
                      navtitle
                      description
                    }
                    slug
                    rawBody
                    id
                  }
                }
              }
            }`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;iii. &lt;strong&gt;&lt;code&gt;transformer&lt;/code&gt;&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;It’s a function that transforms the fetched data (from GraphQL query execution) before sending it to Meilisearch.&lt;/p&gt;

&lt;p&gt;Let's suppose, the data returned from the query is:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;allMdx&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;edges&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
      &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;node&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="na"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Introduction&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;navtitle&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;intro&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="p"&gt;},&lt;/span&gt;
          &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Introduction to the Nikita.js&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="na"&gt;slug&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;/introduction&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;1&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;node&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="na"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;title&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Architecture&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;navtitle&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;architecture&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="p"&gt;},&lt;/span&gt;
          &lt;span class="na"&gt;body&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Architechture of Nikita.js&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="na"&gt;slug&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;/architecture&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;2&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We need to change the key names, and create a simple object, therefore, we can use the map function,&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="nx"&gt;transformer&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;allMdx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;edges&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(({&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="c1"&gt;// Node property has been destructured here&lt;/span&gt;
          &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;lvl0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;title&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;lvl1&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;navtitle&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;body&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
            &lt;span class="na"&gt;anchor&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;slug&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="p"&gt;};&lt;/span&gt;
        &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="cm"&gt;/*
  It will return a list of transformed structured object

    [
    {
      id: "1",
      lvl0: "Introduction",
      lvl1: "introduction",
      content: "Introduction to the Nikita.js",
      anchor: "/introduction"
    },
    {
      id: "2",
      lvl0: "Architecture",
      lvl1: "architecture",
      content: "Architechture of Nikita.js",
      anchor: "/architecture"
    }
  ]
*/&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this, &lt;code&gt;data&lt;/code&gt; will be passed to a customized function. A map function restructures the whole object into a simple array that can be indexed to Meilisearch.&lt;/p&gt;

&lt;p&gt;An &lt;code&gt;id&lt;/code&gt; field is required for indexing the data in Meilisearch.&lt;/p&gt;

&lt;p&gt;After filling in these fields, the Meilisearch configuration should look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;
   &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;resolve&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;gatsby-plugin-meilisearch&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;options&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="na"&gt;host&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;http://localhost:7700&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;masterKey&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;batchSize&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="na"&gt;indexes&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="na"&gt;indexUid&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;nikita-api-docs&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="na"&gt;settings&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
              &lt;span class="na"&gt;searchableAttributes&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;lvl0&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;lvl1&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;lvl2&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;content&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;],&lt;/span&gt;
            &lt;span class="p"&gt;},&lt;/span&gt;
          &lt;span class="na"&gt;transformer&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt;
              &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;allMdx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;edges&lt;/span&gt;
                &lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(({&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                  &lt;span class="c1"&gt;// Have to update for versioning&lt;/span&gt;
                  &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;currentVersion&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt;
                    &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;slug&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;substring&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;8&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nx"&gt;search&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;project&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;
                      &lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;/current&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
                      &lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;''&lt;/span&gt;

                  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="na"&gt;lvl0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                      &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;navtitle&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;title&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;''&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="na"&gt;lvl1&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                      &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;title&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;navtitle&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;''&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="na"&gt;lvl2&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;frontmatter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;description&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;''&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="na"&gt;content&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;rawBody&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="na"&gt;url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;currentVersion&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;slug&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                  &lt;span class="p"&gt;}&lt;/span&gt;
                &lt;span class="p"&gt;}),&lt;/span&gt;
            &lt;span class="na"&gt;query&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`query MyQuery {
              allMdx {
                edges {
                  node {
                    frontmatter {
                      title
                      navtitle
                      description
                    }
                    slug
                    rawBody
                    id
                  }
                }
              }
            }`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
          &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="p"&gt;],&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Build the project&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;gatsby-plugin-meilisearch&lt;/code&gt; will fetch and send data for indexation to Meilisearch during the build process&lt;/p&gt;

&lt;p&gt;To start the build process, run the following command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;npm run build
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;After the build, there should be a message in the terminal that your content was successfully indexed:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;success gatsby-plugin-meilisearch - x.xxxs - Documents added to Meilisearch 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can validate this by going to &lt;a href="http://localhost:7700/"&gt;http://localhost:7700&lt;/a&gt;, entering the API key, and checking that your app content has been added to Meilisearch.&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;strong&gt;Build the frontend ( UI )&lt;/strong&gt;
&lt;/h1&gt;

&lt;p&gt;Now that the data is indexed, let’s proceed to build the user interface for creating the search experience.&lt;/p&gt;

&lt;p&gt;For this example, we will use the Meilisearch &lt;a href="https://github.com/meilisearch/docs-searchbar.js"&gt;docs-searchbar.js&lt;/a&gt;. It is a front-end SDK for Meilisearch that provides an easy integration of a search bar into the documentation site. The content indexed earlier in the Meilisearch instance will be referenced.&lt;/p&gt;

&lt;p&gt;Let’s start by installing the docs-searchbar.js SDK in the frontend directory of the project.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;&lt;span class="gh"&gt;# With NPM&lt;/span&gt;
npm install docs-searchbar.js
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Search Bar&lt;/strong&gt; &lt;/p&gt;

&lt;p&gt;The first step is to import the docs-searchbar module into the AppBar.js code. The file can be found in &lt;code&gt;website/src/components/shared&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;docs-searchbar.js/dist/cdn/docs-searchbar.css&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Next add a &lt;code&gt;useEffect&lt;/code&gt; Hook to add the docsSearchBar function to the AppBar.js code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="nx"&gt;useEffect&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;window&lt;/span&gt; &lt;span class="o"&gt;!==&lt;/span&gt; &lt;span class="kc"&gt;undefined&lt;/span&gt;&lt;span class="p"&gt;){&lt;/span&gt;
      &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;docsSearchBar&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;require&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;docs-searchbar.js&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="k"&gt;default&lt;/span&gt;

      &lt;span class="nx"&gt;docsSearchBar&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
        &lt;span class="na"&gt;hostUrl&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;http://localhost:7700&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;masterKey&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;indexUid&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;nikita-api-docs&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;inputSelector&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;#search-bar-input&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;meilisearchOptions&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
          &lt;span class="na"&gt;limit&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="na"&gt;debug&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;enhancedSearchInput&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="kc"&gt;true&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="p"&gt;})&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;[])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;docsSearchBar&lt;/code&gt; function comes in with a number of different parameters.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;hostUrl&lt;/code&gt; and the &lt;code&gt;apiKey&lt;/code&gt; serve as the credentials of your Meilisearch instance.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;indexUid&lt;/code&gt; is the index identifier in your Meilisearch instance where your documentation/website content is stored.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;inputSelector&lt;/code&gt; is the id attribute of the HTML search input tag. In the case of the docsSearchBar, it will match the id of the input.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;enhancedSearhInput&lt;/code&gt;: When set to true, a theme is applied to the search box to improve its appearance. It adds the .searchbox class which can be used to further customize the search box.&lt;/p&gt;

&lt;p&gt;Next, add an input tag of type search with an id:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;input&lt;/span&gt; &lt;span class="na"&gt;type&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"search"&lt;/span&gt; &lt;span class="na"&gt;id&lt;/span&gt;&lt;span class="p"&gt;=&lt;/span&gt;&lt;span class="s"&gt;"search-bar-input"&lt;/span&gt; &lt;span class="p"&gt;/&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;It’s done!&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Testing the Implementation&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Run the command &lt;code&gt;npm run develop&lt;/code&gt; to initiate the site and run it on &lt;code&gt;localhost:8000&lt;/code&gt;. A search input should appear like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4VoxSkAO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/19529592/161043686-a34453b0-e155-4f03-8ba2-2a9225190757.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4VoxSkAO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://user-images.githubusercontent.com/19529592/161043686-a34453b0-e155-4f03-8ba2-2a9225190757.jpg" alt="Untitled" width="880" height="664"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  &lt;a href="https://blog.meilisearch.com/add-meilisearch-to-your-strapi-app/#conclusion"&gt;Conclusion&lt;/a&gt;
&lt;/h1&gt;

&lt;p&gt;We hope this article was a pleasant introduction to the new &lt;a href="https://github.com/meilisearch/gatsby-plugin-meilisearch"&gt;Meilisearch plugin for Gatsby&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If you have any questions, please join us &lt;a href="https://meilicommunity.slack.com/join/shared_invite/zt-c4rs8tpi-ZId_q3fw~7cqeuzFG4XHaA#/shared-invite/email"&gt;on Slack&lt;/a&gt;. For more information, see our:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="http://docs.meilisearch.com/"&gt;Documentation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/meilisearch/meilisearch"&gt;GitHub&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
    </item>
    <item>
      <title>Messing with Service Meshes using Meshery</title>
      <dc:creator>Shivay Lamba</dc:creator>
      <pubDate>Wed, 20 May 2020 08:39:00 +0000</pubDate>
      <link>https://forem.com/shivaylamba/ai-recruiter-3dp6</link>
      <guid>https://forem.com/shivaylamba/ai-recruiter-3dp6</guid>
      <description>&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--sQkyN6cb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y4tyuqgw8vydhytfnaty.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--sQkyN6cb--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/y4tyuqgw8vydhytfnaty.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Meshery?
&lt;/h2&gt;

&lt;p&gt;Meshery is the Service Mesh Management Tool&lt;br&gt;
Meshery enables operators, developers, and service owners to realize the full potential of a service mesh...&lt;/p&gt;
&lt;h2&gt;
  
  
  But wait! What is a service mesh exactly?
&lt;/h2&gt;

&lt;p&gt;Service meshes in the Cloud Native Space are tools for adding observability, reliability, security, and features to applications by inserting them at the platform layer rather than the application layer. &lt;/p&gt;

&lt;p&gt;In a control plane service meshes provide the following: &lt;br&gt;
Provides policy, configuration, and platform integration.&lt;br&gt;
Takes a set of isolated stateless sidecar proxies and turns them into a service mesh.&lt;br&gt;
Does not touch any packets/requests in the data path.&lt;/p&gt;

&lt;p&gt;While in the data plane, service meshes provide the following as well : &lt;br&gt;
Provides policy, configuration, and platform integration.&lt;br&gt;
Takes a set of isolated stateless sidecar proxies and turns them into a service mesh.&lt;br&gt;
Does not touch any packets/requests in the data path.&lt;/p&gt;
&lt;h2&gt;
  
  
  How does Meshery come into the picture?
&lt;/h2&gt;

&lt;p&gt;Meshery provides federation, backend system integration, expanded policy and governance, continuous delivery integration, workflow, chaos engineering, and application performance tuning.&lt;/p&gt;
&lt;h2&gt;
  
  
  Benefits of using Meshery for service meshes :
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Configuration Best Practices
Assess your service mesh configuration against deployment and operational best practices with Meshery's configuration analyzer.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RPQmgxRO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hateig9q8wnjbwn978ma.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RPQmgxRO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/hateig9q8wnjbwn978ma.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Service Mesh Interface (SMI) Conformance
Operate and upgrade with confirmation of SMI compatibility.
Defines compliant behavior. Produces compatibility matrix.
Ensures provenance of results. Runs a set of conformance tests.
Built into participating service mesh’s release pipeline.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--lR7g7byU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vase2p2qxx39yg45bnuu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--lR7g7byU--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/vase2p2qxx39yg45bnuu.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The service mesh management plane
Supports the following : 
1.Lifecycle
2.Workload 
3.Performance 
4.Configuration &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--JuqhHtCo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x4nzs9d2605d6dhgiix8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--JuqhHtCo--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/x4nzs9d2605d6dhgiix8.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Performance Management
Most importantly, you can do performance comparison between different service meshes by doing load tests. &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--iI02hp5Y--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/agobdpvxuz9xkudgpuln.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--iI02hp5Y--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/agobdpvxuz9xkudgpuln.png" alt="image"&gt;&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;type PerformanceProfile struct {
    ID *uuid.UUID `json:"id,omitempty"`

    Name              string         `json:"name,omitempty"`
    LastRun           string         `json:"last_run,omitempty"`
    Schedule          *uuid.UUID     `json:"schedule,omitempty"`
    LoadGenerators    pq.StringArray `json:"load_generators,omitempty" gorm:"type:text[]"`
    Endpoints         pq.StringArray `json:"endpoints,omitempty" gorm:"type:text[]"`
    ServiceMesh       string         `json:"service_mesh,omitempty"`
    ConcurrentRequest int            `json:"concurrent_request,omitempty"`
    QPS               int            `json:"qps,omitempty"`
    Duration          string         `json:"duration,omitempty"`
    TotalResults      int            `json:"total_results,omitempty"`

    RequestHeaders string `json:"request_headers,omitempty"`
    RequestCookies string `json:"request_cookies,omitempty"`
    RequestBody    string `json:"request_body,omitempty"`
    ContentType    string `json:"content_type,omitempty"`

    UpdatedAt *time.Time `json:"updated_at,omitempty"`
    CreatedAt *time.Time `json:"created_at,omitempty"`
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The above code shows the performance criteria for different service meshes. &lt;/p&gt;

&lt;h2&gt;
  
  
  The Meshery Architecture!
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--7TlKjWX9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0aftt9mv3t5i0dg85yqz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--7TlKjWX9--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/0aftt9mv3t5i0dg85yqz.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;For Providers : &lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--m77htSR---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/oswgir3v0plzt5mpcqq4.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--m77htSR---/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/oswgir3v0plzt5mpcqq4.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;For Clients : &lt;br&gt;
&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--OAVxm8Ti--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ueko5imuaxl98as56fsr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--OAVxm8Ti--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ueko5imuaxl98as56fsr.png" alt="image"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>cloudnative</category>
      <category>meshery</category>
      <category>serverless</category>
    </item>
  </channel>
</rss>
