<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Timon Vonk</title>
    <description>The latest articles on Forem by Timon Vonk (@timonv).</description>
    <link>https://forem.com/timonv</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/timonv"/>
    <language>en</language>
    <item>
      <title>Announcing Swiftide 0.31</title>
      <dc:creator>Timon Vonk</dc:creator>
      <pubDate>Tue, 16 Sep 2025 15:39:36 +0000</pubDate>
      <link>https://forem.com/timonv/announcing-swiftide-031-16g8</link>
      <guid>https://forem.com/timonv/announcing-swiftide-031-16g8</guid>
      <description>&lt;p&gt;Just released &lt;strong&gt;Swiftide&lt;/strong&gt; 0.31 🚀 A Rust library for building LLM applications. From performing a simple prompt completion, to building fast, streaming indexing and querying pipelines, to building agents that can use tools and call other agents.&lt;/p&gt;

&lt;p&gt;The release is absolutely packed:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Graph like workflows with tasks
&lt;/li&gt;
&lt;li&gt;Langfuse integration via tracing
&lt;/li&gt;
&lt;li&gt;Ground-work for multi-modal pipelines
&lt;/li&gt;
&lt;li&gt;Structured prompts with SchemaRs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;... and a lot more, shout-out to all our contributors and users for making it possible &amp;lt;3  &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmt0y7w148wddpr1czoi5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmt0y7w148wddpr1czoi5.png" alt=" " width="800" height="483"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Even went wild with my drawing skills.  &lt;/p&gt;

&lt;p&gt;Full write up on all the things in this release at our &lt;a href="https://blog.bosun.ai/swiftide-0-31/" rel="noopener noreferrer"&gt;blog&lt;/a&gt; and on &lt;a href="https://github.com/bosun-ai/swiftide" rel="noopener noreferrer"&gt;github&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>programming</category>
      <category>ai</category>
      <category>rust</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Swiftide 0.26 - Streaming agents</title>
      <dc:creator>Timon Vonk</dc:creator>
      <pubDate>Thu, 08 May 2025 09:54:20 +0000</pubDate>
      <link>https://forem.com/timonv/swiftide-026-streaming-agents-1j2h</link>
      <guid>https://forem.com/timonv/swiftide-026-streaming-agents-1j2h</guid>
      <description>&lt;p&gt;Funny how time flies and you forget to write a blog post every time there is a major release. We are now at 0.26, and a lot has happened since our last update (January, 0.16!). We have been working hard on building out the agent framework, fixing bugs, and adding features. Shout out to all the contributors who have helped us along the way, and to all the users who have provided feedback and suggestions.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Swiftide is a Rust library for building LLM applications. Index, query, run agents, and bring your experiments right to production.&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;em&gt;To get started with Swiftide, head over to &lt;a href="https://swiftide.rs" rel="noopener noreferrer"&gt;swiftide.rs&lt;/a&gt;, check us out on &lt;a href="https://github.com/bosun-ai/swiftide" rel="noopener noreferrer"&gt;github&lt;/a&gt;, or hit us up on &lt;a href="https://discord.gg/3jjXYen9UY" rel="noopener noreferrer"&gt;discord&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Finally they stream!
&lt;/h2&gt;

&lt;p&gt;Better late than never, agents can now stream their output. Under the hood, the &lt;code&gt;ChatCompletion&lt;/code&gt; trait received an additional method called &lt;code&gt;complete_stream&lt;/code&gt;, which returns a stream of responses, both the accumulated response and the delta. All OpenAI like providers and Anthropic are supported. We decided on including the accumulated response for convenience. Let us know if that is too many bytes for you, and we're happy to take a look.&lt;/p&gt;

&lt;p&gt;The kicker is that it can also be used with agents, like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;    &lt;span class="nn"&gt;agents&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;Agent&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="nf"&gt;.llm&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;anthropic&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;.on_stream&lt;/span&gt;&lt;span class="p"&gt;(|&lt;/span&gt;&lt;span class="n"&gt;_agent&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;|&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="c1"&gt;// We print the message chunk if it exists. Streamed responses also include&lt;/span&gt;
            &lt;span class="c1"&gt;// the full response (without tool calls) in `message` and an `id` to map them to&lt;/span&gt;
            &lt;span class="c1"&gt;// previous chunks for convenience.&lt;/span&gt;
            &lt;span class="c1"&gt;//&lt;/span&gt;
            &lt;span class="c1"&gt;// The agent uses the full assembled response at the end of the stream.&lt;/span&gt;
            &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="nf"&gt;Some&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;delta&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="py"&gt;.delta&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="nd"&gt;print!&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                    &lt;span class="s"&gt;"{}"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                    &lt;span class="n"&gt;delta&lt;/span&gt;
                        &lt;span class="py"&gt;.message_chunk&lt;/span&gt;
                        &lt;span class="nf"&gt;.as_deref&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
                        &lt;span class="nf"&gt;.map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;str&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="n"&gt;to_string&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                        &lt;span class="nf"&gt;.unwrap_or_default&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
                &lt;span class="p"&gt;);&lt;/span&gt;
            &lt;span class="p"&gt;};&lt;/span&gt;

            &lt;span class="nn"&gt;Box&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;pin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;move&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nf"&gt;Ok&lt;/span&gt;&lt;span class="p"&gt;(())&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt;
        &lt;span class="p"&gt;})&lt;/span&gt;
        &lt;span class="c1"&gt;// Every message added by the agent will be printed to stdout&lt;/span&gt;
        &lt;span class="nf"&gt;.on_new_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;move&lt;/span&gt; &lt;span class="p"&gt;|&lt;/span&gt;&lt;span class="n"&gt;_&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;|&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;msg&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="nf"&gt;.to_string&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
            &lt;span class="nn"&gt;Box&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;pin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;move&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="nd"&gt;println!&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;---&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;Final message:&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt; {msg}"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
                &lt;span class="nf"&gt;Ok&lt;/span&gt;&lt;span class="p"&gt;(())&lt;/span&gt;
            &lt;span class="p"&gt;})&lt;/span&gt;
        &lt;span class="p"&gt;})&lt;/span&gt;
        &lt;span class="nf"&gt;.limit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;5&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;.build&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;
        &lt;span class="nf"&gt;.query&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Why is the rust programming language so good?"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;.await&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Streaming the response back to users makes for a much snappier user experience. We've also implemented it in &lt;a href="https://github.com/bosun-ai/kwaak" rel="noopener noreferrer"&gt;kwaak&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;The completion response (streaming and non-streaming) now also includes the token usage.&lt;/p&gt;

&lt;h2&gt;
  
  
  MCP Support
&lt;/h2&gt;

&lt;p&gt;Agents can now use tools provided by &lt;a href="https://modelcontextprotocol.io/introduction" rel="noopener noreferrer"&gt;MCP (Model Context Protocol)&lt;/a&gt;. The ecosystem is growing rapidly and there are quite a few cool tools available. Under the hood we're using the 'official' Rust implementation. We don't support creating MCP servers, as I think it's a bit out of scope.&lt;/p&gt;

&lt;p&gt;Here is an example of adding MCP tools to an agent:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;
    &lt;span class="c1"&gt;// First set up our client info to identify ourselves to the server&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;client_info&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ClientInfo&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;client_info&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;Implementation&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;name&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s"&gt;"swiftide-example"&lt;/span&gt;&lt;span class="nf"&gt;.into&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
            &lt;span class="n"&gt;version&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nd"&gt;env!&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"CARGO_PKG_VERSION"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nf"&gt;.into&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
        &lt;span class="p"&gt;},&lt;/span&gt;
        &lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="nn"&gt;Default&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;default&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;};&lt;/span&gt;

    &lt;span class="c1"&gt;// Use `rmcp` to start the server&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;running_service&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client_info&lt;/span&gt;
        &lt;span class="nf"&gt;.serve&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;TokioChildProcess&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="nn"&gt;tokio&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;process&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;Command&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"npx"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="nf"&gt;.args&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="s"&gt;"-y"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"@modelcontextprotocol/server-everything"&lt;/span&gt;&lt;span class="p"&gt;]),&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;.await&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="c1"&gt;// Create a toolbox from the running server, and only use the `add` tool&lt;/span&gt;
    &lt;span class="c1"&gt;//&lt;/span&gt;
    &lt;span class="c1"&gt;// A toolbox reveals it's tools to the swiftide agent the first time it starts (if the state of&lt;/span&gt;
    &lt;span class="c1"&gt;// the agent was pending). You can add as many toolboxes as you want. MCP services are an&lt;/span&gt;
    &lt;span class="c1"&gt;// implementation of a toolbox. A list of tools is another.&lt;/span&gt;
    &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;everything_toolbox&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;McpToolbox&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_running_service&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;running_service&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;.with_whitelist&lt;/span&gt;&lt;span class="p"&gt;([&lt;/span&gt;&lt;span class="s"&gt;"add"&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
        &lt;span class="nf"&gt;.to_owned&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="nn"&gt;agents&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;Agent&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="nf"&gt;.llm&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;openai&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="c1"&gt;// Add the toolbox to the agent&lt;/span&gt;
        &lt;span class="nf"&gt;.add_toolbox&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;everything_toolbox&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="c1"&gt;// Every message added by the agent will be printed to stdout&lt;/span&gt;
        &lt;span class="nf"&gt;.on_new_message&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;move&lt;/span&gt; &lt;span class="p"&gt;|&lt;/span&gt;&lt;span class="n"&gt;_&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;|&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;msg&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="nf"&gt;.to_string&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
            &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;tx&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;tx&lt;/span&gt;&lt;span class="nf"&gt;.clone&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
            &lt;span class="nn"&gt;Box&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;pin&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;move&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="n"&gt;tx&lt;/span&gt;&lt;span class="nf"&gt;.send&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;msg&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
                &lt;span class="nf"&gt;Ok&lt;/span&gt;&lt;span class="p"&gt;(())&lt;/span&gt;
            &lt;span class="p"&gt;})&lt;/span&gt;
        &lt;span class="p"&gt;})&lt;/span&gt;
        &lt;span class="nf"&gt;.build&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;
        &lt;span class="nf"&gt;.query&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Use the add tool to add 1 and 2"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;.await&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Resuming agents from history
&lt;/h2&gt;

&lt;p&gt;You can now resume agents from a pre-existing history. Technically this was already possible, we've made it a bit easier. By creating the agent context from an existing history, the agent will resume where it left off:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;  &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;first_agent&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;agents&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;Agent&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.llm&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;openai&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nf"&gt;.build&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="n"&gt;first_agent&lt;/span&gt;&lt;span class="nf"&gt;.query&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Say hello!"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="k"&gt;.await&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="c1"&gt;// Let's store the messages in a database, retrieve them back, and start a new agent&lt;/span&gt;
  &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;stored_history&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;serde_json&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;to_string&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;first_agent&lt;/span&gt;&lt;span class="nf"&gt;.history&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="k"&gt;.await&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;retrieved_history&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;Vec&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;_&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;serde_json&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;stored_history&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;restored_context&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;DefaultContext&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;default&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
      &lt;span class="nf"&gt;.with_message_history&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;retrieved_history&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="nf"&gt;.to_owned&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

  &lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="k"&gt;mut&lt;/span&gt; &lt;span class="n"&gt;second_agent&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;agents&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;Agent&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
      &lt;span class="nf"&gt;.llm&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="n"&gt;openai&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="nf"&gt;.context&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;restored_context&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
      &lt;span class="c1"&gt;// We'll use the one from the first agent, alternatively we could also pop it from the&lt;/span&gt;
      &lt;span class="c1"&gt;// previous history and add a new one here&lt;/span&gt;
      &lt;span class="nf"&gt;.no_system_prompt&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
      &lt;span class="nf"&gt;.build&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="n"&gt;second_agent&lt;/span&gt;&lt;span class="nf"&gt;.query&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"What did you say?"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="k"&gt;.await&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="nf"&gt;Ok&lt;/span&gt;&lt;span class="p"&gt;(())&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  So much more
&lt;/h2&gt;

&lt;p&gt;Since January there have been so many improvements, new integrations, and a myriad of fixes, it's hard to keep track. Luckily we have a &lt;a href="https://github.com/bosun-ai/swiftide/blob/master/CHANGELOG.md" rel="noopener noreferrer"&gt;changelog here&lt;/a&gt;&lt;br&gt;
Some more highlights since then:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Broader adoption of &lt;code&gt;Backoff&lt;/code&gt; for more controlled LLM error handling&lt;/li&gt;
&lt;li&gt;Tool macros now support generics and nearly all types common for tool calling&lt;/li&gt;
&lt;li&gt;Prompt templating setup is drastically simplified, just templates that can use Tera and that's it.&lt;/li&gt;
&lt;li&gt;Duckdb support&lt;/li&gt;
&lt;li&gt;Token estimation with tiktoken.rs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Feedback, suggestions, ideas, and contributions are super welcome. Swiftide aims to make LLM app development in Rust easier, while providing opinionated building blocks to get you started. It's your feedback that makes it worthwhile &amp;lt;3.&lt;/p&gt;




&lt;p&gt;To get started with Swiftide, head over to &lt;a href="https://swiftide.rs" rel="noopener noreferrer"&gt;swiftide.rs&lt;/a&gt; or check us out on &lt;a href="https://github.com/bosun-ai/swiftide" rel="noopener noreferrer"&gt;github&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>opensource</category>
      <category>rust</category>
    </item>
    <item>
      <title>Kwaak, a different take on AI coding tools</title>
      <dc:creator>Timon Vonk</dc:creator>
      <pubDate>Tue, 25 Feb 2025 22:12:35 +0000</pubDate>
      <link>https://forem.com/timonv/kwaak-a-different-take-on-ai-coding-tools-7fi</link>
      <guid>https://forem.com/timonv/kwaak-a-different-take-on-ai-coding-tools-7fi</guid>
      <description>&lt;p&gt;Just did a major release of Kwaak (open source), adding a whole host of new features, model support (sonnet 3.7 and azure!) and other niceties.&lt;/p&gt;

&lt;p&gt;Kwaak is different than other AI coding tools in that it tries to get out of your way. You can throw your backlog at it and have it burn through it, so we as engineers can work on the cool stuff. Under the hood it uses rust based rag shenanigans to index large codebases fast and can solve multiple tasks in parallel.&lt;/p&gt;

&lt;p&gt;Oh and its open-source.&lt;/p&gt;

&lt;p&gt;You can find the project at &lt;a href="https://github.com/bosun-ai/kwaak" rel="noopener noreferrer"&gt;https://github.com/bosun-ai/kwaak&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fir3tgpgw3ftrva51pqjx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fir3tgpgw3ftrva51pqjx.png" alt="Image description" width="200" height="200"&gt;&lt;/a&gt;&lt;/p&gt;

</description>
      <category>programming</category>
      <category>ai</category>
      <category>tooling</category>
    </item>
    <item>
      <title>Kwaak 0.8 - More LLMs and usability improvements</title>
      <dc:creator>Timon Vonk</dc:creator>
      <pubDate>Fri, 07 Feb 2025 15:59:20 +0000</pubDate>
      <link>https://forem.com/timonv/kwaak-08-more-llms-and-usability-improvements-35b7</link>
      <guid>https://forem.com/timonv/kwaak-08-more-llms-and-usability-improvements-35b7</guid>
      <description>&lt;p&gt;Hey everyone,&lt;/p&gt;

&lt;p&gt;A month ago we released &lt;a href="https://github.com/bosun-ai/kwaak" rel="noopener noreferrer"&gt;Kwaak&lt;/a&gt;, a terminal app that allows you to spawn many coding agents in parallel, on your own machine (open and free). The idea: Have AI burn through your tech debt and backlog, so we as engineers can work on the fun creative stuff.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4oo7neiut0qic3k9kx4q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4oo7neiut0qic3k9kx4q.png" alt="Image description" width="800" height="497"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Since we released it, we received a lot of positive and constructive feedback &amp;lt;3 I'd just like to share some of the highlights we released since.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Ollama + OpenRouter support&lt;/li&gt;
&lt;li&gt;A lot is now configurable; no more opinionated workflows&lt;/li&gt;
&lt;li&gt;Pull and show the diff of the agent&lt;/li&gt;
&lt;li&gt;Interactive configurator&lt;/li&gt;
&lt;li&gt;Many, many usability fixes and improvements &lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>ai</category>
      <category>tui</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Swiftide 0.16 brings AI agents to Rust</title>
      <dc:creator>Timon Vonk</dc:creator>
      <pubDate>Tue, 07 Jan 2025 17:05:41 +0000</pubDate>
      <link>https://forem.com/timonv/swiftide-016-brings-ai-agents-to-rust-3o17</link>
      <guid>https://forem.com/timonv/swiftide-016-brings-ai-agents-to-rust-3o17</guid>
      <description>&lt;p&gt;A pretty huge release for Swiftide, 0.16 allows you to build AI agents in Rust. You know, that seemingly hot AI thing everybody talks about, with promises of doing it all for you.&lt;/p&gt;

&lt;p&gt;I'm a bit more optimistic, I think we just are not there yet. And when doing agents, performance and reliability suddenly really start to matter. At the same time, the space is moving extremely, ridiculously fast, so we need building blocks to keep innovating at the same pace, and not get stuck on infrastructure (or lifetime) issues.&lt;/p&gt;

&lt;p&gt;I think with this release we put a step forward in doing exactly that.&lt;/p&gt;

&lt;p&gt;Check out the full &lt;a href="https://bosun.ai/posts/swiftide-0-16" rel="noopener noreferrer"&gt;https://bosun.ai/posts/swiftide-0-16&lt;/a&gt; or jump right into it on &lt;a href="https://github.com/bosun-ai/swiftide" rel="noopener noreferrer"&gt;https://github.com/bosun-ai/swiftide&lt;/a&gt;&lt;/p&gt;

</description>
      <category>rust</category>
      <category>ai</category>
      <category>rag</category>
    </item>
    <item>
      <title>Swiftide 0.12 - Hybrid Search, search filters, parquet loader, and a giant speed bump</title>
      <dc:creator>Timon Vonk</dc:creator>
      <pubDate>Fri, 13 Sep 2024 15:59:55 +0000</pubDate>
      <link>https://forem.com/timonv/swiftide-012-hybrid-search-search-filters-parquet-loader-and-a-giant-speed-bump-4m9c</link>
      <guid>https://forem.com/timonv/swiftide-012-hybrid-search-search-filters-parquet-loader-and-a-giant-speed-bump-4m9c</guid>
      <description>&lt;p&gt;Excited to announce Swiftide 0.12 🚀 A Rust library for building AI applications using retrieval augmented generation.&lt;/p&gt;

&lt;p&gt;Retrieving the most relevant information for a given query is the key challenge in when building AI applications. Research and our own experience shows that similarity search on vectors is not enough. The idea behind hybrid search is fairly simple:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;Retrieve n documents with similarity search&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Retrieve n documents with another kind of search (ie full text, sparse vectors)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Rerank the documents for relevancy&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Take the top k documents&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Swiftide can now do hybrid search with #Qdrant, supports filters in search, load parquet files for indexing (i.e. from #HuggingFace) and is a &lt;em&gt;lot&lt;/em&gt; faster!&lt;/p&gt;

&lt;p&gt;Trumpets and a big thanks to @ephraimkunz for his first contribution! Read the full release post &lt;a href="https://bosun.ai/posts/swiftide-0-12/" rel="noopener noreferrer"&gt;here&lt;/a&gt;, check us out on &lt;a href="https://github.com/bosun-ai/swiftide" rel="noopener noreferrer"&gt;github&lt;/a&gt;&lt;/p&gt;

</description>
      <category>ai</category>
      <category>rust</category>
      <category>rag</category>
      <category>opensource</category>
    </item>
    <item>
      <title>Swiftide 0.9, a Rust native library for building LLM applications with RAG, brings Fluvio, Lancedb and Ragas support</title>
      <dc:creator>Timon Vonk</dc:creator>
      <pubDate>Mon, 02 Sep 2024 14:30:36 +0000</pubDate>
      <link>https://forem.com/timonv/swiftide-09-a-rust-native-library-for-building-llm-applications-with-rag-brings-fluvio-lancedb-and-ragas-support-3ai6</link>
      <guid>https://forem.com/timonv/swiftide-09-a-rust-native-library-for-building-llm-applications-with-rag-brings-fluvio-lancedb-and-ragas-support-3ai6</guid>
      <description>&lt;p&gt;Introducing Swiftide 0.9 with Fluvio as a starting point for indexing streams, lancedb for querying and indexing and RAGAS support&lt;/p&gt;

&lt;p&gt;&lt;em&gt;To get started with Swiftide, head over to &lt;a href="https://swiftide.rs" rel="noopener noreferrer"&gt;swiftide.rs&lt;/a&gt;, check us out on &lt;a href="https://github.com/bosun-ai/swiftide" rel="noopener noreferrer"&gt;github&lt;/a&gt;, or hit us up on &lt;a href="https://discord.gg/3jjXYen9UY" rel="noopener noreferrer"&gt;discord&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Fluvio support
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://fluvio.io" rel="noopener noreferrer"&gt;Fluvio&lt;/a&gt; is a lightweight high-performance distributed data streaming system written in Rust and Web Assembly. In a production environment, data could come and go to many places at the same time. With the Fluvio loader, you can hook into a Fluvio topic and index right away.&lt;/p&gt;

&lt;p&gt;The integration is fully configurable with Fluvio's own configuration. Here is an example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="n"&gt;TOPIC_NAME&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"hello-rust"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="n"&gt;PARTITION_NUM&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;u32&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;loader&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;Fluvio&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="nf"&gt;.consumer_config_ext&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="nn"&gt;ConsumerConfigExt&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="nf"&gt;.topic&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;TOPIC_NAME&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="nf"&gt;.partition&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;PARTITION_NUM&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="nf"&gt;.offset_start&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;fluvio&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;Offset&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_end&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
            &lt;span class="nf"&gt;.build&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;.build&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="nn"&gt;indexing&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;Pipeline&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_loader&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;loader&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;.then_in_batch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nn"&gt;Embed&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;FastEmbed&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;try_default&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="nf"&gt;.unwrap&lt;/span&gt;&lt;span class="p"&gt;()))&lt;/span&gt;
    &lt;span class="nf"&gt;.then_store_with&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="nn"&gt;Qdrant&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
            &lt;span class="nf"&gt;.batch_size&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="nf"&gt;.vector_size&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;384&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="nf"&gt;.collection_name&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"swiftide-examples"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="nf"&gt;.build&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;.run&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="k"&gt;.await&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Lancedb
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://lancedb.com" rel="noopener noreferrer"&gt;Lancedb&lt;/a&gt; is a popular vector and text database that separates storage from compute. It enables a whole new class of applications where the database is embedded into the application itself. Under the hood it uses Apache Arrow and Tantivy.&lt;/p&gt;

&lt;p&gt;Both indexing with different embedded fields and querying is supported. Note that Lancedb does not really need Sparse vectors, as it provides full text search as well. Additionally, Swiftide does not index the data.&lt;/p&gt;

&lt;p&gt;An example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;lancedb&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;LanceDB&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;builder&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
  &lt;span class="nf"&gt;.uri&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;tempdir&lt;/span&gt;&lt;span class="nf"&gt;.child&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"lancedb"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nf"&gt;.to_str&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="nf"&gt;.vector_size&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1536&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="nf"&gt;.with_vector&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;EmbeddedField&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="n"&gt;Combined&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="nf"&gt;.with_metadata&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;METADATA_QA_TEXT_NAME&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="nf"&gt;.table_name&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"swiftide_test"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
  &lt;span class="nf"&gt;.build&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// and in your indexing pipeline&lt;/span&gt;
&lt;span class="n"&gt;indexing_pipeline&lt;/span&gt;&lt;span class="nf"&gt;.then_store_with&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;lancedb&lt;/span&gt;&lt;span class="nf"&gt;.clone&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;

&lt;span class="c1"&gt;// and in your query pipeline&lt;/span&gt;
&lt;span class="n"&gt;query_pipeline&lt;/span&gt;&lt;span class="nf"&gt;.then_retrieve&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;lancedb&lt;/span&gt;&lt;span class="nf"&gt;.clone&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  RAGAS support
&lt;/h3&gt;

&lt;p&gt;Evaluating RAG pipelines is a whole topic by itself. A query pipeline can now take an evaluator, and we have build one for &lt;a href="https://ragas.io" rel="noopener noreferrer"&gt;RAGAS&lt;/a&gt; to kick it off. The evaluator&lt;br&gt;
can export to json, which can be imported in a python hugging face dataset and used with RAGAS.&lt;/p&gt;

&lt;p&gt;This is great, as it will allow you to evaluate the quality of your data, indexing and querying.&lt;/p&gt;

&lt;p&gt;A full guide is coming soon!&lt;/p&gt;

&lt;p&gt;An example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;ragas&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;evaluators&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;ragas&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;Ragas&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_prepared_questions&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;questions&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;pipeline&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nn"&gt;query&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;Pipeline&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;default&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="nf"&gt;.evaluate_with&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ragas&lt;/span&gt;&lt;span class="nf"&gt;.clone&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
    &lt;span class="nf"&gt;.then_transform_query&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;GenerateSubquestions&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_client&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="py"&gt;.openai&lt;/span&gt;&lt;span class="nf"&gt;.clone&lt;/span&gt;&lt;span class="p"&gt;()))&lt;/span&gt;
    &lt;span class="nf"&gt;.then_transform_query&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;query_transformers&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;Embed&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_client&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="py"&gt;.openai&lt;/span&gt;&lt;span class="nf"&gt;.clone&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt;
    &lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="nf"&gt;.then_retrieve&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="py"&gt;.qdrant&lt;/span&gt;&lt;span class="nf"&gt;.clone&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
    &lt;span class="nf"&gt;.then_answer&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;Simple&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_client&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;context&lt;/span&gt;&lt;span class="py"&gt;.openai&lt;/span&gt;&lt;span class="nf"&gt;.clone&lt;/span&gt;&lt;span class="p"&gt;()));&lt;/span&gt;

&lt;span class="n"&gt;pipeline&lt;/span&gt;&lt;span class="nf"&gt;.query_all&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ragas&lt;/span&gt;&lt;span class="nf"&gt;.questions&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="k"&gt;.await&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="k"&gt;.await&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;let&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ragas&lt;/span&gt;&lt;span class="nf"&gt;.to_json&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="k"&gt;.await&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="nn"&gt;std&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nn"&gt;fs&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;write&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"output.json"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;json&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And then in a Python notebook:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;datasets&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;load_dataset&lt;/span&gt;
&lt;span class="n"&gt;dataset&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;load_dataset&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;json&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;data_file&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;output.json&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;ragas.metrics&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;answer_relevancy&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;faithfulness&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;context_recall&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;context_precision&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;pandas&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;pd&lt;/span&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;ragas&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;evaluate&lt;/span&gt;

&lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;evaluate&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;dataset&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;metrics&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;answer_relevancy&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;faithfulness&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;context_recall&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="n"&gt;context_precision&lt;/span&gt;&lt;span class="p"&gt;]).&lt;/span&gt;&lt;span class="nf"&gt;to_pandas&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="c1"&gt;# ... And create some amazing plots!
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;From our own experimentation, Rust feature flags are great to try out many different options fast to get data in.&lt;/p&gt;

&lt;h3&gt;
  
  
  What's next?
&lt;/h3&gt;

&lt;p&gt;We are working hard on hybrid search in the query pipeline for both Qdrant and Lancedb, improving documentation for the query pipeline and many more improvements. If you have feedback, suggestions or need help, feel free to reach out to us on Discord or via a Github issue.&lt;/p&gt;

&lt;h3&gt;
  
  
  Call for contributors
&lt;/h3&gt;

&lt;p&gt;There is a large list of desired features, and many more unlisted over at our issues page; ranging from great starter issues, to fun, complex challenges.&lt;/p&gt;




&lt;p&gt;You can find the full &lt;a href="https://github.com/bosun-ai/swiftide/blob/master/CHANGELOG.md" rel="noopener noreferrer"&gt;changelog here&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;To get started with Swiftide, head over to &lt;a href="https://swiftide.rs" rel="noopener noreferrer"&gt;swiftide.rs&lt;/a&gt; or check us out on &lt;a href="https://github.com/bosun-ai/swiftide" rel="noopener noreferrer"&gt;github&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>rag</category>
      <category>llm</category>
      <category>rust</category>
      <category>data</category>
    </item>
    <item>
      <title>Announcing Swiftide, blazing fast data pipelines for RAG</title>
      <dc:creator>Timon Vonk</dc:creator>
      <pubDate>Wed, 26 Jun 2024 20:15:29 +0000</pubDate>
      <link>https://forem.com/timonv/announcing-swiftide-blazing-fast-data-pipelines-for-rag-4onb</link>
      <guid>https://forem.com/timonv/announcing-swiftide-blazing-fast-data-pipelines-for-rag-4onb</guid>
      <description>&lt;p&gt;While working with other Python-based tooling, frustrations arose around performance, stability, and ease of use.&lt;/p&gt;

&lt;p&gt;Excited to announce Swiftide, blazing fast data pipelines for Retrieval Augmented Generation written in Rust.  Python bindings soon!&lt;/p&gt;

&lt;p&gt;Check it out at &lt;a href="https://swiftide.rs" rel="noopener noreferrer"&gt;https://swiftide.rs&lt;/a&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight rust"&gt;&lt;code&gt;&lt;span class="nn"&gt;IngestionPipeline&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;from_loader&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;FileLoader&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"."&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="nf"&gt;.with_extensions&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;&amp;amp;&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="s"&gt;"md"&lt;/span&gt;&lt;span class="p"&gt;]))&lt;/span&gt;
        &lt;span class="nf"&gt;.then_chunk&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;ChunkMarkdown&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;with_chunk_range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="o"&gt;..&lt;/span&gt;&lt;span class="mi"&gt;512&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
        &lt;span class="nf"&gt;.then&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nn"&gt;MetadataQACode&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;openai_client&lt;/span&gt;&lt;span class="nf"&gt;.clone&lt;/span&gt;&lt;span class="p"&gt;()))&lt;/span&gt;
        &lt;span class="nf"&gt;.then_in_batch&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;10&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nn"&gt;Embed&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;new&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;openai_client&lt;/span&gt;&lt;span class="nf"&gt;.clone&lt;/span&gt;&lt;span class="p"&gt;()))&lt;/span&gt;
        &lt;span class="nf"&gt;.then_store_with&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="nn"&gt;Qdrant&lt;/span&gt;&lt;span class="p"&gt;::&lt;/span&gt;&lt;span class="nf"&gt;try_from_url&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;qdrant_url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;
                &lt;span class="nf"&gt;.batch_size&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="nf"&gt;.vector_size&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1536&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
                &lt;span class="nf"&gt;.collection_name&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"swiftide-examples"&lt;/span&gt;&lt;span class="nf"&gt;.to_string&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
                &lt;span class="nf"&gt;.build&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;.run&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="k"&gt;.await&lt;/span&gt;&lt;span class="o"&gt;?&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Questions, feedback, complaints and great ideas are more than welcome in the comments &amp;lt;3&lt;/p&gt;


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/bosun-ai" rel="noopener noreferrer"&gt;
        bosun-ai
      &lt;/a&gt; / &lt;a href="https://github.com/bosun-ai/swiftide" rel="noopener noreferrer"&gt;
        swiftide
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      Fast, streaming indexing and query library for AI (RAG) applications, written in Rust
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
  Table of Contents
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://github.com/bosun-ai/swiftide#about-the-project" rel="noopener noreferrer"&gt;About The Project&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://github.com/bosun-ai/swiftide#latest-updates-on-our-blog-fire" rel="noopener noreferrer"&gt;Latest updates on our blog 🔥&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://github.com/bosun-ai/swiftide#example" rel="noopener noreferrer"&gt;Example&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://github.com/bosun-ai/swiftide#vision" rel="noopener noreferrer"&gt;Vision&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://github.com/bosun-ai/swiftide#features" rel="noopener noreferrer"&gt;Features&lt;/a&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/bosun-ai/swiftide#in-detail" rel="noopener noreferrer"&gt;In detail&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://github.com/bosun-ai/swiftide#getting-started" rel="noopener noreferrer"&gt;Getting Started&lt;/a&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;a href="https://github.com/bosun-ai/swiftide#prerequisites" rel="noopener noreferrer"&gt;Prerequisites&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/bosun-ai/swiftide#installation" rel="noopener noreferrer"&gt;Installation&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://github.com/bosun-ai/swiftide#usage-and-concepts" rel="noopener noreferrer"&gt;Usage and concepts&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://github.com/bosun-ai/swiftide#roadmap" rel="noopener noreferrer"&gt;Roadmap&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://github.com/bosun-ai/swiftide#contributing" rel="noopener noreferrer"&gt;Contributing&lt;/a&gt;&lt;/p&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;a href="https://github.com/bosun-ai/swiftide#license" rel="noopener noreferrer"&gt;License&lt;/a&gt;&lt;/p&gt;
  &lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/141bd1ac5e3a1c04a2ff181523fc22ccb693901855dc1a16bb68514002ed1796/68747470733a2f2f696d672e736869656c64732e696f2f6769746875622f616374696f6e732f776f726b666c6f772f7374617475732f626f73756e2d61692f73776966746964652f746573742e796d6c3f7374796c653d666c61742d737175617265"&gt;&lt;img src="https://camo.githubusercontent.com/141bd1ac5e3a1c04a2ff181523fc22ccb693901855dc1a16bb68514002ed1796/68747470733a2f2f696d672e736869656c64732e696f2f6769746875622f616374696f6e732f776f726b666c6f772f7374617475732f626f73756e2d61692f73776966746964652f746573742e796d6c3f7374796c653d666c61742d737175617265" alt="CI"&gt;&lt;/a&gt;
&lt;a rel="noopener noreferrer nofollow" href="https://camo.githubusercontent.com/5e830d8a61bfc304016f27b73881839840dcdadf18d24ae4a104e52357f971b8/68747470733a2f2f696d672e736869656c64732e696f2f636f766572616c6c73436f7665726167652f6769746875622f626f73756e2d61692f73776966746964653f7374796c653d666c61742d737175617265"&gt;&lt;img src="https://camo.githubusercontent.com/5e830d8a61bfc304016f27b73881839840dcdadf18d24ae4a104e52357f971b8/68747470733a2f2f696d672e736869656c64732e696f2f636f766572616c6c73436f7665726167652f6769746875622f626f73756e2d61692f73776966746964653f7374796c653d666c61742d737175617265" alt="Coverage Status"&gt;&lt;/a&gt;
&lt;a href="https://crates.io/crates/swiftide" rel="nofollow noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/fa4d8547dac53b2860525bf24162545e02b79c1293c10083671d57da116eee33/68747470733a2f2f696d672e736869656c64732e696f2f6372617465732f762f73776966746964653f6c6f676f3d72757374267374796c653d666c61742d737175617265266c6f676f436f6c6f723d45303544343426636f6c6f723d453035443434" alt="Crate Badge"&gt;&lt;/a&gt;
&lt;a href="https://docs.rs/swiftide" rel="nofollow noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/e25ebda5dfeeaa9246aed23b3c89a0b1c6509fd3fdffc6db7a859848df4f5bc6/68747470733a2f2f696d672e736869656c64732e696f2f646f637372732f73776966746964653f6c6f676f3d72757374267374796c653d666c61742d737175617265266c6f676f436f6c6f723d453035443434" alt="Docs Badge"&gt;&lt;/a&gt;
&lt;a href="https://github.com/bosun-ai/swiftide/graphs/contributors" rel="noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/87df631de423e74f41b98d8a024fee37b5bc743fe7cbdf05bb89602dcba0ce4b/68747470733a2f2f696d672e736869656c64732e696f2f6769746875622f636f6e7472696275746f72732f626f73756e2d61692f73776966746964652e7376673f7374796c653d666c61742d737175617265" alt="Contributors"&gt;&lt;/a&gt;
&lt;a href="https://github.com/bosun-ai/swiftide/stargazers" rel="noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/5579dd8ff58776c16e8363b3830467665a45695b41da818dc17e2483e0c0749f/68747470733a2f2f696d672e736869656c64732e696f2f6769746875622f73746172732f626f73756e2d61692f73776966746964652e7376673f7374796c653d666c61742d737175617265" alt="Stargazers"&gt;&lt;/a&gt;
&lt;a href="https://github.com/bosun-ai/swiftide/blob/master/LICENSE.txt" rel="noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/333bc174575e71382edf31f0aa431f26de7e7edddb867a39042760bdab037546/68747470733a2f2f696d672e736869656c64732e696f2f6769746875622f6c6963656e73652f626f73756e2d61692f73776966746964652e7376673f7374796c653d666c61742d737175617265" alt="MIT License"&gt;&lt;/a&gt;
&lt;a href="https://www.linkedin.com/company/bosun-ai" rel="nofollow noopener noreferrer"&gt;&lt;img src="https://camo.githubusercontent.com/4ce48cfece0b76d4e1ace7de4e9352ea9a51e61c46f327165ffa7c9977362905/68747470733a2f2f696d672e736869656c64732e696f2f62616467652f2d4c696e6b6564496e2d626c61636b2e7376673f7374796c653d666c61742d737175617265266c6f676f3d6c696e6b6564696e26636f6c6f72423d353535" alt="LinkedIn"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;br&gt;
&lt;div&gt;
  &lt;a href="https://github.com/bosun-ai/swiftide" rel="noopener noreferrer"&gt;
    &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fgithub.com%2Fbosun-ai%2Fswiftide%2Fraw%2Fmaster%2Fimages%2Flogo.png" alt="Logo" width="250" height="250"&gt;
  &lt;/a&gt;
  &lt;div class="markdown-heading"&gt;
&lt;h3 class="heading-element"&gt;Swiftide&lt;/h3&gt;

&lt;/div&gt;


&lt;p&gt;&lt;br&gt;
Fast, streaming indexing and query library for AI applications, written in Rust&lt;br&gt;
    &lt;br&gt;&lt;br&gt;
    &lt;a href="https://swiftide.rs" rel="nofollow noopener noreferrer"&gt;&lt;strong&gt;Read more on swiftide.rs »&lt;/strong&gt;&lt;/a&gt;&lt;br&gt;
    &lt;br&gt;&lt;br&gt;
    &lt;br&gt;&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;&amp;lt;a href="https://docs.rs/swiftide/latest/swiftide/" rel="nofollow"&amp;gt;API Docs&amp;lt;/a&amp;gt;
·
&amp;lt;a href="https://github.com/bosun-ai/swiftide/issues/new?labels=bug&amp;amp;amp;template=bug_report.md"&amp;gt;Report Bug&amp;lt;/a&amp;gt;
·
&amp;lt;a href="https://github.com/bosun-ai/swiftide/issues/new?labels=enhancement&amp;amp;amp;template=feature_request.md"&amp;gt;Request Feature&amp;lt;/a&amp;gt;
·
&amp;lt;a href="https://discord.gg/3jjXYen9UY" rel="nofollow"&amp;gt;Discord&amp;lt;/a&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;br&gt;
&lt;/div&gt;

&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;About The Project&lt;/h2&gt;

&lt;/div&gt;

&lt;p&gt;Swiftide is a Rust native library for building LLM applications. Large language models are amazing, but need context
to solve real problems. Swiftide allows you to ingest, transform and index large amounts of data fast, and then query that data so it it can be injected into prompts.
This process is called Retrieval Augmented Generation.&lt;/p&gt;
&lt;p&gt;With Swiftide, you can build your AI application from idea to production in a few lines of code.&lt;/p&gt;
&lt;div&gt;
    &lt;a rel="noopener noreferrer" href="https://github.com/bosun-ai/swiftide/blob/master/images/rag-dark.svg"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fgithub.com%2Fbosun-ai%2Fswiftide%2Fraw%2Fmaster%2Fimages%2Frag-dark.svg" alt="RAG" width="100%"&gt;&lt;/a&gt;
&lt;/div&gt;
&lt;p&gt;While working with other Python-based tooling, frustrations arose around performance, stability, and ease of use. Thus, Swiftide was born…&lt;/p&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/bosun-ai/swiftide" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;


</description>
      <category>llm</category>
      <category>rust</category>
      <category>ai</category>
      <category>data</category>
    </item>
  </channel>
</rss>
