<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Adheeban Manoharan</title>
    <description>The latest articles on Forem by Adheeban Manoharan (@iamadhee).</description>
    <link>https://forem.com/iamadhee</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/iamadhee"/>
    <language>en</language>
    <item>
      <title>🗃️ Lessons Learned from Migrating Huge BigQuery Datasets Across Regions</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Sat, 14 Sep 2024 16:11:00 +0000</pubDate>
      <link>https://forem.com/iamadhee/lessons-learned-from-migrating-huge-bigquery-datasets-across-regions-2kjp</link>
      <guid>https://forem.com/iamadhee/lessons-learned-from-migrating-huge-bigquery-datasets-across-regions-2kjp</guid>
      <description>&lt;p&gt;&lt;strong&gt;TL;DR&lt;/strong&gt; We were tasked with migrating a large dataset from the US to the EU due to the client's EU-based operations and GDPR compliance requirements. In this post, I’ll walk you through the challenges I faced while transferring a BigQuery dataset across regions. Since Google's documentation on this subject is quite limited, I believe this guide will be invaluable for anyone attempting a similar migration.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prelude
&lt;/h2&gt;

&lt;p&gt;When tasked with migrating datasets across the globe (for example, from the US to the EU region), we had to address three key questions before starting the process:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What limitations does Google impose on data exports?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Understanding the constraints of Google's infrastructure was crucial to avoid unforeseen issues.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;What volume of data are we dealing with?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The size and complexity of the dataset would significantly impact the migration strategy.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;How can we ensure zero downtime during the migration?&lt;/strong&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Minimizing disruption to services while the data was being transferred was a top priority.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Mind you, our task is creating the same dataset with the same name and same number of tables in a different region. Now, Let's go over the questions one after the other.&lt;/p&gt;




&lt;h2&gt;
  
  
  Google's Limitations
&lt;/h2&gt;

&lt;p&gt;Google's BigQuery by nature has some acceptable limitations (at the time of writing this blog) due to its architecture . Let me list them down. Some of them might sound pretty obvious but I'm still listing them down as we would have the full context when we go through how we solved this migration problem.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You cannot create a second dataset with the same name.&lt;/li&gt;
&lt;li&gt;You cannot create a multi-regional dataset (This limitation applies only for the Asia region, US and EU multi regional buckets can be created) &lt;/li&gt;
&lt;li&gt;You cannot export the contents of a BigQuery table that resides in the US region to a GCS bucket that resides in the EU region. (Cross regional export)&lt;/li&gt;
&lt;li&gt;When you create a dataset, it usually takes about 20-120 seconds for it to be available for the APIs that consume this dataset, until then you'd get 404s.&lt;/li&gt;
&lt;li&gt;Google doesn't give support for migrating these datasets, so we have to do it ourselves.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Data Size
&lt;/h2&gt;

&lt;p&gt;Our application stores audit data in BigQuery, serving as our cold storage. Given the nature of audit logs, the datasets have grown to a colossal size. For example, some tables, even in their compressed form (Parquet format with Snappy compression), contained up to 4 terabytes of data. Since this is critical audit data, ensuring zero data loss was absolutely essential during the migration. &lt;/p&gt;

&lt;h2&gt;
  
  
  Zero Downtime
&lt;/h2&gt;

&lt;p&gt;As mentioned earlier, our audit system writes data continuously to these BigQuery datasets. This means that migrating a customer’s data without downtime or prior notification could result in data loss during the migration window. In our case, it was crucial that the migration proceeded seamlessly, with zero downtime, so as not to disrupt the customer's operations.&lt;/p&gt;




&lt;h2&gt;
  
  
  How it actually went down
&lt;/h2&gt;

&lt;p&gt;After getting all the information we needed, Below is the approach we finalised. In this illustration let's go with the assumption that we are migrating a US multi-regional dataset to a EU multi-regional location.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqs5hloxs8ailhw858d8j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqs5hloxs8ailhw858d8j.png" alt="BigQuery Migration"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Yes, I hear you. I know how confusing this image is. I suck at these engineering illustrations. So let me explain.&lt;/p&gt;

&lt;h3&gt;
  
  
  1) Export Table Data as Parquet:
&lt;/h3&gt;

&lt;p&gt;We initiated the migration by exporting the table data from the US dataset to a Google Cloud Storage (GCS) bucket in the US region. The data was saved in Parquet format, which is ideal for both storage and transfer due to its efficient compression.&lt;/p&gt;

&lt;h3&gt;
  
  
  2) Extract Table Schema and Partition Details:
&lt;/h3&gt;

&lt;p&gt;Along with the data, we extracted the table schema, partition and clustering details, saving them as JSON files. This ensured that the tables would be accurately recreated during the migration.&lt;/p&gt;

&lt;h3&gt;
  
  
  3) Transfer Data to the EU bucket:
&lt;/h3&gt;

&lt;p&gt;Using Google Cloud Storage Transfer service, we moved the exported data from the US bucket to a bucket in the EU region. This step circumvented BigQuery’s restriction on direct cross-regional exports. We chose storage transfer as this was the most efficient way google had for transferring data from one bucket to the other.&lt;/p&gt;

&lt;h3&gt;
  
  
  4) Create a Temporary Dataset:
&lt;/h3&gt;

&lt;p&gt;A temporary dataset was created in the EU region to serve as an intermediary for data migration. This dataset mirrored the structure of the original tables from the US dataset. The application will write it's data during the migration window to this dataset.&lt;/p&gt;

&lt;h3&gt;
  
  
  5) Enable the Migration Flag:
&lt;/h3&gt;

&lt;p&gt;At this stage, the migration flag was enabled in the application, ensuring that all new data writes occurred in the temporary EU dataset. This step allowed us to maintain zero downtime during the migration, as the application seamlessly redirected its writes without any interruption to the customer’s operations.&lt;/p&gt;

&lt;h3&gt;
  
  
  6) Delete the US Dataset:
&lt;/h3&gt;

&lt;p&gt;Once we verified that the application was writing to the EU region without any issues, the old US dataset was deleted to comply with GDPR and avoid any confusion or conflict with the migrated data.&lt;/p&gt;

&lt;h3&gt;
  
  
  7) Create a Permanent EU Dataset:
&lt;/h3&gt;

&lt;p&gt;A new permanent dataset was created in the EU region. This dataset would become the final destination for the migrated data using the schema that we saved in step 2, which ensured that the original dataset its exactly replicated. After creating this dataset, the data in the EU bucket is loaded into the newly created EU dataset's tables.&lt;/p&gt;

&lt;h3&gt;
  
  
  8) Turn off the Migration Flag:
&lt;/h3&gt;

&lt;p&gt;Since a new EU dataset with the same name is created now, the migration flag that was enabled in step 5 can now be safely disabled, making the application redirect it's writes to the new EU dataset as it rightfully should.&lt;/p&gt;

&lt;h3&gt;
  
  
  9) Export Data from the Temporary Dataset:
&lt;/h3&gt;

&lt;p&gt;Now during this migration window, the interim application data would've been written to the temp dataset. So, this data also has to be imported into the EU dataset. The data in the temporary dataset was exported to the EU Bucket.&lt;/p&gt;

&lt;h3&gt;
  
  
  10) Import Data into the Permanent EU Dataset:
&lt;/h3&gt;

&lt;p&gt;The exported temporary data in the EU bucket is imported into the EU dataset in the same flow of operations. This marked the end of the migration process for the customer’s data.&lt;/p&gt;

&lt;h3&gt;
  
  
  11) Cleanup:
&lt;/h3&gt;

&lt;p&gt;Now after ensuring that the migration is successful. The contents in the two buckets (US and EU) along with the temp dataset was deleted. Until then, this data served as a fallback.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Thanks for sticking with me till the very end! I know this might not be the most efficient way out there, but this ensured zero data-loss with zero-downtime during this migration activity for us. There are very few documentations on this and browsing endlessly for the documentation made me write this blog. I'd love hear your opinions on this. &lt;/p&gt;

&lt;p&gt;Cheers, see you in the next one! 😉&lt;/p&gt;

</description>
      <category>googlecloud</category>
      <category>database</category>
      <category>analytics</category>
      <category>cloud</category>
    </item>
    <item>
      <title>The Battle of PR-Agents⚡</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Mon, 11 Dec 2023 13:25:45 +0000</pubDate>
      <link>https://forem.com/iamadhee/the-battle-of-pr-agents-n00</link>
      <guid>https://forem.com/iamadhee/the-battle-of-pr-agents-n00</guid>
      <description>&lt;p&gt;In the ever-evolving landscape of software development, the effective management of pull requests (PRs) stands as a linchpin for collaboration, code quality, and project success. This blog is entirely dedicated to my personal quest for a great AI driven PR-Agent, along with my findings. Among the myriad of tools available, &lt;strong&gt;&lt;a href="https://www.codium.ai/products/git-plugin/" rel="noopener noreferrer"&gt;CodiumAI's PR-Agent&lt;/a&gt;&lt;/strong&gt; and &lt;strong&gt;&lt;a href="https://docs.github.com/en/copilot/github-copilot-enterprise/copilot-pull-request-summaries/about-copilot-pull-request-summaries" rel="noopener noreferrer"&gt;GitHub Copilot&lt;/a&gt;&lt;/strong&gt; emerge as contenders that promise to streamline the PR review process and elevate the developer experience. In this extensive exploration, we will delve into the depths of these tools, dissecting each feature to understand their nuances, strengths, and how they contribute to reducing time exceptionally – a crucial aspect in the fast-paced world of modern software development.&lt;/p&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;1. Command Arsenal: A Multifaceted Symphony&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  CodiumAI PR-Agent:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Auto Description (&lt;a href="https://github.com/Codium-ai/pr-agent/blob/main/docs/DESCRIBE.md" rel="noopener noreferrer"&gt;/describe&lt;/a&gt;):&lt;/strong&gt; This feature stands out by automatically generating a comprehensive PR description that includes the title, type, summary, code walkthrough, and labels. It ensures that the reviewer gets a holistic view of the changes introduced.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Auto Review (&lt;a href="https://github.com/Codium-ai/pr-agent/blob/main/docs/REVIEW.md" rel="noopener noreferrer"&gt;/review&lt;/a&gt;):&lt;/strong&gt; Offering flexibility, PR-Agent's auto review provides adjustable feedback on various aspects such as the PR's main theme, type, relevant tests, security issues, score, and diverse content suggestions.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Question Answering (&lt;a href="https://github.com/Codium-ai/pr-agent/blob/main/docs/ASK.md" rel="noopener noreferrer"&gt;/ask ...&lt;/a&gt;):&lt;/strong&gt; Facilitating communication, this feature enables free-text interaction by answering questions related to the PR, promoting collaboration and clear understanding.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Code Suggestions (&lt;a href="https://github.com/Codium-ai/pr-agent/blob/main/docs/IMPROVE.md" rel="noopener noreferrer"&gt;/improve&lt;/a&gt;):&lt;/strong&gt; A powerful addition to the arsenal, this feature provides commit-worthy code suggestions, aiding developers in enhancing the quality of their code.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Update Changelog (&lt;a href="https://github.com/Codium-ai/pr-agent/blob/main/docs/UPDATE_CHANGELOG.md" rel="noopener noreferrer"&gt;/update_changelog&lt;/a&gt;):&lt;/strong&gt; Streamlining documentation, PR-Agent automatically updates the CHANGELOG.md file with the PR changes, ensuring that the documentation stays up-to-date.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Find Similar Issue (&lt;a href="https://github.com/Codium-ai/pr-agent/blob/main/docs/SIMILAR_ISSUE.md" rel="noopener noreferrer"&gt;/similar_issue&lt;/a&gt;):&lt;/strong&gt; Enhancing issue management, this feature retrieves and presents similar issues associated with the PR, aiding in thorough issue resolution.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Add Documentation (&lt;a href="https://github.com/Codium-ai/pr-agent/blob/main/docs/ADD_DOCS.md" rel="noopener noreferrer"&gt;/add_docs&lt;/a&gt;):&lt;/strong&gt; Addressing completeness, PR-Agent automatically adds documentation to undocumented functions/classes in the PR, promoting better code documentation practices.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Generate Custom Labels (&lt;a href="https://github.com/Codium-ai/pr-agent/blob/main/docs/GENERATE_LABELS.md" rel="noopener noreferrer"&gt;/generate_labels&lt;/a&gt;):&lt;/strong&gt; Streamlining organization, this feature suggests custom labels based on PR code changes, ensuring a standardized labeling process.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  GitHub Copilot:
&lt;/h3&gt;

&lt;p&gt;GitHub Copilot primarily focuses on a single command, centering around code suggestions derived from natural language prompts. While proficient in generating code snippets, it lacks the multifaceted approach seen in PR-Agent's symphony of commands.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Analysis:&lt;/em&gt;&lt;br&gt;
PR-Agent's multifaceted command set orchestrates a harmonious blend, catering to the diverse needs of the PR review process. In contrast, Copilot's spotlight on code suggestions is a solo performance, lacking the richness found in PR-Agent's symphony of commands.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;2. Platform Universality: Bridging Git Divides&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  CodiumAI PR-Agent:
&lt;/h3&gt;

&lt;p&gt;Compatible with all Git platforms, PR-Agent ensures flexibility for teams using diverse Git repositories. This universality promotes collaboration across various platforms, breaking down silos.&lt;/p&gt;

&lt;h3&gt;
  
  
  GitHub Copilot:
&lt;/h3&gt;

&lt;p&gt;Tailored for GitHub, Copilot's integration capabilities are potentially limited when it comes to non-GitHub Git platforms. This may pose challenges for teams working in mixed Git environments.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Analysis:&lt;/em&gt;&lt;br&gt;
PR-Agent's universal compatibility allows for seamless collaboration across various Git repositories, surpassing the constraints observed in GitHub Copilot, which caters primarily to GitHub users.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;3. Open-Source Dynamics: A Community-Driven Ethos&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  CodiumAI PR-Agent:
&lt;/h3&gt;

&lt;p&gt;Flourishing as an open-source solution, PR-Agent embraces transparency and encourages community contributions. The collaborative ethos allows the community to contribute to the tool's continuous improvement.&lt;/p&gt;

&lt;h3&gt;
  
  
  GitHub Copilot:
&lt;/h3&gt;

&lt;p&gt;Enveloped in closed code, Copilot restricts visibility into underlying algorithms. While a robust tool, its closed-code nature limits the community-driven enhancements seen in open-source solutions.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Analysis:&lt;/em&gt;&lt;br&gt;
PR-Agent's open-source nature invites community-driven improvements, fostering transparency. In contrast, Copilot's closed-code approach may hinder collaborative contributions.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;4. Integration Landscape: A Seamless Experience&lt;/strong&gt;
&lt;/h2&gt;

&lt;h3&gt;
  
  
  CodiumAI PR-Agent:
&lt;/h3&gt;

&lt;p&gt;PR-Agent embeds itself seamlessly into both Git and popular Integrated Development Environments (IDEs). This ensures a harmonious experience for developers, regardless of their preferred environment.&lt;/p&gt;

&lt;h3&gt;
  
  
  GitHub Copilot:
&lt;/h3&gt;

&lt;p&gt;Primarily designed for Git, Copilot may have limited or no native integration into popular IDEs. This could pose challenges for developers who rely heavily on integrated development environments.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Analysis:&lt;/em&gt;&lt;br&gt;
PR-Agent's dual integration into both Git and IDE environments ensures a consistent developer experience, offering flexibility not necessarily found in Copilot's primarily Git-centric design.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  &lt;strong&gt;5. Time Efficiency: The Quintessence of PR Management&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;In the fast-paced world of software development, time is a precious commodity. Both CodiumAI's PR-Agent and GitHub Copilot recognize the urgency and need for efficiency in the PR management process. The multifaceted command set of PR-Agent, encompassing auto-description, review, question answering, and more, is crafted to expedite the review process while maintaining a high standard of quality. The ability to automate documentation updates, find similar issues, and suggest custom labels significantly reduces the manual overhead, allowing developers to allocate their time to more critical tasks.&lt;/p&gt;

&lt;p&gt;Similarly, GitHub Copilot's prowess in generating code snippets from natural language prompts streamlines the coding process, reducing the time required to write and review code. While Copilot's focus is more centered on code suggestions, its efficiency in this domain is undeniable, contributing to accelerated development cycles.&lt;/p&gt;

&lt;p&gt;In essence, both tools present a paradigm shift in how developers approach PR management, emphasizing time efficiency without compromising on quality. By automating repetitive tasks, offering intelligent suggestions, and promoting collaboration, these tools empower development teams to navigate the complex PR landscape with agility.&lt;/p&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;In Conclusion: CodiumAI's PR-Agent Takes the Lead&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;While both CodiumAI's PR-Agent and GitHub Copilot bring innovation to the PR management landscape, the versatility, open-source nature, and holistic approach of PR-Agent make it stand out. &lt;strong&gt;&lt;em&gt;&lt;a href="https://www.codium.ai/" rel="noopener noreferrer"&gt;CodiumAI&lt;/a&gt;'s commitment to reducing the manual overhead, fostering collaboration, and accommodating diverse Git platforms positions PR-Agent as a go-to tool for development teams striving for efficiency and excellence in their PR workflows.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;As you navigate the intricacies of PR solutions, the key consideration remains:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If it aligns with your multifaceted needs, accelerates your development workflow, and provides a comprehensive solution, CodiumAI's PR-Agent is a tool worth exploring!&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  &lt;strong&gt;The Problem&lt;/strong&gt;
&lt;/h2&gt;

&lt;p&gt;Now let me explain why I did this. The reason I went into this PR-Agent exploration is because I had a problem at hand. I'm an independent developer and for most of the projects I work for, mostly I would review my own PR, and there is a lot of room for oversight. I wanted an intelligent system that serves as my co-developer that reviews my PRs for me and provide valuable review comments. CodiumAI's PR-Agent did exactly what I needed. Attaching here the review done by CodiumAI for one of my PRs. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv1jw04q9f68iw6h8fltf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv1jw04q9f68iw6h8fltf.png" alt="docker CLI tool for CodiumAI PR-Agent" width="800" height="425"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpvm8eybozt7ileym91ha.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpvm8eybozt7ileym91ha.png" alt="PR Review by CodiumAI" width="800" height="933"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that's about it. Share your insights and thoughts on this detailed comparison in the comments below. Hope this helps. Until the next one - cheers!&lt;/p&gt;

</description>
      <category>opensource</category>
      <category>productivity</category>
      <category>ai</category>
    </item>
    <item>
      <title>Merge Sort</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Tue, 24 Oct 2023 06:56:08 +0000</pubDate>
      <link>https://forem.com/iamadhee/merge-sort-56mp</link>
      <guid>https://forem.com/iamadhee/merge-sort-56mp</guid>
      <description>&lt;p&gt;Today, we're going to dive into the world of Merge Sort, a widely used sorting algorithm known for its efficiency, reliability, and stability.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is Merge Sort?
&lt;/h2&gt;

&lt;p&gt;Merge Sort is a sorting algorithm that falls under the "divide and conquer" category. It was developed in 1945 by John von Neumann and is often used for sorting arrays. The basic idea behind Merge Sort is to divide a large array into smaller, more manageable sub-arrays, sort them, and then merge these sub-arrays back together in a sorted manner.&lt;/p&gt;

&lt;p&gt;Merge Sort operates recursively and is known for its stable sorting properties, making it an excellent choice for sorting large data sets.&lt;/p&gt;

&lt;h2&gt;
  
  
  How Merge Sort Works
&lt;/h2&gt;

&lt;p&gt;The key to understanding Merge Sort lies in its name. It "merges" and "sorts" to achieve its goal. Here's a high-level overview of the process:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Divide&lt;/strong&gt;: The unsorted array is divided into two equal halves until individual elements are reached.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Conquer&lt;/strong&gt;: Each of these smaller arrays is sorted recursively using Merge Sort.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Merge&lt;/strong&gt;: Finally, the sorted sub-arrays are merged back together, maintaining their order.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This process continues until the entire array is sorted.&lt;/p&gt;

&lt;h2&gt;
  
  
  Time and Space Complexity
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Time Complexity&lt;/strong&gt;: Merge Sort has a time complexity of O(n log n), which makes it efficient for sorting large data sets. The "divide and conquer" approach ensures that each element is compared only once.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Space Complexity&lt;/strong&gt;: The space complexity of Merge Sort is O(n), which means it requires additional memory for creating and merging sub-arrays. This can be a drawback for large arrays with limited memory.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Implementation with Python
&lt;/h2&gt;

&lt;p&gt;Let's take a look at a Python implementation of Merge Sort. This code demonstrates the divide-and-conquer approach to sorting an array.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;merge_sort&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;arr&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;arr&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;arr&lt;/span&gt;

    &lt;span class="c1"&gt;# Divide the array into two halves
&lt;/span&gt;    &lt;span class="n"&gt;middle&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;arr&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;//&lt;/span&gt; &lt;span class="mi"&gt;2&lt;/span&gt;
    &lt;span class="n"&gt;left_half&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;arr&lt;/span&gt;&lt;span class="p"&gt;[:&lt;/span&gt;&lt;span class="n"&gt;middle&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="n"&gt;right_half&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;arr&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;middle&lt;/span&gt;&lt;span class="p"&gt;:]&lt;/span&gt;

    &lt;span class="c1"&gt;# Recursively sort both halves
&lt;/span&gt;    &lt;span class="n"&gt;left_half&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;merge_sort&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;left_half&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;right_half&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;merge_sort&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;right_half&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="c1"&gt;# Merge the sorted halves
&lt;/span&gt;    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;merge&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;left_half&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;right_half&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;merge&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;left&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;right&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;result&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
    &lt;span class="n"&gt;left_index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;right_index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;

    &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="n"&gt;left_index&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;left&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="ow"&gt;and&lt;/span&gt; &lt;span class="n"&gt;right_index&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="nf"&gt;len&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;right&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;left&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;left_index&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="n"&gt;right&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;right_index&lt;/span&gt;&lt;span class="p"&gt;]:&lt;/span&gt;
            &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;left&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;left_index&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
            &lt;span class="n"&gt;left_index&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
        &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;right&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;right_index&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
            &lt;span class="n"&gt;right_index&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;

    &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;extend&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;left&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;left_index&lt;/span&gt;&lt;span class="p"&gt;:])&lt;/span&gt;
    &lt;span class="n"&gt;result&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;extend&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;right&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;right_index&lt;/span&gt;&lt;span class="p"&gt;:])&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;result&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now that's about merge sort. See you in the next one. Until then, Sayonara!&lt;/p&gt;

</description>
      <category>python</category>
      <category>beginners</category>
      <category>tutorial</category>
      <category>algorithms</category>
    </item>
    <item>
      <title>Doubly Linked Lists</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Mon, 23 Oct 2023 14:12:25 +0000</pubDate>
      <link>https://forem.com/iamadhee/doubly-linked-lists-4c8o</link>
      <guid>https://forem.com/iamadhee/doubly-linked-lists-4c8o</guid>
      <description>&lt;p&gt;In the last blog post we saw the implementation of Singly Linked List. Now in this post, we'll have detailed a look at his brother the Doubly Linked List.&lt;/p&gt;




&lt;h2&gt;
  
  
  Understanding Doubly Linked Lists
&lt;/h2&gt;

&lt;p&gt;Just like singly linked lists, &lt;strong&gt;a doubly linked list&lt;/strong&gt; is a data structure that consists of a collection of nodes. Each node, in this case, is a separate object, but what makes it unique is that each node is linked not only to the next but also to the previous node. This two-way linkage allows for more efficient traversal in both directions.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F62ul32pe1fc9li9kfhna.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F62ul32pe1fc9li9kfhna.png" alt="Doubly Linked List"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In contrast to the singly linked list, which only points to the next element, a doubly linked list has two pointers in each node, one pointing to the next element and one pointing to the previous element.&lt;/p&gt;

&lt;h3&gt;
  
  
  Adding a Node
&lt;/h3&gt;

&lt;p&gt;Adding a node in a doubly linked list is a bit more intricate than in a singly linked list. Suppose we have a node &lt;code&gt;cur&lt;/code&gt; that we want to insert between the &lt;code&gt;prev&lt;/code&gt; node and the &lt;code&gt;next&lt;/code&gt; node. To achieve this, we must perform the following steps:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Set the reference field of the &lt;code&gt;prev&lt;/code&gt; node to point to the &lt;code&gt;cur&lt;/code&gt; node.&lt;/li&gt;
&lt;li&gt;Set the reference field of the &lt;code&gt;cur&lt;/code&gt; node to point to the &lt;code&gt;next&lt;/code&gt; node.&lt;/li&gt;
&lt;li&gt;Adjust the reference field of the &lt;code&gt;next&lt;/code&gt; node to also point back to the &lt;code&gt;cur&lt;/code&gt; node, completing the two-way linkage.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The insertion process in a doubly linked list still maintains its efficiency, and you can insert a new node in O(1) time complexity when you have references to the &lt;code&gt;prev&lt;/code&gt; and &lt;code&gt;next&lt;/code&gt; nodes.&lt;/p&gt;

&lt;h3&gt;
  
  
  Deleting a Node
&lt;/h3&gt;

&lt;p&gt;Deleting a node in a doubly linked list, much like insertion, requires a bit more work due to the two-way linkage. Let's consider a node &lt;code&gt;cur&lt;/code&gt; that we want to delete:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Find the &lt;code&gt;cur&lt;/code&gt; node's previous node &lt;code&gt;prev&lt;/code&gt; and next node &lt;code&gt;next&lt;/code&gt; first.&lt;/li&gt;
&lt;li&gt;Adjust the reference field of the &lt;code&gt;prev&lt;/code&gt; node to point to the &lt;code&gt;next&lt;/code&gt; node, breaking the linkage from the &lt;code&gt;cur&lt;/code&gt; node.&lt;/li&gt;
&lt;li&gt;Adjust the reference field of the &lt;code&gt;next&lt;/code&gt; node to point back to the &lt;code&gt;prev&lt;/code&gt; node.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In a doubly linked list, finding the &lt;code&gt;next&lt;/code&gt; node using the reference field is simple. However, finding the previous node &lt;code&gt;prev&lt;/code&gt; is equally efficient since we can traverse the list in both directions. The time complexity for deleting a node remains O(1).&lt;/p&gt;

&lt;h2&gt;
  
  
  Implementation with Python
&lt;/h2&gt;

&lt;p&gt;Now, let's dive into the implementation of a doubly linked list in Python:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;

&lt;span class="c1"&gt;# ListNode class represents a node in the doubly linked list.
&lt;/span&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ListNode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;  &lt;span class="c1"&gt;# Value of the current node
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prev&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;  &lt;span class="c1"&gt;# Reference to the previous node
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;  &lt;span class="c1"&gt;# Reference to the next node
&lt;/span&gt;
&lt;span class="c1"&gt;# MyLinkedList class implements the doubly linked list with the specified methods.
&lt;/span&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;MyLinkedList&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;  &lt;span class="c1"&gt;# Reference to the head node (the first node)
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tail&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;  &lt;span class="c1"&gt;# Reference to the tail node (the last node)
&lt;/span&gt;        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;     &lt;span class="c1"&gt;# Number of nodes in the linked list
&lt;/span&gt;
    &lt;span class="c1"&gt;# Get the value of the node at the specified index in the linked list.
&lt;/span&gt;    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;  &lt;span class="c1"&gt;# Return -1 for an invalid index
&lt;/span&gt;
        &lt;span class="n"&gt;current&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="n"&gt;current&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;

        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;

    &lt;span class="c1"&gt;# Add a new node with the given value to the head of the linked list.
&lt;/span&gt;    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;addAtHead&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;new_node&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ListNode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tail&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new_node&lt;/span&gt;
        &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;new_node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prev&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new_node&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new_node&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;

    &lt;span class="c1"&gt;# Append a new node with the given value to the tail of the linked list.
&lt;/span&gt;    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;addAtTail&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;new_node&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ListNode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tail&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new_node&lt;/span&gt;
        &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;new_node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prev&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tail&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tail&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new_node&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tail&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new_node&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;

    &lt;span class="c1"&gt;# Add a new node with the given value before the node at the specified index.
&lt;/span&gt;    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;addAtIndex&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt;  &lt;span class="c1"&gt;# Do nothing for an invalid index
&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;addAtHead&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;addAtTail&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;new_node&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ListNode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;current&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;
            &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
                &lt;span class="n"&gt;current&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;

            &lt;span class="n"&gt;new_node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prev&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;current&lt;/span&gt;
            &lt;span class="n"&gt;new_node&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;
            &lt;span class="n"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prev&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new_node&lt;/span&gt;
            &lt;span class="n"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new_node&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;

    &lt;span class="c1"&gt;# Delete the node at the specified index in the linked list.
&lt;/span&gt;    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;deleteAtIndex&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt;  &lt;span class="c1"&gt;# Do nothing for an invalid index
&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tail&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prev&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="k"&gt;elif&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tail&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tail&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prev&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;tail&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;current&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;
            &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
                &lt;span class="n"&gt;current&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;
            &lt;span class="n"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prev&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;
            &lt;span class="n"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prev&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;current&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;prev&lt;/span&gt;

        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;size&lt;/span&gt; &lt;span class="o"&gt;-=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now that's about Doubly Linked Lists. See you in the next one. Until then, Sayonara!&lt;/p&gt;

</description>
      <category>algorithms</category>
      <category>beginners</category>
      <category>python</category>
      <category>career</category>
    </item>
    <item>
      <title>Tortoise and Hare Algorithm</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Sun, 22 Oct 2023 14:24:17 +0000</pubDate>
      <link>https://forem.com/iamadhee/tortoise-and-hare-algorithm-2a5f</link>
      <guid>https://forem.com/iamadhee/tortoise-and-hare-algorithm-2a5f</guid>
      <description>&lt;p&gt;We all would have heard about the tortoise and hare story right? Well, there's an algorithm inspired by this story. &lt;/p&gt;

&lt;h2&gt;
  
  
  What is this Algo?
&lt;/h2&gt;

&lt;p&gt;Floyd's Cycle Detection Algorithm, often referred to as the "tortoise and hare" algorithm, is used to detect the presence of a cycle in a linked list. This algorithm is named after American computer scientist Robert W. Floyd, who described it in 1967. It's a simple yet efficient technique that works by using two pointers to traverse the linked list at different speeds.&lt;/p&gt;

&lt;p&gt;Yes, it works like the story. There will be two pointers running simultaneously within the same linked list. If the linked list has a cycle (i.e if the tail is connected to another node) the two pointers will meet together at some point. If not, the pointers will never meet.&lt;/p&gt;

&lt;p&gt;First of all, what do I mean by a cycle in a linked list? Refer the below images: image 1 is a linked list without cycle (i.e tail node is not connected to any other node) and image 2 is a linked list with cycle because the tail node (-4) is connected to another node (2) in the list. Floyd's algo is efficient in detecting cycles in the linked lists.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqy95dpl6y3xouydlim2k.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fqy95dpl6y3xouydlim2k.png" alt="Linked list without cycle"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffnqrqi0ueq2pi4mazei3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffnqrqi0ueq2pi4mazei3.png" alt="Linked list with cycle"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Implementation with python
&lt;/h2&gt;

&lt;p&gt;Below is the implementation of Floyd's algo with python. The &lt;code&gt;detect_cycle&lt;/code&gt; method takes in head (LinkNode object) and traverses through the list and checks if the Linked List has a cycle.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;

&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;ListNode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;detectCycle&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="c1"&gt;# Initialize two pointers, slow and fast, both initially at the head of the linked list.
&lt;/span&gt;    &lt;span class="n"&gt;slow&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;head&lt;/span&gt;
    &lt;span class="n"&gt;fast&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;head&lt;/span&gt;

    &lt;span class="c1"&gt;# Step 1: Detect the cycle (Floyd's algorithm)
&lt;/span&gt;    &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="n"&gt;fast&lt;/span&gt; &lt;span class="ow"&gt;and&lt;/span&gt; &lt;span class="n"&gt;fast&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="c1"&gt;# Move the slow pointer one step at a time.
&lt;/span&gt;        &lt;span class="n"&gt;slow&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;slow&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;
        &lt;span class="c1"&gt;# Move the fast pointer two steps at a time.
&lt;/span&gt;        &lt;span class="n"&gt;fast&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;fast&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;

        &lt;span class="c1"&gt;# If the slow and fast pointers meet, it indicates the presence of a cycle.
&lt;/span&gt;        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;slow&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="n"&gt;fast&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;  &lt;span class="c1"&gt;# Cycle detected
&lt;/span&gt;
    &lt;span class="c1"&gt;# If the fast pointer reaches the end of the list (or becomes None), there is no cycle.
&lt;/span&gt;    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="bp"&gt;False&lt;/span&gt;  &lt;span class="c1"&gt;# No cycle found
&lt;/span&gt;

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;See you in the next one. Until then, Sayonara!&lt;/p&gt;

</description>
      <category>algorithms</category>
      <category>beginners</category>
      <category>learning</category>
      <category>python</category>
    </item>
    <item>
      <title>Singly Linked Lists</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Sat, 21 Oct 2023 13:55:54 +0000</pubDate>
      <link>https://forem.com/iamadhee/singly-linked-lists-518n</link>
      <guid>https://forem.com/iamadhee/singly-linked-lists-518n</guid>
      <description>&lt;p&gt;In this new blog series of mine, we'll be adventuring into the realm of DSA. The intent of this blog series is to explore the fundamentals of DSA while trying to maintain the KISS (keep it simple, stupid) principle. Hence, all the implementations throughout this series will be with python.&lt;/p&gt;

&lt;p&gt;Now that you have the context, without further ado, let's dive in.&lt;br&gt;
To start with one of the most basic data structures we'll be implementing the linked lists in this post.&lt;/p&gt;


&lt;h2&gt;
  
  
  What are Singly Linked Lists?
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;A linked list&lt;/strong&gt; is a type of data structure in which each element is actually a separate object while all the objects are linked together by a reference field. If you try to envision it, it might look something like the below representation:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9pno9x5u7y1ll6njlegf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9pno9x5u7y1ll6njlegf.png" alt="Singly Linked List" width="800" height="216"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There are two types of linked lists: &lt;br&gt;
1) Singly Linked List&lt;br&gt;
2) Doubly Linked list&lt;br&gt;
In this post, we'll be exploring the singly linked list specifically.&lt;/p&gt;

&lt;p&gt;Each object is a node in a linked list and is connected to only one other node in a &lt;strong&gt;Singly linked list&lt;/strong&gt;. &lt;/p&gt;
&lt;h3&gt;
  
  
  Adding a node
&lt;/h3&gt;

&lt;p&gt;To add a node to a singly linked list, lets consider the node that we want to add as &lt;code&gt;cur&lt;/code&gt; and initialize it. We want to insert the &lt;code&gt;cur&lt;/code&gt; node between &lt;code&gt;prev&lt;/code&gt; node and &lt;code&gt;next&lt;/code&gt; node. To do this we have to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Connect the reference field of &lt;code&gt;prev&lt;/code&gt; node to the &lt;code&gt;cur&lt;/code&gt; node&lt;/li&gt;
&lt;li&gt;Connect the reference field of &lt;code&gt;cur&lt;/code&gt; node to the &lt;code&gt;next&lt;/code&gt; node&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Unlike an array, we don’t have to move all elements past the inserted element. Therefore, you can insert a new node into a linked list in O(1) time complexity if you have a reference to &lt;code&gt;prev&lt;/code&gt;, which is very efficient. &lt;/p&gt;
&lt;h3&gt;
  
  
  Deleting a node
&lt;/h3&gt;

&lt;p&gt;To delete a node, lets consider the same example with &lt;code&gt;cur&lt;/code&gt;, &lt;code&gt;prev&lt;/code&gt; and &lt;code&gt;next&lt;/code&gt;. &lt;code&gt;cur&lt;/code&gt; is the node you want to delete.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Find the &lt;code&gt;cur&lt;/code&gt; node's previous node &lt;code&gt;prev&lt;/code&gt; and next node &lt;code&gt;next&lt;/code&gt; first&lt;/li&gt;
&lt;li&gt;Link &lt;code&gt;prev&lt;/code&gt; node's reference field to the &lt;code&gt;next&lt;/code&gt; node, eventually unlinking &lt;code&gt;cur&lt;/code&gt; node in the process&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Finding the &lt;code&gt;next&lt;/code&gt; node using the reference field is easy. But to find the previous node &lt;code&gt;prev&lt;/code&gt;, we have to traverse the linked list from the &lt;code&gt;head&lt;/code&gt; node, hence the time complexity of deleting a node in a linked list is O(n) where n is the length of the list.&lt;/p&gt;
&lt;h2&gt;
  
  
  Implementation with Python
&lt;/h2&gt;

&lt;p&gt;Here's the implementation of the singly linked list with python:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;LinkedNode&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;


&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;SinglyLinkedList&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;__init__&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="bp"&gt;None&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;length&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;get_with_index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;length&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# Raise an error if the index is out of bounds
&lt;/span&gt;            &lt;span class="k"&gt;raise&lt;/span&gt; &lt;span class="nc"&gt;IndexError&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Index out of bounds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="n"&gt;cur&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="c1"&gt;# Traverse the list to the specified index
&lt;/span&gt;            &lt;span class="n"&gt;cur&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;

        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;add_to_head&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;new&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;LinkedNode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# If the list is empty, the new node becomes the head
&lt;/span&gt;            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new&lt;/span&gt;
        &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# Otherwise, make the new node the new head and link it to the previous head
&lt;/span&gt;            &lt;span class="n"&gt;new&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;length&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;add_to_tail&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;new&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;LinkedNode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;length&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# If the list is empty, the new node becomes the head
&lt;/span&gt;            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new&lt;/span&gt;
        &lt;span class="k"&gt;else&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;cur&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;
            &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
                &lt;span class="c1"&gt;# Traverse to the end of the list
&lt;/span&gt;                &lt;span class="n"&gt;cur&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;
            &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;length&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;add_at_index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="n"&gt;new&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;LinkedNode&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;val&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;length&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# Raise an error if the index is out of bounds
&lt;/span&gt;            &lt;span class="k"&gt;raise&lt;/span&gt; &lt;span class="nc"&gt;IndexError&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Index out of bounds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# If index is 0, the new node becomes the head
&lt;/span&gt;            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;length&lt;/span&gt;&lt;span class="o"&gt;+=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt;

        &lt;span class="n"&gt;cur&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="c1"&gt;# Traverse to the node just before the specified index
&lt;/span&gt;            &lt;span class="n"&gt;cur&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;

        &lt;span class="n"&gt;new&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;
        &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;new&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;length&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;

    &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;delete_at_index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;lt;&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt; &lt;span class="ow"&gt;or&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;length&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# Raise an error if the index is out of bounds
&lt;/span&gt;            &lt;span class="k"&gt;raise&lt;/span&gt; &lt;span class="nc"&gt;IndexError&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Index out of bounds&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;==&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="c1"&gt;# If index is 0, remove the head node
&lt;/span&gt;            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;
            &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;length&lt;/span&gt; &lt;span class="o"&gt;-=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;

        &lt;span class="n"&gt;cur&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;head&lt;/span&gt;
        &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="nf"&gt;range&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;-&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
            &lt;span class="c1"&gt;# Traverse to the node just before the specified index
&lt;/span&gt;            &lt;span class="n"&gt;cur&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;

        &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;cur&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nb"&gt;next&lt;/span&gt;
        &lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;length&lt;/span&gt; &lt;span class="o"&gt;-=&lt;/span&gt; &lt;span class="mi"&gt;1&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now that's about singly linked lists, in the next blog post we'll have a look at the doubly linked lists. Until then, Sayonara!&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>learning</category>
      <category>python</category>
      <category>algorithms</category>
    </item>
    <item>
      <title>LlamaHub: The one stop data solution for your LLMs🦙🏠</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Mon, 26 Jun 2023 10:21:00 +0000</pubDate>
      <link>https://forem.com/iamadhee/llamahub-the-one-stop-data-solution-for-your-llms-4mpc</link>
      <guid>https://forem.com/iamadhee/llamahub-the-one-stop-data-solution-for-your-llms-4mpc</guid>
      <description>&lt;p&gt;In the 4th part of this blog series, we'll have a look at the most popular adapters available at &lt;a href="//llamahub.ai"&gt;llamahub.ai&lt;/a&gt;. LLaMahub is a library of data connectors that translate raw data from various sources into vector(s) aiding in ease of access and transformation by LLMs like ChatGPT and Google Bard.&lt;/p&gt;

&lt;p&gt;This is an open-source project where the adapters/connectors that are created would slowly get integrated into projects like Langchain and Llama-index. If you are building a custom document bot for your firm or even for your personal use I think you would find this super useful.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvv5x8ywy435iljyz5uux.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvv5x8ywy435iljyz5uux.png" alt="Llamahub" width="800" height="426"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, lets get started !&lt;/p&gt;




&lt;p&gt;For the sake of simplicity in this walkthrough, We will access the adapters of llamahub through Llama-Index&lt;/p&gt;

&lt;h2&gt;
  
  
  Adapters Assemble 🔨⚡
&lt;/h2&gt;

&lt;p&gt;First up here's how you could load any loader with Llama-Index:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;llama_index&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;download_loader&lt;/span&gt;

&lt;span class="n"&gt;Loader&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;download_loader&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;&amp;lt;NAME_OF_THE_LOADER&amp;gt;&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;the &lt;code&gt;download_loader&lt;/code&gt; helper method will make sure to load the mentioned loader along with all the needed dependencies. To check the name of the loader that you'd want to use, visit this &lt;a href="https://gpt-index.readthedocs.io/en/latest/reference/readers.html" rel="noopener noreferrer"&gt;documentation&lt;/a&gt;. With that out of the way, lets have a look at some of the important data loaders in Llama-Index.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;code&gt;SimpleDirectoryReader&lt;/code&gt;:
&lt;/h3&gt;

&lt;p&gt;This is a universal data connector available in the llama-index that should be able to handle pretty much all the file types you throw at it. Under the hood it uses various third party libraries like &lt;code&gt;pypdf2&lt;/code&gt; for PDFs, &lt;code&gt;python-pptx&lt;/code&gt; for PPTs etc... The general idea behind this data connector is to read a directory of different file types in entirety or to read single files, doesn't matter what kind of file it is.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;SimpleDirectoryReader&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;download_loader&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SimpleDirectoryReader&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;SimpleDirectoryReader&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input_dir&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;foo/&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;load_data&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;code&gt;SimpleWebPageReader&lt;/code&gt;:
&lt;/h3&gt;

&lt;p&gt;This is the web equivalent of the &lt;code&gt;SimpleDirectoryReader&lt;/code&gt;. This can read any webpage by using &lt;code&gt;BeautifulSoup4&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;SimpleWebPageReader&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;download_loader&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;SimpleWebPageReader&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;SimpleWebPageReader&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;load_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;urls&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;https://foo.com/index&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  &lt;code&gt;YoutubeTranscriptReader&lt;/code&gt;:
&lt;/h3&gt;

&lt;p&gt;This loader will attempt to transcript any youtube video that you provide. It will first try to search for any available captions using the &lt;code&gt;youtube-transcript-api&lt;/code&gt; and if the captions are not available, it will use OpenAI's &lt;a href="https://openai.com/research/whisper" rel="noopener noreferrer"&gt;whisper API&lt;/a&gt; to transcript the video.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;YoutubeTranscriptReader&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;download_loader&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;YoutubeTranscriptReader&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;YoutubeTranscriptReader&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;load_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ytlinks&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;https://youtu.be/5MuIMqhT8DM&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;p&gt;The above are some of the loaders that I think have a lot of use cases. But there are more and you can find the complete list &lt;a href="https://gpt-index.readthedocs.io/en/latest/reference/readers.html" rel="noopener noreferrer"&gt;here&lt;/a&gt;. If you want to load the documents into langchain's &lt;code&gt;Document&lt;/code&gt; format, you can use &lt;code&gt;&amp;lt;Loader&amp;gt;.load_langchain_documents()&lt;/code&gt; instead of &lt;code&gt;load_data&lt;/code&gt;, it will output the documents in the langchain supported format. &lt;/p&gt;

&lt;p&gt;Once you load the documents, it is a cake walk from there. You can visit my old blogs in this series on how you could use this as context for the ChatGPT API.&lt;/p&gt;

&lt;p&gt;Now, that's about LlamaHub, See ya in the next one 😉&lt;/p&gt;

</description>
      <category>python</category>
      <category>ai</category>
      <category>machinelearning</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>I built a tool that creates and posts AI content to social media automatically 🧌</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Mon, 19 Jun 2023 04:55:34 +0000</pubDate>
      <link>https://forem.com/iamadhee/i-built-a-tool-that-creates-and-posts-ai-content-in-social-media-1k2d</link>
      <guid>https://forem.com/iamadhee/i-built-a-tool-that-creates-and-posts-ai-content-in-social-media-1k2d</guid>
      <description>&lt;p&gt;I have always been intrigued by the idea of automating a social media handle. With the advent of LLMs, generating content has never been easier.&lt;/p&gt;

&lt;p&gt;I wanted to experiment a little in this space and thus created LLM Influencer, A bot powered by OpenAI's GPT-3 and DALL-E 2 generative AIs.&lt;/p&gt;




&lt;h2&gt;
  
  
  💡 How does it work?
&lt;/h2&gt;

&lt;p&gt;LLM Influencer utilizes the power of AI to create thought-provoking daily content accompanied by visually appealing images and posts it to twitter on a daily basis. The type of content that gets generated are available as separate plug-and-play modules.&lt;/p&gt;




&lt;h2&gt;
  
  
  🚀 Features:
&lt;/h2&gt;

&lt;p&gt;✅ Generates daily motivational content for Twitter and other social media platforms.&lt;br&gt;
✅ Offers easy plug-and-play modules for different types of content generation.&lt;br&gt;
✅ Leverages ChatGPT and DALL-E 2 to provide creative and unique outputs.&lt;br&gt;
✅ Includes built-in scheduling for automatic posting at your desired time every day.&lt;br&gt;
✅ Provides email notifications to keep you informed in case of any job failures.&lt;/p&gt;
&lt;h3&gt;
  
  
  🧩 Currently Available Modules:
&lt;/h3&gt;

&lt;p&gt;1️⃣ &lt;strong&gt;Quoter&lt;/strong&gt;: Delivers motivational quotes paired with captivating images, designed to inspire and uplift you each day.&lt;br&gt;
2️⃣ &lt;strong&gt;Tweet Storm&lt;/strong&gt;: Explores various aspects of life through engaging tweet storms, offering valuable insights and perspectives.&lt;/p&gt;



&lt;p&gt;I have been posting the generated content to my own twitter handle for while now, I have added some of the posts here.&lt;/p&gt;

&lt;p&gt;I'll add the links for the repository and the twitter handle where I post the generated content.&lt;/p&gt;




&lt;blockquote&gt;
&lt;p&gt;Life is a journey between the bright marble of our dreams and the muddy realities of our days, but both are equally important. ✨🧱💭 &lt;a href="https://t.co/yjPHfahtAr" rel="noopener noreferrer"&gt;pic.twitter.com/yjPHfahtAr&lt;/a&gt;&lt;/p&gt;— Adheeban Manoharan (@iamadhee_) &lt;a href="https://twitter.com/iamadhee_/status/1669522732248334336?ref_src=twsrc%5Etfw" rel="noopener noreferrer"&gt;June 16, 2023&lt;/a&gt;
&lt;/blockquote&gt;  


&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/iamadhee" rel="noopener noreferrer"&gt;
        iamadhee
      &lt;/a&gt; / &lt;a href="https://github.com/iamadhee/llm-influencer" rel="noopener noreferrer"&gt;
        llm-influencer
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      🎙 AI-driven social media content generation
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;div&gt;
&lt;a rel="noopener noreferrer nofollow" href="https://raw.githubusercontent.com/iamadhee/llm-influencer/main/assets/influencer.png"&gt;&lt;img width="200px" src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fraw.githubusercontent.com%2Fiamadhee%2Fllm-influencer%2Fmain%2Fassets%2Finfluencer.png"&gt;&lt;/a&gt;
&lt;div class="markdown-heading"&gt;
&lt;h1 class="heading-element"&gt;LLM Influencer&lt;/h1&gt;
&lt;/div&gt;
&lt;/div&gt;
&lt;div&gt;
Your AI-powered content generator
&lt;/div&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;About&lt;/h2&gt;
&lt;/div&gt;
&lt;p&gt;This project is an experimentation with AI, that aims on AI-Driven content generation. The aim is to be able to generate various types of content, such as podcasts, articles, and more using AI.&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Installation&lt;/h2&gt;
&lt;/div&gt;
&lt;ol&gt;
&lt;li&gt;Clone the repository&lt;/li&gt;
&lt;li&gt;Install the required dependencies using &lt;code&gt;poetry install&lt;/code&gt;
&lt;ul&gt;
&lt;li&gt;This project uses &lt;code&gt;poetry&lt;/code&gt; for dependency management, so make sure you have it installed. If not, you can install it using &lt;code&gt;pip install poetry&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Usage&lt;/h2&gt;

&lt;/div&gt;
&lt;ol&gt;
&lt;li&gt;Create a &lt;code&gt;.env&lt;/code&gt; file in the root directory of the project, there is an example &lt;code&gt;.env.example&lt;/code&gt; file provided. Fill in the required environment variables.&lt;/li&gt;
&lt;li&gt;If you want, you can play around with the &lt;code&gt;config.toml&lt;/code&gt; file to change the configurations of the content. Each type of content has its own section in the config file.&lt;/li&gt;
&lt;li&gt;To create content of your choice, run the following command
&lt;div class="highlight highlight-source-shell notranslate position-relative overflow-auto js-code-highlight"&gt;
&lt;pre&gt;python generate.py &lt;span class="pl-k"&gt;&amp;lt;&lt;/span&gt;type&lt;span class="pl-k"&gt;&amp;gt;&lt;/span&gt; &lt;span class="pl-k"&gt;&amp;lt;&lt;/span&gt;description&lt;span class="pl-k"&gt;&amp;gt;&lt;/span&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;&amp;lt;type&amp;gt;&lt;/code&gt; is…&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/iamadhee/llm-influencer" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;


&lt;p&gt;Kindly have a look and share your valuable thoughts.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>llm</category>
      <category>chatgpt</category>
      <category>python</category>
    </item>
    <item>
      <title>Combine LangChain 🦜🔗 &amp; Llama-Index 🦙</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Mon, 22 May 2023 03:56:55 +0000</pubDate>
      <link>https://forem.com/iamadhee/combine-langchain-llama-index-1068</link>
      <guid>https://forem.com/iamadhee/combine-langchain-llama-index-1068</guid>
      <description>&lt;p&gt;In the last post of this blog series, we saw how we could use LangChain's memory classes for building a chatbot-like interface with OpenAI's GPT-3 API. In this blog, we are basically gonna use these memory classes to keep track of our previous conversations with the LLM all the while making it answer our queries regarding the contents of the document we feed to it.&lt;/p&gt;

&lt;p&gt;We'll try to keep the dependencies on LlamaIndex to a bare minimum, because Langchain almost covers all the aspects of utilities that we would need to build this. We'll only be using the adapters available in the LlamaIndex library, because they perform better than the Langchain ones.&lt;/p&gt;

&lt;p&gt;Now before we dive in, let me set the context for you as to what we are trying to achieve. We are trying to build an intelligent system that reads the contents of the document and should be able to answer any queries from that content. This intelligent should also be able to keep track of our past conversations and if at all any question pointing one of the previous queries, it should be able to answer that.&lt;/p&gt;

&lt;p&gt;Remember when I told you in the last blog about completion tokens and context tokens? The maximum context window for GPT-3 is &lt;code&gt;4096&lt;/code&gt; tokens. We should be able to squeeze all our context, memory, prompt and response all within the same window.&lt;/p&gt;

&lt;p&gt;Something like this -&amp;gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6pqaycaj9t8gxhda7roz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6pqaycaj9t8gxhda7roz.png" alt="Context Window - GPT3"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With that out of the way, let's hustle !&lt;/p&gt;




&lt;h2&gt;
  
  
  Memory meets Custom Knowledge 🧠📔
&lt;/h2&gt;

&lt;p&gt;Let's start right away with the code:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;

&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;langchain.chat_models&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ChatOpenAI&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;langchain.chains&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ConversationalRetrievalChain&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;langchain.memory&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ConversationBufferWindowMemory&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;langchain.vectorstores&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;FAISS&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;langchain.text_splitter&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;RecursiveCharacterTextSplitter&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;langchain.embeddings&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;OpenAIEmbeddings&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;llama_index&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;download_loader&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;

&lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;environ&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;OPENAI_API_KEY&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Your API Key Here&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;

&lt;span class="n"&gt;file_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;input&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Enter the youtube link: &lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;YoutubeTranscriptReader&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;download_loader&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;YoutubeTranscriptReader&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="n"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;YoutubeTranscriptReader&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;load_data&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;ytlinks&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;file_path&lt;/span&gt;&lt;span class="p"&gt;])&lt;/span&gt;

&lt;span class="n"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;doc&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;to_langchain_format&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;doc&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
&lt;span class="n"&gt;text_splitter&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;RecursiveCharacterTextSplitter&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="n"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;text_splitter&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;split_documents&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;embeddings&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;OpenAIEmbeddings&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="n"&gt;retriever&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;FAISS&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;from_documents&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;embeddings&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;as_retriever&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;4&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nc"&gt;ChatOpenAI&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;temperature&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;model_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gpt-3.5-turbo&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;max_tokens&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;2048&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;memory&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ConversationBufferWindowMemory&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                                    &lt;span class="n"&gt;memory_key&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;chat_history&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                                    &lt;span class="n"&gt;return_messages&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                                    &lt;span class="n"&gt;k&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;6&lt;/span&gt;
                                  &lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;conversation&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;ConversationalRetrievalChain&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;from_llm&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
    &lt;span class="n"&gt;retriever&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;retriever&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="c1"&gt;# verbose=True, 
&lt;/span&gt;    &lt;span class="n"&gt;memory&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;memory&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="n"&gt;max_tokens_limit&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;1536&lt;/span&gt;  
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;chatbot&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pt&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;conversation&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;question&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;pt&lt;/span&gt;&lt;span class="p"&gt;})[&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;answer&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;res&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;__name__&lt;/span&gt;&lt;span class="o"&gt;==&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;__main__&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;########################################&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;pt&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;input&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ASK: &lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;pt&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;lower&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;==&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;end&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;break&lt;/span&gt;
        &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;chatbot&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pt&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;----------------------------------------&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ChatGPT says: &lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;You know the drill, just copy the code and paste it in a file and while running the file, install the dependencies if anything is prompted. &lt;/p&gt;

&lt;p&gt;These are some important points, that I thought I should highlight in the above code:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;code&gt;FAISS&lt;/code&gt; is a library built my &lt;a href="https://ai.facebook.com/" rel="noopener noreferrer"&gt;MetaAI&lt;/a&gt; for semantic search, that helps us with the similarity search within the context.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;ConversationBufferWindowMemory&lt;/code&gt; is the memory that we've used, this type of memory keeps track of the last &lt;code&gt;k&lt;/code&gt; conversations between the user and the LLM. I have given &lt;code&gt;k=6&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;RecursiveCharacterTextSplitter&lt;/code&gt; is an utility from LangChain that splits the context into chunks.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;ConversationalRetrievalChain&lt;/code&gt; is a type of chain that aids in a conversational chatbot-like interface while also keeping the document context and memory intact.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;That should give you an idea. If you have any more questions about the code, feel free to comment below.&lt;/p&gt;

&lt;p&gt;Now, for a change, I have used the &lt;code&gt;YoutubeTranscriptReader&lt;/code&gt; from the llama-index, this reads the transcript of the youtube link that you provide and builds the context out of it for the LLM. In the last blog we had used &lt;code&gt;SimpleDirectoryReader&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Here's what I tried with Matt Walker's talk on &lt;a href="https://youtu.be/5MuIMqhT8DM" rel="noopener noreferrer"&gt;Sleep is your superpower&lt;/a&gt;:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;

Enter the youtube link: https://youtu.be/5MuIMqhT8DM


########################################

ASK: list down the key points in the speech

----------------------------------------

ChatGPT says: 

1. Regularity in sleep timing is important to anchor sleep and improve its quantity and quality.
2. Keeping the bedroom cool aids in initiating and maintaining sleep.
3. Sleep is a non-negotiable biological necessity for human beings' health and well-being.
4. Sleep loss is a silent epidemic in industrialized nations and a significant public health challenge.
5. Sleep debt cannot be accumulated and repaid at a later point in time.
6. Getting out of bed when unable to sleep for too long establishes the association between bed and sleep.
7. The importance of prioritizing sleep for overall health and well-being. 

########################################

ASK: now convert that into a song for a 5 year old

----------------------------------------

ChatGPT says: 

Sure, here's a simple song about the importance of sleep:

Verse 1:
We all need to rest our heads
And go to sleep in our cozy beds
Regular sleep is the key
To feeling good and healthy

Chorus:
Sleep, sleep, the magic elixir
It helps us grow and gives us vigor
Don't stay up too late at night
Or you won't feel so bright

Verse 2:
Keep your room nice and cool
And your sleep will be extra smooth
And if you can't fall asleep
Get out of bed and don't make a peep

Chorus:
Sleep, sleep, the magic elixir
It helps us grow and gives us vigor
Don't stay up too late at night
Or you won't feel so bright

Verse 3:
Sleep is important, it's true
For our brains and our bodies too
We won't feel so good without it
So let's all just go to bed

Chorus:
Sleep, sleep, the magic elixir
It helps us grow and gives us vigor
Don't stay up too late at night
Or you won't feel so bright

Outro:
So let's all get a good night's sleep
And tomorrow, we'll feel so sweet
Sleep tight and have nice dreams
And wake up feeling like a new team! 

########################################


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Notice how the LLM is able to sense that I'm referring the previous conversation in the second question and converts that into a song? Exactly what we were trying to achieve. Do try it out with your own Youtube video. &lt;/p&gt;




&lt;p&gt;Now, that's about LangChain and Llama-Index. In the next blog of this series, we'll have a look at all the awesome adapters that are available in the &lt;a href="https://llamahub.ai/" rel="noopener noreferrer"&gt;Llama-Hub&lt;/a&gt;. Cheers, see ya in the next one 😉&lt;/p&gt;

</description>
      <category>python</category>
      <category>tutorial</category>
      <category>ai</category>
      <category>programming</category>
    </item>
    <item>
      <title>Make ChatGPT keep track of your past conversations 💬</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Thu, 11 May 2023 02:43:51 +0000</pubDate>
      <link>https://forem.com/iamadhee/make-chatgpt-keep-track-of-your-past-conversations-jon</link>
      <guid>https://forem.com/iamadhee/make-chatgpt-keep-track-of-your-past-conversations-jon</guid>
      <description>&lt;p&gt;In our previous blog, we fed ChatGPT with the custom knowledge from documents and made it answer our queries using Llama-Index. One problem with the previous implementation is that the chatbot wouldn't be able to answer any of our questions addressing the previous questions/prompts. Here's what I'm talking about:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ASK: You are an AI assistant, specialized in programming.                                    

----------------------------------------

ChatGPT says: 

Hello! I'm here to assist you with anything related to
programming. What can I help you with today? 

########################################

ASK: what did I tell you earlier?                                  

----------------------------------------

ChatGPT says: 

I'm sorry, but as an AI language model, I don't have access to
any context or information about our previous interactions
unless you provide it to me. Could you please clarify what you
asked me earlier? I'll do my best to assist you.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;So, lets set the custom knowledge base aside for a short-while and focus on this memory problem in this blog. We will be using the Langchain framework that offers various kinds of memory classes that will help us to build a chatbot system that keeps track of the past conversations we had with it.&lt;/p&gt;

&lt;p&gt;If you ever visited &lt;a href="//chat.openai.com"&gt;chat.openai.com&lt;/a&gt;, you would know that OpenAI maintains histories of chats, ChatGPT had with the user in seperate containers called sessions. Basically, we'll be trying to mimic these sessions using langchain. &lt;/p&gt;

&lt;p&gt;Now, that should give you an idea. Before we start, here are some more things you should know.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;OpenAI's ChatGPT has a token limit of 4096 (1 token =~ 4 characters)&lt;/li&gt;
&lt;li&gt;This limit includes both your prompt and the response (completion tokens) that is returned from ChatGPT&lt;/li&gt;
&lt;li&gt;Anything that has to be generated post this token limit will be ignored abruptly without even a single warning&lt;/li&gt;
&lt;li&gt;So, ideally we should be keeping our prompt token limit with 2048 (half the limit) and leave the rest for ChatGPT's completion. This buffer changes based on your use case.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Well, why am I blabbering all this now? I'll explain in a while. Now, without further ado, let's get started with langchain.&lt;/p&gt;




&lt;h2&gt;
  
  
  Enter Langchain 🦜🔗
&lt;/h2&gt;

&lt;p&gt;So, what's our goal? We don't have to worry about feeding document context into the LLM for now. We rather have to build a simple chatbot that remembers our past conversations. With langchain, it is as easy as 1, 2, 3 !&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;langchain.chat_models&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ChatOpenAI&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;langchain.chains&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ConversationChain&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;langchain.memory&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ConversationBufferMemory&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;

&lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;environ&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;OPENAI_API_KEY&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Your API Key Here&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;

&lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nc"&gt;ChatOpenAI&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;temperature&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;model_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gpt-3.5-turbo&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;max_tokens&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;512&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="n"&gt;conversation&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ConversationChain&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
    &lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; 
    &lt;span class="c1"&gt;# verbose=True, 
&lt;/span&gt;    &lt;span class="n"&gt;memory&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nc"&gt;ConversationBufferMemory&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;chatbot&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pt&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;conversation&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;predict&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nb"&gt;input&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;pt&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;res&lt;/span&gt;

&lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;__name__&lt;/span&gt;&lt;span class="o"&gt;==&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;__main__&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;########################################&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="n"&gt;pt&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;input&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ASK: &lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;pt&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;lower&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;==&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;end&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="k"&gt;break&lt;/span&gt;
        &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;chatbot&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pt&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="s"&gt;----------------------------------------&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ChatGPT says: &lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="se"&gt;\n&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's all you gotta do. Now you know the drill, copy the above snippet and name the file whatever you want and run it with &lt;code&gt;python3 &amp;lt;filename&amp;gt;.py&amp;gt;&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Here's the conversation I had:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ASK: Hello, who are you?

----------------------------------------

ChatGPT says: 

Hello! I am an artificial intelligence program designed to
interact with humans and assist with various tasks. How can I
help you today? 

########################################

ASK: Who was martin luther king?

----------------------------------------

ChatGPT says: 

Martin Luther King Jr. was an American Baptist minister and
activist who became the most visible spokesperson and leader
in the civil rights movement from 1954 until his assassination
in 1968.

########################################

ASK: what was his profession?

----------------------------------------

ChatGPT says: 

Martin Luther King Jr. was a Baptist minister and activist. He
became a prominent leader in the civil rights movement and
advocated for nonviolent civil disobedience to advance civil
rights for African Americans. 

########################################

ASK: Okay, what was my first question to you?

----------------------------------------

ChatGPT says: 

Your first question to me was "Hello, who are you?" 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As you could clearly see, now the LLM is able to remember my past conversations. Even the first question I asked. In the code setting the &lt;code&gt;verbose&lt;/code&gt; argument to &lt;code&gt;True&lt;/code&gt; during the initialization of &lt;code&gt;ConversationChain&lt;/code&gt; class is upto you. Setting the option to true will give you a glimpse of what's happening in the backend as we sail through the conversation with the LLM.&lt;/p&gt;

&lt;p&gt;Our conversations will be stored in the Memory class of langchain, if you want to store the conversation for later retrieval, you can do so by pickling the &lt;code&gt;chain.messages&lt;/code&gt; class and save it in a file. Later when you need to load your history, you just have to overwrite your chain's &lt;code&gt;messages&lt;/code&gt; class with the pickled string, like how I have done below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="n"&gt;qa&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;ConversationChain&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
            &lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;memory&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;memory&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="nf"&gt;open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;sessions/&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;self&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;history_key&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;rb&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
     &lt;span class="n"&gt;user_ses&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;pickle&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;loads&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;f&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;read&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
     &lt;span class="n"&gt;qa&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;memory&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;user_ses&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Enough with the code! Now, What's happening underneath?
&lt;/h2&gt;

&lt;p&gt;In the example code snippet above, I have used the &lt;code&gt;ConversationBufferMemory&lt;/code&gt; memory class, this is the most simplest implementation of memory in langchain. There are other memory classes too in langchain, each with its own perk. Before I get into that, Here's what happens when you have a conversation with a chain with memory in langchain:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;When you send in the first question or prompt to the LLM, it returns a response. Cool.&lt;/li&gt;
&lt;li&gt;Now, when you send the second prompt, langchain also sends the previous conversation the user had with the LLM as context for the LLM, so that the LLM would be able to answer any questions addressing the previous questions.&lt;/li&gt;
&lt;li&gt;This happens with every conversation. Previous conversations will be linked as context to the LLM with every hit to the LLM's API.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A very simple engineering problem handled effortlessly with langchain. Now, Remember when I told you about the token limit in ChatGPT? It plays a crucial role now. Whatever conversation you had previously with the LLM will be linked with your original prompt and sent to the API. As the depth of the conversation increases, the percentage of tokens the history uses in the whole 4096 token limit increases rapidly. Which will eventually leave no room for the LLM's response to the current question. &lt;/p&gt;

&lt;p&gt;Langchain knows about this problem very well, that's why they have different memory classes in their memory module that suits your unique use case, I'll give short note on a impressive few.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;ConversationSummaryMemory&lt;/code&gt; - This type of memory creates a summary of the conversation over time. This can be useful for condensing information from the conversation over time resulting in a lesser usage of tokens within the 4096 limit. &lt;/p&gt;

&lt;p&gt;&lt;code&gt;ConversationKGMemory&lt;/code&gt; - This type of memory uses a knowledge graph to recreate memory. It identifies the entities and maps out the relation between each of those entities.&lt;/p&gt;

&lt;p&gt;These memories use the LLM itself to identify entities, to summarize the previous conversations etc. This should be taken into account as this is an extra hit to the API before every conversation. There are a lot more memory classes in langchain, Be sure you give langchain's documentation a read &lt;a href="https://python.langchain.com/en/latest/modules/memory/how_to_guides.html" rel="noopener noreferrer"&gt;here&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;Now, that's about Langchain's memory classes. In the next blog of this series, we'll try to combine the vectore store from the first blog with the memory classes in this blog. LlamaIndex + Langchain 🔥. See ya in the next one 😉&lt;/p&gt;

</description>
      <category>python</category>
      <category>tutorial</category>
      <category>ai</category>
      <category>programming</category>
    </item>
    <item>
      <title>Chat with your documents using ChatGPT 🦾</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Sat, 06 May 2023 09:04:23 +0000</pubDate>
      <link>https://forem.com/iamadhee/chat-with-your-documents-using-chatgpt-18je</link>
      <guid>https://forem.com/iamadhee/chat-with-your-documents-using-chatgpt-18je</guid>
      <description>&lt;p&gt;Ever since &lt;a href="https://openai.com/" rel="noopener noreferrer"&gt;OpenAI&lt;/a&gt; announced their language model &lt;a href="https://openai.com/blog/chatgpt" rel="noopener noreferrer"&gt;ChatGPT&lt;/a&gt;, it has been making headlines in the AI world on a daily basis. ChatGPT is being used as the foundation for countless new tools and applications, ranging from customer service chatbots to creative writing assistants. With its ability to generate high-quality, human-like responses to complex prompts, ChatGPT has quickly become a game-changing technology. &lt;/p&gt;

&lt;p&gt;The applications of LLMs like ChatGPT are virtually limitless. Our imagination would be the only barrier. In this blog series though, We'll be focusing on how we can make ChatGPT, or any LLM for that matter to answer our queries with the context of the custom knowledge from the documents we feed to it. We'll start off with a simpler implementation using &lt;a href="https://github.com/jerryjliu/llama_index" rel="noopener noreferrer"&gt;Llama-index&lt;/a&gt; that reads almost all types of basic document formats and returns the response to your queries based upon it. As we progress, we'll be using &lt;a href="https://python.langchain.com/en/latest/index.html" rel="noopener noreferrer"&gt;Langchain&lt;/a&gt; to build a full fledged chatbot framework that reads the content from almost any link or document that you give to it and answers your queries accordingly. Langchain is a great framework for developing applications powered by language models. We'll be building a web app using these frameworks. Exciting times ahead !&lt;/p&gt;

&lt;h3&gt;
  
  
  Let me lay the &lt;strong&gt;prerequisites&lt;/strong&gt; first:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;You will need a working OpenAI key, because we'll be using the GPT-3 model underneath. If you don't have one, here's &lt;a href="https://www.howtogeek.com/885918/how-to-get-an-openai-api-key/" rel="noopener noreferrer"&gt;how to get an OpenAI API Key&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;You need to have python&amp;gt;=3.6 installed in your machine&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That's about everything you'd need. Rest of the things we'll take care of, as we sail through. Now without further ado, let's dive right in.&lt;/p&gt;




&lt;h2&gt;
  
  
  Behold, the power of Llama-Index 🦙
&lt;/h2&gt;

&lt;p&gt;As I said earlier, We are gonna use llama_index for this tutorial. We are not building anything fancy as of now. We'll not be building any UI. The sole purpose of this is to give an understanding of how llama_index works underneath. Below is the implementation using llama_index and langchain. llama_index uses langchain models under the hood.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;gpt_index&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;download_loader&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;SimpleDirectoryReader&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;GPTSimpleVectorIndex&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;LLMPredictor&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;PromptHelper&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;langchain.chat_models&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;ChatOpenAI&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;os&lt;/span&gt;

&lt;span class="n"&gt;os&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;environ&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;OPENAI_API_KEY&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Your API Key Here&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;

&lt;span class="n"&gt;file_path&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;input&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;Enter the path of the file/doc: &lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;build_index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;file_path&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="n"&gt;max_input_size&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;4096&lt;/span&gt;
    &lt;span class="n"&gt;num_outputs&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;512&lt;/span&gt;
    &lt;span class="n"&gt;max_chunk_overlap&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;20&lt;/span&gt;
    &lt;span class="n"&gt;chunk_size_limit&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="mi"&gt;256&lt;/span&gt;

    &lt;span class="n"&gt;prompt_helper&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;PromptHelper&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;max_input_size&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;num_outputs&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;max_chunk_overlap&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;chunk_size_limit&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;chunk_size_limit&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

    &lt;span class="n"&gt;llm_predictor&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;LLMPredictor&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;llm&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="nc"&gt;ChatOpenAI&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;temperature&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mf"&gt;0.7&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;model_name&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;gpt-3.5-turbo&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;max_tokens&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;num_outputs&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;

    &lt;span class="nf"&gt;download_loader&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;SimpleDirectoryReader&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;documents&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;SimpleDirectoryReader&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;input_files&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="n"&gt;file_path&lt;/span&gt;&lt;span class="p"&gt;]).&lt;/span&gt;&lt;span class="nf"&gt;load_data&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nc"&gt;GPTSimpleVectorIndex&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;documents&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;llm_predictor&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;llm_predictor&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;prompt_helper&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;prompt_helper&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;


&lt;span class="n"&gt;index&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;build_index&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;file_path&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="n"&gt;file_path&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;chatbot&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;):&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;index&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;query&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;response_mode&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;compact&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

&lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;########################################&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="n"&gt;pt&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;input&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ASK: &lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;pt&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;lower&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;&lt;span class="o"&gt;==&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;end&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;break&lt;/span&gt;
    &lt;span class="n"&gt;response&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;chatbot&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;pt&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;----------------------------------------&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="s"&gt;ChatGPT says: &lt;/span&gt;&lt;span class="sh"&gt;'&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="nf"&gt;print&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;response&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copy the above code in entirety and paste it in a file and name it whatever you want. I'm naming it &lt;code&gt;main.py&lt;/code&gt;. Replace the API key placeholder in the code, with your own OpenAI API key and you are done.&lt;/p&gt;

&lt;p&gt;First up, run the below command to install the libraries that we'd need:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;pip3 install gpt_index==0.4.24 &amp;amp;&amp;amp; pip3 install langchain==0.0.142
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can run the code by running this command &lt;code&gt;python3 main.py&lt;/code&gt; at the location of this file, when prompted give the path of your context file to the program. I'm using Martin Luther King's "&lt;a href="https://www.btboces.org/Downloads/I%20Have%20a%20Dream%20by%20Martin%20Luther%20King%20Jr.pdf" rel="noopener noreferrer"&gt;I have a dream&lt;/a&gt;" speech transcript PDF file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;➜  python3 main.py
Enter the path of the file/doc: blogs/docbot/martin.pdf
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You can give any type of file format. Depending on the file&lt;br&gt;
size and your machine's power, it'll take some time to read and convert your file contents into vectors. In my case, it takes around 10 seconds to process a 30MB document. &lt;/p&gt;

&lt;p&gt;Once it is done, you can start asking questions to your document using ChatGPT. Below are some of the example prompts and responses.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ASK: 

what is this document about?

----------------------------------------

ChatGPT says: 

The document is about a speech or written piece that discusses
the promise of equal rights for all Americans, particularly
black Americans, as outlined in the Constitution and Declaration of Independence.
It addresses the fact that this promise has not been fully
realized and that there is a shameful condition that needs to be dramatized.  
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ASK: 

summarize the speech

----------------------------------------

ChatGPT says: 

The speech urges America to remember the importance of freedom and justice, and emphasizes the urgent
need for action in the face of racial injustice. The speaker encourages the audience to rise above 
segregation and work towards brotherhood. The overall message is that now is the time to make real the
promises of democracy. 
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's it. As simple as that. 35 lines of code. Llama-Index and other tools like it have made it so easy and user friendly to leverage the full power of LLMs.&lt;/p&gt;




&lt;h2&gt;
  
  
  Okay that's cool, but what really happens underneath?
&lt;/h2&gt;

&lt;p&gt;Llama-index and langchain have made the whole process very seamless, which otherwise would've been a really cumbersome task. So, here's what happens:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;When you give llama_index the document, it uses one of the adapters that actually suits for the file type from the collection of pre-built adapters from the &lt;a href="https://llamahub.ai/" rel="noopener noreferrer"&gt;Llama-Hub&lt;/a&gt; and parses the contents of the file.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Once the parsing is done, llama_index converts the whole content into chunks of vectors. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;When you put up a question to ChatGPT, llama_index takes in your question retrieves the chunks of vectors from the parsed file that are relevant to your prompt using &lt;a href="https://www.pinecone.io/learn/what-is-similarity-search/" rel="noopener noreferrer"&gt;similarity search&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Once it retrieves the relevant chunks, llama_index overrides your original prompt by adding what it retrieved as the context to the model. &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;With the original question and the context it just has been provided with, ChatGPT should be able to understand your question.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And voila! you'll get a relevant response from ChatGPT. This is how llama_index makes LLMs understand custom knowledge. There's more to this and new features are getting added to llama_index everyday. Make sure you explore more of what it could do.&lt;/p&gt;




&lt;p&gt;Now, that's about llama_index. In the next blog of this series. We'll see how to use langchain to build a more robust chatbot framework that keeps track of the previous conversations it had with the user. See ya in the next one 😉&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>programming</category>
      <category>python</category>
      <category>ai</category>
    </item>
    <item>
      <title>Know when to automate things (and when not to) 👾</title>
      <dc:creator>Adheeban Manoharan</dc:creator>
      <pubDate>Sat, 24 Dec 2022 11:52:21 +0000</pubDate>
      <link>https://forem.com/iamadhee/know-when-to-automate-things-and-when-not-to-26bb</link>
      <guid>https://forem.com/iamadhee/know-when-to-automate-things-and-when-not-to-26bb</guid>
      <description>&lt;p&gt;According to a 2020 study on "shift in number working hours from 1870 to 2020", It was found that an average worker in a rich country today really does work many fewer hours than the average worker 50 years ago. Although, this steady decline in the number of working hours could have been the direct cause of multiple events like the end of the great depression, major global wars etc.., one major and inevitable reason is the rapid technological advancement that we've been experiencing in almost all the fields in the last few decades.&lt;/p&gt;

&lt;p&gt;Every task in today's world is either already automated or people are trying to automate it. A McKinsey Report estimates that automating tasks can increase annual productivity by about 1.4% in industries. Yes, Automation plays out really well in several of today's fields.&lt;/p&gt;

&lt;p&gt;Now, how about in the tech industry? Well, this is where modern day automation was born. As techies, we all have this constant urge to automate things. But don't you think that some of us overdo this a bit? Well, atleast I do. I try to automate almost everything that I do in a day, if I see the opportunity to do so. That too with a versatile and powerful scripting language like python, not even a second's thought. Adding to that, the thrill that gives us as developers while facing the challenges the task poses while trying to automate.&lt;/p&gt;

&lt;p&gt;As cool as that sounds, while trying to automate a particular task, I sometimes end up spending more time on the automation in itself than it would've required to perform that original task, which technically defeats the whole purpose of automating in the first place. In short, I ended up wasting time trying to find the easier route.&lt;/p&gt;

&lt;p&gt;So, that task clearly was not a suitable candidate for automating. Right, so when is a task automation worth your time?&lt;/p&gt;

&lt;h2&gt;
  
  
  Repetitive, manual tasks 🔁
&lt;/h2&gt;

&lt;p&gt;I heard this from a colleague whose work I admire a lot, ' &lt;em&gt;If you are doing something more than twice, automate it !&lt;/em&gt; ', He said. Tasks that need to be done frequently or on a set schedule are perfect for automation because the benefits are immediate. You'll spend less time each day on repetitive tasks that can quickly become annoying or irritating. Even it is a task you ought to perform once in a year, go for it. Time intervals don't matter, as long as it is repetitive and consumes a lot of your precious time and energy.&lt;/p&gt;

&lt;h2&gt;
  
  
  Boring and complex tasks 🥱
&lt;/h2&gt;

&lt;p&gt;If a task doesn't excite you and if it is quite complex to perform, automate it. Doesn't matter if it takes quite some time to automate, because doing something that you don't like brings your whole spirit down, which in turn dials your excitement down in the other tasks resulting in the decline of your overall productivity. Also, if that task is complex, but we'll have to do it manually, it is by nature prone to many human errors. In this case, an automation framework to perform this particular task goes a long way in preventing these kind of human errors, provided that you get the framework right.&lt;/p&gt;

&lt;h2&gt;
  
  
  Distractive tasks 😶‍🌫️
&lt;/h2&gt;

&lt;p&gt;When you start your day, you might have a lot of tasks on your plate and it would be very tempting to get started with the most easiest one, rather than the important ones. These kind of tasks will easily sidetrack you from the very important and impactful tasks. But, these are the kind of tasks that cannot be skipped, no matter how insignificant they are. If time allows you, you should automate these kind of tasks, to stay focused on your main tasks. &lt;/p&gt;

&lt;p&gt;If a task checks all the boxes I have listed above, It is already begging to be automated, go for it. Don't spend your precious time on the insignificant ones //&lt;/p&gt;

&lt;p&gt;Here's a thumb rule:&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If you hate it, automate it !&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Good luck on your automation journey, I would love to hear your thoughts in this, Please do comment. Cheers! See you in the next post.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;References:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Charlie Giattino, Esteban Ortiz-Ospina and Max Roser (2020) - "Working Hours". Published online at OurWorldInData.org. Retrieved from: '&lt;a href="https://ourworldindata.org/working-hours" rel="noopener noreferrer"&gt;https://ourworldindata.org/working-hours&lt;/a&gt;'&lt;/li&gt;
&lt;li&gt;Mckinsey Global Institute (2017) - &lt;a href="https://www.mckinsey.com/~/media/mckinsey/featured%20insights/Digital%20Disruption/Harnessing%20automation%20for%20a%20future%20that%20works/MGI-A-future-that-works-Executive-summary.ashx" rel="noopener noreferrer"&gt;A FUTURE THAT WORKS:
AUTOMATION, EMPLOYMENT,
AND PRODUCTIVITY&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>automation</category>
      <category>programming</category>
      <category>productivity</category>
      <category>writing</category>
    </item>
  </channel>
</rss>
