<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Lena Jeremiah</title>
    <description>The latest articles on Forem by Lena Jeremiah (@jeremiahjacinth13).</description>
    <link>https://forem.com/jeremiahjacinth13</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/jeremiahjacinth13"/>
    <language>en</language>
    <item>
      <title>Picshaw - An image sharing app</title>
      <dc:creator>Lena Jeremiah</dc:creator>
      <pubDate>Mon, 14 Oct 2024 04:43:58 +0000</pubDate>
      <link>https://forem.com/jeremiahjacinth13/picshaw-an-image-sharing-app-3799</link>
      <guid>https://forem.com/jeremiahjacinth13/picshaw-an-image-sharing-app-3799</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/pinata"&gt;The Pinata Challenge &lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Built
&lt;/h2&gt;

&lt;p&gt;Imagine attending an exciting event — like a developer conference—where you meet new people, take tons of photos and vidoes with them, and make lasting memories. But once the event wraps up, you realize you have no easy way to get all those great pictures.&lt;/p&gt;

&lt;p&gt;Or consider a wedding ceremony: You know your guests captured beautiful moments throughout the day, but gathering all those photos means individually reaching out to each person. It is definitely not the way to go.&lt;/p&gt;

&lt;p&gt;Picshaw offers a simple solution. As an event organizer, you can create a dedicated event folder on the platform, generate a shareable link, and invite your guests to upload the pictures they’ve taken. This way, everyone gets to relive the event from the unique perspectives of all attendees. No more missing moments, just memories shared effortlessly.&lt;/p&gt;

&lt;h2&gt;
  
  
  Picshaw Features
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Creating Public and Private events. 
Public events are showcased on the “Discover Events” page, allowing anyone to browse and explore them. Private events, on the other hand, remain accessible only via a shareable link provided by the event organizer.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe3rmn7j4vxjnwo72mqab.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe3rmn7j4vxjnwo72mqab.png" alt="Image description" width="800" height="935"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Upload Files Effortlessly
Guests can easily upload their event photos to the designated event folder, making sure all memorable moments are gathered in one place.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsmo7ew8e6ddoaurcs37a.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsmo7ew8e6ddoaurcs37a.png" alt="Image description" width="800" height="935"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Browse Uploaded Photos with Two Viewing Modes. &lt;br&gt;
Users can explore photos either in list mode, which snaps images into a feed similar to Instagram, or in grid mode, offering a gallery-style layout.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpbsr70ews33le99y1xy8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpbsr70ews33le99y1xy8.png" alt="Image description" width="800" height="935"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxyoijaq9ost26q10b1i1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxyoijaq9ost26q10b1i1.png" alt="Image description" width="800" height="935"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Easily Share Event Links&lt;br&gt;
Organizers can generate and share a unique link to invite guests to upload their pictures, streamlining the process of gathering photos.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fogpjtnkctzvd12dgx687.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fogpjtnkctzvd12dgx687.png" alt="Image description" width="800" height="928"&gt;&lt;/a&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Discover Public Events&lt;br&gt;
Explore and browse all public events from the “Discover Events” page, opening up new ways to experience moments shared by others.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsz0x9q7592dh379ikbhq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsz0x9q7592dh379ikbhq.png" alt="Image description" width="800" height="928"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Seamless Authentication
Users can sign in quickly and securely using Google sign-in or magic links, making the experience smooth and hassle-free.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhxokoj9gmk21xmb4b0zx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhxokoj9gmk21xmb4b0zx.png" alt="Image description" width="800" height="928"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Support for dark mode and light mode
&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fd0pvbtm500h6v1og7xz8.png" alt="Image description" width="800" height="928"&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Demo
&lt;/h2&gt;

&lt;p&gt;Try out Picshaw live: &lt;a href="https://picshaw.vercel.app" rel="noopener noreferrer"&gt;https://picshaw.vercel.app&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  My Code
&lt;/h2&gt;

&lt;p&gt;The full code is available on GitHub. Feel free to star the repo and follow me! 😊&lt;br&gt;
&lt;/p&gt;
&lt;div class="ltag-github-readme-tag"&gt;
  &lt;div class="readme-overview"&gt;
    &lt;h2&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fassets.dev.to%2Fassets%2Fgithub-logo-5a155e1f9a670af7944dd5e12375bc76ed542ea80224905ecaf878b9157cdefc.svg" alt="GitHub logo"&gt;
      &lt;a href="https://github.com/lenajeremy" rel="noopener noreferrer"&gt;
        lenajeremy
      &lt;/a&gt; / &lt;a href="https://github.com/lenajeremy/hack-photobomb" rel="noopener noreferrer"&gt;
        hack-photobomb
      &lt;/a&gt;
    &lt;/h2&gt;
    &lt;h3&gt;
      
    &lt;/h3&gt;
  &lt;/div&gt;
  &lt;div class="ltag-github-body"&gt;
    
&lt;div id="readme" class="md"&gt;
&lt;p&gt;This is a &lt;a href="https://nextjs.org" rel="nofollow noopener noreferrer"&gt;Next.js&lt;/a&gt; project bootstrapped with &lt;a href="https://nextjs.org/docs/app/api-reference/cli/create-next-app" rel="nofollow noopener noreferrer"&gt;&lt;code&gt;create-next-app&lt;/code&gt;&lt;/a&gt;.&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Getting Started&lt;/h2&gt;
&lt;/div&gt;
&lt;p&gt;First, run the development server:&lt;/p&gt;
&lt;div class="highlight highlight-source-shell notranslate position-relative overflow-auto js-code-highlight"&gt;
&lt;pre&gt;npm run dev
&lt;span class="pl-c"&gt;&lt;span class="pl-c"&gt;#&lt;/span&gt; or&lt;/span&gt;
yarn dev
&lt;span class="pl-c"&gt;&lt;span class="pl-c"&gt;#&lt;/span&gt; or&lt;/span&gt;
pnpm dev
&lt;span class="pl-c"&gt;&lt;span class="pl-c"&gt;#&lt;/span&gt; or&lt;/span&gt;
bun dev&lt;/pre&gt;

&lt;/div&gt;
&lt;p&gt;Open &lt;a href="http://localhost:3000" rel="nofollow noopener noreferrer"&gt;http://localhost:3000&lt;/a&gt; with your browser to see the result.&lt;/p&gt;
&lt;p&gt;You can start editing the page by modifying &lt;code&gt;app/page.tsx&lt;/code&gt;. The page auto-updates as you edit the file.&lt;/p&gt;
&lt;p&gt;This project uses &lt;a href="https://nextjs.org/docs/app/building-your-application/optimizing/fonts" rel="nofollow noopener noreferrer"&gt;&lt;code&gt;next/font&lt;/code&gt;&lt;/a&gt; to automatically optimize and load &lt;a href="https://vercel.com/font" rel="nofollow noopener noreferrer"&gt;Geist&lt;/a&gt;, a new font family for Vercel.&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Learn More&lt;/h2&gt;
&lt;/div&gt;
&lt;p&gt;To learn more about Next.js, take a look at the following resources:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;a href="https://nextjs.org/docs" rel="nofollow noopener noreferrer"&gt;Next.js Documentation&lt;/a&gt; - learn about Next.js features and API.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://nextjs.org/learn" rel="nofollow noopener noreferrer"&gt;Learn Next.js&lt;/a&gt; - an interactive Next.js tutorial.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;You can check out &lt;a href="https://github.com/vercel/next.js" rel="noopener noreferrer"&gt;the Next.js GitHub repository&lt;/a&gt; - your feedback and contributions are welcome!&lt;/p&gt;
&lt;div class="markdown-heading"&gt;
&lt;h2 class="heading-element"&gt;Deploy on Vercel&lt;/h2&gt;
&lt;/div&gt;
&lt;p&gt;The easiest way to deploy your Next.js app is to use the &lt;a href="https://vercel.com/new?utm_medium=default-template&amp;amp;filter=next.js&amp;amp;utm_source=create-next-app&amp;amp;utm_campaign=create-next-app-readme" rel="nofollow noopener noreferrer"&gt;Vercel Platform&lt;/a&gt; from the creators of Next.js.&lt;/p&gt;
&lt;p&gt;Check out our &lt;a href="https://nextjs.org/docs/app/building-your-application/deploying" rel="nofollow noopener noreferrer"&gt;Next.js deployment documentation&lt;/a&gt; for more…&lt;/p&gt;
&lt;/div&gt;
  &lt;/div&gt;
  &lt;div class="gh-btn-container"&gt;&lt;a class="gh-btn" href="https://github.com/lenajeremy/hack-photobomb" rel="noopener noreferrer"&gt;View on GitHub&lt;/a&gt;&lt;/div&gt;
&lt;/div&gt;





&lt;h2&gt;
  
  
  Using Pinata
&lt;/h2&gt;

&lt;p&gt;Integrating with Pinata was one of the smoothest parts of the project. Here’s a breakdown of how I implemented it:&lt;/p&gt;

&lt;h3&gt;
  
  
  1. Initialize the Pinata SDK
&lt;/h3&gt;

&lt;p&gt;I set up a dedicated file, &lt;code&gt;@/lib/pinata.ts&lt;/code&gt;, to manage the Pinata configuration:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;PinataSDK&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;pinata&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;pinata&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;PinataSDK&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
    &lt;span class="na"&gt;pinataJwt&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;PINATA_JWT&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
    &lt;span class="na"&gt;pinataGateway&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;env&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;NEXT_PUBLIC_GATEWAY_URL&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h3&gt;
  
  
  2. In my API Route:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Ensured my user was logged in:
I ensured the user was logged in before uploading.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;session&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;auth&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;

&lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;session&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;session&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;user&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
   &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;respondError&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;User not authenticated&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Failed to authenticate user&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;401&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Validate the Request:
I checked if the event exists and ensured the file upload is within limits.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;form&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;request&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;formData&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;files&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nb"&gt;Array&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="k"&gt;from&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;form&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;values&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;eventSlug&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;params&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;event-slug&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;

&lt;span class="k"&gt;try&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;prisma&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;findUnique&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;where&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;slug&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;eventSlug&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt;

    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="o"&gt;!&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;respondError&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Event not found&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="kc"&gt;undefined&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;404&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;files&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;&amp;gt;&lt;/span&gt; &lt;span class="mi"&gt;50&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;respondError&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Limit of 50 files per request&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Too many files&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;400&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;files&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;respondError&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;Error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;At least one file is required&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;No files&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;400&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Upload files to Pinata
The next step is uploading the files.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nb"&gt;Promise&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;all&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;files&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;filter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;f&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="k"&gt;typeof&lt;/span&gt; &lt;span class="nx"&gt;f&lt;/span&gt; &lt;span class="o"&gt;!==&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;string&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;f&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;pinata&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;upload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;file&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;f&lt;/span&gt;&lt;span class="p"&gt;)))&lt;/span&gt;
        &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;successfulUploads&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;filter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;r&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;status&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;fulfilled&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;

        &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;uploadsWithPublicURL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nb"&gt;Promise&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;all&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;successfulUploads&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="nx"&gt;r&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;publicURL&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;pinata&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;gateways&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;createSignedURL&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt; &lt;span class="na"&gt;cid&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;value&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;cid&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="na"&gt;expires&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="mi"&gt;60&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;60&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;24&lt;/span&gt; &lt;span class="o"&gt;*&lt;/span&gt; &lt;span class="mi"&gt;365&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt;
            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="p"&gt;...&lt;/span&gt;&lt;span class="nx"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;value&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;publicURL&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;}))&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Note: A better implementation would include checks for failed uploads. This way, users are notified about failed files and can retry uploading them.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;ul&gt;
&lt;li&gt;Save Upload Data to the Database
Finally, I saved the upload information into the database using Prisma.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;dbUpload&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;prisma&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;upload&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;create&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;
            &lt;span class="na"&gt;data&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                &lt;span class="na"&gt;text&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;session&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;user&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; uploaded &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;files&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;length&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt; images for &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;name&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;`&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="na"&gt;ownerId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;session&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;user&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt; &lt;span class="o"&gt;??&lt;/span&gt; &lt;span class="dl"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="na"&gt;eventId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt; &lt;span class="o"&gt;??&lt;/span&gt; &lt;span class="dl"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                &lt;span class="na"&gt;files&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                    &lt;span class="na"&gt;createMany&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                        &lt;span class="na"&gt;data&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;uploadsWithPublicURL&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;map&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;file&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                            &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
                                &lt;span class="na"&gt;ownerId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;session&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;user&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt; &lt;span class="o"&gt;??&lt;/span&gt; &lt;span class="dl"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                                &lt;span class="na"&gt;eventId&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;event&lt;/span&gt;&lt;span class="p"&gt;?.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt; &lt;span class="o"&gt;??&lt;/span&gt; &lt;span class="dl"&gt;""&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                                &lt;span class="na"&gt;cid&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;file&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;id&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                                &lt;span class="na"&gt;publicURL&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;file&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;publicURL&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                            &lt;span class="p"&gt;}&lt;/span&gt;
                        &lt;span class="p"&gt;})&lt;/span&gt;
                    &lt;span class="p"&gt;}&lt;/span&gt;
                &lt;span class="p"&gt;}&lt;/span&gt;
            &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="p"&gt;})&lt;/span&gt;

        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;respondSuccess&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;dbUpload&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Uploaded successfully&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;201&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;p&gt;This flow ensures the event photos are uploaded securely and efficiently to Pinata, with successful uploads tracked and stored in the database for easy access later.&lt;/p&gt;

&lt;h2&gt;
  
  
  Rendering the uploaded images in the browser
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. Updating &lt;code&gt;next.config.js&lt;/code&gt;
&lt;/h3&gt;

&lt;p&gt;I first had to update the images field in the &lt;code&gt;next.config.js&lt;/code&gt; file to allow NextJS optimize images from remote URLs.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="cm"&gt;/** @type {import('next').NextConfig} */&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;nextConfig&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="na"&gt;images&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="na"&gt;remotePatterns&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;
      &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="na"&gt;hostname&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;sapphire-obliged-canidae-626.mypinata.cloud&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
        &lt;span class="na"&gt;protocol&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;https&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
      &lt;span class="p"&gt;},&lt;/span&gt;
    &lt;span class="p"&gt;],&lt;/span&gt;
  &lt;span class="p"&gt;},&lt;/span&gt;
&lt;span class="p"&gt;};&lt;/span&gt;

&lt;span class="k"&gt;export&lt;/span&gt; &lt;span class="k"&gt;default&lt;/span&gt; &lt;span class="nx"&gt;nextConfig&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  2. Fetching Event Data
&lt;/h3&gt;

&lt;p&gt;In my client, I use a custom hook &lt;code&gt;useFetch&lt;/code&gt; to fetch the details about the selected event&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt; const params = useParams();
  const eventSlug = params["event-slug"];
  const [viewMode, setViewMode] = React.useState&amp;lt;"grid" | "list"&amp;gt;("grid");
  const router = useRouter();

  const {
    loading,
    data,
    trigger: getEventDetails,
  } = useFetch&amp;lt;void, GetUploadsResponse&amp;gt;(`/api/e/${eventSlug}`, undefined, {
    fetchOnRender: true,
  });
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  3. Rendering the Images:
&lt;/h3&gt;

&lt;p&gt;I render the retrieved images inside a responsive grrrrrrrrrrid&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;  &amp;lt;div className="grid grid-cols-3 md:grid-cols-4 gap-1 my-6"&amp;gt;
        {photos.map((file) =&amp;gt; (
          &amp;lt;Image
            key={file.id}
            src={file.publicURL}
            width={400}
            height={400}
            alt=""
            className="aspect-square object-cover"
          /&amp;gt;
      ))}
 &amp;lt;/div&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Ideas for improvement
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;To further improve the app, it would be necessary to add some content moderation features. This would ensure that users don't post NSFW content on public groups. I started integrating with Google Cloud Vision but I didn't have enough time to complete it before the deadline.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Integrating Pinata’s Files API into Picshaw has greatly enhanced how images are uploaded and managed. Pinata provided seamless performance, and its intuitive API made implementation straightforward, allowing me to focus on building the core features of the app while trusting Pinata to handle file storage and delivery efficiently.&lt;/p&gt;

&lt;p&gt;Overall, Pinata has been an essential tool in building a reliable and smooth image management system for Picshaw.&lt;/p&gt;

&lt;p&gt;Also, I really enjoyed building Picshaw.&lt;/p&gt;

&lt;p&gt;Follow me on X &lt;a href="https://x.com/jeremiahlena13" rel="noopener noreferrer"&gt;@jeremiahlena13&lt;/a&gt;&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>pinatachallenge</category>
      <category>webdev</category>
      <category>api</category>
    </item>
    <item>
      <title>Build AI Chat Apps in Minutes: Unleashing the Power of NeonChat Starter Kit</title>
      <dc:creator>Lena Jeremiah</dc:creator>
      <pubDate>Mon, 02 Sep 2024 03:00:22 +0000</pubDate>
      <link>https://forem.com/jeremiahjacinth13/build-ai-chat-apps-in-minutes-unleashing-the-power-of-neonchat-starter-kit-3ip9</link>
      <guid>https://forem.com/jeremiahjacinth13/build-ai-chat-apps-in-minutes-unleashing-the-power-of-neonchat-starter-kit-3ip9</guid>
      <description>&lt;p&gt;&lt;em&gt;This is a submission for the &lt;a href="https://dev.to/challenges/neon"&gt;Neon Open Source Starter Kit Challenge &lt;/a&gt;: Ultimate Starter Kit&lt;/em&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  My Kit
&lt;/h2&gt;

&lt;p&gt;Building AI chat apps can be a real challenge. You've got to juggle user logins, make sure AI responses appear instantly, save all those conversations, and tie everything to a solid database. It's a lot to handle, even for experienced developers. But here's the good news: I've cooked up something that's about to make your life a whole lot easier.&lt;/p&gt;

&lt;p&gt;Welcome to NeonChat, the ultimate open-source starter kit for building AI-powered chat applications! This kit combines the power of Next.js, Prisma, and Neon's serverless Postgres to create a flexible, scalable, and developer-friendly foundation for your next AI chat project.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgarbuj53ar9k6jpe4kx5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgarbuj53ar9k6jpe4kx5.png" alt="Image description" width="800" height="447"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The main features? Well, we've got:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A clean chat interface that doesn't look like it's from the 90s&lt;/li&gt;
&lt;li&gt;Login with email or GitHub &lt;/li&gt;
&lt;li&gt;Real-time chatting with AI&lt;/li&gt;
&lt;li&gt;Saving conversations (Prisma doing it's thing with Neon's Postgres)&lt;/li&gt;
&lt;li&gt;Ultimately, a CLI tool which allows developers to easily select whichever AI provider they want to use and get started in minutes.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Link to Kit
&lt;/h2&gt;

&lt;p&gt;You can find the NeonChat starter kit on GitHub &lt;a href="https://github.com/lenajeremy/nextjs-ai-neon-starter" rel="noopener noreferrer"&gt;https://github.com/lenajeremy/nextjs-ai-neon-starter&lt;/a&gt;&lt;br&gt;
The CLI tool is &lt;a href="https://dev.tohere"&gt;https://github.com/lenajeremy/nextjs-ai-starter-script&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;Or you can simply run this command to get started&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npx neonchat
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  My Journey
&lt;/h2&gt;

&lt;p&gt;I chose this stack for several reasons:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Next.js because it's just awesome.&lt;/li&gt;
&lt;li&gt;Prisma makes database stuff really simple and intuitive.&lt;/li&gt;
&lt;li&gt;Neon's Postgres for it's scalability and its ease of use. Also, because I needed to 😂😂&lt;/li&gt;
&lt;li&gt;I used Vercel's AI SDK because I wanted people to be able to use whatever AI service they wanted to and not be restricted to the one (the one that was kind enough to give me free credits). And also because it was really easy to integrate.&lt;/li&gt;
&lt;li&gt;Shadcn because it's the best.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Things I learned:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Trying to make something flexible AND simple is like juggling flaming swords while balancing on a tightrope.&lt;/li&gt;
&lt;li&gt;Documentation is Key: Writing clear documentation is as important as writing good code. I put extra effort into creating a README that guides developers through every step of the process&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0g1zwb9snc6e09d3ufhg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0g1zwb9snc6e09d3ufhg.png" alt="Image description" width="800" height="417"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To help you get up and running quickly, here's a brief overview of how to use NeonChat:&lt;/p&gt;

&lt;h2&gt;
  
  
  Get Started
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Run &lt;code&gt;npx neonchat&lt;/code&gt; in your terminal.&lt;/li&gt;
&lt;li&gt;Follow the prompts to select your preferred AI provider and configure your project.&lt;/li&gt;
&lt;li&gt;The CLI tool will set up your project, customize the starter templates based on the selected AI provider and install the required dependencies for you.&lt;/li&gt;
&lt;li&gt;All you have to do is navigate to the created project directory and&lt;/li&gt;
&lt;li&gt;Start the development server with npm run dev.&lt;/li&gt;
&lt;li&gt;Visit &lt;a href="http://localhost:3000" rel="noopener noreferrer"&gt;http://localhost:3000&lt;/a&gt; to see your NeonChat instance in action!&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmorw7yhsx79o9fyjm34g.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fmorw7yhsx79o9fyjm34g.png" alt="Image description" width="800" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkrs89zyytjmhwixh8uc1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkrs89zyytjmhwixh8uc1.png" alt="Image description" width="800" height="500"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;NeonChat aims to empower developers to build AI-powered chat applications quickly and efficiently. By combining a feature-rich frontend, a flexible backend, and an intuitive setup process, we hope to inspire a new wave of innovative AI projects.&lt;br&gt;
We're excited to see what the community builds with NeonChat. Happy coding!&lt;/p&gt;

</description>
      <category>devchallenge</category>
      <category>neonchallenge</category>
      <category>postgres</category>
      <category>database</category>
    </item>
    <item>
      <title>Practical Guide to Web-scraping in Python</title>
      <dc:creator>Lena Jeremiah</dc:creator>
      <pubDate>Fri, 17 Mar 2023 14:04:58 +0000</pubDate>
      <link>https://forem.com/jeremiahjacinth13/practical-guide-to-web-scraping-in-python-28ba</link>
      <guid>https://forem.com/jeremiahjacinth13/practical-guide-to-web-scraping-in-python-28ba</guid>
      <description>&lt;p&gt;Web scraping, in simple terms, is a technique used to collect useful data from a website. You'd agree that there's a ton of information on the internet, and most times, this information might not be structured exactly how we want. So, web scraping allows us to extract data from the internet, which can then be restructured or rearranged so that it is useful to us.&lt;/p&gt;

&lt;p&gt;In this article, I'll be taking you step-by-step through the web scraping process, providing you with useful advice on how to develop your skills, tips on what to do and what not to do when scraping websites (copyright infringement in particular), and much more. Additionally, we'd scrape two websites, &lt;a href="https://coinmarketcap.com/" rel="noopener noreferrer"&gt;CoinMarketCap&lt;/a&gt; and &lt;a href="https://news.bitcoin.com/" rel="noopener noreferrer"&gt;Bitcoin.com&lt;/a&gt;, to get the prices of cryptocurrencies and to get latest news about Bitcoin.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Basic programming skill. Although we will be using Python in this article, you can still benefit from it even if you are familiar with other programming languages.&lt;/li&gt;
&lt;li&gt;A fundamental understanding of HTML, HTML elements and attributes, etc.&lt;/li&gt;
&lt;li&gt;Knowledge of Chrome DevTools (not required)&lt;/li&gt;
&lt;li&gt;A computer or other tool for running Python code.&lt;/li&gt;
&lt;/ol&gt;

&lt;blockquote&gt;
&lt;p&gt;Let me quickly introduce you to two packages: requests and BeautifulSoup. They are Python libraries that you use to scrape websites. In this article, we'd require them.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h2&gt;
  
  
  The Web-scraping Process
&lt;/h2&gt;

&lt;p&gt;When you go to your browser and input the url of the website you want to visit, your browser makes a request to the server. The server then responds to your browser. The browser receives data from the server as text, HTML, JSON, or a multipart request (media file).&lt;/p&gt;

&lt;p&gt;When the browser receives HTML, it parses it, generates DOM nodes, and renders it to the page, displaying the lovely website on your screen.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3jrhnl27ldelulek9jm9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3jrhnl27ldelulek9jm9.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/3jrhnl27ldelulek9jm9.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When scraping websites, we need this HTML content, and that introduces us to the first package &lt;code&gt;requests&lt;/code&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://pypi.org/project/requests/" rel="noopener noreferrer"&gt;Requests&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;As the name suggests, it's used for sending requests to servers. If you have any experience in frontend development or Javascript, it's the equivalent of &lt;code&gt;fetch&lt;/code&gt; API in Javascript.&lt;/p&gt;

&lt;p&gt;For us to be able to scrape website content, we need to ensure that end server endpoint we're calling returns HTML.&lt;/p&gt;

&lt;p&gt;After getting the HTML, the next course of action is to collect the data we need. And this brings us to the second package &lt;code&gt;BeautifulSoup&lt;/code&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;a href="https://www.crummy.com/software/BeautifulSoup/bs4/doc/" rel="noopener noreferrer"&gt;Beautiful Soup&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Beautiful Soup is a library used for pulling data out of HTML and XML files. Beautiful soup contains a lot of very useful utilities, which makes web-scraping really easy.&lt;/p&gt;

&lt;h2&gt;
  
  
  Building the Scrapers (Writing Code)
&lt;/h2&gt;

&lt;p&gt;After much talk, let's get into the crux of the matter, writing code and building the actual web-scrapers.&lt;/p&gt;

&lt;p&gt;I've created a Github &lt;a href="https://github.com/Jeremiahjacinth13/webscrapers-in-python" rel="noopener noreferrer"&gt;repo&lt;/a&gt; which contains the code for the scrapers. The complete code is at the &lt;code&gt;completed&lt;/code&gt; branch, checkout to the &lt;code&gt;get-started&lt;/code&gt; branch to get started.&lt;/p&gt;

&lt;p&gt;I'll be creating the project from scratch in this article, for those who might not want to clone a Github repo.&lt;/p&gt;

&lt;h3&gt;
  
  
  PROJECT SETUP
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Create a new folder. Name it whatever you want.&lt;/li&gt;
&lt;li&gt;Set up a new virtual environment.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Run the following commands to set up a new virtual environment&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 shell
# if you've not already installed virtualenv
pip install virtualenv

# you can name your environment whatever you want
python -m venv &amp;lt;name_of_environment&amp;gt;


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After running the previous commands successfully, you should see folder with the name of the virtual environment you just created.&lt;/p&gt;

&lt;p&gt;You also need to activate the virtual environment. To do that, run the following commands depending on your OS.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 shell
# macOS / linux
source ./&amp;lt;virtual_env_name&amp;gt;/bin/activate

# windows
&amp;lt;virtual_env_name&amp;gt;/Scripts/activate.bat   # In CMD
&amp;lt;virtual_env_name&amp;gt;/Scripts/Activate.ps1   # In Powershell


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;Install the required packages.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Having activated our virtual environment, let's go on ahead to install the packages. To do so, run the following in your terminal:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 shell
pip install requests beautifulsoup4


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;ol&gt;
&lt;li&gt;Generate a &lt;code&gt;requirements.txt&lt;/code&gt; file.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If you'll be sharing with code with someone or a team, it's important that the other party installs the exact same package version you used to build the project. There could be breaking changes to the package used, so it's important that everybody uses the same version. A &lt;code&gt;requirements.txt&lt;/code&gt; file lets others know the exact version of the packages you used on the project.&lt;/p&gt;

&lt;p&gt;To generate a &lt;code&gt;requirements.txt&lt;/code&gt; file, just run:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 shell
pip freeze &amp;gt; requirements.txt


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After running this, you should see a file containing the packages required to run the project.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;If you're working in a team and there's a requirements.txt file in the project, run pip install -r requirements.txt to install the exact version of the packages in the requirements.txt file.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;How many times did I say requirements.txt file?🤡&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Add a .gitignore file.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This is used to indicate to git the files or folders to ignore when you run &lt;code&gt;git add .&lt;/code&gt; Add &lt;code&gt;/&amp;lt;virtual_env_name&amp;gt;&lt;/code&gt; to your .gitignore file.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fifeeagl0b6fouy4m272h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fifeeagl0b6fouy4m272h.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ifeeagl0b6fouy4m272h.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Scraping &lt;a href="https://news.bitcoin.com/" rel="noopener noreferrer"&gt;Bitcoin.com&lt;/a&gt;.
&lt;/h3&gt;

&lt;p&gt;Now, we'll be scraping the first website. The first thing we need to do is to get the HTML with requests.&lt;/p&gt;

&lt;p&gt;So, create a file &lt;code&gt;bitcoinnews.py&lt;/code&gt; (you can name it whatever you want).&lt;/p&gt;

&lt;p&gt;Add the following code to the file&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 python
import requests

def get_news():
    url = '&amp;lt;https://news.bitcoin.com&amp;gt;'

    response = requests.get(url) # making a request
    response_text = response.text # getting the html

    # here's where we should get the news content from the html, we're just printing the content to the terminal for now
    print(response_text)

get_news()


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;
&lt;h3&gt;
  
  
  A bit on HTML and frontend development
&lt;/h3&gt;

&lt;p&gt;When scraping websites, it's important that we understand basic HTML as this would help us decide the best way to extract our data.&lt;/p&gt;

&lt;p&gt;HTML (HyperText Markup Language) is the standard markup language for creating webpages. HTML is used to describe the structure of a web page. &lt;a href="https://www.w3schools.com/html/html_elements.asp" rel="noopener noreferrer"&gt;HTML elements&lt;/a&gt; are the building blocks of every webpage. HTML elements are related to one another in a way that resembles a tree.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faczrvybxcblzq2imp9ef.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Faczrvybxcblzq2imp9ef.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/aczrvybxcblzq2imp9ef.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the diagram above, we have the root element, the &lt;code&gt;&amp;lt;html&amp;gt;&lt;/code&gt; element, which has two children, the &lt;code&gt;&amp;lt;head&amp;gt;&lt;/code&gt; and &lt;code&gt;&amp;lt;body&amp;gt;&lt;/code&gt; elements. The &lt;code&gt;&amp;lt;head&amp;gt;&lt;/code&gt; tag has a single child, the &lt;code&gt;&amp;lt;title&amp;gt;&lt;/code&gt; tag which has text as simple text as its child. The &lt;code&gt;&amp;lt;body&amp;gt;&lt;/code&gt; has three children (one heading tag and two paragraph tags).&lt;/p&gt;

&lt;p&gt;This tree-like structure facilitates the composition of elements and the definition of relationships between them. It makes it simple to write a paragraph and include links within it.&lt;/p&gt;

&lt;p&gt;HTML elements, in addition to having children, have attributes, which add a lot more functionality to them. An 'image' element would require the source of the image it is supposed to display. When you click on a link (anchor tag), it needs to know where you want to go. Attributes can also be used to distinguish between a single element and a group of elements.&lt;/p&gt;

&lt;p&gt;Understanding HTML attributes is really important because when scraping a website, we need specific data and it's important we can target the elements that have the data we need.&lt;/p&gt;

&lt;p&gt;If you want to learn more about HTML, you can check out &lt;a href="https://www.w3schools.com/html/html_intro.asp" rel="noopener noreferrer"&gt;W3schools.com&lt;/a&gt;&lt;/p&gt;
&lt;h3&gt;
  
  
  Scraping top crypto prices from CoinMarketCap.
&lt;/h3&gt;

&lt;p&gt;Before we get back to our &lt;a href="https://news.bitcoin.com/" rel="noopener noreferrer"&gt;Bitcoin News&lt;/a&gt; scraper, let's scrape the prices of the top cryptocurrencies from CoinMarketCap, alongside the images.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;TIP: Before you start scraping a website, disable Javascript and open the page in that browser. This is significant because some websites display content on the webpage using client-side Javascript, which cannot be used by our scrapers. So it only makes sense to see the exact HTML that the server is returning.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;With Javascript running&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq3k9s3d64ic4xsp6y7b2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fq3k9s3d64ic4xsp6y7b2.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/q3k9s3d64ic4xsp6y7b2.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Without Javascript running&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5tojecltd5d4ixzh58yi.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F5tojecltd5d4ixzh58yi.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/5tojecltd5d4ixzh58yi.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you open &lt;a href="//notion://www.notion.so/coinmarketcap.com"&gt;coinmarketcap.com&lt;/a&gt; on a browser with Javascript disabled, you'd notice that you can only get the prices for the first ten cryptocurrencies. You'd also notice that on a browser where Javascript is enabled, you can scroll to get prices of other cryptocurrencies. This difference in the look of page (HTML) can cause you a lot of trouble when scraping a website.&lt;/p&gt;
&lt;h3&gt;
  
  
  Back to the code editor
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Create a new file. You can call it whatever you want. I called mine &lt;code&gt;coinmarketcap.py&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Add the following code to it.&lt;/li&gt;
&lt;/ol&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 python
import requests
from bs4 import BeautifulSoup

def get_crypto_prices():
    url = '&amp;lt;https://coinmarketcap.com&amp;gt;'
    response_text = requests.get(url).text

    soup = BeautifulSoup(response_text, 'html.parser') #added this line

    print(soup)

get_crypto_prices()


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;This looks almost exactly like the previous file, I only changed the URL and function name, and also added the line with BeautifulSoup.&lt;/p&gt;

&lt;p&gt;That line of code parses the HTML text we get from the endpoint and converts it to a &lt;code&gt;BeautifulSoup&lt;/code&gt; object which has methods we can use to extract information from the text.&lt;/p&gt;

&lt;p&gt;Now that we have access to the HTML, let's figure out how to scrape the data we need: in our case, prices of cryptocurrencies.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open up DevTools on your web browser.
If you're using Chrome on a Mac, you can press &lt;code&gt;Cmd + Option + I&lt;/code&gt; to open up DevTools.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4hr7493ljxdlefmagdnj.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F4hr7493ljxdlefmagdnj.gif" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/4hr7493ljxdlefmagdnj.gif"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Or you can just right click on the target element and click inspect.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8w5b3wjlk16ggo6m92sa.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8w5b3wjlk16ggo6m92sa.gif" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8w5b3wjlk16ggo6m92sa.gif"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you look at the elements in the Inspect Element section of Chrome DevTools, you'd notice that there's a table element which children. Inside the table element, we have the &lt;code&gt;colgroup&lt;/code&gt;, &lt;code&gt;thead&lt;/code&gt;, &lt;code&gt;tbody&lt;/code&gt; elements. The &lt;code&gt;tbody&lt;/code&gt; element contains all the content of the table. Inside the &lt;code&gt;tbody&lt;/code&gt; element, you'd notice that we have a &lt;code&gt;tr&lt;/code&gt; which denotes a table row, which also contains multiple &lt;code&gt;td&lt;/code&gt; elements. &lt;code&gt;td&lt;/code&gt; denotes table data.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8j2odt2tkiv24q51wkbw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8j2odt2tkiv24q51wkbw.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/8j2odt2tkiv24q51wkbw.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, just knowing that the data we need is inside of table isn't enough. We have to dig deeper into the HTML tree to be able to extract the exact data we need. In this case, we need the name of the cryptocurrency, the shortened name e.g. BTC for Bitcoin, the current price and the image of the cryptocurrency.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgm7g9lzv9v62zs7oaawu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgm7g9lzv9v62zs7oaawu.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/gm7g9lzv9v62zs7oaawu.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you look closely, you'd notice that the name of the cryptocurrency is in a paragraph tag with the class &lt;code&gt;sc-e225a64a-0 ePTNty&lt;/code&gt;. The shortened name of the cryptocurrency is also in a paragraph tag with a different class name &lt;code&gt;sc-e225a64a-0 dfeAJi coin-item-symbol&lt;/code&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;We use attributes like class and id to uniquely identify HTML elements or groups of familiar HTML elements.  When we have these distinct attributes, we can use them to target the elements and extract the values we need from them.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  What we've done so far?
&lt;/h3&gt;

&lt;p&gt;From analysing &lt;a href="https://www.coinmarketcap.com/" rel="noopener noreferrer"&gt;Coinmarketcap's&lt;/a&gt; website, we've seen that data of each cryptocurrency is in a row and each row has children that contains data we'd like to scrape.&lt;/p&gt;

&lt;p&gt;Let's get back to the code editor and update our &lt;code&gt;coinmarketcap.py&lt;/code&gt; file&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 python
import requests
from bs4 import BeautifulSoup

def get_crypto_prices():
    url = '&amp;lt;https://coinmarketcap.com&amp;gt;'
    response_text = requests.get(url).text

    soup = BeautifulSoup(response_text, 'html.parser')

    # get all the table rows
    table_rows = soup.findAll('tr')

    # iterate through all the table rows and get the required data
    for table_row in table_rows:
        crypto_name = table_row.find('p', class_ = 'sc-e225a64a-0 ePTNty')
        shortened_crypto_name = table_row.find('p', class_ = 'sc-e225a64a-0 dfeAJi coin-item-symbol')
        coin_img = table_row.find('img', class_ = 'coin-logo')

        print(crypto_name, shortened_crypto_name)

    get_crypto_prices()


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;blockquote&gt;
&lt;p&gt;Notice the difference between findAll and find&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;If you run the above code, you'd get this&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fei67om29rk8s6j5j2jm2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fei67om29rk8s6j5j2jm2.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/ei67om29rk8s6j5j2jm2.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can see that some of the data is returning &lt;code&gt;None&lt;/code&gt;. This is as a result of the remaining table rows which are empty. What we can do here is to check to see if there is a value before printing the values.&lt;/p&gt;

&lt;p&gt;Updating our &lt;code&gt;for&lt;/code&gt; loop, we have this:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 python
# iterate through all the table rows and get the required data

for table_row in table_rows:
    crypto_name = table_row.find('p', class_ = 'sc-e225a64a-0 ePTNty')
    shortened_crypto_name = table_row.find('p', class_ = 'sc-e225a64a-0 dfeAJi coin-item-symbol')
    coin_img = table_row.find('img', class_ = 'coin-logo')

    if crypto_name is None or shortened_crypto_name is None:
        continue
    else:
        crypto_name = crypto_name.text
        shortened_crypto_name = shortened_crypto_name.text
        coin_img = coin_img.attrs.get('src')

        print(crypto_name, shortened_crypto_name)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;If there's a value for &lt;code&gt;crypto_name&lt;/code&gt; or &lt;code&gt;shortened_crypto_name&lt;/code&gt;, we get the text from the HTML element and print it to the console. We also get the &lt;code&gt;src&lt;/code&gt; of the crypto image.&lt;/p&gt;

&lt;p&gt;Running the updated code, we should have this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9f7cqa9rlvbz72eciihf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9f7cqa9rlvbz72eciihf.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/9f7cqa9rlvbz72eciihf.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, let's get the prices for each cryptocurrency.&lt;/p&gt;

&lt;p&gt;Going back to our Chrome Devtools and right-clicking on the price text, we should see this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fseu300hix30mgnsyqtf8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fseu300hix30mgnsyqtf8.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/seu300hix30mgnsyqtf8.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We see that the price of the currency is in a &lt;code&gt;span&lt;/code&gt; tag, which is wrapped in an anchor tag: the &lt;code&gt;a&lt;/code&gt; tag, which has a class value of &lt;code&gt;cmc-link&lt;/code&gt;. However, using the class of the anchor tag to scrape the price won't work because the class &lt;code&gt;cmc-link&lt;/code&gt; does not uniquely identify the element we're attempting to target.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 javascript
const bitcoinRow = document.querySelectorAll('tr')[1]
const cmcLinks = bitcoinRow.querySelectorAll('.cmc-link')

console.log(cmcLinks) // NodeList(4) [a.cmc-link, a.cmc-link, a.cmc-link, a.cmc-link]


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;If you run the Javascript code above in the browser console, you'd see that there are four links with the class name &lt;code&gt;cmc-link&lt;/code&gt; in every row. This is definitely not the best way to get the price of the crypto at that row.&lt;/p&gt;

&lt;p&gt;Let's look at the parent: the &lt;code&gt;div&lt;/code&gt; with the class name &lt;code&gt;sc-8bda0120-0 dskdZn&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fof85gr8v2s06ip2bk3n1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fof85gr8v2s06ip2bk3n1.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/of85gr8v2s06ip2bk3n1.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Notice that when the element is hovered on the console, the price is also hovered on the web page. So, this proves to be a better way to get the price of the cryptocurrency.&lt;/p&gt;

&lt;p&gt;Updating the code, we have:&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 python
import requests
from bs4 import BeautifulSoup

def get_crypto_prices():
    url = '&amp;lt;https://coinmarketcap.com&amp;gt;'
    response_text = requests.get(url).text

    soup = BeautifulSoup(response_text, 'html.parser')

    # get all the table rows
    table_rows = soup.findAll('tr')

    # iterate through all the table rows and get the required data
    for table_row in table_rows:
        crypto_name = table_row.find('p', class_ = 'sc-e225a64a-0 ePTNty')
        shortened_crypto_name = table_row.find('p', class_ = 'sc-e225a64a-0 dfeAJi coin-item-symbol')
        coin_img = table_row.find('img', class_ = 'coin-logo')
        crypto_price = table_row.find('div', class_ = 'sc-8bda0120-0 dskdZn')

            if crypto_name is None or shortened_crypto_name is None or crypto_price is None:
                continue
            else:
                crypto_name = crypto_name.text
                shortened_crypto_name = shortened_crypto_name.text
                coin_img = coin_img.attrs.get('src')
                crypto_price = crypto_price.text

                print(f"Name: {crypto_name} ({shortened_crypto_name}) \\nPrice: {crypto_price} \\nImage URL: {crypto_img_url}\\n")

get_crypto_prices()


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Running the updated code, we should get this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkxoj218ussxusoj69ylk.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkxoj218ussxusoj69ylk.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/kxoj218ussxusoj69ylk.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Whooopppsssssss... That was a lot to process. I hope you were able to follow through.&lt;br&gt;
Grab that cup of coffee, you deserve it👍&lt;/p&gt;

&lt;h3&gt;
  
  
  Going back to scraping &lt;a href="https://news.bitcoin.com/" rel="noopener noreferrer"&gt;Bitcoin News&lt;/a&gt;
&lt;/h3&gt;

&lt;p&gt;Now, with the knowledge we've gained from scraping &lt;a href="https://coinmarketcap.com/" rel="noopener noreferrer"&gt;Coinmarketcap&lt;/a&gt;, we can go on ahead to complete our Bitcoin News scraper. We'd be using this scraper to get the latest news in the crypto space.&lt;/p&gt;

&lt;p&gt;Right-clicking on the first news and opening the Inspect Elements section of Chrome Devtools would reveal that news headlines have a &lt;code&gt;story&lt;/code&gt; class. However, if you right-click on more news headlines, you'd discover that they are variations to the news. There's the medium story, small, huge, tiny, etc, with different class names to uniquely identify each type.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;A very important skill required for web-scraping is the ability to take a close look at how the HTML of a webpage is structured. As long as you understand the structure of a webpage, extracting useful content from the elements is a small task.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Heading back to the code editor:
&lt;/h3&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 python
import requests

def get_news():
    url = '&amp;lt;https://news.bitcoin.com&amp;gt;'

    response = requests.get(url) # making a request
    response_text = response.text # getting the html

    print(response_text)

    soup = BeautifulSoup(response_text, 'html.parser')
    all_articles = soup.findAll('div', class_ = 'story')

    for article in all_articles:
        print(article.text.strip())

get_news()


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;When you run the above code, you might get this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F70dq8xz602ft7yykg6nz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F70dq8xz602ft7yykg6nz.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/70dq8xz602ft7yykg6nz.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To prevent malicious bots from accessing their website, some websites employ &lt;a href="https://www.cloudflare.com/products/bot-management/" rel="noopener noreferrer"&gt;Cloudflare Bot Management&lt;/a&gt;. Cloudflare maintains a list of known good bots that are granted access to the website, such as search engines, copyright bots, chat bots, site monitoring bots, and so on. Unfortunately for web-scraping enthusiasts like you and me, they also assume all non-whitelisted bot traffic is malicious.&lt;/p&gt;

&lt;p&gt;However, they are a number of ways this can be bypassed and the ease depends on how much of a threat our bot poses to Cloudflare and the Bot Protection plan the website owner subscribed for.&lt;/p&gt;

&lt;p&gt;A list of ways to avoid Cloudflare can be found &lt;a href="https://www.zenrows.com/blog/bypass-cloudflare#how-cloudflare-detects-bots" rel="noopener noreferrer"&gt;here&lt;/a&gt;. Click &lt;a href="https://www.cloudflare.com/learning/bots/what-is-bot-management/" rel="noopener noreferrer"&gt;here&lt;/a&gt; to learn more about Cloudflare Bot Management.&lt;/p&gt;

&lt;p&gt;In this article, we'll look at the most basic, which is setting the &lt;code&gt;User-Agent&lt;/code&gt; header. By doing so, we pretend to the server that the request is coming from a regular web browser.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 python
def get_news():
    url = '&amp;lt;https://news.bitcoin.com&amp;gt;'
    headers = {
        'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36',
    }

    response = requests.get(url, headers=headers)

    # the rest of the code


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;After adding the above line of code and running it, we should get this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxxeqsu87meoru3d1dy0t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxxeqsu87meoru3d1dy0t.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/xxeqsu87meoru3d1dy0t.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now that we have the actual web content, let's go to our browser to inspect the webpage:&lt;/p&gt;

&lt;p&gt;Right-clicking on the first news there, we should see this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcioxtmhdcxi06cmcijac.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcioxtmhdcxi06cmcijac.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/cioxtmhdcxi06cmcijac.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Scraping news headline
&lt;/h3&gt;

&lt;p&gt;Notice that the &lt;code&gt;h6&lt;/code&gt; element has a class of &lt;code&gt;story__title story--medium__title&lt;/code&gt;. If you right-click on another news, you might see something like &lt;code&gt;story__title story--huge__title&lt;/code&gt;, or &lt;code&gt;story__title story--large__title&lt;/code&gt;.  Notice how there's already a pattern: the headline of each news always has the class of &lt;code&gt;story__title&lt;/code&gt;. That seems like the best way to target the headline of the news.&lt;/p&gt;

&lt;h3&gt;
  
  
  Scraping news URLs
&lt;/h3&gt;

&lt;p&gt;If you look closely, you'd notice that the news title has a parent which is a link. This link contains the URL of the news in it's &lt;code&gt;href&lt;/code&gt; attribute.&lt;/p&gt;

&lt;p&gt;Putting all of these together, we can write the code for the scraper.&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 python
import requests
from bs4 import BeautifulSoup

def get_news():
    url = '&amp;lt;https://news.bitcoin.com&amp;gt;'
    headers = {
    'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/109.0.0.0 Safari/537.36',
    } # headers to bypass Cloudflare Protection

    response = requests.get(url, headers=headers)
    response_text = response.text

    soup = BeautifulSoup(response_text, 'html.parser')
    all_articles = soup.findAll('div', class_ = 'story')

    for article in all_articles:
        news_title_element = article.select_one('.story__title')
        news_url = news_title_element.parent.attrs.get('href')
        news_title = news_title_element.text.strip()

        print(f"HEADLINE: {news_title} \\nURL: {news_url}\\n")

get_news()


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Running the above code, we have the news headline and the URL&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flyhx42akegg49mq79dpo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flyhx42akegg49mq79dpo.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/lyhx42akegg49mq79dpo.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Utilising scraped data
&lt;/h2&gt;

&lt;p&gt;Now that we've successfully scraped required data from the websites, let's save the scraped data. If you're working with a database, you could immediately save it to the data, you could perform computations with the data, you could save it for future use, whatever the case is, data you've scraped is not very useful in the console.&lt;/p&gt;

&lt;p&gt;Let's do something really simple: save the data in a JSON file&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 python
import requests
from bs4 import BeautifulSoup
import json

# some code
all_articles = soup.findAll('div', class_ = 'story')
scraped_articles = []

for article in all_articles:
    news_title_element = article.select_one('.story__title')
    news_url = news_title_element.parent.attrs.get('href')
    news_title = news_title_element.text.strip()

    scraped_articles.append({
        "headline": news_title,
        "url": news_url
    })

with open ('news.json', 'w') as file:
    news_as_json = json.dumps({
        'news': scraped_articles,
        'number_of_news': len(scraped_articles)
    }, indent = 3, sort_keys = True)

    file.write(news_as_json)


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Now, we're saving saved news in a JSON file, which can be used for whatever you want.&lt;/p&gt;

&lt;p&gt;Your JSON file should look something like this.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F97jv9gomf5p4z4bw9v0h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F97jv9gomf5p4z4bw9v0h.png" alt="https://dev-to-uploads.s3.amazonaws.com/uploads/articles/97jv9gomf5p4z4bw9v0h.png"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Whooopppsssss.⚡️⚡️ That was a whole lot to take in.&lt;/p&gt;

&lt;h2&gt;
  
  
  What not to do when scraping websites
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Overloading the servers with too many requests at a time. The reason why many websites frown against bots is their frequent abuse. When you make a request, the server uses resources to be able to process the request. Making a ton of requests might make the server run out of resources, which isn't good.&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;
 python
"""
⛔️ DON'T DO THIS
"""
for i in range(10000):
    request.get('&amp;lt;https://reallyamazingwebsite.com&amp;gt;')


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;p&gt;Your IP address might get blacklisted.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Disrespecting copyright rules. Web scraping is completely legal if you scrape data publicly available on the internet. But some kinds of data are protected by international regulations, so be careful scraping personal data, intellectual property, or confidential data. Some websites might openly state that content on the page should not be distributed by any other means, that should be respected.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You've finally made it to the end of the article, well-done champ.&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In conclusion, web scraping in Python can be a useful tool for gathering information from websites for a variety of purposes. Python provides a user-friendly and efficient way to extract structured data from websites through the use of popular libraries such as BeautifulSoup. However, it is critical to remember that web scraping should always be done ethically and in accordance with the terms of service of the website. It is also critical to be aware of any legal constraints that may apply to the data that is being scraped. With these factors in mind, Python web scraping can be a useful skill for anyone looking to collect and analyse data from the web.&lt;/p&gt;

&lt;p&gt;If you enjoyed reading this article, you can follow help share on your socials and follow me on social media.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://twitter.com/jeremiahlena13" rel="noopener noreferrer"&gt;Twitter&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.linkedin.com/in/jeremiah-lena/" rel="noopener noreferrer"&gt;LinkedIn&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://github.com/Jeremiahjacinth13" rel="noopener noreferrer"&gt;GitHub&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>python</category>
      <category>django</category>
      <category>programming</category>
      <category>webdev</category>
    </item>
    <item>
      <title>What is Debouncing?</title>
      <dc:creator>Lena Jeremiah</dc:creator>
      <pubDate>Sun, 13 Feb 2022 18:57:45 +0000</pubDate>
      <link>https://forem.com/jeremiahjacinth13/what-is-debouncing-1akk</link>
      <guid>https://forem.com/jeremiahjacinth13/what-is-debouncing-1akk</guid>
      <description>&lt;p&gt;Performance is one of the many things that are prioritized when building websites and software generally. As software engineers, it's imperative that we write code with performance in mind, as this would help a great deal to improve the overall user experience of our software.&lt;/p&gt;

&lt;p&gt;In this article, we'd be taking a look at Debouncing, a very useful technique for improving the performance of client-side applications. &lt;/p&gt;

&lt;p&gt;Before we look at what debouncing is, let's take a brief look at event listeners.&lt;/p&gt;

&lt;h3&gt;
  
  
  Event Listeners
&lt;/h3&gt;

&lt;p&gt;When building client-side applications, event listeners are things we can't do without. Every client-side application would require that the user interacts with it for it (the app) to be useful, these interactions could be clicking a button, scrolling to view more content, typing into an input field, submitting a form and so many more. These event listeners have callbacks that fire whenever the event they're listening for is triggered.&lt;/p&gt;

&lt;p&gt;On some occasions, these event listeners would have performant-heavy callbacks, hence, the need for us to control how and when these callbacks are called. And this is where debouncing comes to play.&lt;/p&gt;

&lt;p&gt;Let's assume that we have a search bar that makes a request to an API whenever a user makes a change to the input field. That means if a user wants to search for the term 'What is debouncing?', the browser would have to make a total of 19 requests to the API.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1642663917108%2FPwJY9-iJq.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1642663917108%2FPwJY9-iJq.gif" alt="ezgif.com-gif-maker.gif"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here's a code pen so you can test it out.&lt;/p&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/jeremiahjacinth13/embed/VwMOVpK?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;p&gt;Now, with this approach, our browser makes a request for every single keystroke the user makes on the keyboard, which leaves us with multiple useless requests.&lt;/p&gt;

&lt;p&gt;How about we find a way to prevent the request from being made until the user has finished typing? Would this solve the problem? Now, this is exactly what debouncing is.&lt;/p&gt;

&lt;h3&gt;
  
  
  Debouncing
&lt;/h3&gt;

&lt;p&gt;Debouncing is a method in which a function is prevented from running until a certain amount of time has elapsed without the function being called. In our case, the request won't be made until the user has stopped typing. &lt;/p&gt;

&lt;p&gt;Implementing debouncing, our event callback would look something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;timeout&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="c1"&gt;// other codes&lt;/span&gt;
&lt;span class="nx"&gt;inputField&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;addEventListener&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;input&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nf"&gt;clearTimeout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;timeout&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="nx"&gt;timeout&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;setTimeout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;makeRequest&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;})&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;From the above snippet, whenever a user types, we clear a timeout, which does not exist when the function is called initially. We then create another timeout using &lt;code&gt;setTimeout&lt;/code&gt;, which calls the &lt;code&gt;makeRequest&lt;/code&gt; function whenever the time has elapsed. That means that if the timeout has not exceeded and the user types, we clear the previous timer and create another one. This way, only the last timeout would run. Hence, solving our problem of having multiple requests. n&lt;/p&gt;

&lt;p&gt;This is what it looks out after implementing debouncing:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1642675924730%2FGJQAn42G1.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1642675924730%2FGJQAn42G1.gif" alt="gif2.gif"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Makes more sense, right?&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmedia.makeameme.org%2Fcreated%2Foh-yes-that.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fmedia.makeameme.org%2Fcreated%2Foh-yes-that.jpg" alt="makes sense meme"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here's a codepen if you want to take a close look at the implementation&lt;/p&gt;

&lt;p&gt;&lt;iframe height="600" src="https://codepen.io/jeremiahjacinth13/embed/BawevKv?height=600&amp;amp;default-tab=result&amp;amp;embed-version=2"&gt;
&lt;/iframe&gt;
&lt;/p&gt;

&lt;h2&gt;
  
  
  An extra something
&lt;/h2&gt;

&lt;p&gt;Instead of manually writing the debouncing function every time we want to implement this amazing functionality, we can just create a utility function that takes a callback and a timer and then return a function that has the whole debouncing functionality.&lt;/p&gt;

&lt;p&gt;Something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="kd"&gt;function&lt;/span&gt; &lt;span class="nf"&gt;debounce&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;func&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;timeINMS&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;timeout&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

  &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="nf"&gt;function &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="nf"&gt;clearTimeout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;timeout&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="nx"&gt;timeout&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;setTimeout&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;func&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;timeINMS&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
  &lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;debouncedHello&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;debounce&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;say hello&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nb"&gt;Date&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;now&lt;/span&gt;&lt;span class="p"&gt;()),&lt;/span&gt; &lt;span class="mi"&gt;800&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, the &lt;code&gt;debounce&lt;/code&gt; function takes two arguments, a function that logs &lt;code&gt;say hello&lt;/code&gt; and a number that represents time(in milliseconds) through which the function should be delayed, then returns a function that has the function of debouncing.&lt;/p&gt;

&lt;p&gt;Debouncing is a very simple and intuitive technique but also considerably improves performance.&lt;/p&gt;

&lt;p&gt;I hope you were able to follow through with the concept. In my next article, I'd be talking about another technique which is a little like debouncing: Throttling.&lt;/p&gt;

&lt;p&gt;Stay tuned and stay safe❤✌️&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>webdev</category>
      <category>performance</category>
      <category>programming</category>
    </item>
    <item>
      <title>Memory Leaks, How to avoid them in a React App.</title>
      <dc:creator>Lena Jeremiah</dc:creator>
      <pubDate>Sat, 08 Jan 2022 09:58:39 +0000</pubDate>
      <link>https://forem.com/jeremiahjacinth13/memory-leaks-how-to-avoid-them-in-a-react-app-1g5e</link>
      <guid>https://forem.com/jeremiahjacinth13/memory-leaks-how-to-avoid-them-in-a-react-app-1g5e</guid>
      <description>&lt;h2&gt;
  
  
  What is a memory leak?
&lt;/h2&gt;

&lt;p&gt;According to &lt;a href="https://en.wikipedia.org/wiki/Memory_leak" rel="noopener noreferrer"&gt;Wikipedia&lt;/a&gt;, a memory leak is a type of resource leak that occurs when a computer program incorrectly manages memory allocations in a way that memory that is no longer needed is not released. A memory leak may also happen when an object is stored in memory but cannot be accessed by the running code.&lt;/p&gt;

&lt;p&gt;Simply put, a memory leak is said to occur whenever inaccessible or unreferenced data exists in memory. Nowadays, many modern programming languages have techniques for clearing out data that is no longer needed, &lt;a href="https://en.wikipedia.org/wiki/Garbage_collection_(computer_science)" rel="noopener noreferrer"&gt;garbage collection&lt;/a&gt;, but it turns out there are other not-so-popular errors which can expose your React app to memory leaks and, to a great extent, reduce the performance of your app.&lt;/p&gt;

&lt;p&gt;Let's look at some causes of memory leaks.&lt;/p&gt;

&lt;h2&gt;
  
  
  Causes of Memory Leaks in a React Application
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1641496893441%2FUnKO065qK.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fcdn.hashnode.com%2Fres%2Fhashnode%2Fimage%2Fupload%2Fv1641496893441%2FUnKO065qK.png" alt="memory leak.PNG"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Memory leaks in React applications are primarily a result of not cancelling subscriptions made when a component was mounted before the component gets unmounted. These subscriptions could be a DOM Event listener, a WebSocket subscription, or even a request to an API. &lt;/p&gt;

&lt;p&gt;The first two are not too much of a challenge, as we can easily remove an event listener or unsubscribe from the WebSocket before the component gets unmounted. But the last one might require a little bit of extra work.&lt;/p&gt;

&lt;h2&gt;
  
  
  A typical React workflow
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;useState&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;useEffect&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;Link&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react-router-dom&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;axios&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;axios&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;MyCompany&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt; &lt;span class="nx"&gt;company&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;setCompany&lt;/span&gt; &lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;useState&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="nf"&gt;useEffect&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
             &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;axios&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                 &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;https://random-data-api.com/api/company/random_company&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;
             &lt;span class="p"&gt;);&lt;/span&gt;
             &lt;span class="nf"&gt;setCompany&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="p"&gt;})();&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;[]);&lt;/span&gt;

    &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&amp;gt;&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;pre&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;company&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;pre&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Link&lt;/span&gt; &lt;span class="na"&gt;to&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;'/anotherpage'&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Another Interesting Page&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nc"&gt;Link&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;/&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the code snippet above, we have a simple component &lt;code&gt;MyCompany&lt;/code&gt; which when mounted, makes a request to get a random company and sets value of &lt;code&gt;company&lt;/code&gt; state to the value gotten from the API. &lt;/p&gt;

&lt;h2&gt;
  
  
  The Problem
&lt;/h2&gt;

&lt;p&gt;Assuming our user has a very slow internet connection and then decides to leave the current page for another interesting page, the request would have already been made and our browser would be expecting a response, which when received, would lead us to call &lt;code&gt;setState&lt;/code&gt; on a component that's no longer mounted. &lt;/p&gt;

&lt;p&gt;Aside from setting state, we would now have unimportant data in our app with no means of accessing them. This process is repeated multiple times while the user uses the app, filling up useful memory with useless and inaccessible data and leading to serious performance issues.&lt;/p&gt;

&lt;p&gt;We've seen the problems and I believe you understand, now let's look at how to solve this problem.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Way Forward: AbortControllers
&lt;/h2&gt;

&lt;p&gt;Having understood the problem, what we'd do to salvage the situation is &lt;strong&gt;cancel the request&lt;/strong&gt; the moment our component unmounts, ensuring we don't get any data from the API.&lt;/p&gt;

&lt;h3&gt;
  
  
  So, how do we cancel requests? &lt;strong&gt;&lt;a href="https://developer.mozilla.org/en-US/docs/Web/API/AbortController" rel="noopener noreferrer"&gt;AbortControllers&lt;/a&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;According to &lt;a href="https://developer.mozilla.org/en-US/docs/Web/API/AbortController" rel="noopener noreferrer"&gt;MDN&lt;/a&gt;, the AbortController represents a controller object that allows you to abort one or more Web requests as and when desired. That's quite explanatory!!&lt;/p&gt;

&lt;p&gt;AbortControllers are created with the &lt;code&gt;new AbortController()&lt;/code&gt; syntax, initializing an instance of the AbortController class. Every AbortController object has a read-only &lt;code&gt;signal&lt;/code&gt; property which is passed into requests, and an &lt;code&gt;abort()&lt;/code&gt; method which is whenever you want to cancel a request.&lt;/p&gt;

&lt;p&gt;Now using AbortControllers, our code should look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight jsx"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;useState&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;useEffect&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;Link&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;react-router-dom&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;axios&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;axios&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;MyCompany&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="kd"&gt;function&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt; &lt;span class="nx"&gt;company&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;setCompany&lt;/span&gt; &lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nf"&gt;useState&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="nf"&gt;useEffect&lt;/span&gt;&lt;span class="p"&gt;(()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
         &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;abortController&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
        &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
             &lt;span class="nx"&gt;abortController&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nc"&gt;AbortController&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
             &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;signal&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;abortController&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;signal&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;    

             &lt;span class="c1"&gt;// the signal is passed into the request(s) we want to abort using this controller&lt;/span&gt;
             &lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;axios&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
                 &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;https://random-data-api.com/api/company/random_company&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;
                 &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="na"&gt;signal&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nx"&gt;signal&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
             &lt;span class="p"&gt;);&lt;/span&gt;
             &lt;span class="nf"&gt;setCompany&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="p"&gt;})();&lt;/span&gt;

        &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="nx"&gt;abortController&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;abort&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="p"&gt;[]);&lt;/span&gt;

    &lt;span class="k"&gt;return &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;&amp;gt;&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nt"&gt;pre&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="nx"&gt;JSON&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;stringify&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;company&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kc"&gt;null&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;3&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nt"&gt;pre&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
            &lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="nc"&gt;Link&lt;/span&gt; &lt;span class="na"&gt;to&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;'/anotherpage'&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;Another Interesting Page&lt;span class="p"&gt;&amp;lt;/&lt;/span&gt;&lt;span class="nc"&gt;Link&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;&amp;lt;/&amp;gt;&lt;/span&gt;
    &lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now, when our user navigates to a new page, our AbortController cancels the request and we don't have to worry about having data leaks.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;NOTE:&lt;/strong&gt; Calling abortController.abort() after the request has been completed doesn't throw any errors. The abortController simply does not take any action on an already complete request.&lt;/p&gt;

&lt;p&gt;Using AbortControllers in your web apps can help improve performance and prevent memory leaks, so it's something you should actually use.&lt;/p&gt;

&lt;p&gt;Thanks for reading❤❤&lt;/p&gt;

</description>
      <category>react</category>
      <category>javascript</category>
      <category>webdev</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
