<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Rutika Khaire</title>
    <description>The latest articles on Forem by Rutika Khaire (@rutikakhaire).</description>
    <link>https://forem.com/rutikakhaire</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/rutikakhaire"/>
    <language>en</language>
    <item>
      <title>My First Data Governance Project: From Confusion to Clarity (Lessons I Learned the Hard Way)</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Mon, 30 Mar 2026 15:52:14 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/my-first-data-governance-project-from-confusion-to-clarity-lessons-i-learned-the-hard-way-5a6e</link>
      <guid>https://forem.com/rutikakhaire/my-first-data-governance-project-from-confusion-to-clarity-lessons-i-learned-the-hard-way-5a6e</guid>
      <description>&lt;p&gt;When I was asked to work on Data Governance for the first time, I had no idea where to start.&lt;/p&gt;

&lt;p&gt;Honestly, I was completely blank.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ao12wzpk9yj6ykcu4hv.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7ao12wzpk9yj6ykcu4hv.gif" alt="No idea" width="220" height="126"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Like most beginners, I did what we all do:&lt;br&gt;
I watched YouTube videos, read blogs, and tried to understand frameworks. Everything sounded important-and complicated.&lt;/p&gt;

&lt;p&gt;So I decided to create a Data Governance plan.&lt;/p&gt;

&lt;p&gt;With the help of AI tools, I quickly built something that looked impressive-structured, detailed, and “complete.”&lt;/p&gt;

&lt;p&gt;But during the review, I faced a simple problem:&lt;/p&gt;

&lt;p&gt;👉 I couldn’t explain what I had written.&lt;/p&gt;

&lt;p&gt;That moment changed everything.&lt;br&gt;
&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1rzl8dujibpoddpl6qbc.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1rzl8dujibpoddpl6qbc.gif" alt="Changed" width="480" height="480"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Lesson 1: If You Don’t Understand It, You Can’t Implement It&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcyx8h5xs97187eawym1l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fcyx8h5xs97187eawym1l.png" alt="Understand it" width="480" height="480"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Using AI gave me speed-but not understanding.&lt;/p&gt;

&lt;p&gt;I realized that Data Governance is not about creating fancy documents. It’s about clarity.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Lesson 2: Don’t Start with Complexity&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm9ubpppxqqpdh1zioc6u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm9ubpppxqqpdh1zioc6u.png" alt="Make it simple" width="800" height="600"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;My initial mistake was trying to build a “perfect” governance model from day one.&lt;/p&gt;

&lt;p&gt;After a few reviews, I learned:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Keep things simple&lt;/li&gt;
&lt;li&gt;Use clear language&lt;/li&gt;
&lt;li&gt;Focus on what actually matters&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;strong&gt;Lesson 3: Start Small&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3n1cyrim24flakwsxkua.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F3n1cyrim24flakwsxkua.png" alt="Start small" width="453" height="340"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Instead of complex frameworks, I shifted to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Simple governance plans&lt;/li&gt;
&lt;li&gt;Basic role definitions&lt;/li&gt;
&lt;li&gt;Minimal, clear processes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And suddenly, everything became easier to explain—and implement.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Lesson 4: Always Ask ‘Why?’&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwd2puzsbrctz090q4sws.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwd2puzsbrctz090q4sws.png" alt="Ask Why" width="300" height="375"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;One question helped me more than anything else:&lt;/p&gt;

&lt;p&gt;👉 Why do we need Data Governance?&lt;/p&gt;

&lt;p&gt;This helped me focus on:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Data quality&lt;/li&gt;
&lt;li&gt;Consistency&lt;/li&gt;
&lt;li&gt;Better decision-making&lt;/li&gt;
&lt;li&gt;Reducing confusion&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If something didn’t answer “why,” I removed it.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Lesson 5: Data Governance is About People, Not Just Documents&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdyg0oa8eqcbmhxv275m1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdyg0oa8eqcbmhxv275m1.png" alt="Its about people" width="192" height="72"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;While preparing for stakeholder discussions, I realized:&lt;/p&gt;

&lt;p&gt;People don’t care about frameworks-they care about impact.&lt;/p&gt;

&lt;p&gt;So instead of explaining governance models, I focused on:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;What problems it solves&lt;/li&gt;
&lt;li&gt;How it helps the business&lt;/li&gt;
&lt;li&gt;Why it matters to them&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;strong&gt;Lesson 6: Tell a Story, Not Just a Plan&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwnb1vlabgpwdr8gctbw9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwnb1vlabgpwdr8gctbw9.png" alt="Tell a story" width="249" height="202"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;When I prepared my demo, I didn’t just present documents.&lt;/p&gt;

&lt;p&gt;I explained:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The current challenges&lt;/li&gt;
&lt;li&gt;What could go wrong without governance&lt;/li&gt;
&lt;li&gt;A simple way forward&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;That made all the difference.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Final Thought&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;My first Data Governance experience wasn’t perfect—but it taught me something important:&lt;/p&gt;

&lt;p&gt;👉 Start simple. Understand deeply. Then build.&lt;/p&gt;

</description>
      <category>datagovernance</category>
      <category>datamanagement</category>
      <category>learning</category>
      <category>analytics</category>
    </item>
    <item>
      <title>[Boost]</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Wed, 24 Dec 2025 06:25:02 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/-3pee</link>
      <guid>https://forem.com/rutikakhaire/-3pee</guid>
      <description>&lt;div class="ltag__link"&gt;
  &lt;a href="/rutikakhaire" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__pic"&gt;
      &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F772594%2F54e61f84-00da-4979-80d1-6473fa4a2ac2.jpg" alt="rutikakhaire"&gt;
    &lt;/div&gt;
  &lt;/a&gt;
  &lt;a href="https://dev.to/rutikakhaire/behind-the-scenes-of-data-ingestion-how-small-issues-cause-big-headaches-ddn" class="ltag__link__link"&gt;
    &lt;div class="ltag__link__content"&gt;
      &lt;h2&gt;Behind the Scenes of Data Ingestion: How Small Issues Cause Big Headaches&lt;/h2&gt;
      &lt;h3&gt;Rutika Khaire ・ Dec 24 '25&lt;/h3&gt;
      &lt;div class="ltag__link__taglist"&gt;
        &lt;span class="ltag__link__tag"&gt;#dataingestion&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#medallion&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#dataengineering&lt;/span&gt;
        &lt;span class="ltag__link__tag"&gt;#datafactory&lt;/span&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/a&gt;
&lt;/div&gt;


</description>
      <category>dataingestion</category>
      <category>medallion</category>
      <category>dataengineering</category>
      <category>datafactory</category>
    </item>
    <item>
      <title>Behind the Scenes of Data Ingestion: How Small Issues Cause Big Headaches</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Wed, 24 Dec 2025 06:24:25 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/behind-the-scenes-of-data-ingestion-how-small-issues-cause-big-headaches-ddn</link>
      <guid>https://forem.com/rutikakhaire/behind-the-scenes-of-data-ingestion-how-small-issues-cause-big-headaches-ddn</guid>
      <description>&lt;p&gt;Data ingestion is often treated as a solved problem-until it breaks. What looks like a simple pipeline moving data from source to destination can quietly introduce inconsistencies, missing records, or silent failures that ripple across analytics and reporting systems.&lt;/p&gt;

&lt;p&gt;In this blog, we’ll go behind the scenes of a real-world data ingestion architecture, explore common issues that arise, uncover their root causes, and share best practices to build more resilient pipelines.&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Why Data Ingestion Matters&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fitupxv0lz3kr5gx41nbm.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fitupxv0lz3kr5gx41nbm.jpg" alt="Foundation" width="389" height="259"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Data ingestion is the foundation of every data platform. When ingestion goes wrong:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Reports show incorrect numbers&lt;/li&gt;
&lt;li&gt;Business decisions are based on stale or incomplete data&lt;/li&gt;
&lt;li&gt;Engineers spend hours firefighting instead of building features&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The most dangerous problems aren’t always the obvious failures—they’re the subtle ones that go unnoticed for weeks&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Our Ingestion Architecture at a Glance&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Azure Data Factory (ADF) for orchestration&lt;/li&gt;
&lt;li&gt;&lt;p&gt;SQL Server Views as the source layer&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Medallion Architecture&lt;/strong&gt;&lt;br&gt;
  &lt;strong&gt;Bronze&lt;/strong&gt;: Raw ingested data&lt;br&gt;
  &lt;strong&gt;Silver&lt;/strong&gt;: Cleaned and transformed data&lt;br&gt;
  &lt;strong&gt;Gold&lt;/strong&gt;: Business-ready datasets&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbvxd78fsnzsf7fecadqq.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbvxd78fsnzsf7fecadqq.jpg" alt="Medallion architecture" width="800" height="361"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Watermark-based incremental loads to process only changed data&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This architecture is scalable and efficient—but only when implemented carefully &lt;/p&gt;




&lt;p&gt;&lt;strong&gt;Issue #1: Missing Deletes – “Why Is This Customer Still Active?”&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fymuxwknp40j6sv38imjx.webp" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fymuxwknp40j6sv38imjx.webp" alt="Missing deletes" width="800" height="800"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;u&gt;Real-World Example&lt;/u&gt;&lt;/p&gt;

&lt;p&gt;A customer account is deleted in the CRM system due to GDPR requirements.&lt;br&gt;
However, the sales dashboard still shows the customer as active weeks later.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Impact&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Compliance risk&lt;/li&gt;
&lt;li&gt;Incorrect KPIs&lt;/li&gt;
&lt;li&gt;Loss of trust from business users&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;What Actually Happened&lt;/strong&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Source System        Target Table
Customer Deleted  →  No Delete Captured

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Fix (Realistic Approach)&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;Source (CDC Enabled)&lt;br&gt;
   │&lt;br&gt;
   ├── Insert&lt;br&gt;
   ├── Update&lt;br&gt;
   └── Delete ──► Target Table&lt;br&gt;
&lt;/code&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Enable CDC or soft-delete flags&lt;/li&gt;
&lt;li&gt;Add delete-handling logic during MERGE&lt;/li&gt;
&lt;li&gt;Periodically reconcile record counts&lt;/li&gt;
&lt;/ul&gt;



&lt;p&gt;&lt;strong&gt;Issue #2: Race Conditions – Incremental Loads Miss Late-Arriving Updates&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fazomem8s1769nyraro5p.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fazomem8s1769nyraro5p.gif" alt="Race condition" width="220" height="196"&gt;&lt;/a&gt;&lt;br&gt;
&lt;u&gt;Real-World Example&lt;/u&gt;&lt;br&gt;
Incremental loads often rely on a watermark column such as LastModifiedDate.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Example&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Last successful watermark = 10:00 AM&lt;/li&gt;
&lt;li&gt;A record is updated at 9:55 AM&lt;/li&gt;
&lt;li&gt;That update arrives late due to upstream delays&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Because the timestamp is older than the watermark, the incremental query skips it entirely.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why This Is Dangerous&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;No pipeline failure&lt;/li&gt;
&lt;li&gt;No alert&lt;/li&gt;
&lt;li&gt;Data is permanently missed unless a full reload happens&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is a silent data loss scenario.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Common Root Causes&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Eventual consistency in source systems&lt;/li&gt;
&lt;li&gt;Batch updates applied late&lt;/li&gt;
&lt;li&gt;Reliance on timestamps without buffering&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;How to fix it&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use a lookback window (e.g., watermark − 5 or 10 minutes)&lt;/li&gt;
&lt;li&gt;Prefer CDC or version-based sequencing&lt;/li&gt;
&lt;li&gt;Reprocess recent partitions regularly&lt;/li&gt;
&lt;/ul&gt;



&lt;p&gt;&lt;strong&gt;Issue #3: Over-Aggressive Filtering – “Where Did My Data Go?”&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwjojw982bfz23q1nlli2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwjojw982bfz23q1nlli2.png" alt="Aggressive Filtering" width="224" height="225"&gt;&lt;/a&gt;&lt;br&gt;
&lt;u&gt;Real-World Example&lt;/u&gt;&lt;/p&gt;

&lt;p&gt;A filter is added to exclude test users:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;WHERE username NOT LIKE '%test%'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Suddenly, legitimate users like &lt;strong&gt;&lt;em&gt;contest_winner&lt;/em&gt;&lt;/strong&gt; disappear from reports.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Hidden Damage&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;No pipeline failure&lt;/li&gt;
&lt;li&gt;No alert&lt;/li&gt;
&lt;li&gt;Business notices weeks later&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Better Filtering Strategy&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;Incoming Data&lt;br&gt;
     │&lt;br&gt;
     ├── Valid Users ──► Continue&lt;br&gt;
     └── Test Users  ──► Logged + Reviewed&lt;br&gt;
&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;How to fix it&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use exact match lists (e.g., TestUserList)&lt;/li&gt;
&lt;li&gt;Log filtered records&lt;/li&gt;
&lt;li&gt;Validate filters with production samples&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Best Practices
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Resilient Ingestion Design&lt;/strong&gt;&lt;br&gt;
&lt;code&gt;Detect Change&lt;br&gt;
   │&lt;br&gt;
Validate Data&lt;br&gt;
   │&lt;br&gt;
Apply Idempotent Load&lt;br&gt;
   │&lt;br&gt;
Monitor + Reconcile&lt;br&gt;
&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;Operational Guardrails&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Row-count reconciliation&lt;/li&gt;
&lt;li&gt;Data freshness checks&lt;/li&gt;
&lt;li&gt;Alerting on anomalies—not just failures&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Final Thoughts
&lt;/h2&gt;

&lt;p&gt;Most data ingestion failures don’t crash pipelines—they quietly corrupt trust. Missed deletes, late-arriving updates, and poorly managed watermarks often go unnoticed until business users start asking uncomfortable questions.&lt;/p&gt;

&lt;p&gt;The takeaway is simple: build ingestion pipelines for real-world behavior, not ideal scenarios. Expect delays, partial failures, and messy data. Validate early, reconcile often, and treat control logic like watermarks as first-class citizens.&lt;/p&gt;

&lt;p&gt;In data engineering, it’s rarely the big failures that hurt the most—it’s the small ones you didn’t see coming.&lt;/p&gt;

</description>
      <category>dataingestion</category>
      <category>medallion</category>
      <category>dataengineering</category>
      <category>datafactory</category>
    </item>
    <item>
      <title>My First Open Source Contribution</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Mon, 06 Oct 2025 04:55:05 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/my-first-open-source-contribution-3695</link>
      <guid>https://forem.com/rutikakhaire/my-first-open-source-contribution-3695</guid>
      <description>&lt;p&gt;For the longest time, I believed contributing to open source was something only “seasoned developers” could do. I pictured huge, complicated codebases and intimidating review processes. But recently, I made my very first open source contribution—and it turned out to be one of the most rewarding learning experiences I’ve ever had.&lt;/p&gt;

&lt;p&gt;In this post, I’ll walk you through my journey: how I got started, what I contributed, what I learned, and why you should try it too.&lt;/p&gt;

&lt;h2&gt;
  
  
  How To Find the Right Project
&lt;/h2&gt;

&lt;p&gt;The first challenge was figuring out where to contribute. GitHub is full of amazing repositories, but it’s easy to feel overwhelmed. I started by looking for beginner-friendly issues using labels like good first issue and help wanted.&lt;/p&gt;

&lt;p&gt;Eventually, I came across a project called &lt;a href="https://github.com/firstcontributions/first-contributions" rel="noopener noreferrer"&gt;first-contributions&lt;/a&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Understanding the Workflow
&lt;/h2&gt;

&lt;p&gt;Before writing any code, carefully read the project’s README and CONTRIBUTING.md file. These documents are gold for beginners—they explain how to set up the project, the coding style, and the process for submitting changes.&lt;/p&gt;

&lt;h2&gt;
  
  
  Making My First Contribution
&lt;/h2&gt;

&lt;p&gt;My first contribution wasn’t something massive—it was improving a README section.&lt;/p&gt;

&lt;p&gt;Here’s the process I followed:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Forked the repo&lt;/li&gt;
&lt;li&gt;Cloned it to my local machine&lt;/li&gt;
&lt;li&gt;Created a new branch for my fix&lt;/li&gt;
&lt;li&gt;Made the changes&lt;/li&gt;
&lt;li&gt;Committed and pushed the code&lt;/li&gt;
&lt;li&gt;Opened a Pull Request (PR)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  What I Learned
&lt;/h2&gt;

&lt;p&gt;Here are my biggest takeaways from this first step into open source:&lt;/p&gt;

&lt;p&gt;Open source isn’t as scary as it looks—maintainers usually want you to succeed.&lt;br&gt;
Even small contributions make a difference. Don’t underestimate fixing a typo or improving documentation.&lt;br&gt;
Reading project guidelines before jumping in saves a lot of time.&lt;br&gt;
Collaboration and communication matter as much as writing code.&lt;/p&gt;




&lt;p&gt;If you’re thinking about contributing to open source, here’s my advice:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Start small&lt;/strong&gt;. Documentation fixes or simple bugs are great entry points.&lt;/li&gt;
&lt;li&gt;Look for “&lt;strong&gt;good first issue&lt;/strong&gt;” labels. They exist specifically to help beginners get started.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Don’t be afraid&lt;/strong&gt; of mistakes. PR reviews are part of the learning process.&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;If you’ve made your first contribution, I’d love to hear your story too!&lt;/p&gt;

</description>
      <category>codenewbie</category>
      <category>github</category>
      <category>opensource</category>
    </item>
    <item>
      <title>How to Scale Your Application to Handle Peak Loads and Increase Throughput</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Wed, 18 Dec 2024 12:22:05 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/how-to-scale-your-application-to-handle-peak-loads-and-increase-throughput-4d6l</link>
      <guid>https://forem.com/rutikakhaire/how-to-scale-your-application-to-handle-peak-loads-and-increase-throughput-4d6l</guid>
      <description>&lt;p&gt;&lt;strong&gt;Introduction: The Party Planner’s Dilemma&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Imagine you're hosting a party. You plan for 50 guests, but suddenly 200 people show up. You need more food, more chairs, and a way to keep everyone happy without chaos. This is exactly what systems face during peak loads. How to keep everything running smoothly? &lt;/p&gt;




&lt;h2&gt;
  
  
  1. Scaling Up: Adding Chairs to the Party
&lt;/h2&gt;

&lt;p&gt;Scaling your system during peak loads is like finding extra chairs when more guests arrive. Scaling ensures that the application has enough resources to handle increased demand. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnoy2xz81rg0jwydocfaz.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnoy2xz81rg0jwydocfaz.gif" alt="Add chairs" width="220" height="338"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Here’s how to do it with Azure:&lt;br&gt;
&lt;strong&gt;&lt;u&gt;Vertical Scaling (Adding More Power):&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
Adding resources to existing servers, like upgrading a chair to a sofa.&lt;br&gt;
Example: If your Azure App Service instance needs more resources, scale up to a higher pricing tier.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Do It?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Navigate to your App Service.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Scale Up (App Service Plan)&lt;/strong&gt; and choose a higher tier (e.g., Standard to Premium).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Horizontal Scaling (Adding More Machines):&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
Adding more servers, like borrowing chairs from your neighbors.&lt;br&gt;
Example: During a flash sale, e-commerce platforms add more servers to handle the load.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Do It?&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to your App Service.&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;Scale Out (App Service Plan)&lt;/strong&gt; and configure autoscaling rules based on metrics like CPU usage or request count.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Azure-Specific Tip:&lt;/strong&gt; Use &lt;strong&gt;Azure Monitor Autoscale&lt;/strong&gt; to define rules like “Add one instance if CPU usage exceeds 70% for 5 minutes.” Refer &lt;a href="https://learn.microsoft.com/en-us/azure/azure-monitor/autoscale/autoscale-get-started" rel="noopener noreferrer"&gt;this&lt;/a&gt; for more detailed information.&lt;/p&gt;




&lt;h2&gt;
  
  
  2. Thresholds: Knowing Your Party’s Limit
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkix4yuk7o3lu4jsq6cgz.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkix4yuk7o3lu4jsq6cgz.gif" alt="Thresholds" width="480" height="480"&gt;&lt;/a&gt;&lt;br&gt;
Every system has a breaking point—its threshold. It’s like knowing your living room can only hold 30 people before things get cramped. Thresholds are the breaking points of a system. Knowing them helps avoid disasters.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Identify Thresholds in Azure:&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
Example: Your SQL Database might have a DTU (Database Transaction Unit) limit of 1000.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Do It in Azure:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use &lt;strong&gt;Azure Monitor&lt;/strong&gt; to track resource metrics like DTU usage, CPU percentage, and memory utilization.&lt;/li&gt;
&lt;li&gt;Set up &lt;strong&gt;Alerts&lt;/strong&gt; in Azure Monitor for critical thresholds. For instance, send an email when database utilization exceeds 85%.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Set Alarms:&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
Example: Monitor your Azure Function’s execution time or failures.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Do It in Azure:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Navigate to Azure Monitor → Alerts → New Alert Rule.&lt;/li&gt;
&lt;li&gt;Configure conditions like “Trigger an alert if CPU utilization &amp;gt; 80% for 5 minutes.”&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  3. Handling Thresholds: Avoiding the Crash
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flnarznwbrzlnryy1ow85.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Flnarznwbrzlnryy1ow85.gif" alt="Avoiding the Crash" width="200" height="166"&gt;&lt;/a&gt;&lt;br&gt;
What happens when you hit a threshold? Use these strategies with Azure:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Graceful Degradation:&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
Example: Serve cached product pages if your backend API is overloaded.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Do It in Azure:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use &lt;strong&gt;Azure Front Door&lt;/strong&gt; to cache and serve static content from edge locations.&lt;/li&gt;
&lt;li&gt;Configure &lt;strong&gt;Azure CDN&lt;/strong&gt; to offload traffic from your backend during peak times.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Queueing Systems:&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
Example: Process orders in batches during peak traffic.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Do It in Azure:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use &lt;strong&gt;Azure Service Bus&lt;/strong&gt; or &lt;strong&gt;Azure Queue Storage&lt;/strong&gt; to enqueue requests for asynchronous processing.&lt;/li&gt;
&lt;li&gt;Implement a &lt;strong&gt;Logic App&lt;/strong&gt; or &lt;strong&gt;Azure Function&lt;/strong&gt; to process messages from the queue.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  4. Throughput: Keeping the Line Moving
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxpn3nntpyz51kt90xyhv.gif" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxpn3nntpyz51kt90xyhv.gif" alt="Throughput" width="480" height="320"&gt;&lt;/a&gt;&lt;br&gt;
Throughput is the speed at which you can serve your guests. In tech terms, it’s how many requests or tasks your system can handle per second.&lt;br&gt;
Here’s how to improve it with Azure:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Optimize Database Performance:&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
Example: If your Azure SQL Database is slow, optimize queries and scale up.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Do It in Azure:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use the Query Performance Insight feature in Azure SQL to identify slow queries.&lt;/li&gt;
&lt;li&gt;Scale your database using the DTU-based or vCore-based model.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Use Load Balancers:&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
Example: Distribute traffic evenly across multiple virtual machines.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Do It in Azure:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Set up Azure Load Balancer for Layer 4 (TCP/UDP) traffic.&lt;/li&gt;
&lt;li&gt;Use Application Gateway for Layer 7 (HTTP/HTTPS) traffic, and enable autoscaling.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Implement Caching:&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
Example: Reduce repeated database calls for product details.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Do It in Azure:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use Azure Cache for Redis to store frequently accessed data like product details or user sessions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Compress Responses:&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
Example: Compress API responses to reduce latency.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;How to Do It in Azure:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In Azure App Service, enable compression under Configuration → General Settings.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  5. Put It All Together: The Ultimate Party Plan
&lt;/h2&gt;

&lt;p&gt;Example Scenario:&lt;br&gt;
You’re running a ticket-booking platform for a concert, and traffic surges at ticket release.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Scaling:&lt;/u&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use Azure App Service autoscaling to handle increased HTTP requests.&lt;/li&gt;
&lt;li&gt;Scale your Azure SQL Database to a higher tier (e.g., Standard to Premium).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Thresholds:&lt;/u&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Monitor DTU utilization for Azure SQL Database using Azure Monitor.&lt;/li&gt;
&lt;li&gt;Set 
alerts for API response times exceeding 500ms.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Throughput:&lt;/u&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Cache event details using Azure Cache for Redis.&lt;/li&gt;
&lt;li&gt;Use Azure Front Door to route traffic globally and serve cached content faster.&lt;/li&gt;
&lt;li&gt;Implement message queues with Azure Service Bus to process ticket purchases asynchronously.&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Conclusion: Keep the Party Going!
&lt;/h2&gt;

&lt;p&gt;Azure provides an array of tools to manage peak loads, thresholds, and throughput efficiently. With features like autoscaling, Azure Monitor, Service Bus, and Front Door, you can ensure your system remains reliable and performant—even under heavy demand.&lt;/p&gt;

&lt;p&gt;By implementing these best practices, your applications will stay resilient, agile, and ready to handle any traffic surge.&lt;/p&gt;

&lt;p&gt;This blog is designed for novices to scaling or those with intermediate skills who want to strengthen their understanding of handling peak loads, thresholds, and throughput. Using relatable analogies and real-world scenarios, I have tried to break down these concepts into easy-to-grasp sections.&lt;/p&gt;

&lt;p&gt;Reference Links&lt;br&gt;
&lt;a href="https://learn.microsoft.com/en-us/azure/architecture/best-practices/auto-scaling" rel="noopener noreferrer"&gt;Autoscaling&lt;/a&gt;&lt;/p&gt;

</description>
      <category>azure</category>
      <category>productivity</category>
      <category>howto</category>
      <category>serverless</category>
    </item>
    <item>
      <title>Boost Productivity and Data Accuracy: Essential Tactics for Software Integration</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Thu, 06 Jun 2024 13:49:22 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/boost-productivity-and-data-accuracy-essential-tactics-for-software-integration-3hjn</link>
      <guid>https://forem.com/rutikakhaire/boost-productivity-and-data-accuracy-essential-tactics-for-software-integration-3hjn</guid>
      <description>&lt;h2&gt;
  
  
  Understanding Software Integration
&lt;/h2&gt;

&lt;p&gt;Nowadays, integration has become a fundamental aspect of software development. Every organization strives to remain competitive in today's fast-paced world. Integration enables different systems to communicate seamlessly and share information efficiently.&lt;/p&gt;

&lt;p&gt;The most effective software integration strategies involve thorough planning, clear communication, and collaborative efforts among teams.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why do we need Software Integration&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;&lt;strong&gt;Enhanced Workflow Efficiency&lt;/strong&gt;&lt;/em&gt;: Automating the flow of information between systems speeds up business processes and minimizes delays.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;&lt;strong&gt;Unified Data Management&lt;/strong&gt;&lt;/em&gt;: Integrated systems ensure that data is consistent and up-to-date across all platforms, reducing the risk of errors caused by data discrepancies.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;&lt;strong&gt;Real-time Data Access&lt;/strong&gt;&lt;/em&gt;: Changes made in one system are instantly reflected across all integrated systems, ensuring all users have access to the latest information.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;&lt;strong&gt;Automation of Tasks&lt;/strong&gt;&lt;/em&gt;: By automating routine and complex processes, integration frees up employees to focus on higher-value activities.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;em&gt;&lt;strong&gt;Reduced Operational Costs&lt;/strong&gt;&lt;/em&gt;: Integration reduces the need for maintaining multiple systems and data silos, lowering IT and operational expenses.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Key Integration Tactics
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;What it is&lt;/strong&gt;:  Key Integration Tactics are strategies and approaches used to effectively connect different software applications and data sources. They ensure smooth information flow and collaboration between these systems.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why they matter&lt;/strong&gt;:  Without proper tactics, integration projects can become complex, error-prone, and ultimately fail to deliver the desired benefits.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Different methods of Integration&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;em&gt;API Integration&lt;/em&gt;&lt;/strong&gt;: This allows different software systems to communicate with each other via APIs. &lt;br&gt;
&lt;strong&gt;&lt;em&gt;Middleware&lt;/em&gt;&lt;/strong&gt;: There are various middleware that can be employed to facilitate communication between systems.&lt;br&gt;
&lt;strong&gt;&lt;em&gt;Webhooks&lt;/em&gt;&lt;/strong&gt;: This helps in real-time data exchange.&lt;br&gt;
&lt;strong&gt;Microservices&lt;/strong&gt;: This breaks down applications into smaller, interconnected services.&lt;/p&gt;

&lt;h2&gt;
  
  
  Planning and Choosing the Right Tactics
&lt;/h2&gt;

&lt;p&gt;&lt;u&gt;&lt;strong&gt;Data mapping and standardization&lt;/strong&gt;&lt;/u&gt;&lt;br&gt;
This is one of the crucial tactics to ensure smooth and accurate communication between different software applications. There are several benefits that include &lt;strong&gt;data accuracy&lt;/strong&gt;, &lt;strong&gt;efficiency&lt;/strong&gt;, &lt;strong&gt;maintainability&lt;/strong&gt; and &lt;strong&gt;interoperability&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;u&gt;&lt;strong&gt;Reconciliation fields&lt;/strong&gt;&lt;/u&gt;&lt;br&gt;
Though reconciliation fields are not a universally defined concept in software integration, but they play a crucial role in ensuring data consistency between integrated systems. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What are reconciliation fields?&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Reconciliation fields are specific data points used to compare and identify discrepancies between data sets in integrated systems. They act as a common ground for both systems to verify if the information they hold aligns.&lt;/p&gt;

&lt;p&gt;Few advantages of using reconciliation fields include pinpointing the source of the error and taking corrective actions by analyzing discrepancies in such fields. Reliable data from reconciled systems leads to better insights and informed decisions.&lt;/p&gt;

&lt;p&gt;&lt;u&gt;&lt;strong&gt;Error Handling and Security&lt;/strong&gt;&lt;/u&gt;&lt;br&gt;
Implementation of robust error handling and security measures is another crucial tact to ensure smooth integration. It is a process of anticipating, detecting, and recovering from unexpected issues that may arise during data exchange between integrated systems.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Few techniques to implement it include:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Logging errors&lt;/strong&gt;: This is one of the very first go-to option in order to tackle issues and understand the root causes behind the problems.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Retry logics&lt;/strong&gt;: You can implement logics that would re attempt data transfers after some delay.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Alerts&lt;/strong&gt;: There are different ways to define alert systems to notify respective teams about issues to take necessary and appropriate actions.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;&lt;u&gt;Testing and Monitoring&lt;/u&gt;&lt;/strong&gt;&lt;br&gt;
Thorough testing and ongoing monitoring are the most essentials for ensuring the smooth operation and long-term success of your software integration project. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Testing&lt;/strong&gt; in simple words means ensuring that the integration meets the requirements and the data exchange functions accurately. There are various types of testing like below that really help in ensuring that systems meets all expectations.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Unit testing&lt;/li&gt;
&lt;li&gt;Integration testing&lt;/li&gt;
&lt;li&gt;Regression testing&lt;/li&gt;
&lt;li&gt;Smoke testing&lt;/li&gt;
&lt;li&gt;Functional and Non-Functional testing&lt;/li&gt;
&lt;li&gt;System testing&lt;/li&gt;
&lt;li&gt;Load testing&lt;/li&gt;
&lt;li&gt;Negative testing&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;strong&gt;Monitoring&lt;/strong&gt; on the other hand involves continuously observing the performance and the health of the integrated system after it is deployed. There are different metrics available that help in tracking the exchange of data and identify failed transactions.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Future Trends in Software Integration&lt;/strong&gt;&lt;br&gt;
Software integration is constantly evolving, driven by advancements in technology and the growing need for seamless data exchange across an ever-expanding application ecosystem. Here's a glimpse into some key trends shaping the future of software integration:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The Internet of Things (IoT)&lt;/strong&gt;: Integrating with IoT devices is going to become crucial in order to manage and analyze data collected from sensors and intelligent devices.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Artificial intelligence (AI) and Machine Learning (ML)&lt;/strong&gt;: AI and ML can be used to automate tasks within the integration process, like data mapping and error handling. Additionally, AI-powered integration platforms can learn and adapt over time, optimizing performance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Low-Code/No-Code Integration Tools&lt;/strong&gt;: There is a rise in tools that empower business users with limited or no coding experience to build basic integrations.&lt;/p&gt;

&lt;p&gt;These trends highlight a future where software integration becomes more accessible, automated, and secure. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Looking Ahead&lt;/strong&gt;&lt;br&gt;
By adopting the key tactics explored in this blog, you can approach software integration projects with confidence.  Remember, successful integration goes beyond just connecting systems. It's about establishing a well-defined strategy, utilizing the right tools and methods, and prioritizing ongoing maintenance and monitoring.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Building Real-time Notifications with Azure Web PubSub and Azure Functions</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Fri, 01 Dec 2023 04:31:18 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/building-real-time-notifications-with-azure-web-pubsub-and-azure-functions-5dmc</link>
      <guid>https://forem.com/rutikakhaire/building-real-time-notifications-with-azure-web-pubsub-and-azure-functions-5dmc</guid>
      <description>&lt;p&gt;Hello people!!! Today I am writing this blog to share my experience of using the Azure Web Pub-Sub service within Azure Function to send real time messages to my React application.&lt;/p&gt;

&lt;p&gt;Many a times, we have a requirement to do some processing asynchronously behind the scenes so that we don't block user's activities on the site.&lt;/p&gt;

&lt;p&gt;I came across a similar situation where I was working on a chatbot application. This application has a feature that allows a user to upload a file and then train the file on a specific model and then once the file is trained successfully, the user should be able to query the trained data and get responses from it. Very much similar to what we do in chatgpt.&lt;/p&gt;

&lt;p&gt;So the requirement was as below:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;User uploads a file in the chat box&lt;/li&gt;
&lt;li&gt;Clicks the send icon&lt;/li&gt;
&lt;li&gt;File is uploaded to Azure storage&lt;/li&gt;
&lt;li&gt;User receives a success message saying - "File uploaded successfully"&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Now, while the file is getting trained asynchronously, the user should be able to continue his/her conversation with rest of the available data.&lt;/p&gt;

&lt;p&gt;So, then when should the user query for the data that he/she just uploaded as a file??? The answer to this is whenever the file is processed successfully behind the scenes, there should be some indication to the user saying that your file is now trained successfully and is ready to serve requests.&lt;/p&gt;

&lt;p&gt;Here comes the concept of a PubSub notification. When the file is processed or trained, the PubSub service will publish a message. That is what Pub stands for. And this published message should be subscribed by some service/resource to receive it which is termed as Sub. So &lt;strong&gt;&lt;em&gt;PubSub&lt;/em&gt;&lt;/strong&gt; stands for &lt;strong&gt;&lt;em&gt;Publish&lt;/em&gt;&lt;/strong&gt; and &lt;strong&gt;&lt;em&gt;Subscribe&lt;/em&gt;&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Subscribe to a message
&lt;/h2&gt;

&lt;p&gt;You can use &lt;strong&gt;&lt;em&gt;WebSocket&lt;/em&gt;&lt;/strong&gt; connection from your frontend application to stay connected to your publisher service so that you can receive real time messages.&lt;/p&gt;

&lt;p&gt;In my case, the publisher is Azure function and the subscriber is React application.&lt;/p&gt;

&lt;h2&gt;
  
  
  Create an Azure Web PubSub service
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Go to Azure portal&lt;/li&gt;
&lt;li&gt;Type Web PubSub Service in the search bar&lt;/li&gt;
&lt;li&gt;Click on the Create button &lt;/li&gt;
&lt;li&gt;Select Web PubSub and then add the necessary information and create the service&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Below is how the service will look like when created successfully&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F15ox68hypfv4el8w6ukr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F15ox68hypfv4el8w6ukr.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Go to the Keys section and see that there are connection strings available. The primary connection string is to be used while establishing connection. So keep a note of it. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhocngr4i26poc2yni4s9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhocngr4i26poc2yni4s9.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Azure Function
&lt;/h2&gt;

&lt;p&gt;An Azure function is an event driven, serverless compute where you can write less code to save costs and maintain the infrastructure. Read more &lt;a href="https://learn.microsoft.com/en-us/azure/azure-functions/functions-triggers-bindings?tabs=isolated-process%2Cpython-v2&amp;amp;pivots=programming-language-typescript" rel="noopener noreferrer"&gt;here&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;So, now my azure function is responsible to read the file from Azure storage and then pass the file to an API endpoint to train it.&lt;/p&gt;

&lt;p&gt;As, the function is event driven, I have to add two functions in my function app. One is the &lt;strong&gt;&lt;em&gt;Blob Storage Trigger&lt;/em&gt;&lt;/strong&gt; which executes whenever a new file is uploaded to the specific storage. And the other is the &lt;strong&gt;&lt;em&gt;HTTP Trigger&lt;/em&gt;&lt;/strong&gt; which helps in keeping the Websocket http connection open from the React app to the Azure function.&lt;/p&gt;

&lt;p&gt;You have to create a function app in Azure and then add the functions to that function app.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Steps to create the function app:&lt;/strong&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Go to Azure portal&lt;/li&gt;
&lt;li&gt;Type Function App in the search bar&lt;/li&gt;
&lt;li&gt;Click Create&lt;/li&gt;
&lt;li&gt;Add the necessary and required information to create the function app&lt;/li&gt;
&lt;li&gt;After the app is created, go to Configuration and add a setting for the Web PubSub connection string as below. Here you would need the primary connection string that was created when PubSub service was created.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgxh8gx1afykjtev9t3lo.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgxh8gx1afykjtev9t3lo.png" alt=" " width="800" height="378"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Below is my code structure for Azure function:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnatsllj3dlop3mjat62v.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fnatsllj3dlop3mjat62v.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;&lt;em&gt;AiTrain&lt;/em&gt;&lt;/strong&gt; function is a blob storage trigger function and &lt;strong&gt;&lt;em&gt;Negotiate&lt;/em&gt;&lt;/strong&gt; is the HTTP trigger function. I am using typescript.&lt;/p&gt;

&lt;h2&gt;
  
  
  Negotiate function
&lt;/h2&gt;

&lt;p&gt;This function is responsible to return the connection object to the subscriber. So, the index.ts file in this function has below code.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;module.exports = function (context, req, connection) {&lt;br&gt;
    context.res = { body: connection };&lt;br&gt;
    context.done();&lt;br&gt;
};&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;And in the function.json file, it has below settings:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "bindings": [
    {
      "authLevel": "anonymous",
      "type": "httpTrigger",
      "direction": "in",
      "name": "req",
      "methods": [
        "get",
        "post"
      ]
    },
    {
      "type": "http",
      "direction": "out",
      "name": "res"
    },
    {
      "type": "webPubSubConnection",
      "name": "connection",
      "hub": "notification",
      "direction": "in"
    }
  ],
  "scriptFile": "../dist/Negotiate/index.js"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  AiTrain function
&lt;/h2&gt;

&lt;p&gt;This function is responsible to train the uploaded file and publish a message after success.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import { AzureFunction, Context } from '@azure/functions';

const blobTrigger: AzureFunction = async function (
    context: Context,
    myBlob: string
): Promise&amp;lt;void&amp;gt; {
    try {
        context.log('TypeScript Blob trigger function');

        const blobURI = decodeURIComponent(context.bindingData.uri);

        // file processing logic


        //Azure web pubsub actions
        const actions = {
            actionName: 'sendToAll',
            data: `File processing complete!`,
            dataType: 'text',
        };

        context.bindings.actions = actions;

        // No asynchronous operations, so we can return a resolved Promise
        return Promise.resolve();
        }
    } catch (error: unknown) {
        console.error(error);
    }
};

export default blobTrigger;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The function.json file has below code:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "bindings": [
    {
      "name": "myBlob",
      "type": "blobTrigger",
      "direction": "in",
      "path": "files/{name}",
      "connection": "dev_STORAGE"
    },
    {
      "type": "webPubSub",
      "name": "actions",
      "hub": "notification",
      "direction": "out"
    }
  ],
  "scriptFile": "../dist/AiTrain/index.js"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This completes the first part of publishing a message from Azure function.&lt;/p&gt;




&lt;h2&gt;
  
  
  Subscribe to published messages
&lt;/h2&gt;

&lt;p&gt;In order to subscribe the messages from the React application, all you need to do is establish a Websocket connection in your code.&lt;/p&gt;

&lt;p&gt;Below is my react code that establishes a Websocket connection to the azure function.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;useEffect(() =&amp;gt; {
        const connect = async () =&amp;gt; {
        const res = await fetch(`${API_URI}/api/negotiate`);
        const { url, accessToken } = await res.json();
        const ws = new WebSocket(url);
        ws.onopen = () =&amp;gt; console.log('connected');
        ws.onmessage = (event:any) =&amp;gt; {
            console.log(JSON.stringify(event.data));
            //Code Logic
        };

        };
        connect();
    }, []);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We have to call the negotiate endpoint to keep the React app connected to the Azure function.&lt;/p&gt;

&lt;p&gt;And that's all. You can see the console messages to see the event object that has the data sent from the Azure function. You can choose any component to display this data as a notification to the user. &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;In conclusion, Azure Web PubSub when coupled with Azure Functions and a React application opens up exciting possibilities for real-time, scalable, and interactive web experiences. We've explored how seamlessly these components integrate to create a dynamic communication channel between the server and client, enabling instant updates and collaborative features.&lt;/p&gt;

&lt;p&gt;Keep experimenting, pushing the boundaries, and discovering new ways to leverage Azure Web PubSub. The possibilities are vast.&lt;/p&gt;

&lt;p&gt;I hope this blog has inspired you to dive deeper into the realm of real-time communication and discover the endless possibilities that await. Happy coding!&lt;/p&gt;

</description>
      <category>azure</category>
      <category>pubsub</category>
      <category>messaging</category>
      <category>ai</category>
    </item>
    <item>
      <title>AWS vs. Azure: Comparison</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Fri, 29 Sep 2023 14:26:03 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/aws-vs-azure-comparison-36pf</link>
      <guid>https://forem.com/rutikakhaire/aws-vs-azure-comparison-36pf</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Amazon Web Services (AWS) and Microsoft Azure are the two giants when it comes to cloud computing. The most critical business decision is to choose the cloud platform.&lt;/p&gt;

&lt;p&gt;Not only cloud providers, but AWS and Azure have come a long way to provide technological services to revolutionize the way organizations operate and innovate.&lt;/p&gt;

&lt;p&gt;In this comparison, I will share my understandings of the vast ecosystems they offer, the pricing models that govern their services, the global infrastructure, and the security measures they employ to protect sensitive data. &lt;/p&gt;

&lt;h2&gt;
  
  
  Ecosystems
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;Below are the &lt;strong&gt;Compute Services&lt;/strong&gt; offered&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;AWS&lt;/strong&gt; comprises of AWS Elastic Beanstalk, Amazon EC2 (Elastic Compute Cloud),  AWS Lambda, Amazon ECS (Elastic Container Service), and more.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Azure&lt;/strong&gt; comprises of Azure App Service, Azure Functions, Azure Kubernetes Service (AKS), Virtual Machines (VMs), and more&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Then if we talk about storage then below are the &lt;strong&gt;Storage Services&lt;/strong&gt; offered&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;AWS&lt;/strong&gt; offers Amazon S3 (Simple Storage Service), Amazon EBS (Elastic Block Store), Amazon RDS (Relational Database Service), Amazon DynamoDB, and Amazon Redshift&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Azure&lt;/strong&gt; offers Azure Blob Storage, Azure File Storage, Azure Table Storage, Azure Disk Storage, and Azure Data Lake Storage&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Let's see the &lt;strong&gt;Database Services&lt;/strong&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;strong&gt;AWS&lt;/strong&gt; offers Amazon RDS, Amazon Aurora, Amazon DynamoDB, Amazon Redshift, and Amazon Neptune&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Azure&lt;/strong&gt; offers Azure SQL Database, Azure Database for PostgreSQL, MySQL, and MariaDB, Azure Cosmos DB, and more&lt;/p&gt;

&lt;h2&gt;
  
  
  Pricing
&lt;/h2&gt;

&lt;p&gt;Both AWS and Azure provide a free tier that you can use to try your hands on to explore their service offerings. Both also offer pricing calculators that you can use to make cost estimations as per your needs. &lt;/p&gt;

&lt;p&gt;There is also &lt;strong&gt;pay as you go pricing&lt;/strong&gt; that charges you based on the amount of time you use the resources. It can be per minute or per hour basis.&lt;/p&gt;

&lt;p&gt;If you want to use the resources for a longer duration like for a year or more then both AWS and Azure offer special discounted rates which are termed as &lt;strong&gt;Reserved Instances&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Global Infrastructure
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Azure&lt;/strong&gt; global infrastructure is made up of two key components—physical infrastructure and connective network components. The physical component is comprised of 200+ physical datacenters, arranged into regions, and linked by one of the largest interconnected networks on the planet.&lt;/p&gt;

&lt;p&gt;With the connectivity of the global Azure network, each of the Azure datacenters provides high availability, low latency, scalability, and the latest advancements in cloud infrastructure—all running on the Azure platform.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;AWS Cloud&lt;/strong&gt; has across 102 Availability Zones within 32 geographic regions around the world, with announced plans for 12 more Availability Zones and 4 more AWS Regions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Security Measures to protect sensitive data
&lt;/h2&gt;

&lt;p&gt;To manage access to resources, AWS offers &lt;strong&gt;IAM&lt;/strong&gt; roles, users and group and in Azure, you can use &lt;strong&gt;Azure Active Directory&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;If you want to secure configurable data then &lt;strong&gt;AWS&lt;/strong&gt; offers &lt;strong&gt;Secrets Manager&lt;/strong&gt; and &lt;strong&gt;Azure&lt;/strong&gt; provides &lt;strong&gt;Azure Key Vault&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;This is a brief comparison that I have shared based on my experience of working on Azure and AWS. There is still lot out there in Azure and AWS that you can explore by visiting their official documentation.&lt;/p&gt;

&lt;p&gt;Thanks for reading!&lt;/p&gt;

</description>
      <category>azure</category>
      <category>aws</category>
      <category>cloudcomputing</category>
    </item>
    <item>
      <title>AI-Powered Coding: A Look at GitHub CoPilot's Capabilities</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Thu, 13 Apr 2023 12:05:42 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/ai-powered-coding-a-look-at-github-copilots-capabilities-3a55</link>
      <guid>https://forem.com/rutikakhaire/ai-powered-coding-a-look-at-github-copilots-capabilities-3a55</guid>
      <description>&lt;p&gt;Let me give you an insight of a powerful tool called the &lt;strong&gt;GitHub CoPilot&lt;/strong&gt; that has revolutionized the way we write code. And congratulations to all the developers as we have an AI powered assistant that will assist us in writing fast and efficient code. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhqtwuokxkg5e7nhfb06j.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhqtwuokxkg5e7nhfb06j.jpg" alt="Congratulations" width="512" height="333"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The term "CoPilot" generally refers to a person who assists the primary pilot in controlling an aircraft. Similarly, GitHub CoPilot is the term that conveys the idea of an AI powered assistant for a developer. Like a co-pilot who helps the pilot in controlling an aircraft, GitHub CoPilot helps developers in controlling the code they write.&lt;/p&gt;

&lt;h2&gt;
  
  
  GitHub CoPilot's Capabilities
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Increased Productivity
&lt;/h3&gt;

&lt;p&gt;Isn't the goal of a developer to maximize productivity? Why not utilize a potent tool to enhance your productivity?&lt;/p&gt;

&lt;p&gt;GitHub CoPilot provides suggestions and autocompletions thus automating repetitive tasks. While it won't write the entire code according to your exact specifications, it can still be a valuable time-saver, particularly when it comes to catching unintended syntactical errors. By preventing you from spending your productive hours fixing minor mistakes that you may have inadvertently made, it allows you to focus on more meaningful tasks.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs5ug3nsw5b64rvddtrdk.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs5ug3nsw5b64rvddtrdk.jpg" alt="More Productive" width="287" height="176"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Understanding Natural Language
&lt;/h3&gt;

&lt;p&gt;One of the most impressive capabilities that I found is that it is able to understand plain English language and generate code for you.&lt;/p&gt;

&lt;p&gt;For example, if you want to write a code to filter non matching data from two JSON objects then you can simply type the text in form of a comment and it will give the code.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;//Filter out non matching personIDs from teamMembers and assignmentExists&lt;br&gt;
            const filteredTeamMembers = teamMembers.filter((e:any)=&amp;gt; !assignmentExists.find((n:any)=&amp;gt;n.dataValues.PersonID === e.dataValues.PersonID));&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F15msfk466c2e9mtgtfiz.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F15msfk466c2e9mtgtfiz.png" alt="AI Programmer" width="800" height="450"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Helps in reducing errors
&lt;/h3&gt;

&lt;p&gt;It is able to recognize errors and correct those, thus saving lot of time.&lt;/p&gt;

&lt;h3&gt;
  
  
  Better quality of code
&lt;/h3&gt;

&lt;p&gt;As it is able to give proper code suggestions, you can focus on improving the code quality. In my case, it gave me async await code suggestions which I feel is much better than a promise based code.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvfrzlxi28x4prcooki2l.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvfrzlxi28x4prcooki2l.jpg" alt="Good Quality Code" width="275" height="183"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;So these are few of the capabilities that this tool offers. While there are still a few limitations as well but it is still powerful enough. &lt;/p&gt;

&lt;p&gt;You can explore this tool by creating a free account for 2 months after which it will charge you. If you are using Visual Studio Code editor, then simply install the extension and start coding.&lt;/p&gt;

&lt;p&gt;For more info visit &lt;a href="https://github.com/features/copilot" rel="noopener noreferrer"&gt;link&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>github</category>
      <category>developers</category>
      <category>programming</category>
    </item>
    <item>
      <title>The easiest way to use HubSpot Search API</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Fri, 24 Mar 2023 13:47:24 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/the-easiest-way-to-use-hubspot-search-api-5ga9</link>
      <guid>https://forem.com/rutikakhaire/the-easiest-way-to-use-hubspot-search-api-5ga9</guid>
      <description>&lt;p&gt;Whenever working on integrations, we basically use the APIs to retrieve the information. Some APIs are pretty straightforward to understand, so just by looking at the endpoint we can make out how to use it but sometimes, you need to dig in a bit to understand the usage.&lt;/p&gt;

&lt;p&gt;I came across this situation in my application where I wanted to retrieve all the contacts from HubSpot based on a particular property value. For eg. If there is a contact property named &lt;strong&gt;Company_Affiliation&lt;/strong&gt; and I want to retrieve all the contacts with a company affiliation value of say &lt;strong&gt;Fictional Company&lt;/strong&gt; then how should I use the Search API?&lt;/p&gt;

&lt;h2&gt;
  
  
  The Endpoint
&lt;/h2&gt;

&lt;p&gt;POST&lt;br&gt;
/crm/v3/objects/contacts/search&lt;/p&gt;

&lt;h2&gt;
  
  
  The Input
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const inputHubSpotSearchObject=
    {
        "objectName":"contacts",
        "limitValue":"100",
        "afterValue":"0",
        "filters":[
            {
                "value":req.body.company_name,
                "propertyName": "company_affiliation",
                "operator": "EQ"
            }

        ]
    }
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  The API call
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const response = await axios.post(`${process.env.API_URL}/api/hubspot/search`,inputHubSpotSearchObject,
                          {
                          headers: {
                              "Content-Type": "application/json",
                              Authorization: `Bearer ${accessToken}`,
                          }});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  The Response
&lt;/h2&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F96cgatfbyeswj7nmf9j9.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F96cgatfbyeswj7nmf9j9.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;I hope you found this article helpful. For more information visit &lt;a href="https://developers.hubspot.com/docs/api/crm/search" rel="noopener noreferrer"&gt;link&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Happy Coding!!!&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>programming</category>
      <category>typescript</category>
      <category>api</category>
    </item>
    <item>
      <title>Understanding the basics of Express Middleware</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Wed, 28 Dec 2022 09:23:24 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/understanding-the-basics-of-express-middleware-4ifp</link>
      <guid>https://forem.com/rutikakhaire/understanding-the-basics-of-express-middleware-4ifp</guid>
      <description>&lt;p&gt;In this article, we will understand what Express middleware is and how to use it in a NodeJS application.&lt;/p&gt;

&lt;p&gt;As the name in itself is self explanatory, Express works as a middleware in an application. Typically an application is accessed from a client system to retrieve some information from a server. So, here there is a &lt;strong&gt;Request&lt;/strong&gt; that is sent by the client to the server and there is a &lt;strong&gt;Response&lt;/strong&gt; sent by the server back to the requesting client.&lt;/p&gt;

&lt;p&gt;So, &lt;strong&gt;Express Middleware&lt;/strong&gt; consists of functions that are executed after a &lt;strong&gt;Request&lt;/strong&gt; is sent by client and before the &lt;strong&gt;Response&lt;/strong&gt; is sent by the server. Something, that we want to execute in the middle of the &lt;strong&gt;Request&lt;/strong&gt; and &lt;strong&gt;Response&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9fyxdlagmzn8xf8b28c1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9fyxdlagmzn8xf8b28c1.png" alt=" " width="800" height="314"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We can have multiple middlewares that can also modify the request object and then pass it to the next function in the sequence.&lt;/p&gt;




&lt;p&gt;The base of Express is the &lt;strong&gt;Application&lt;/strong&gt; object which is typically used as app. This app object exposes different methods and here we will take a look at the &lt;strong&gt;listen&lt;/strong&gt;, &lt;strong&gt;use&lt;/strong&gt;, &lt;strong&gt;HTTP methods (get, put, post etc.)&lt;/strong&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Create a server with app object
&lt;/h2&gt;

&lt;p&gt;Whenever a website is created, it has a link / URL that we use to access the website. And as the website stays on a server, there is a client that comes into picture that calls the website's URL. So, now as the website is being called, there must be someone who is listening to the calls right? In this case, the server is the listener. And that is the purpose of the listen method of app. There is a port number where these calls from a client are getting listened by the server.&lt;/p&gt;

&lt;p&gt;The syntax for creating a server using the app object is as below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const express = require('express')

const app = express()
const port = 3000

app.listen(port, () =&amp;gt; {
  console.log(`Example app listening on port ${port}`)
})
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now the above code will make the server up and running at the port 3000. When we run the application we can see the message given in the console.log as below.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1u2xea6e3yz83466h7n7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F1u2xea6e3yz83466h7n7.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;Now, we need to define some endpoints (path/URI) in the code that will allow the users to call different pages of the website. For, eg. a website can have a Home Page, a Contact Page, an About Page and so on. Ideally, the URL would be something like &lt;a href="https://somedomain/Home" rel="noopener noreferrer"&gt;https://somedomain/Home&lt;/a&gt;, &lt;a href="https://somedomain/About" rel="noopener noreferrer"&gt;https://somedomain/About&lt;/a&gt;, &lt;a href="https://somedomain/Contact" rel="noopener noreferrer"&gt;https://somedomain/Contact&lt;/a&gt; and so on. The part of the URL like /Home, /About, /Contact etc. is what is termed as the endpoint that we should be adding in our code using the app object. Here, the HTTP methods (get,put,post etc.) come into picture. Whenever we try to access the website's page, we get some information that is displayed in the browser. So, for the above mentioned pages, we should use the get method of app.&lt;/p&gt;

&lt;p&gt;This is termed as &lt;strong&gt;Routing&lt;/strong&gt;. Below is the definition of Routing from express official documentation.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Routing refers to determining how an application responds to a client request to a particular endpoint, which is a URI (or path) and a specific HTTP request method (GET, POST, and so on).&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;The basic structure of a route is as following:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;app.METHOD(PATH, HANDLER)&lt;/code&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;strong&gt;Where&lt;/strong&gt;:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;app&lt;/strong&gt; is an instance of express.&lt;br&gt;
&lt;strong&gt;METHOD&lt;/strong&gt; is an HTTP request method, in lowercase.&lt;br&gt;
&lt;strong&gt;PATH&lt;/strong&gt; is a path on the server.&lt;br&gt;
&lt;strong&gt;HANDLER&lt;/strong&gt; is the function executed when the route is matched.&lt;/p&gt;
&lt;/blockquote&gt;




&lt;h2&gt;
  
  
  Let's see how we can use the HTTP methods
&lt;/h2&gt;

&lt;p&gt;Following example shows the &lt;strong&gt;get&lt;/strong&gt; method:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;app.get('/Home', (req, res) =&amp;gt; {
  res.send('Welcome to the Home Page!')
})
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The code snippet would be as below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0hecl4q9thh6dozqqpwa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0hecl4q9thh6dozqqpwa.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;It can be accessed in the browser using URL &lt;a href="http://localhost:3000/home" rel="noopener noreferrer"&gt;http://localhost:3000/home&lt;/a&gt; and we can see the following message:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fajc5lgurvhskxe18c6xx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fajc5lgurvhskxe18c6xx.png" alt=" " width="409" height="157"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;An important point to note here is that if we try to access the link as &lt;a href="http://localhost:3000" rel="noopener noreferrer"&gt;http://localhost:3000&lt;/a&gt;, we will get an error that says &lt;em&gt;Cannot GET /&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;What does this error mean? It means that we have not specified the base path which is "/" in our code and we are still trying to access it in the browser.&lt;/p&gt;

&lt;p&gt;Let us add the code for base path and see how it works. Following code snippet shows a new get method for base path.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;app.get('/', (req, res) =&amp;gt; {
    res.send('This is the base call!')
  })
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The complete code till now looks like below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzhwflt3h2t8kxeenid96.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fzhwflt3h2t8kxeenid96.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In the browser now if we access the URL as &lt;a href="http://localhost:3000/" rel="noopener noreferrer"&gt;http://localhost:3000/&lt;/a&gt;, we see the below output.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu6qlq3n57u41wiqgq3tq.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fu6qlq3n57u41wiqgq3tq.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;The purpose of app.use method&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Now, let's understand what is the app.use method used for. From the definition app.use -&amp;gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;&lt;em&gt;Mounts the specified middleware function or functions at the specified path: the middleware function is executed when the base of the requested path matches path.&lt;/em&gt;&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;So, in a simple language, if we want to execute some logic before the a specific path is matched and the respective handler is executed, we can configure it using the app.use method.&lt;/p&gt;

&lt;p&gt;For eg. app.use('/apple', ...) will match “/apple”, “/apple/images”, “/apple/images/news”, and so on. So, if we call "apple/images", it will first execute handler of "/apple" and then the handler of "/apple/images"&lt;/p&gt;

&lt;p&gt;If we configure a middleware &lt;strong&gt;without a path&lt;/strong&gt; then it gets executed for &lt;strong&gt;every request&lt;/strong&gt; to the app. For eg. the following function will be executed for every request.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;app.use(function (req, res, next) {
  console.log('Time: %d', Date.now())
  next()
})
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;&lt;em&gt;Middleware functions are executed sequentially, therefore the order of middleware inclusion is important.&lt;/em&gt;&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The sequential execution of functions is achieved using the &lt;strong&gt;next()&lt;/strong&gt; function. As used in the above code snippet, we are using the next() call to ensure that the next function in the sequence is executed.&lt;/p&gt;

&lt;p&gt;Following code illustrates this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Femr95onrl6817m96lrgr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Femr95onrl6817m96lrgr.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;NOTE: You have to call the URL &lt;a href="http://localhost:3000/" rel="noopener noreferrer"&gt;http://localhost:3000/&lt;/a&gt; in the browser to see the output&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Let's see what happens if we remove the next() call from the function.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsdi1r6dsby4m2ogb8jci.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fsdi1r6dsby4m2ogb8jci.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;See, the log that we were seeing earlier after the Time was displayed (This is the base call!) does not appear now. This simply means that the control does not pass to the next function in sequence. So, it is important to decide the sequence of executing the functions and add those appropriately.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;I hope you have enjoyed this article and hope that it helped you in understanding the basics of Express. For more information you can definitely go through the official express documentation &lt;a href="https://expressjs.com/" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/em&gt;&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Sequelize ORM with NodeJS</title>
      <dc:creator>Rutika Khaire</dc:creator>
      <pubDate>Fri, 30 Sep 2022 16:34:23 +0000</pubDate>
      <link>https://forem.com/rutikakhaire/sequelize-orm-with-nodejs-3ogf</link>
      <guid>https://forem.com/rutikakhaire/sequelize-orm-with-nodejs-3ogf</guid>
      <description>&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;Sequelize is a promise based ORM (Object Relational Mapper) for NodeJS. We can use multiple databases with Sequelize like Oracle, Postgres, MySQL, MariaDB, SQLite and SQL Server, and more.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;What does ORM actually mean?&lt;/strong&gt;&lt;br&gt;
An Object Relational Mapper represents the database records as objects. It lets you create and manipulate data from a database using an object oriented paradigm.&lt;/p&gt;

&lt;p&gt;So, using Sequelize, you can perform DML operations like &lt;strong&gt;SELECT, INSERT, UPDATE, DELETE&lt;/strong&gt;  etc. using class methods. You can also define relationships on your database tables using class methods like &lt;strong&gt;hasOne()&lt;/strong&gt;, &lt;strong&gt;belongsTo()&lt;/strong&gt; and &lt;strong&gt;hasMany()&lt;/strong&gt;, etc.&lt;/p&gt;

&lt;p&gt;So, now let's get started&lt;/p&gt;


&lt;h2&gt;
  
  
  Create a NodeJS application
&lt;/h2&gt;

&lt;p&gt;Create a new folder at your desired location and initialize this as your node.js app using below command&lt;/p&gt;

&lt;p&gt;&lt;code&gt;npm init&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Keep pressing the enter key after adding the required information and your node.js app is ready.&lt;/p&gt;

&lt;p&gt;Now install all the required dependencies using below command&lt;/p&gt;

&lt;p&gt;&lt;code&gt;npm install express mysql2 cors sequelize - save&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;The package.json file will look like below after you have all the dependencies successfully installed.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F38bxxdf1nh7qy05uj3rg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F38bxxdf1nh7qy05uj3rg.png" alt=" " width="800" height="431"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The next step is to create a new express web server. Add a filename.js file at the root of your folder and add below code.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const cors = require("cors");
const express = require("express");

const app = express();

var corsOptions = {
  origin: "http://localhost:8081"
};

app.use(cors(corsOptions));

// parse requests of content-type - application/json
app.use(express.json());

// parse requests of content-type - application/x-www-form-urlencoded
app.use(express.urlencoded({ extended: true }));

// simple route
app.get("/", (req, res) =&amp;gt; {
  res.json({ message: "Welcome to NodeJs App!!!" });
});

// set port, listen for requests
const PORT = process.env.PORT || 8080;
app.listen(PORT, () =&amp;gt; {
  console.log(`Server is up and running on port ${PORT}.`);
});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Use the following command to execute the server&lt;/p&gt;

&lt;p&gt;&lt;code&gt;node filename.js&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;You will get the following message &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvrpuglex1t3c059zumnw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fvrpuglex1t3c059zumnw.png" alt=" " width="677" height="169"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now, if you go to the browser and type the URL -&amp;gt; [&lt;a href="http://localhost:8080/" rel="noopener noreferrer"&gt;http://localhost:8080/&lt;/a&gt;] you can see the application is up and running&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuuuhadm9qzjzhnwe4cnn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fuuuhadm9qzjzhnwe4cnn.png" alt=" " width="800" height="400"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Create a database&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can go to MSSQL Server and create a new database. In my case, I created it on Microsoft Azure. The creation of tables can be done with the help of Sequelize.&lt;/p&gt;

&lt;p&gt;Next step is to put all the database configurations in a file. So, I have created a config.js file as below.&lt;/p&gt;

&lt;p&gt;&lt;code&gt;module.exports = {&lt;br&gt;
    HOST: "localhost",&lt;br&gt;
    USER: "root",&lt;br&gt;
    PASSWORD: "",&lt;br&gt;
    DB: "student_db",&lt;br&gt;
    dialect: "mysql",&lt;br&gt;
    pool: {//pool configuration&lt;br&gt;
      max: 5,//maximum number of connection in pool&lt;br&gt;
      min: 0,//minimum number of connection in pool&lt;br&gt;
      acquire: 30000,//maximum time in ms that pool will try to get connection before throwing error&lt;br&gt;
      idle: 10000//maximum time in ms, that a connection can be idle before being released&lt;br&gt;
    }&lt;br&gt;
  };&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Initialize Sequelize&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Create a new folder called models in the root directory and add a new file called index.js. Add below code there.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const dbConfig = require(“../config/config.js:);

const Sequelize = require(“Sequelize”);
const Sequelize = new Sequelize(dbCofig.DB, dbConfig.USER,
dbConfig.PASSWORD, {
    host: dbConfig.HOST,
    dialect: dbConfig.dialect,
    operationsAliases: false,
    pool: {
    max: dbConfig.pool.max,
    min: dbConfig.pool.min,
    acquire: dbConfig.pool.acquire,
    idle: dbConfig.pool.idle
    }
};
const db = {};

db.Sequelize = Sequelize;
db.sequelize = sequelize;

db.student= require(“./student.js”) (sequelize, Sequelize);

module.exports = db;
The user should not forget to summon the sync() method in the server.js.
const app = express();
app.use(....);

const db = require(“./models”);
db.sequelize.sync();
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;When you need to drop the existing tables and the database is required to be resynchronized, enter the force: true code like the below:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;db.sequelize.sync({force: true}).then(() =&amp;gt; {

console.log(“Drop and resync db.”);

});
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We now need to create a new model named student.js&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module.exports = (sequelize, Sequelize) =&amp;gt; {
    const Student = sequelize.define("student", {
      name: {
        type: Sequelize.STRING
      },
      admission:{
        type:Sequelize.INTEGER
      },
      class: {
        type: Sequelize.INTEGER
      },
      city: {
        type: Sequelize.STRING
      }
    });

    return Student;
  };
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Creating Controller&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Below is the code for a controller.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;const db = require(“../models”);
// models path depends on your structure
const Student= db.student;

exports.create = (req, res) =&amp;gt; {
// Validating the request
if (!req.body.title) {
res.status(400).send ({
message: “Content can be placed here!”
});
return;
}

// Creating a Student
const student = {
name: req.body.name,
admission: req.body.admission,
class: req.body.class,
city: req.body.city
};

// Saving the Student in the database
Student .create(student). then(data =&amp;gt; {
res.send(data);
}) .catch(err =&amp;gt; {
res.status(500).send ({
Message:
err.message || “Some errors will occur when creating a student”
});
});
};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;Retrieving Data&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can use below code to retrieve data.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;exports.findAll = (req, res) =&amp;gt; {

  const name = req.query.name;
  var condition = name ? { name: { [Op.like]: `%${name}%` } } : null;

  Student.findAll({ where: condition })
    .then(data =&amp;gt; {
      res.send(data);
    })
    .catch(err =&amp;gt; {
      res.status(500).send({
        message:
          err.message || "Some error occurred while retrieving data."
      });
    });

};
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now that we are done adding the controller and model, we need to have a route defined in our application that will execute the controller. Let's go ahead and create a route.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Defining Route&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;You can create a new folder called &lt;em&gt;routes&lt;/em&gt; and add a new route.js file in it. Add below code in that file.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module.exports = app =&amp;gt; {
    const students = require("../controllers/student.js");

    var router = require("express").Router();

    // add new student
    router.post("/", students.create);

    // view all students
    router.get("/", students.findAll);
 };

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Now include the route in the student_server.js file using below code&lt;/p&gt;

&lt;p&gt;&lt;code&gt;require("./routes/routes.js")(app);&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;You can test the API by calling the routes in postman.&lt;/p&gt;

&lt;p&gt;Thank you for reading. &lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
