<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: PeterMilovcik</title>
    <description>The latest articles on Forem by PeterMilovcik (@petermilovcik).</description>
    <link>https://forem.com/petermilovcik</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/petermilovcik"/>
    <language>en</language>
    <item>
      <title>Create a Domain Driven Design custom agent for GitHub Copilot</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Mon, 15 Dec 2025 08:03:44 +0000</pubDate>
      <link>https://forem.com/petermilovcik/create-a-domain-driven-design-custom-agent-for-github-copilot-32gi</link>
      <guid>https://forem.com/petermilovcik/create-a-domain-driven-design-custom-agent-for-github-copilot-32gi</guid>
      <description>&lt;p&gt;This tutorial shows how you create and use a reusable GitHub Copilot custom agent focused on Domain Driven Design for .NET.&lt;br&gt;
You end with one small agent file.&lt;br&gt;
You reuse the agent across repositories.&lt;br&gt;
You guide Copilot toward clean DDD decisions.&lt;/p&gt;


&lt;h2&gt;
  
  
  Why use a DDD custom agent
&lt;/h2&gt;

&lt;p&gt;Copilot follows context.&lt;br&gt;
Without guidance, design quality drifts.&lt;/p&gt;

&lt;p&gt;A custom agent gives Copilot a stable role:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Enforces DDD boundaries&lt;/li&gt;
&lt;li&gt;Protects domain purity&lt;/li&gt;
&lt;li&gt;Pushes behavior driven models&lt;/li&gt;
&lt;li&gt;Promotes test first thinking&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You stop repeating architectural rules.&lt;br&gt;
You focus on business problems.&lt;/p&gt;


&lt;h2&gt;
  
  
  Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Visual Studio Code 1.106 or newer&lt;/li&gt;
&lt;li&gt;GitHub Copilot enabled&lt;/li&gt;
&lt;li&gt;GitHub Copilot Chat enabled&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Custom agents ship with recent VS Code releases.&lt;/p&gt;


&lt;h2&gt;
  
  
  What this agent enforces
&lt;/h2&gt;

&lt;p&gt;The agent acts as a DDD expert for .NET projects.&lt;/p&gt;

&lt;p&gt;Core principles:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Clear bounded contexts&lt;/li&gt;
&lt;li&gt;Ubiquitous language aligned with code&lt;/li&gt;
&lt;li&gt;Pure Domain layer&lt;/li&gt;
&lt;li&gt;Explicit Application layer orchestration&lt;/li&gt;
&lt;li&gt;Infrastructure isolated behind ports&lt;/li&gt;
&lt;li&gt;Strong domain and integration tests&lt;/li&gt;
&lt;/ul&gt;


&lt;h2&gt;
  
  
  Step 1: Create a custom agent
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Open VS Code&lt;/li&gt;
&lt;li&gt;Open GitHub Copilot Chat&lt;/li&gt;
&lt;li&gt;Open the agents dropdown&lt;/li&gt;
&lt;li&gt;Select Configure Custom Agents&lt;/li&gt;
&lt;li&gt;Select Create new custom agent&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You now choose a location.&lt;/p&gt;
&lt;h3&gt;
  
  
  Location options
&lt;/h3&gt;

&lt;p&gt;User profile&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Reuse across all repositories&lt;/li&gt;
&lt;li&gt;Best choice for personal workflows&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Workspace&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Stored inside repository&lt;/li&gt;
&lt;li&gt;Best choice for team sharing&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;For team usage, select workspace.&lt;/p&gt;


&lt;h2&gt;
  
  
  Step 2: Add the agent file to the repository
&lt;/h2&gt;

&lt;p&gt;Create this folder if missing:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;.github/agents
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ddd.agent.md
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Step 3: Paste the DDD agent definition
&lt;/h2&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;&lt;span class="nn"&gt;---&lt;/span&gt;
&lt;span class="na"&gt;name&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;DDD&lt;/span&gt;
&lt;span class="na"&gt;description&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;Domain-Driven Design (DDD) expert for .NET&lt;/span&gt;
&lt;span class="na"&gt;tools&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="pi"&gt;[&lt;/span&gt;&lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;edit'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;runNotebooks'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;search'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;new'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;runCommands'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;runTasks'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;microsoft.docs.mcp/*'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;usages'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;vscodeAPI'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;problems'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;changes'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;testFailure'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;openSimpleBrowser'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;fetch'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;githubRepo'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;extensions'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;todos'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;runSubagent'&lt;/span&gt;&lt;span class="pi"&gt;,&lt;/span&gt; &lt;span class="s1"&gt;'&lt;/span&gt;&lt;span class="s"&gt;runTests'&lt;/span&gt;&lt;span class="pi"&gt;]&lt;/span&gt;
&lt;span class="nn"&gt;---&lt;/span&gt;
You are a Domain-Driven Design expert for .NET (C#). Optimize for correctness and maintainability.

Non-negotiables
&lt;span class="p"&gt;-&lt;/span&gt; Identify bounded context and ubiquitous language from the repo. Ask if unclear.
&lt;span class="p"&gt;-&lt;/span&gt; Keep Domain pure. No infrastructure, UI, transport, persistence, logging, or framework dependencies in Domain code.
&lt;span class="p"&gt;-&lt;/span&gt; Model behavior first. Entities (identity), Value Objects (immutable), Aggregates (consistency boundary), Domain Events (facts).
&lt;span class="p"&gt;-&lt;/span&gt; Only the aggregate root is referenced from outside. Only mutate aggregate state via root methods which enforce invariants.
&lt;span class="p"&gt;-&lt;/span&gt; Define application use cases in an Application layer. Orchestrate workflows, authorization, transactions, and coordination there.
&lt;span class="p"&gt;-&lt;/span&gt; Define ports as interfaces near the Domain or Application boundary. Implement adapters in an Infrastructure layer.
&lt;span class="p"&gt;-&lt;/span&gt; Persistence concerns live outside Domain. Use mapping or configuration code in Infrastructure to keep Domain persistence-ignorant.
&lt;span class="p"&gt;-&lt;/span&gt; Testing: domain unit tests for invariants and behaviors, integration tests for adapters and persistence mappings, run tests after edits.

Workflow
1) Read relevant code and constraints.
2) Propose a short plan.
3) Implement with smallest safe change-set.
4) Add or update tests.
5) Run tests and fix failures.

Response format
&lt;span class="p"&gt;-&lt;/span&gt; Questions or Assumptions
&lt;span class="p"&gt;-&lt;/span&gt; Plan
&lt;span class="p"&gt;-&lt;/span&gt; Changed files
&lt;span class="p"&gt;-&lt;/span&gt; Tests to run
&lt;span class="p"&gt;-&lt;/span&gt; Notes or Risks
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Save the file.&lt;br&gt;
Commit the file if using a shared repository.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 4: Activate the agent
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Open GitHub Copilot Chat&lt;/li&gt;
&lt;li&gt;Open the agents dropdown&lt;/li&gt;
&lt;li&gt;Select &lt;strong&gt;DDD&lt;/strong&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Copilot now operates under strict DDD rules.&lt;/p&gt;




&lt;h2&gt;
  
  
  Step 5: Work with the agent
&lt;/h2&gt;

&lt;p&gt;Write requests in domain language.&lt;br&gt;
Avoid technical framing in prompts.&lt;/p&gt;

&lt;h3&gt;
  
  
  Example 1: Aggregate design
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Introduce an Order aggregate.
Move invariants into named methods.
Add domain unit tests.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Expected outcome:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Aggregate root creation&lt;/li&gt;
&lt;li&gt;Invariants enforced inside methods&lt;/li&gt;
&lt;li&gt;Tests validating business rules&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Example 2: Bounded context split
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Identify bounded contexts for Billing and Shipping.
Propose folder structure.
Hide cross context calls behind ports.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Expected outcome:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Context boundaries&lt;/li&gt;
&lt;li&gt;Clear namespaces&lt;/li&gt;
&lt;li&gt;Explicit interfaces between contexts&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Example 3: Domain events
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Add OrderSubmitted domain event.
Raise event from aggregate root.
Add application handler and tests.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Expected outcome:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Event raised inside domain logic&lt;/li&gt;
&lt;li&gt;Handler placed in application layer&lt;/li&gt;
&lt;li&gt;Tests covering behavior&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  What changes after adoption
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Less anemic models&lt;/li&gt;
&lt;li&gt;Fewer leaking dependencies&lt;/li&gt;
&lt;li&gt;Smaller pull requests&lt;/li&gt;
&lt;li&gt;Cleaner tests&lt;/li&gt;
&lt;li&gt;Faster design reviews&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Copilot stops guessing architecture.&lt;br&gt;
Copilot follows rules.&lt;/p&gt;




&lt;h2&gt;
  
  
  Recommended usage pattern
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Select DDD agent before architectural changes&lt;/li&gt;
&lt;li&gt;Switch back to default agent for trivial edits&lt;/li&gt;
&lt;li&gt;Keep agent file stable&lt;/li&gt;
&lt;li&gt;Review changes through DDD lens&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  Next improvements
&lt;/h2&gt;

&lt;p&gt;Optional follow ups:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Separate agent for persistence adapters&lt;/li&gt;
&lt;li&gt;Separate agent for testing strategy&lt;/li&gt;
&lt;li&gt;Team wide instruction file for baseline rules&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The DDD agent stays small.&lt;br&gt;
The impact stays large.&lt;/p&gt;

</description>
      <category>githubcopilot</category>
      <category>productivity</category>
      <category>tutorial</category>
      <category>programming</category>
    </item>
    <item>
      <title>Reusable GitHub Copilot Prompt for Implementation Plans</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Sat, 06 Dec 2025 07:47:41 +0000</pubDate>
      <link>https://forem.com/petermilovcik/reusable-github-copilot-prompt-for-implementation-plans-2h5m</link>
      <guid>https://forem.com/petermilovcik/reusable-github-copilot-prompt-for-implementation-plans-2h5m</guid>
      <description>&lt;p&gt;Large edits from GitHub Copilot Agent often feel risky.&lt;br&gt;
You ask for a change and the agent starts to rewrite files.&lt;br&gt;
Only later you notice hidden side effects.&lt;/p&gt;

&lt;p&gt;A small reusable prompt file reduces this risk.&lt;br&gt;
You ask for a plan first.&lt;br&gt;
You approve the plan.&lt;br&gt;
Then the agent edits code.&lt;/p&gt;

&lt;p&gt;This article walks through one prompt file named &lt;code&gt;implementation_plan.prompt.md&lt;/code&gt; which supports this workflow.&lt;/p&gt;
&lt;h2&gt;
  
  
  The prompt file
&lt;/h2&gt;

&lt;p&gt;Create folder &lt;code&gt;.github/prompts&lt;/code&gt; in your repository if no such folder exists.&lt;/p&gt;

&lt;p&gt;Add file:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;.github/prompts/implementation_plan.prompt.md&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Use this content:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;---
agent: agent
---
Before making any changes, prepare a detailed step by step implementation plan with numbered actions, files to edit, reasoning for each change, identified risks, and example code snippets for key parts. Ask clarifying questions if anything in my request is unclear. Present this plan for review and wait for my approval before starting.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Front matter selects the agent which runs this prompt.&lt;br&gt;
Text below describes response format and behavior.&lt;/p&gt;
&lt;h2&gt;
  
  
  What this prompt enforces
&lt;/h2&gt;

&lt;p&gt;The instruction asks for several concrete elements.&lt;/p&gt;
&lt;h3&gt;
  
  
  Numbered actions
&lt;/h3&gt;

&lt;p&gt;You receive ordered steps.&lt;br&gt;
You see how the agent plans to approach the work.&lt;br&gt;
You choose where to stop or split the plan.&lt;/p&gt;
&lt;h3&gt;
  
  
  Files to edit
&lt;/h3&gt;

&lt;p&gt;The response lists target files.&lt;br&gt;
You see scope before any edit.&lt;br&gt;
You spot sensitive areas early.&lt;/p&gt;
&lt;h3&gt;
  
  
  Reasoning for each change
&lt;/h3&gt;

&lt;p&gt;The agent explains why each step exists.&lt;br&gt;
You compare reasoning with your own intent.&lt;br&gt;
You catch misunderstandings before they reach code.&lt;/p&gt;
&lt;h3&gt;
  
  
  Identified risks
&lt;/h3&gt;

&lt;p&gt;The agent points out risky parts of the plan.&lt;br&gt;
For example, shared components or public APIs.&lt;br&gt;
You add constraints or ask for a safer approach.&lt;/p&gt;
&lt;h3&gt;
  
  
  Example code snippets
&lt;/h3&gt;

&lt;p&gt;The response includes small examples for key parts.&lt;br&gt;
You review structure and style early.&lt;br&gt;
You align patterns with project standards.&lt;/p&gt;
&lt;h3&gt;
  
  
  Clarifying questions
&lt;/h3&gt;

&lt;p&gt;The prompt asks the agent to raise questions when something feels unclear.&lt;br&gt;
This steers the session toward a short dialogue instead of blind action.&lt;/p&gt;
&lt;h3&gt;
  
  
  Approval gate
&lt;/h3&gt;

&lt;p&gt;The last sentence requests a pause before edits.&lt;br&gt;
The agent proposes a plan.&lt;br&gt;
You respond with approval or changes.&lt;br&gt;
Only then the next step starts.&lt;/p&gt;
&lt;h2&gt;
  
  
  Using the prompt in GitHub Copilot Chat
&lt;/h2&gt;

&lt;p&gt;Work with this prompt in a simple way.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Open GitHub Copilot Chat.&lt;/li&gt;
&lt;li&gt;Write your request and list the changes you want, with any &lt;code&gt;#file:&lt;/code&gt; references you prefer.&lt;/li&gt;
&lt;li&gt;Add &lt;code&gt;/implementation_plan&lt;/code&gt; at the end of your message on a new line.&lt;/li&gt;
&lt;li&gt;Send the message.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;I want to extract discount logic from OrderService into a separate domain service.
Keep behavior stable for all callers.
Focus on smaller methods and clearer error handling.

/implementation_plan
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;GitHub Copilot Agent reads your description together with the reusable prompt.&lt;br&gt;
The response follows the structure you defined in &lt;code&gt;implementation_plan.prompt.md&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;You then review the plan.&lt;br&gt;
You ask for adjustments or extra focus where needed.&lt;br&gt;
After you feel confident, you ask the agent to start with concrete edits in a follow up message.&lt;/p&gt;
&lt;h2&gt;
  
  
  Refactoring example
&lt;/h2&gt;

&lt;p&gt;A short scenario from daily work.&lt;/p&gt;

&lt;p&gt;You see complex method logic in a service class.&lt;br&gt;
You open Copilot Chat.&lt;br&gt;
You send:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Refactor the main billing method in BillingService into smaller methods.
Keep business behavior unchanged.
Point out places where error handling needs improvement.

/implementation_plan
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The agent replies with a plan such as:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Steps for scanning existing logic&lt;/li&gt;
&lt;li&gt;Plan for extracting helper methods&lt;/li&gt;
&lt;li&gt;List of new private methods or new classes&lt;/li&gt;
&lt;li&gt;Risk notes around behavior changes and existing integrations&lt;/li&gt;
&lt;li&gt;Sample code for the new structure&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You respond with comments like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Keep the public interface unchanged&lt;/li&gt;
&lt;li&gt;Avoid changes in logging behavior&lt;/li&gt;
&lt;li&gt;Add a step for updating related unit tests later&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The agent updates the plan and sends a new version.&lt;br&gt;
Only when you confirm, the session moves on to actual edits.&lt;/p&gt;

&lt;h2&gt;
  
  
  Team benefits
&lt;/h2&gt;

&lt;p&gt;This pattern improves collaboration with GitHub Copilot Agent.&lt;/p&gt;

&lt;h3&gt;
  
  
  Shared expectations
&lt;/h3&gt;

&lt;p&gt;Each developer on the team uses the same prompt file.&lt;br&gt;
New team members learn a stable way to work with AI assistance.&lt;/p&gt;

&lt;h3&gt;
  
  
  Lower review overhead
&lt;/h3&gt;

&lt;p&gt;Pull requests show changes which follow an agreed plan.&lt;br&gt;
Reviewers read the original plan in chat history when they need context.&lt;/p&gt;

&lt;h3&gt;
  
  
  Less accidental scope creep
&lt;/h3&gt;

&lt;p&gt;The plan defines scope clearly.&lt;br&gt;
You notice when the agent tries to do too much in one go.&lt;/p&gt;

&lt;h3&gt;
  
  
  Clear structure for larger changes
&lt;/h3&gt;

&lt;p&gt;Complex refactors break into smaller, named steps.&lt;br&gt;
You run these steps in order and keep control.&lt;/p&gt;

&lt;h2&gt;
  
  
  Variations
&lt;/h2&gt;

&lt;p&gt;Adapt this idea for different kinds of work.&lt;/p&gt;

&lt;p&gt;For destructive tasks such as file removal, tighten wording around confirmation.&lt;br&gt;
For experimental branches, relax constraints.&lt;br&gt;
For production critical services, add a step for rollback ideas.&lt;/p&gt;

&lt;p&gt;Each variation lives in its own prompt file under &lt;code&gt;.github/prompts&lt;/code&gt;, for example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;security_review.prompt.md&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;performance_review.prompt.md&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;migration_plan.prompt.md&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Every file follows the same pattern.&lt;br&gt;
Front matter at the top.&lt;br&gt;
Concise instruction in the body.&lt;/p&gt;

&lt;h2&gt;
  
  
  Start using the prompt
&lt;/h2&gt;

&lt;p&gt;Add &lt;code&gt;implementation_plan.prompt.md&lt;/code&gt; to your repository.&lt;br&gt;
Use this helper for the next feature or refactor.&lt;br&gt;
Observe how your sessions with GitHub Copilot Agent change.&lt;/p&gt;

&lt;p&gt;You spend more time judging plans and less time fixing surprises.&lt;/p&gt;

</description>
      <category>githubcopilot</category>
      <category>programming</category>
      <category>ai</category>
      <category>productivity</category>
    </item>
    <item>
      <title>Reusable GitHub Copilot Prompt for Refactoring Opportunities</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Tue, 02 Dec 2025 10:57:12 +0000</pubDate>
      <link>https://forem.com/petermilovcik/reusable-github-copilot-prompt-for-refactoring-opportunities-2cdi</link>
      <guid>https://forem.com/petermilovcik/reusable-github-copilot-prompt-for-refactoring-opportunities-2cdi</guid>
      <description>&lt;p&gt;Refactoring becomes easier when you receive fast feedback about improvement options in code.&lt;/p&gt;

&lt;p&gt;GitHub Copilot Chat already helps with this, but writing a long prompt each time feels wasteful.&lt;/p&gt;

&lt;p&gt;A small prompt instruction file in your repository removes friction.&lt;/p&gt;

&lt;p&gt;This article walks through one concrete example.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is a prompt instruction file
&lt;/h2&gt;

&lt;p&gt;GitHub Copilot Chat supports reusable prompts stored under a &lt;code&gt;.github/prompts&lt;/code&gt; folder in your repository.&lt;/p&gt;

&lt;p&gt;Each file holds instructions for the agent.&lt;/p&gt;

&lt;p&gt;You reference such a file from chat with a slash command instead of copying full text again.&lt;/p&gt;

&lt;p&gt;This section focuses on one refactoring helper prompt.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: Create the prompt file
&lt;/h2&gt;

&lt;p&gt;Create folder &lt;code&gt;.github/prompts&lt;/code&gt; in your repository if it does not exist yet.&lt;/p&gt;

&lt;p&gt;Add a new markdown file named &lt;code&gt;refactoring_opportunities.prompt.md&lt;/code&gt; inside this folder.&lt;/p&gt;

&lt;p&gt;Use the following content.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight markdown"&gt;&lt;code&gt;&lt;span class="nn"&gt;---&lt;/span&gt;
&lt;span class="na"&gt;agent&lt;/span&gt;&lt;span class="pi"&gt;:&lt;/span&gt; &lt;span class="s"&gt;agent&lt;/span&gt;
&lt;span class="nn"&gt;---&lt;/span&gt;

Check for potential refactoring opportunities in the following code. If you find any, suggest specific improvements along with code examples. Focus on enhancing readability, maintainability, and performance. If the code is already optimal, simply state that no changes are necessary.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A short explanation:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The front matter at the top selects which agent runs this prompt.&lt;/li&gt;
&lt;li&gt;The text below tells Copilot Chat what to look for and how to respond.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 2: Use the prompt from chat
&lt;/h2&gt;

&lt;p&gt;Open GitHub Copilot Chat in your editor.&lt;/p&gt;

&lt;p&gt;Use this command:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/refactoring_opportunities #file:&amp;lt;yourfile&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Replace &lt;code&gt;&amp;lt;yourfile&amp;gt;&lt;/code&gt; with the file name you want to inspect, for example:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/refactoring_opportunities #file:MyClass.cs
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Copilot receives code from this file together with prompt text from &lt;code&gt;refactoring_opportunities.prompt.md&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;The agent then reviews the code and responds with suggested refactorings or a short confirmation when nothing needs to change.&lt;/p&gt;

&lt;h2&gt;
  
  
  What this prompt asks for
&lt;/h2&gt;

&lt;p&gt;The instruction focuses on three aspects.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Readability&lt;br&gt;
Names, structure, and formatting should help new readers understand code quickly.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Maintainability&lt;br&gt;
Smaller functions, clear responsibilities, and fewer hidden dependencies reduce effort for future changes.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Performance&lt;br&gt;
The agent points out unnecessary allocations, redundant work, and obvious algorithm problems.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The last sentence in the prompt keeps responses grounded.&lt;/p&gt;

&lt;p&gt;When no clear improvement appears, Copilot replies with a short message instead of forcing weak suggestions.&lt;/p&gt;

&lt;p&gt;Noise in your review process stays low.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why store the prompt in your repository
&lt;/h2&gt;

&lt;p&gt;Keeping prompt files inside version control brings a few benefits.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Shared language for the team&lt;br&gt;
Everyone calls the same prompt name and receives similar guidance.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;History&lt;br&gt;
You refine wording through pull requests and review changes together.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Portability&lt;br&gt;
Prompt files travel with the codebase, so new team members gain useful helpers from day one.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A prompt becomes part of your engineering practice, not a snippet lost in chat history.&lt;/p&gt;

&lt;h2&gt;
  
  
  Ideas for extensions
&lt;/h2&gt;

&lt;p&gt;Once &lt;code&gt;refactoring_opportunities.prompt.md&lt;/code&gt; works, expand with more prompt files in the same folder.&lt;/p&gt;

&lt;p&gt;Some examples:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;security_review.prompt.md&lt;/code&gt; for quick checks of obvious security problems.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;test_gaps.prompt.md&lt;/code&gt; for simple suggestions where tests might be missing.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;api_consistency.prompt.md&lt;/code&gt; for checks of naming and behavior across endpoints and clients.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Each file follows the same pattern.&lt;/p&gt;

&lt;p&gt;Front matter selects an agent, body text explains the task.&lt;/p&gt;

&lt;h2&gt;
  
  
  Summary
&lt;/h2&gt;

&lt;p&gt;A small prompt instruction file turns GitHub Copilot Chat into a repeatable refactoring assistant.&lt;/p&gt;

&lt;p&gt;You spend less time writing prompts and more time judging suggestions.&lt;/p&gt;

&lt;p&gt;Start with &lt;code&gt;refactoring_opportunities.prompt.md&lt;/code&gt; under &lt;code&gt;.github/prompts&lt;/code&gt;, then call:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;/refactoring_opportunities #file:YourFile.cs
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;whenever you want a fresh look at a piece of code.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>githubcopilot</category>
      <category>refactoring</category>
      <category>programming</category>
    </item>
    <item>
      <title>One GitHub Copilot Agent Prompt for Safer Changes</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Tue, 02 Dec 2025 07:51:05 +0000</pubDate>
      <link>https://forem.com/petermilovcik/one-github-copilot-agent-prompt-for-safer-changes-4f88</link>
      <guid>https://forem.com/petermilovcik/one-github-copilot-agent-prompt-for-safer-changes-4f88</guid>
      <description>&lt;p&gt;Developers like automation. GitHub Copilot Agent writes code, runs tests, edits files.&lt;/p&gt;

&lt;p&gt;This speed hides a risk. The agent starts changing code before you see a plan.&lt;/p&gt;

&lt;p&gt;You need a brake. A short sentence in your prompt gives you control.&lt;/p&gt;

&lt;h2&gt;
  
  
  The core idea
&lt;/h2&gt;

&lt;p&gt;Describe the changes you want.&lt;/p&gt;

&lt;p&gt;Then finish your prompt with this line:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Before you do any changes, show me your detailed step by step implementation plan for approval.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You ask the agent for a plan first. No edits yet. Only a clear list of steps.&lt;/p&gt;

&lt;p&gt;Afterwards you review the plan. You confirm, refine, or reject.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why this helps
&lt;/h2&gt;

&lt;p&gt;A plan first pattern improves the flow.&lt;/p&gt;

&lt;p&gt;You gain:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Clear intent before edits touch your code&lt;/li&gt;
&lt;li&gt;Easier review, since you see a list of steps, not a block of diff&lt;/li&gt;
&lt;li&gt;Lower risk of half-finished refactors&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The agent shifts from "fire and forget" to "propose, then act".&lt;/p&gt;

&lt;p&gt;You stay in charge of scope and direction.&lt;/p&gt;

&lt;h2&gt;
  
  
  A small example
&lt;/h2&gt;

&lt;p&gt;Prompt:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;I want to extract logging concerns from this C# class into a separate service.

Steps:
1. Identify all logging calls.
2. Introduce an IAppLogger interface.
3. Inject IAppLogger into the constructor.
4. Replace direct logging calls with the new abstraction.

Before you do any changes, show me your detailed step by step implementation plan for approval.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;A good plan from the agent might look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Step 1: Scan the class and list all logging calls, grouped by log level.
Step 2: Propose the IAppLogger interface signature which covers the current usage.
Step 3: Add the new interface and a concrete implementation in a Logging folder.
Step 4: Update the target class constructor to accept IAppLogger through dependency injection.
Step 5: Replace direct logging calls with calls to IAppLogger.
Step 6: Update the composition root to register the concrete logger.
Step 7: Propose unit test updates for the new abstraction.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You read the plan. You notice missing points or risks.&lt;/p&gt;

&lt;p&gt;For example:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;No mention of structured logging&lt;/li&gt;
&lt;li&gt;No mention of log context&lt;/li&gt;
&lt;li&gt;No mention of cross cutting concerns in other classes&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You respond:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Good start.

Extend the plan with:
- Structured logging support
- Log context (correlation id, user id)
- Follow up steps for other classes using logging in this project

Then show me the updated plan. Do not change any code yet.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Only when the plan looks solid you ask the agent to apply the plan.&lt;/p&gt;

&lt;h2&gt;
  
  
  Prompt variants
&lt;/h2&gt;

&lt;p&gt;You tune the line for different work modes.&lt;/p&gt;

&lt;p&gt;For safety and review:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Before you do any changes, show me your detailed step by step implementation plan for approval. Wait for my confirmation before you start.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For pair programming:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Before you do any changes, propose a detailed step by step implementation plan. Ask me questions if anything in my request is unclear.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;For larger refactors:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Before you do any changes, outline a detailed step by step implementation plan with milestones. Mark risky steps explicitly.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Notice the common structure.&lt;/p&gt;

&lt;p&gt;You ask for:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A detailed plan&lt;/li&gt;
&lt;li&gt;Steps in order&lt;/li&gt;
&lt;li&gt;A pause before edits&lt;/li&gt;
&lt;li&gt;Space for questions&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Tips for daily use
&lt;/h2&gt;

&lt;p&gt;A few habits help you get value from this pattern.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Keep your change request short and focused&lt;/li&gt;
&lt;li&gt;Ask for numbered steps&lt;/li&gt;
&lt;li&gt;Ask for risk highlights for bigger changes&lt;/li&gt;
&lt;li&gt;Ask for test impact in the plan&lt;/li&gt;
&lt;li&gt;Store your favorite prompt variant in a snippet or note&lt;/li&gt;
&lt;li&gt;Use the same pattern across your team, so prompts stay consistent&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  When this pattern shines
&lt;/h2&gt;

&lt;p&gt;This simple line in your prompt works best when:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You refactor legacy code&lt;/li&gt;
&lt;li&gt;You touch shared libraries&lt;/li&gt;
&lt;li&gt;You change public APIs&lt;/li&gt;
&lt;li&gt;You work in a codebase you do not know well&lt;/li&gt;
&lt;li&gt;You review plans with teammates in chat or pull requests&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In all those cases you gain a shared view of the work before any edit happens.&lt;/p&gt;

&lt;p&gt;You see scope. You see risk. You see test impact.&lt;/p&gt;

&lt;p&gt;Then you decide how far the agent should go.&lt;/p&gt;

&lt;p&gt;You guide the agent. Not the other way around.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>programming</category>
      <category>productivity</category>
      <category>githubcopilot</category>
    </item>
    <item>
      <title>Integrating OpenAI's Retrieval-Augmented Generation in NET Applications</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Mon, 10 Mar 2025 10:55:01 +0000</pubDate>
      <link>https://forem.com/petermilovcik/building-a-net-console-app-for-document-search-rag-with-openai-embeddings-5ehh</link>
      <guid>https://forem.com/petermilovcik/building-a-net-console-app-for-document-search-rag-with-openai-embeddings-5ehh</guid>
      <description>&lt;p&gt;Integrating OpenAI's Retrieval-Augmented Generation (RAG) in a .NET application involves several steps, including setting up a local embedding vector database, processing PDF documents using PdfPig, and leveraging the OpenAI SDK along with Microsoft.Extensions.AI for unified AI abstractions. Here’s a step-by-step guide to achieve this integration:&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: Setup Environment
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Prerequisites:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;.NET 6 or higher&lt;/strong&gt; installed.&lt;/li&gt;
&lt;li&gt;A &lt;strong&gt;C# development environment&lt;/strong&gt; (Visual Studio, VS Code, or .NET CLI).&lt;/li&gt;
&lt;li&gt;An &lt;strong&gt;OpenAI API Key&lt;/strong&gt; for accessing OpenAI services.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;PdfPig&lt;/strong&gt; for PDF text extraction.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Microsoft.Extensions.AI&lt;/strong&gt; for unified AI abstractions.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Install Required Packages:
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;dotnet add package OpenAI
dotnet add package PdfPig
dotnet add package Microsoft.Extensions.AI
dotnet add package Microsoft.Extensions.AI.OpenAI
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 2: Extract Text from PDFs Using PdfPig
&lt;/h2&gt;

&lt;p&gt;Extract text from PDFs to create chunks for embedding generation.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;PdfPig&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// Load PDF&lt;/span&gt;
&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;var&lt;/span&gt; &lt;span class="n"&gt;pdfDocument&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;PdfDocument&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Open&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"path/to/your/document.pdf"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// Extract text&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Join&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"\n"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;pdfDocument&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetPages&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;Select&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;p&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;p&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetText&lt;/span&gt;&lt;span class="p"&gt;()));&lt;/span&gt;

&lt;span class="c1"&gt;// Split text into chunks (e.g., paragraphs)&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;chunks&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;new&lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="s"&gt;"\n\n"&lt;/span&gt; &lt;span class="p"&gt;},&lt;/span&gt; &lt;span class="n"&gt;StringSplitOptions&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;RemoveEmptyEntries&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 3: Generate Embeddings Using OpenAI SDK
&lt;/h2&gt;

&lt;p&gt;Use the OpenAI SDK to generate embeddings for each chunk.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;OpenAI&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// Initialize OpenAI client with API key&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;openAiClient&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;OpenAIClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Environment&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetEnvironmentVariable&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"OPENAI_API_KEY"&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;

&lt;span class="c1"&gt;// Generate embeddings for each chunk&lt;/span&gt;
&lt;span class="k"&gt;foreach&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;chunk&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="n"&gt;chunks&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;embedding&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;openAiClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Embeddings&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GenerateAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"text-embedding-ada-002"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="c1"&gt;// Store the embedding in a local database or in-memory structure&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 4: Implement Local Embedding Vector Database
&lt;/h2&gt;

&lt;p&gt;Store the generated embeddings in a simple local database or in-memory structure like a dictionary.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;System.Collections.Generic&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// In-memory database example&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;embeddingDatabase&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;Dictionary&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

&lt;span class="c1"&gt;// Store embeddings&lt;/span&gt;
&lt;span class="k"&gt;foreach&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;chunk&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="n"&gt;chunks&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;embedding&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;openAiClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Embeddings&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GenerateAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"text-embedding-ada-002"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="n"&gt;embeddingDatabase&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Add&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;embedding&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Vector&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ToArray&lt;/span&gt;&lt;span class="p"&gt;());&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 5: Implement Chat Loop with Microsoft.Extensions.AI
&lt;/h2&gt;

&lt;p&gt;Use Microsoft.Extensions.AI to create a chat loop that queries the local embedding database.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;Microsoft.Extensions.AI&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="c1"&gt;// Initialize chat client&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;chatClient&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;OpenAIClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Environment&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetEnvironmentVariable&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"OPENAI_API_KEY"&lt;/span&gt;&lt;span class="p"&gt;)).&lt;/span&gt;&lt;span class="nf"&gt;AsChatClient&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

&lt;span class="c1"&gt;// Chat loop&lt;/span&gt;
&lt;span class="k"&gt;while&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;true&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="n"&gt;Console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Write&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Enter your question: "&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;query&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ReadLine&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt; &lt;span class="p"&gt;==&lt;/span&gt; &lt;span class="s"&gt;"exit"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="k"&gt;break&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="c1"&gt;// Generate query embedding&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;queryEmbedding&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;openAiClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Embeddings&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GenerateAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;query&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"text-embedding-ada-002"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="c1"&gt;// Perform vector similarity search&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;similarities&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;embeddingDatabase&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Select&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;kvp&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Key&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;kvp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Key&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Similarity&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;CosineSimilarity&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;queryEmbedding&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Vector&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ToArray&lt;/span&gt;&lt;span class="p"&gt;(),&lt;/span&gt; &lt;span class="n"&gt;kvp&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Value&lt;/span&gt;&lt;span class="p"&gt;)));&lt;/span&gt;

    &lt;span class="c1"&gt;// Get top matches&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;topMatches&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;similarities&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;OrderByDescending&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;s&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;s&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Similarity&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;Take&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="m"&gt;3&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

    &lt;span class="c1"&gt;// Display matches&lt;/span&gt;
    &lt;span class="k"&gt;foreach&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;match&lt;/span&gt; &lt;span class="k"&gt;in&lt;/span&gt; &lt;span class="n"&gt;topMatches&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;Console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WriteLine&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;$"Match: &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Key&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s"&gt;, Similarity: &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;match&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Similarity&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="n"&gt;F2&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="c1"&gt;// Cosine similarity function&lt;/span&gt;
&lt;span class="kt"&gt;float&lt;/span&gt; &lt;span class="nf"&gt;CosineSimilarity&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;float&lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt; &lt;span class="n"&gt;vec1&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="kt"&gt;float&lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt; &lt;span class="n"&gt;vec2&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;dotProduct&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;vec1&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Zip&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;vec2&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;a&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;b&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;a&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="n"&gt;b&lt;/span&gt;&lt;span class="p"&gt;).&lt;/span&gt;&lt;span class="nf"&gt;Sum&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;magnitude1&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;MathF&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Sqrt&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;vec1&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;magnitude2&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;MathF&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Sqrt&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;vec2&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Sum&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="n"&gt;x&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;dotProduct&lt;/span&gt; &lt;span class="p"&gt;/&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;magnitude1&lt;/span&gt; &lt;span class="p"&gt;*&lt;/span&gt; &lt;span class="n"&gt;magnitude2&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Step 6: Run the Application
&lt;/h2&gt;

&lt;p&gt;Run the console application and interact with it by asking questions about the PDF documents.&lt;/p&gt;

&lt;p&gt;This setup integrates OpenAI RAG with a local embedding vector database and uses PdfPig for PDF processing, all within a .NET environment enhanced by Microsoft.Extensions.AI for unified AI abstractions.&lt;/p&gt;

&lt;p&gt;References:&lt;br&gt;
[1] &lt;a href="https://dev.to/petermilovcik/building-a-net-console-app-for-document-search-rag-with-openai-embeddings-5ehh"&gt;https://dev.to/petermilovcik/building-a-net-console-app-for-document-search-rag-with-openai-embeddings-5ehh&lt;/a&gt;&lt;br&gt;
[2] &lt;a href="https://juldhais.net/retrieval-augmented-generation-rag-using-net-and-openai-api-9814d4d5051f" rel="noopener noreferrer"&gt;https://juldhais.net/retrieval-augmented-generation-rag-using-net-and-openai-api-9814d4d5051f&lt;/a&gt;&lt;br&gt;
[3] &lt;a href="https://devblogs.microsoft.com/dotnet/introducing-microsoft-extensions-ai-preview/" rel="noopener noreferrer"&gt;https://devblogs.microsoft.com/dotnet/introducing-microsoft-extensions-ai-preview/&lt;/a&gt;&lt;br&gt;
[4] &lt;a href="https://learn.microsoft.com/en-us/dotnet/ai/ai-extensions" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/dotnet/ai/ai-extensions&lt;/a&gt;&lt;br&gt;
[5] &lt;a href="https://www.confident-ai.com/blog/how-to-build-a-pdf-qa-chatbot-using-openai-and-chromadb" rel="noopener noreferrer"&gt;https://www.confident-ai.com/blog/how-to-build-a-pdf-qa-chatbot-using-openai-and-chromadb&lt;/a&gt;&lt;br&gt;
[6] &lt;a href="https://blog.gopenai.com/chat-with-pdf-rag-using-openai-4o-and-pinecone-1e6feb451642" rel="noopener noreferrer"&gt;https://blog.gopenai.com/chat-with-pdf-rag-using-openai-4o-and-pinecone-1e6feb451642&lt;/a&gt;&lt;br&gt;
[7] &lt;a href="https://learn.microsoft.com/en-us/dotnet/ai/conceptual/vector-databases" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/dotnet/ai/conceptual/vector-databases&lt;/a&gt;&lt;br&gt;
[8] &lt;a href="https://dev.to/petermilovcik/implementing-rag-with-azure-openai-in-net-c-12c2"&gt;https://dev.to/petermilovcik/implementing-rag-with-azure-openai-in-net-c-12c2&lt;/a&gt;&lt;br&gt;
[9] &lt;a href="https://uglytoad.github.io/PdfPig/" rel="noopener noreferrer"&gt;https://uglytoad.github.io/PdfPig/&lt;/a&gt;&lt;br&gt;
[10] &lt;a href="https://dev.to/eliotjones/reading-a-pdf-in-c-on-net-core-43ef"&gt;https://dev.to/eliotjones/reading-a-pdf-in-c-on-net-core-43ef&lt;/a&gt;&lt;br&gt;
[11] &lt;a href="https://github.com/openai/openai-dotnet" rel="noopener noreferrer"&gt;https://github.com/openai/openai-dotnet&lt;/a&gt;&lt;br&gt;
[12] &lt;a href="https://github.com/UglyToad/PdfPig" rel="noopener noreferrer"&gt;https://github.com/UglyToad/PdfPig&lt;/a&gt;&lt;br&gt;
[13] &lt;a href="https://www.reddit.com/r/dotnet/comments/1ciph74/pdf_chunking_for_vector_embeddings_options/" rel="noopener noreferrer"&gt;https://www.reddit.com/r/dotnet/comments/1ciph74/pdf_chunking_for_vector_embeddings_options/&lt;/a&gt;&lt;br&gt;
[14] &lt;a href="https://github.com/edilma/RAG-App-HackTogether" rel="noopener noreferrer"&gt;https://github.com/edilma/RAG-App-HackTogether&lt;/a&gt;&lt;br&gt;
[15] &lt;a href="https://help.openai.com/en/articles/8550641-assistants-api-v2-faq" rel="noopener noreferrer"&gt;https://help.openai.com/en/articles/8550641-assistants-api-v2-faq&lt;/a&gt;&lt;br&gt;
[16] &lt;a href="https://learn.microsoft.com/en-us/samples/azure/azure-sdk-for-net/azureprojects-samples/" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/samples/azure/azure-sdk-for-net/azureprojects-samples/&lt;/a&gt;&lt;br&gt;
[17] &lt;a href="https://platform.openai.com/docs/libraries" rel="noopener noreferrer"&gt;https://platform.openai.com/docs/libraries&lt;/a&gt;&lt;br&gt;
[18] &lt;a href="https://learn.microsoft.com/en-us/azure/ai-services/openai/" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/azure/ai-services/openai/&lt;/a&gt;&lt;br&gt;
[19] &lt;a href="https://learn.microsoft.com/en-us/samples/azure-samples/azure-sql-db-session-recommender-v2/azure-sql-db-session-recommender-v2/" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/samples/azure-samples/azure-sql-db-session-recommender-v2/azure-sql-db-session-recommender-v2/&lt;/a&gt;&lt;br&gt;
[20] &lt;a href="https://openai.com/index/new-tools-for-building-agents/" rel="noopener noreferrer"&gt;https://openai.com/index/new-tools-for-building-agents/&lt;/a&gt;&lt;br&gt;
[21] &lt;a href="https://www.youtube.com/watch?v=umzMPlaKLQo" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=umzMPlaKLQo&lt;/a&gt;&lt;br&gt;
[22] &lt;a href="https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/use-your-data" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/azure/ai-services/openai/concepts/use-your-data&lt;/a&gt;&lt;br&gt;
[23] &lt;a href="https://learn.microsoft.com/en-us/dotnet/api/microsoft.extensions.ai?view=net-9.0-pp" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/dotnet/api/microsoft.extensions.ai?view=net-9.0-pp&lt;/a&gt;&lt;br&gt;
[24] &lt;a href="https://learn.microsoft.com/en-us/dotnet/ai/conceptual/evaluation-libraries" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/dotnet/ai/conceptual/evaluation-libraries&lt;/a&gt;&lt;br&gt;
[25] &lt;a href="https://github.com/dotnet/extensions/issues/5739" rel="noopener noreferrer"&gt;https://github.com/dotnet/extensions/issues/5739&lt;/a&gt;&lt;br&gt;
[26] &lt;a href="https://github.com/Azure-Samples/aisearch-openai-rag-audio" rel="noopener noreferrer"&gt;https://github.com/Azure-Samples/aisearch-openai-rag-audio&lt;/a&gt;&lt;br&gt;
[27] &lt;a href="https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIWebAPI/README.md" rel="noopener noreferrer"&gt;https://github.com/dotnet/ai-samples/blob/main/src/microsoft-extensions-ai/azure-openai/AzureOpenAIWebAPI/README.md&lt;/a&gt;&lt;br&gt;
[28] &lt;a href="https://learn.microsoft.com/de-de/dotnet/ai/ai-extensions" rel="noopener noreferrer"&gt;https://learn.microsoft.com/de-de/dotnet/ai/ai-extensions&lt;/a&gt;&lt;br&gt;
[29] &lt;a href="https://learn.microsoft.com/en-us/azure/search/retrieval-augmented-generation-overview" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/azure/search/retrieval-augmented-generation-overview&lt;/a&gt;&lt;br&gt;
[30] &lt;a href="https://learn.microsoft.com/en-us/dotnet/api/overview/azure/ai.openai-readme?view=azure-dotnet" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/dotnet/api/overview/azure/ai.openai-readme?view=azure-dotnet&lt;/a&gt;&lt;br&gt;
[31] &lt;a href="https://www.nuget.org/profiles/Microsoft.Extensions.AI.Evaluation" rel="noopener noreferrer"&gt;https://www.nuget.org/profiles/Microsoft.Extensions.AI.Evaluation&lt;/a&gt;&lt;br&gt;
[32] &lt;a href="https://learn.microsoft.com/en-us/samples/azure-samples/azure-search-openai-demo/azure-search-openai-demo/" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/samples/azure-samples/azure-search-openai-demo/azure-search-openai-demo/&lt;/a&gt;&lt;br&gt;
[33] &lt;a href="https://www.nuget.org/packages/Microsoft.Extensions.AI.OpenAI/9.3.0-preview.1.25161.3" rel="noopener noreferrer"&gt;https://www.nuget.org/packages/Microsoft.Extensions.AI.OpenAI/9.3.0-preview.1.25161.3&lt;/a&gt;&lt;br&gt;
[34] &lt;a href="https://dev.to/focused_dot_io/chat-with-your-pdfs-an-end-to-end-langchain-tutorial-for-building-a-custom-rag-with-openai-part-1-3oi3"&gt;https://dev.to/focused_dot_io/chat-with-your-pdfs-an-end-to-end-langchain-tutorial-for-building-a-custom-rag-with-openai-part-1-3oi3&lt;/a&gt;&lt;br&gt;
[35] &lt;a href="https://community.openai.com/t/what-is-the-current-rag-architecture-of-openai-for-pdf-uploads/878636" rel="noopener noreferrer"&gt;https://community.openai.com/t/what-is-the-current-rag-architecture-of-openai-for-pdf-uploads/878636&lt;/a&gt;&lt;br&gt;
[36] &lt;a href="https://pdf.ai" rel="noopener noreferrer"&gt;https://pdf.ai&lt;/a&gt;&lt;br&gt;
[37] &lt;a href="https://cookbook.openai.com/examples/file_search_responses" rel="noopener noreferrer"&gt;https://cookbook.openai.com/examples/file_search_responses&lt;/a&gt;&lt;br&gt;
[38] &lt;a href="https://chatdoc.com" rel="noopener noreferrer"&gt;https://chatdoc.com&lt;/a&gt;&lt;br&gt;
[39] &lt;a href="https://cookbook.openai.com/examples/parse_pdf_docs_for_rag" rel="noopener noreferrer"&gt;https://cookbook.openai.com/examples/parse_pdf_docs_for_rag&lt;/a&gt;&lt;br&gt;
[40] &lt;a href="https://monica.im/webapp/doc-chat" rel="noopener noreferrer"&gt;https://monica.im/webapp/doc-chat&lt;/a&gt;&lt;br&gt;
[41] &lt;a href="https://www.youtube.com/watch?v=kC-Dzy4nADI" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=kC-Dzy4nADI&lt;/a&gt;&lt;br&gt;
[42] &lt;a href="https://www.youtube.com/watch?v=hSQY4N1u3v0" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=hSQY4N1u3v0&lt;/a&gt;&lt;br&gt;
[43] &lt;a href="https://community.openai.com/t/using-large-pdfs-to-make-a-chatbot/372228" rel="noopener noreferrer"&gt;https://community.openai.com/t/using-large-pdfs-to-make-a-chatbot/372228&lt;/a&gt;&lt;br&gt;
[44] &lt;a href="https://smallpdf.com/chat-pdf" rel="noopener noreferrer"&gt;https://smallpdf.com/chat-pdf&lt;/a&gt;&lt;br&gt;
[45] &lt;a href="https://help.openai.com/en/articles/8868588-retrieval-augmented-generation-rag-and-semantic-search-for-gpts" rel="noopener noreferrer"&gt;https://help.openai.com/en/articles/8868588-retrieval-augmented-generation-rag-and-semantic-search-for-gpts&lt;/a&gt;&lt;br&gt;
[46] &lt;a href="https://www.reddit.com/r/vectordatabase/comments/1hzovpy/best_vector_database_for_rag/" rel="noopener noreferrer"&gt;https://www.reddit.com/r/vectordatabase/comments/1hzovpy/best_vector_database_for_rag/&lt;/a&gt;&lt;br&gt;
[47] &lt;a href="https://learn.microsoft.com/en-us/azure/architecture/ai-ml/guide/rag/rag-generate-embeddings" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/azure/architecture/ai-ml/guide/rag/rag-generate-embeddings&lt;/a&gt;&lt;br&gt;
[48] &lt;a href="https://www.heise.de/ratgeber/RAG-mit-deutschsprachigen-Embedding-Modellen-aufsetzen-10231709.html" rel="noopener noreferrer"&gt;https://www.heise.de/ratgeber/RAG-mit-deutschsprachigen-Embedding-Modellen-aufsetzen-10231709.html&lt;/a&gt;&lt;br&gt;
[49] &lt;a href="https://jasonhaley.com/2024/02/07/simple-rag-sql-openai/" rel="noopener noreferrer"&gt;https://jasonhaley.com/2024/02/07/simple-rag-sql-openai/&lt;/a&gt;&lt;br&gt;
[50] &lt;a href="https://www.timescale.com/blog/finding-the-best-open-source-embedding-model-for-rag" rel="noopener noreferrer"&gt;https://www.timescale.com/blog/finding-the-best-open-source-embedding-model-for-rag&lt;/a&gt;&lt;br&gt;
[51] &lt;a href="https://github.com/microsoft/generative-ai-for-beginners/blob/main/15-rag-and-vector-databases/README.md?WT.mc_id=academic-105485-koreyst" rel="noopener noreferrer"&gt;https://github.com/microsoft/generative-ai-for-beginners/blob/main/15-rag-and-vector-databases/README.md?WT.mc_id=academic-105485-koreyst&lt;/a&gt;&lt;br&gt;
[52] &lt;a href="https://github.com/mmr116/document-search-using-vector-embeddings-openai-rag" rel="noopener noreferrer"&gt;https://github.com/mmr116/document-search-using-vector-embeddings-openai-rag&lt;/a&gt;&lt;br&gt;
[53] &lt;a href="https://autoize.com/retrieval-augmented-generation-rag-with-local-embeddings/" rel="noopener noreferrer"&gt;https://autoize.com/retrieval-augmented-generation-rag-with-local-embeddings/&lt;/a&gt;&lt;br&gt;
[54] &lt;a href="https://www.reddit.com/r/LocalLLaMA/comments/18j39qt/what_embedding_models_are_you_using_for_rag/" rel="noopener noreferrer"&gt;https://www.reddit.com/r/LocalLLaMA/comments/18j39qt/what_embedding_models_are_you_using_for_rag/&lt;/a&gt;&lt;br&gt;
[55] &lt;a href="https://community.openai.com/t/best-vector-database-to-use-with-rag/615350" rel="noopener noreferrer"&gt;https://community.openai.com/t/best-vector-database-to-use-with-rag/615350&lt;/a&gt;&lt;br&gt;
[56] &lt;a href="https://stackoverflow.com/questions/72880545/get-text-line-by-line-from-pdf-using-c-sharp" rel="noopener noreferrer"&gt;https://stackoverflow.com/questions/72880545/get-text-line-by-line-from-pdf-using-c-sharp&lt;/a&gt;&lt;br&gt;
[57] &lt;a href="https://stackoverflow.com/questions/79555503/how-can-i-extract-text-and-images-from-pdf-files-in-net-core" rel="noopener noreferrer"&gt;https://stackoverflow.com/questions/79555503/how-can-i-extract-text-and-images-from-pdf-files-in-net-core&lt;/a&gt;&lt;br&gt;
[58] &lt;a href="https://www.reddit.com/r/dotnet/comments/17svth5/how_to_get_all_text_from_pdf_fasterimage_with/" rel="noopener noreferrer"&gt;https://www.reddit.com/r/dotnet/comments/17svth5/how_to_get_all_text_from_pdf_fasterimage_with/&lt;/a&gt;&lt;br&gt;
[59] &lt;a href="https://www.nuget.org/packages/PdfPig/0.1.4" rel="noopener noreferrer"&gt;https://www.nuget.org/packages/PdfPig/0.1.4&lt;/a&gt;&lt;br&gt;
[60] &lt;a href="https://news.ycombinator.com/item?id=21256814" rel="noopener noreferrer"&gt;https://news.ycombinator.com/item?id=21256814&lt;/a&gt;&lt;br&gt;
[61] &lt;a href="https://github.com/UglyToad/PdfPig/issues/319" rel="noopener noreferrer"&gt;https://github.com/UglyToad/PdfPig/issues/319&lt;/a&gt;&lt;br&gt;
[62] &lt;a href="https://stackoverflow.com/questions/77469097/how-can-i-process-a-pdf-using-openais-apis-gpts" rel="noopener noreferrer"&gt;https://stackoverflow.com/questions/77469097/how-can-i-process-a-pdf-using-openais-apis-gpts&lt;/a&gt;&lt;br&gt;
[63] &lt;a href="https://stackoverflow.com/questions/69798017/split-large-pdf-file-in-to-multiple-pdfs-in-c-sharp" rel="noopener noreferrer"&gt;https://stackoverflow.com/questions/69798017/split-large-pdf-file-in-to-multiple-pdfs-in-c-sharp&lt;/a&gt;&lt;br&gt;
[64] &lt;a href="https://www.reddit.com/r/csharp/comments/vlk1g1/extract_text_from_pdf_file_blazor/" rel="noopener noreferrer"&gt;https://www.reddit.com/r/csharp/comments/vlk1g1/extract_text_from_pdf_file_blazor/&lt;/a&gt;&lt;br&gt;
[65] &lt;a href="https://github.com/UglyToad/PdfPig/discussions/374" rel="noopener noreferrer"&gt;https://github.com/UglyToad/PdfPig/discussions/374&lt;/a&gt;&lt;br&gt;
[66] &lt;a href="https://ironpdf.com/blog/compare-to-other-components/pdfpig-csharp-alternatives/" rel="noopener noreferrer"&gt;https://ironpdf.com/blog/compare-to-other-components/pdfpig-csharp-alternatives/&lt;/a&gt;&lt;br&gt;
[67] &lt;a href="https://www.nuget.org/packages/PdfPig/0.1.8-alpha-20230605-7fe5f" rel="noopener noreferrer"&gt;https://www.nuget.org/packages/PdfPig/0.1.8-alpha-20230605-7fe5f&lt;/a&gt;&lt;br&gt;
[68] &lt;a href="https://liblab.com/docs/tutorials/others/rag-with-sdk" rel="noopener noreferrer"&gt;https://liblab.com/docs/tutorials/others/rag-with-sdk&lt;/a&gt;&lt;br&gt;
[69] &lt;a href="https://developer.auth0.com/resources/labs/authorization/securing-a-rag-app-with-open-ai-and-fga-in-python" rel="noopener noreferrer"&gt;https://developer.auth0.com/resources/labs/authorization/securing-a-rag-app-with-open-ai-and-fga-in-python&lt;/a&gt;&lt;br&gt;
[70] &lt;a href="https://learn.microsoft.com/en-us/dotnet/core/extensions/artificial-intelligence" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/dotnet/core/extensions/artificial-intelligence&lt;/a&gt;&lt;br&gt;
[71] &lt;a href="https://learn.microsoft.com/en-us/dotnet/ai/quickstarts/quickstart-azure-openai-tool" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/dotnet/ai/quickstarts/quickstart-azure-openai-tool&lt;/a&gt;&lt;br&gt;
[72] &lt;a href="https://learn.microsoft.com/en-us/answers/questions/2136354/azure-openai-agentic-ai-semantic-kernel-rag-integr" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/answers/questions/2136354/azure-openai-agentic-ai-semantic-kernel-rag-integr&lt;/a&gt;&lt;br&gt;
[73] &lt;a href="https://learn.microsoft.com/de-de/dotnet/ai/quickstarts/quickstart-azure-openai-tool" rel="noopener noreferrer"&gt;https://learn.microsoft.com/de-de/dotnet/ai/quickstarts/quickstart-azure-openai-tool&lt;/a&gt;&lt;br&gt;
[74] &lt;a href="https://learn.microsoft.com/en-us/dotnet/ai/quickstarts/build-chat-app" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/dotnet/ai/quickstarts/build-chat-app&lt;/a&gt;&lt;br&gt;
[75] &lt;a href="https://www.reddit.com/r/aipromptprogramming/comments/13oea8y/ways_to_integrate_pdf_file_content_into_my_own/" rel="noopener noreferrer"&gt;https://www.reddit.com/r/aipromptprogramming/comments/13oea8y/ways_to_integrate_pdf_file_content_into_my_own/&lt;/a&gt;&lt;br&gt;
[76] &lt;a href="https://www.textcontrol.com/blog/2024/02/23/ask-pdf-a-generative-ai-application-for-pdf-documents-using-tx-text-control-and-openai-functions-in-c-sharp/" rel="noopener noreferrer"&gt;https://www.textcontrol.com/blog/2024/02/23/ask-pdf-a-generative-ai-application-for-pdf-documents-using-tx-text-control-and-openai-functions-in-c-sharp/&lt;/a&gt;&lt;br&gt;
[77] &lt;a href="https://www.youtube.com/watch?v=jH2tuFSDZUg" rel="noopener noreferrer"&gt;https://www.youtube.com/watch?v=jH2tuFSDZUg&lt;/a&gt;&lt;br&gt;
[78] &lt;a href="https://www.chatpdf.com" rel="noopener noreferrer"&gt;https://www.chatpdf.com&lt;/a&gt;&lt;br&gt;
[79] &lt;a href="https://community.openai.com/t/which-is-the-best-approach-to-do-chat-with-pdf-application-rag-fine-tuning-open-ai-assistant/943686" rel="noopener noreferrer"&gt;https://community.openai.com/t/which-is-the-best-approach-to-do-chat-with-pdf-application-rag-fine-tuning-open-ai-assistant/943686&lt;/a&gt;&lt;br&gt;
[80] &lt;a href="https://www.matillion.com/blog/a-deep-dive-into-embedding-and-retrieval-augmented-generation-rag" rel="noopener noreferrer"&gt;https://www.matillion.com/blog/a-deep-dive-into-embedding-and-retrieval-augmented-generation-rag&lt;/a&gt;&lt;br&gt;
[81] &lt;a href="https://learn.microsoft.com/en-us/dotnet/ai/tutorials/tutorial-ai-vector-search" rel="noopener noreferrer"&gt;https://learn.microsoft.com/en-us/dotnet/ai/tutorials/tutorial-ai-vector-search&lt;/a&gt;&lt;br&gt;
[82] &lt;a href="https://thecodeman.net/posts/how-to-implement-rag-in-dotnet" rel="noopener noreferrer"&gt;https://thecodeman.net/posts/how-to-implement-rag-in-dotnet&lt;/a&gt;&lt;br&gt;
[83] &lt;a href="https://dev.to/aknox/local-langflow-a-vector-rag-application-running-locally-c52"&gt;https://dev.to/aknox/local-langflow-a-vector-rag-application-running-locally-c52&lt;/a&gt;&lt;br&gt;
[84] &lt;a href="https://wandb.ai/mostafaibrahim17/ml-articles/reports/Vector-Embeddings-in-RAG-Applications--Vmlldzo3OTk1NDA5" rel="noopener noreferrer"&gt;https://wandb.ai/mostafaibrahim17/ml-articles/reports/Vector-Embeddings-in-RAG-Applications--Vmlldzo3OTk1NDA5&lt;/a&gt;&lt;/p&gt;




&lt;p&gt;Answer from Perplexity: pplx.ai/share&lt;/p&gt;

</description>
      <category>openai</category>
      <category>rag</category>
      <category>tutorial</category>
      <category>dotnet</category>
    </item>
    <item>
      <title>Implementing RAG with Azure OpenAI in .NET (C#)</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Mon, 10 Mar 2025 10:20:59 +0000</pubDate>
      <link>https://forem.com/petermilovcik/implementing-rag-with-azure-openai-in-net-c-12c2</link>
      <guid>https://forem.com/petermilovcik/implementing-rag-with-azure-openai-in-net-c-12c2</guid>
      <description>&lt;blockquote&gt;
&lt;p&gt;This tutorial was created using OpenAI's Deep Research capability.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Retrieval-Augmented Generation (RAG) combines a &lt;strong&gt;document retrieval step&lt;/strong&gt; with an &lt;strong&gt;OpenAI LLM&lt;/strong&gt; to ground the model’s answers on your data. Below are best practices for a quick yet robust prototype in C#, focusing on storage, embeddings, vector search, and resources.&lt;/p&gt;

&lt;h2&gt;
  
  
  Document Storage: Azure Blob vs. Local
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Use Azure Blob Storage for realistic scenarios&lt;/strong&gt; – Storing documents in Azure Blob Storage is the recommended approach, especially if you plan to integrate with Azure Cognitive Search. Azure’s RAG tutorials upload source files (e.g. PDFs) to a Blob container so an indexer can ingest and chunk them automatically (&lt;a href="https://learn.microsoft.com/en-us/azure/search/tutorial-rag-build-solution-pipeline#:~:text=,blob%20storage%20for%20automated%20indexing" rel="noopener noreferrer"&gt;RAG tutorial: Build an indexing pipeline - Azure AI Search | Microsoft Learn&lt;/a&gt;). Blob Storage is scalable and allows Azure Search or other services to pull content easily.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Local storage for quick prototyping&lt;/strong&gt; – For a simple prototype or local development, you can read files from the local filesystem. This avoids provisioning cloud storage initially. It’s acceptable to start with local files if you have just a few documents. Keep in mind you’ll likely transition to Blob Storage for production or if you want to use Azure Search indexers (which work natively with Azure Blob). In short, &lt;strong&gt;local storage is fine for an isolated proof-of-concept&lt;/strong&gt;, but Azure Blob Storage is preferred for anything beyond a toy example or when using Azure’s managed pipelines.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Authentication:&lt;/strong&gt; In a prototype, using the storage account’s &lt;strong&gt;API key or connection string&lt;/strong&gt; is easiest for access. You can supply the Blob Storage key (or a SAS token) in your configuration to let your .NET code upload or download files. This is simpler than setting up Azure AD roles at this stage. (You can later move to managed identities for better security once the basic solution works.)&lt;/p&gt;

&lt;h2&gt;
  
  
  Embedding Model: Choosing the Best for Azure OpenAI
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Use OpenAI’s text-embedding-ada-002 model&lt;/strong&gt; – The &lt;em&gt;ADA v2&lt;/em&gt; text embedding model is currently the go-to for Azure OpenAI RAG scenarios. It produces 1536-dimensional embeddings and offers strong semantic representation of text. In fact, Azure’s AI Search integration expects an &lt;strong&gt;embedding model&lt;/strong&gt; like &lt;strong&gt;&lt;code&gt;text-embedding-ada-002&lt;/code&gt;&lt;/strong&gt; for vectorization (&lt;a href="https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-azure-openai-embedding#:~:text=,skillset%20is%20created%20using%20the" rel="noopener noreferrer"&gt;Azure OpenAI Embedding skill - Azure AI Search | Microsoft Learn&lt;/a&gt;). As of now, &lt;code&gt;text-embedding-ada-002&lt;/code&gt; is the primary available embedding model in Azure OpenAI (other newer embedding models like “text-embedding-3-large” are in preview or not yet generally available to all users (&lt;a href="https://learn.microsoft.com/en-us/answers/questions/1700212/embedding-model-for-rag-openai-studio#:~:text=As%20you%20rightly%20pointed%20it,002%20is%20only%20available" rel="noopener noreferrer"&gt;embedding model for RAG, OpenAI Studio - Microsoft Q&amp;amp;A&lt;/a&gt;)).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Why ADA-002?&lt;/strong&gt; It’s proven to perform well on a variety of semantic search tasks and is the recommended default. Using this model ensures your document chunks and user queries are mapped to the same vector space for meaningful similarity matching (&lt;a href="https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-azure-openai-embedding#:~:text=,skillset%20is%20created%20using%20the" rel="noopener noreferrer"&gt;Azure OpenAI Embedding skill - Azure AI Search | Microsoft Learn&lt;/a&gt;). When you create your Azure OpenAI resource, deploy the ada-002 embedding model (e.g., as an “embeddings” deployment) so your .NET app can call it. For example, you’ll call the Azure OpenAI &lt;em&gt;Embeddings&lt;/em&gt; API with this model to vectorize your document chunks and user questions.&lt;/p&gt;

&lt;h2&gt;
  
  
  Vector Database: Azure AI Search vs. pgvector vs. Others
&lt;/h2&gt;

&lt;p&gt;Once you have embeddings, you need a vector store to &lt;strong&gt;index and query&lt;/strong&gt; them for nearest matches:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Azure AI Search (Cognitive Search)&lt;/strong&gt; – This is often the &lt;strong&gt;simplest and most integrated choice&lt;/strong&gt; for an Azure-based RAG prototype. Azure Cognitive Search now supports vector search natively, and it’s a “proven solution” for RAG in enterprise scenarios (&lt;a href="https://learn.microsoft.com/en-us/azure/search/retrieval-augmented-generation-overview#:~:text=,language%20understanding%20models%20for%20retrieval" rel="noopener noreferrer"&gt;RAG and generative AI - Azure AI Search | Microsoft Learn&lt;/a&gt;). You can create a search index with a &lt;em&gt;vector field&lt;/em&gt; to store embeddings, and use the Search service’s API to perform cosine similarity searches. The benefit is that Azure Search also provides robust indexing (including built-in chunking and enrichment skills) and security controls. If your data doesn’t change frequently in real-time, and you’re okay managing an index, &lt;strong&gt;Azure AI Search is compelling&lt;/strong&gt; (&lt;a href="https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/vector-search#:~:text=When%20deciding%20whether%20to%20use,can%20be%20a%20compelling%20choice" rel="noopener noreferrer"&gt;Choose an Azure service for vector search - Azure Architecture Center | Microsoft Learn&lt;/a&gt;). It offloads a lot of the heavy lifting: for instance, you can set up an indexer that automatically chunks documents and calls the embedding model to vectorize content during ingestion (&lt;a href="https://learn.microsoft.com/en-us/azure/search/tutorial-rag-build-solution-pipeline#:~:text=In%20this%20article" rel="noopener noreferrer"&gt;RAG tutorial: Build an indexing pipeline - Azure AI Search | Microsoft Learn&lt;/a&gt;).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;PostgreSQL with pgvector&lt;/strong&gt; – Using a relational database can be convenient if you prefer a lightweight setup or already have a database in your stack. PostgreSQL’s &lt;code&gt;pgvector&lt;/code&gt; extension allows you to store embedding vectors and run similarity queries via SQL. This approach is straightforward: you insert your vectors into a table and use cosine (or inner product) distance functions provided by pgvector to find nearest neighbors. If your team is already comfortable with Postgres, leveraging it might be &lt;strong&gt;the easiest solution for your scenario&lt;/strong&gt; (&lt;a href="https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/vector-search#:~:text=If%20you%20choose%20a%20traditional,supports%20the%20functionality%20you%20require" rel="noopener noreferrer"&gt;Choose an Azure service for vector search - Azure Architecture Center | Microsoft Learn&lt;/a&gt;). It’s a good option for quick prototypes because you can avoid learning a new service – just add the extension to an Azure Database for PostgreSQL or local Postgres, and you have a basic vector database ready. Keep in mind performance might be lower than a specialized vector store for very large vector counts, but for moderate volumes it works well.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;FAISS or other libraries&lt;/strong&gt; – FAISS (Facebook AI Similarity Search) is a high-performance library for vector similarity search. It’s typically used in Python or C++ environments, but there are .NET bindings and alternatives (like SciSharp’s MILVus or using Pinecone via API) if you want to explore them. For a &lt;em&gt;quick C# prototype&lt;/em&gt;, introducing FAISS might add complexity unless you’re familiar with it. However, if you need an on-premise or in-memory solution without external services, you could integrate a vector search library. Many RAG prototypes skip a database entirely and just use an in-memory list of vectors with a brute-force cosine similarity for simplicity when the data size is very small – this is not scalable, but can jumpstart a proof-of-concept.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Recommendation:&lt;/strong&gt; For an Azure-oriented prototype, &lt;strong&gt;Azure Cognitive Search&lt;/strong&gt; is often the best blend of simplicity and capability. It’s managed, requires minimal code for indexing/querying, and ties in nicely with Azure OpenAI (for example, Azure’s “OpenAI on your data” feature can use an Azure Search index as the knowledge source). If you prefer not to provision Azure Search or want everything local, &lt;strong&gt;Postgres/pgvector&lt;/strong&gt; is a solid fallback that keeps things simple while following best practices (vector indexing in a database). In contrast, FAISS or dedicated vector DBs (Pinecone, Weaviate, etc.) might be overkill for a quick demo unless you specifically want to evaluate them. The bottom line is that &lt;strong&gt;any vector store can work&lt;/strong&gt; – RAG isn’t limited to one technology (&lt;a href="https://learn.microsoft.com/en-us/azure/architecture/ai-ml/#:~:text=Retrieval%20Augmented%20Generation%20,use%20any%20data%20store%20technology" rel="noopener noreferrer"&gt;AI Architecture Design - Azure Architecture Center | Microsoft Learn&lt;/a&gt;) – so choose the one that lets you iterate fastest while meeting your needs.&lt;/p&gt;

&lt;h2&gt;
  
  
  Authentication with API Keys (Azure OpenAI &amp;amp; Storage)
&lt;/h2&gt;

&lt;p&gt;For a quick prototype, &lt;strong&gt;API key-based authentication&lt;/strong&gt; is the way to go. Both Azure OpenAI and Azure Storage support API keys that you can use in your .NET application, avoiding the overhead of Azure AD authentication setup. Using the keys is straightforward and gets you up and running quickly. Microsoft’s documentation notes that while you &lt;em&gt;can&lt;/em&gt; use Azure AD roles for tighter security, &lt;strong&gt;“keys are easier to start with”&lt;/strong&gt; for development (&lt;a href="https://learn.microsoft.com/en-us/azure/search/search-get-started-rag#:~:text=Configure%20access" rel="noopener noreferrer"&gt;Quickstart: Generative Search (RAG) - Azure AI Search | Microsoft Learn&lt;/a&gt;).&lt;/p&gt;

&lt;p&gt;In practice, this means:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Azure OpenAI&lt;/strong&gt; – Use the Key and Endpoint from your Azure OpenAI resource. For example, when calling the Azure OpenAI REST API (or SDK), set the &lt;code&gt;api-key&lt;/code&gt; header with your key. In C#, if using the OpenAI client libraries or REST calls, ensure you include the key. (Azure OpenAI also requires an API version in the endpoint URL.)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Azure Storage (Blobs)&lt;/strong&gt; – Use the storage account key or a SAS token to authenticate your Blob client in Azure SDK for .NET. For instance, you can use &lt;code&gt;new BlobServiceClient(&amp;lt;connection_string&amp;gt;)&lt;/code&gt; where the connection string contains the key. This avoids needing to configure managed identities or OAuth in a prototype.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Using API keys for both services will let your prototype run with minimal friction. Just be sure to keep the keys safe (don’t hard-code them in public code; use something like user-secrets or environment variables in your .NET project). You can later swap to managed identities when moving toward production, but for now API keys are perfectly fine and &lt;strong&gt;conform to Azure’s recommended quickstart practices&lt;/strong&gt; (&lt;a href="https://learn.microsoft.com/en-us/azure/search/search-get-started-rag#:~:text=Configure%20access" rel="noopener noreferrer"&gt;Quickstart: Generative Search (RAG) - Azure AI Search | Microsoft Learn&lt;/a&gt;).&lt;/p&gt;

&lt;h2&gt;
  
  
  Official Microsoft Tutorials &amp;amp; Resources
&lt;/h2&gt;

&lt;p&gt;Microsoft provides several tutorials and examples to guide you in building a RAG solution on Azure:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Azure Learn RAG Tutorial Series&lt;/strong&gt; – The Azure Cognitive Search documentation has an end-to-end RAG tutorial series. For example, there’s a &lt;em&gt;Quickstart: Generative search (RAG)&lt;/em&gt; guide that shows how to set up an Azure AI Search index and query it with Azure OpenAI. &lt;em&gt;“In this quickstart, you send queries to a chat completion model for a conversational search experience over your indexed content on Azure AI Search”&lt;/em&gt; (&lt;a href="https://learn.microsoft.com/en-us/azure/search/search-get-started-rag#:~:text=In%20this%20article" rel="noopener noreferrer"&gt;Quickstart: Generative Search (RAG) - Azure AI Search | Microsoft Learn&lt;/a&gt;). It walks through setting up the services (OpenAI, Search) and using them together (the sample code is in Python, but the concepts apply equally to .NET). There are also Azure Architecture Center articles on designing RAG solutions, and a tutorial on building an indexing pipeline that covers chunking, embedding, and indexing documents (&lt;a href="https://learn.microsoft.com/en-us/azure/search/tutorial-rag-build-solution-pipeline#:~:text=In%20this%20article" rel="noopener noreferrer"&gt;RAG tutorial: Build an indexing pipeline - Azure AI Search | Microsoft Learn&lt;/a&gt;).&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;“ChatGPT + Enterprise Data” C# Sample&lt;/strong&gt; – Microsoft has an official .NET sample on GitHub (Azure-Samples) that demonstrates a full RAG application using Azure OpenAI and Azure Cognitive Search. &lt;em&gt;“This sample… uses Azure OpenAI Service to access the ChatGPT model, and Azure AI Search for data indexing and retrieval.”&lt;/em&gt; (&lt;a href="https://github.com/Azure-Samples/azure-search-openai-demo-csharp#:~:text=This%20sample%20demonstrates%20a%20few,for%20data%20indexing%20and%20retrieval" rel="noopener noreferrer"&gt;GitHub - Azure-Samples/azure-search-openai-demo-csharp: A sample app for the Retrieval-Augmented Generation pattern running in Azure, using Azure Cognitive Search for retrieval and Azure OpenAI large language models to power ChatGPT-style and Q&amp;amp;A experiences.&lt;/a&gt;). It’s a Blazor web app that allows you to ask questions to an internal knowledge base. The repo comes with scripts to set up Azure Blob Storage (for documents), an Azure Search index, and Azure OpenAI deployments. This is a great resource to see best practices in action (chunking docs, storing embeddings in the index, constructing prompts with citations, etc.) in a C# codebase. You can deploy it to Azure or run locally once you configure the keys.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Microsoft Learn Modules&lt;/strong&gt; – Look for learning modules or workshops on &lt;em&gt;“Azure OpenAI and Cognitive Search”&lt;/em&gt;. For instance, Azure AI Search’s documentation includes an overview of RAG and suggests approaches in different languages. They even mention &lt;strong&gt;templates for .NET&lt;/strong&gt; that create an end-to-end solution (&lt;a href="https://learn.microsoft.com/en-us/azure/search/retrieval-augmented-generation-overview#:~:text=Microsoft%20has%20several%20built,Search%20in%20a%20RAG%20solution" rel="noopener noreferrer"&gt;RAG and generative AI - Azure AI Search | Microsoft Learn&lt;/a&gt;). Additionally, the AI Show or Azure webinars often showcase building a chatbot with your own data on Azure. Checking Microsoft Learn for “OpenAI on your data” or “knowledge mining with Azure OpenAI” can yield step-by-step content.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;By leveraging these resources, you can follow a proven path. Start by uploading a few documents to Blob (or using provided sample data), create a vector index (either via Azure Search or manually with pgvector), obtain embeddings with Ada-002, and then use the OpenAI GPT-35-Turbo or GPT-4 model to answer questions with the retrieved content. The tutorials will reinforce the best practices: &lt;strong&gt;store data securely, use efficient embedding models, query via vector similarity, and authenticate with API keys&lt;/strong&gt; for simplicity.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Sources:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;Azure Cognitive Search RAG Tutorial – &lt;em&gt;“This exercise uploads PDF files into blob storage for automated indexing.”&lt;/em&gt; (&lt;a href="https://learn.microsoft.com/en-us/azure/search/tutorial-rag-build-solution-pipeline#:~:text=,blob%20storage%20for%20automated%20indexing" rel="noopener noreferrer"&gt;RAG tutorial: Build an indexing pipeline - Azure AI Search | Microsoft Learn&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Azure OpenAI Embeddings – &lt;em&gt;Use an embedding model such as &lt;code&gt;text-embedding-ada-002&lt;/code&gt;&lt;/em&gt; (&lt;a href="https://learn.microsoft.com/en-us/azure/search/cognitive-search-skill-azure-openai-embedding#:~:text=,skillset%20is%20created%20using%20the" rel="noopener noreferrer"&gt;Azure OpenAI Embedding skill - Azure AI Search | Microsoft Learn&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Azure AI Search for Vector Data – &lt;em&gt;Azure AI Search is a proven solution for information retrieval in a RAG architecture… providing indexing and query capabilities with Azure’s infrastructure.&lt;/em&gt; (&lt;a href="https://learn.microsoft.com/en-us/azure/search/retrieval-augmented-generation-overview#:~:text=,language%20understanding%20models%20for%20retrieval" rel="noopener noreferrer"&gt;RAG and generative AI - Azure AI Search | Microsoft Learn&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Choosing a Vector Store – &lt;em&gt;If you already use a specific type of database… using that same type might be the easiest solution… each service has unique capabilities and limitations for vector search.&lt;/em&gt; (&lt;a href="https://learn.microsoft.com/en-us/azure/architecture/guide/technology-choices/vector-search#:~:text=If%20you%20choose%20a%20traditional,supports%20the%20functionality%20you%20require" rel="noopener noreferrer"&gt;Choose an Azure service for vector search - Azure Architecture Center | Microsoft Learn&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Azure Search + OpenAI Sample (C#) – &lt;em&gt;This sample… uses Azure OpenAI Service (ChatGPT) and Azure AI Search for data indexing and retrieval.&lt;/em&gt; (&lt;a href="https://github.com/Azure-Samples/azure-search-openai-demo-csharp#:~:text=This%20sample%20demonstrates%20a%20few,for%20data%20indexing%20and%20retrieval" rel="noopener noreferrer"&gt;GitHub - Azure-Samples/azure-search-openai-demo-csharp: A sample app for the Retrieval-Augmented Generation pattern running in Azure, using Azure Cognitive Search for retrieval and Azure OpenAI large language models to power ChatGPT-style and Q&amp;amp;A experiences.&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Azure OpenAI Quickstart Note – &lt;em&gt;“You can use API keys or roles… Keys are easier to start with.”&lt;/em&gt; (&lt;a href="https://learn.microsoft.com/en-us/azure/search/search-get-started-rag#:~:text=Configure%20access" rel="noopener noreferrer"&gt;Quickstart: Generative Search (RAG) - Azure AI Search | Microsoft Learn&lt;/a&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;p&gt;&lt;a href="https://www.pexels.com/photo/abstract-digital-circuitry-with-glowing-effects-30547568/" rel="noopener noreferrer"&gt;Photo by Pachon in Motion&lt;/a&gt;&lt;/p&gt;

</description>
      <category>openai</category>
      <category>azure</category>
      <category>tutorial</category>
      <category>rag</category>
    </item>
    <item>
      <title>How to Integrate OpenAI for Text Generation, Text-to-Speech, and Speech-to-Text in .NET</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Thu, 07 Nov 2024 19:10:20 +0000</pubDate>
      <link>https://forem.com/petermilovcik/how-to-integrate-openai-for-text-generation-text-to-speech-and-speech-to-text-in-net-2il0</link>
      <guid>https://forem.com/petermilovcik/how-to-integrate-openai-for-text-generation-text-to-speech-and-speech-to-text-in-net-2il0</guid>
      <description>&lt;p&gt;With the release of OpenAI's latest NuGet package (version 2.0.0), developers can easily integrate AI-driven text generation, text-to-speech (TTS), and speech-to-text (STT) functionalities into their .NET applications. This guide will walk through creating an OpenAI service in .NET that allows you to generate text responses, convert text to audio, and transcribe audio files back to text.&lt;/p&gt;

&lt;p&gt;This implementation will use the minimum configuration necessary. For Windows, we’ll also leverage the &lt;a href="https://www.nuget.org/packages/NAudio/" rel="noopener noreferrer"&gt;NAudio&lt;/a&gt; package for handling audio playback, as it offers a straightforward solution for recording and playing audio files.&lt;/p&gt;




&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;p&gt;Before you start integrating OpenAI’s capabilities into your .NET project, make sure you have the following set up:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Install the OpenAI NuGet Package (Version 2.0.0):&lt;/strong&gt; Add the latest version of the &lt;a href="https://www.nuget.org/packages/OpenAI/" rel="noopener noreferrer"&gt;OpenAI NuGet package&lt;/a&gt; to your .NET project:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;dotnet add package OpenAI &lt;span class="nt"&gt;--version&lt;/span&gt; 2.0.0
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Install NAudio (for Windows audio handling):&lt;/strong&gt; If you're working on a Windows machine and need to handle audio recording or playback, add the &lt;a href="https://www.nuget.org/packages/NAudio/" rel="noopener noreferrer"&gt;NAudio NuGet package&lt;/a&gt;:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;dotnet add package NAudio
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Set the OpenAI API Key:&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;For Windows users&lt;/strong&gt;, you can set the &lt;code&gt;OPENAI_API_KEY&lt;/code&gt; environment variable using the Command Prompt:
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;setx OPENAI_API_KEY your_openai_api_key
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Note:&lt;/strong&gt; Run this command in a Command Prompt with administrative privileges for a system-wide setting.&lt;/li&gt;
&lt;li&gt;Restart any open Command Prompt or PowerShell windows after running this command to ensure the new variable is recognized.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;For other platforms (macOS, Linux)&lt;/strong&gt;, you can set the environment variable using:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;export &lt;/span&gt;&lt;span class="nv"&gt;OPENAI_API_KEY&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;your_openai_api_key
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Ensure .NET SDK is Installed:&lt;/strong&gt; Make sure you have the latest version of the .NET SDK installed. You can check your version using:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;dotnet &lt;span class="nt"&gt;--version&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;With these prerequisites in place, you are ready to start building AI-enhanced features into your .NET applications!&lt;/p&gt;




&lt;h3&gt;
  
  
  Step 1: Generating Text Responses
&lt;/h3&gt;

&lt;p&gt;The following &lt;code&gt;OpenAiService&lt;/code&gt; class uses OpenAI’s text generation API, powered by the GPT-4 model, to generate responses based on a given prompt.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;OpenAiService&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;private&lt;/span&gt; &lt;span class="k"&gt;readonly&lt;/span&gt; &lt;span class="n"&gt;ChatClient&lt;/span&gt; &lt;span class="n"&gt;_chatClient&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="nf"&gt;OpenAiService&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;apiKey&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Environment&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetEnvironmentVariable&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"OPENAI_API_KEY"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="n"&gt;_chatClient&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;ChatClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"gpt-4o-mini"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;GenerateResponseAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;messages&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="n"&gt;List&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="n"&gt;ChatMessage&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;SystemChatMessage&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"You are a knowledgeable assistant."&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt;
            &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;UserChatMessage&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;$"Generate a response based on the prompt:\n\n&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;};&lt;/span&gt;
        &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;completion&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;_chatClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;CompleteChatAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;messages&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;completion&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Content&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="m"&gt;0&lt;/span&gt;&lt;span class="p"&gt;].&lt;/span&gt;&lt;span class="n"&gt;Text&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this class:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;The &lt;code&gt;GenerateResponseAsync&lt;/code&gt; method takes a &lt;code&gt;prompt&lt;/code&gt; and generates a response.&lt;/li&gt;
&lt;li&gt;We initiate a conversation by sending a system message, setting the tone as a "knowledgeable assistant."&lt;/li&gt;
&lt;li&gt;Finally, we pass the prompt to the model and return the generated response.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 2: Converting Text to Speech
&lt;/h3&gt;

&lt;p&gt;To convert text to speech, we’ll use OpenAI’s TTS functionality. This &lt;code&gt;TextToSpeechService&lt;/code&gt; class converts a given text to an audio file and plays it.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;TextToSpeechService&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;private&lt;/span&gt; &lt;span class="k"&gt;readonly&lt;/span&gt; &lt;span class="n"&gt;AudioClient&lt;/span&gt; &lt;span class="n"&gt;_audioClient&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="nf"&gt;TextToSpeechService&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;apiKey&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Environment&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetEnvironmentVariable&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"OPENAI_API_KEY"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="n"&gt;_audioClient&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;AudioClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"tts-1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt; &lt;span class="nf"&gt;ConvertTextToSpeechAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;speech&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;_audioClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GenerateSpeechAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;GeneratedSpeechVoice&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Onyx&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;stream&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;File&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;OpenWrite&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"output.mp3"&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;speech&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ToStream&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;CopyTo&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;stream&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="nf"&gt;PlayAudio&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"output.mp3"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;private&lt;/span&gt; &lt;span class="k"&gt;void&lt;/span&gt; &lt;span class="nf"&gt;PlayAudio&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;filePath&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;audioFile&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;AudioFileReader&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;filePath&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
        &lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;outputDevice&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;WaveOutEvent&lt;/span&gt;&lt;span class="p"&gt;())&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;outputDevice&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Init&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;audioFile&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
            &lt;span class="n"&gt;outputDevice&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Play&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Key points:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;ConvertTextToSpeechAsync&lt;/code&gt; accepts a text string, converts it into speech, and saves it as an MP3 file.&lt;/li&gt;
&lt;li&gt;The &lt;code&gt;PlayAudio&lt;/code&gt; method leverages &lt;code&gt;NAudio&lt;/code&gt; for playback. It reads the MP3 file and plays it back on your system.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 3: Transcribing Audio to Text
&lt;/h3&gt;

&lt;p&gt;The following &lt;code&gt;SpeechToTextService&lt;/code&gt; class uses OpenAI’s Whisper model to transcribe audio files into text. This can be incredibly useful for processing voice input.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;SpeechToTextService&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;private&lt;/span&gt; &lt;span class="k"&gt;readonly&lt;/span&gt; &lt;span class="n"&gt;AudioClient&lt;/span&gt; &lt;span class="n"&gt;_audioClient&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="nf"&gt;SpeechToTextService&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;apiKey&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Environment&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetEnvironmentVariable&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"OPENAI_API_KEY"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="n"&gt;_audioClient&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;AudioClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"whisper-1"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;apiKey&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;

    &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt;&lt;span class="p"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;&amp;gt;&lt;/span&gt; &lt;span class="nf"&gt;TranscribeAudioAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;audioFilePath&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;transcription&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;_audioClient&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;TranscribeAudioAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;audioFilePath&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;transcription&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Text&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This class:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Accepts an audio file path and transcribes the audio content into text.&lt;/li&gt;
&lt;li&gt;The transcription result is returned as a plain text string.&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Step 4: Recording Audio with NAudio
&lt;/h3&gt;

&lt;p&gt;For applications that need to capture audio from the user, such as for speech-to-text input, you can use the &lt;code&gt;NAudio&lt;/code&gt; library to record audio and save it as a &lt;code&gt;.wav&lt;/code&gt; file. This is especially useful for Windows-based applications, where &lt;code&gt;NAudio&lt;/code&gt; provides a straightforward API for handling audio input.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;StartRecordingAsync&lt;/code&gt; method below demonstrates how to record audio from the default microphone, saving it to a specified output file path.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt; &lt;span class="nf"&gt;StartRecordingAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;outputFilePath&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;CancellationToken&lt;/span&gt; &lt;span class="n"&gt;cancellationToken&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;waveFormat&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;WaveFormat&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="m"&gt;44100&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="m"&gt;16&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="m"&gt;1&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="c1"&gt;// 44.1 kHz, 16-bit, mono&lt;/span&gt;
    &lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;waveIn&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="n"&gt;WaveInEvent&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="n"&gt;WaveFormat&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;waveFormat&lt;/span&gt; &lt;span class="p"&gt;})&lt;/span&gt;
    &lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;writer&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;WaveFileWriter&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;outputFilePath&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;waveFormat&lt;/span&gt;&lt;span class="p"&gt;))&lt;/span&gt;
    &lt;span class="p"&gt;{&lt;/span&gt;
        &lt;span class="n"&gt;waveIn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;DataAvailable&lt;/span&gt; &lt;span class="p"&gt;+=&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;sender&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;writer&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Write&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Buffer&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="m"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;BytesRecorded&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
        &lt;span class="p"&gt;};&lt;/span&gt;

        &lt;span class="n"&gt;waveIn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;StartRecording&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

        &lt;span class="k"&gt;try&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Delay&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;Timeout&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Infinite&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cancellationToken&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt; &lt;span class="c1"&gt;// Keeps recording until cancellation&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
        &lt;span class="k"&gt;catch&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;TaskCanceledException&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
        &lt;span class="p"&gt;{&lt;/span&gt;
            &lt;span class="n"&gt;waveIn&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;StopRecording&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
        &lt;span class="p"&gt;}&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In this code:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Initialize Audio Format&lt;/strong&gt;: We set up the audio format to 44.1 kHz, 16-bit, mono. These settings provide good quality for most voice recordings.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Create Audio Input and Writer&lt;/strong&gt;: We use &lt;code&gt;WaveInEvent&lt;/code&gt; for capturing audio from the default microphone and &lt;code&gt;WaveFileWriter&lt;/code&gt; to write the audio data to a file.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Handle Data Available Event&lt;/strong&gt;: As audio data becomes available (captured in chunks), it is written to the file through the &lt;code&gt;writer&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Start and Stop Recording&lt;/strong&gt;: Recording starts with &lt;code&gt;StartRecording()&lt;/code&gt; and will continue until the provided &lt;code&gt;CancellationToken&lt;/code&gt; is canceled, at which point &lt;code&gt;StopRecording()&lt;/code&gt; is called to end the recording.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  Usage Example
&lt;/h3&gt;

&lt;p&gt;To start recording audio, you can call this method and provide a file path and cancellation token:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;recordingService&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;AudioRecordingService&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;cancellationTokenSource&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;CancellationTokenSource&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

&lt;span class="n"&gt;Console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WriteLine&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Recording audio. Press any key to stop..."&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;recordingService&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;StartRecordingAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"recording.wav"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cancellationTokenSource&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Token&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// Wait for a key press to stop recording&lt;/span&gt;
&lt;span class="n"&gt;Console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ReadKey&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="n"&gt;cancellationTokenSource&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Cancel&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This example will begin recording audio and save it to &lt;code&gt;recording.wav&lt;/code&gt; until a key is pressed, triggering the cancellation of the recording.&lt;/p&gt;

&lt;p&gt;With the addition of audio recording using &lt;code&gt;NAudio&lt;/code&gt;, you now have a full toolkit for handling text generation, text-to-speech, speech-to-text, and audio recording within your .NET application. This setup provides a complete pipeline for interactive and conversational applications in .NET, enabling voice-based input, audio output, and seamless integration with OpenAI’s powerful language models.&lt;/p&gt;

&lt;h3&gt;
  
  
  Putting It All Together
&lt;/h3&gt;

&lt;p&gt;With these services implemented, you have the foundation for a fully interactive .NET application that can generate text, convert text to speech, transcribe spoken input, and record audio. Here’s an example of how to use all four services in a cohesive application.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;openAiService&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;OpenAiService&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;ttsService&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;TextToSpeechService&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;sttService&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;SpeechToTextService&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;recordingService&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;AudioRecordingService&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;cancellationTokenSource&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;CancellationTokenSource&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

&lt;span class="c1"&gt;// Step 1: Generate a Text Response&lt;/span&gt;
&lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;prompt&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"Tell me something interesting about AI."&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;generatedText&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;openAiService&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GenerateResponseAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;prompt&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="n"&gt;Console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WriteLine&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Generated Text: "&lt;/span&gt; &lt;span class="p"&gt;+&lt;/span&gt; &lt;span class="n"&gt;generatedText&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// Step 2: Convert Generated Text to Speech&lt;/span&gt;
&lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;ttsService&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ConvertTextToSpeechAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;generatedText&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="c1"&gt;// Step 3: Record Audio Input&lt;/span&gt;
&lt;span class="n"&gt;Console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WriteLine&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Recording audio input. Press any key to stop recording..."&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="n"&gt;_&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;recordingService&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;StartRecordingAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"user_recording.wav"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;cancellationTokenSource&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Token&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="n"&gt;Console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ReadKey&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="n"&gt;cancellationTokenSource&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Cancel&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;

&lt;span class="c1"&gt;// Step 4: Transcribe Recorded Audio&lt;/span&gt;
&lt;span class="kt"&gt;string&lt;/span&gt; &lt;span class="n"&gt;transcribedText&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;sttService&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;TranscribeAudioAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"user_recording.wav"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="n"&gt;Console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WriteLine&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"Transcribed Text: "&lt;/span&gt; &lt;span class="p"&gt;+&lt;/span&gt; &lt;span class="n"&gt;transcribedText&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;Using OpenAI’s .NET SDK alongside NAudio, you can bring powerful AI capabilities into your .NET applications. This integration covers:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Text Generation&lt;/strong&gt;: Generate contextually relevant responses.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Text-to-Speech&lt;/strong&gt;: Convert generated text to audio for a more interactive experience.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Speech-to-Text&lt;/strong&gt;: Capture and transcribe user input.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Audio Recording&lt;/strong&gt;: Enable seamless audio capture for user interactions.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This setup provides a complete, interactive pipeline that can power chatbots, virtual assistants, or any voice-enabled application. By following this guide, you’ll have a solid foundation for enhancing your .NET applications with AI-powered, voice-driven features.&lt;/p&gt;

&lt;h3&gt;
  
  
  Before You Go...
&lt;/h3&gt;

&lt;p&gt;Did this guide help you level up your .NET skills with OpenAI integration? If so, let’s spread the knowledge! Give it a like, share with fellow devs, or drop a comment below. Every interaction helps boost this content, bringing these tips to more developers. And hey—if it didn’t deliver, no hard feelings; your silence speaks louder than clicks! 😉&lt;/p&gt;

</description>
      <category>openai</category>
      <category>dotnet</category>
      <category>beginners</category>
      <category>ai</category>
    </item>
    <item>
      <title>Deesix: A Journey from MUDs to Modern AI-Powered RPGs</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Wed, 31 Jul 2024 11:33:14 +0000</pubDate>
      <link>https://forem.com/petermilovcik/deesix-a-journey-from-muds-to-modern-ai-powered-rpgs-38n4</link>
      <guid>https://forem.com/petermilovcik/deesix-a-journey-from-muds-to-modern-ai-powered-rpgs-38n4</guid>
      <description>&lt;h3&gt;
  
  
  A Nostalgic Journey
&lt;/h3&gt;

&lt;p&gt;Back in the day, I spent countless hours exploring the immersive worlds of MUDs, those classic text-based games that brought fantasy adventures to life with just words on a screen. There was something magical about using our imagination to fill in the gaps, picturing vast dungeons, mythical creatures, and epic battles. It was a time of pure creativity and wonder, where every command could lead to a new discovery or challenge. Those games weren't just pastimes; they were gateways to other worlds, and I cherished every moment spent in them. &lt;/p&gt;

&lt;p&gt;Now, as I look back, I find myself wanting to recreate that magic with a modern twist.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Evolution of Technology and Gaming
&lt;/h3&gt;

&lt;p&gt;With the rise of Generative AI, we've seen a huge leap in what technology can do. These advanced AI models can now create incredibly detailed descriptions, making it easier than ever to bring game worlds to life. Imagine walking into a room and instantly getting a vivid picture of its atmosphere, or encountering a character with a rich backstory—all generated on the fly. This technology has made it possible to craft immersive experiences that were once only possible in our imaginations.&lt;/p&gt;

&lt;p&gt;For game developers, this means less time spent on manual content creation and more time focusing on making the game fun and engaging. It's a game-changer, opening up new possibilities for storytelling and player interaction.&lt;/p&gt;

&lt;h3&gt;
  
  
  Introducing Deesix: A Modern Text-Based RPG
&lt;/h3&gt;

&lt;p&gt;Deesix is a modern take on the classic text-based RPGs that we all loved, blending old-school charm with new technology. Inspired by the traditional fantasy elements of games like Dungeons &amp;amp; Dragons, Deesix brings a unique twist with the help of Generative AI.&lt;/p&gt;

&lt;p&gt;One of the standout features is the &lt;strong&gt;Game Master AI&lt;/strong&gt;, which acts like a narrator and guide. It creates the game's world, describes locations, and offers a variety of actions for players to choose from. This AI isn't just static; it adapts to your choices, making the game feel alive and interactive. &lt;/p&gt;

&lt;p&gt;Another cool aspect is the &lt;strong&gt;procedural generation&lt;/strong&gt;. Every time you play, the world is freshly generated, so no two adventures are the same. This keeps things exciting and unpredictable, offering endless exploration possibilities. And for those who love a bit of randomness, the game mechanics revolve around using D6 dice, just like in tabletop RPGs. This simple yet engaging system adds an extra layer of fun and strategy to the gameplay.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Development Journey: Challenges and Milestones
&lt;/h3&gt;

&lt;p&gt;Creating Deesix is set to be an exciting adventure full of ups and downs. It will start with a simple idea: to bring the magic of old-school text-based RPGs into the modern era. I aim to capture that nostalgic feel while adding something fresh and new with the help of AI.&lt;/p&gt;

&lt;p&gt;Of course, like any project, there will likely be challenges along the way. Balancing the AI's ability to generate engaging content with the player's freedom of choice could be a big one. I might also face technical hurdles, like optimizing the game for different platforms and ensuring the AI's responses are quick and relevant. But each challenge will be a learning experience, pushing me to find creative solutions and improve the game.&lt;/p&gt;

&lt;p&gt;Despite these potential obstacles, I'm hopeful for great successes. Seeing the Game Master AI come to life and create dynamic, immersive worlds will be incredibly rewarding. Each milestone, from the first playable prototype to the refined gameplay mechanics, will bring me closer to making Deesix a reality. And I'm excited to share these experiences, both the successes and struggles, as I move forward with the game's development.&lt;/p&gt;

&lt;h3&gt;
  
  
  Future Updates and Community Engagement
&lt;/h3&gt;

&lt;p&gt;As I continue developing Deesix, I'm eager to share my journey with you. This blog post is just the beginning. I'll be posting regular updates, detailing everything from our development milestones to the challenges I face. Whether it's a new feature I'm excited about or a problem I'm tackling, I want to keep you in the loop.&lt;/p&gt;

&lt;p&gt;I also invite you to join the conversation and be a part of this journey. You can follow the &lt;code&gt;#Deesix&lt;/code&gt; tag for updates or follow me directly for all my future posts. Your feedback and support will be invaluable as I bring this game to life. Let's explore this adventure together!&lt;/p&gt;

&lt;h3&gt;
  
  
  Conclusion: An Invitation to Join the Journey
&lt;/h3&gt;

&lt;p&gt;I'm thrilled to embark on this journey of creating Deesix and bringing a fresh take on text-based RPGs to life. This project is more than just a game; it's a tribute to the nostalgic days of MUDs and a step into the future with Generative AI. I can't wait to share more about the development process, the ups and downs, and everything in between.&lt;/p&gt;

&lt;p&gt;It's important to note that Deesix is just a side project of mine. While I'm passionate about it, I won't be able to work on it full-time. So, please don't expect miracles overnight. Progress might be steady but slow. However, your interest and support mean the world to me, and I'm excited to share each step of the journey with you. Together, let's create something amazing! Follow the &lt;code&gt;#Deesix&lt;/code&gt; tag or my profile for the latest updates, insights, and stories from behind the scenes.&lt;/p&gt;




&lt;p&gt;&lt;a href="https://www.onceuponapicture.co.uk/portfolio_page/whos-there/" rel="noopener noreferrer"&gt;Image by Alejandro Burdisio/Burda&lt;/a&gt;&lt;/p&gt;

</description>
      <category>deesix</category>
      <category>devjournal</category>
      <category>ai</category>
      <category>gamedev</category>
    </item>
    <item>
      <title>How to start with InfluxDB OSS v2 and C# Client Library for Windows and Docker</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Wed, 10 Apr 2024 09:36:59 +0000</pubDate>
      <link>https://forem.com/petermilovcik/how-to-start-with-influxdb-oss-v2-and-c-client-library-for-windows-and-docker-78k</link>
      <guid>https://forem.com/petermilovcik/how-to-start-with-influxdb-oss-v2-and-c-client-library-for-windows-and-docker-78k</guid>
      <description>&lt;h2&gt;
  
  
  What is InfluxDB OSS v2?
&lt;/h2&gt;

&lt;p&gt;InfluxDB OSS (Open Source Software) v2 is the latest version of the InfluxDB time series database platform. It is designed to handle high write and query loads, making it a suitable tool for storing and analyzing large amounts of time-stamped data. This version combines capabilities from the previous versions of InfluxDB 1.x (like storage engine, query language) and other components like Chronograf (visualization) and Kapacitor (processing, alerting, and downsampling). It also introduces Flux, a new scripting and query language that supports analytics across measurements.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 1: Install Docker Desktop on Windows
&lt;/h2&gt;

&lt;p&gt;Before you begin, you need Docker Desktop installed on your Windows machine. Docker Desktop provides an easy-to-use interface and the Docker CLI (Command Line Interface) necessary for running containerized applications like InfluxDB.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Download Docker Desktop&lt;/strong&gt;: Go to the &lt;a href="https://www.docker.com/products/docker-desktop/" rel="noopener noreferrer"&gt;Docker Desktop: The #1 Containerization Tool for Developers | Docker&lt;/a&gt; and download the latest version of Docker Desktop for Windows.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Installation&lt;/strong&gt;: Run the installer and follow the on-screen instructions. Make sure to enable the WSL 2 feature if prompted, as it's required for Docker Desktop to run efficiently on Windows.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Verification&lt;/strong&gt;: After installation, open a terminal (you can use PowerShell or the Windows Command Prompt) and type &lt;code&gt;docker --version&lt;/code&gt; to verify that Docker has been installed correctly. You should see the Docker version displayed.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Step 2: Pull and Run the InfluxDB Docker Image
&lt;/h2&gt;

&lt;p&gt;First, ensure you have Docker Desktop installed. Then, use the terminal to pull the latest version of InfluxDB, version 2.7 as of today (2024-04-10). Verify the latest version on &lt;a href="https://hub.docker.com/_/influxdb" rel="noopener noreferrer"&gt;Docker Hub&lt;/a&gt;.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Pull the InfluxDB Image&lt;/strong&gt;: Execute &lt;code&gt;docker pull influxdb:2.7&lt;/code&gt; in the terminal to download the InfluxDB 2.7 image.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Run the Container&lt;/strong&gt;: Use the command below to start your InfluxDB container:
&lt;/li&gt;
&lt;/ol&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;docker run &lt;span class="nt"&gt;--name&lt;/span&gt; influxdb2 &lt;span class="nt"&gt;-d&lt;/span&gt; &lt;span class="nt"&gt;-p&lt;/span&gt; 8086:8086 &lt;span class="nt"&gt;-v&lt;/span&gt; influxdb2-data:/var/lib/influxdb2 &lt;span class="nt"&gt;-v&lt;/span&gt; influxdb2-config:/etc/influxdb2 influxdb:2.7
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;--name influxdb2&lt;/code&gt;: Assigns the container a name (&lt;code&gt;influxdb2&lt;/code&gt;) for easy reference.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-d&lt;/code&gt;: Runs the container in detached mode, meaning it runs in the background.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-p 8086:8086&lt;/code&gt;: Maps port 8086 inside the container to port 8086 on your host, making the InfluxDB UI accessible at &lt;code&gt;http://localhost:8086&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;-v&lt;/code&gt;: Creates volumes for persistent data storage and configuration settings, ensuring your data is saved outside the container.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This command sets up InfluxDB version 2.7 in a container, making it ready for configuration and use.&lt;/p&gt;

&lt;h2&gt;
  
  
  Step 3: Configure InfluxDB
&lt;/h2&gt;

&lt;p&gt;After starting the container, you'll need to configure InfluxDB.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Access the InfluxDB UI&lt;/strong&gt;: Open your web browser and go to &lt;code&gt;http://localhost:8086&lt;/code&gt;. You should see the InfluxDB welcome screen.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw9hxpzwhewyzgepaoycg.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw9hxpzwhewyzgepaoycg.png" alt="InfluxDB Welcome Page" width="800" height="479"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Initial Setup&lt;/strong&gt;: Follow the prompts to create an initial user, organization, and bucket. Remember the credentials you set here; you'll need them for accessing InfluxDB later.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Secure Your InfluxDB&lt;/strong&gt;: It's crucial to secure your InfluxDB instance. Make sure to note down the generated token during setup, as it's needed for API authentication.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 4: Creating a Simple Console Application with .NET 8 Using the InfluxDB C# Client Library
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Create a New Console Application&lt;/strong&gt;: Open your terminal or command prompt, navigate to the desired directory, and run following command to create a new .NET 8 console application.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;dotnet new console &lt;span class="nt"&gt;-n&lt;/span&gt; InfluxDB2Demo
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Install the InfluxDB Client Library&lt;/strong&gt;: Use the .NET CLI command to add the InfluxDB client package to your project.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;cd &lt;/span&gt;InfluxDB2Demo
dotnet add package InfluxDB.Client
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Open console application&lt;/strong&gt; in your favorite IDE, e.g., Visual Studio Code:
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;code .
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Open &lt;code&gt;Program.cs&lt;/code&gt; file to continue.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Initialize the Client&lt;/strong&gt;: &lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To initialize the InfluxDB client in a C# application:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Reference the InfluxDB Client Namespace&lt;/strong&gt;: Add &lt;code&gt;using InfluxDB.Client;&lt;/code&gt; at the top of your program file.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Create the Client Instance&lt;/strong&gt;: Instantiate the &lt;code&gt;InfluxDBClient&lt;/code&gt; by calling its constructor with two arguments: the URL to your InfluxDB instance and your InfluxDB token.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Write Data&lt;/strong&gt;:

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;By POCO&lt;/strong&gt;: Define a class with your data schema, instantiate it, and write it using &lt;code&gt;writeApi.WriteMeasurement()&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;strong&gt;Query Data&lt;/strong&gt;: Write a Flux query string to retrieve data from your bucket and use &lt;code&gt;influxDBClient.GetQueryApi().QueryAsync()&lt;/code&gt; to execute the query and process the results.
&lt;/li&gt;

&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight csharp"&gt;&lt;code&gt;&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;InfluxDB.Client&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;InfluxDB.Client.Api.Domain&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;InfluxDB.Client.Core&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="k"&gt;internal&lt;/span&gt; &lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;Program&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="k"&gt;private&lt;/span&gt; &lt;span class="k"&gt;static&lt;/span&gt; &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="n"&gt;Task&lt;/span&gt; &lt;span class="nf"&gt;Main&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;[]&lt;/span&gt; &lt;span class="n"&gt;args&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"http://localhost:8086"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;token&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"Your_InfluxDB_Token"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;var&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nf"&gt;InfluxDBClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;token&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="c1"&gt;// Write data&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="k"&gt;using&lt;/span&gt; &lt;span class="nn"&gt;var&lt;/span&gt; &lt;span class="n"&gt;writeApi&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetWriteApi&lt;/span&gt;&lt;span class="p"&gt;();&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;temperature&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="n"&gt;Temperature&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="n"&gt;Location&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"south"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;Value&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;Random&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Shared&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;Next&lt;/span&gt;&lt;span class="p"&gt;(-&lt;/span&gt;&lt;span class="m"&gt;30&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="m"&gt;40&lt;/span&gt;&lt;span class="p"&gt;),&lt;/span&gt; &lt;span class="n"&gt;Time&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;DateTime&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;UtcNow&lt;/span&gt; &lt;span class="p"&gt;};&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="n"&gt;writeApi&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WriteMeasurement&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;temperature&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;WritePrecision&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Ns&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"bucket_name"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"org_id"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;

&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="c1"&gt;// Query data&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;flux&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="s"&gt;"from(bucket:\"bucket_name\") |&amp;gt; range(start: 0)"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;fluxTables&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetQueryApi&lt;/span&gt;&lt;span class="p"&gt;().&lt;/span&gt;&lt;span class="nf"&gt;QueryAsync&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;flux&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="s"&gt;"org_id"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="n"&gt;fluxTables&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ForEach&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;fluxTable&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="kt"&gt;var&lt;/span&gt; &lt;span class="n"&gt;fluxRecords&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;fluxTable&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;Records&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="n"&gt;fluxRecords&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;ForEach&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;fluxRecord&lt;/span&gt; &lt;span class="p"&gt;=&amp;gt;&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="n"&gt;Console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;WriteLine&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;$"&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;fluxRecord&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetTime&lt;/span&gt;&lt;span class="p"&gt;()}&lt;/span&gt;&lt;span class="s"&gt;: &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="n"&gt;fluxRecord&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;GetValue&lt;/span&gt;&lt;span class="p"&gt;()}&lt;/span&gt;&lt;span class="s"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nf"&gt;Measurement&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"temperature"&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt;
&lt;span class="k"&gt;class&lt;/span&gt; &lt;span class="nc"&gt;Temperature&lt;/span&gt;
&lt;span class="p"&gt;{&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nf"&gt;Column&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"location"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;IsTag&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;true&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt; &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="kt"&gt;string&lt;/span&gt;&lt;span class="p"&gt;?&lt;/span&gt; &lt;span class="n"&gt;Location&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="k"&gt;get&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;set&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nf"&gt;Column&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s"&gt;"value"&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt; &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="kt"&gt;double&lt;/span&gt; &lt;span class="n"&gt;Value&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="k"&gt;get&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;set&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="err"&gt; &lt;/span&gt; &lt;span class="err"&gt; &lt;/span&gt; &lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nf"&gt;Column&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;IsTimestamp&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="k"&gt;true&lt;/span&gt;&lt;span class="p"&gt;)]&lt;/span&gt; &lt;span class="k"&gt;public&lt;/span&gt; &lt;span class="n"&gt;DateTime&lt;/span&gt; &lt;span class="n"&gt;Time&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="k"&gt;get&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="k"&gt;set&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;NOTE&lt;/strong&gt;: Ensure you replace &lt;code&gt;"Your_InfluxDB_Token"&lt;/code&gt; with your actual InfluxDB API token created in &lt;strong&gt;Step 3&lt;/strong&gt;, similarly also &lt;code&gt;bucket_name&lt;/code&gt; and &lt;code&gt;org_id&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;This simplified guide helps you set up a basic application to interact with InfluxDB. &lt;br&gt;
For detailed documentation and examples, refer to the &lt;a href="https://github.com/influxdata/influxdb-client-csharp" rel="noopener noreferrer"&gt;influxdata/influxdb-client-csharp: InfluxDB 2.x C# Client (github.com)&lt;/a&gt;.&lt;/p&gt;




&lt;p&gt;You've taken a big step by learning how to set up and start using InfluxDB OSS v2 with Docker and C#. This guide is just the beginning. There's a lot more you can do:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;a href="https://www.influxdata.com/products/flux/" rel="noopener noreferrer"&gt;Learn More About Flux&lt;/a&gt;: Get better at using Flux for more complex data tasks.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.influxdata.com/influxdb/v2/reference/internals/data-retention/" rel="noopener noreferrer"&gt;Learn More About Data Retention&lt;/a&gt;: Learn about data retention and how to keep your database running smoothly as it grows.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.influxdata.com/influxdb/cloud/admin/buckets/" rel="noopener noreferrer"&gt;Learn More About Buckets in InfluxDB&lt;/a&gt;: Get better understanding of how and where time series data is stored.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.influxdata.com/community/" rel="noopener noreferrer"&gt;Join the Community&lt;/a&gt;: There's a big community out there. Join forums, share ideas, and get help.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;Remember, practice makes perfect. Keep exploring and experimenting with InfluxDB to get the most out of it. For more guides and support, check out the &lt;a href="https://www.influxdata.com/" rel="noopener noreferrer"&gt;InfluxDB official page&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>beginners</category>
      <category>tutorial</category>
      <category>influxdb</category>
      <category>csharp</category>
    </item>
    <item>
      <title>Understanding BehaviorSubject in Angular Through the Thermometer Analogy</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Thu, 28 Sep 2023 08:58:34 +0000</pubDate>
      <link>https://forem.com/petermilovcik/understanding-behaviorsubject-in-angular-through-the-thermometer-analogy-554i</link>
      <guid>https://forem.com/petermilovcik/understanding-behaviorsubject-in-angular-through-the-thermometer-analogy-554i</guid>
      <description>&lt;p&gt;When learning Angular and delving into the reactive programming paradigm, it's easy to stumble upon terms that may seem intimidating—&lt;code&gt;Observable&lt;/code&gt;, &lt;code&gt;Subject&lt;/code&gt;, &lt;code&gt;BehaviorSubject&lt;/code&gt;, and so on. Today, we're going to demystify one of these terms: &lt;code&gt;BehaviorSubject&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  Table of Contents
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Introduction&lt;/li&gt;
&lt;li&gt;What is BehaviorSubject?&lt;/li&gt;
&lt;li&gt;The Thermometer Analogy&lt;/li&gt;
&lt;li&gt;Using &lt;code&gt;next()&lt;/code&gt; Method&lt;/li&gt;
&lt;li&gt;BehaviorSubject vs Observable&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  1. Introduction
&lt;/h2&gt;

&lt;p&gt;Angular relies heavily on reactive programming patterns, particularly those provided by the RxJS library. Reactive programming is all about dealing with asynchronous operations and data streams. This is where &lt;code&gt;BehaviorSubject&lt;/code&gt; comes in as a powerful construct for managing these streams.&lt;/p&gt;

&lt;h2&gt;
  
  
  2. What is BehaviorSubject?
&lt;/h2&gt;

&lt;p&gt;A &lt;code&gt;BehaviorSubject&lt;/code&gt; is a type of subject, a particular kind of observable in RxJS. What makes it unique is that it stores the "current" value. This means that when you subscribe to it, you immediately get the latest emitted value—or a default value if none has been emitted yet.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt; &lt;span class="nx"&gt;BehaviorSubject&lt;/span&gt; &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="s1"&gt;rxjs&lt;/span&gt;&lt;span class="dl"&gt;'&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;temperature$&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;new&lt;/span&gt; &lt;span class="nx"&gt;BehaviorSubject&lt;/span&gt;&lt;span class="o"&gt;&amp;lt;&lt;/span&gt;&lt;span class="kr"&gt;number&lt;/span&gt;&lt;span class="o"&gt;&amp;gt;&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;20&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;In the code above, &lt;code&gt;temperature$&lt;/code&gt; is a &lt;code&gt;BehaviorSubject&lt;/code&gt; that starts with an initial value of 20. The &lt;code&gt;$&lt;/code&gt; at the end is a naming convention often used for Observables.&lt;/p&gt;

&lt;h2&gt;
  
  
  3. The Thermometer Analogy
&lt;/h2&gt;

&lt;p&gt;Think of &lt;code&gt;BehaviorSubject&lt;/code&gt; as a digital thermometer in a room. Various departments in an organization—say, Maintenance, Operations, and HR—need to know the room's temperature.&lt;/p&gt;

&lt;h3&gt;
  
  
  Initialization
&lt;/h3&gt;

&lt;p&gt;When you set up the thermometer, you determine that the initial temperature is 20°C. This initial value is crucial for any department that decides to monitor the temperature.&lt;/p&gt;

&lt;h3&gt;
  
  
  Subscriptions
&lt;/h3&gt;

&lt;p&gt;Maintenance and Operations departments 'subscribe' to the thermometer. From that point on, they get updated readings whenever the temperature changes.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="nx"&gt;temperature$&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;subscribe&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;temp&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`Maintenance department: Current temperature &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;temp&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;°C`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;

&lt;span class="nx"&gt;temperature$&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;subscribe&lt;/span&gt;&lt;span class="p"&gt;((&lt;/span&gt;&lt;span class="nx"&gt;temp&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="s2"&gt;`Operations department: Current temperature &lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;temp&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;°C`&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  New Subscription
&lt;/h3&gt;

&lt;p&gt;Now, HR also decides to subscribe. The moment they do, they receive the current temperature, not the initial one.&lt;/p&gt;

&lt;h2&gt;
  
  
  4. Using &lt;code&gt;next()&lt;/code&gt; Method
&lt;/h2&gt;

&lt;p&gt;To change the thermometer's reading, you use the &lt;code&gt;next()&lt;/code&gt; method. This action updates the current value and informs all subscribers about this change.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight typescript"&gt;&lt;code&gt;&lt;span class="nx"&gt;temperature$&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;next&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;22&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This will log:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Maintenance department: Current temperature 22°C
Operations department: Current temperature 22°C
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;next()&lt;/code&gt; function effectively pushes the new value into the &lt;code&gt;BehaviorSubject&lt;/code&gt;, updating all active subscribers.&lt;/p&gt;

&lt;h2&gt;
  
  
  5. BehaviorSubject vs Observable
&lt;/h2&gt;

&lt;p&gt;You might ask, "How is this different from an &lt;code&gt;Observable&lt;/code&gt;?" The crucial difference lies in the initial value and the immediate availability of the current value. With a standard &lt;code&gt;Observable&lt;/code&gt;, new subscribers would have to wait for a new value to be emitted. They don't get the latest value right when they subscribe, as with &lt;code&gt;BehaviorSubject&lt;/code&gt;.&lt;/p&gt;

&lt;h2&gt;
  
  
  6. Conclusion
&lt;/h2&gt;

&lt;p&gt;Understanding &lt;code&gt;BehaviorSubject&lt;/code&gt; can significantly impact how you manage state and data flow in Angular applications. The ability to emit, listen, and react to new data points in different parts of your application is not just powerful; it also aligns closely with Angular's reactive ethos.&lt;/p&gt;

&lt;p&gt;With this newfound knowledge, you're one step closer to mastering Angular's reactive landscape. Happy coding!&lt;/p&gt;

</description>
      <category>webdev</category>
      <category>angular</category>
      <category>beginners</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>10 Psychological Archetypes of Test Automation Engineers: Which One Are You?</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Sat, 19 Aug 2023 13:58:54 +0000</pubDate>
      <link>https://forem.com/petermilovcik/10-psychological-archetypes-of-test-automation-engineers-which-one-are-you-24o9</link>
      <guid>https://forem.com/petermilovcik/10-psychological-archetypes-of-test-automation-engineers-which-one-are-you-24o9</guid>
      <description>&lt;p&gt;In the vast universe of software development, test automation engineers play a pivotal role in ensuring software's reliability and robustness. Yet, beyond the technical aspects, there’s a fascinating human element at play. These engineers, driven by unique psychological traits, embody distinct archetypes that define their work approach. In diving deep into these archetypes, we might not only understand our colleagues better but also unearth our own latent traits.&lt;/p&gt;

&lt;h3&gt;
  
  
  1. &lt;strong&gt;The Perfectionist&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;For the Perfectionist, every script is a masterpiece. Their pursuit of flawlessness might sometimes be their Achilles' heel as they often risk getting entangled in the minutiae. &lt;/p&gt;

&lt;h3&gt;
  
  
  2. &lt;strong&gt;The Quick-Fixer&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Ever seen tests get up and running in the blink of an eye? Thank the Quick-Fixer. But, like every rose has its thorn, this speed sometimes comes at the cost of sustainability.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. &lt;strong&gt;The Strategist&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;This archetype sees beyond the immediate. They're the architects, ensuring the foundations are robust. Their test frameworks don’t just serve the present but are scalable visions for the future.&lt;/p&gt;

&lt;h3&gt;
  
  
  4. &lt;strong&gt;The Collaborator&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The belief driving the Collaborator is simple: unity is strength. They know the magic that unfolds when minds come together, resulting in automation solutions that are greater than the sum of their parts.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. &lt;strong&gt;The Innovator&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;In a rapidly evolving tech landscape, the Innovator is our beacon. With an insatiable appetite for the novel, they’re our bridge to the latest in tools and methodologies.&lt;/p&gt;

&lt;h3&gt;
  
  
  6. &lt;strong&gt;The Taskmaster&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Every team needs its pulse. And the Taskmaster is just that. Their rigorous schedules and checklists ensure that the ship not only sails but also stays its course.&lt;/p&gt;

&lt;h3&gt;
  
  
  7. &lt;strong&gt;The Challenger&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Status quo? Not for the Challenger. With an instinct to question, they ensure that we don't just do things right, but we also do the right things.&lt;/p&gt;

&lt;h3&gt;
  
  
  8. &lt;strong&gt;The Learner&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;In an age of information, the Learner thrives. Their continuous journey of upskilling is a testament to their belief in evolution – both personal and professional.&lt;/p&gt;

&lt;h3&gt;
  
  
  9. &lt;strong&gt;The Mentor&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The cycle of knowledge is complete when it’s passed on. The Mentor ensures this cycle never breaks. Their wisdom is not just their own but a lantern for others.&lt;/p&gt;

&lt;h3&gt;
  
  
  10. &lt;strong&gt;The Achiever&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;For some, the journey is the reward. For the Achiever, it's the milestones. Every test coverage percentage and implemented test case is a feather in their cap.&lt;/p&gt;

&lt;p&gt;In recognizing these archetypes, we don’t box individuals but rather celebrate the diversity of approaches. After all, it's this very diversity that enriches our teams and our solutions. So, the next time you’re in a meeting or a code review, take a moment. Look around. Which archetype do you see? More importantly, which one do you resonate with?&lt;/p&gt;

&lt;h3&gt;
  
  
  Join the Conversation
&lt;/h3&gt;

&lt;p&gt;So, which archetype resonates with you? Or do you see a bit of yourself in multiple roles? Discovering our archetype isn't just about introspection; it's a doorway to improved collaboration and team synergy. &lt;strong&gt;Share this post with your colleagues and ignite a fun, introspective discussion!&lt;/strong&gt; Who knows, understanding these profiles might just be the key to unlocking your team's next breakthrough. After all, in the intricate dance of software development, every step, every trait, and every individual matters. Dive deep, engage, and let's keep the conversation alive!&lt;/p&gt;

</description>
      <category>testing</category>
      <category>testautomation</category>
      <category>psychology</category>
      <category>engineering</category>
    </item>
    <item>
      <title>Modern Note-Taking: How Obsidian Stands Out in 2023</title>
      <dc:creator>PeterMilovcik</dc:creator>
      <pubDate>Sat, 19 Aug 2023 12:48:01 +0000</pubDate>
      <link>https://forem.com/petermilovcik/modern-note-taking-how-obsidian-stands-out-in-2023-42fh</link>
      <guid>https://forem.com/petermilovcik/modern-note-taking-how-obsidian-stands-out-in-2023-42fh</guid>
      <description>&lt;p&gt;In the vast galaxy of digital note-taking, stars emerge, flicker, and fade. Yet, as we barrel into 2023, one star blazes more brilliantly than the rest: Obsidian. As we wade through the deluge of our digital thoughts, a sanctuary like Obsidian becomes not just desirable, but downright essential.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Allure of the Local-First Paradigm
&lt;/h2&gt;

&lt;p&gt;In an age where cloud-based everything seems the norm, Obsidian's audacious move towards a local-first approach is both a breath of fresh air and a grounding reminder. There’s an oddly satisfying nostalgia in knowing your thoughts are safely stored on your own machine, reminiscent of scribbling in a physical journal under the cover of night.&lt;/p&gt;

&lt;h2&gt;
  
  
  Interlinking Thoughts: A Web of Brilliance
&lt;/h2&gt;

&lt;p&gt;Remember those detective shows with walls covered in pictures, connected by strings in a maze of theories and suspects? That’s Obsidian’s backlinking and graph view for you. But instead of solving crimes, you’re untangling the intricate mysteries of your own mind, making connections you might never have otherwise perceived.&lt;/p&gt;

&lt;h2&gt;
  
  
  Customization: A Playground for the Inquisitive
&lt;/h2&gt;

&lt;p&gt;Why accept a one-size-fits-all when you can tailor to perfection? Obsidian isn’t just a tool; it's a canvas. Whether you're a minimalist seeking an uncluttered space or a maximalist hungering for intricate toolsets, Obsidian bends to your will.&lt;/p&gt;

&lt;h2&gt;
  
  
  Knowledge, Not Notes
&lt;/h2&gt;

&lt;p&gt;While most apps let you take notes, Obsidian lets you craft a knowledge base. It’s a subtle distinction, but one that shifts the focus from mere information recording to genuine understanding. It nudges you gently towards epiphanies and a-ha moments, those delightful instances when the fog lifts and clarity reigns.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Timelessness of Markdown
&lt;/h2&gt;

&lt;p&gt;And, of course, the beating heart of Obsidian - markdown. In 2023, as more apps become bloated with features, the purity and simplicity of markdown stand unyielding. It's a nod to the digital romantics among us, who believe that words, in their most unadorned form, are magic enough.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Quiet Revolution of Obsidian
&lt;/h2&gt;

&lt;p&gt;So, here we stand, on the precipice of an ever-evolving digital age, our thoughts scattered like leaves in the wind. But with Obsidian, there's a sense of grounding, a sense of coming home. In the crowded arena of note-taking apps, Obsidian doesn’t just stand out; it gleams, a beacon for those seeking a deeper, more profound relationship with their thoughts.&lt;/p&gt;

&lt;p&gt;Let's embrace this tool not just for its features, but for the introspection and clarity it beckons us towards. After all, in the cacophony of modern life, a sanctuary for our thoughts is the most profound rebellion.&lt;/p&gt;

</description>
      <category>obsidian</category>
      <category>pkm</category>
    </item>
  </channel>
</rss>
