DEV Community

Cover image for Use OpenAI’s Node.js SDK with DeepSeek R1 Running Locally via Ollama
Deepak Sharma
Deepak Sharma

Posted on

7 1 1

Use OpenAI’s Node.js SDK with DeepSeek R1 Running Locally via Ollama

Introduction

DeepSeek R1 is an open-source LLM that offers powerful generative AI capabilities. If you're running it locally using Ollama, you might be wondering how to integrate it with your Node.js applications. This guide will show you how to set up and use the OpenAI SDK with your locally running DeepSeek R1 model.

Step 1: Start DeepSeek R1 Locally with Ollama

Make sure Ollama is running and has the DeepSeek R1 model downloaded. If you haven't installed it yet, do this:

ollama pull deepseek-r1:1.5b
Enter fullscreen mode Exit fullscreen mode

Then, start a test session to verify it's working:

ollama run deepseek-r1:1.5b
Enter fullscreen mode Exit fullscreen mode

Step 2: Install Dependencies (Nodejs)

First, ensure you have Node.js installed, then install the OpenAI SDK:

npm install openai
Enter fullscreen mode Exit fullscreen mode

Step 3: Configure OpenAI SDK to Use Ollama

const OpenAI = require("openai");

const openai = new OpenAI({
    baseURL: "http://localhost:11434/v1", // Pointing to Ollama's local API
    apiKey: "ollama", // Required by the OpenAI SDK, but Ollama doesn’t validate it
});

async function chatWithDeepSeek(prompt) {
    try {
        const response = await openai.chat.completions.create({
            model: "deepseek-r1:1.5b", // Ensure this model is running
            messages: [{ role: "user", content: prompt }],
        });

        console.log(response.choices[0].message.content);
    } catch (error) {
        console.error("Error:", error.message);
    }
}

// Test the function
chatWithDeepSeek("Hello, how are you?");
Enter fullscreen mode Exit fullscreen mode

Step 4: Enabling Streaming Responses

To improve performance and get responses in real-time, enable streaming
Streaming Version of the Function

async function chatWithDeepSeekStream(prompt) {
    try {
        const stream = await openai.chat.completions.create({
            model: "deepseek-r1:1.5b",
            messages: [{ role: "user", content: prompt }],
            stream: true, // Enable streaming
        });

        for await (const chunk of stream) {
            process.stdout.write(chunk.choices[0]?.delta?.content || "");
        }
        console.log("\n");
    } catch (error) {
        console.error("Error:", error.message);
    }
}

chatWithDeepSeekStream("Tell me a fun fact about space.");
Enter fullscreen mode Exit fullscreen mode

Hot sauce if you're wrong - web dev trivia for staff engineers

Hot sauce if you're wrong · web dev trivia for staff engineers (Chris vs Jeremy, Leet Heat S1.E4)

  • Shipping Fast: Test your knowledge of deployment strategies and techniques
  • Authentication: Prove you know your OAuth from your JWT
  • CSS: Demonstrate your styling expertise under pressure
  • Acronyms: Decode the alphabet soup of web development
  • Accessibility: Show your commitment to building for everyone

Contestants must answer rapid-fire questions across the full stack of modern web development. Get it right, earn points. Get it wrong? The spice level goes up!

Watch Video 🌶️🔥

Top comments (0)

AWS Q Developer image

Your AI Code Assistant

Automate your code reviews. Catch bugs before your coworkers. Fix security issues in your code. Built to handle large projects, Amazon Q Developer works alongside you from idea to production code.

Get started free in your IDE

👋 Kindness is contagious

Engage with a wealth of insights in this thoughtful article, valued within the supportive DEV Community. Coders of every background are welcome to join in and add to our collective wisdom.

A sincere "thank you" often brightens someone’s day. Share your gratitude in the comments below!

On DEV, the act of sharing knowledge eases our journey and fortifies our community ties. Found value in this? A quick thank you to the author can make a significant impact.

Okay