<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>Forem: Nikhilesh G</title>
    <description>The latest articles on Forem by Nikhilesh G (@nikhilesh002).</description>
    <link>https://forem.com/nikhilesh002</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://forem.com/feed/nikhilesh002"/>
    <language>en</language>
    <item>
      <title>Install LLMs in a custom directory with Ollama on Linux</title>
      <dc:creator>Nikhilesh G</dc:creator>
      <pubDate>Thu, 30 Jan 2025 06:44:43 +0000</pubDate>
      <link>https://forem.com/nikhilesh002/install-deepseek-in-custom-directory-with-ollama-in-linux-4jfk</link>
      <guid>https://forem.com/nikhilesh002/install-deepseek-in-custom-directory-with-ollama-in-linux-4jfk</guid>
      <description>&lt;p&gt;Every PC's main drive is almost full (atleast mine). I wanted to install deepseek on my local machine. But my main drive on which OS(Ubuntu 24.04) is installed does not have enough space. So I wanted to install the model in another directory(disk).&lt;/p&gt;

&lt;p&gt;So I googled for it. I found the solution &lt;a href="https://github.com/ollama/ollama/issues/3045#:~:text=Like%20ejgutierrez74%2C%20I%20wanted%20to%20change%20the%20template%20directory%20and%20I%20followed%20the%20documentation%20on%20this%20page.%0ASo%20I%20edited%20the%20ollama.service%20file%20with%20the%20following%20command%3A" rel="noopener noreferrer"&gt;here&lt;/a&gt;. It took me some time to succeed. So I wanted to write a blog that saves others' time.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Overriding the configuration will result in loosing the existing models you have downloaded. I have not tried this, but models should work if we move all contents from &lt;code&gt;/usr/share/ollama/.ollama&lt;/code&gt;(default directory) to our new directory after doing below steps. Please let me know via comments, if it works.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 1: Open override.conf file
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;You need to add an environment variable that overrides the default directory and also we need to specify the use&lt;/li&gt;
&lt;li&gt;Run the below command to open &lt;code&gt;/etc/systemd/system/ollama.service.d/override.conf&lt;/code&gt; file.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo &lt;/span&gt;systemctl edit ollama.service
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;The default one looks like
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;### Editing /etc/systemd/system/ollama.service.d/override.conf&lt;/span&gt;
&lt;span class="c"&gt;### Anything between here and the comment below will become the contents of the&amp;gt;&lt;/span&gt;


&lt;span class="c"&gt;### Edits below this comment will be discarded&lt;/span&gt;


&lt;span class="c"&gt;### /etc/systemd/system/ollama.service&lt;/span&gt;
&lt;span class="c"&gt;# [Unit]&lt;/span&gt;
&lt;span class="c"&gt;# Description=Ollama Service&lt;/span&gt;
&lt;span class="c"&gt;# After=network-online.target&lt;/span&gt;
&lt;span class="c"&gt;# &lt;/span&gt;
&lt;span class="c"&gt;# [Service]&lt;/span&gt;
&lt;span class="c"&gt;# ExecStart=/usr/local/bin/ollama serve&lt;/span&gt;
&lt;span class="c"&gt;# User=ollama&lt;/span&gt;
&lt;span class="c"&gt;# Group=ollama&lt;/span&gt;
&lt;span class="c"&gt;# Restart=always&lt;/span&gt;
&lt;span class="c"&gt;# RestartSec=3&lt;/span&gt;
&lt;span class="c"&gt;# Environment="PATH=/home/nikki/.nvm/versions/node/v20.17.0/bin:/home/nikki/.nv&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;# &lt;/span&gt;
&lt;span class="c"&gt;# [Install]&lt;/span&gt;
&lt;span class="c"&gt;# WantedBy=default.target&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;It specifies that anything written below line:5 will be discarded. So we need to write our env var before that comment.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 2: Add env variable
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;You need to add environment variable under [Service]. &lt;/li&gt;
&lt;li&gt;After adding the env vars, the file looks like
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;### Editing /etc/systemd/system/ollama.service.d/override.conf&lt;/span&gt;
&lt;span class="c"&gt;### Anything between here and the comment below will become the contents of the&amp;gt;&lt;/span&gt;

&lt;span class="o"&gt;[&lt;/span&gt;Service]
&lt;span class="nv"&gt;Environment&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="s2"&gt;"OLLAMA_MODELS=/media/nikki/mymodels"&lt;/span&gt;
&lt;span class="nv"&gt;User&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;nikki
&lt;span class="nv"&gt;Group&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;nikki

&lt;span class="c"&gt;### Edits below this comment will be discarded&lt;/span&gt;


&lt;span class="c"&gt;### /etc/systemd/system/ollama.service&lt;/span&gt;
&lt;span class="c"&gt;# [Unit]&lt;/span&gt;
&lt;span class="c"&gt;# Description=Ollama Service&lt;/span&gt;
&lt;span class="c"&gt;# After=network-online.target&lt;/span&gt;
&lt;span class="c"&gt;# &lt;/span&gt;
&lt;span class="c"&gt;# [Service]&lt;/span&gt;
&lt;span class="c"&gt;# ExecStart=/usr/local/bin/ollama serve&lt;/span&gt;
&lt;span class="c"&gt;# User=ollama&lt;/span&gt;
&lt;span class="c"&gt;# Group=ollama&lt;/span&gt;
&lt;span class="c"&gt;# Restart=always&lt;/span&gt;
&lt;span class="c"&gt;# RestartSec=3&lt;/span&gt;
&lt;span class="c"&gt;# Environment="PATH=/home/nikki/.nvm/versions/node/v20.17.0/bin:/home/nikki/.nv&amp;gt;&lt;/span&gt;
&lt;span class="c"&gt;# &lt;/span&gt;
&lt;span class="c"&gt;# [Install]&lt;/span&gt;
&lt;span class="c"&gt;# WantedBy=default.target&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Step 3: Save file
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;I think by default the terminal opens &lt;code&gt;nano&lt;/code&gt; file editor. Follow below steps to save file.&lt;/li&gt;
&lt;li&gt;Press Ctrl + x&lt;/li&gt;
&lt;li&gt;Press y&lt;/li&gt;
&lt;li&gt;Press Enter&lt;/li&gt;
&lt;li&gt;If your default editor is vi/vim, then follow&lt;/li&gt;
&lt;li&gt;Press Ctrl + c&lt;/li&gt;
&lt;li&gt;Enter &lt;code&gt;:wq&lt;/code&gt; to write and quit the editor&lt;/li&gt;
&lt;li&gt;Doing this will edit the override.conf file&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 4: Restart the daemon
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;You need to restart the daemon to apply the changes.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;systemctl daemon-reload
systemctl restart ollama
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;To learn more about daemon, refer &lt;a href="https://man7.org/linux/man-pages/man7/daemon.7.html#:~:text=A%20daemon%20is%20a%20service%20process%20that%20runs%20in%20the%20background%20and%0A%20%20%20%20%20%20%20supervises%20the%20system%20or%20provides%20functionality%20to%20other%0A%20%20%20%20%20%20%20processes." rel="noopener noreferrer"&gt;this&lt;/a&gt; .&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;systemctl daemon-reload&lt;/code&gt; tells &lt;code&gt;systemd&lt;/code&gt; to reload its configurations.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;systemctl restart ollama&lt;/code&gt; restarts(start and then start immediately) the service named &lt;code&gt;ollama&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;/blockquote&gt;

&lt;h3&gt;
  
  
  Step 5: (Optional) Give permissions to new directory
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;Give access to the new directory, if not present
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="nb"&gt;sudo chmod &lt;/span&gt;755 /media/nikki/mymodels
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;After execution, the permissions will become
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;drwxr-xr-x 1 nikki nikki 4096 Jan 30 03:57 /media/nikki/mymodels
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;r =&amp;gt; read&lt;/li&gt;
&lt;li&gt;w =&amp;gt; write&lt;/li&gt;
&lt;li&gt;&lt;p&gt;x =&amp;gt; execute&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;d&lt;/code&gt;: It's a directory.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;rwx&lt;/code&gt;: Owner can read, write, and execute.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;r-x&lt;/code&gt;: Group can read and execute, but not write.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;code&gt;r-x&lt;/code&gt;: Others can read and execute, but not write.&lt;/p&gt;
&lt;h3&gt;
  
  
  Step 6: Download the model and run
&lt;/h3&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;ollama&lt;/code&gt; is such an amazing tool that made naive users to run LLMs locally.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Running below command will pull the model's binary from ollama registry&lt;br&gt;
&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ollama run deepseek-r1:1.5b
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;Refer &lt;a href="https://ollama.com/search" rel="noopener noreferrer"&gt;ollama&lt;/a&gt; to browse a model for your requirements.&lt;/li&gt;
&lt;li&gt;To list all running models, run below command
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;ollama list
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;If you like my article, please share and follow me. If you have any doubts, let me know via comments.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>deepseek</category>
      <category>llm</category>
      <category>beginners</category>
    </item>
  </channel>
</rss>
