How to run DeepSeek AI locally to protect your privacy – 2 easy ways

ZDNET

DeepSeek is the latest buzzword within the world of AI. DeepSeek is a Chinese AI startup, founded in May 2023, that functions as an independent AI research lab and has gained significant attention around the globe for developing very powerful large language models (LLMs) at a cost for which its US counterparts cannot compete.

Also: What is sparsity? DeepSeek AI’s secret, revealed by Apple researchers

One reason for this lower cost is that DeepSeek is open-source. The company has also claimed it has created a way to develop LLMs at a much lower cost than US AI companies. DeepSeek models also perform as well (if not better) than other models, and the company has released different models for different purposes (such as programming, general-purpose, and vision).

My experience with DeepSeek has been interesting so far. What I’ve found is that DeepSeek always seems to be having a conversation with itself, in the process of relaying information to the user. The responses tend to be long-winded and can send me down several different rabbit holes, each of which led to me learning something new.

I do love learning new things.

Also: How I feed my files to a local AI for better, more relevant responses

If you’re interested in DeepSeek, you don’t have to rely on a third party to use it. That’s right — you can install DeepSeek locally and use it at your whim.

There are two easy ways to make this happen, and I’m going to show you both.

How to add DeepSeek to Msty

What you’ll need: For this, you’ll need both Ollama and Msty installed — and that’s it. You can use this on Linux, MacOS, or Windows, and it won’t cost you a penny.

The first step is to open the Msty GUI. How you do this will depend on the OS you use.


Show more

From the left sidebar, click the icon that looks like a computer monitor with a lightning bolt, which will open the Local AI Models section.


Show more

The Msty sidebar with Local AI models highlighted.

Make sure Msty is updated by clicking the cloud icon.

Jack Wallen/ZDNET

In the Local AI Models section, you’ll see DeepSeek R1. Click the download button (downward pointing arrow) to add the DeepSeek model to Msty. Once the download completes, close the Local AI Models window.


Show more

The Msty Local AI downloader.

Make sure to select DeepSeek R1.

Jack Wallen/ZDNET

Back at the main window, click the model selection drop-down, click DeepSeek R1 (under Local AL), and type your query.


Show more

The Msty Models selection menu.

You can install as many local models as you need.

Jack Wallen/ZDNET

How to install DeepSeek locally from the Linux command line

Another option is to do a full install of DeepSeek on Linux. Before you do this, know that the system requirements for this are pretty steep. You’ll need a minimum of:

  • CPU: A powerful multi-core processor with a minimum of 12 cores recommended.
  • GPU: An NVIDIA GPU with CUDA support for accelerated performance. If Ollama doesn’t detect the presence of an NVIDIA GPU, it will configure itself to run in CPU-only mode.
  • RAM: A minimum of 16 GB, preferably 32 GB or more.
  • Storage: You’ll want NVMe storage for faster read/write operations.
  • Operating System: You’ll need Ubuntu or an Ubuntu-based distribution.

If your system meets those requirements, and you already have Ollama installed, you can run the DeepSeek R1 model with:

ollama run deepseek-r1:8b

If you haven’t already installed Ollama, you can do that with a single command:

curl -fsSL https://ollama.com/install.sh | sh

Also: I tried Sanctum’s local AI app, and it’s exactly what I needed to keep my data private

You’ll be prompted for your user password.

There are other versions of DeepSeek you can run, which are:

  • ollama run deepseek-r1 – The default 8B version
  • ollama run deepseek-r1:1.5b – The smallest model
  • ollama run deepseek-r1:7b – The 7B version
  • ollama run deepseek-r1:14b – The 14B version
  • ollama run deepseek-r1:32b – The 32B version
  • ollama run deepseek-r1:70b – The largest and smartest of the models.

Once the command completes, you’ll find yourself at the Ollama prompt, where you can start using the model of your choice.

Also: These nations are banning DeepSeek AI – here’s why

Either way you go, you now have access to the DeepSeek AI and can use it while keeping all of your queries and information safe on your local machine.





Source link

Leave a comment