Xorte logo

News Markets Groups

USA | Europe | Asia | World| Stocks | Commodities



Add a new RSS channel

 
 


Keywords

2025-03-01 10:00:00| Fast Company

Most of us are used to using internet chatbots like ChatGPT and DeepSeek in one of two ways: via a web browser or via their dedicated smartphone apps. There are two drawbacks to this. First, their use requires an internet connection. Second, everything you type into the chatbot is sent to the companies servers, where it is analyzed and retained. In other words: the more you use the chatbot the more the company knows about you. This is a particular worry surrounding DeepSeek that American lawmakers have expressed. But thanks to a few innovative and easy-to-use desktop apps, LM Studio and GPT4All, you can bypass both these drawbacks. With the apps, you can run various LLM models on your computer directly. Ive spent the last week playing around with these apps and thanks to each, I can now use DeepSeek without the privacy concerns. Heres how you can, too. Run DeepSeek locally on your computer without an internet connection To get started, simply download LM Studio or GPT4All on your Mac, Windows PC, or Linux machine. Once the app is installed, youll download the LLM of your choice into it from an in-app menu. I chose to run DeepSeeks R1 model, but the apps support myriad open-source LLMs. LM Studio can run DeepSeeks reasoning model privately on your computer. Once youve done the above youve essentially turned your personal computer into an AI server capable of running numerous open-source LLMs, including ones from DeepSeek and Meta. Next, simply open a new chat window and type away just as you would when using an AI chatbot on the web. The best thing about both these apps is that they are free for general consumer use, you can run several open-source LLMs in them (you get to choose which and can swap between LLMs at will), and, if you already know how to use an AI chatbot in a web browser, youll know how to use the chatbot in these apps. But there are additional benefits to running LLMs locally on your computer, too. The benefits of using an LLM locally Ive been running DeepSeeks reasoning model on my MacBook for the past week without so much as a hiccup in both LM Studio or GPT4All. One of the coolest things about interacting with DeepSeek in this way is that no internet is required. Since the LLM is hosted directly on your computer, you dont need any kind of data connection to the outside world to use it. Running LLMs like DeepSeek in apps like GPT4All can help keep your data secure. Or as GPT4Alls lead developer, Adam Treat, puts it, You can use it on an airplane or at the top of Mount Everest. This is a major boon to business travelers stuck on long flights and those working in remote, rural areas.  But if Treat had to sum up the biggest benefit of running DeepSeek locally on your computer, he would do it in one word: Privacy. Every online LLM is hosted by a company that has access to whatever you input into the LLM. For personal, legal, and regulatory reasons this can be less than optimal or simply not possible, Treat explains.  While for individuals, this can present privacy risks, those who upload business or legal documents into an LLM to summarize could be putting their company and its data in jeopardy. Uploading that [kind of data] to an online server risks your data in a way that using it with an offline LLM will not, Treat notes. The reason an offline LLM running locally on your own computer doesnt put your data at risk is because Your data simply never leaves your machine, says Treat. This means, for example, if you want to use DeepSeek to help you summarize that report you wrote, you can upload it into the DeepSeek model stored locally on your computer via GPT4All or LM Studio and rest assured the information in that report isnt being sent to the LLM maker’s servers. The drawbacks of using an LLM locally However, there are drawbacks to running an LLM locally. The first is that youre limited to using only the open-source models that are available, which may be less recent than the model that is available through the chatbots official website. And because only open-source models can be installed, that means you cant use apps like GPT4All or LM Studio to run OpenAIs ChatGPT locally on your computer. Another disadvantage is speed.  Because you are using your own hardware (your laptop or desktop) to power the AI, the speed of responses will be generally slower than an online server, Treat says. And since AI models rely heavily on RAM to perform their computations, the amount of RAM you have in your computer can limit which models you can install in apps like GPT4All and LM Studio. As online servers are usually powered by very high-end hardware they are generally going to be faster and have more memory allowing for very fast responses by very large models, explains Treat. Still, in my testing of both LM Studio and GPT4All over the past week, I dont think the reduced speediness of DeepSeeks replies is a dealbreaker. When using DeepSeeks R1 reasoning model on the web, the DeepSeek hosted on servers in China took 32 seconds to return an answer to the prompt Can you teach me how to make a birthday cake? When asking the local DeepSeek R1 model stored in LM Studio and GPT4All, the response time was 84 seconds and 82 seconds, respectively.  Ive found that the benefits of running DeepSeek locally on my device using LM Studio and GPT4All far outweigh the extra waiting time required to get a response. Without a doubt, being able to access a powerful AI odel like DeepSeeks R1 locally on my computer anywhere at any time without an internet connectionand knowing the data I enter into it remains privateis a trade-off worth making.


Category: E-Commerce

 

Latest from this category

17.01Youre banned from blocking Trumps face on your national park passbut theres a work-around
17.01Worried about retirement? Consider a die with zero plan
17.01You probably shouldnt click that email unsubscribe link. Heres what to do instead
17.01What we can learn about U.S. disaster responsea year after the LA wildfires
17.01Should AI be allowed to see everything at work?
17.01Do you have these 5 emotional intelligence traits that are key for building trust?
16.01FDA commissioners drug review plan sparks alarm across the agency
16.01Trump threatens tariffs for countries if they oppose U.S. control of Greenland
E-Commerce »

All news

17.01The plan for a gaming-themed Atari hotel in Las Vegas has reportedly been scrapped
17.01Amazon's live-action God of War adaptation adds Teresa Palmer
17.01TikTok's latest spinoff app feels a lot like Quibi, but with shorter and cornier content
17.01Jackson Square redevelopment in La Grange prompts call for zoning process change
17.01Elon Musk is looking for a $134 billion payout from OpenAI and Microsoft
17.01Beyond the 'Sham' tag: Myanmar's first phase polls offer first signs of stability
17.01'We can hunt': Greenlanders weigh drastic options as US threatens
17.01How to pair AirPods with any device
More »
Privacy policy . Copyright . Contact form .