“Censorship is the child of fear and the father of ignorance.” – Laurie Halse Anderson
Get Full Access to Trends Pro
❓ What You’ll Learn
- Why should you use open-source AI?
- How to make $2,000/mo with a Telegram bot powered by open-source AI?
- What virtual companies are run completely by AI?
- How is a team of AI agents building software?
- How to build complex AI apps without code?
- What risks does local AI share with proprietary models?
- How will the US try to stop China from winning the AI race?
- How to improve local AI setup and onboarding?
- How can local AI models debug each other?
- How to offer a great user experience with local AI apps?
💎 Why It Matters
Censorship lowers leverage. Privacy limitations lower trust.
Local AI shifts control from OpenAI, Microsoft and Google to the people.
🔍 Problem
You pay for centralized AI tools that tell you what you can and cannot do.
While saving your documents and innermost thoughts on their servers.
💡 Solution
Local AI gives you more control over your data and usage.
It becomes your advisor, not a supervisor.
🏁 Players
Local AI Models
- Llama 3 • Local AI model from Meta
- Gemma • Open-source AI models from Google
- Qwen2 • Open-source model that beats all other open-source models
- DeepSeek Coder • AI models that write, understand and complete code
- Stable Diffusion • Text-to-image generation model for photo realistic images
☁️ Opportunities
- Build privacy-first, client-side apps. Privacy is a strong selling point for sensitive use cases.
- WriteUp locked privacy behind a paid plan. It collects data from free users only.
- Pieces is a local-first coding assistant that protects your codebase. It uses your local resources to give code suggestions.
- Pieter Levels grew TherapistAI to $2,000/mo. He says local LLMs are perfect for sensitive use cases and plans to turn it into a client-side chatbot.
- Build a user-friendly interface to help non-technical users connect, train and use local AI. Great UI leads to great UX.
- LM Studio lets you build, run and chat with local LLMs.
- WebLLM is an in-browser AI engine for using local LLMs.
- TypingMind lets you self-host local LLMs on your own infrastructure.
- Make tutorials to help people build, run and use local AI models.
- Matthew Berman shows how to run any AI model with LM Studio.
- Zoltan C. Toth teaches The Local LLM Crash Course. He’s got 2,769 students.
- Sam Witteveen made a series of tutorials on running local AI models with Ollama.
- Eden Marco teaches how to build LLM apps with LangChain. He’s got 56,404 students.
- Sharath Raju teaches how to use LangChain with Llama 2 and HuggingFace. He’s got 10,657 students.
🔮 Predictions
- We’ll see virtual companies of AI agents that work together locally.
- Camel lets you use open-source AI models to build role-playing AI agents.
- OpenAGI lets you use local models to build collaborative AI teams. Here’s an example of an AI team that writes blogs.
- MetaGPT lets you build a collaborative entity for complex tasks. It works best with commercial models, but you can use open-source AI too.
- ChatDev uses several AI agents with different roles to build software. They collaborate by “attending” specialized seminars on design, coding, testing and more.
- We’ll be able to build AI apps visually, without code. This will let non-technical users build complex apps for their workflows.
- Flowise lets you build custom LLM flows and AI agents.
- Langflow offers a visual interface for building AI-powered apps.
- Obviously AI lets you build production-ready AI apps without code.
- Lack of censorship will become a better selling point. Users will prefer unfiltered creativity over censored tools that refuse to do controversial, yet legal tasks.
- Venice is a privacy-first chatbot that stores chats in your browser.
- xAI is an AI lab led by Elon Musk. It released Grok-1, an open-source and uncensored alternative to OpenAI.
- Perplexity made uncensored AI models that outperformed GPT-3.5 and Llama 2. Paired with browser access, they went too far. Since they weren’t open-source, they were taken down in 6 months.
🏔️ Risks
- Hardware Requirements • If you’re serious about running AI models locally, you may need to buy a new computer. Depending on your needs and preferences, this may cost a few thousand dollars.
- UX Issues • You may not be able to use multiple models simultaneously. You can open ChatGPT, Claude and Gemini in different tabs. But running more than one local AI model with billions of parameters can be impossible.
🔑 Key Lessons
- The performance gap between local and cloud AI is closing.
- Local AI is self-sufficient. You can ask for help anytime, anywhere, as long as you have your device with you. No internet connection required.
🔥 Hot Takes
- Governments will regulate local AI on par with centralized models. They still pose risks similar to proprietary models.
- China will beat the US in the AI race. Chinese open-source models already beat open-source models from the US. Eventually, Chinese proprietary models will catch up too. The US will try to limit the public access to AI research. Such concerns have already been stated.
😠 Haters
“I need more expensive and powerful hardware to run local AI models.”
This is the main tradeoff for local AI at the moment. But it’s becoming more performant. See how llama.cpp lets you run them on consumer devices and how Apple is doing this on a grand scale. This may be an inflection point for hardware and local AI.
“Setup and onboarding is hard. I can’t just visit a URL.”
User experience with local AI is a solvable problem. We’re getting there with open-source tools that make setting up local AI easier. Ollama lets you set up Llama 3 in 10 minutes.
“Local AI models perform worse than AI models made by tech giants.”
Open-source AI models can be a little worse, but a lot more private and less censored. Depending on your use case, it can be wise to sacrifice quality without giving up your privacy.
“Providing support for models running locally sounds impossible.”
This is another tradeoff of local LLMs. Unless the model becomes unusable, users can use an AI model to debug another AI model. This guy uses local AI models as copilots for coding copilots.
“Local AI models aren’t a panacea for AI-related data privacy issues.”
This is the risk of storing data in digital form. “Private”, local AI may not protect your data if your computer is compromised.
🔗 Links
- Open-Source AI Is Wild • The thread behind this report.
- Building a Report on Local AI • The tweet behind this report.
- Open LLM Leaderboard • 100+ open-source AI models with performance tests.
- Why I Use Open Weights LLMs Locally • The benefits of using locally hosted open LLMs.
📁 Related Reports
- Open-Source AI • Learn from and build on each others’ work.
- Niche AI Models • Do specific tasks more accurately and efficiently.
- Data as a Service • Gain a competitive edge by fueling your decisions with the right data.
- AI Agents • Autonomous agents are the natural endpoint of automation in general.
- Prompt Engineering • Learn how to direct AI to get more accurate results.
🙏 Thanks
Thanks to Elia Zane and Matt Spear. We had a great time jamming on this report.
✏️ Emin researched and wrote this report. Dru researched and edited this report.
📈 What else?
Trends PRO #0150 — Local AI has more insights.
What you’ll get:
- 17 Local AI Models (240% More)
- 8 Opportunities (167% More)
- 7 Predictions (133% More)
- 4 Risks (100% More)
- 4 Key Lessons (100% More)
- 4 Hot Takes (100% More)
- 15 Links (275% More)
With Trends Pro you’ll learn:
- (📈 Pro) How to build a sustainable business around local AI?
- (📈 Pro) How to use open-source work to market your paid products?
- (📈 Pro) What tools will let us run local AI models on our phones?
- (📈 Pro) What will be the fastest way to build a performant AI model?
- (📈 Pro) What is the key barrier to popularizing local AI?
- (📈 Pro) Why will open-source and closed-source companies collaborate?
- (📈 Pro) What are you responsible for when using local AI?
- (📈 Pro) What is the go-to community for local AI enthusiasts?
- (📈 Pro) Why will users ditch “one-size-fits-all” AI assistants?
- (📈 Pro) What will make human-generated training data irrelevant?
- (📈 Pro) What are the 50 ways to run AI models locally?
- (📈 Pro) How to set up a powerful, private AI server?
- (📈 Pro) How to build a powerful computer for local AI models for under $3,000?
- (📈 Pro) How to understand the top AI benchmarks?
- (📈 Pro) How can local AI perform better than proprietary AI?
- (📈 Pro) How to verify the results of decentralized AI engines?
- (📈 Pro) How to assess AI risks by manipulating it?
- And much more…
Get Weekly Reports
Join 65,000+ founders and investors
📈 Unlock Pro Reports, 1:1 Intros and Masterminds
Become a Trends Pro Member and join 1,200+ founders enjoying…
🧠 Founder Mastermind Groups • To share goals, progress and solve problems together, each group is made up of 6 members who meet for 1 hour each Monday.
📈 100+ Trends Pro Reports • To make sense of new markets, ideas and business models, check out our research reports.
💬 1:1 Founder Intros • Make new friends, share lessons and find ways to help each other. Keep life interesting by meeting new founders each week.
🧍 Daily Standups • Stay productive and accountable with daily, async standups. Unlock access to 1:1 chats, masterminds and more by building standup streaks.
💲 100k+ Startup Discounts • Get access to $100k+ in startup discounts on AWS, Twilio, Webflow, ClickUp and more.