mirror of
https://github.com/tcsenpai/agenticSeek.git
synced 2025-06-06 11:05:26 +00:00
Update README.md
This commit is contained in:
parent
feb2e233ef
commit
4c1fcf3ec7
@ -145,12 +145,15 @@ The 32B model needs a GPU with 24GB+ VRAM.
|
||||
Deepseek R1 excels at reasoning and tool use for its size. We think it’s a solid fit for our needs—other models work fine, but Deepseek is our primary pick.
|
||||
|
||||
**Q: I get an error running `main.py`. What do I do?**
|
||||
|
||||
Ensure Ollama is running (`ollama serve`), your `config.ini` matches your provider, and dependencies are installed. If none work feel free to raise an issue.
|
||||
|
||||
**Q: Can it really run 100% locally?**
|
||||
|
||||
Yes with Ollama or Server providers, all speech to text, LLM and text to speech model run locally. Non-local options (OpenAI, Deepseek API) are optional.
|
||||
|
||||
**Q: How can it is older than manus ?**
|
||||
|
||||
we started this a fun side project to make a fully local, Jarvis-like AI. However, with the rise of Manus and openManus, we saw the opportunity to redirected some tasks priority to make yet another alternative.
|
||||
|
||||
**Q: How is it better than manus or openManus ?**
|
||||
|
Loading…
x
Reference in New Issue
Block a user