mirror of
https://github.com/tcsenpai/agenticSeek.git
synced 2025-06-07 11:35:29 +00:00
Doc : readme
This commit is contained in:
parent
b7eaee6b34
commit
c5fc156fc6
174
README.md
174
README.md
@ -1,67 +1,79 @@
|
|||||||
|
|
||||||
# AgenticSeek: Local AI Assistant Powered by Deepseek R1 Agents.
|
# AgenticSeek: Manus-like AI powered by Deepseek R1 Agents.
|
||||||
|
|
||||||
**A fully local alternative to Manus AI**, a voice-enabled AI assistant that codes, explores your filesystem, and correct it's mistakes all without sending a byte of data to the cloud. The goal of the project is to create a truly Jarvis like assistant using reasoning model such as deepseek R1.
|
|
||||||
|
|
||||||
> 🛠️ **Work in Progress** – Looking for contributors! 🚀
|
**A fully local alternative to Manus AI**, a voice-enabled AI assistant that codes, explores your filesystem, browse the web and correct it's mistakes all without sending a byte of data to the cloud. Built with reasoning models like DeepSeek R1, this autonomous agent runs entirely on your hardware, keeping your data private.
|
||||||
|
|
||||||
|
[](https://fosowl.github.io/agenticSeek.html)  
|
||||||
|
> 🛠️ **Work in Progress** – Looking for contributors!
|
||||||
|
|
||||||
|

|
||||||
|
|
||||||

|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Features:
|
## Features:
|
||||||
|
|
||||||
- **Privacy-first**: Runs 100% locally – **no data leaves your machine**
|
- **100% Local**: No cloud, runs on your hardware. Your data stays yours.
|
||||||
- ️**Voice-enabled**: Speak and interact naturally
|
|
||||||
- **Filesystem interaction**: Use bash to interact with your filesystem.
|
- **Voice interaction**: Voice-enabled natural interaction.
|
||||||
- **Coding abilities**: Code in Python, C, Golang, and soon more
|
|
||||||
- **Trial-and-error**: If a command or code fails, the assistant retries to fixes it automatically, saving you time.
|
- **Filesystem interaction**: Use bash to navigate and manipulate your files effortlessly.
|
||||||
- **Agent routing**: Select the best agent for the task
|
|
||||||
- **Multi-agent (On Dev branch)**: For complex tasks, divide and conquer with multiple agents
|
- **Code what you ask**: Can write, debug, and run code in Python, C, Golang and more languages on the way.
|
||||||
- **Tools:**: All agents have their respective tools ability. Basic search, flight API, files explorer, etc...
|
|
||||||
- **Web browsing (not implemented yet | Hight priority task)**: Browse the web autonomously to conduct task.
|
- **Autonomous**: If a command flops or code breaks, it retries and fixes it by itself.
|
||||||
- **Memory&Recovery**: Compress conversation over time to retain useful information, recover conversation session.
|
|
||||||
|
- **Agent routing**: Automatically picks the right agent for the job.
|
||||||
|
|
||||||
|
- **Divide and Conquer**: For big tasks, spins up multiple agents to plan and execute.
|
||||||
|
|
||||||
|
- **Tool-Equipped**: From basic search to flight APIs and file exploration, every agent has it's own tools.
|
||||||
|
|
||||||
|
- **Memory**: Remembers what’s useful, your preferences and past sessions conversation.
|
||||||
|
|
||||||
|
- **Web Browsing**: Autonomous web navigation is underway. (See it on browser branch)
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
## Run locally
|
## **Installation**
|
||||||
|
|
||||||
**We recommend using at least Deepseek 14B—smaller models struggle with tool use and memory retention.**
|
### 1️⃣ **Clone the repository**
|
||||||
|
|
||||||
### 1️⃣ **Install Dependencies**
|
```sh
|
||||||
|
git clone https://github.com/Fosowl/agenticSeek.git
|
||||||
|
cd agenticSeek
|
||||||
**Install requirements**
|
```
|
||||||
|
|
||||||
|
### 2️ **Create a virtual env**
|
||||||
|
|
||||||
|
```sh
|
||||||
|
python3 -m venv agentic_seek_env
|
||||||
|
source agentic_seek_env/bin/activate
|
||||||
|
# On Windows: agentic_seek_env\Scripts\activate
|
||||||
|
```
|
||||||
|
|
||||||
|
### 3️⃣ **Install package**
|
||||||
|
|
||||||
|
**Automatic Installation:**
|
||||||
|
|
||||||
|
```sh
|
||||||
|
./install.sh
|
||||||
|
```
|
||||||
|
|
||||||
|
**Manually:**
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
sudo apt-get update
|
|
||||||
pip3 install -r requirements.txt
|
pip3 install -r requirements.txt
|
||||||
|
# or
|
||||||
|
python3 setup.py install
|
||||||
```
|
```
|
||||||
|
|
||||||
**Install chromedriver**
|
|
||||||
|
|
||||||
```sh
|
## Run locally on your machine
|
||||||
# linux
|
|
||||||
pip install selenium
|
|
||||||
|
|
||||||
# macos
|
**We recommend using at least Deepseek 14B, smaller models struggle with tool use and forget quickly the context.**
|
||||||
brew install --cask chromedriver
|
|
||||||
|
|
||||||
# windows
|
### 1️⃣ **Download Models**
|
||||||
https://sites.google.com/chromium.org/driver/getting-started
|
|
||||||
```
|
|
||||||
|
|
||||||
**Install pyAudio**
|
|
||||||
|
|
||||||
```sh
|
|
||||||
# linux
|
|
||||||
sudo apt-get install portaudio19-dev python3-dev
|
|
||||||
|
|
||||||
#macos
|
|
||||||
brew install portaudio
|
|
||||||
```
|
|
||||||
|
|
||||||
### 2️⃣ **Download Models**
|
|
||||||
|
|
||||||
Make sure you have [Ollama](https://ollama.com/) installed.
|
Make sure you have [Ollama](https://ollama.com/) installed.
|
||||||
|
|
||||||
@ -71,7 +83,7 @@ Download the `deepseek-r1:7b` model from [DeepSeek](https://deepseek.com/models)
|
|||||||
ollama pull deepseek-r1:7b
|
ollama pull deepseek-r1:7b
|
||||||
```
|
```
|
||||||
|
|
||||||
### 3️⃣ **Run the Assistant (Ollama)**
|
### 2️ **Run the Assistant (Ollama)**
|
||||||
|
|
||||||
Start the ollama server
|
Start the ollama server
|
||||||
```sh
|
```sh
|
||||||
@ -80,6 +92,8 @@ ollama serve
|
|||||||
|
|
||||||
Change the config.ini file to set the provider_name to `ollama` and provider_model to `deepseek-r1:7b`
|
Change the config.ini file to set the provider_name to `ollama` and provider_model to `deepseek-r1:7b`
|
||||||
|
|
||||||
|
NOTE: `deepseek-r1:7b`is an exemple, use a bigger model if your hardware allow it.
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
[MAIN]
|
[MAIN]
|
||||||
is_local = True
|
is_local = True
|
||||||
@ -93,7 +107,11 @@ Run the assistant:
|
|||||||
python3 main.py
|
python3 main.py
|
||||||
```
|
```
|
||||||
|
|
||||||
## **Alternative: Run the LLM on your own server**
|
---
|
||||||
|
|
||||||
|
## **Run the LLM on your own server**
|
||||||
|
|
||||||
|
If you have a powerful computer or a server that you can use, but you want to use it from your laptop you have the options to run the LLM on a remote server.
|
||||||
|
|
||||||
### 1️⃣ **Set up and start the server scripts**
|
### 1️⃣ **Set up and start the server scripts**
|
||||||
|
|
||||||
@ -103,6 +121,8 @@ On your "server" that will run the AI model, get the ip address
|
|||||||
ip a | grep "inet " | grep -v 127.0.0.1 | awk '{print $2}' | cut -d/ -f1
|
ip a | grep "inet " | grep -v 127.0.0.1 | awk '{print $2}' | cut -d/ -f1
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Note: For Windows or macOS, use ipconfig or ifconfig respectively to find the IP address.
|
||||||
|
|
||||||
Clone the repository and then, run the script `stream_llm.py` in `server/`
|
Clone the repository and then, run the script `stream_llm.py` in `server/`
|
||||||
|
|
||||||
```sh
|
```sh
|
||||||
@ -132,6 +152,28 @@ Run the assistant:
|
|||||||
python3 main.py
|
python3 main.py
|
||||||
```
|
```
|
||||||
|
|
||||||
|
## **Run with an API**
|
||||||
|
|
||||||
|
Clone the repository.
|
||||||
|
|
||||||
|
Set the desired provider in the `config.ini`
|
||||||
|
|
||||||
|
```sh
|
||||||
|
[MAIN]
|
||||||
|
is_local = False
|
||||||
|
provider_name = openai
|
||||||
|
provider_model = gpt4-o
|
||||||
|
provider_server_address = 127.0.0.1:5000 # can be set to anything, not used
|
||||||
|
```
|
||||||
|
|
||||||
|
Run the assistant:
|
||||||
|
|
||||||
|
```sh
|
||||||
|
python3 main.py
|
||||||
|
```
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
## Providers
|
## Providers
|
||||||
|
|
||||||
The table below show the available providers:
|
The table below show the available providers:
|
||||||
@ -161,7 +203,41 @@ provider_server_address = 127.0.0.1:5000
|
|||||||
|
|
||||||
`provider_server_address`: can be set to anything if you are not using the server provider.
|
`provider_server_address`: can be set to anything if you are not using the server provider.
|
||||||
|
|
||||||
## Current contributor:
|
## FAQ
|
||||||
|
**Q: What hardware do I need?**
|
||||||
|
|
||||||
Fosowl 🇫🇷
|
7B Model: GPU with 8GB VRAM.
|
||||||
steveh8758 🇹🇼
|
14B Model: 12GB GPU (e.g., RTX 3060).
|
||||||
|
32B Model: 24GB+ VRAM.
|
||||||
|
|
||||||
|
**Q: Why Deepseek R1 over other models?**
|
||||||
|
|
||||||
|
Deepseek R1 excels at reasoning and tool use for its size. We think it’s a solid fit for our needs other models work fine, but Deepseek is our primary pick.
|
||||||
|
|
||||||
|
**Q: I get an error running `main.py`. What do I do?**
|
||||||
|
|
||||||
|
Ensure Ollama is running (`ollama serve`), your `config.ini` matches your provider, and dependencies are installed. If none work feel free to raise an issue.
|
||||||
|
|
||||||
|
**Q: How to join the discord ?**
|
||||||
|
|
||||||
|
Ask in the Community section for an invite.
|
||||||
|
|
||||||
|
**Q: Can it really run 100% locally?**
|
||||||
|
|
||||||
|
Yes with Ollama or Server providers, all speech to text, LLM and text to speech model run locally. Non-local options (OpenAI or others API) are optional.
|
||||||
|
|
||||||
|
**Q: How come it is older than manus ?**
|
||||||
|
|
||||||
|
we started this a fun side project to make a fully local, Jarvis-like AI. However, with the rise of Manus, we saw the opportunity to redirected some tasks to make yet another alternative.
|
||||||
|
|
||||||
|
**Q: How is it better than manus ?**
|
||||||
|
|
||||||
|
It's not but we prioritizes local execution and privacy over cloud based approach. It’s a fun, accessible alternative!
|
||||||
|
|
||||||
|
## Contribute
|
||||||
|
|
||||||
|
We’re looking for developers to improve AgenticSeek! Check out open issues or discussion.
|
||||||
|
|
||||||
|
## Authors:
|
||||||
|
> [Fosowl](https://github.com/Fosowl)
|
||||||
|
> [steveh8758](https://github.com/steveh8758)
|
||||||
|
Loading…
x
Reference in New Issue
Block a user