mirror of
https://github.com/tcsenpai/agenticSeek.git
synced 2025-06-06 11:05:26 +00:00
readme update & fix provider not auto downloading model
This commit is contained in:
parent
6e2954d446
commit
b40322dc2c
20
README.md
20
README.md
@ -38,8 +38,7 @@
|
|||||||
|
|
||||||
- **Memory**: Remembers what’s useful, your preferences and past sessions conversation.
|
- **Memory**: Remembers what’s useful, your preferences and past sessions conversation.
|
||||||
|
|
||||||
- **Web Browsing**: Autonomous web navigation is underway.
|
- **Web Browsing**: Autonomous web navigation.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
### Searching the web with agenticSeek :
|
### Searching the web with agenticSeek :
|
||||||
@ -278,22 +277,25 @@ python3 main.py
|
|||||||
|
|
||||||
## Speech to Text
|
## Speech to Text
|
||||||
|
|
||||||
The speech to text is disabled by default, you can enable it by setting listen to true in the config.ini:
|
The speech-to-text functionality is disabled by default. To enable it, set the listen option to True in the config.ini file:
|
||||||
|
|
||||||
```
|
```
|
||||||
listen = True
|
listen = True
|
||||||
```
|
```
|
||||||
|
|
||||||
The speech to text will await for a AI name as a trigger keyword before it start listening, you can change the AI name by changing the agent_name in the config.ini:
|
When enabled, the speech-to-text feature listens for a trigger keyword, which is the agent's name, before it begins processing your input. You can customize the agent's name by updating the `agent_name` value in the *config.ini* file:
|
||||||
|
|
||||||
```
|
```
|
||||||
agent_name = Friday
|
agent_name = Friday
|
||||||
```
|
```
|
||||||
|
|
||||||
It will work better if you use a common english name like John or Emma.
|
For optimal recognition, we recommend using a common English name like "John" or "Emma" as the agent name
|
||||||
|
|
||||||
After hearing it's name agenticSeek will listen until it hear one of the following keyword for confirmation:
|
Once you see the transcript start to appear, say the agent's name aloud to wake it up (e.g., "Friday").
|
||||||
|
|
||||||
|
Speak your query clearly.
|
||||||
|
|
||||||
|
End your request with a confirmation phrase to signal the system to proceed. Examples of confirmation phrases include:
|
||||||
```
|
```
|
||||||
"do it", "go ahead", "execute", "run", "start", "thanks", "would ya", "please", "okay?", "proceed", "continue", "go on", "do that", "go it", "do you understand?"
|
"do it", "go ahead", "execute", "run", "start", "thanks", "would ya", "please", "okay?", "proceed", "continue", "go on", "do that", "go it", "do you understand?"
|
||||||
```
|
```
|
||||||
@ -321,7 +323,7 @@ provider_server_address = 127.0.0.1:5000
|
|||||||
```
|
```
|
||||||
`is_local`: should be True for any locally running LLM, otherwise False.
|
`is_local`: should be True for any locally running LLM, otherwise False.
|
||||||
|
|
||||||
`provider_name`: Select the provider to use by its name, see the provider list above.
|
`provider_name`: Select the provider to use by it's name, see the provider list above.
|
||||||
|
|
||||||
`provider_model`: Set the model to use by the agent.
|
`provider_model`: Set the model to use by the agent.
|
||||||
|
|
||||||
@ -366,10 +368,6 @@ Deepseek R1 excels at reasoning and tool use for its size. We think it’s a sol
|
|||||||
|
|
||||||
Ensure Ollama is running (`ollama serve`), your `config.ini` matches your provider, and dependencies are installed. If none work feel free to raise an issue.
|
Ensure Ollama is running (`ollama serve`), your `config.ini` matches your provider, and dependencies are installed. If none work feel free to raise an issue.
|
||||||
|
|
||||||
**Q: How to join the discord ?**
|
|
||||||
|
|
||||||
Ask in the Community section for an invite.
|
|
||||||
|
|
||||||
**Q: Can it really run 100% locally?**
|
**Q: Can it really run 100% locally?**
|
||||||
|
|
||||||
Yes with Ollama or Server providers, all speech to text, LLM and text to speech model run locally. Non-local options (OpenAI or others API) are optional.
|
Yes with Ollama or Server providers, all speech to text, LLM and text to speech model run locally. Non-local options (OpenAI or others API) are optional.
|
||||||
|
@ -33,7 +33,7 @@ class Provider:
|
|||||||
if self.provider_name in self.unsafe_providers:
|
if self.provider_name in self.unsafe_providers:
|
||||||
pretty_print("Warning: you are using an API provider. You data will be sent to the cloud.", color="warning")
|
pretty_print("Warning: you are using an API provider. You data will be sent to the cloud.", color="warning")
|
||||||
self.api_key = self.get_api_key(self.provider_name)
|
self.api_key = self.get_api_key(self.provider_name)
|
||||||
elif self.server != "":
|
elif self.server != "ollama":
|
||||||
pretty_print(f"Provider: {provider_name} initialized at {self.server}", color="success")
|
pretty_print(f"Provider: {provider_name} initialized at {self.server}", color="success")
|
||||||
self.check_address_format(self.server)
|
self.check_address_format(self.server)
|
||||||
if not self.is_ip_online(self.server.split(':')[0]):
|
if not self.is_ip_online(self.server.split(':')[0]):
|
||||||
@ -54,6 +54,7 @@ class Provider:
|
|||||||
Validate if the address is valid IP.
|
Validate if the address is valid IP.
|
||||||
"""
|
"""
|
||||||
try:
|
try:
|
||||||
|
address = address.replace('http://', '')
|
||||||
ip, port = address.rsplit(":", 1)
|
ip, port = address.rsplit(":", 1)
|
||||||
if all(c.lower() in ".:abcdef0123456789" for c in ip):
|
if all(c.lower() in ".:abcdef0123456789" for c in ip):
|
||||||
ipaddress.ip_address(ip)
|
ipaddress.ip_address(ip)
|
||||||
@ -143,6 +144,7 @@ class Provider:
|
|||||||
if e.status_code == 404:
|
if e.status_code == 404:
|
||||||
animate_thinking(f"Downloading {self.model}...")
|
animate_thinking(f"Downloading {self.model}...")
|
||||||
ollama.pull(self.model)
|
ollama.pull(self.model)
|
||||||
|
self.ollama_fn(history, verbose)
|
||||||
if "refused" in str(e).lower():
|
if "refused" in str(e).lower():
|
||||||
raise Exception("Ollama connection failed. is the server running ?") from e
|
raise Exception("Ollama connection failed. is the server running ?") from e
|
||||||
raise e
|
raise e
|
||||||
|
Loading…
x
Reference in New Issue
Block a user