From 5bf1b4f659eba023679bac3d4f5a95da405ad40f Mon Sep 17 00:00:00 2001 From: Martin <49105846+Fosowl@users.noreply.github.com> Date: Mon, 10 Mar 2025 14:32:14 +0100 Subject: [PATCH] Update README.md --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index 72cb840..4845783 100644 --- a/README.md +++ b/README.md @@ -152,7 +152,7 @@ Ensure Ollama is running (`ollama serve`), your `config.ini` matches your provid Yes with Ollama or Server providers, all speech to text, LLM and text to speech model run locally. Non-local options (OpenAI, Deepseek API) are optional. -**Q: How can it is older than manus ?** +**Q: How come it is older than manus ?** we started this a fun side project to make a fully local, Jarvis-like AI. However, with the rise of Manus and openManus, we saw the opportunity to redirected some tasks priority to make yet another alternative.