From 7a489ec373599f7a435da542c7b113a2821f0b96 Mon Sep 17 00:00:00 2001 From: arkohut <39525455+arkohut@users.noreply.github.com> Date: Fri, 25 Oct 2024 15:18:21 +0800 Subject: [PATCH] docs: update docs --- README.md | 10 ++++------ README_ZH.md | 10 ++++------ 2 files changed, 8 insertions(+), 12 deletions(-) diff --git a/README.md b/README.md index 3f16018..bef5e1c 100644 --- a/README.md +++ b/README.md @@ -122,13 +122,11 @@ Before deciding to enable the VLM feature, please note the following: - Enabling VLM will significantly increase system power consumption - Consider using other devices to provide OpenAI API compatible model services -#### Enabling Steps - -1. **Install Ollama** +#### 1. Install Ollama Visit the [Ollama official documentation](https://ollama.com) for detailed installation and configuration instructions. -2. **Prepare the Multimodal Model** +#### 2. Prepare the Multimodal Model Download and run the multimodal model `minicpm-v` using the following command: @@ -138,7 +136,7 @@ ollama run minicpm-v "Describe what this service is" This command will download and run the minicpm-v model. If the running speed is too slow, it is not recommended to use this feature. -3. **Configure Memos to Use Ollama** +#### 3. Configure Memos to Use Ollama Open the `~/.memos/config.yaml` file with your preferred text editor and modify the `vlm` configuration: @@ -163,7 +161,7 @@ default_plugins: This adds the `builtin_vlm` plugin to the default plugin list. -4. **Restart Memos Service** +#### 4. Restart Memos Service ```sh memos stop diff --git a/README_ZH.md b/README_ZH.md index ffc5298..8514c59 100644 --- a/README_ZH.md +++ b/README_ZH.md @@ -125,13 +125,11 @@ memos reindex --force - 启用 VLM 后会显著增加系统功耗 - 可以考虑使用其他设备提供 OpenAI API 兼容的模型服务 -#### 启用步骤 - -1. **安装 Ollama** +#### 1. 安装 Ollama 请访问 [Ollama 官方文档](https://ollama.com) 获取详细的安装和配置指南。 -2. **准备多模态模型** +#### 2. 准备多模态模型 使用以下命令下载并运行多模态模型 `minicpm-v`: @@ -141,7 +139,7 @@ ollama run minicpm-v "描述一下这是什么服务" 这条命令会下载并运行 minicpm-v 模型,如果发现运行速度太慢的话,不推荐使用这部分功能。 -3. **配置 Memos 使用 Ollama** +#### 3. 配置 Memos 使用 Ollama 使用你喜欢的文本编辑器打开 `~/.memos/config.yaml` 文件,并修改 `vlm` 配置: @@ -166,7 +164,7 @@ default_plugins: 这里就是将 `builtin_vlm` 插件添加到默认的插件列表中。 -4. **重启 Memos 服务** +#### 4. 重启 Memos 服务 ```sh memos stop