This website requires JavaScript.
Explore
Help
Register
Sign In
tcsenpai
/
ollama
Watch
1
Star
0
Fork
0
You've already forked ollama
mirror of
https://github.com/tcsenpai/ollama.git
synced
2025-06-06 19:25:21 +00:00
Code
Issues
Packages
Projects
Releases
Wiki
Activity
ollama
/
llm
/
ext_server
History
Jeffrey Morgan
d89454de80
Use slot with cached prompt instead of least recently used (
#5492
)
...
* Use common prefix to select slot * actually report `longest`
2024-07-05 12:32:47 -04:00
..
CMakeLists.txt
Switch back to subprocessing for llama.cpp
2024-04-01 16:48:18 -07:00
httplib.h
Import server.cpp as of b2356
2024-03-12 13:58:06 -07:00
json.hpp
Import server.cpp as of b2356
2024-03-12 13:58:06 -07:00
server.cpp
Use slot with cached prompt instead of least recently used (
#5492
)
2024-07-05 12:32:47 -04:00
utils.hpp
log clean up
2024-05-09 14:55:36 -07:00