标题: | LLM Farm 1.0.1 |
操作系统: |
|
许可: | Freeware |
加入日期: | 2024/06/14 |
出版商: | Artem Savkin |
评价: | 下载客户端率! |
改变日志 |
If you are getting strange prediction results in version 1.0.1, but everything was fine in version 0.9.0, try disabling the BOS option in the template. ## Changes: * llama.cpp updated to b2135 * Added support for multimodal models MobileVLM, Yi-VL, LLaVA, Obsidian tested on (mobileVLM 3B) * Added the ability to download models from the application menu * Added possibility to specify System Prompt, which will be added to the text of the first message in the session. See (https://github.com/guinmoon/LLMFarm/wiki/FAQ) * Added ability to clone chat (without message history) * Added progress indicator for model loading * Added the ability to hide the keyboard, to do this tap anywhere in the chat window * Added ability to temporarily disable chat autoscrolling by tapping anywhere in the chat window, autoscrolling will be enabled automatically when sending a new message * When you clear the message history, the model context is also cleared * Chats are sorted by last modification date * Clear chat history button is placed on the toolbox. * You can now use both {prompt} and {{prompt}} designations in templates * Templates have been updated * Fixed disappearing keyboard bug * Fixed a bug with displaying already deleted chats and models * Fixed crash on switch model * Fixed a bug that could cause a crash on startup of feintune * Fixed some bugs that could cause the application to crash * Fixed some other bugs * Some UI improvements |
2024/09/22 | 觉晓法考HD-司法考试优质题库 4.30.0 |
2024/09/22 | Campus Outreach Birmingham 6.10.3 |
2024/09/22 | Migraine diary+ 1.67 |
2024/09/22 | Binary Brigade 2.0 |
2024/09/22 | Reflection Church 6.10.20 |