ChattyUI
Local browser-based LLM execution
软件新闻
暂无相关新闻。
使用指南
What is WebChat?
Run open-source LLMs locally in the browser using WebGPU
How to use WebChat?
Visit the website, select a model, and interact through the chat interface.
WebChat's Core Features
Browser-based execution of open-source LLMs
Utilizes WebGPU for enhanced performance
No server-side processing, ensuring data privacy
WebChat's Use Cases
#1
Interact with Gemma, Mistral, or LLama3 models directly in your browser
#2
Experimental AI conversations without internet dependency
FAQ from WebChat
Is my data secure when using WebChat?