ChattyUI Logo

ChattyUI

Local browser-based LLM execution

10k 热度 AI对话聊天 WebGPU LLMs Gemma Mistral LLama3 Browser-based AI Open-source models
访问官网
ChattyUI 截图

软件新闻

暂无相关新闻。

使用指南

What is WebChat?

Run open-source LLMs locally in the browser using WebGPU

How to use WebChat?

Visit the website, select a model, and interact through the chat interface.

WebChat's Core Features

Browser-based execution of open-source LLMs

Utilizes WebGPU for enhanced performance

No server-side processing, ensuring data privacy

WebChat's Use Cases

#1

Interact with Gemma, Mistral, or LLama3 models directly in your browser

#2

Experimental AI conversations without internet dependency

FAQ from WebChat

Is my data secure when using WebChat?

你可能还感兴趣