Using Llama3 in the browser with WebGPU with vercel/ai lib
Jun 6, 2024

Hi, this is a simple change to the great vercel/ai-chatbot repo to allow you to use LLMs using WebGPU (your mac GPU for example) instead of OpenAI, which guarantees:
- 100% data privacy
- 0 cost
- open source model
- no need internet
Under the hood we’re using WASM using web-llm and mlc-llm libraries :)
Here’s the code: