Using Llama3 in the browser with WebGPU with vercel/ai lib

louis030195
Jun 6, 2024

--

Hi, this is a simple change to the great vercel/ai-chatbot repo to allow you to use LLMs using WebGPU (your mac GPU for example) instead of OpenAI, which guarantees:

  • 100% data privacy
  • 0 cost
  • open source model
  • no need internet

Under the hood we’re using WASM using web-llm and mlc-llm libraries :)

Here’s the code:

https://github.com/llm-edge/hal-051224

Sign up to discover human stories that deepen your understanding of the world.

Free

Distraction-free reading. No ads.

Organize your knowledge with lists and highlights.

Tell your story. Find your audience.

Membership

Read member-only stories

Support writers you read most

Earn money for your writing

Listen to audio narrations

Read offline with the Medium app

--

--

louis030195
louis030195

Written by louis030195

Chief Executive. Bookworm. AI Engineer. I write about code, AI, OSS, PKM, business, and books

No responses yet

Write a response