use-local-llm: React Hooks for AI That Actually Work Locally
You've finally got your local LLM running. You pull a model, test it with curl, and it works beautifully. But the moment you try to integrate it into your React app, you hit a wall. The tools every...

Source: DEV Community
You've finally got your local LLM running. You pull a model, test it with curl, and it works beautifully. But the moment you try to integrate it into your React app, you hit a wall. The tools everyone uses assume you're calling OpenAI or Anthropic from a server. They don't expect you to talk to localhost:11434 directly from the browser. And if they do, they force you to build API routes, add a backend, and complicate your prototype. I kept running into this frustration, so I built use-local-llm, a library with a single purpose. It streams AI responses from local models directly in the browser with no backend, in 2.8 KB of code and zero dependencies. Why Existing Tools Don't Fit You'd think you could just use Vercel AI SDK. It's the standard for React + AI. It ships adapters for multiple frameworks, maintains thorough API references, and handles production traffic at scale. But Vercel did not build it for direct browser-to-localhost communication. Vercel AI SDK requires an API layer. Yo