Chat UI | Open Source Codebase Powering The HuggingChat App

Chat UI

Find the docs at hf.co/docs/chat-ui.

Chat UI repository thumbnail

A chat interface using open source models, eg OpenAssistant or Llama. It is a SvelteKit app and it powers the HuggingChat app on hf.co/chat.

  1. Quickstart
  2. No Setup Deploy
  3. Setup
  4. Launch
  5. Web Search
  6. Text Embedding Models
  7. Extra parameters
  8. Common issues
  9. Deploying to a HF Space
  10. Building

Quickstart

Docker image

You can deploy a chat-ui instance in a single command using the docker image. Get your huggingface token from here.

docker run -p 3000 -e HF_TOKEN=hf_*** -v db:/data ghcr.io/huggingface/chat-ui-db:latest

Take a look at the .env file and the readme to see all the environment variables that you can set. We have endpoint support for all OpenAI API compatible local services as well as many other providers like Anthropic, Cloudflare, Google Vertex AI, etc.

Local setup

You can quickly start a locally running chat-ui & LLM text-generation server thanks to chat-ui’s llama.cpp server support.

Step 1 (Start llama.cpp server):

Install llama.cpp w/ brew (for Mac):

# install llama.cpp brew install llama.cpp

or build directly from the source for your target device:

git clone https://github.com/ggerganov/llama.cpp && cd llama.cpp && make

Next, start the server with the LLM of your choice:

# start llama.cpp server (using hf.co/microsoft/Phi-3-mini-4k-instruct-gguf as an example)
llama-server --hf-repo microsoft/Phi-3-mini-4k-instruct-gguf --hf-file Phi-3-mini-4k-instruct-q4.gguf -c 4096

A local LLaMA.cpp HTTP Server will start on http://localhost:8080. Read more here.

Step 3 (make sure you have MongoDb running locally):

docker run -d -p 27017:27017 --name mongo-chatui mongo:latest

Read more here.

Step 4 (clone chat-ui):

git clone https://github.com/huggingface/chat-ui cd chat-ui

Step 5 (tell chat-ui to use local llama.cpp server):

Add the following to your .env.local:

MODELS=`[
  {
    "name": "microsoft/Phi-3-mini-4k-instruct",
    "endpoints": [{
      "type" : "llamacpp",
      "baseURL": "http://localhost:8080"
    }],
  },
]`

Read more here.

Step 6 (start chat-ui):

npm install npm run dev -- --open

Read more here.

GitHub:

4 Likes