About this extension
Utilize your own AI models running locally to interact with while you browse or as a web UI for your local AI model provider like Ollama.

Repo: https://github.com/n4ze3m/page-assist

Current Features:
  • Sidebar for various tasks
  • Support for vision models
  • A minimal web UI for local AI models
  • Internet search
  • Chat with PDF on Sidebar
  • Chat wit Documents (pdf,csv,txt,md)

Supported Providers:
  • Ollama
  • [BETA] OpenAI-compatible API support (LM Studio, Llamafile, and many more providers).
Rate your experience
How are you enjoying Page Assist - A Web UI for Local AI Models?
There are no ratings yet

Star rating saved