Análises para Page Assist - A Web UI for Local AI Models
Page Assist - A Web UI for Local AI Models por Muhammed Nazeem
Análise por Elliott
Avaliado em 4 de 5
por  Elliott ,  há 9 meses Very nice ollama frontend. It automatically recognises the ollama daemon already running on PC if installed. It provides a nice way to interact with local LLMs, complete with web search integration which, having now seen it firsthand, really transforms the usefulness of the models; even those that appear stupid without web search can be good at summarising information, and become actually helpful when they don't have to rely on only their built in knowledge. This extension is probably the easiest way to get any graphical interface for ollama running, particularly with integrated web search.
It does have a few bugs though. Sometimes if you close the window too soon after generating an answer, it won't be saved in your history and you will have to generate it again (usually if you do it before all of the statistics at the bottom become available). Also I have seen clicking the regenerate button make existing answers suddenly disappear (I think after I switched model). Sometimes some questions you asked disappear after a reload even if the answer remains. Another thing is that attaching images and asking vision models about them just results in an error.
I also tried it on my android phone in firefox, connecting to ollama on my laptop, which is recognised by the app to be running. However, on my phone it does not display the drop down menu for selecting a model or prompt, so I cannot use it. It seems that it does not see any models as installed on Android. Do they have to be installed locally on the phone to work?
Overall, has flaws, but is still a fantastic tool, enabling you to put local models to use conveniently instead of just messing around with them in ollama.
It does have a few bugs though. Sometimes if you close the window too soon after generating an answer, it won't be saved in your history and you will have to generate it again (usually if you do it before all of the statistics at the bottom become available). Also I have seen clicking the regenerate button make existing answers suddenly disappear (I think after I switched model). Sometimes some questions you asked disappear after a reload even if the answer remains. Another thing is that attaching images and asking vision models about them just results in an error.
I also tried it on my android phone in firefox, connecting to ollama on my laptop, which is recognised by the app to be running. However, on my phone it does not display the drop down menu for selecting a model or prompt, so I cannot use it. It seems that it does not see any models as installed on Android. Do they have to be installed locally on the phone to work?
Overall, has flaws, but is still a fantastic tool, enabling you to put local models to use conveniently instead of just messing around with them in ollama.
59 análises
- Avaliado em 5 de 5por HumanistAtypik , há 13 horas
- Avaliado em 5 de 5por codefossa , há 15 diasThis easily replaces what Firefox's options provide for use with cloud-based LLMs. I can finally use the shortcuts to summarize and rephrase things for work that I can't have going to a cloud. I already had Ollama running locally, it automatically detected it. It was ready to use out of the box. It seems perfect!
- Avaliado em 5 de 5por Vaz-Dev , há 22 diasAbsolutely perfect, to only thing i think could improve is the extension icon which doesnt match a lot with modern browsers UIs.
 Also i did not understand if the extension provides tools for Ollama hosted LLMs to perform web searches, or its only using the Ollama API.
- Avaliado em 5 de 5por Utilizador do Firefox 19258258 , há 2 meses
- Avaliado em 5 de 5por Utilizador do Firefox 19244952 , há 3 meses
- Avaliado em 5 de 5por Vick , há 4 mesesThis addon deserves a 180K +ve reviews (not just 18 reviews)! After struggling with Openweb UI, frontend, backend and bash scripts in vain, to make my local LLM work on Linux Mint, I downloaded this and it JUST worked in 10 seconds. That is it. It required nothing else! Thank you Mr Nazeem. Awesome work! Now, I need to find out how to add other models from the UI, or of course, I can add them from terminal.
- Avaliado em 5 de 5por Utilizador do Firefox 19104715 , há 4 meses
- Avaliado em 5 de 5por Henrique , há 4 meses
- Avaliado em 5 de 5por FFFire , há 6 meses
- Avaliado em 5 de 5por Utilizador do Firefox 18939203 , há 7 meses
- Avaliado em 5 de 5por sun-jiao , há 7 meses
- Avaliado em 5 de 5por Jean Louis , há 7 meses🌟 My Exciting Review of Page Assist! 🌟
 Wow, Page Assist has completely transformed the way I work online! As someone who’s all about maximizing efficiency and keeping my data private, I couldn’t be happier with this extension. Let me gush over why it’s now my go-to browser buddy! ✨
 🖥️ Local AI Magic
 First off, the local OpenAI-compatible endpoint support is just brilliant! Running models like llama.cpp locally means I get lightning-fast responses without worrying about privacy. It’s like having a super-smart, personal assistant right at my fingertips! 🔒💨
 📚 Versatile Features Galore
 - Sidebar for All the Fun: The sidebar is like a Swiss Army knife for productivity. Whether I’m jotting down notes or brainstorming ideas, it’s always there, ready to help. 📝
 - Vision Model Wizardry: From analyzing images to extracting text with OCR, the vision models are nothing short of magical. It’s like having a mini-photographer and typist all in one! 📸✍️
 - Chat with My Docs: Interacting with PDFs and other documents directly in the sidebar is a dream come true. It’s like having a conversation with my files—how cool is that? 📄💬
 🎨 Minimal Web UI
 The web UI is sleek and user-friendly. It’s so intuitive that even my grandma could use it without breaking a sweat! 🧙♂️👵
 🌐 Internet Search & More
 And let’s not forget the internet search capability. Combining local AI power with the vastness of the web is like having the best of both worlds. 🌍🔍
 🎉 Conclusion
 In short, Page Assist is an absolute gem! It’s made my digital life so much more efficient and enjoyable. If you’re on the fence, just go for it—you won’t regret it! 🚀
 Jean Louis
- Avaliado em 5 de 5por Iván Campaña , há 7 meses
- Avaliado em 5 de 5por Utilizador do Firefox 14643647 , há 7 meses
- Avaliado em 5 de 5por plter , há 8 meses
- Avaliado em 5 de 5por Utilizador do Firefox 18867716 , há 8 meses
- Avaliado em 5 de 5por paulcalebfarmer , há 8 meses
- Avaliado em 5 de 5por Jujaga , há 8 mesesThis addon is by far one of the best and easiest ways to dive into using local LLMs. This addon is a severely underrated web interface for Ollama, and comes with many very useful features. It is a must have in your toolkit if you work with local AI models.
- Avaliado em 5 de 5por 酱油炒饭 , há 8 meses
- Avaliado em 5 de 5por aepex , há 8 meses
- Avaliado em 5 de 5por 明一 , há 8 meses
- Avaliado em 5 de 5por Eduardo , há 8 meses
- Avaliado em 5 de 5por 晓明 , há 9 meses
- Avaliado em 4 de 5por Utilizador do Firefox 10396487 , há 9 meses
