Κριτικές για το Page Assist - A Web UI for Local AI Models
Page Assist - A Web UI for Local AI Models από Muhammed Nazeem
59 κριτικές
- Βαθμολογία 5 από 5από null, 9 μήνες πριν
- Βαθμολογία 5 από 5από JuiceFruit, 9 μήνες πριν
- Βαθμολογία 5 από 5από Χρήστης Firefox 18824818, 9 μήνες πριν
- Βαθμολογία 4 από 5από Bugloss, 9 μήνες πριν
- Βαθμολογία 5 από 5από Χρήστης Firefox 18821506, 10 μήνες πριν
- Βαθμολογία 5 από 5από zyb, 10 μήνες πριν
- Βαθμολογία 4 από 5από Elliott, 10 μήνες πρινVery nice ollama frontend. It automatically recognises the ollama daemon already running on PC if installed. It provides a nice way to interact with local LLMs, complete with web search integration which, having now seen it firsthand, really transforms the usefulness of the models; even those that appear stupid without web search can be good at summarising information, and become actually helpful when they don't have to rely on only their built in knowledge. This extension is probably the easiest way to get any graphical interface for ollama running, particularly with integrated web search.
It does have a few bugs though. Sometimes if you close the window too soon after generating an answer, it won't be saved in your history and you will have to generate it again (usually if you do it before all of the statistics at the bottom become available). Also I have seen clicking the regenerate button make existing answers suddenly disappear (I think after I switched model). Sometimes some questions you asked disappear after a reload even if the answer remains. Another thing is that attaching images and asking vision models about them just results in an error.
I also tried it on my android phone in firefox, connecting to ollama on my laptop, which is recognised by the app to be running. However, on my phone it does not display the drop down menu for selecting a model or prompt, so I cannot use it. It seems that it does not see any models as installed on Android. Do they have to be installed locally on the phone to work?
Overall, has flaws, but is still a fantastic tool, enabling you to put local models to use conveniently instead of just messing around with them in ollama. - Βαθμολογία 5 από 5από robouden, 10 μήνες πρινAmazing plugin. Only thing I would love to see is an option to have the plugin as a popup or sidebar selection in the settings of the plugin.And when right clicked on text the Chat can popup can be activated.
Regards
Rob Oudendijk - Βαθμολογία 5 από 5από redspectre, 10 μήνες πριν
- Βαθμολογία 1 από 5από Breanna Johnson, 10 μήνες πριν
- Βαθμολογία 5 από 5από Χρήστης Firefox 18810652, 10 μήνες πριν
- Βαθμολογία 1 από 5από malisipi, 10 μήνες πριν
- Βαθμολογία 5 από 5από 赤黑, 10 μήνες πριν
- Βαθμολογία 5 από 5από Alwaysliumx, 10 μήνες πριν
- Βαθμολογία 5 από 5από AltB, 10 μήνες πρινExcellent with ollama ! Better than most Ollama AI UI frontend. Could be an indie app.
- Βαθμολογία 5 από 5από Bob Tao, 10 μήνες πριν
- Βαθμολογία 4 από 5από nn, ένας χρόνος πρινWorks very well in the current state (02/01/2025). It would be great to have the custom pilot prompts be more configurable (just one "custom" entry isn't a lot) and using the options window creates a 100% cpu utilization for one core (process web extension).
Overall a great addon, very helpful.Απάντηση προγραμματιστή
δημοσιεύτηκε στις ένας χρόνος πρινThank you for the suggestions and review. Sorry about the 100% CPU utilization issue; I will release a fix in the next update. - Βαθμολογία 1 από 5από Hous, ένας χρόνος πρινextension takes 100% cpu load even when you are not actively using it.
Απάντηση προγραμματιστή
δημοσιεύτηκε στις ένας χρόνος πρινApologies for the issue; we will release a fix in the next update - Βαθμολογία 5 από 5από Sabryabdallah, ένας χρόνος πρινأفضل تطبيق لتشغيل نماذج الذكاء الصناعى على الجهاز الخاص بك
- Βαθμολογία 5 από 5από 无能狂怒气死自己, ένας χρόνος πριν
- Βαθμολογία 5 από 5από lukp12, ένας χρόνος πρινAbsolutely fantastic - excactly what I was looking for. You might want to add "LLM" or "Ollama" keywords to name so it's found easier
- Βαθμολογία 4 από 5από Χρήστης Firefox 18669759, ένας χρόνος πρινNice extension, but can you fix it to follow different prompt formats. For example, it doesn't respect or use <|end_of_text|> tokens for ChatML models and keeps outputting more and more text.
- Βαθμολογία 5 από 5από JackMack, ένας χρόνος πρινI rarely leave reviews for things, but this is an extension that deserves to be better known. When you're just getting into local LLMs, it's exactly this sort of simple local Web UI that's needed. No need for hours tinkering with CLI and GUI interfaces. Straight into the action with this.