Reviews for Page Assist - A Web UI for Local AI Models
Page Assist - A Web UI for Local AI Models by Muhammed Nazeem
73 reviews
- Rated 5 out of 5by 飞舞的冰龙, 15 hours agoWhy this great addon cannot be updated? It has already been version 1.5.65 on GitHub.
Developer response
posted 12 hours agoHi, I’m uploading the latest version to the store, but the add-on has not been approved yet. It is still under review. It has been two months, and there is no timeline for when Mozilla Firefox will approve it. I’m sorry. - Rated 5 out of 5by Firefox user 14457244, 5 days ago
- Rated 5 out of 5by MarsLife, 24 days ago
- Rated 5 out of 5by TA, a month agoVery good extension, but the defaults are a little bit conservative.
So for example the default text size from the page you are viewing when sent to LLM is truncated to only 7 kilobytes (around 1500 tokens).
So for bigger pages LLM doesn't even know what's on the page.
You can change this default in: "Settings -> Pipeline settings -> Maximum Content Size for Full Context Mode". It's in bytes. - Rated 5 out of 5by Firefox user 12490047, a month ago
- Rated 4 out of 5by jonuno, 2 months agogreat but how do I copy the text? its not selectable, so it takes away a lot of the functionality for what I need.
- Rated 5 out of 5by James, 3 months agoVery good browser extension for AI model use. It is programmed in a very clever way, lightweigt and ressource saving, but full functionality. Compared to all other AI frontends, it is the best one i know. Congratulations! Greetings from Germany :-)
- Rated 5 out of 5by Zunami, 4 months ago
- Rated 5 out of 5by Robin Filer, 4 months ago
- Rated 5 out of 5by Michal Mikoláš, 4 months agoCompatible with not only ollama, but OpenRouter as well. Very customizable and just works!
- Rated 5 out of 5by Firefox user 19622186, 5 months ago
- Rated 4 out of 5by Firefox user 14478686, 5 months agoI'm a novice, but finding Page Assist to be a useful tool for finding my way with Ollama, if I had to work with cmd line only I'd have given up. This is great.
- Rated 5 out of 5by Nik, 5 months agosave your self from all the open-webui headache with this amazing addon.
- Rated 5 out of 5by HumanistAtypik, 6 months ago
- Rated 5 out of 5by codefossa, 7 months agoThis easily replaces what Firefox's options provide for use with cloud-based LLMs. I can finally use the shortcuts to summarize and rephrase things for work that I can't have going to a cloud. I already had Ollama running locally, it automatically detected it. It was ready to use out of the box. It seems perfect!
- Rated 5 out of 5by Vaz-Dev, 7 months agoAbsolutely perfect, to only thing i think could improve is the extension icon which doesnt match a lot with modern browsers UIs.
Also i did not understand if the extension provides tools for Ollama hosted LLMs to perform web searches, or its only using the Ollama API. - Rated 5 out of 5by Firefox user 19258258, 9 months ago
- Rated 5 out of 5by Firefox user 19244952, 9 months ago
- Rated 5 out of 5by Vick, 10 months agoThis addon deserves a 180K +ve reviews (not just 18 reviews)! After struggling with Openweb UI, frontend, backend and bash scripts in vain, to make my local LLM work on Linux Mint, I downloaded this and it JUST worked in 10 seconds. That is it. It required nothing else! Thank you Mr Nazeem. Awesome work! Now, I need to find out how to add other models from the UI, or of course, I can add them from terminal.
- Rated 5 out of 5by Firefox user 19104715, 10 months ago
- Rated 5 out of 5by Henrique, a year ago
- Rated 5 out of 5by FFFire, a year ago
- Rated 5 out of 5by Firefox user 18939203, a year ago
- Rated 5 out of 5by sun-jiao, a year ago