Ilya Rated 4 out of 5 stars
Cant find the option to save cleaned up css file. After scanning online webpage can't use the option "view stored data". (FireFox 24)
The cleaning controls are in the view data dialog, but I don't know of any reason why you wouldn't be able to access that. What happens when you try?
Ain't working in FF22 Rated 3 out of 5 stars
All the functions are greyed out, can't access preferences or scan any page. Maybe it needs updated for FF22+ ?This review is for a previous version of the add-on (3.01).
Doesn't work Mac 10.7.5 FF 23.0 Rated 1 out of 5 stars
Non of the functions work, can't even get dialog box, preferences can't save, etcThis review is for a previous version of the add-on (3.01).
Rated 1 out of 5 stars
Doesn't work on firefox 22This review is for a previous version of the add-on (3.01).
Yes, that notes above do say that. Changes in F22 have broken the extension, and I'll release a fix as soon as I get the chance.
Helpful, but not completely. Rated 3 out of 5 stars
This will get me in the neighborhood of what I want...
The xml sitemap business doesn't work, so if that's what you want, you should wait for the next version. That's what I need, the aggregate.
The selector bit is nice, but it would be cool to see if actual rules go un-applied, like in cases where an element can adopt rules from multiple selectors. Dust-Me selectors never claims to be able to do this, so I can't really fault it. But it does mean that if you really need this tool, you probably need more information if you want to really clean up your CSS,
If your site is more than one page or has DOM changes, it's going to take more work to be confident that a selector is really unused.
It's good information, what little there is, and the View Saved Data interface is thoughtful. I look forward to the next version.
Thanks for the review.
Sitemap indexes are not yet supported, but ordinary sitemap XML files should be fine. Can you tell me what the problem was, or give me a URL of the sitemap you tried to parse?
Where an element adopts rules from multiple selectors, all of the selectors will be checked, including those which only apply through inheritance. But you're right when you say that this alone is not enough to be completely sure that a selector is unused -- you have to take it as the starting point to reduce manual checking. I welcome any thoughts you have on how this could be improved.
Rated 1 out of 5 stars
I'm sorry but this addon isn't "that" addon. I was expecting it to be something like Firebug.This review is for a previous version of the add-on (3.01).
Erm, well Firebug is like Firebug, there'd be no point replicating functionality.
What exactly did you want?
Rated 1 out of 5 stars
Oh finally figured out you have to right click BUT IT DOES NOTHING. You wasted 20 minutes of my time. BAH!
You LEFT click the icon in the add-on bar and it will scan the current page. Or you can RIGHT click the page and select "Spider this page", which will open the spider dialog and prefill its URL, then you press "Start" to begin spidering.
I will be publishing some proper documentation soon, but in the meantime, I hope that will get you started
Musts to be useful Rated 1 out of 5 stars
It's a must to do to things, on let us save the files, and to use the custom user agent that is being used in requests. I am using "User Agent Switcher" to let me do dev work on a live site without others seeing it.
Also it seems to never crawl all the links, so it's not working right or helpful.
Also, it doesn't seem to work if you log in as a user so that it will pick up hidden pages so it's not using the sessions of the browser as is? This is important to ecom sites with carts and user areas. my stats fed back was that there was 4000 unused rules, 4 pages crawled yet 183pages were ignored when they would have helped reduce the unused rules.
With not being able to down load a css file of the used rules or even picking up the other pages, this is not useful at all...ATM this is just a things to do but no value to it at all.I hope it's fixed to be useful soon cause it's a great idea and i had high hopes for it.
Firstly, the extension does not crawl sites like a search-engine robot, it only follows links from the first page you crawl, so any pages that are not linked from that, will not be tested. You need to get it to crawl a sitemap (HTML or XML).
Secondly, it does test pages as the same user (ie. as you), so it will pick up login items, but note that the spider's default behavior is to disable JS on the pages it tests, so it doesn't pick up dynamic content. You can change that in the "Spidering" tab in Preferences.
I'm not sure what you're referring to by custom user agents -- it won't make any difference to the extension what UA you have. Or is that what you're saying, that it should?
Save a cleaned file Rated 4 out of 5 stars
Would be sensational if it was possible to save a cleaned CSS file directly to disk.This review is for a previous version of the add-on (3.01).
Not working on large or complex websites? Rated 4 out of 5 stars
I was excited to see this tool updated but it still doesn't seem to work on our website at http://agr.wa.gov/
Scan the entire website. Under the section for "wsdaNewStyle3.css" the first item it shows is "#pageLastMod" but that id is used on many pages, only one of which is http://agr.wa.gov/Inspection/FVinspection/ContactUs.aspx
The problem is that if I find even a single thing that a tool doesn't work properly on, then I can't trust anything produced by that tool. :( Is there perhaps something I can change in the configuration of the tool that will make it work properly on our website, or have I reached a limit in how complex the CSS can be for the tool to work?
Thanks for any assistance you can provide!
Thank you for your prompt reply! And that is the one thing I needed to know: that it doesn't "spider" the entire site, it only scans what it can reach off the current page you activate it from, kind of a "breadth-first" search instead of a "depth-first" search, limited to a certain number of levels (because it stands to reason that at some point a recursive scan would hit every page from the homepage, otherwise the unlinked pages would never have any traffic). If that was on the instructions webpage somewhere then I just missed reading it. :(
I will see about generating a sitemap and go from there, thanks again!
One final question since I have your attention: is there any way to tell it to not scan a domain other than the one I started the scan in? On my system for example, it also scans Google because I use their mapping widget.
Recursive scan is not supported, no. I guess it's just semantics to describe what it does as "spidering the whole site", because as you've observed, what it actually does is spider a sitemap, and won't reach any pages that aren't linked to from that sitemap.
You can "program" the Spider by defining exclusions or inclusions based on link REL attribute values, ie. tell it to exclude all links that have or don't have a particular REL value. You can also tell it not to follow off-site links, and only follow links on the same domain. Both of these options are in "Preferences > Spidering"
Excellent work, one feature requested! Rated 5 out of 5 stars
I was about to write some JS code and I decided to see if my idea was done first. It had been, and done very well! Thanks for making this a very usable extension. One request though -- I feel I want to be able to save off a copy of the .CSS file without the unused CSS in it. Thanks for Dust-Me, again!This review is for a previous version of the add-on (3.01).
That feature is coming in the next update.
Not working with sitemap index Rated 4 out of 5 stars
It seems a great tool, unfortunately it does not work with sitemap index, or at least gzipped sitemap indexThis review is for a previous version of the add-on (3.01).
That feature is coming in the next update
Great Add-on, few features missing Rated 4 out of 5 stars
- "Exceptions" feature - to exclude certain pages /sub-directories from scanning;
- Ability to scan local copy of the web site
Not sure yet whether local scanning will be possible, since it won't be able to use XHR to make requests, I'll have to see whether it can work with files through the directory structure instead.
For exceptions, that certainly something I'll look into. In the meantime have you looked at the Spidering preferences -- you can program exceptions there based on the value of "rel" attributes in HTML sitemap links.
Dust-me selector V3 Rated 5 out of 5 stars
New version available here : http://www.brothercake.com/site/portfolio/tools/dustmeselectors/This review is for a previous version of the add-on (2.2).
updated to Firefox 13.5 Rated 4 out of 5 stars
I have updated the .rdf file so the add-on should work as before until Firefox 13.5. You can download revised version: https://docs.google.com/open?id=0B_CE2l4Osa3SdVlmSkxUWWNUMm1QRDJ2cDI5OFMtZwThis review is for a previous version of the add-on (2.2).
move to opera Rated 4 out of 5 stars
This was the only plugin holding me back from using the new firefox 10 version (by the end of 2012 we'll probably be on version firefox 300.00!). So to cut a long story short, I installed another browser having found dust me selectors for opera. I don't use dust me selectors as often so I can switch between the two. So if your waiting for an update before diving in to firefox 10 you may want to do what I did.This review is for a previous version of the add-on (2.2).
Release date latest FF compatible version Rated 5 out of 5 stars
Hi brothercake, Christmas is closing in fast.. Any news on the releasedate of the latest FF compatible version?This review is for a previous version of the add-on (2.2).
Fail Rated 1 out of 5 stars
Umm..., the free version of unused-css.com is so limited it's useless.
Dust-me: Like many here, I've never been able to get it to scan an entire site without hanging. Still don't see why it's recommended so much - it does NOT work as advertised.
Rated 4 out of 5 stars
Thanks for this great extension. If you want to clean multiple pages at once and download the clean CSS files, you should try http://unused-css.comThis review is for a previous version of the add-on (2.2).
Rated 1 out of 5 stars
I have downloaded this extension at least 4 times over the past 2 years. Each time I have installed, adjusted the preferences, read, re-read all the documentation and have never gotten it to do anything besides animate the sweeping motion of the broom.
Every time I re-try I get excited by what it says it can do. But no matter how much time I spend on it nothing is ever generated. The now irritating little broom just keeps sweeping away and that is all it ever does for me.
I don't consider myself a stupid person. I build large high volume websites and make good money. However each time I attempt to get "Dust Me" to function I lose a couple hundred bucks in time and more in frustration.
I see other people praising it and also others like myself that do not. I will be the first to admit that the blame must still be with me but I feel compelled to wish ill will on all people responsible for this extension.
As I write this the GD pink broom is swishing back and forth mocking me and my efforts. I will uninstall yet again while thinking unhealthy thoughts.