Doesn't work Rated 1 out of 5 stars

Does nothing when I click on "view data".
I have latest updates of Firefox and Dust me installed.
Complete waste of time.

Can you give me some more information to isolate what this problem might be -- the precise Firefox version number, which platform you're running it on, and whether there are any errors in the console.

Причем тут json и csv? Rated 5 out of 5 stars

С одной стороны плагин то, что нужно. Казалось бы, можно найти на сайте неиспользуемые стили, скопировать лишь используемые, и заменить ими стили. Но нет, почему-то сделано так, что сохранить используемые и неиспользуемые стили можно лишь в json и csv. Почему нельзя сделать сохранение в css формате - непонятно.
Нужно возиться с конвертация json формата в css. Хотелось бы, конечно, иметь сразу возможность сохранения в css.

Попробуйте "чистые" кнопки в диалоговом окне "Просмотр сохраненных данных", которые производят версию таблицы стилей, в которых неиспользованные селекторы удаляются.

(Try the "clean" buttons in the "View stored data" dialog, which produce a version of the stylesheet in which unused selectors are removed.)

Rated 1 out of 5 stars

Definitely not reliable - I have selectors listed as unused that are actually used on various pages

This user has a previous review of this add-on.

If you know that a selector is unused, then you can right-click and select "mark as used".

But generally, the places where Dust-Me will fail to find selector use, is where those selectors are only used by content which is generated by JS after page load or by user interaction (e.g. a lightbox which only appears when the user clicks something). It's not possible for the spider to detect that kind of thing, so you have to scan such pages individually, using the Automation preferences and mutation events, to detect dynamic changes in content.

Unable to parse selectors data (JSON syntax error) Rated 4 out of 5 stars

Dust-Me Selectors works well for me but inthe web site I'm working on doesn't work.He say me No StylesheetsandUnable to parse selectors data (JSON syntax error) Can you help me please?
This is the url
http://goo.gl/B7oOym

What's the web site URL?

Rated 5 out of 5 stars

very good, easy use.

errors Rated 4 out of 5 stars

α) Με την εγκατάσταση όλα οκ. Στη χρήση τώρα...κάμποσα selectors τα διαγράφει ενώ χρειάζονται...πρέπει να τρέξεις όλα τα js της φόρμας-σελίδας, η αφήνει πράγματα που δεν χρειάζονται.πχ έβαλα το bootstrap.css το καθάρισε αρκετά αλλά με το χερι το έκανα αρκετα μικρότερο.β) Όταν το επαν-ενεργοποίησα χάθηκαν τα Data στη popup σελίδας...

Εάν χρησιμοποιείτε το "Αυτοματισμού" προτιμήσεις που μπορείτε να πάρετε την επέκταση να ανιχνεύσει αυτόματα, ενώ μπορείτε να περιηγηθείτε και να αλληλεπιδράσετε με τις σελίδες. Κάντε κλικ στα στοιχεία, αιωρείται πάνω από τα πράγματα - ό, τι αλληλεπιδράσεις είστε scripting για - και αυτό θα σας βοηθήσει να βρείτε επιλογείς που χρησιμοποιούνται μόνο με JavaScript.

(If you use the "Automation" preferences you can get the extension to scan automatically, while you browse and interact with pages. Click on elements, hover over things -- whatever interactions you're scripting for -- and this will help it to find selectors that are only used by JavaScript.)

Ilya Rated 4 out of 5 stars

Cant find the option to save cleaned up css file. After scanning online webpage can't use the option "view stored data". (FireFox 24)

The cleaning controls are in the view data dialog, but I don't know of any reason why you wouldn't be able to access that. What happens when you try?

Ain't working in FF22 Rated 3 out of 5 stars

All the functions are greyed out, can't access preferences or scan any page. Maybe it needs updated for FF22+ ?

This review is for a previous version of the add-on (3.01). 

Doesn't work Mac 10.7.5 FF 23.0 Rated 1 out of 5 stars

Non of the functions work, can't even get dialog box, preferences can't save, etc

This review is for a previous version of the add-on (3.01). 

Rated 1 out of 5 stars

Doesn't work on firefox 22

This review is for a previous version of the add-on (3.01). 

Yes, that notes above do say that. Changes in F22 have broken the extension, and I'll release a fix as soon as I get the chance.

Helpful, but not completely. Rated 3 out of 5 stars

This will get me in the neighborhood of what I want...

The xml sitemap business doesn't work, so if that's what you want, you should wait for the next version. That's what I need, the aggregate.

The selector bit is nice, but it would be cool to see if actual rules go un-applied, like in cases where an element can adopt rules from multiple selectors. Dust-Me selectors never claims to be able to do this, so I can't really fault it. But it does mean that if you really need this tool, you probably need more information if you want to really clean up your CSS,

If your site is more than one page or has DOM changes, it's going to take more work to be confident that a selector is really unused.

It's good information, what little there is, and the View Saved Data interface is thoughtful. I look forward to the next version.

This review is for a previous version of the add-on (3.01). 

Thanks for the review.

Sitemap indexes are not yet supported, but ordinary sitemap XML files should be fine. Can you tell me what the problem was, or give me a URL of the sitemap you tried to parse?

Where an element adopts rules from multiple selectors, all of the selectors will be checked, including those which only apply through inheritance. But you're right when you say that this alone is not enough to be completely sure that a selector is unused -- you have to take it as the starting point to reduce manual checking. I welcome any thoughts you have on how this could be improved.

Rated 1 out of 5 stars

I'm sorry but this addon isn't "that" addon. I was expecting it to be something like Firebug.

This review is for a previous version of the add-on (3.01). 

Erm, well Firebug is like Firebug, there'd be no point replicating functionality.

What exactly did you want?

Rated 1 out of 5 stars

Oh finally figured out you have to right click BUT IT DOES NOTHING. You wasted 20 minutes of my time. BAH!

00000000 Stars

This review is for a previous version of the add-on (3.01).  This user has a previous review of this add-on.

You LEFT click the icon in the add-on bar and it will scan the current page. Or you can RIGHT click the page and select "Spider this page", which will open the spider dialog and prefill its URL, then you press "Start" to begin spidering.

I will be publishing some proper documentation soon, but in the meantime, I hope that will get you started

Musts to be useful Rated 1 out of 5 stars

It's a must to do to things, on let us save the files, and to use the custom user agent that is being used in requests. I am using "User Agent Switcher" to let me do dev work on a live site without others seeing it.

Also it seems to never crawl all the links, so it's not working right or helpful.

Also, it doesn't seem to work if you log in as a user so that it will pick up hidden pages so it's not using the sessions of the browser as is? This is important to ecom sites with carts and user areas. my stats fed back was that there was 4000 unused rules, 4 pages crawled yet 183pages were ignored when they would have helped reduce the unused rules.

With not being able to down load a css file of the used rules or even picking up the other pages, this is not useful at all...ATM this is just a things to do but no value to it at all.I hope it's fixed to be useful soon cause it's a great idea and i had high hopes for it.

This review is for a previous version of the add-on (3.01). 

Firstly, the extension does not crawl sites like a search-engine robot, it only follows links from the first page you crawl, so any pages that are not linked from that, will not be tested. You need to get it to crawl a sitemap (HTML or XML).

Secondly, it does test pages as the same user (ie. as you), so it will pick up login items, but note that the spider's default behavior is to disable JS on the pages it tests, so it doesn't pick up dynamic content. You can change that in the "Spidering" tab in Preferences.

I'm not sure what you're referring to by custom user agents -- it won't make any difference to the extension what UA you have. Or is that what you're saying, that it should?

Save a cleaned file Rated 4 out of 5 stars

Would be sensational if it was possible to save a cleaned CSS file directly to disk.

This review is for a previous version of the add-on (3.01). 

Not working on large or complex websites? Rated 4 out of 5 stars

I was excited to see this tool updated but it still doesn't seem to work on our website at http://agr.wa.gov/

Scan the entire website. Under the section for "wsdaNewStyle3.css" the first item it shows is "#pageLastMod" but that id is used on many pages, only one of which is http://agr.wa.gov/Inspection/FVinspection/ContactUs.aspx

The problem is that if I find even a single thing that a tool doesn't work properly on, then I can't trust anything produced by that tool. :( Is there perhaps something I can change in the configuration of the tool that will make it work properly on our website, or have I reached a limit in how complex the CSS can be for the tool to work?

Thanks for any assistance you can provide!
-----------------------------------------------------------------------------------
Thank you for your prompt reply! And that is the one thing I needed to know: that it doesn't "spider" the entire site, it only scans what it can reach off the current page you activate it from, kind of a "breadth-first" search instead of a "depth-first" search, limited to a certain number of levels (because it stands to reason that at some point a recursive scan would hit every page from the homepage, otherwise the unlinked pages would never have any traffic). If that was on the instructions webpage somewhere then I just missed reading it. :(
I will see about generating a sitemap and go from there, thanks again!

One final question since I have your attention: is there any way to tell it to not scan a domain other than the one I started the scan in? On my system for example, it also scans Google because I use their mapping widget.

This review is for a previous version of the add-on (3.01). 

Recursive scan is not supported, no. I guess it's just semantics to describe what it does as "spidering the whole site", because as you've observed, what it actually does is spider a sitemap, and won't reach any pages that aren't linked to from that sitemap.

You can "program" the Spider by defining exclusions or inclusions based on link REL attribute values, ie. tell it to exclude all links that have or don't have a particular REL value. You can also tell it not to follow off-site links, and only follow links on the same domain. Both of these options are in "Preferences > Spidering"

Excellent work, one feature requested! Rated 5 out of 5 stars

I was about to write some JS code and I decided to see if my idea was done first. It had been, and done very well! Thanks for making this a very usable extension. One request though -- I feel I want to be able to save off a copy of the .CSS file without the unused CSS in it. Thanks for Dust-Me, again!

This review is for a previous version of the add-on (3.01). 

That feature is coming in the next update.

Not working with sitemap index Rated 4 out of 5 stars

It seems a great tool, unfortunately it does not work with sitemap index, or at least gzipped sitemap index

This review is for a previous version of the add-on (3.01). 

Coming Soon!

That feature is coming in the next update

Great Add-on, few features missing Rated 4 out of 5 stars

Please add:
- "Exceptions" feature - to exclude certain pages /sub-directories from scanning;
- Ability to scan local copy of the web site

This review is for a previous version of the add-on (3.01). 

Not sure yet whether local scanning will be possible, since it won't be able to use XHR to make requests, I'll have to see whether it can work with files through the directory structure instead.

For exceptions, that certainly something I'll look into. In the meantime have you looked at the Spidering preferences -- you can program exceptions there based on the value of "rel" attributes in HTML sitemap links.

Dust-me selector V3 Rated 5 out of 5 stars

New version available here : http://www.brothercake.com/site/portfolio/tools/dustmeselectors/

This review is for a previous version of the add-on (2.2).