- Valutata 3 su 5di Utente Firefox 16217654, 3 mesi faQuick, but could not delete all-bar-one, which makes sense when you think about it. Not everyone is going to be wanting some kind of hesitant, multiple locations cautious approach, most are likely to have all in one location either downloaded from their cloud, or self-managed and moved over manually each upgrade-up.
And even-if one was wanting to filter by location, or other attributes, to deliberately keep SOME but not all of the duplicates found, there was no custom / fiddly filtering if you did need that.
A couple of parallel useful functions, like delete empty subdirs (subdirectories but not initial directory highest-level 1st directories, only 2nd onward) ... but one was then needing to ALSO delete the 1st level - one needed to manually find the USERS/ location for them anyway, to remove the highest level.
mm... program permissions? not asking for administrator confirmation? 1st level has runtime carried-over, passed-on executed BY user permissions, but subsequent do not, and are more easily reversed? something. not really a timesaver, when then unless you were OK with STILL HAVING most of the upon-1st-mouse-hover drop-down list (lvl1 depth) ... STILL being as clogged as it was before having clicked delete empty directories,.. but did want to say,.. ONLY reduce the number of extra directories for things like reducing the amount of empty directories on one's computer for shortening how long windows search catalogs user directories for the Index? sigh... then yeah, OK, some might find that useful,.. but coincidentally? The main POINT, of a duplicates manager/handler, should be for the VARIETY of reasons and needs, different people will be needing to do DIFFERENT, things, while doing that. i didn't ... i only needed to leave 1-of-everything ... but i also couldn't do that.
Essentially, the extras did not compensate for not being able to filter , sort, etc, nor select-all , nor in my case, select-all-but-1
- Valutata 3 su 5di Utente Firefox 17792552, 10 mesi faI like what it says it can do BUT I cannot figure out if it deletes all of the duplicates or does it leave one remaining bookmark and deletes the rest of the duplicates......
- Valutata 3 su 5di Utente Firefox 13718743, 2 anni faIt finds broken links & duplicate bookmarks, but it doesn't seem to have a whitelist or an ignore list – that's important so it doesn't keep finding the same bookmarks that it thinks are wrong. Perhaps those features will be added in a future update, but for now I can't rate it higher than what I gave it due to those lacking features. Otherwise, it's pretty good... :)
- Valutata 3 su 5di agBAZE, 3 anni faThe "broken urls" and "remove duplicated folders" features are great, works really well.
But the other functions doesn't work at all. The "find duplicated" feature finds all the bookmarks that it should do, but the deletion doesn't work as expected.
- Valutata 3 su 5di Nunya, 3 anni faI like this app a lot, but I had to take 2 points because I had to go through the whole list and manually select like 100 duplicates because they were in a different folder. Thanks for finding them, but you should tell people to merge folders before they bother with the duplicates. You should set is up as a process, bing bang boom, just do the whole thing. That's 1 point.
The other thing is your "Find Broken Urls" function isn't working. I don't know if it has to do with my JS settings or my VPN, but it's ignoring the timeout criteria I provided. I set it to wait a full minute before adding a site to the list. Maybe it's on my end, or maybe you set the maximum to 1 second and didn't tell me, but it looks broken.
Whatever it is, I bet putting all the functions into a single, linear process fixes it, at least on my end. Let me know if you do that and I'll fix your rating. Keep up the good work.
- Valutata 3 su 5di Utente Firefox 14306928, 3 anni faIt's a decent tool but it has a bug(s)! It does finds the dupes o'right but it also cannot distinguish between nuances of urls. e.g. I saved bunch of google searches for later, it sees them all as one dupe! And regarding real legit dupes, there're no commands or buttons to deal with the dupes in one fell swoop but rather you have to sit there and manually click on every dupe checkbox to take action.
The Find Broken URLs is a good feature but many websites like linkedin, facebook that do not allow webscrapers see it as a threat causing BookmarksCleanUp to generate a false positive report that the bookmark is broken or dead. The average person using this add-on needs to see friendlier http errors such as "the target web url does not allow remote checking!" as opposed to http 404!
- Valutata 3 su 5di endolith, 4 anni faI wish it could only delete duplicates with the same names, tags, etc. It's based purely on matching URLs, even if the other things are different. I don't want to lose any information.
- Valutata 3 su 5di Wyndsayl, 4 anni faExcellent job on finding dead links of varying kinds.
However really poor on finding duplicates. For some reason it's showing folders at the same level as duplicates (:data) which are not other than them being folders. You will delete your tree following through on that. Obviously this must be fixed.
- Valutata 3 su 5di cniru, 4 anni faHello - Just installed this addon - but how do I actually run this thing to clean up my bookmarks ? The screenshots I have seen, but how to get there? Sorry for the silly question.