-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Feature Request: partial library download #7
Comments
Thanks for the message! (1) Currently the tool can be run several times in the same folder and will make sure to not re-download an album, thanks to a simple cache file system (that you seem to have noticed). I think this should be enough for your case. (2) However, filters is indeed a good idea! I myself also think that being able to select (or avoid) specific artists, genres, or time periods would be really nice. I don't know if or when I will have for this, but meanwhile PRs are welcome :). |
similarly ran into this problem. would love to be able to maintain an "ignore list" of some kind.. I have some things in my collection like sound sample libraries that don't belong in the same area as music |
I'd appreciate a feature like this too. Several items in my collection are duplicated as a result of buying a digital release first, then a physical release like a cassette or cd later on. The content of both downloads is exactly the same but sometimes the title of the physical release differs, resulting in a duplicate. |
That works for my use case! |
I have a library of ~500 purchased albums on Bandcamp.
I purchased about 50 albums today (a subscription to an artist artificially inflated that, but that's another story.)
I've only just found this project, and it looks great. Have it ready to go, and it wants to download everything, as it is designed.
Having a regex and/or date argument (before/after/between) could prevent re-downloading of items...
Update: There is a cache file dropped on run, so that might help long term so pretty low priority request... just running now.
The text was updated successfully, but these errors were encountered: