Releases: FerrahWolfeh/imageboard-downloader-rs
Releases · FerrahWolfeh/imageboard-downloader-rs
Version 1.7.0
Version 1.7.0
Download from custom websites
Currently, the custom servers config file will reside by default in either:
$XDG_CONFIG_HOME/imageboard-downloader/servers.toml
%APPDATA%/FerrahWolfeh/imageboard-downloader/servers.toml
$IBDL_SERVER_CFG/servers.toml
Now, to download from a custom server, once configured, is pretty straightforward and similar to the old way:
cargo run --release -- search -i my_custom_server "tag1" "tag2"
In order to verify which servers are available for download, just run the app as usual, but with a --servers
flag:
cargo run --release -- post --servers 123456
The current format for the server config file is as follows with some fields being ok if omitted:
[servers]
# Currently supported server types are ["danbooru", "e621" , "gelbooru", "gelbooru beta 0.2" | "gelbooru_020", "moebooru"]
[servers.danbooru]
pretty_name = "Danbooru" # Required
server = "danbooru" # Required
base_url = "https://danbooru.donmai.us" # Required
post_url = "https://danbooru.donmai.us/posts/" # Optional
post_list_url = "https://danbooru.donmai.us/posts.json" # Optional
pool_idx_url = "https://danbooru.donmai.us/pools" # Optional
max_post_limit = 200 # Required
auth_url = "https://danbooru.donmai.us/profile.json" # Optional
[servers.gelbooru]
pretty_name = "Gelbooru"
server = "gelbooru"
base_url = "https://gelbooru.com"
post_url = "http://gelbooru.com/index.php?page=dapi&s=post&q=index&json=1"
post_list_url = "http://gelbooru.com/index.php?page=dapi&s=post&q=index&json=1"
max_post_limit = 100
[servers.yandere]
pretty_name = "Yande.re"
server = "moebooru"
base_url = "https://yande.re"
post_list_url = "https://yande.re/post.json"
max_post_limit = 100
[servers.xbooru]
pretty_name = "xbooru"
server = "gelbooru"
base_url = "https://xbooru.com/"
post_url = "https://xbooru.com/index.php?page=dapi&s=post&q=index&json=1"
post_list_url = "https://xbooru.com/index.php?page=dapi&s=post&q=index&json=1"
max_post_limit = 100
image_url = "https://img.xbooru.com/"
These new changes will also change a bit how the global blacklist works, with now the new format being:
[blacklist]
[blacklist.global]
tags = [] # Place in this array all the tags that will be excluded from all imageboards
# Place in the following all the tags that will be excluded from specific imageboards
[blacklist.danbooru]
tags = []
[blacklist.e621]
tags = []
[blacklist.realbooru]
tags = []
[blacklist.rule34]
tags = []
[blacklist.gelbooru]
tags = []
[blacklist.konachan]
tags = []
[blacklist.custom_server]
tags = []
Known bugs Fixed
- Downloading with
gelbooru
servers paradoxically crashes withExtractorError::PostMapFail
when a download session finishes successfully. - The dialog message to confirm overwriting the output directory messes up with debug logs.
- Global Blacklist counting items removed by
--no-animated
- Realbooru server giving 404 errors when downloading posts.
Important API changes:
ibdl-common
no longer holds all info from the imageboards in theImageBoards
enum.- Major overhaul of
Extractor
trait. Now it should exposefeatures()
andconfig()
functions to better interface with new workflow. - Other stuff that I definetly forgot... (it's a lot)
Main PR
- Add custom imageboard servers functionality & major code cleanup and patching. in #7
Full Changelog: 1.6.2...1.7.0
Version 1.5.5
Changelog
CLI
- Fixed small naming inconsistency when using
-O
Version 1.5.4
Changelog
CLI
- Added some cli flag constraints when downloading pools
- Added inverse pool download order with
--latest
- Fixed incorrect download of pools #5
Lib
- Only added some additional args to
PoolDownload
Version 1.5.1
Changelog
CLI
- Small fix to e621 pool downloading function
Version 1.5.0
Changelog
CLI
- Added capability to search for more than 2 tags on danbooru without gold account
- Added function to download complete and partial pools from Danbooru and e621 with
--pool
Lib
- Separated most modules from Queue into their own files
- Added pool download functionality with
PoolDownload
trait - Generic bug fixes here and there
Version 1.4.0
Changelog
CLI
- Now the async download method is the default one.
- Please refer to branch
pre-1.4.0
if you still want to use the default method.
- Please refer to branch
- Dropped the
--update
function because it is too unreliable. - Fixed the bug that makes
--annotate
not write any files - Fixed progress bar download counters.
Lib
- Removed normal
Queue
- Removed
SummaryWriter
and all it's functionality - Implemented more robust counters.
Version 1.3.0
Changelog
CLI
- Added
--annotate
arg to help download images and captions for Stable-Diffusion finetuning - Added new
--async
function to fetch posts while downloading. (speeds up the download proccess quite a bit)
Lib
- Overhaul of the tagging system. Now most of them are contained in a
Tag
struct and classified by their type. - Added new workflow for asynchronous downloading.
Version 1.1.1
Changelog
CLI
- Added new
--update
flag. Using it will download only the latest posts for the tags you chose after a complete download session.
Lib
- Refactored the Global Blacklist to read the fs only once after the extractor has been initialized
- [e621] Don't serialize the
invalid
tags since they're not commonly used while searching
Version 1.0.0
Changelog
CLI
- Fixed progress bar inconsistent width and styling
- Limited the number of concurrent downloads to 20
- Rename already downloaded files to their ID if the MD5-named file is present and vice-versa
Lib
- Post downloading is now multithreaded
- Extractors now export their
Client
and removed post count. - Finalized user-facing API
- Removed all printing functions from lib
- New custom error types for the extractors
- Unified blacklist with all extractors
- Deprecated extractor-specific Safe Mode
- Added global Safe Mode
- Increased debug logging
- Speed up the
cbz
download path by making everythingasync
- Extreme speedup on MD5 checking of preexistent files
- Rename file to id or MD5 if it's already downloaded in the folder
- More documentation coverage
0.27.3
- Added a limit to how many posts can be downloaded
- Added
--start-page
flag to specify which page to start searching posts