diff --git a/docs/changelog.md b/docs/changelog.md
index 57315d2ce..dc75f1225 100644
--- a/docs/changelog.md
+++ b/docs/changelog.md
@@ -7,6 +7,39 @@ title: Changelog
!!! note
This is the new changelog, only the most recent builds. For all versions, see the [old changelog](old_changelog.html).
+## [Version 575](https://github.com/hydrusnetwork/hydrus/releases/tag/v575)
+
+### misc
+
+* the new 'children' tab now sorts its results by count, and it only shows the top n (default 40) results. you can edit the n under _options->tags_. let me know how this works IRL, as this new count-sorting needs a bit of extra CPU
+* when you ask subscriptions to 'check now', either in the 'edit subscription' or 'edit subscriptions' dialogs, if there is a mix of DEAD and ALIVE subs, it now pops up a quick question dialog asking whether you want to check now for all/alive/dead
+* fixed the (do not) 'alphabetise GET query parameters' URL Class checkbox, which I broke in v569. sorry for the trouble--the new URL encoding handling was accidentally alphabetising all URLs on ingestion. a new unit test will catch this in future, so it shouldn't happen again (issue #1551)
+* thanks to a user, I think we have fixed ICC profile processing when your system ICC Profile is non-sRGB
+* fixed a logical test that was disallowing thumbnail regen on files with no resolution (certain svg, for instance). all un-resolutioned files will now (re)render a thumb to the max bounding thumbnail resolution setting. fingers crossed we'll be able to figure out a ratio solution in future
+* added a _debug->help->gui actions->reload current stylesheet_ menu action. it unloads and reloads the current QSS
+* added a _debug->help->gui actions->reload current gui session_ menu action. it saves the current session and reloads it
+* fixed the rendering of some 16-bit pngs that seem to be getting a slightly different image mode on the new version of PIL
+* the debug 'gui report mode' now reports extensive info about virtual taglist heights. if I have been working with you on taglists, mostly on the manage tags dialog, that spawn without a scrollbar even though they should, please run this mode and then try to capture the error. hit me up and we'll see if the numbers explain what's going on. I may have also simply fixed the bug
+* I think I sped up adding tags to a local tag service that has a lot of siblings/parents
+* updated the default danbooru parsers to get the original and/or translated artist notes. I don't know if a user did this or I did, but my dev machine somehow already had the tech while the defaults did not--if you did this, thinks!
+* added more tweet URL Classes for the default downloader. you should now be able to drag and drop a vxtwitter or fxtwitter URL on the client and it'll work
+
+### auto-duplicate resolution
+
+* I have nothing real to show today, but I have a skeleton of code and a good plan on how to get the client resolving easy duplicate pairs by itself. so far, it looks easier than I feared, but, as always, there will be a lot to do. I will keep chipping away at this and will release features in tentative waves for advanced users to play with
+* with this system, I will be launching the very first version of the 'Metadata Conditional' object I have been talking about for a few years. fingers crossed, we'll be able to spam it to all sorts of other places to do 'if the file has x property, then do y' in a standardised way
+
+### boring stuff
+
+* refactored the new tag children autocomplete tab to its own class so it can handle its new predicate gubbins and sorted/culled search separately. it is also now aware of the current file location context to give file-domain-sensitive suggestions (falling back to 'all known files' for fast search if things are complicated)
+* fixed a layout issue on file import options panel when a sister page caused it to be taller than it wanted; the help button ended up being the expanding widget jej
+* non-menubar menus and submenus across the program now remove a hanging final separator item, making the logic of forming menu groups a little easier in future
+* the core 'Load image in PIL' method has some better error reporting, and many calls now explicitly tell it a human-readable source description so we can avoid repeats of `DamagedOrUnusualFileException: Could not load the image at "<_io.BytesIO object at 0x000001F60CE45620>"--it was likely malformed!`
+* cleaned up some dict instantiations in `ClientOptions`
+* moved `ClientDuplicates` up to a new `duplicates` module and migrated some duplicate enums over to it from `ClientConstants`
+* removed an old method-wrapper hack that applied the 'load images with PIL' option. I just moved to a global that I set on init and update on options change
+* cleaned some duplicate checking code
+
## [Version 574](https://github.com/hydrusnetwork/hydrus/releases/tag/v574)
### local hashes cache
@@ -381,46 +414,3 @@ title: Changelog
* just a small thing, but the under-documented `/manage_database/get_client_options` call now says the four types of default tag sort. I left the old key, `default_tag_sort`, in so as not to break stuff, but it is just a copy of the `search_page` variant in the new `default_tag_sort_xxx` foursome
* client api version is now 62
-
-## [Version 564](https://github.com/hydrusnetwork/hydrus/releases/tag/v564)
-
-### more macOS work
-
-* thanks to a user, we have more macOS features:
-* macOS users get a new shortcut action, default Space, that uses Quick Look to preview a thumbnail like you can in Finder. **all existing users will get the new shortcut!**
-* the hydrus .app now has the version number in Get Info
-* **macOS users who run from source should rebuild their venvs this week!** if you don't, then trying this new Quick Look feature will just give you an error notification
-
-### new fuzzy operator math in system predicates
-
-* the system predicates for width, height, num_notes, num_words, num_urls, num_frames, duration, and framerate now support two different kinds of approximate equals, ≈: absolute (±x), and percentage (±x%). previously, the ≈ secretly just did ±15% in all cases (issue #1468)
-* all `system:framerate=x` searches are now converted to `±5%`, which is what they were behind the scenes. `!=` framerate stuff is no longer supported, so if you happened to use it, it is now converted to `<` just as a valid fallback
-* `system:duration` gets the same thing, `±5%`. it wasn't doing this behind the scenes before, but it should have been!
-* `system:duration` also now allows hours and minutes input, if you need longer!
-* for now, the parsing system is not updated to specify the % or absolute ± values. it will remain the same as the old system, with ±15% as the default for a `~=` input
-* there's still a little borked logic in these combined types. if you search `< 3 URLs`, that will return files with 0 URLs, and same for `num_notes`, but if you search `< 200px width` or any of the others I changed this week, that won't return a PDF that has no width (although it will return a damaged file that reports 0 width specifically). I am going to think about this, since there isn't an easy one-size-fits-all-solution to marry what is technically correct with what is actually convenient. I'll probably add a checkbox that says whether to include 'Null' values or not and default that True/False depending on the situation; let me know what you think!
-
-### misc
-
-* I have taken out Space as the default for archive/delete filter 'keep' and duplicate filter 'this is better, delete other'. Space is now exclusively, by default, media pause/play. **I am going to set this to existing users too, deleting/overwriting what Space does for you, if you are still set to the defaults**
-* integer percentages are now rendered without the trailing `.0`. `15%`, not `15.0%`
-* when you 'open externally', 'open in web browser', or 'open path' from a thumbnail, the preview viewer now pauses rather than clears completely
-* fixed the edit shortcut panel ALWAYS showing the new (home/end/left/right/to focus) dropdown for thumbnail dropdown, arrgh
-* I fixed a stupid typo that was breaking file repository file deletes
-* `help->about` now shows the Qt platformName
-* added a note about bad Wayland support to the Linux 'installing' help document
-* the guy who wrote the `Fixing_Hydrus_Random_Crashes_Under_Linux` document has updated it with new information, particularly related to running hydrus fast using virtual memory on small, underpowered computers
-
-### client api
-
-* thanks to a user, the undocumented API call that returns info on importer pages now includes the sha256 file hash in each import object Object
-* although it is a tiny change, let's nonetheless update the Client API version to 61
-
-### boring predicate overhaul work
-
-* updated the `NumberTest` object to hold specific percentage and absolute ± values
-* updated the `NumberTest` object to render itself to any number format, for instance pixels vs kilobytes vs a time delta
-* updated the `Predicate` object for system preds width, height, num_notes, num_words, num_urls, num_frames, duration, and framerate to store their operator and value as a `NumberTest`, and updated predicate string rendering, parsing, editing, database-level predicate handling
-* wrote new widgets to edit `NumberTest`s of various sorts and spammed them to these (operator, value) system predicate UI panels. we are finally clearing out some 8+-year-old jank here
-* rewrote the `num_notes` database search logic to use `NumberTest`s
-* the system preds for height, width, and framerate now say 'has x' and 'no x' when set to `>0` or `=0`, although what these really mean is not perfectly defined
diff --git a/docs/getting_started_installing.md b/docs/getting_started_installing.md
index b6cb48397..d157bb9c0 100644
--- a/docs/getting_started_installing.md
+++ b/docs/getting_started_installing.md
@@ -83,7 +83,7 @@ By default, hydrus stores all its data—options, files, subscriptions, _everyth
!!! danger "Bad Locations"
**Do not install to a network location!** (i.e. on a different computer's hard drive) The SQLite database is sensitive to interruption and requires good file locking, which network interfaces often fake. There are [ways of splitting your client up](database_migration.md) so the database is on a local SSD but the files are on a network--this is fine--but you really should not put the database on a remote machine unless you know what you are doing and have a backup in case things go wrong.
- **Do not install to a location with filesystem-level compression enabled!** It may work ok to start, but when the SQLite database grows to large size, this can cause extreme access latency and I/O errors and corruption.
+ **Do not install to a location with filesystem-level compression enabled! (e.g. BTRFS)** It may work ok to start, but when the SQLite database grows to large size, this can cause extreme access latency and I/O errors and corruption.
!!! info "For macOS users"
The Hydrus App is **non-portable** and puts your database in `~/Library/Hydrus` (i.e. `/Users/[You]/Library/Hydrus`). You can update simply by replacing the old App with the new, but if you wish to backup, you should be looking at `~/Library/Hydrus`, not the App itself.
diff --git a/docs/old_changelog.html b/docs/old_changelog.html
index ac53cbedf..b2c33e6f9 100644
--- a/docs/old_changelog.html
+++ b/docs/old_changelog.html
@@ -34,6 +34,36 @@
+ -
+
+
+ misc
+ - the new 'children' tab now sorts its results by count, and it only shows the top n (default 40) results. you can edit the n under _options->tags_. let me know how this works IRL, as this new count-sorting needs a bit of extra CPU
+ - when you ask subscriptions to 'check now', either in the 'edit subscription' or 'edit subscriptions' dialogs, if there is a mix of DEAD and ALIVE subs, it now pops up a quick question dialog asking whether you want to check now for all/alive/dead
+ - fixed the (do not) 'alphabetise GET query parameters' URL Class checkbox, which I broke in v569. sorry for the trouble--the new URL encoding handling was accidentally alphabetising all URLs on ingestion. a new unit test will catch this in future, so it shouldn't happen again (issue #1551)
+ - thanks to a user, I think we have fixed ICC profile processing when your system ICC Profile is non-sRGB
+ - fixed a logical test that was disallowing thumbnail regen on files with no resolution (certain svg, for instance). all un-resolutioned files will now (re)render a thumb to the max bounding thumbnail resolution setting. fingers crossed we'll be able to figure out a ratio solution in future
+ - added a _debug->help->gui actions->reload current stylesheet_ menu action. it unloads and reloads the current QSS
+ - added a _debug->help->gui actions->reload current gui session_ menu action. it saves the current session and reloads it
+ - fixed the rendering of some 16-bit pngs that seem to be getting a slightly different image mode on the new version of PIL
+ - the debug 'gui report mode' now reports extensive info about virtual taglist heights. if I have been working with you on taglists, mostly on the manage tags dialog, that spawn without a scrollbar even though they should, please run this mode and then try to capture the error. hit me up and we'll see if the numbers explain what's going on. I may have also simply fixed the bug
+ - I think I sped up adding tags to a local tag service that has a lot of siblings/parents
+ - updated the default danbooru parsers to get the original and/or translated artist notes. I don't know if a user did this or I did, but my dev machine somehow already had the tech while the defaults did not--if you did this, thinks!
+ - added more tweet URL Classes for the default downloader. you should now be able to drag and drop a vxtwitter or fxtwitter URL on the client and it'll work
+ auto-duplicate resolution
+ - I have nothing real to show today, but I have a skeleton of code and a good plan on how to get the client resolving easy duplicate pairs by itself. so far, it looks easier than I feared, but, as always, there will be a lot to do. I will keep chipping away at this and will release features in tentative waves for advanced users to play with
+ - with this system, I will be launching the very first version of the 'Metadata Conditional' object I have been talking about for a few years. fingers crossed, we'll be able to spam it to all sorts of other places to do 'if the file has x property, then do y' in a standardised way
+ boring stuff
+ - refactored the new tag children autocomplete tab to its own class so it can handle its new predicate gubbins and sorted/culled search separately. it is also now aware of the current file location context to give file-domain-sensitive suggestions (falling back to 'all known files' for fast search if things are complicated)
+ - fixed a layout issue on file import options panel when a sister page caused it to be taller than it wanted; the help button ended up being the expanding widget jej
+ - non-menubar menus and submenus across the program now remove a hanging final separator item, making the logic of forming menu groups a little easier in future
+ - the core 'Load image in PIL' method has some better error reporting, and many calls now explicitly tell it a human-readable source description so we can avoid repeats of `DamagedOrUnusualFileException: Could not load the image at "<_io.BytesIO object at 0x000001F60CE45620>"--it was likely malformed!`
+ - cleaned up some dict instantiations in `ClientOptions`
+ - moved `ClientDuplicates` up to a new `duplicates` module and migrated some duplicate enums over to it from `ClientConstants`
+ - removed an old method-wrapper hack that applied the 'load images with PIL' option. I just moved to a global that I set on init and update on options change
+ - cleaned some duplicate checking code
+
+
-
@@ -53,7 +83,7 @@
- when setting up an import folder, the dialog will now refuse to OK if you set a path that is 1) above the install dir or db dir or 2) above or below any of your file storage locations. shouldn't be possible to set up an import from your own file storage folder by accident any more
- added a new 'apply image ICC Profile colour adjustments' checkbox to _options->media_. this simply turns off ICC profile loading and application, for debug purposes
boring cleanup
- - the default SQLite page size is now 4096 bytes on Linux, the SQLite default. it was 1024 previously, but SQLite now recommend 4096 for all platforms. the next time Linux users vacuum any of their databases, they will get fixed. I do not think this is a big deal, so don't rush to force this
+ - the default SQLite page size is now 4096 bytes on Linux and macOS, the SQLite default. it was 1024 previously, but SQLite now recommend 4096 for all platforms. the next time Linux users vacuum any of their databases, they will get fixed. I do not think this is a big deal, so don't rush to force this
- fixed the last couple dozen missing layout flags across the program, which were ancient artifacts from the wx->Qt conversion
- fixed the WTFPL licence to be my copyright, lol
- deleted the local booru service management/UI code
diff --git a/hydrus/client/ClientConstants.py b/hydrus/client/ClientConstants.py
index c41f866bc..3ede72482 100644
--- a/hydrus/client/ClientConstants.py
+++ b/hydrus/client/ClientConstants.py
@@ -65,10 +65,6 @@
DIRECTION_DOWN : 'bottom'
}
-DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH = 0
-DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH = 1
-DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES = 2
-
FIELD_VERIFICATION_RECAPTCHA = 0
FIELD_COMMENT = 1
FIELD_TEXT = 2
@@ -146,16 +142,6 @@
HAMMING_SPECULATIVE : 'speculative'
}
-SIMILAR_FILES_PIXEL_DUPES_REQUIRED = 0
-SIMILAR_FILES_PIXEL_DUPES_ALLOWED = 1
-SIMILAR_FILES_PIXEL_DUPES_EXCLUDED = 2
-
-similar_files_pixel_dupes_string_lookup = {
- SIMILAR_FILES_PIXEL_DUPES_REQUIRED : 'must be pixel dupes',
- SIMILAR_FILES_PIXEL_DUPES_ALLOWED : 'can be pixel dupes',
- SIMILAR_FILES_PIXEL_DUPES_EXCLUDED : 'must not be pixel dupes'
-}
-
IDLE_NOT_ON_SHUTDOWN = 0
IDLE_ON_SHUTDOWN = 1
IDLE_ON_SHUTDOWN_ASK_FIRST = 2
diff --git a/hydrus/client/ClientController.py b/hydrus/client/ClientController.py
index 456efecda..f86921297 100644
--- a/hydrus/client/ClientController.py
+++ b/hydrus/client/ClientController.py
@@ -1023,6 +1023,8 @@ def ReinitGlobalSettings( self ):
HydrusImageHandling.SetEnableLoadTruncatedImages( self.new_options.GetBoolean( 'enable_truncated_images_pil' ) )
HydrusImageNormalisation.SetDoICCProfileNormalisation( self.new_options.GetBoolean( 'do_icc_profile_normalisation' ) )
+ HydrusImageHandling.FORCE_PIL_ALWAYS = self.new_options.GetBoolean( 'load_images_with_pil' )
+
def InitModel( self ):
@@ -1490,7 +1492,7 @@ def MaintainDB( self, maintenance_mode = HC.MAINTENANCE_IDLE, stop_time = None )
if work_done:
- from hydrus.client import ClientDuplicates
+ from hydrus.client.duplicates import ClientDuplicates
ClientDuplicates.DuplicatesManager.instance().RefreshMaintenanceNumbers()
diff --git a/hydrus/client/ClientDownloading.py b/hydrus/client/ClientDownloading.py
index be713119f..e4be03eee 100644
--- a/hydrus/client/ClientDownloading.py
+++ b/hydrus/client/ClientDownloading.py
@@ -334,7 +334,7 @@ def MainLoop( self ):
file_import_options.SetPreImportURLCheckLooksForNeighbours( preimport_url_check_looks_for_neighbours )
file_import_options.SetPostImportOptions( automatic_archive, associate_primary_urls, associate_source_urls )
- file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options )
+ file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options, human_file_description = f'Downloaded File - {hash.hex()}' )
file_import_job.DoWork()
diff --git a/hydrus/client/ClientFiles.py b/hydrus/client/ClientFiles.py
index 8ff12445c..27489a0d7 100644
--- a/hydrus/client/ClientFiles.py
+++ b/hydrus/client/ClientFiles.py
@@ -1792,7 +1792,7 @@ def RegenerateThumbnailIfWrongSize( self, media ):
thumbnail_mime = HydrusFileHandling.GetThumbnailMime( path )
- numpy_image = ClientImageHandling.GenerateNumPyImage( path, thumbnail_mime )
+ numpy_image = HydrusImageHandling.GenerateNumPyImage( path, thumbnail_mime )
( current_width, current_height ) = HydrusImageHandling.GetResolutionNumPy( numpy_image )
@@ -1863,7 +1863,7 @@ def shutdown( self ):
-def HasHumanReadableEmbeddedMetadata( path, mime ):
+def HasHumanReadableEmbeddedMetadata( path, mime, human_file_description = None ):
if mime not in HC.FILES_THAT_CAN_HAVE_HUMAN_READABLE_EMBEDDED_METADATA:
@@ -1878,7 +1878,7 @@ def HasHumanReadableEmbeddedMetadata( path, mime ):
try:
- pil_image = HydrusImageOpening.RawOpenPILImage( path )
+ pil_image = HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
except:
@@ -2051,7 +2051,7 @@ def _CanRegenThumbForMediaResult( self, media_result ):
( width, height ) = media_result.GetResolution()
- if width is None or height is None:
+ if mime in HC.MIMES_THAT_ALWAYS_HAVE_GOOD_RESOLUTION and ( width is None or height is None ) :
# this guy is probably pending a metadata regen but the user forced thumbnail regen now
# we'll wait for metadata regen to notice the new dimensions and schedule this job again
@@ -2420,7 +2420,7 @@ def _HasICCProfile( self, media_result ):
path = self._controller.client_files_manager.GetFilePath( hash, mime )
if mime == HC.APPLICATION_PSD:
-
+
try:
has_icc_profile = HydrusPSDHandling.PSDHasICCProfile( path )
@@ -2672,7 +2672,7 @@ def _RegenBlurhash( self, media ):
thumbnail_mime = HydrusFileHandling.GetThumbnailMime( thumbnail_path )
- numpy_image = ClientImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
+ numpy_image = HydrusImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
return HydrusBlurhash.GetBlurhashFromNumPy( numpy_image )
diff --git a/hydrus/client/ClientFilesPhysical.py b/hydrus/client/ClientFilesPhysical.py
index 11bc834df..f636bfb42 100644
--- a/hydrus/client/ClientFilesPhysical.py
+++ b/hydrus/client/ClientFilesPhysical.py
@@ -64,6 +64,10 @@ def GetMissingPrefixes( merge_target: str, prefixes: typing.Collection[ str ], m
return missing_prefixes
+# TODO: A 'FilePath' or 'FileLocation' or similar that holds the path or IO stream, and/or temp_path to use for import calcs, and hash once known, and the human description like 'this came from blah URL'
+# then we spam that all over the import pipeline and when we need a nice error, we ask that guy to describe himself
+# search up 'human_file_description' to see what we'd be replacing
+
class FilesStorageBaseLocation( object ):
def __init__( self, path: str, ideal_weight: int, max_num_bytes = None ):
diff --git a/hydrus/client/ClientImageHandling.py b/hydrus/client/ClientImageHandling.py
index cb222bcbe..7893f423b 100644
--- a/hydrus/client/ClientImageHandling.py
+++ b/hydrus/client/ClientImageHandling.py
@@ -28,13 +28,6 @@ def DiscardBlankPerceptualHashes( perceptual_hashes ):
return perceptual_hashes
-def GenerateNumPyImage( path, mime ):
-
- force_pil = CG.client_controller.new_options.GetBoolean( 'load_images_with_pil' )
-
- return HydrusImageHandling.GenerateNumPyImage( path, mime, force_pil = force_pil )
-
-
def GenerateShapePerceptualHashes( path, mime ):
if HG.phash_generation_report_mode:
@@ -44,7 +37,7 @@ def GenerateShapePerceptualHashes( path, mime ):
try:
- numpy_image = GenerateNumPyImage( path, mime )
+ numpy_image = HydrusImageHandling.GenerateNumPyImage( path, mime )
return GenerateShapePerceptualHashesNumPy( numpy_image )
diff --git a/hydrus/client/ClientOptions.py b/hydrus/client/ClientOptions.py
index 2fa11f19b..7ac911875 100644
--- a/hydrus/client/ClientOptions.py
+++ b/hydrus/client/ClientOptions.py
@@ -11,8 +11,8 @@
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientDefaults
-from hydrus.client import ClientDuplicates
from hydrus.client import ClientGlobals as CG
+from hydrus.client.duplicates import ClientDuplicates
from hydrus.client.importing.options import FileImportOptions
class ClientOptions( HydrusSerialisable.SerialisableBase ):
@@ -122,206 +122,134 @@ def _GetSerialisableInfo( self ):
def _InitialiseDefaults( self ):
- self._dictionary[ 'booleans' ] = {}
-
- self._dictionary[ 'booleans' ][ 'advanced_mode' ] = False
-
- self._dictionary[ 'booleans' ][ 'remove_filtered_files_even_when_skipped' ] = False
-
- self._dictionary[ 'booleans' ][ 'filter_inbox_and_archive_predicates' ] = False
-
- self._dictionary[ 'booleans' ][ 'discord_dnd_fix' ] = False
- self._dictionary[ 'booleans' ][ 'secret_discord_dnd_fix' ] = False
-
- self._dictionary[ 'booleans' ][ 'show_unmatched_urls_in_media_viewer' ] = False
-
- self._dictionary[ 'booleans' ][ 'set_search_focus_on_page_change' ] = False
-
- self._dictionary[ 'booleans' ][ 'allow_remove_on_manage_tags_input' ] = True
- self._dictionary[ 'booleans' ][ 'yes_no_on_remove_on_manage_tags' ] = True
-
- self._dictionary[ 'booleans' ][ 'activate_window_on_tag_search_page_activation' ] = False
-
- self._dictionary[ 'booleans' ][ 'show_related_tags' ] = True
- self._dictionary[ 'booleans' ][ 'show_file_lookup_script_tags' ] = False
-
- self._dictionary[ 'booleans' ][ 'use_native_menubar' ] = HC.PLATFORM_MACOS
-
- self._dictionary[ 'booleans' ][ 'shortcuts_merge_non_number_numpad' ] = True
-
- self._dictionary[ 'booleans' ][ 'disable_get_safe_position_test' ] = False
-
- self._dictionary[ 'booleans' ][ 'freeze_message_manager_when_mouse_on_other_monitor' ] = False
- self._dictionary[ 'booleans' ][ 'freeze_message_manager_when_main_gui_minimised' ] = False
-
- self._dictionary[ 'booleans' ][ 'load_images_with_pil' ] = True
-
- self._dictionary[ 'booleans' ][ 'only_show_delete_from_all_local_domains_when_filtering' ] = False
-
- self._dictionary[ 'booleans' ][ 'use_system_ffmpeg' ] = False
-
- self._dictionary[ 'booleans' ][ 'elide_page_tab_names' ] = True
-
- self._dictionary[ 'booleans' ][ 'maintain_similar_files_duplicate_pairs_during_idle' ] = False
-
- self._dictionary[ 'booleans' ][ 'show_namespaces' ] = True
- self._dictionary[ 'booleans' ][ 'show_number_namespaces' ] = True
- self._dictionary[ 'booleans' ][ 'show_subtag_number_namespaces' ] = True
- self._dictionary[ 'booleans' ][ 'replace_tag_underscores_with_spaces' ] = False
- self._dictionary[ 'booleans' ][ 'replace_tag_emojis_with_boxes' ] = False
-
- self._dictionary[ 'booleans' ][ 'verify_regular_https' ] = True
-
- self._dictionary[ 'booleans' ][ 'page_drop_chase_normally' ] = True
- self._dictionary[ 'booleans' ][ 'page_drop_chase_with_shift' ] = False
- self._dictionary[ 'booleans' ][ 'page_drag_change_tab_normally' ] = True
- self._dictionary[ 'booleans' ][ 'page_drag_change_tab_with_shift' ] = True
- self._dictionary[ 'booleans' ][ 'wheel_scrolls_tab_bar' ] = False
-
- self._dictionary[ 'booleans' ][ 'remove_local_domain_moved_files' ] = False
-
- self._dictionary[ 'booleans' ][ 'anchor_and_hide_canvas_drags' ] = HC.PLATFORM_WINDOWS
- self._dictionary[ 'booleans' ][ 'touchscreen_canvas_drags_unanchor' ] = False
-
- self._dictionary[ 'booleans' ][ 'import_page_progress_display' ] = True
-
- self._dictionary[ 'booleans' ][ 'process_subs_in_random_order' ] = True
-
- self._dictionary[ 'booleans' ][ 'ac_select_first_with_count' ] = False
-
- self._dictionary[ 'booleans' ][ 'saving_sash_positions_on_exit' ] = True
-
- self._dictionary[ 'booleans' ][ 'database_deferred_delete_maintenance_during_idle' ] = True
- self._dictionary[ 'booleans' ][ 'database_deferred_delete_maintenance_during_active' ] = True
-
- self._dictionary[ 'booleans' ][ 'file_maintenance_during_idle' ] = True
- self._dictionary[ 'booleans' ][ 'file_maintenance_during_active' ] = True
-
- self._dictionary[ 'booleans' ][ 'tag_display_maintenance_during_idle' ] = True
- self._dictionary[ 'booleans' ][ 'tag_display_maintenance_during_active' ] = True
-
- self._dictionary[ 'booleans' ][ 'save_page_sort_on_change' ] = False
- self._dictionary[ 'booleans' ][ 'disable_page_tab_dnd' ] = False
- self._dictionary[ 'booleans' ][ 'force_hide_page_signal_on_new_page' ] = False
-
- self._dictionary[ 'booleans' ][ 'pause_export_folders_sync' ] = False
- self._dictionary[ 'booleans' ][ 'pause_import_folders_sync' ] = False
- self._dictionary[ 'booleans' ][ 'pause_repo_sync' ] = False
- self._dictionary[ 'booleans' ][ 'pause_subs_sync' ] = False
-
- self._dictionary[ 'booleans' ][ 'pause_all_new_network_traffic' ] = False
- self._dictionary[ 'booleans' ][ 'boot_with_network_traffic_paused' ] = False
- self._dictionary[ 'booleans' ][ 'pause_all_file_queues' ] = False
- self._dictionary[ 'booleans' ][ 'pause_all_watcher_checkers' ] = False
- self._dictionary[ 'booleans' ][ 'pause_all_gallery_searches' ] = False
-
- self._dictionary[ 'booleans' ][ 'popup_message_force_min_width' ] = False
-
- self._dictionary[ 'booleans' ][ 'always_show_iso_time' ] = False
-
- self._dictionary[ 'booleans' ][ 'confirm_multiple_local_file_services_move' ] = True
- self._dictionary[ 'booleans' ][ 'confirm_multiple_local_file_services_copy' ] = True
-
- self._dictionary[ 'booleans' ][ 'use_advanced_file_deletion_dialog' ] = False
-
- self._dictionary[ 'booleans' ][ 'show_new_on_file_seed_short_summary' ] = False
- self._dictionary[ 'booleans' ][ 'show_deleted_on_file_seed_short_summary' ] = False
-
- self._dictionary[ 'booleans' ][ 'only_save_last_session_during_idle' ] = False
-
- self._dictionary[ 'booleans' ][ 'do_human_sort_on_hdd_file_import_paths' ] = True
-
- self._dictionary[ 'booleans' ][ 'highlight_new_watcher' ] = True
- self._dictionary[ 'booleans' ][ 'highlight_new_query' ] = True
-
- self._dictionary[ 'booleans' ][ 'delete_files_after_export' ] = False
-
- self._dictionary[ 'booleans' ][ 'file_viewing_statistics_active' ] = True
- self._dictionary[ 'booleans' ][ 'file_viewing_statistics_active_on_archive_delete_filter' ] = True
- self._dictionary[ 'booleans' ][ 'file_viewing_statistics_active_on_dupe_filter' ] = False
-
- self._dictionary[ 'booleans' ][ 'prefix_hash_when_copying' ] = False
- self._dictionary[ 'booleans' ][ 'file_system_waits_on_wakeup' ] = False
-
- self._dictionary[ 'booleans' ][ 'always_show_system_everything' ] = False
-
- self._dictionary[ 'booleans' ][ 'watch_clipboard_for_watcher_urls' ] = False
- self._dictionary[ 'booleans' ][ 'watch_clipboard_for_other_recognised_urls' ] = False
-
- self._dictionary[ 'booleans' ][ 'default_search_synchronised' ] = True
- self._dictionary[ 'booleans' ][ 'autocomplete_float_main_gui' ] = True
-
- self._dictionary[ 'booleans' ][ 'global_audio_mute' ] = False
- self._dictionary[ 'booleans' ][ 'media_viewer_audio_mute' ] = False
- self._dictionary[ 'booleans' ][ 'media_viewer_uses_its_own_audio_volume' ] = False
- self._dictionary[ 'booleans' ][ 'preview_audio_mute' ] = False
- self._dictionary[ 'booleans' ][ 'preview_uses_its_own_audio_volume' ] = True
-
- self._dictionary[ 'booleans' ][ 'always_loop_gifs' ] = True
-
- self._dictionary[ 'booleans' ][ 'always_show_system_tray_icon' ] = False
- self._dictionary[ 'booleans' ][ 'minimise_client_to_system_tray' ] = False
- self._dictionary[ 'booleans' ][ 'close_client_to_system_tray' ] = False
- self._dictionary[ 'booleans' ][ 'start_client_in_system_tray' ] = False
-
- self._dictionary[ 'booleans' ][ 'use_qt_file_dialogs' ] = False
-
- self._dictionary[ 'booleans' ][ 'notify_client_api_cookies' ] = False
-
- self._dictionary[ 'booleans' ][ 'expand_parents_on_storage_taglists' ] = True
- self._dictionary[ 'booleans' ][ 'expand_parents_on_storage_autocomplete_taglists' ] = True
-
- self._dictionary[ 'booleans' ][ 'show_parent_decorators_on_storage_taglists' ] = True
- self._dictionary[ 'booleans' ][ 'show_parent_decorators_on_storage_autocomplete_taglists' ] = True
-
- self._dictionary[ 'booleans' ][ 'show_sibling_decorators_on_storage_taglists' ] = True
- self._dictionary[ 'booleans' ][ 'show_sibling_decorators_on_storage_autocomplete_taglists' ] = True
-
- self._dictionary[ 'booleans' ][ 'show_session_size_warnings' ] = True
-
- self._dictionary[ 'booleans' ][ 'delete_lock_for_archived_files' ] = False
-
- self._dictionary[ 'booleans' ][ 'remember_last_advanced_file_deletion_reason' ] = True
- self._dictionary[ 'booleans' ][ 'remember_last_advanced_file_deletion_special_action' ] = False
-
- self._dictionary[ 'booleans' ][ 'do_macos_debug_dialog_menus' ] = False
-
- self._dictionary[ 'booleans' ][ 'save_default_tag_service_tab_on_change' ] = True
-
- self._dictionary[ 'booleans' ][ 'force_animation_scanbar_show' ] = False
-
- self._dictionary[ 'booleans' ][ 'call_mouse_buttons_primary_secondary' ] = False
-
- self._dictionary[ 'booleans' ][ 'start_note_editing_at_end' ] = True
-
- self._dictionary[ 'booleans' ][ 'draw_transparency_checkerboard_media_canvas' ] = False
- self._dictionary[ 'booleans' ][ 'draw_transparency_checkerboard_media_canvas_duplicates' ] = True
-
- self._dictionary[ 'booleans' ][ 'menu_choice_buttons_can_mouse_scroll' ] = True
-
- self._dictionary[ 'booleans' ][ 'focus_preview_on_ctrl_click' ] = False
- self._dictionary[ 'booleans' ][ 'focus_preview_on_ctrl_click_only_static' ] = False
- self._dictionary[ 'booleans' ][ 'focus_preview_on_shift_click' ] = False
- self._dictionary[ 'booleans' ][ 'focus_preview_on_shift_click_only_static' ] = False
-
- self._dictionary[ 'booleans' ][ 'fade_sibling_connector' ] = True
- self._dictionary[ 'booleans' ][ 'use_custom_sibling_connector_colour' ] = False
-
- self._dictionary[ 'booleans' ][ 'hide_uninteresting_local_import_time' ] = True
- self._dictionary[ 'booleans' ][ 'hide_uninteresting_modified_time' ] = True
-
- self._dictionary[ 'booleans' ][ 'allow_blurhash_fallback' ] = True
-
- self._dictionary[ 'booleans' ][ 'fade_thumbnails' ] = True
-
- self._dictionary[ 'booleans' ][ 'slideshow_always_play_duration_media_once_through' ] = False
-
- self._dictionary[ 'booleans' ][ 'enable_truncated_images_pil' ] = True
- self._dictionary[ 'booleans' ][ 'do_icc_profile_normalisation' ] = True
-
from hydrus.client.gui.canvas import ClientGUIMPV
- self._dictionary[ 'booleans' ][ 'mpv_available_at_start' ] = ClientGUIMPV.MPV_IS_AVAILABLE
+ self._dictionary[ 'booleans' ] = {
+ 'advanced_mode' : False,
+ 'remove_filtered_files_even_when_skipped' : False,
+ 'filter_inbox_and_archive_predicates' : False,
+ 'discord_dnd_fix' : False,
+ 'secret_discord_dnd_fix' : False,
+ 'show_unmatched_urls_in_media_viewer' : False,
+ 'set_search_focus_on_page_change' : False,
+ 'allow_remove_on_manage_tags_input' : True,
+ 'yes_no_on_remove_on_manage_tags' : True,
+ 'activate_window_on_tag_search_page_activation' : False,
+ 'show_related_tags' : True,
+ 'show_file_lookup_script_tags' : False,
+ 'use_native_menubar' : HC.PLATFORM_MACOS,
+ 'shortcuts_merge_non_number_numpad' : True,
+ 'disable_get_safe_position_test' : False,
+ 'freeze_message_manager_when_mouse_on_other_monitor' : False,
+ 'freeze_message_manager_when_main_gui_minimised' : False,
+ 'load_images_with_pil' : True,
+ 'only_show_delete_from_all_local_domains_when_filtering' : False,
+ 'use_system_ffmpeg' : False,
+ 'elide_page_tab_names' : True,
+ 'maintain_similar_files_duplicate_pairs_during_idle' : False,
+ 'show_namespaces' : True,
+ 'show_number_namespaces' : True,
+ 'show_subtag_number_namespaces' : True,
+ 'replace_tag_underscores_with_spaces' : False,
+ 'replace_tag_emojis_with_boxes' : False,
+ 'verify_regular_https' : True,
+ 'page_drop_chase_normally' : True,
+ 'page_drop_chase_with_shift' : False,
+ 'page_drag_change_tab_normally' : True,
+ 'page_drag_change_tab_with_shift' : True,
+ 'wheel_scrolls_tab_bar' : False,
+ 'remove_local_domain_moved_files' : False,
+ 'anchor_and_hide_canvas_drags' : HC.PLATFORM_WINDOWS,
+ 'touchscreen_canvas_drags_unanchor' : False,
+ 'import_page_progress_display' : True,
+ 'process_subs_in_random_order' : True,
+ 'ac_select_first_with_count' : False,
+ 'saving_sash_positions_on_exit' : True,
+ 'database_deferred_delete_maintenance_during_idle' : True,
+ 'database_deferred_delete_maintenance_during_active' : True,
+ 'file_maintenance_during_idle' : True,
+ 'file_maintenance_during_active' : True,
+ 'tag_display_maintenance_during_idle' : True,
+ 'tag_display_maintenance_during_active' : True,
+ 'save_page_sort_on_change' : False,
+ 'disable_page_tab_dnd' : False,
+ 'force_hide_page_signal_on_new_page' : False,
+ 'pause_export_folders_sync' : False,
+ 'pause_import_folders_sync' : False,
+ 'pause_repo_sync' : False,
+ 'pause_subs_sync' : False,
+ 'pause_all_new_network_traffic' : False,
+ 'boot_with_network_traffic_paused' : False,
+ 'pause_all_file_queues' : False,
+ 'pause_all_watcher_checkers' : False,
+ 'pause_all_gallery_searches' : False,
+ 'popup_message_force_min_width' : False,
+ 'always_show_iso_time' : False,
+ 'confirm_multiple_local_file_services_move' : True,
+ 'confirm_multiple_local_file_services_copy' : True,
+ 'use_advanced_file_deletion_dialog' : False,
+ 'show_new_on_file_seed_short_summary' : False,
+ 'show_deleted_on_file_seed_short_summary' : False,
+ 'only_save_last_session_during_idle' : False,
+ 'do_human_sort_on_hdd_file_import_paths' : True,
+ 'highlight_new_watcher' : True,
+ 'highlight_new_query' : True,
+ 'delete_files_after_export' : False,
+ 'file_viewing_statistics_active' : True,
+ 'file_viewing_statistics_active_on_archive_delete_filter' : True,
+ 'file_viewing_statistics_active_on_dupe_filter' : False,
+ 'prefix_hash_when_copying' : False,
+ 'file_system_waits_on_wakeup' : False,
+ 'always_show_system_everything' : False,
+ 'watch_clipboard_for_watcher_urls' : False,
+ 'watch_clipboard_for_other_recognised_urls' : False,
+ 'default_search_synchronised' : True,
+ 'autocomplete_float_main_gui' : True,
+ 'global_audio_mute' : False,
+ 'media_viewer_audio_mute' : False,
+ 'media_viewer_uses_its_own_audio_volume' : False,
+ 'preview_audio_mute' : False,
+ 'preview_uses_its_own_audio_volume' : True,
+ 'always_loop_gifs' : True,
+ 'always_show_system_tray_icon' : False,
+ 'minimise_client_to_system_tray' : False,
+ 'close_client_to_system_tray' : False,
+ 'start_client_in_system_tray' : False,
+ 'use_qt_file_dialogs' : False,
+ 'notify_client_api_cookies' : False,
+ 'expand_parents_on_storage_taglists' : True,
+ 'expand_parents_on_storage_autocomplete_taglists' : True,
+ 'show_parent_decorators_on_storage_taglists' : True,
+ 'show_parent_decorators_on_storage_autocomplete_taglists' : True,
+ 'show_sibling_decorators_on_storage_taglists' : True,
+ 'show_sibling_decorators_on_storage_autocomplete_taglists' : True,
+ 'show_session_size_warnings' : True,
+ 'delete_lock_for_archived_files' : False,
+ 'remember_last_advanced_file_deletion_reason' : True,
+ 'remember_last_advanced_file_deletion_special_action' : False,
+ 'do_macos_debug_dialog_menus' : False,
+ 'save_default_tag_service_tab_on_change' : True,
+ 'force_animation_scanbar_show' : False,
+ 'call_mouse_buttons_primary_secondary' : False,
+ 'start_note_editing_at_end' : True,
+ 'draw_transparency_checkerboard_media_canvas' : False,
+ 'draw_transparency_checkerboard_media_canvas_duplicates' : True,
+ 'menu_choice_buttons_can_mouse_scroll' : True,
+ 'focus_preview_on_ctrl_click' : False,
+ 'focus_preview_on_ctrl_click_only_static' : False,
+ 'focus_preview_on_shift_click' : False,
+ 'focus_preview_on_shift_click_only_static' : False,
+ 'fade_sibling_connector' : True,
+ 'use_custom_sibling_connector_colour' : False,
+ 'hide_uninteresting_local_import_time' : True,
+ 'hide_uninteresting_modified_time' : True,
+ 'allow_blurhash_fallback' : True,
+ 'fade_thumbnails' : True,
+ 'slideshow_always_play_duration_media_once_through' : False,
+ 'enable_truncated_images_pil' : True,
+ 'do_icc_profile_normalisation' : True,
+ 'mpv_available_at_start' : ClientGUIMPV.MPV_IS_AVAILABLE
+ }
#
@@ -415,207 +343,144 @@ def _InitialiseDefaults( self ):
#
- self._dictionary[ 'integers' ] = {}
-
- self._dictionary[ 'integers' ][ 'notebook_tab_alignment' ] = CC.DIRECTION_UP
-
- self._dictionary[ 'integers' ][ 'video_buffer_size' ] = 96 * 1024 * 1024
-
- self._dictionary[ 'integers' ][ 'related_tags_search_1_duration_ms' ] = 250
- self._dictionary[ 'integers' ][ 'related_tags_search_2_duration_ms' ] = 2000
- self._dictionary[ 'integers' ][ 'related_tags_search_3_duration_ms' ] = 6000
- self._dictionary[ 'integers' ][ 'related_tags_concurrence_threshold_percent' ] = 6
-
- self._dictionary[ 'integers' ][ 'suggested_tags_width' ] = 300
-
- self._dictionary[ 'integers' ][ 'similar_files_duplicate_pairs_search_distance' ] = 0
-
- self._dictionary[ 'integers' ][ 'default_new_page_goes' ] = CC.NEW_PAGE_GOES_FAR_RIGHT
-
- self._dictionary[ 'integers' ][ 'num_recent_petition_reasons' ] = 5
-
- self._dictionary[ 'integers' ][ 'max_page_name_chars' ] = 20
- self._dictionary[ 'integers' ][ 'page_file_count_display' ] = CC.PAGE_FILE_COUNT_DISPLAY_ALL
-
- self._dictionary[ 'integers' ][ 'network_timeout' ] = 10
- self._dictionary[ 'integers' ][ 'connection_error_wait_time' ] = 15
- self._dictionary[ 'integers' ][ 'serverside_bandwidth_wait_time' ] = 60
-
- self._dictionary[ 'integers' ][ 'thumbnail_visibility_scroll_percent' ] = 75
- self._dictionary[ 'integers' ][ 'ideal_tile_dimension' ] = 768
-
- self._dictionary[ 'integers' ][ 'wake_delay_period' ] = 15
-
from hydrus.client.gui.canvas import ClientGUICanvasMedia
-
- self._dictionary[ 'integers' ][ 'media_viewer_zoom_center' ] = ClientGUICanvasMedia.ZOOM_CENTERPOINT_MOUSE
-
- self._dictionary[ 'integers' ][ 'last_session_save_period_minutes' ] = 5
-
- self._dictionary[ 'integers' ][ 'shutdown_work_period' ] = 86400
-
- self._dictionary[ 'integers' ][ 'max_network_jobs' ] = 15
- self._dictionary[ 'integers' ][ 'max_network_jobs_per_domain' ] = 3
-
- self._dictionary[ 'integers' ][ 'max_connection_attempts_allowed' ] = 5
- self._dictionary[ 'integers' ][ 'max_request_attempts_allowed_get' ] = 5
-
from hydrus.core.files.images import HydrusImageHandling
- self._dictionary[ 'integers' ][ 'thumbnail_scale_type' ] = HydrusImageHandling.THUMBNAIL_SCALE_DOWN_ONLY
-
- self._dictionary[ 'integers' ][ 'max_simultaneous_subscriptions' ] = 1
-
- self._dictionary[ 'integers' ][ 'gallery_page_wait_period_pages' ] = 15
- self._dictionary[ 'integers' ][ 'gallery_page_wait_period_subscriptions' ] = 5
- self._dictionary[ 'integers' ][ 'watcher_page_wait_period' ] = 5
-
- self._dictionary[ 'integers' ][ 'popup_message_character_width' ] = 56
-
- self._dictionary[ 'integers' ][ 'duplicate_filter_max_batch_size' ] = 250
-
- self._dictionary[ 'integers' ][ 'video_thumbnail_percentage_in' ] = 35
-
- self._dictionary[ 'integers' ][ 'global_audio_volume' ] = 70
- self._dictionary[ 'integers' ][ 'media_viewer_audio_volume' ] = 70
- self._dictionary[ 'integers' ][ 'preview_audio_volume' ] = 70
-
- self._dictionary[ 'integers' ][ 'duplicate_comparison_score_higher_jpeg_quality' ] = 10
- self._dictionary[ 'integers' ][ 'duplicate_comparison_score_much_higher_jpeg_quality' ] = 20
- self._dictionary[ 'integers' ][ 'duplicate_comparison_score_higher_filesize' ] = 10
- self._dictionary[ 'integers' ][ 'duplicate_comparison_score_much_higher_filesize' ] = 20
- self._dictionary[ 'integers' ][ 'duplicate_comparison_score_higher_resolution' ] = 20
- self._dictionary[ 'integers' ][ 'duplicate_comparison_score_much_higher_resolution' ] = 50
- self._dictionary[ 'integers' ][ 'duplicate_comparison_score_more_tags' ] = 8
- self._dictionary[ 'integers' ][ 'duplicate_comparison_score_older' ] = 4
- self._dictionary[ 'integers' ][ 'duplicate_comparison_score_nicer_ratio' ] = 10
- self._dictionary[ 'integers' ][ 'duplicate_comparison_score_has_audio' ] = 20
-
- self._dictionary[ 'integers' ][ 'thumbnail_cache_size' ] = 1024 * 1024 * 32
- self._dictionary[ 'integers' ][ 'image_cache_size' ] = 1024 * 1024 * 384
- self._dictionary[ 'integers' ][ 'image_tile_cache_size' ] = 1024 * 1024 * 256
-
- self._dictionary[ 'integers' ][ 'thumbnail_cache_timeout' ] = 86400
- self._dictionary[ 'integers' ][ 'image_cache_timeout' ] = 600
- self._dictionary[ 'integers' ][ 'image_tile_cache_timeout' ] = 300
-
- self._dictionary[ 'integers' ][ 'image_cache_storage_limit_percentage' ] = 25
- self._dictionary[ 'integers' ][ 'image_cache_prefetch_limit_percentage' ] = 10
-
- self._dictionary[ 'integers' ][ 'media_viewer_prefetch_delay_base_ms' ] = 100
- self._dictionary[ 'integers' ][ 'media_viewer_prefetch_num_previous' ] = 2
- self._dictionary[ 'integers' ][ 'media_viewer_prefetch_num_next' ] = 3
-
- self._dictionary[ 'integers' ][ 'thumbnail_border' ] = 1
- self._dictionary[ 'integers' ][ 'thumbnail_margin' ] = 2
-
- self._dictionary[ 'integers' ][ 'thumbnail_dpr_percent' ] = 100
-
- self._dictionary[ 'integers' ][ 'file_maintenance_idle_throttle_files' ] = 1
- self._dictionary[ 'integers' ][ 'file_maintenance_idle_throttle_time_delta' ] = 2
-
- self._dictionary[ 'integers' ][ 'file_maintenance_active_throttle_files' ] = 1
- self._dictionary[ 'integers' ][ 'file_maintenance_active_throttle_time_delta' ] = 20
-
- self._dictionary[ 'integers' ][ 'subscription_network_error_delay' ] = 12 * 3600
- self._dictionary[ 'integers' ][ 'subscription_other_error_delay' ] = 36 * 3600
- self._dictionary[ 'integers' ][ 'downloader_network_error_delay' ] = 90 * 60
-
- self._dictionary[ 'integers' ][ 'file_viewing_stats_menu_display' ] = CC.FILE_VIEWING_STATS_MENU_DISPLAY_MEDIA_AND_PREVIEW_IN_SUBMENU
-
- self._dictionary[ 'integers' ][ 'number_of_gui_session_backups' ] = 10
-
- self._dictionary[ 'integers' ][ 'animated_scanbar_height' ] = 20
- self._dictionary[ 'integers' ][ 'animated_scanbar_nub_width' ] = 10
-
- self._dictionary[ 'integers' ][ 'domain_network_infrastructure_error_number' ] = 3
- self._dictionary[ 'integers' ][ 'domain_network_infrastructure_error_time_delta' ] = 600
-
- self._dictionary[ 'integers' ][ 'ac_read_list_height_num_chars' ] = 21
- self._dictionary[ 'integers' ][ 'ac_write_list_height_num_chars' ] = 11
-
- self._dictionary[ 'integers' ][ 'system_busy_cpu_percent' ] = 50
-
- self._dictionary[ 'integers' ][ 'human_bytes_sig_figs' ] = 3
-
- self._dictionary[ 'integers' ][ 'ms_to_wait_between_physical_file_deletes' ] = 250
-
- self._dictionary[ 'integers' ][ 'potential_duplicates_search_work_time_ms' ] = 500
- self._dictionary[ 'integers' ][ 'potential_duplicates_search_rest_percentage' ] = 100
-
- self._dictionary[ 'integers' ][ 'repository_processing_work_time_ms_very_idle' ] = 30000
- self._dictionary[ 'integers' ][ 'repository_processing_rest_percentage_very_idle' ] = 3
-
- self._dictionary[ 'integers' ][ 'repository_processing_work_time_ms_idle' ] = 10000
- self._dictionary[ 'integers' ][ 'repository_processing_rest_percentage_idle' ] = 5
-
- self._dictionary[ 'integers' ][ 'repository_processing_work_time_ms_normal' ] = 500
- self._dictionary[ 'integers' ][ 'repository_processing_rest_percentage_normal' ] = 10
-
- self._dictionary[ 'integers' ][ 'tag_display_processing_work_time_ms_idle' ] = 15000
- self._dictionary[ 'integers' ][ 'tag_display_processing_rest_percentage_idle' ] = 3
-
- self._dictionary[ 'integers' ][ 'tag_display_processing_work_time_ms_normal' ] = 100
- self._dictionary[ 'integers' ][ 'tag_display_processing_rest_percentage_normal' ] = 9900
-
- self._dictionary[ 'integers' ][ 'tag_display_processing_work_time_ms_work_hard' ] = 5000
- self._dictionary[ 'integers' ][ 'tag_display_processing_rest_percentage_work_hard' ] = 5
-
- self._dictionary[ 'integers' ][ 'deferred_table_delete_work_time_ms_idle' ] = 20000
- self._dictionary[ 'integers' ][ 'deferred_table_delete_rest_percentage_idle' ] = 10
-
- self._dictionary[ 'integers' ][ 'deferred_table_delete_work_time_ms_normal' ] = 250
- self._dictionary[ 'integers' ][ 'deferred_table_delete_rest_percentage_normal' ] = 1000
-
- self._dictionary[ 'integers' ][ 'deferred_table_delete_work_time_ms_work_hard' ] = 5000
- self._dictionary[ 'integers' ][ 'deferred_table_delete_rest_percentage_work_hard' ] = 10
+ self._dictionary[ 'integers' ] = {
+ 'notebook_tab_alignment' : CC.DIRECTION_UP,
+ 'video_buffer_size' : 96 * 1024 * 1024,
+ 'related_tags_search_1_duration_ms' : 250,
+ 'related_tags_search_2_duration_ms' : 2000,
+ 'related_tags_search_3_duration_ms' : 6000,
+ 'related_tags_concurrence_threshold_percent' : 6,
+ 'suggested_tags_width' : 300,
+ 'similar_files_duplicate_pairs_search_distance' : 0,
+ 'default_new_page_goes' : CC.NEW_PAGE_GOES_FAR_RIGHT,
+ 'num_recent_petition_reasons' : 5,
+ 'max_page_name_chars' : 20,
+ 'page_file_count_display' : CC.PAGE_FILE_COUNT_DISPLAY_ALL,
+ 'network_timeout' : 10,
+ 'connection_error_wait_time' : 15,
+ 'serverside_bandwidth_wait_time' : 60,
+ 'thumbnail_visibility_scroll_percent' : 75,
+ 'ideal_tile_dimension' : 768,
+ 'wake_delay_period' : 15,
+ 'media_viewer_zoom_center' : ClientGUICanvasMedia.ZOOM_CENTERPOINT_MOUSE,
+ 'last_session_save_period_minutes' : 5,
+ 'shutdown_work_period' : 86400,
+ 'max_network_jobs' : 15,
+ 'max_network_jobs_per_domain' : 3,
+ 'max_connection_attempts_allowed' : 5,
+ 'max_request_attempts_allowed_get' : 5,
+ 'thumbnail_scale_type' : HydrusImageHandling.THUMBNAIL_SCALE_DOWN_ONLY,
+ 'max_simultaneous_subscriptions' : 1,
+ 'gallery_page_wait_period_pages' : 15,
+ 'gallery_page_wait_period_subscriptions' : 5,
+ 'watcher_page_wait_period' : 5,
+ 'popup_message_character_width' : 56,
+ 'duplicate_filter_max_batch_size' : 250,
+ 'video_thumbnail_percentage_in' : 35,
+ 'global_audio_volume' : 70,
+ 'media_viewer_audio_volume' : 70,
+ 'preview_audio_volume' : 70,
+ 'duplicate_comparison_score_higher_jpeg_quality' : 10,
+ 'duplicate_comparison_score_much_higher_jpeg_quality' : 20,
+ 'duplicate_comparison_score_higher_filesize' : 10,
+ 'duplicate_comparison_score_much_higher_filesize' : 20,
+ 'duplicate_comparison_score_higher_resolution' : 20,
+ 'duplicate_comparison_score_much_higher_resolution' : 50,
+ 'duplicate_comparison_score_more_tags' : 8,
+ 'duplicate_comparison_score_older' : 4,
+ 'duplicate_comparison_score_nicer_ratio' : 10,
+ 'duplicate_comparison_score_has_audio' : 20,
+ 'thumbnail_cache_size' : 1024 * 1024 * 32,
+ 'image_cache_size' : 1024 * 1024 * 384,
+ 'image_tile_cache_size' : 1024 * 1024 * 256,
+ 'thumbnail_cache_timeout' : 86400,
+ 'image_cache_timeout' : 600,
+ 'image_tile_cache_timeout' : 300,
+ 'image_cache_storage_limit_percentage' : 25,
+ 'image_cache_prefetch_limit_percentage' : 10,
+ 'media_viewer_prefetch_delay_base_ms' : 100,
+ 'media_viewer_prefetch_num_previous' : 2,
+ 'media_viewer_prefetch_num_next' : 3,
+ 'thumbnail_border' : 1,
+ 'thumbnail_margin' : 2,
+ 'thumbnail_dpr_percent' : 100,
+ 'file_maintenance_idle_throttle_files' : 1,
+ 'file_maintenance_idle_throttle_time_delta' : 2,
+ 'file_maintenance_active_throttle_files' : 1,
+ 'file_maintenance_active_throttle_time_delta' : 20,
+ 'subscription_network_error_delay' : 12 * 3600,
+ 'subscription_other_error_delay' : 36 * 3600,
+ 'downloader_network_error_delay' : 90 * 60,
+ 'file_viewing_stats_menu_display' : CC.FILE_VIEWING_STATS_MENU_DISPLAY_MEDIA_AND_PREVIEW_IN_SUBMENU,
+ 'number_of_gui_session_backups' : 10,
+ 'animated_scanbar_height' : 20,
+ 'animated_scanbar_nub_width' : 10,
+ 'domain_network_infrastructure_error_number' : 3,
+ 'domain_network_infrastructure_error_time_delta' : 600,
+ 'ac_read_list_height_num_chars' : 21,
+ 'ac_write_list_height_num_chars' : 11,
+ 'system_busy_cpu_percent' : 50,
+ 'human_bytes_sig_figs' : 3,
+ 'ms_to_wait_between_physical_file_deletes' : 250,
+ 'potential_duplicates_search_work_time_ms' : 500,
+ 'potential_duplicates_search_rest_percentage' : 100,
+ 'repository_processing_work_time_ms_very_idle' : 30000,
+ 'repository_processing_rest_percentage_very_idle' : 3,
+ 'repository_processing_work_time_ms_idle' : 10000,
+ 'repository_processing_rest_percentage_idle' : 5,
+ 'repository_processing_work_time_ms_normal' : 500,
+ 'repository_processing_rest_percentage_normal' : 10,
+ 'tag_display_processing_work_time_ms_idle' : 15000,
+ 'tag_display_processing_rest_percentage_idle' : 3,
+ 'tag_display_processing_work_time_ms_normal' : 100,
+ 'tag_display_processing_rest_percentage_normal' : 9900,
+ 'tag_display_processing_work_time_ms_work_hard' : 5000,
+ 'tag_display_processing_rest_percentage_work_hard' : 5,
+ 'deferred_table_delete_work_time_ms_idle' : 20000,
+ 'deferred_table_delete_rest_percentage_idle' : 10,
+ 'deferred_table_delete_work_time_ms_normal' : 250,
+ 'deferred_table_delete_rest_percentage_normal' : 1000,
+ 'deferred_table_delete_work_time_ms_work_hard' : 5000,
+ 'deferred_table_delete_rest_percentage_work_hard' : 10
+ }
#
- self._dictionary[ 'keys' ] = {}
-
- self._dictionary[ 'keys' ][ 'default_tag_service_tab' ] = CC.DEFAULT_LOCAL_TAG_SERVICE_KEY.hex()
- self._dictionary[ 'keys' ][ 'default_tag_service_search_page' ] = CC.COMBINED_TAG_SERVICE_KEY.hex()
- self._dictionary[ 'keys' ][ 'default_gug_key' ] = HydrusData.GenerateKey().hex()
+ self._dictionary[ 'keys' ] = {
+ 'default_tag_service_tab' : CC.DEFAULT_LOCAL_TAG_SERVICE_KEY.hex(),
+ 'default_tag_service_search_page' : CC.COMBINED_TAG_SERVICE_KEY.hex(),
+ 'default_gug_key' : HydrusData.GenerateKey().hex()
+ }
self._dictionary[ 'key_list' ] = {}
#
- self._dictionary[ 'noneable_integers' ] = {}
-
- self._dictionary[ 'noneable_integers' ][ 'forced_search_limit' ] = None
-
- self._dictionary[ 'noneable_integers' ][ 'num_recent_tags' ] = 20
-
- self._dictionary[ 'noneable_integers' ][ 'duplicate_background_switch_intensity_a' ] = 0
- self._dictionary[ 'noneable_integers' ][ 'duplicate_background_switch_intensity_b' ] = 3
-
- self._dictionary[ 'noneable_integers' ][ 'last_review_bandwidth_search_distance' ] = 7 * 86400
-
- self._dictionary[ 'noneable_integers' ][ 'file_viewing_statistics_media_min_time' ] = 2
- self._dictionary[ 'noneable_integers' ][ 'file_viewing_statistics_media_max_time' ] = 600
- self._dictionary[ 'noneable_integers' ][ 'file_viewing_statistics_preview_min_time' ] = 5
- self._dictionary[ 'noneable_integers' ][ 'file_viewing_statistics_preview_max_time' ] = 60
-
- self._dictionary[ 'noneable_integers' ][ 'subscription_file_error_cancel_threshold' ] = 5
-
- self._dictionary[ 'noneable_integers' ][ 'media_viewer_cursor_autohide_time_ms' ] = 700
-
- self._dictionary[ 'noneable_integers' ][ 'idle_mode_client_api_timeout' ] = None
-
- self._dictionary[ 'noneable_integers' ][ 'system_busy_cpu_count' ] = 1
-
- self._dictionary[ 'noneable_integers' ][ 'animated_scanbar_hide_height' ] = 5
-
- self._dictionary[ 'noneable_integers' ][ 'last_backup_time' ] = None
-
- self._dictionary[ 'noneable_integers' ][ 'slideshow_short_duration_loop_percentage' ] = 20
- self._dictionary[ 'noneable_integers' ][ 'slideshow_short_duration_loop_seconds' ] = 10
-
- self._dictionary[ 'noneable_integers' ][ 'slideshow_short_duration_cutoff_percentage' ] = 75
-
- self._dictionary[ 'noneable_integers' ][ 'slideshow_long_duration_overspill_percentage' ] = 50
+ self._dictionary[ 'noneable_integers' ] = {
+ 'forced_search_limit' : None,
+ 'num_recent_tags' : 20,
+ 'duplicate_background_switch_intensity_a' : 0,
+ 'duplicate_background_switch_intensity_b' : 3,
+ 'last_review_bandwidth_search_distance' : 7 * 86400,
+ 'file_viewing_statistics_media_min_time' : 2,
+ 'file_viewing_statistics_media_max_time' : 600,
+ 'file_viewing_statistics_preview_min_time' : 5,
+ 'file_viewing_statistics_preview_max_time' : 60,
+ 'subscription_file_error_cancel_threshold' : 5,
+ 'media_viewer_cursor_autohide_time_ms' : 700,
+ 'idle_mode_client_api_timeout' : None,
+ 'system_busy_cpu_count' : 1,
+ 'animated_scanbar_hide_height' : 5,
+ 'last_backup_time' : None,
+ 'slideshow_short_duration_loop_percentage' : 20,
+ 'slideshow_short_duration_loop_seconds' : 10,
+ 'slideshow_short_duration_cutoff_percentage' : 75,
+ 'slideshow_long_duration_overspill_percentage' : 50,
+ 'num_to_show_in_ac_dropdown_children_tab' : 40
+ }
#
@@ -623,50 +488,50 @@ def _InitialiseDefaults( self ):
#
- self._dictionary[ 'noneable_strings' ] = {}
-
- self._dictionary[ 'noneable_strings' ][ 'favourite_file_lookup_script' ] = 'gelbooru md5'
- self._dictionary[ 'noneable_strings' ][ 'suggested_tags_layout' ] = 'notebook'
- self._dictionary[ 'noneable_strings' ][ 'backup_path' ] = None
- self._dictionary[ 'noneable_strings' ][ 'web_browser_path' ] = None
- self._dictionary[ 'noneable_strings' ][ 'last_png_export_dir' ] = None
- self._dictionary[ 'noneable_strings' ][ 'media_background_bmp_path' ] = None
- self._dictionary[ 'noneable_strings' ][ 'http_proxy' ] = None
- self._dictionary[ 'noneable_strings' ][ 'https_proxy' ] = None
- self._dictionary[ 'noneable_strings' ][ 'no_proxy' ] = '127.0.0.1'
- self._dictionary[ 'noneable_strings' ][ 'qt_style_name' ] = None
- self._dictionary[ 'noneable_strings' ][ 'qt_stylesheet_name' ] = None
- self._dictionary[ 'noneable_strings' ][ 'last_advanced_file_deletion_reason' ] = None
- self._dictionary[ 'noneable_strings' ][ 'last_advanced_file_deletion_special_action' ] = None
- self._dictionary[ 'noneable_strings' ][ 'sibling_connector_custom_namespace_colour' ] = 'system'
- self._dictionary[ 'noneable_strings' ][ 'or_connector_custom_namespace_colour' ] = 'system'
-
- self._dictionary[ 'strings' ] = {}
-
- self._dictionary[ 'strings' ][ 'app_display_name' ] = 'hydrus client'
- self._dictionary[ 'strings' ][ 'namespace_connector' ] = ':'
- self._dictionary[ 'strings' ][ 'sibling_connector' ] = ' \u2192 '
- self._dictionary[ 'strings' ][ 'or_connector' ] = ' OR '
- self._dictionary[ 'strings' ][ 'export_phrase' ] = '{hash}'
- self._dictionary[ 'strings' ][ 'current_colourset' ] = 'default'
- self._dictionary[ 'strings' ][ 'favourite_simple_downloader_formula' ] = 'all files linked by images in page'
- self._dictionary[ 'strings' ][ 'thumbnail_scroll_rate' ] = '1.0'
- self._dictionary[ 'strings' ][ 'pause_character' ] = '\u23F8'
- self._dictionary[ 'strings' ][ 'stop_character' ] = '\u23F9'
- self._dictionary[ 'strings' ][ 'default_gug_name' ] = 'safebooru tag search'
- self._dictionary[ 'strings' ][ 'has_audio_label' ] = '\U0001F50A'
- self._dictionary[ 'strings' ][ 'has_duration_label' ] = ' \u23F5 '
- self._dictionary[ 'strings' ][ 'discord_dnd_filename_pattern' ] = '{hash}'
- self._dictionary[ 'strings' ][ 'default_suggested_tags_notebook_page' ] = 'related'
- self._dictionary[ 'strings' ][ 'last_incremental_tagging_namespace' ] = 'page'
- self._dictionary[ 'strings' ][ 'last_incremental_tagging_prefix' ] = ''
- self._dictionary[ 'strings' ][ 'last_incremental_tagging_suffix' ] = ''
-
- self._dictionary[ 'string_list' ] = {}
-
- self._dictionary[ 'string_list' ][ 'default_media_viewer_custom_shortcuts' ] = []
- self._dictionary[ 'string_list' ][ 'favourite_tags' ] = []
- self._dictionary[ 'string_list' ][ 'advanced_file_deletion_reasons' ] = [ 'I do not like it.', 'It is bad quality.', 'It is not appropriate for this client.', 'Temporary delete--I want to bring it back later.' ]
+ self._dictionary[ 'noneable_strings' ] = {
+ 'favourite_file_lookup_script' : 'gelbooru md5',
+ 'suggested_tags_layout' : 'notebook',
+ 'backup_path' : None,
+ 'web_browser_path' : None,
+ 'last_png_export_dir' : None,
+ 'media_background_bmp_path' : None,
+ 'http_proxy' : None,
+ 'https_proxy' : None,
+ 'no_proxy' : '127.0.0.1',
+ 'qt_style_name' : None,
+ 'qt_stylesheet_name' : None,
+ 'last_advanced_file_deletion_reason' : None,
+ 'last_advanced_file_deletion_special_action' : None,
+ 'sibling_connector_custom_namespace_colour' : 'system',
+ 'or_connector_custom_namespace_colour' : 'system'
+ }
+
+ self._dictionary[ 'strings' ] = {
+ 'app_display_name' : 'hydrus client',
+ 'namespace_connector' : ':',
+ 'sibling_connector' : ' \u2192 ',
+ 'or_connector' : ' OR ',
+ 'export_phrase' : '{hash}',
+ 'current_colourset' : 'default',
+ 'favourite_simple_downloader_formula' : 'all files linked by images in page',
+ 'thumbnail_scroll_rate' : '1.0',
+ 'pause_character' : '\u23F8',
+ 'stop_character' : '\u23F9',
+ 'default_gug_name' : 'safebooru tag search',
+ 'has_audio_label' : '\U0001F50A',
+ 'has_duration_label' : ' \u23F5 ',
+ 'discord_dnd_filename_pattern' : '{hash}',
+ 'default_suggested_tags_notebook_page' : 'related',
+ 'last_incremental_tagging_namespace' : 'page',
+ 'last_incremental_tagging_prefix' : '',
+ 'last_incremental_tagging_suffix' : ''
+ }
+
+ self._dictionary[ 'string_list' ] = {
+ 'default_media_viewer_custom_shortcuts' : [],
+ 'favourite_tags' : [],
+ 'advanced_file_deletion_reasons' : [ 'I do not like it.', 'It is bad quality.', 'It is not appropriate for this client.', 'Temporary delete--I want to bring it back later.' ]
+ }
#
diff --git a/hydrus/client/ClientRendering.py b/hydrus/client/ClientRendering.py
index e7747852f..0642d8403 100644
--- a/hydrus/client/ClientRendering.py
+++ b/hydrus/client/ClientRendering.py
@@ -45,12 +45,14 @@ def FrameIndexOutOfRange( index, range_start, range_end ):
return False
+
def GenerateHydrusBitmap( path, mime, compressed = True ):
- numpy_image = ClientImageHandling.GenerateNumPyImage( path, mime )
+ numpy_image = HydrusImageHandling.GenerateNumPyImage( path, mime )
return GenerateHydrusBitmapFromNumPyImage( numpy_image, compressed = compressed )
+
def GenerateHydrusBitmapFromNumPyImage( numpy_image, compressed = True ):
( y, x, depth ) = numpy_image.shape
@@ -247,7 +249,7 @@ def _Initialise( self ):
try:
- self._numpy_image = ClientImageHandling.GenerateNumPyImage( self._path, self._mime )
+ self._numpy_image = HydrusImageHandling.GenerateNumPyImage( self._path, self._mime )
except Exception as e:
diff --git a/hydrus/client/caches/ClientCaches.py b/hydrus/client/caches/ClientCaches.py
index 3c7e9f293..9a85f215b 100644
--- a/hydrus/client/caches/ClientCaches.py
+++ b/hydrus/client/caches/ClientCaches.py
@@ -374,7 +374,7 @@ def _GetThumbnailHydrusBitmap( self, display_media ):
thumbnail_mime = HydrusFileHandling.GetThumbnailMime( thumbnail_path )
- numpy_image = ClientImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
+ numpy_image = HydrusImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
except Exception as e:
@@ -394,7 +394,7 @@ def _GetThumbnailHydrusBitmap( self, display_media ):
try:
- numpy_image = ClientImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
+ numpy_image = HydrusImageHandling.GenerateNumPyImage( thumbnail_path, thumbnail_mime )
except Exception as e:
@@ -645,7 +645,7 @@ def Clear( self ):
for ( mime, thumbnail_path ) in HydrusFileHandling.mimes_to_default_thumbnail_paths.items():
- numpy_image = ClientImageHandling.GenerateNumPyImage( thumbnail_path, HC.IMAGE_PNG )
+ numpy_image = HydrusImageHandling.GenerateNumPyImage( thumbnail_path, HC.IMAGE_PNG )
numpy_image_resolution = HydrusImageHandling.GetResolutionNumPy( numpy_image )
diff --git a/hydrus/client/db/ClientDB.py b/hydrus/client/db/ClientDB.py
index 7b90e8f82..16ae59afc 100644
--- a/hydrus/client/db/ClientDB.py
+++ b/hydrus/client/db/ClientDB.py
@@ -69,6 +69,7 @@
from hydrus.client.db import ClientDBTagSiblings
from hydrus.client.db import ClientDBTagSuggestions
from hydrus.client.db import ClientDBURLMap
+from hydrus.client.duplicates import ClientDuplicates
from hydrus.client.importing import ClientImportFiles
from hydrus.client.interfaces import ClientControllerInterface
from hydrus.client.media import ClientMediaManagers
@@ -1829,7 +1830,7 @@ def _DuplicatesGetRandomPotentialDuplicateHashes(
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_2:
- if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
+ if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
query_hash_ids_1 = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
query_hash_ids_2 = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_2, temp_table_name_2 ) )
@@ -1850,7 +1851,7 @@ def _DuplicatesGetRandomPotentialDuplicateHashes(
query_hash_ids = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
- if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
+ if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
chosen_allowed_hash_ids = query_hash_ids
comparison_allowed_hash_ids = query_hash_ids
@@ -1967,7 +1968,7 @@ def _DuplicatesGetPotentialDuplicatePairsForFiltering( self, file_search_context
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_2:
- if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
+ if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
query_hash_ids_1 = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
query_hash_ids_2 = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_2, temp_table_name_2 ) )
@@ -1988,7 +1989,7 @@ def _DuplicatesGetPotentialDuplicatePairsForFiltering( self, file_search_context
query_hash_ids = set( self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 ) )
- if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
+ if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
# both chosen and comparison must be in the search, no king selection nonsense allowed
chosen_allowed_hash_ids = query_hash_ids
@@ -2155,7 +2156,7 @@ def _DuplicatesGetPotentialDuplicatesCount( self, file_search_context_1, file_se
with self._MakeTemporaryIntegerTable( [], 'hash_id' ) as temp_table_name_2:
- if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
+ if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 )
self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_2, temp_table_name_2 )
@@ -2172,7 +2173,7 @@ def _DuplicatesGetPotentialDuplicatesCount( self, file_search_context_1, file_se
self.modules_files_query.PopulateSearchIntoTempTable( file_search_context_1, temp_table_name_1 )
- if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
+ if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH:
table_join = self.modules_files_duplicates.GetPotentialDuplicatePairsTableJoinOnSearchResultsBothFiles( temp_table_name_1, pixel_dupes_preference, max_hamming_distance )
@@ -2791,7 +2792,7 @@ def _GetBonedStatsFromTable(
return boned_stats
# TODO: fix this, it takes ages sometimes IRL
- table_join = self.modules_files_duplicates.GetPotentialDuplicatePairsTableJoinOnSearchResults( db_location_context, current_files_table_name, CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED, max_hamming_distance = 8 )
+ table_join = self.modules_files_duplicates.GetPotentialDuplicatePairsTableJoinOnSearchResults( db_location_context, current_files_table_name, ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED, max_hamming_distance = 8 )
( total_potential_pairs, ) = self._Execute( f'SELECT COUNT( * ) FROM ( SELECT DISTINCT smaller_media_id, larger_media_id FROM {table_join} );' ).fetchone()
@@ -6807,6 +6808,7 @@ def _Read( self, action, *args, **kwargs ):
elif action == 'tag_display_application': result = self.modules_tag_display.GetApplication( *args, **kwargs )
elif action == 'tag_display_maintenance_status': result = self._CacheTagDisplayGetApplicationStatusNumbers( *args, **kwargs )
elif action == 'tag_parents': result = self.modules_tag_parents.GetTagParents( *args, **kwargs )
+ elif action == 'tag_predicates': result = self.modules_tag_search.GetTagPredicates( *args, **kwargs )
elif action == 'tag_siblings': result = self.modules_tag_siblings.GetTagSiblings( *args, **kwargs )
elif action == 'tag_siblings_all_ideals': result = self.modules_tag_siblings.GetTagSiblingsIdeals( *args, **kwargs )
elif action == 'tag_display_decorators': result = self.modules_tag_display.GetUIDecorators( *args, **kwargs )
@@ -10321,6 +10323,55 @@ def ask_what_to_do_zip_docx_scan():
+ if version == 574:
+
+ try:
+
+ domain_manager = self.modules_serialisable.GetJSONDump( HydrusSerialisable.SERIALISABLE_TYPE_NETWORK_DOMAIN_MANAGER )
+
+ domain_manager.Initialise()
+
+ domain_manager.OverwriteDefaultParsers( [
+ 'danbooru file page parser - get webm ugoira',
+ 'danbooru file page parser'
+ ] )
+
+ parsers = domain_manager.GetParsers()
+
+ parser_names = { parser.GetName() for parser in parsers }
+
+ # checking for floog's downloader
+ if 'fxtwitter api status parser' not in parser_names and 'vxtwitter api status parser' not in parser_names:
+
+ domain_manager.OverwriteDefaultURLClasses( [
+ 'vxtwitter tweet',
+ 'vxtwitter api status',
+ 'vxtwitter api status (with username)',
+ 'fixvx tweet',
+ 'fixupx tweet',
+ 'fxtwitter tweet',
+ 'x post'
+ ] )
+
+
+ #
+
+ domain_manager.TryToLinkURLClassesAndParsers()
+
+ #
+
+ self.modules_serialisable.SetJSONDump( domain_manager )
+
+ except Exception as e:
+
+ HydrusData.PrintException( e )
+
+ message = 'Trying to update some downloaders failed! Please let hydrus dev know!'
+
+ self.pub_initial_message( message )
+
+
+
self._controller.frame_splash_status.SetTitleText( 'updated db to v{}'.format( HydrusData.ToHumanInt( version + 1 ) ) )
self._Execute( 'UPDATE version SET version = ?;', ( version + 1, ) )
diff --git a/hydrus/client/db/ClientDBFilesDuplicates.py b/hydrus/client/db/ClientDBFilesDuplicates.py
index edf916c94..79e314067 100644
--- a/hydrus/client/db/ClientDBFilesDuplicates.py
+++ b/hydrus/client/db/ClientDBFilesDuplicates.py
@@ -12,6 +12,7 @@
from hydrus.client.db import ClientDBFilesStorage
from hydrus.client.db import ClientDBModule
from hydrus.client.db import ClientDBSimilarFiles
+from hydrus.client.duplicates import ClientDuplicates
class ClientDBFilesDuplicates( ClientDBModule.ClientDBModule ):
@@ -1013,16 +1014,16 @@ def GetPotentialDuplicatePairsTableJoinGetInitialTablesAndPreds( self, pixel_dup
join_predicates = [ 'smaller_media_id = duplicate_files_smaller.media_id AND larger_media_id = duplicate_files_larger.media_id' ]
- if pixel_dupes_preference != CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
+ if pixel_dupes_preference != ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
join_predicates.append( 'distance <= {}'.format( max_hamming_distance ) )
- if pixel_dupes_preference in ( CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
+ if pixel_dupes_preference in ( ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
join_predicate_pixel_dupes = 'duplicate_files_smaller.king_hash_id = pixel_hash_map_smaller.hash_id AND duplicate_files_larger.king_hash_id = pixel_hash_map_larger.hash_id AND pixel_hash_map_smaller.pixel_hash_id = pixel_hash_map_larger.pixel_hash_id'
- if pixel_dupes_preference == CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
+ if pixel_dupes_preference == ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_REQUIRED:
tables.extend( [
'pixel_hash_map AS pixel_hash_map_smaller',
@@ -1031,7 +1032,7 @@ def GetPotentialDuplicatePairsTableJoinGetInitialTablesAndPreds( self, pixel_dup
join_predicates.append( join_predicate_pixel_dupes )
- elif pixel_dupes_preference == CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED:
+ elif pixel_dupes_preference == ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED:
# can't do "AND NOT {}", or the join will just give you the million rows where it isn't true. we want 'AND NEVER {}', and quick
diff --git a/hydrus/client/db/ClientDBMappingsCacheSpecificDisplay.py b/hydrus/client/db/ClientDBMappingsCacheSpecificDisplay.py
index d6da1d80c..34ebabbc8 100644
--- a/hydrus/client/db/ClientDBMappingsCacheSpecificDisplay.py
+++ b/hydrus/client/db/ClientDBMappingsCacheSpecificDisplay.py
@@ -693,6 +693,9 @@ def RegeneratePending( self, file_service_id, tag_service_id, status_hook = None
def RescindPendingMappings( self, file_service_id, tag_service_id, storage_tag_id, hash_ids ):
+ # other things imply this tag on display, so we need to check storage to see what else has it
+ statuses_to_table_names = self.modules_mappings_storage.GetFastestStorageMappingTableNames( file_service_id, tag_service_id )
+
( cache_display_current_mappings_table_name, cache_display_pending_mappings_table_name ) = ClientDBMappingsStorage.GenerateSpecificDisplayMappingsCacheTableNames( file_service_id, tag_service_id )
implies_tag_ids = self.modules_tag_display.GetImplies( ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, tag_service_id, storage_tag_id )
@@ -718,9 +721,6 @@ def RescindPendingMappings( self, file_service_id, tag_service_id, storage_tag_i
else:
- # other things imply this tag on display, so we need to check storage to see what else has it
- statuses_to_table_names = self.modules_mappings_storage.GetFastestStorageMappingTableNames( file_service_id, tag_service_id )
-
mappings_table_name = statuses_to_table_names[ HC.CONTENT_STATUS_PENDING ]
with self._MakeTemporaryIntegerTable( other_implied_by_tag_ids, 'tag_id' ) as temp_table_name:
diff --git a/hydrus/client/db/ClientDBMappingsCacheSpecificStorage.py b/hydrus/client/db/ClientDBMappingsCacheSpecificStorage.py
index 83ba6be20..94cdfb22e 100644
--- a/hydrus/client/db/ClientDBMappingsCacheSpecificStorage.py
+++ b/hydrus/client/db/ClientDBMappingsCacheSpecificStorage.py
@@ -263,16 +263,25 @@ def AddFiles( self, file_service_id, tag_service_id, hash_ids, hash_ids_table_na
def AddMappings( self, tag_service_id, tag_id, hash_ids, filtered_hashes_generator: FilteredHashesGenerator ):
+ is_local = self.modules_services.GetServiceType( tag_service_id ) == HC.LOCAL_TAG
+
for ( file_service_id, filtered_hash_ids ) in filtered_hashes_generator.IterateHashes( hash_ids ):
( cache_current_mappings_table_name, cache_deleted_mappings_table_name, cache_pending_mappings_table_name ) = ClientDBMappingsStorage.GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id )
- # we have to interleave this into the iterator so that if two siblings with the same ideal are pend->currented at once, we remain logic consistent for soletag lookups!
- self.modules_mappings_cache_specific_display.RescindPendingMappings( file_service_id, tag_service_id, tag_id, filtered_hash_ids )
-
- self._ExecuteMany( 'DELETE FROM ' + cache_pending_mappings_table_name + ' WHERE hash_id = ? AND tag_id = ?;', ( ( hash_id, tag_id ) for hash_id in filtered_hash_ids ) )
-
- num_pending_rescinded = self._GetRowCount()
+ if is_local:
+
+ num_pending_rescinded = 0
+
+ else:
+
+ # we have to interleave this into the iterator so that if two siblings with the same ideal are pend->currented at once, we remain logic consistent for soletag lookups!
+ self.modules_mappings_cache_specific_display.RescindPendingMappings( file_service_id, tag_service_id, tag_id, filtered_hash_ids )
+
+ self._ExecuteMany( 'DELETE FROM ' + cache_pending_mappings_table_name + ' WHERE hash_id = ? AND tag_id = ?;', ( ( hash_id, tag_id ) for hash_id in filtered_hash_ids ) )
+
+ num_pending_rescinded = self._GetRowCount()
+
#
@@ -595,8 +604,6 @@ def RescindPendingMappings( self, tag_service_id, tag_id, hash_ids, filtered_has
( cache_current_mappings_table_name, cache_deleted_mappings_table_name, cache_pending_mappings_table_name ) = ClientDBMappingsStorage.GenerateSpecificMappingsCacheTableNames( file_service_id, tag_service_id )
- ac_counts = collections.Counter()
-
self.modules_mappings_cache_specific_display.RescindPendingMappings( file_service_id, tag_service_id, tag_id, filtered_hash_ids )
self._ExecuteMany( 'DELETE FROM ' + cache_pending_mappings_table_name + ' WHERE hash_id = ? AND tag_id = ?;', ( ( hash_id, tag_id ) for hash_id in filtered_hash_ids ) )
diff --git a/hydrus/client/db/ClientDBTagSearch.py b/hydrus/client/db/ClientDBTagSearch.py
index 0eadbc556..1c6a8b6be 100644
--- a/hydrus/client/db/ClientDBTagSearch.py
+++ b/hydrus/client/db/ClientDBTagSearch.py
@@ -1295,6 +1295,84 @@ def GetTagIdsFromSubtagIdsTable( self, file_service_id: int, tag_service_id: int
return final_result_tag_ids
+ def GetTagIdPredicates(
+ self,
+ tag_display_type: int,
+ file_search_context: ClientSearch.FileSearchContext,
+ tag_ids: typing.Collection[ int ],
+ inclusive = True,
+ zero_count_ok = False,
+ job_status = None
+ ):
+
+ all_predicates = []
+
+ tag_context = file_search_context.GetTagContext()
+
+ display_tag_service_id = self.modules_services.GetServiceId( tag_context.display_service_key )
+
+ include_current = tag_context.include_current_tags
+ include_pending = tag_context.include_pending_tags
+
+ file_search_context_branch = self.modules_services.GetFileSearchContextBranch( file_search_context )
+
+ for leaf in file_search_context_branch.IterateLeaves():
+
+ domain_is_cross_referenced = leaf.file_service_id != self.modules_services.combined_deleted_file_service_id
+
+ for group_of_tag_ids in HydrusData.SplitIteratorIntoChunks( tag_ids, 1000 ):
+
+ if job_status is not None and job_status.IsCancelled():
+
+ return []
+
+
+ ids_to_count = self.modules_mappings_counts.GetCounts( tag_display_type, leaf.tag_service_id, leaf.file_service_id, group_of_tag_ids, include_current, include_pending, domain_is_cross_referenced = domain_is_cross_referenced, zero_count_ok = zero_count_ok, job_status = job_status )
+
+ if len( ids_to_count ) == 0:
+
+ continue
+
+
+ #
+
+ predicates = self.modules_tag_display.GeneratePredicatesFromTagIdsAndCounts( tag_display_type, display_tag_service_id, ids_to_count, inclusive, job_status = job_status )
+
+ all_predicates.extend( predicates )
+
+
+ if job_status is not None and job_status.IsCancelled():
+
+ return []
+
+
+
+ predicates = ClientSearch.MergePredicates( all_predicates )
+
+ return predicates
+
+
+ def GetTagPredicates(
+ self,
+ tag_display_type: int,
+ file_search_context: ClientSearch.FileSearchContext,
+ tags: typing.Collection[ str ],
+ inclusive = True,
+ zero_count_ok = False,
+ job_status = None
+ ):
+
+ tag_ids = set( self.modules_tags.GetTagIdsToTags( tags = tags ).keys() )
+
+ return self.GetTagIdPredicates(
+ tag_display_type,
+ file_search_context,
+ tag_ids,
+ inclusive = inclusive,
+ zero_count_ok = zero_count_ok,
+ job_status = job_status )
+
+
def GetTagsTableName( self, file_service_id, tag_service_id ):
if file_service_id == self.modules_services.combined_file_service_id:
diff --git a/hydrus/client/duplicates/ClientAutoDuplicates.py b/hydrus/client/duplicates/ClientAutoDuplicates.py
new file mode 100644
index 000000000..3cb219347
--- /dev/null
+++ b/hydrus/client/duplicates/ClientAutoDuplicates.py
@@ -0,0 +1,199 @@
+import random
+import threading
+
+from hydrus.core import HydrusSerialisable
+
+from hydrus.client import ClientConstants as CC
+from hydrus.client.duplicates import ClientDuplicates
+
+DUPLICATE_STATUS_DOES_NOT_MATCH_SEARCH = 0
+DUPLICATE_STATUS_MATCHES_SEARCH_BUT_NOT_TESTED = 1
+DUPLICATE_STATUS_MATCHES_SEARCH_FAILED_TEST = 2
+DUPLICATE_STATUS_MATCHES_SEARCH_PASSED_TEST = 3 # presumably this will not be needed much since we'll delete the duplicate pair soon after, but we may as well be careful
+
+class PairComparatorRule( HydrusSerialisable.SerialisableBase ):
+
+ def Test( self, media_result_better, media_result_worse ):
+
+ raise NotImplementedError()
+
+
+
+LOOKING_AT_BETTER_CANDIDATE = 0
+LOOKING_AT_WORSE_CANDIDATE = 1
+
+class PairComparatorRuleOneFile( PairComparatorRule ):
+
+ SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_ONE_FILE
+ SERIALISABLE_NAME = 'Auto-Duplicates Pair Comparator Rule - One File'
+ SERIALISABLE_VERSION = 1
+
+ def __init__( self ):
+
+ PairComparatorRule.__init__( self )
+
+ self._looking_at = LOOKING_AT_BETTER_CANDIDATE
+
+ # ok bro time to get metadata conditional working. first draft will be filetype test for jpeg/png. no need for UI yet
+ self._metadata_conditional = None
+ # what are we testing?
+ # this would be a great place to insert MetadataConditional
+ # mime is jpeg
+ # has icc profile
+ # maybe stuff like filesize > 200KB
+
+
+ # serialisable gubbins
+ # get/set
+
+ def Test( self, media_result_better, media_result_worse ):
+
+ if self._looking_at == LOOKING_AT_BETTER_CANDIDATE:
+
+ return self._metadata_conditional.Test( media_result_better )
+
+ else:
+
+ return self._metadata_conditional.Test( media_result_worse )
+
+
+
+
+HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_ONE_FILE ] = PairComparatorRuleOneFile
+
+class PairComparatorRuleTwoFiles( PairComparatorRule ):
+
+ SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_TWO_FILES
+ SERIALISABLE_NAME = 'Auto-Duplicates Pair Comparator Rule - Two Files'
+ SERIALISABLE_VERSION = 1
+
+ def __init__( self ):
+
+ PairComparatorRule.__init__( self )
+
+ # if I am feeling big brain, isn't this just a dynamic one-file metadata conditional?
+ # if we want 4x size, then we just pull the size of A and ask if B is <0.25x that or whatever. we don't need a clever two-file MetadataConditional test
+ # so, this guy should yeah just store two or three simple enums to handle type, operator, and quantity
+
+ # property
+ # width
+ # filesize
+ # age
+ # etc..
+ # operator
+ # is more than 4x larger
+ # is at least x absolute value larger?
+
+
+ # serialisable gubbins
+ # get/set
+
+ def Test( self, media_result_better, media_result_worse ):
+
+ pass
+
+
+
+HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_TWO_FILES ] = PairComparatorRuleTwoFiles
+
+class PairSelectorAndComparator( HydrusSerialisable.SerialisableBase ):
+
+ SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_SELECTOR_AND_COMPARATOR
+ SERIALISABLE_NAME = 'Auto-Duplicates Pair Selector and Comparator'
+ SERIALISABLE_VERSION = 1
+
+ def __init__( self ):
+
+ HydrusSerialisable.SerialisableBase.__init__( self )
+
+ self._rules = HydrusSerialisable.SerialisableList()
+
+
+ # serialisable gubbins
+ # get/set
+
+ def GetMatchingMedia( self, media_result_1, media_result_2 ):
+
+ pair = [ media_result_1, media_result_2 ]
+
+ # just in case both match
+ random.shuffle( pair )
+
+ ( media_result_1, media_result_2 ) = pair
+
+ if False not in ( rule.Test( media_result_1, media_result_2 ) for rule in self._rules ):
+
+ return media_result_1
+
+ elif False not in ( rule.Test( media_result_2, media_result_1 ) for rule in self._rules ):
+
+ return media_result_2
+
+ else:
+
+ return None
+
+
+
+
+HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_PAIR_SELECTOR_AND_COMPARATOR ] = PairSelectorAndComparator
+
+class AutoDuplicatesRule( HydrusSerialisable.SerialisableBaseNamed ):
+
+ SERIALISABLE_TYPE = HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_RULE
+ SERIALISABLE_NAME = 'Auto-Duplicates Rule'
+ SERIALISABLE_VERSION = 1
+
+ def __init__( self, name ):
+
+ HydrusSerialisable.SerialisableBaseNamed.__init__( self, name )
+
+ self._id = -1
+
+ # maybe make this search part into its own object? in ClientDuplicates
+ # could wangle duplicate pages and client api dupe stuff to work in the same guy, great idea
+ self._file_search_context_1 = None
+ self._file_search_context_2 = None
+ self._dupe_search_type = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
+ self._pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ self._max_hamming_distance = 4
+
+ self._selector_and_comparator = None
+
+ # action info
+ # set as better
+ # delete the other one
+ # optional custom merge options
+
+
+ # serialisable gubbins
+ # get/set
+ # 'here's a pair of media results, pass/fail?'
+
+
+HydrusSerialisable.SERIALISABLE_TYPES_TO_OBJECT_TYPES[ HydrusSerialisable.SERIALISABLE_AUTO_DUPLICATES_RULE ] = AutoDuplicatesRule
+
+class AutoDuplicatesManager( object ):
+
+ my_instance = None
+
+ def __init__( self ):
+
+ AutoDuplicatesManager.my_instance = self
+
+ # my rules, start with empty and then load from db or whatever on controller init
+
+ self._lock = threading.Lock()
+
+
+ @staticmethod
+ def instance() -> 'AutoDuplicatesManager':
+
+ if AutoDuplicatesManager.my_instance is None:
+
+ AutoDuplicatesManager()
+
+
+ return AutoDuplicatesManager.my_instance
+
+
diff --git a/hydrus/client/ClientDuplicates.py b/hydrus/client/duplicates/ClientDuplicates.py
similarity index 98%
rename from hydrus/client/ClientDuplicates.py
rename to hydrus/client/duplicates/ClientDuplicates.py
index 0a82c4531..a1beb9bbf 100644
--- a/hydrus/client/ClientDuplicates.py
+++ b/hydrus/client/duplicates/ClientDuplicates.py
@@ -23,6 +23,24 @@
from hydrus.client.metadata import ClientContentUpdates
from hydrus.client.metadata import ClientTags
+DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH = 0
+DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH = 1
+DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES = 2
+
+SIMILAR_FILES_PIXEL_DUPES_REQUIRED = 0
+SIMILAR_FILES_PIXEL_DUPES_ALLOWED = 1
+SIMILAR_FILES_PIXEL_DUPES_EXCLUDED = 2
+
+similar_files_pixel_dupes_string_lookup = {
+ SIMILAR_FILES_PIXEL_DUPES_REQUIRED : 'must be pixel dupes',
+ SIMILAR_FILES_PIXEL_DUPES_ALLOWED : 'can be pixel dupes',
+ SIMILAR_FILES_PIXEL_DUPES_EXCLUDED : 'must not be pixel dupes'
+}
+
+SYNC_ARCHIVE_NONE = 0
+SYNC_ARCHIVE_IF_ONE_DO_BOTH = 1
+SYNC_ARCHIVE_DO_BOTH_REGARDLESS = 2
+
hashes_to_jpeg_quality = {}
def GetDuplicateComparisonScore( shown_media, comparison_media ):
@@ -423,41 +441,26 @@ def GetDuplicateComparisonStatements( shown_media, comparison_media ):
global hashes_to_jpeg_quality
- if s_hash not in hashes_to_jpeg_quality:
-
- path = CG.client_controller.client_files_manager.GetFilePath( s_hash, s_mime )
+ for jpeg_hash in ( s_hash, c_hash ):
- try:
-
- raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
-
- result = HydrusImageMetadata.GetJPEGQuantizationQualityEstimate( raw_pil_image )
-
- except:
+ if jpeg_hash not in hashes_to_jpeg_quality:
- result = ( 'unknown', None )
+ path = CG.client_controller.client_files_manager.GetFilePath( jpeg_hash, HC.IMAGE_JPEG )
-
- hashes_to_jpeg_quality[ s_hash ] = result
-
-
- if c_hash not in hashes_to_jpeg_quality:
-
- path = CG.client_controller.client_files_manager.GetFilePath( c_hash, c_mime )
-
- try:
-
- raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
-
- result = HydrusImageMetadata.GetJPEGQuantizationQualityEstimate( raw_pil_image )
-
- except:
+ try:
+
+ raw_pil_image = HydrusImageOpening.RawOpenPILImage( path )
+
+ result = HydrusImageMetadata.GetJPEGQuantizationQualityEstimate( raw_pil_image )
+
+ except:
+
+ result = ( 'unknown', None )
+
- result = ( 'unknown', None )
+ hashes_to_jpeg_quality[ jpeg_hash ] = result
- hashes_to_jpeg_quality[ c_hash ] = result
-
( s_label, s_jpeg_quality ) = hashes_to_jpeg_quality[ s_hash ]
( c_label, c_jpeg_quality ) = hashes_to_jpeg_quality[ c_hash ]
@@ -785,10 +788,6 @@ def THREADSearchPotentials( self ):
-SYNC_ARCHIVE_NONE = 0
-SYNC_ARCHIVE_IF_ONE_DO_BOTH = 1
-SYNC_ARCHIVE_DO_BOTH_REGARDLESS = 2
-
def get_updated_domain_modified_timestamp_datas( destination_media: ClientMedia.MediaSingleton, source_media: ClientMedia.MediaSingleton, urls: typing.Collection[ str ] ):
from hydrus.client.networking import ClientNetworkingFunctions
diff --git a/hydrus/client/duplicates/__init__.py b/hydrus/client/duplicates/__init__.py
new file mode 100644
index 000000000..e69de29bb
diff --git a/hydrus/client/gui/ClientGUI.py b/hydrus/client/gui/ClientGUI.py
index e33a44442..3e1278db4 100644
--- a/hydrus/client/gui/ClientGUI.py
+++ b/hydrus/client/gui/ClientGUI.py
@@ -3459,6 +3459,8 @@ def flip_macos_antiflicker():
ClientGUIMenus.AppendMenuItem( gui_actions, 'make some popups', 'Throw some varied popups at the message manager, just to check it is working.', self._DebugMakeSomePopups )
ClientGUIMenus.AppendMenuItem( gui_actions, 'publish some sub files in five seconds', 'Publish some files like a subscription would.', self._controller.CallLater, 5, lambda: CG.client_controller.pub( 'imported_files_to_page', [ HydrusData.GenerateKey() for i in range( 5 ) ], 'example sub files' ) )
ClientGUIMenus.AppendMenuItem( gui_actions, 'refresh pages menu in five seconds', 'Delayed refresh the pages menu, giving you time to minimise or otherwise alter the client before it arrives.', self._controller.CallLater, 5, self._menu_updater_pages.update )
+ ClientGUIMenus.AppendMenuItem( gui_actions, 'reload current g ui session', 'Reload the current QSS stylesheet.', self._ReloadCurrentGUISession )
+ ClientGUIMenus.AppendMenuItem( gui_actions, 'reload current stylesheet', 'Reload the current QSS stylesheet.', ClientGUIStyle.ReloadStyleSheet )
ClientGUIMenus.AppendMenuItem( gui_actions, 'reset multi-column list settings to default', 'Reset all multi-column list widths and other display settings to default.', self._DebugResetColumnListManager )
ClientGUIMenus.AppendMenuItem( gui_actions, 'save \'last session\' gui session', 'Make an immediate save of the \'last session\' gui session. Mostly for testing crashes, where last session is not saved correctly.', self.ProposeSaveGUISession, CC.LAST_SESSION_SESSION_NAME )
@@ -5437,6 +5439,42 @@ def _RegenerateTagSiblingsLookupCache( self ):
+ def _ReloadCurrentGUISession( self ):
+
+ name = 'temp_session_slot_for_reload_if_you_see_this_you_can_delete_it'
+ only_changed_page_data = True
+ about_to_save = True
+
+ session = self._notebook.GetCurrentGUISession( name, only_changed_page_data, about_to_save )
+
+ self._FleshOutSessionWithCleanDataIfNeeded( self._notebook, name, session )
+
+ def qt_load():
+
+ while self._notebook.count() > 0:
+
+ self._notebook.CloseCurrentPage( polite = False )
+
+
+ self._notebook.LoadGUISession( name )
+
+ self._controller.Write( 'delete_serialisable_named', HydrusSerialisable.SERIALISABLE_TYPE_GUI_SESSION_CONTAINER, name )
+
+ self._controller.pub( 'notify_new_sessions' )
+
+
+
+ def do_save():
+
+ CG.client_controller.SaveGUISession( session )
+
+ CG.client_controller.CallBlockingToQt( self, qt_load )
+
+
+ self._controller.CallToThread( do_save )
+
+
+
def _RepairInvalidTags( self ):
message = 'This will scan all your tags and repair any that are invalid. This might mean taking out unrenderable characters or cleaning up improper whitespace. If there is a tag collision once cleaned, it may add a (1)-style number on the end.'
diff --git a/hydrus/client/gui/ClientGUICore.py b/hydrus/client/gui/ClientGUICore.py
index e9ed98ca9..9dec16da8 100644
--- a/hydrus/client/gui/ClientGUICore.py
+++ b/hydrus/client/gui/ClientGUICore.py
@@ -53,6 +53,8 @@ def MenuIsOpen( self ):
def PopupMenu( self, widget: QW.QWidget, menu: QW.QMenu ):
+ ClientGUIMenus.RemoveFinalSeparator( menu )
+
if HC.PLATFORM_MACOS and widget.window().isModal():
# Ok, seems like Big Sur can't do menus at the moment lmao. it shows the menu but the mouse can't interact with it
diff --git a/hydrus/client/gui/ClientGUIMenus.py b/hydrus/client/gui/ClientGUIMenus.py
index 962b30eff..9144414a2 100644
--- a/hydrus/client/gui/ClientGUIMenus.py
+++ b/hydrus/client/gui/ClientGUIMenus.py
@@ -17,6 +17,8 @@
def AppendMenu( menu, submenu, label ):
+ RemoveFinalSeparator( submenu )
+
label = SanitiseLabel( label )
submenu.setTitle( label )
@@ -234,6 +236,26 @@ def event_callable( checked_state ):
return event_callable
+
+def RemoveFinalSeparator( menu: QW.QMenu ):
+
+ num_items = len( menu.actions() )
+
+ if num_items > 0:
+
+ last_item = menu.actions()[-1]
+
+ # got this once, who knows what happened, so we test for QAction now
+ # 'PySide2.QtGui.QStandardItem' object has no attribute 'isSeparator'
+ if isinstance( last_item, QW.QAction ):
+
+ if last_item.isSeparator():
+
+ menu.removeAction( last_item )
+
+
+
+
def SanitiseLabel( label: str ) -> str:
if label == '':
diff --git a/hydrus/client/gui/ClientGUIScrolledPanelsEdit.py b/hydrus/client/gui/ClientGUIScrolledPanelsEdit.py
index 62a3b4b34..8908528b5 100644
--- a/hydrus/client/gui/ClientGUIScrolledPanelsEdit.py
+++ b/hydrus/client/gui/ClientGUIScrolledPanelsEdit.py
@@ -19,9 +19,9 @@
from hydrus.client import ClientApplicationCommand as CAC
from hydrus.client import ClientConstants as CC
-from hydrus.client import ClientDuplicates
from hydrus.client import ClientGlobals as CG
from hydrus.client import ClientTime
+from hydrus.client.duplicates import ClientDuplicates
from hydrus.client.gui import ClientGUIDialogs
from hydrus.client.gui import ClientGUIDialogsMessage
from hydrus.client.gui import ClientGUIDialogsQuick
diff --git a/hydrus/client/gui/ClientGUIScrolledPanelsManagement.py b/hydrus/client/gui/ClientGUIScrolledPanelsManagement.py
index 15cb07ac4..d40aa2e71 100644
--- a/hydrus/client/gui/ClientGUIScrolledPanelsManagement.py
+++ b/hydrus/client/gui/ClientGUIScrolledPanelsManagement.py
@@ -4119,6 +4119,15 @@ def __init__( self, parent, new_options ):
#
+ children_panel = ClientGUICommon.StaticBox( self, 'children tags' )
+
+ self._num_to_show_in_ac_dropdown_children_tab = ClientGUICommon.NoneableSpinCtrl( children_panel, none_phrase = 'show all', min = 1 )
+ tt = 'The "children" tab will show children of the current tag context (usually the list of tags above the autocomplete), ordered by file count. This can quickly get spammy, so I recommend you cull it to a reasonable size.'
+ self._num_to_show_in_ac_dropdown_children_tab.setToolTip( tt )
+ self._num_to_show_in_ac_dropdown_children_tab.SetValue( 20 ) # init default
+
+ #
+
self._expand_parents_on_storage_taglists.setChecked( self._new_options.GetBoolean( 'expand_parents_on_storage_taglists' ) )
self._expand_parents_on_storage_taglists.setToolTip( ClientGUIFunctions.WrapToolTip( 'This affects taglists in places like the manage tags dialog, where you edit tags as they actually are, and implied parents hang below tags.' ) )
@@ -4147,7 +4156,9 @@ def __init__( self, parent, new_options ):
#
- vbox = QP.VBoxLayout()
+ self._num_to_show_in_ac_dropdown_children_tab.SetValue( self._new_options.GetNoneableInteger( 'num_to_show_in_ac_dropdown_children_tab' ) )
+
+ #
rows = []
@@ -4164,18 +4175,30 @@ def __init__( self, parent, new_options ):
general_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
- QP.AddToLayout( vbox, general_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
-
#
favourites_panel.Add( favourites_st, CC.FLAGS_EXPAND_PERPENDICULAR )
favourites_panel.Add( self._favourites, CC.FLAGS_EXPAND_BOTH_WAYS )
favourites_panel.Add( self._favourites_input )
- QP.AddToLayout( vbox, favourites_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
+ #
+
+ rows = []
+
+ rows.append( ( 'How many tags to show in the children tab: ', self._num_to_show_in_ac_dropdown_children_tab ) )
+
+ gridbox = ClientGUICommon.WrapInGrid( children_panel, rows )
+
+ children_panel.Add( gridbox, CC.FLAGS_EXPAND_SIZER_PERPENDICULAR )
#
+ vbox = QP.VBoxLayout()
+
+ QP.AddToLayout( vbox, general_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
+ QP.AddToLayout( vbox, favourites_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
+ QP.AddToLayout( vbox, children_panel, CC.FLAGS_EXPAND_PERPENDICULAR )
+
self.setLayout( vbox )
#
@@ -4212,6 +4235,10 @@ def UpdateOptions( self ):
self._new_options.SetStringList( 'favourite_tags', list( self._favourites.GetTags() ) )
+ #
+
+ self._new_options.SetNoneableInteger( 'num_to_show_in_ac_dropdown_children_tab', self._num_to_show_in_ac_dropdown_children_tab.GetValue() )
+
class _TagPresentationPanel( QW.QWidget ):
diff --git a/hydrus/client/gui/ClientGUIStyle.py b/hydrus/client/gui/ClientGUIStyle.py
index d93c5f053..7049d7b8b 100644
--- a/hydrus/client/gui/ClientGUIStyle.py
+++ b/hydrus/client/gui/ClientGUIStyle.py
@@ -15,6 +15,7 @@
CURRENT_STYLE_NAME = None
ORIGINAL_STYLESHEET = None
CURRENT_STYLESHEET = None
+CURRENT_STYLESHEET_FILENAME = None
def ClearStylesheet():
@@ -88,6 +89,18 @@ def InitialiseDefaults():
CURRENT_STYLESHEET = ORIGINAL_STYLESHEET
+def ReloadStyleSheet():
+
+ ClearStylesheet()
+
+ if CURRENT_STYLESHEET_FILENAME is not None:
+
+ ClearStylesheet()
+
+ SetStylesheetFromPath( CURRENT_STYLESHEET_FILENAME )
+
+
+
def SetStyleFromName( name: str ):
if QtInit.WE_ARE_QT5:
@@ -116,7 +129,6 @@ def SetStyleFromName( name: str ):
except Exception as e:
raise HydrusExceptions.DataMissing( 'Style "{}" could not be generated/applied. If this is the default, perhaps a third-party custom style, you may have to restart the client to re-set it. Extra error info: {}'.format( name, e ) )
-
@@ -143,6 +155,10 @@ def SetStyleSheet( stylesheet, prepend_hydrus = True ):
def SetStylesheetFromPath( filename ):
+ global CURRENT_STYLESHEET_FILENAME
+
+ CURRENT_STYLESHEET_FILENAME = filename
+
path = os.path.join( STYLESHEET_DIR, filename )
if not os.path.exists( path ):
diff --git a/hydrus/client/gui/ClientGUISubscriptions.py b/hydrus/client/gui/ClientGUISubscriptions.py
index 549be06f0..9836238ff 100644
--- a/hydrus/client/gui/ClientGUISubscriptions.py
+++ b/hydrus/client/gui/ClientGUISubscriptions.py
@@ -1,4 +1,5 @@
import collections
+import itertools
import os
import threading
import time
@@ -39,6 +40,36 @@
from hydrus.client.importing import ClientImportSubscriptionQuery
from hydrus.client.importing import ClientImportSubscriptionLegacy # keep this here so the serialisable stuff is registered, it has to be imported somewhere
+def DoAliveOrDeadCheck( win: QW.QWidget, query_headers: typing.Collection[ ClientImportSubscriptionQuery.SubscriptionQueryHeader ] ):
+
+ do_alive = True
+ do_dead = True
+
+ num_dead = sum( ( 1 for query_header in query_headers if query_header.IsDead() ) )
+
+ if 0 < num_dead < len( query_headers ):
+
+ message = f'Of the {HydrusData.ToHumanInt(len(query_headers))} selected queries, {HydrusData.ToHumanInt(num_dead)} are DEAD. Which queries do you want to check?'
+
+ choice_tuples = [
+ ( f'all of them', ( True, True ), 'Resuscitate the DEAD queries and check everything.' ),
+ ( f'the {HydrusData.ToHumanInt(len(query_headers)-num_dead)} ALIVE', ( True, False ), 'Check the ALIVE queries.' ),
+ ( f'the {HydrusData.ToHumanInt(num_dead)} DEAD', ( False, True ), 'Resuscitate the DEAD queries and check them.' )
+ ]
+
+ try:
+
+ ( do_alive, do_dead ) = ClientGUIDialogsQuick.SelectFromListButtons( win, 'Check which?', choice_tuples, message = message )
+
+ except HydrusExceptions.CancelledException:
+
+ raise
+
+
+
+ return ( do_alive, do_dead )
+
+
def GetQueryHeadersQualityInfo( query_headers: typing.Iterable[ ClientImportSubscriptionQuery.SubscriptionQueryHeader ] ):
data = []
@@ -403,14 +434,38 @@ def _CheckerOptionsUpdated( self, checker_options ):
def _CheckNow( self ):
- selected_queries = self._query_headers.GetData( only_selected = True )
+ selected_query_headers = self._query_headers.GetData( only_selected = True )
+
+ try:
+
+ ( do_alive, do_dead ) = DoAliveOrDeadCheck( self, selected_query_headers )
+
+ except HydrusExceptions.CancelledException:
+
+ return
+
- for query_header in selected_queries:
+ for query_header in selected_query_headers:
+
+ if query_header.IsDead():
+
+ if not do_dead:
+
+ continue
+
+
+ else:
+
+ if not do_alive:
+
+ continue
+
+
query_header.CheckNow()
- self._query_headers.UpdateDatas( selected_queries )
+ self._query_headers.UpdateDatas( selected_query_headers )
self._query_headers.Sort()
@@ -2060,9 +2115,49 @@ def CheckNow( self ):
subscriptions = self._subscriptions.GetData( only_selected = True )
+ query_headers = HydrusData.MassExtend( ( subscription.GetQueryHeaders() for subscription in subscriptions ) )
+
+ try:
+
+ ( do_alive, do_dead ) = DoAliveOrDeadCheck( self, query_headers )
+
+ except HydrusExceptions.CancelledException:
+
+ return
+
+
for subscription in subscriptions:
- subscription.CheckNow()
+ we_did_some = False
+
+ query_headers = subscription.GetQueryHeaders()
+
+ for query_header in query_headers:
+
+ if query_header.IsDead():
+
+ if not do_dead:
+
+ continue
+
+
+ else:
+
+ if not do_alive:
+
+ continue
+
+
+
+ query_header.CheckNow()
+
+ we_did_some = True
+
+
+ if we_did_some:
+
+ subscription.ScrubDelay()
+
self._subscriptions.UpdateDatas( subscriptions )
diff --git a/hydrus/client/gui/canvas/ClientGUICanvas.py b/hydrus/client/gui/canvas/ClientGUICanvas.py
index b32f4b576..fa049f2cd 100644
--- a/hydrus/client/gui/canvas/ClientGUICanvas.py
+++ b/hydrus/client/gui/canvas/ClientGUICanvas.py
@@ -14,9 +14,9 @@
from hydrus.client import ClientApplicationCommand as CAC
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientData
-from hydrus.client import ClientDuplicates
from hydrus.client import ClientGlobals as CG
from hydrus.client import ClientLocation
+from hydrus.client.duplicates import ClientDuplicates
from hydrus.client.gui import ClientGUICore as CGC
from hydrus.client.gui import ClientGUIDialogs
from hydrus.client.gui import ClientGUIDialogsManage
diff --git a/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py b/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py
index 999302d38..5532a309a 100644
--- a/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py
+++ b/hydrus/client/gui/canvas/ClientGUICanvasHoverFrames.py
@@ -12,8 +12,8 @@
from hydrus.client import ClientApplicationCommand as CAC
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientData
-from hydrus.client import ClientDuplicates
from hydrus.client import ClientGlobals as CG
+from hydrus.client.duplicates import ClientDuplicates
from hydrus.client.gui import ClientGUIDragDrop
from hydrus.client.gui import ClientGUICore as CGC
from hydrus.client.gui import ClientGUIFunctions
diff --git a/hydrus/client/gui/importing/ClientGUIImportOptions.py b/hydrus/client/gui/importing/ClientGUIImportOptions.py
index 84164da24..59bfdf83c 100644
--- a/hydrus/client/gui/importing/ClientGUIImportOptions.py
+++ b/hydrus/client/gui/importing/ClientGUIImportOptions.py
@@ -312,6 +312,8 @@ def __init__( self, parent: QW.QWidget, file_import_options: FileImportOptions.F
QP.AddToLayout( vbox, self._load_default_options, CC.FLAGS_EXPAND_PERPENDICULAR )
QP.AddToLayout( vbox, self._specific_options_panel, CC.FLAGS_EXPAND_BOTH_WAYS )
+ vbox.addStretch( 1 )
+
self.widget().setLayout( vbox )
self._destination_location_context.locationChanged.connect( self._UpdateLocationText )
diff --git a/hydrus/client/gui/lists/ClientGUIListBoxes.py b/hydrus/client/gui/lists/ClientGUIListBoxes.py
index 2cad749de..4e6ca6a14 100644
--- a/hydrus/client/gui/lists/ClientGUIListBoxes.py
+++ b/hydrus/client/gui/lists/ClientGUIListBoxes.py
@@ -1989,6 +1989,8 @@ def _SelectionChanged( self ):
def _SetVirtualSize( self ):
+ # this triggers an update of the scrollbars, maybe important if this is the first time the thing is shown, let's see if it helps our missing scrollbar issue
+ # I think this is needed here for PySide2 and a/c dropdowns, help
self.setWidgetResizable( True )
my_size = self.widget().size()
@@ -1997,10 +1999,18 @@ def _SetVirtualSize( self ):
ideal_virtual_size = QC.QSize( my_size.width(), text_height * self._total_positional_rows )
+ if HG.gui_report_mode:
+
+ HydrusData.ShowText( f'Setting a virtual size on {self}. Num terms: {len( self._ordered_terms)}, Text height: {text_height}, Total Positional Rows: {self._total_positional_rows}, My Height: {my_size.height()}, Ideal Height: {ideal_virtual_size.height()}' )
+
+
if ideal_virtual_size != my_size:
self.widget().setMinimumSize( ideal_virtual_size )
+ # this triggers an update of the scrollbars, maybe important if this is the first time the thing is shown, let's see if it helps our missing scrollbar issue
+ self.setWidgetResizable( True )
+
def _Sort( self ):
diff --git a/hydrus/client/gui/pages/ClientGUIManagementController.py b/hydrus/client/gui/pages/ClientGUIManagementController.py
index 66d4837d7..efd2fc468 100644
--- a/hydrus/client/gui/pages/ClientGUIManagementController.py
+++ b/hydrus/client/gui/pages/ClientGUIManagementController.py
@@ -9,6 +9,7 @@
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientGlobals as CG
from hydrus.client import ClientLocation
+from hydrus.client.duplicates import ClientDuplicates
from hydrus.client.importing import ClientImportGallery
from hydrus.client.importing import ClientImportLocal
from hydrus.client.importing import ClientImportSimpleURLs
@@ -74,8 +75,8 @@ def CreateManagementControllerDuplicateFilter(
management_controller.SetVariable( 'file_search_context_1', file_search_context )
management_controller.SetVariable( 'file_search_context_2', file_search_context.Duplicate() )
- management_controller.SetVariable( 'dupe_search_type', CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
- management_controller.SetVariable( 'pixel_dupes_preference', CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
+ management_controller.SetVariable( 'dupe_search_type', ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
+ management_controller.SetVariable( 'pixel_dupes_preference', ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
management_controller.SetVariable( 'max_hamming_distance', 4 )
return management_controller
@@ -506,13 +507,13 @@ def _UpdateSerialisableInfo( self, version, old_serialisable_info ):
if management_type == MANAGEMENT_TYPE_DUPLICATE_FILTER:
- value = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
+ value = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
if 'both_files_match' in variables:
if variables[ 'both_files_match' ]:
- value = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ value = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
del variables[ 'both_files_match' ]
diff --git a/hydrus/client/gui/pages/ClientGUIManagementPanels.py b/hydrus/client/gui/pages/ClientGUIManagementPanels.py
index 09aff2df6..898f949c4 100644
--- a/hydrus/client/gui/pages/ClientGUIManagementPanels.py
+++ b/hydrus/client/gui/pages/ClientGUIManagementPanels.py
@@ -17,7 +17,6 @@
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientDefaults
-from hydrus.client import ClientDuplicates
from hydrus.client import ClientGlobals as CG
from hydrus.client import ClientLocation
from hydrus.client import ClientParsing
@@ -25,6 +24,7 @@
from hydrus.client import ClientServices
from hydrus.client import ClientThreading
from hydrus.client import ClientTime
+from hydrus.client.duplicates import ClientDuplicates
from hydrus.client.gui import ClientGUIAsync
from hydrus.client.gui import ClientGUICore as CGC
from hydrus.client.gui import ClientGUIDialogs
@@ -549,15 +549,15 @@ def __init__( self, parent, page, controller, management_controller: ClientGUIMa
self._dupe_search_type = ClientGUICommon.BetterChoice( self._filtering_panel )
- self._dupe_search_type.addItem( 'at least one file matches the search', CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
- self._dupe_search_type.addItem( 'both files match the search', CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH )
- self._dupe_search_type.addItem( 'both files match different searches', CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES )
+ self._dupe_search_type.addItem( 'at least one file matches the search', ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
+ self._dupe_search_type.addItem( 'both files match the search', ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH )
+ self._dupe_search_type.addItem( 'both files match different searches', ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES )
self._pixel_dupes_preference = ClientGUICommon.BetterChoice( self._filtering_panel )
- for p in ( CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED, CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
+ for p in ( ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_REQUIRED, ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED, ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED ):
- self._pixel_dupes_preference.addItem( CC.similar_files_pixel_dupes_string_lookup[ p ], p )
+ self._pixel_dupes_preference.addItem( ClientDuplicates.similar_files_pixel_dupes_string_lookup[ p ], p )
self._max_hamming_distance_for_filter = ClientGUICommon.BetterSpinBox( self._filtering_panel, min = 0, max = 64 )
@@ -592,7 +592,7 @@ def __init__( self, parent, page, controller, management_controller: ClientGUIMa
if not management_controller.HasVariable( 'pixel_dupes_preference' ):
- management_controller.SetVariable( 'pixel_dupes_preference', CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
+ management_controller.SetVariable( 'pixel_dupes_preference', ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
self._pixel_dupes_preference.SetValue( management_controller.GetVariable( 'pixel_dupes_preference' ) )
@@ -759,11 +759,11 @@ def _GetDuplicateFileSearchData( self, optimise_for_search = True ) -> typing.Tu
if optimise_for_search:
- if dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH and ( file_search_context_1.IsJustSystemEverything() or file_search_context_1.HasNoPredicates() ):
+ if dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH and ( file_search_context_1.IsJustSystemEverything() or file_search_context_1.HasNoPredicates() ):
- dupe_search_type = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
- elif dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
+ elif dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES:
if file_search_context_1.IsJustSystemEverything() or file_search_context_1.HasNoPredicates():
@@ -771,11 +771,11 @@ def _GetDuplicateFileSearchData( self, optimise_for_search = True ) -> typing.Tu
file_search_context_1 = file_search_context_2
file_search_context_2 = f
- dupe_search_type = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
elif file_search_context_2.IsJustSystemEverything() or file_search_context_2.HasNoPredicates():
- dupe_search_type = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
@@ -1027,9 +1027,9 @@ def _UpdateFilterSearchControls( self ):
( file_search_context_1, file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance ) = self._GetDuplicateFileSearchData( optimise_for_search = False )
- self._tag_autocomplete_2.setVisible( dupe_search_type == CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES )
+ self._tag_autocomplete_2.setVisible( dupe_search_type == ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES )
- self._max_hamming_distance_for_filter.setEnabled( self._pixel_dupes_preference.GetValue() != CC.SIMILAR_FILES_PIXEL_DUPES_REQUIRED )
+ self._max_hamming_distance_for_filter.setEnabled( self._pixel_dupes_preference.GetValue() != ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_REQUIRED )
def FilterDupeSearchTypeChanged( self ):
diff --git a/hydrus/client/gui/pages/ClientGUIPages.py b/hydrus/client/gui/pages/ClientGUIPages.py
index b030f8a59..150181bae 100644
--- a/hydrus/client/gui/pages/ClientGUIPages.py
+++ b/hydrus/client/gui/pages/ClientGUIPages.py
@@ -1973,9 +1973,13 @@ def _ShowMenu( self, screen_position ):
ClientGUIMenus.AppendMenuItem( menu, 'send pages to the right to a new page of pages', 'Make a new page of pages and put all the pages to the right into it.', self._SendRightPagesToNewNotebook, tab_index )
- if click_over_page_of_pages and page.count() > 0:
+ ClientGUIMenus.AppendSeparator( menu )
+
+ if not click_over_page_of_pages:
- ClientGUIMenus.AppendSeparator( menu )
+ ClientGUIMenus.AppendMenuItem( menu, 'refresh this page', 'Command this page to refresh.', page.RefreshQuery )
+
+ elif click_over_page_of_pages and page.count() > 0:
ClientGUIMenus.AppendMenuItem( menu, 'refresh all this page\'s pages', 'Command every page below this one to refresh.', page.RefreshAllPages )
diff --git a/hydrus/client/gui/pages/ClientGUIResults.py b/hydrus/client/gui/pages/ClientGUIResults.py
index de4de2ec6..8471e43d1 100644
--- a/hydrus/client/gui/pages/ClientGUIResults.py
+++ b/hydrus/client/gui/pages/ClientGUIResults.py
@@ -3286,7 +3286,7 @@ def _UpdateBackgroundColour( self ):
def _UpdateScrollBars( self ):
-
+
# The following call is officially a no-op since this property is already true, but it also triggers an update
# of the scroll area's scrollbars which we need.
# We need this since we are intercepting & doing work in resize events which causes
diff --git a/hydrus/client/gui/search/ClientGUIACDropdown.py b/hydrus/client/gui/search/ClientGUIACDropdown.py
index 7dcf11f95..7bfd9c3d9 100644
--- a/hydrus/client/gui/search/ClientGUIACDropdown.py
+++ b/hydrus/client/gui/search/ClientGUIACDropdown.py
@@ -599,13 +599,13 @@ def WriteFetch(
class ListBoxTagsPredicatesAC( ClientGUIListBoxes.ListBoxTagsPredicates ):
- def __init__( self, parent, callable, service_key, float_mode, **kwargs ):
+ def __init__( self, parent, callable, float_mode, service_key, **kwargs ):
ClientGUIListBoxes.ListBoxTagsPredicates.__init__( self, parent, **kwargs )
self._callable = callable
- self._service_key = service_key
self._float_mode = float_mode
+ self._service_key = service_key
self._predicates = {}
@@ -676,7 +676,6 @@ def SetPredicates( self, predicates, preserve_single_selection = False ):
if not they_are_the_same:
previously_selected_predicate = None
- previously_selected_predicate_had_count = False
if len( self._selected_terms ) == 1:
@@ -1534,6 +1533,155 @@ def SetForceDropdownHide( self, value ):
self._DropdownHideShow()
+
+class ChildrenTab( ListBoxTagsPredicatesAC ):
+
+ def __init__( self, parent: QW.QWidget, broadcast_call, float_mode: bool, location_context: ClientLocation.LocationContext, tag_service_key: bytes, tag_display_type: int = ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, height_num_chars: int = 4 ):
+
+ self._location_context = location_context
+ self._tags_to_child_predicates_cache = dict()
+ self._children_need_updating = True
+
+ ListBoxTagsPredicatesAC.__init__( self, parent, broadcast_call, float_mode, tag_service_key, tag_display_type = tag_display_type, height_num_chars = height_num_chars )
+
+
+ def NotifyNeedsUpdating( self ):
+
+ self._children_need_updating = True
+
+
+ def SetLocationContext( self, location_context: ClientLocation.LocationContext ):
+
+ self._location_context = location_context
+
+
+ def SetTagServiceKey( self, service_key: bytes ):
+
+ ListBoxTagsPredicatesAC.SetTagServiceKey( self, service_key )
+
+ self._tags_to_child_predicates_cache = dict()
+
+
+ def UpdateChildrenIfNeeded( self, context_tags: typing.Collection[ str ] ):
+
+ if self._children_need_updating:
+
+ context_tags = set( context_tags )
+
+ tag_display_type = self._tag_display_type
+ location_context = self._location_context
+ tag_service_key = self._service_key
+ tags_to_child_predicates_cache = dict( self._tags_to_child_predicates_cache )
+
+ if location_context.IsOneDomain():
+
+ search_location_context = location_context
+
+ else:
+
+ # let's not blat the db on some crazy multi-domain just for this un-numbered list
+ search_location_context = ClientLocation.LocationContext.STATICCreateSimple( CC.COMBINED_TAG_SERVICE_KEY )
+
+
+ tag_context = ClientSearch.TagContext( service_key = tag_service_key )
+
+ file_search_context = ClientSearch.FileSearchContext(
+ location_context = search_location_context,
+ tag_context = tag_context
+ )
+
+ def work_callable():
+
+ uncached_context_tags = { tag for tag in context_tags if tag not in tags_to_child_predicates_cache }
+
+ if len( uncached_context_tags ) > 0:
+
+ new_tags_to_child_tags = CG.client_controller.Read( 'tag_descendants_lookup', tag_service_key, uncached_context_tags )
+
+ new_child_tags = HydrusData.MassUnion( new_tags_to_child_tags.values() )
+
+ child_predicates = CG.client_controller.Read(
+ 'tag_predicates',
+ tag_display_type,
+ file_search_context,
+ new_child_tags,
+ zero_count_ok = True
+ )
+
+ child_tags_to_child_predicates = { predicate.GetValue() : predicate for predicate in child_predicates }
+
+ new_tags_to_child_predicates = { tag : { child_tags_to_child_predicates[ child_tag ] for child_tag in child_tags if child_tag in child_tags_to_child_predicates } for ( tag, child_tags ) in new_tags_to_child_tags.items() }
+
+ else:
+
+ new_tags_to_child_predicates = dict()
+
+
+ child_predicates = set()
+
+ for tag in context_tags:
+
+ if tag in tags_to_child_predicates_cache:
+
+ child_predicates.update( tags_to_child_predicates_cache[ tag ] )
+
+ elif tag in new_tags_to_child_predicates:
+
+ child_predicates.update( new_tags_to_child_predicates[ tag ] )
+
+
+
+ child_predicates = [ predicate for predicate in child_predicates if predicate.GetValue() not in context_tags ]
+
+ ClientSearch.SortPredicates( child_predicates )
+
+ child_predicates = [ predicate.GetCountlessCopy() for predicate in child_predicates ]
+
+ num_to_show_in_ac_dropdown_children_tab = CG.client_controller.new_options.GetNoneableInteger( 'num_to_show_in_ac_dropdown_children_tab' )
+
+ if num_to_show_in_ac_dropdown_children_tab is not None:
+
+ child_predicates = child_predicates[ : num_to_show_in_ac_dropdown_children_tab ]
+
+
+ return ( location_context, tag_service_key, child_predicates, new_tags_to_child_predicates )
+
+
+ def publish_callable( result ):
+
+ ( job_location_context, job_tag_service_key, child_predicates, new_tags_to_children ) = result
+
+ if job_location_context != self._location_context or job_tag_service_key != self._service_key:
+
+ self.SetPredicates( [] )
+
+ return
+
+
+ self._tags_to_child_predicates_cache.update( new_tags_to_children )
+
+ self.SetPredicates( child_predicates, preserve_single_selection = True )
+
+ self._children_need_updating = False
+
+
+ def errback_callable( etype, value, tb ):
+
+ self.SetPredicates( [] )
+
+ self._children_need_updating = False
+
+ HydrusData.ShowText( 'Trying to load some child tags failed, please send this to hydev:' )
+ HydrusData.ShowExceptionTuple( etype, value, tb, do_wait = False )
+
+
+ job = ClientGUIAsync.AsyncQtJob( self, work_callable, publish_callable, errback_callable = errback_callable )
+
+ job.start()
+
+
+
+
class AutoCompleteDropdownTags( AutoCompleteDropdown ):
locationChanged = QC.Signal( ClientLocation.LocationContext )
@@ -1569,9 +1717,7 @@ def __init__( self, parent, location_context: ClientLocation.LocationContext, ta
self._dropdown_notebook.addTab( self._favourites_list, 'favourites' )
- self._children_list = ListBoxTagsPredicatesAC( self._dropdown_notebook, self.BroadcastChoices, self._float_mode, self._tag_service_key, tag_display_type = ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, height_num_chars = 4 )
- self._tags_to_children_cache = dict()
- self._children_list_needs_updating = True
+ self._children_list = ChildrenTab( self._dropdown_notebook, self.BroadcastChoices, self._float_mode, self._location_context_button.GetValue(), self._tag_service_key, tag_display_type = ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, height_num_chars = 4 )
self._dropdown_notebook.addTab( self._children_list, 'children' )
@@ -1628,6 +1774,10 @@ def _LocationContextJustChanged( self, location_context: ClientLocation.Location
self._SetTagService( top_local_tag_service_key )
+ self._children_list.SetLocationContext( location_context )
+
+ self._NotifyChildrenListNeedsUpdating()
+
self.locationChanged.emit( location_context )
self._SetListDirty()
@@ -1635,7 +1785,7 @@ def _LocationContextJustChanged( self, location_context: ClientLocation.Location
def _NotifyChildrenListNeedsUpdating( self ):
- self._children_list_needs_updating = True
+ self._children_list.NotifyNeedsUpdating()
self._UpdateChildrenListIfNeeded()
@@ -1737,7 +1887,6 @@ def _TagContextJustChanged( self, tag_context: ClientSearch.TagContext ):
self._favourites_list.SetTagServiceKey( self._tag_service_key )
self._children_list.SetTagServiceKey( self._tag_service_key )
- self._tags_to_children_cache = dict()
self._NotifyChildrenListNeedsUpdating()
self.tagServiceChanged.emit( self._tag_service_key )
@@ -1754,83 +1903,9 @@ def _TakeResponsibilityForEnter( self, shift_down ):
def _UpdateChildrenListIfNeeded( self ):
- if self._children_list_needs_updating and self._dropdown_notebook.currentWidget() == self._children_list:
-
- tag_service_key = self._tag_service_key
- context_tags = set( self._current_context_tags )
- tags_to_children_cache = dict( self._tags_to_children_cache )
-
- def work_callable():
-
- uncached_context_tags = { tag for tag in context_tags if tag not in tags_to_children_cache }
-
- if len( uncached_context_tags ) > 0:
-
- new_tags_to_children = CG.client_controller.Read( 'tag_descendants_lookup', tag_service_key, uncached_context_tags )
-
- else:
-
- new_tags_to_children = dict()
-
-
- child_tags = set()
-
- for tag in context_tags:
-
- if tag in tags_to_children_cache:
-
- child_tags.update( tags_to_children_cache[ tag ] )
-
- elif tag in new_tags_to_children:
-
- child_tags.update( new_tags_to_children[ tag ] )
-
-
-
- child_tags.difference_update( context_tags )
-
- return ( tag_service_key, child_tags, new_tags_to_children )
-
-
- def publish_callable( result ):
-
- ( job_tag_service_key, child_tags, new_tags_to_children ) = result
-
- if job_tag_service_key != self._tag_service_key:
-
- self._children_list.SetPredicates( [] )
-
- return
-
-
- child_tags = list( child_tags )
-
- self._tags_to_children_cache.update( new_tags_to_children )
-
- tag_sort = ClientTagSorting.TagSort( sort_type = ClientTagSorting.SORT_BY_HUMAN_TAG, sort_order = CC.SORT_ASC )
-
- ClientTagSorting.SortTags( tag_sort, child_tags )
-
- predicates = [ ClientSearch.Predicate( ClientSearch.PREDICATE_TYPE_TAG, value = tag ) for tag in child_tags ]
-
- self._children_list.SetPredicates( predicates, preserve_single_selection = True )
-
- self._children_list_needs_updating = False
-
-
- def errback_callable( etype, value, tb ):
-
- self._children_list.SetPredicates( [] )
-
- self._children_list_needs_updating = False
-
- HydrusData.ShowText( 'Trying to load some child tags failed, please send this to hydev:' )
- HydrusData.ShowExceptionTuple( etype, value, tb, do_wait = False )
-
-
- job = ClientGUIAsync.AsyncQtJob( self, work_callable, publish_callable, errback_callable = errback_callable )
+ if self._dropdown_notebook.currentWidget() == self._children_list:
- job.start()
+ self._children_list.UpdateChildrenIfNeeded( set( self._current_context_tags ) )
@@ -2292,7 +2367,7 @@ def _InitSearchResultsList( self ):
height_num_chars = self._fixed_results_list_height
- return ListBoxTagsPredicatesAC( self._dropdown_notebook, self.BroadcastChoices, self._tag_service_key, self._float_mode, tag_display_type = ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, height_num_chars = height_num_chars )
+ return ListBoxTagsPredicatesAC( self._dropdown_notebook, self.BroadcastChoices, self._float_mode, self._tag_service_key, tag_display_type = ClientTags.TAG_DISPLAY_DISPLAY_ACTUAL, height_num_chars = height_num_chars )
def _LocationContextJustChanged( self, location_context: ClientLocation.LocationContext ):
@@ -3061,7 +3136,7 @@ def _InitSearchResultsList( self ):
height_num_chars = CG.client_controller.new_options.GetInteger( 'ac_write_list_height_num_chars' )
- preds_list = ListBoxTagsPredicatesAC( self._dropdown_notebook, self.BroadcastChoices, self._display_tag_service_key, self._float_mode, tag_display_type = ClientTags.TAG_DISPLAY_STORAGE, height_num_chars = height_num_chars )
+ preds_list = ListBoxTagsPredicatesAC( self._dropdown_notebook, self.BroadcastChoices, self._float_mode, self._display_tag_service_key, tag_display_type = ClientTags.TAG_DISPLAY_STORAGE, height_num_chars = height_num_chars )
preds_list.SetExtraParentRowsAllowed( CG.client_controller.new_options.GetBoolean( 'expand_parents_on_storage_autocomplete_taglists' ) )
preds_list.SetParentDecoratorsAllowed( CG.client_controller.new_options.GetBoolean( 'show_parent_decorators_on_storage_autocomplete_taglists' ) )
diff --git a/hydrus/client/importing/ClientImportFileSeeds.py b/hydrus/client/importing/ClientImportFileSeeds.py
index 62b904907..15a101cc4 100644
--- a/hydrus/client/importing/ClientImportFileSeeds.py
+++ b/hydrus/client/importing/ClientImportFileSeeds.py
@@ -1023,7 +1023,7 @@ def Import( self, temp_path: str, file_import_options: FileImportOptions.FileImp
file_import_options = FileImportOptions.GetRealFileImportOptions( file_import_options, FileImportOptions.IMPORT_TYPE_LOUD )
- file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options )
+ file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options, human_file_description = self.file_seed_data )
file_import_status = file_import_job.DoWork( status_hook = status_hook )
diff --git a/hydrus/client/importing/ClientImportFiles.py b/hydrus/client/importing/ClientImportFiles.py
index 7d7c7c3b5..326db1418 100644
--- a/hydrus/client/importing/ClientImportFiles.py
+++ b/hydrus/client/importing/ClientImportFiles.py
@@ -106,7 +106,7 @@ def CheckFileImportStatus( file_import_status: FileImportStatus ) -> FileImportS
class FileImportJob( object ):
- def __init__( self, temp_path: str, file_import_options: FileImportOptions.FileImportOptions ):
+ def __init__( self, temp_path: str, file_import_options: FileImportOptions.FileImportOptions, human_file_description = None ):
if HG.file_import_report_mode:
@@ -120,6 +120,7 @@ def __init__( self, temp_path: str, file_import_options: FileImportOptions.FileI
self._temp_path = temp_path
self._file_import_options = file_import_options
+ self._human_file_description = human_file_description
self._pre_import_file_status = FileImportStatus.STATICGetUnknownStatus()
self._post_import_file_status = FileImportStatus.STATICGetUnknownStatus()
@@ -422,7 +423,7 @@ def GenerateInfo( self, status_hook = None ):
if raw_pil_image is None:
- raw_pil_image = HydrusImageOpening.RawOpenPILImage( self._temp_path )
+ raw_pil_image = HydrusImageOpening.RawOpenPILImage( self._temp_path, human_file_description = self._human_file_description )
has_exif = HydrusImageMetadata.HasEXIF( raw_pil_image )
@@ -451,7 +452,7 @@ def GenerateInfo( self, status_hook = None ):
if raw_pil_image is None:
- raw_pil_image = HydrusImageOpening.RawOpenPILImage( self._temp_path )
+ raw_pil_image = HydrusImageOpening.RawOpenPILImage( self._temp_path, human_file_description = self._human_file_description )
has_icc_profile = HydrusImageMetadata.HasICCProfile( raw_pil_image )
diff --git a/hydrus/client/networking/ClientLocalServerResources.py b/hydrus/client/networking/ClientLocalServerResources.py
index dad33b8b3..9655d5e52 100644
--- a/hydrus/client/networking/ClientLocalServerResources.py
+++ b/hydrus/client/networking/ClientLocalServerResources.py
@@ -45,6 +45,7 @@
from hydrus.client import ClientTime
from hydrus.client import ClientRendering
from hydrus.client import ClientImageHandling
+from hydrus.client.duplicates import ClientDuplicates
from hydrus.client.importing import ClientImportFiles
from hydrus.client.importing.options import FileImportOptions
from hydrus.client.media import ClientMedia
@@ -625,8 +626,8 @@ def ParseDuplicateSearch( request: HydrusServerRequest.HydrusRequest ):
file_search_context_1 = ClientSearch.FileSearchContext( location_context = location_context, tag_context = tag_context_1, predicates = predicates_1 )
file_search_context_2 = ClientSearch.FileSearchContext( location_context = location_context, tag_context = tag_context_2, predicates = predicates_2 )
- dupe_search_type = request.parsed_request_args.GetValue( 'potentials_search_type', int, default_value = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
- pixel_dupes_preference = request.parsed_request_args.GetValue( 'pixel_duplicates', int, default_value = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
+ dupe_search_type = request.parsed_request_args.GetValue( 'potentials_search_type', int, default_value = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH )
+ pixel_dupes_preference = request.parsed_request_args.GetValue( 'pixel_duplicates', int, default_value = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED )
max_hamming_distance = request.parsed_request_args.GetValue( 'max_hamming_distance', int, default_value = 4 )
return (
@@ -1434,7 +1435,7 @@ def _threadDoPOSTJob( self, request: HydrusServerRequest.HydrusRequest ):
file_import_options = CG.client_controller.new_options.GetDefaultFileImportOptions( FileImportOptions.IMPORT_TYPE_QUIET )
- file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options )
+ file_import_job = ClientImportFiles.FileImportJob( temp_path, file_import_options, human_file_description = f'API POSTed File' )
body_dict = {}
diff --git a/hydrus/client/networking/ClientNetworkingFunctions.py b/hydrus/client/networking/ClientNetworkingFunctions.py
index 1a1c7d0ac..5f9671960 100644
--- a/hydrus/client/networking/ClientNetworkingFunctions.py
+++ b/hydrus/client/networking/ClientNetworkingFunctions.py
@@ -519,7 +519,7 @@ def EnsureURLIsEncoded( url: str, keep_fragment = True ) -> str:
single_value_parameters = [ ensure_param_component_is_encoded( single_value_parameter ) for single_value_parameter in single_value_parameters ]
path = '/' + '/'.join( path_components )
- query = ConvertQueryDictToText( query_dict, single_value_parameters )
+ query = ConvertQueryDictToText( query_dict, single_value_parameters, param_order = param_order )
if not keep_fragment:
diff --git a/hydrus/core/HydrusConstants.py b/hydrus/core/HydrusConstants.py
index 4ba94a129..880b69939 100644
--- a/hydrus/core/HydrusConstants.py
+++ b/hydrus/core/HydrusConstants.py
@@ -105,7 +105,7 @@
# Misc
NETWORK_VERSION = 20
-SOFTWARE_VERSION = 574
+SOFTWARE_VERSION = 575
CLIENT_API_VERSION = 64
SERVER_THUMBNAIL_DIMENSIONS = ( 200, 200 )
@@ -992,6 +992,9 @@
MIMES_WITH_THUMBNAILS = set( IMAGES ).union( ANIMATIONS ).union( VIDEO ).union( APPLICATIONS_WITH_THUMBNAILS )
+# basically a flash or a clip or a svg or whatever can normally just have some janked out resolution, so when testing such for thumbnail gen etc.., we'll ignore applications
+MIMES_THAT_ALWAYS_HAVE_GOOD_RESOLUTION = set( IMAGES ).union( ANIMATIONS ).union( VIDEO )
+
FILES_THAT_CAN_HAVE_ICC_PROFILE = { IMAGE_BMP, IMAGE_JPEG, IMAGE_PNG, IMAGE_GIF, IMAGE_TIFF, APPLICATION_PSD }.union( PIL_HEIF_MIMES )
FILES_THAT_CAN_HAVE_EXIF = { IMAGE_JPEG, IMAGE_TIFF, IMAGE_PNG, IMAGE_WEBP }.union( PIL_HEIF_MIMES )
diff --git a/hydrus/core/HydrusData.py b/hydrus/core/HydrusData.py
index c32a7e1e9..e6bc98e29 100644
--- a/hydrus/core/HydrusData.py
+++ b/hydrus/core/HydrusData.py
@@ -845,10 +845,16 @@ def LastShutdownWasBad( db_path, instance ):
return False
+def MassExtend( iterables ):
+
+ return [ item for item in itertools.chain.from_iterable( iterables ) ]
+
+
def MassUnion( iterables ):
return { item for item in itertools.chain.from_iterable( iterables ) }
+
def MedianPop( population ):
# assume it has at least one and comes sorted
diff --git a/hydrus/core/HydrusSerialisable.py b/hydrus/core/HydrusSerialisable.py
index 5a1785728..d58703bb1 100644
--- a/hydrus/core/HydrusSerialisable.py
+++ b/hydrus/core/HydrusSerialisable.py
@@ -142,6 +142,10 @@
SERIALISABLE_TYPE_STRING_JOINER = 125
SERIALISABLE_TYPE_FILE_FILTER = 126
SERIALISABLE_TYPE_URL_CLASS_PARAMETER_FIXED_NAME = 127
+SERIALISABLE_AUTO_DUPLICATES_RULE = 128
+SERIALISABLE_AUTO_DUPLICATES_PAIR_SELECTOR_AND_COMPARATOR = 129
+SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_ONE_FILE = 130
+SERIALISABLE_AUTO_DUPLICATES_PAIR_COMPARATOR_RULE_TWO_FILES = 131
SERIALISABLE_TYPES_TO_OBJECT_TYPES = {}
diff --git a/hydrus/core/files/HydrusAnimationHandling.py b/hydrus/core/files/HydrusAnimationHandling.py
index 31ec30d35..edabdb57f 100644
--- a/hydrus/core/files/HydrusAnimationHandling.py
+++ b/hydrus/core/files/HydrusAnimationHandling.py
@@ -199,9 +199,9 @@ def GetAPNGDurationAndNumFrames( path ):
return ( duration_in_ms, num_frames )
-def GetFrameDurationsPILAnimation( path ):
+def GetFrameDurationsPILAnimation( path, human_file_description = None ):
- pil_image = HydrusImageOpening.RawOpenPILImage( path )
+ pil_image = HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
times_to_play = GetTimesToPlayPILAnimationFromPIL( pil_image )
@@ -301,11 +301,11 @@ def GetTimesToPlayAPNG( path: str ) -> int:
return num_plays
-def GetTimesToPlayPILAnimation( path ) -> int:
+def GetTimesToPlayPILAnimation( path, human_file_description = None ) -> int:
try:
- pil_image = HydrusImageOpening.RawOpenPILImage( path )
+ pil_image = HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
except HydrusExceptions.UnsupportedFileException:
diff --git a/hydrus/core/files/HydrusOfficeOpenXMLHandling.py b/hydrus/core/files/HydrusOfficeOpenXMLHandling.py
index f3ca35946..fcdda511a 100644
--- a/hydrus/core/files/HydrusOfficeOpenXMLHandling.py
+++ b/hydrus/core/files/HydrusOfficeOpenXMLHandling.py
@@ -93,7 +93,7 @@ def PowerPointResolution( path: str ):
file = GetZipAsPath( path, 'ppt/presentation.xml' ).open( 'rb' )
root = ET.parse( file )
-
+
sldSz = root.find('./p:sldSz', {'p': 'http://schemas.openxmlformats.org/presentationml/2006/main'})
x_emu = int(sldSz.get('cx'))
diff --git a/hydrus/core/files/images/HydrusImageHandling.py b/hydrus/core/files/images/HydrusImageHandling.py
index 1f02eef14..379d80c74 100644
--- a/hydrus/core/files/images/HydrusImageHandling.py
+++ b/hydrus/core/files/images/HydrusImageHandling.py
@@ -142,7 +142,11 @@ def ClipPILImage( pil_image: PILImage.Image, clip_rect ):
return pil_image.crop( box = ( x, y, x + clip_width, y + clip_height ) )
-def GenerateNumPyImage( path, mime, force_pil = False ) -> numpy.array:
+FORCE_PIL_ALWAYS = True
+
+def GenerateNumPyImage( path, mime, force_pil = False, human_file_description = None ) -> numpy.array:
+
+ force_pil = force_pil or FORCE_PIL_ALWAYS
if HG.media_load_report_mode:
@@ -180,7 +184,7 @@ def GenerateNumPyImage( path, mime, force_pil = False ) -> numpy.array:
if not force_pil:
- pil_image = HydrusImageOpening.RawOpenPILImage( path )
+ pil_image = HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
if pil_image.mode == 'LAB':
@@ -277,9 +281,9 @@ def GenerateNumPyImageFromPILImage( pil_image: PILImage.Image, strip_useless_alp
return numpy_image
-def GeneratePILImage( path: typing.Union[ str, typing.BinaryIO ], dequantize = True ) -> PILImage.Image:
+def GeneratePILImage( path: typing.Union[ str, typing.BinaryIO ], dequantize = True, human_file_description = None ) -> PILImage.Image:
- pil_image = HydrusImageOpening.RawOpenPILImage( path )
+ pil_image = HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
try:
@@ -287,7 +291,7 @@ def GeneratePILImage( path: typing.Union[ str, typing.BinaryIO ], dequantize = T
if dequantize:
- if pil_image.mode in ( 'I', 'F' ):
+ if pil_image.mode in ( 'I', 'I;16', 'I;16L', 'I;16B', 'I;16N', 'F' ):
# 'I' = greyscale, uint16
# 'F' = float, np.float32
@@ -622,7 +626,7 @@ def GetThumbnailResolution( image_resolution: typing.Tuple[ int, int ], bounding
return ( thumbnail_width, thumbnail_height )
-def IsDecompressionBomb( path ) -> bool:
+def IsDecompressionBomb( path, human_file_description = None ) -> bool:
# there are two errors here, the 'Warning' and the 'Error', which atm is just a test vs a test x 2 for number of pixels
# 256MB bmp by default, ( 1024 ** 3 ) // 4 // 3
@@ -634,7 +638,7 @@ def IsDecompressionBomb( path ) -> bool:
try:
- HydrusImageOpening.RawOpenPILImage( path )
+ HydrusImageOpening.RawOpenPILImage( path, human_file_description = human_file_description )
except ( PILImage.DecompressionBombError ):
diff --git a/hydrus/core/files/images/HydrusImageNormalisation.py b/hydrus/core/files/images/HydrusImageNormalisation.py
index a12a44538..17d7324a6 100644
--- a/hydrus/core/files/images/HydrusImageNormalisation.py
+++ b/hydrus/core/files/images/HydrusImageNormalisation.py
@@ -155,7 +155,7 @@ def NormaliseICCProfilePILImageToSRGB( pil_image: PILImage.Image ) -> PILImage.I
src_profile = PILImageCms.ImageCmsProfile( f )
- if pil_image.mode in ( 'I', 'F', 'L', 'LA', 'P' ):
+ if pil_image.mode in ( 'I', 'I;16', 'I;16L', 'I;16B', 'I;16N', 'F', 'L', 'LA', 'P' ):
# had a bunch of LA pngs that turned pure white on RGBA ICC conversion
# but seem to work fine if keep colourspace the same for now
diff --git a/hydrus/core/files/images/HydrusImageOpening.py b/hydrus/core/files/images/HydrusImageOpening.py
index 00cb589c9..e56b1bc38 100644
--- a/hydrus/core/files/images/HydrusImageOpening.py
+++ b/hydrus/core/files/images/HydrusImageOpening.py
@@ -3,20 +3,33 @@
from hydrus.core import HydrusExceptions
-def RawOpenPILImage( path: typing.Union[ str, typing.BinaryIO ] ) -> PILImage.Image:
+def RawOpenPILImage( path: typing.Union[ str, typing.BinaryIO ], human_file_description = None ) -> PILImage.Image:
try:
pil_image = PILImage.open( path )
- except Exception as e:
+ if pil_image is None:
+
+ raise Exception( 'PIL returned None.' )
+
- raise HydrusExceptions.DamagedOrUnusualFileException( f'Could not load the image at "{path}"--it was likely malformed!' ) from e
+ except Exception as e:
-
- if pil_image is None:
+ if human_file_description is not None:
+
+ message = f'Could not load the image at "{human_file_description}"--it was likely malformed!'
+
+ elif isinstance( path, str ):
+
+ message = f'Could not load the image at "{path}"--it was likely malformed!'
+
+ else:
+
+ message = f'Could not load the image, which had no path (so was probably from inside another file?)--it was likely malformed!'
+
- raise HydrusExceptions.DamagedOrUnusualFileException( f'Could not load the image at "{path}"--it was likely malformed!' )
+ raise HydrusExceptions.DamagedOrUnusualFileException( message ) from e
return pil_image
diff --git a/hydrus/test/TestClientAPI.py b/hydrus/test/TestClientAPI.py
index ec48b66a2..b2fabf5f1 100644
--- a/hydrus/test/TestClientAPI.py
+++ b/hydrus/test/TestClientAPI.py
@@ -26,6 +26,7 @@
from hydrus.client import ClientLocation
from hydrus.client import ClientServices
from hydrus.client import ClientTime
+from hydrus.client.duplicates import ClientDuplicates
from hydrus.client.importing import ClientImportFiles
from hydrus.client.media import ClientMediaManagers
from hydrus.client.media import ClientMediaResult
@@ -4172,8 +4173,8 @@ def _test_manage_duplicates( self, connection, set_up_permissions ):
default_file_search_context = ClientSearch.FileSearchContext( location_context = default_location_context, tag_context = tag_context, predicates = predicates )
- default_potentials_search_type = CC.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
- default_pixel_duplicates = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ default_potentials_search_type = ClientDuplicates.DUPE_SEARCH_ONE_FILE_MATCHES_ONE_SEARCH
+ default_pixel_duplicates = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
default_max_hamming_distance = 4
test_tag_service_key_1 = CC.DEFAULT_LOCAL_TAG_SERVICE_KEY
@@ -4192,8 +4193,8 @@ def _test_manage_duplicates( self, connection, set_up_permissions ):
test_file_search_context_2 = ClientSearch.FileSearchContext( location_context = default_location_context, tag_context = test_tag_context_2, predicates = test_predicates_2 )
- test_potentials_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES
- test_pixel_duplicates = CC.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED
+ test_potentials_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_DIFFERENT_SEARCHES
+ test_pixel_duplicates = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_EXCLUDED
test_max_hamming_distance = 8
# get count
diff --git a/hydrus/test/TestClientDBDuplicates.py b/hydrus/test/TestClientDBDuplicates.py
index 8e8e117a8..d237709e6 100644
--- a/hydrus/test/TestClientDBDuplicates.py
+++ b/hydrus/test/TestClientDBDuplicates.py
@@ -10,6 +10,7 @@
from hydrus.client import ClientConstants as CC
from hydrus.client import ClientLocation
from hydrus.client.db import ClientDB
+from hydrus.client.duplicates import ClientDuplicates
from hydrus.client.importing import ClientImportFiles
from hydrus.client.importing.options import FileImportOptions
from hydrus.client.metadata import ClientContentUpdates
@@ -122,9 +123,9 @@ def _import_and_find_dupes( self ):
def _test_initial_state( self ):
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
@@ -175,9 +176,9 @@ def _test_initial_better_worse( self ):
self._our_main_dupe_group_hashes.add( self._dupe_hashes[2] )
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
@@ -263,9 +264,9 @@ def _test_initial_king_usurp( self ):
self._our_main_dupe_group_hashes.add( self._king_hash )
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
@@ -330,9 +331,9 @@ def _test_initial_same_quality( self ):
self._our_main_dupe_group_hashes.add( self._dupe_hashes[5] )
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
@@ -508,9 +509,9 @@ def _test_poach_same( self ):
self._write( 'duplicate_pair_status', [ row ] )
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
@@ -592,9 +593,9 @@ def _test_group_merge( self ):
self._write( 'duplicate_pair_status', rows )
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
@@ -636,9 +637,9 @@ def _test_establish_false_positive_group( self ):
self._write( 'duplicate_pair_status', rows )
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
@@ -656,9 +657,9 @@ def _test_false_positive( self ):
self._write( 'duplicate_pair_status', [ row ] )
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
@@ -712,9 +713,9 @@ def _test_establish_alt_group( self ):
self._write( 'duplicate_pair_status', rows )
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
@@ -732,9 +733,9 @@ def _test_alt( self ):
self._write( 'duplicate_pair_status', [ row ] )
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
@@ -794,9 +795,9 @@ def _test_expand_false_positive( self ):
self._write( 'duplicate_pair_status', rows )
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
@@ -857,9 +858,9 @@ def _test_expand_alt( self ):
self._write( 'duplicate_pair_status', rows )
- pixel_dupes_preference = CC.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
+ pixel_dupes_preference = ClientDuplicates.SIMILAR_FILES_PIXEL_DUPES_ALLOWED
max_hamming_distance = 4
- dupe_search_type = CC.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
+ dupe_search_type = ClientDuplicates.DUPE_SEARCH_BOTH_FILES_MATCH_ONE_SEARCH
num_potentials = self._read( 'potential_duplicates_count', self._file_search_context_1, self._file_search_context_2, dupe_search_type, pixel_dupes_preference, max_hamming_distance )
diff --git a/hydrus/test/TestClientNetworking.py b/hydrus/test/TestClientNetworking.py
index d2774d960..05b8b4ecd 100644
--- a/hydrus/test/TestClientNetworking.py
+++ b/hydrus/test/TestClientNetworking.py
@@ -357,6 +357,13 @@ def test_encoding( self ):
self.assertEqual( ClientNetworkingFunctions.EnsureURLIsEncoded( human_url_with_mix ), encoded_url_with_mix )
self.assertEqual( ClientNetworkingFunctions.EnsureURLIsEncoded( encoded_url_with_mix ), encoded_url_with_mix )
+ # double-check we don't auto-alphabetise params in this early stage! we screwed this up before and broke that option
+ human_url_with_mix = 'https://grunky.site/post?b=5 5&a=1 1'
+ encoded_url_with_mix = 'https://grunky.site/post?b=5%205&a=1%201'
+
+ self.assertEqual( ClientNetworkingFunctions.EnsureURLIsEncoded( human_url_with_mix ), encoded_url_with_mix )
+ self.assertEqual( ClientNetworkingFunctions.EnsureURLIsEncoded( encoded_url_with_mix ), encoded_url_with_mix )
+
def test_defaults( self ):
diff --git a/hydrus/test/TestHydrusSerialisable.py b/hydrus/test/TestHydrusSerialisable.py
index e88cf0533..34660c9e4 100644
--- a/hydrus/test/TestHydrusSerialisable.py
+++ b/hydrus/test/TestHydrusSerialisable.py
@@ -8,9 +8,8 @@
from hydrus.client import ClientApplicationCommand as CAC
from hydrus.client import ClientConstants as CC
-from hydrus.client import ClientData
from hydrus.client import ClientDefaults
-from hydrus.client import ClientDuplicates
+from hydrus.client.duplicates import ClientDuplicates
from hydrus.client.gui import ClientGUIShortcuts
from hydrus.client.importing import ClientImportSubscriptions
from hydrus.client.importing import ClientImportSubscriptionQuery
diff --git a/static/default/parsers/danbooru file page parser - get webm ugoira.png b/static/default/parsers/danbooru file page parser - get webm ugoira.png
index bac3157c6..31bd0de3a 100644
Binary files a/static/default/parsers/danbooru file page parser - get webm ugoira.png and b/static/default/parsers/danbooru file page parser - get webm ugoira.png differ
diff --git a/static/default/parsers/danbooru file page parser.png b/static/default/parsers/danbooru file page parser.png
index 78dc40499..2d055c668 100644
Binary files a/static/default/parsers/danbooru file page parser.png and b/static/default/parsers/danbooru file page parser.png differ
diff --git a/static/default/url_classes/fixupx tweet.png b/static/default/url_classes/fixupx tweet.png
new file mode 100644
index 000000000..bcf2b68c7
Binary files /dev/null and b/static/default/url_classes/fixupx tweet.png differ
diff --git a/static/default/url_classes/fixvx tweet.png b/static/default/url_classes/fixvx tweet.png
new file mode 100644
index 000000000..d007a1a34
Binary files /dev/null and b/static/default/url_classes/fixvx tweet.png differ
diff --git a/static/default/url_classes/fxtwitter tweet.png b/static/default/url_classes/fxtwitter tweet.png
new file mode 100644
index 000000000..6cedefba8
Binary files /dev/null and b/static/default/url_classes/fxtwitter tweet.png differ
diff --git a/static/default/url_classes/vxtwitter api status (with username).png b/static/default/url_classes/vxtwitter api status (with username).png
new file mode 100644
index 000000000..4dbfa9f32
Binary files /dev/null and b/static/default/url_classes/vxtwitter api status (with username).png differ
diff --git a/static/default/url_classes/vxtwitter api status.png b/static/default/url_classes/vxtwitter api status.png
new file mode 100644
index 000000000..7bfdcdb69
Binary files /dev/null and b/static/default/url_classes/vxtwitter api status.png differ
diff --git a/static/default/url_classes/vxtwitter tweet.png b/static/default/url_classes/vxtwitter tweet.png
new file mode 100644
index 000000000..9407f3c6f
Binary files /dev/null and b/static/default/url_classes/vxtwitter tweet.png differ
diff --git a/static/default/url_classes/x post.png b/static/default/url_classes/x post.png
index 5475b40c9..7f0ae94d3 100644
Binary files a/static/default/url_classes/x post.png and b/static/default/url_classes/x post.png differ