Compare commits

...

160 Commits

Author SHA1 Message Date
Alex Ling eca47e3d32 Update README config example 2022-02-11 14:28:05 +00:00
Alex Ling ab3386546d Remove db_optimization from README 2022-02-06 06:39:59 +00:00
Alex Ling 857c11be85 Enable metadata cache by default 2022-02-06 06:39:46 +00:00
Alex Ling b3ea3c6154 Remove unnecessary type restrictions in config 2022-02-06 06:28:39 +00:00
Alex Ling 84168b4f53 Update config example in README 2022-02-06 06:28:10 +00:00
Alex Ling 59528de44d Remove mangadex entry from config 2022-02-06 06:18:09 +00:00
Alex Ling a29d6754e8 Expand paths in config (closes #277) 2022-02-06 06:17:42 +00:00
Alex Ling 167e207fad Bump version to 0.25.0 2022-01-26 12:12:01 +00:00
Alex Ling 3b52d72ebf Merge branch 'master' into rc/0.25.0 2022-01-26 12:11:06 +00:00
Alex Ling dc5edc0c1b Merge pull request #272 from hkalexling/all-contributors/add-nduja
docs: add nduja as a contributor for code
2022-01-26 20:09:32 +08:00
allcontributors[bot] 7fa8ffa0bd docs: update .all-contributorsrc [skip ci] 2022-01-26 11:52:12 +00:00
allcontributors[bot] 85b57672e6 docs: update README.md [skip ci] 2022-01-26 11:52:11 +00:00
Alex Ling 9b111b0ee8 Ignore thumbnail progress in cache (fixes #270) 2022-01-26 09:10:47 +00:00
Alex Ling 8b1c301950 Merge pull request #269 from hkalexling/all-contributors/add-BradleyDS2
docs: add BradleyDS2 as a contributor for doc
2022-01-25 14:01:02 +08:00
allcontributors[bot] 3df4675dd7 docs: update .all-contributorsrc [skip ci] 2022-01-25 05:14:09 +00:00
allcontributors[bot] 312de0e7b5 docs: update README.md [skip ci] 2022-01-25 05:14:08 +00:00
Alex Ling d57ccc8f81 Merge pull request #264 from BradleyDS2/patch-1
Update README.md
2022-01-25 13:13:59 +08:00
Alex Ling fea6c04c4f Fix actions on download manager (fixes #266) 2022-01-24 14:24:35 +00:00
Alex Ling 77df418390 Compare with DB when loading library cache (fixes #256) 2022-01-24 14:18:52 +00:00
Alex Ling 750fbbb8fe Delete cache when dir mismatch (fixes #265) 2022-01-24 13:25:55 +00:00
BradleyDS2 cfe46b435d Update README.md
Fix typo: 'thrid' to 'third'
2022-01-24 00:08:18 +11:00
Alex Ling b2329a79b4 Gracefully handle nullish fields 2022-01-18 15:02:16 +00:00
Alex Ling 2007f13ed6 Merge pull request #259 from Leeingnyo/feature/custom-sort-title-and-sorting-titles
Implement custom sort title and sorting titles
2022-01-15 20:56:16 +08:00
Alex Ling f70be435f9 Merge branch 'dev' into feature/custom-sort-title-and-sorting-titles 2022-01-15 20:30:56 +08:00
Alex Ling 1b32dc3de9 Add sort title to API response 2022-01-15 12:09:03 +00:00
Alex Ling b83ccf1ccc Fix down SQL 2022-01-15 12:08:23 +00:00
Alex Ling a68783aa21 Merge pull request #261 from nduja/feature/right-to-left
Feature/right to left
2022-01-06 16:54:47 +08:00
Robbo 86beed0c5f Cast RightToLeft value to boolean when retrieving from local storage. 2022-01-05 19:04:40 +11:00
Robbo b6c8386caf Merge branch 'dev' into feature/right-to-left 2022-01-04 21:01:56 +11:00
Robbo 27cc669012 Fix Right to Left for keyboard input 2022-01-04 20:43:55 +11:00
Robbo 4b302af2a1 Add Right to Left option to Paged viewing mode 2022-01-04 00:20:52 +11:00
Alex Ling ab29a9eb80 Fix down SQL for removing columns 2021-12-31 14:33:49 +00:00
Leeingnyo e7538bb7f2 Use val(), Remove callbacks after modal hidden 2021-12-26 05:56:45 +09:00
Leeingnyo ecaec307d6 Fix title sort bug, invalidate titles of the Library
Refactor remove cache
2021-12-26 05:15:21 +09:00
Leeingnyo b711072492 Fix lint 2021-12-26 04:10:03 +09:00
Leeingnyo 0f94288bab Avoid N+1 queries problem 2021-12-26 03:29:41 +09:00
Leeingnyo bd2ed1b338 Implement to restore a display name with an empty input
Add a placeholder for the default display name
Remove some console.log() callings
2021-12-26 03:29:23 +09:00
Leeingnyo 1cd777d27d Cache sorted titles 2021-12-26 02:56:57 +09:00
Leeingnyo 1ec8dcbfda Use sort_title, sort_titles in title page 2021-12-26 02:55:52 +09:00
Leeingnyo 8fea35fa51 Use sorted_titles 2021-12-26 00:27:00 +09:00
Leeingnyo 234b29bbdd Fix save 2021-12-25 23:05:12 +09:00
Leeingnyo edfef80e5c Invalidate sort result cache after change sort_title 2021-12-25 23:05:12 +09:00
Leeingnyo 45ffa3d428 Implement UI to edit sort title 2021-12-25 23:05:12 +09:00
Leeingnyo 162318cf4a Add Api call 2021-12-25 23:05:12 +09:00
Leeingnyo d4b58e91d1 Implement sort title api 2021-12-25 22:43:35 +09:00
Leeingnyo 546bd0138c Use sort_title instead of title 2021-12-25 22:43:35 +09:00
Leeingnyo ab799af866 Implement sort_title getter, setter 2021-12-25 22:43:35 +09:00
Leeingnyo 3a932d7b0a Add column 'sort_title' to titles, ids table 2021-12-25 21:49:30 +09:00
Leeingnyo 57683d1cfb Add sort option "Name" for title 2021-12-24 16:44:07 +09:00
Alex Ling d7afd0969a Merge pull request #258 from Leeingnyo/fix/fix-bug-on-scan
Fix bug on scanning
2021-12-22 20:33:12 +08:00
Alex Ling 4eda55552b Linter fix 2021-12-22 12:16:44 +00:00
Leeingnyo f9254c49a1 Fix lint error 2021-12-19 17:14:47 +09:00
Leeingnyo 6d834e9164 Fix formatting 2021-12-19 17:03:39 +09:00
Leeingnyo 70259d8e50 Do same with an entry 2021-12-19 17:03:10 +09:00
Leeingnyo 0fa2bfa744 Fix bug on examine 2021-12-19 16:40:38 +09:00
Leeingnyo cc33fa6595 Fix bug: remove titles not in root library anymore 2021-12-19 16:31:43 +09:00
Alex Ling 921628ba6d Limit max length in download table (fixes #244) 2021-11-17 13:10:44 +00:00
Alex Ling 1199eb7a03 Use mobile menu at @m (fixes #246) 2021-11-16 13:37:19 +00:00
Alex Ling f075511847 Merge pull request #245 from hkalexling/feature/login-api
Add endpoint `/api/login`
2021-10-11 13:20:34 +08:00
Alex Ling 80344c3bf0 Add endpoint /api/login 2021-10-08 10:07:40 +00:00
Alex Ling 8a732804ae Merge pull request #232 from hkalexling/rc/0.24.0
v0.24.0
2021-09-25 13:39:48 +08:00
Alex Ling 9df372f784 Merge branch 'dev' into rc/0.24.0 2021-09-23 07:52:58 +00:00
Alex Ling cf7431b8b6 Merge pull request #236 from hkalexling/fix/shallow-api-fix
Use `depth` instead of `shallow` in API
2021-09-23 15:52:14 +08:00
Alex Ling 974b6cfe9b Use depth instead of shallow in API 2021-09-22 09:17:14 +00:00
Alex Ling 4fbe5b471c Update README 2021-09-20 01:36:31 +00:00
Alex Ling 33e7e31fbc Bump version to 0.24.0 2021-09-20 01:11:26 +00:00
Alex Ling 72fae7f5ed Fix typo cbz -> gz 2021-09-18 12:41:25 +00:00
Alex Ling f50a7e3b3e Merge branch 'dev' into rc/0.24.0 2021-09-18 12:40:56 +00:00
Alex Ling 66c4037f2b Merge pull request #233 from Leeingnyo/feature/avoid-unnecessary-sort
Avoid a unnecessary sorting
2021-09-18 20:40:25 +08:00
Leeingnyo 2c022a07e7 Avoid unnecessary sorts when getting deep percentage
This make page loading fast
2021-09-18 18:49:29 +09:00
Alex Ling 91362dfc7d Merge branch 'master' into rc/0.24.0 2021-09-18 09:20:59 +00:00
Alex Ling 97168b65d8 Make library cache path configurable 2021-09-18 08:40:08 +00:00
Alex Ling 6e04e249e7 Merge pull request #229 from Leeingnyo/feature/preserve-scanned-titles
Reuse scanned titles on boot and scanning
2021-09-18 16:14:31 +08:00
Alex Ling 16397050dd Update comments 2021-09-18 02:24:50 +00:00
Alex Ling 3f73591dd4 Update comments 2021-09-18 02:14:22 +00:00
Alex Ling ec25109fa5 Merge branch 'feature/preserve-scanned-titles' of https://github.com/Leeingnyo/Mango into feature/preserve-scanned-titles 2021-09-18 02:04:02 +00:00
Alex Ling 96f1ef3dde Improve comments on examine 2021-09-18 02:00:10 +00:00
Leeingnyo b56e16e1e1 Remove counter, yield everytime 2021-09-18 10:59:43 +09:00
Leeingnyo 9769e760a0 Pass a counter to recursive calls, Ignore negative threshold 2021-09-16 07:49:12 +09:00
Leeingnyo 70ab198a33 Add config 'forcely_yield_count'
the default value 1000 would make a fiber yield on each 4ms on SSD

Apply yield counter in Dir.contents_signauture
Use contents_signature cache in Title.new
2021-09-16 00:16:26 +09:00
Alex Ling 44a6f822cd Simplify Title.new 2021-09-15 09:00:30 +00:00
Alex Ling 2c241a96bb Merge branch 'dev' into feature/preserve-scanned-titles 2021-09-15 08:58:24 +00:00
Alex Ling 219d4446d1 Merge pull request #231 from hkalexling/feature/api-improvements
API Improvements
2021-09-15 16:54:41 +08:00
Alex Ling d330db131e Simplify mark_unavailable 2021-09-15 08:46:30 +00:00
Leeingnyo de193906a2 Refactor mark_unavailable 2021-09-15 16:54:55 +09:00
Leeingnyo d13cfc045f Add a comment 2021-09-15 01:27:05 +09:00
Leeingnyo a3b2cdd372 Lint 2021-09-15 01:17:44 +09:00
Leeingnyo f4d7128b59 Mark unavailable only in candidates 2021-09-14 23:30:03 +09:00
Leeingnyo 663c0c0b38 Remove nested title including self 2021-09-14 23:28:28 +09:00
Leeingnyo 57b2f7c625 Get nested ids when title removed 2021-09-14 23:08:07 +09:00
Leeingnyo 9489d6abfd Use reference instead of primitive 2021-09-14 23:07:47 +09:00
Leeingnyo 670cf54957 Apply yield forcely 2021-09-14 22:51:37 +09:00
Leeingnyo 2e09efbd62 Collect deleted ids 2021-09-14 22:51:25 +09:00
Leeingnyo 523195d649 Define ExamineContext, apply it when scanning 2021-09-14 22:37:30 +09:00
Leeingnyo be47f309b0 Use cache when calculating contents_signature 2021-09-14 18:11:08 +09:00
Alex Ling 03e044a1aa Improve logging 2021-09-14 07:16:14 +00:00
Alex Ling 4eaf271fa4 Simplify #json_build 2021-09-14 02:30:57 +00:00
Alex Ling 4b464ed361 Allow sorting in /api/book endpoint 2021-09-13 10:58:07 +00:00
Alex Ling a9520d6f26 Add shallow option to library API endpoints 2021-09-13 10:18:07 +00:00
Leeingnyo a151ec486d Fix file extension of gzip file 2021-09-12 18:04:41 +09:00
Leeingnyo 8f1383a818 Use Gzip instead of Zip 2021-09-12 18:01:16 +09:00
Leeingnyo f5933a48d9 Register mime_type scan, thumbnails when loading instance 2021-09-12 17:40:40 +09:00
Leeingnyo 7734dae138 Remove unnecessary sort 2021-09-12 14:36:17 +09:00
Leeingnyo 8c90b46114 Remove removed titles from title_hash 2021-09-12 13:39:28 +09:00
Leeingnyo cd48b45f11 Add 'require "yaml"' 2021-09-12 12:45:24 +09:00
Leeingnyo bdbdf9c94b Fix to pass 'make check', fix comments 2021-09-12 11:09:48 +09:00
Leeingnyo 7e36c91ea7 Remove debug print 2021-09-12 10:47:15 +09:00
Leeingnyo 9309f51df6 Memoization on dir contents_signature 2021-09-12 02:19:49 +09:00
Leeingnyo a8f729f5c1 Sort entries and titles when they needed 2021-09-12 02:19:49 +09:00
Leeingnyo 4e8b561f70 Apply contents signature of directories 2021-09-12 02:19:49 +09:00
Leeingnyo e6214ddc5d Rescan only if instance loaded 2021-09-12 02:19:49 +09:00
Leeingnyo 80e13abc4a Spawn scan job 2021-09-12 02:14:58 +09:00
Leeingnyo fb43abb950 Enhance the examine method 2021-09-12 02:14:58 +09:00
Leeingnyo eb3e37b950 Examine titles and recycle them 2021-09-12 02:14:58 +09:00
Leeingnyo 0a90e3b333 Ignore caches 2021-09-12 02:14:58 +09:00
Leeingnyo 4409ed8f45 Implement save_instance, load_instance 2021-09-12 02:14:58 +09:00
Leeingnyo 291a340cdd Add yaml serializer to Library, Title, Entry 2021-09-12 02:14:58 +09:00
Leeingnyo 0667f01471 Measure scan only 2021-09-12 02:14:58 +09:00
Alex Ling d5847bb105 Merge pull request #224 from hkalexling/fix/sanitize-download-filename
Stricter sanitization rules for download filenames
2021-09-09 08:47:41 +08:00
Alex Ling 3d295e961e Merge branch 'dev' into fix/sanitize-download-filename 2021-09-09 00:31:24 +00:00
Alex Ling e408398523 Merge pull request #227 from hkalexling/all-contributors/add-lincolnthedev
docs: add lincolnthedev as a contributor for infra
2021-09-09 08:18:22 +08:00
Alex Ling 566cebfcdd Remove all leading dots and spaces 2021-09-09 00:13:58 +00:00
Alex Ling a190ae3ed6 Merge pull request #226 from lincolnthedev/master
Better .dockerignore
2021-09-08 20:42:12 +08:00
allcontributors[bot] 17d7cefa12 docs: update .all-contributorsrc [skip ci] 2021-09-08 12:42:10 +00:00
allcontributors[bot] eaef0556fa docs: update README.md [skip ci] 2021-09-08 12:42:09 +00:00
i use arch btw 53226eab61 Forgot .github 2021-09-08 07:58:58 -04:00
Alex Ling ccf558eaa7 Improve filename sanitization rules 2021-09-08 10:03:05 +00:00
Alex Ling 0305433e46 Merge pull request #225 from hkalexling/feature/support-all-image-types
Support additional image formats (resolves #192)
2021-09-08 11:03:59 +08:00
i use arch btw d2cad6c496 Update .dockerignore 2021-09-07 21:12:51 -04:00
Alex Ling 371796cce9 Support additional image formats:
- APNG
- AVIF
- GIF
- SVG
2021-09-07 11:04:05 +00:00
Alex Ling d9adb49c27 Revert "Support all image types (resolves #192)"
This reverts commit f67e4e6cb9.
2021-09-07 10:45:59 +00:00
Alex Ling f67e4e6cb9 Support all image types (resolves #192) 2021-09-06 13:32:10 +00:00
Alex Ling 60a126024c Stricter sanitization rules for download filenames
Fixes #212
2021-09-06 13:01:05 +00:00
Alex Ling da8a485087 Merge pull request #222 from Leeingnyo/feature/enhance-loading-library
Improve loading pages (library, titles)
2021-09-06 16:49:19 +08:00
Alex Ling d809c21ee1 Document CacheEntry 2021-09-06 08:23:54 +00:00
Alex Ling ca1e221b10 Rename ids2entries -> ids_to_entries 2021-09-06 08:23:31 +00:00
Alex Ling 44d9c51ff9 Fix logging 2021-09-06 08:10:42 +00:00
Alex Ling 15a54f4f23 Add :sorted_entries suffix to gen_key 2021-09-06 08:10:13 +00:00
Alex Ling 51806f18db Rename config fields and improve logging 2021-09-06 03:35:46 +00:00
Alex Ling 79ef7bcd1c Remove unused variable 2021-09-06 03:01:21 +00:00
Leeingnyo 5cb85ea857 Set cached data when changed 2021-09-06 09:41:46 +09:00
Leeingnyo 9807db6ac0 Fix bug on entry_cover_url_cache 2021-09-06 02:32:13 +09:00
Leeingnyo 565a535d22 Remove caching verbosely, add cached_cover_url 2021-09-06 02:32:13 +09:00
Alex Ling c5b6a8b5b9 Improve instance_size for Tuple 2021-09-05 13:57:20 +00:00
Leeingnyo c75c71709f make check 2021-09-05 11:21:53 +09:00
Leeingnyo 11976b15f9 Make LRUCache togglable 2021-09-05 03:02:20 +09:00
Leeingnyo 847f516a65 Cache TitleInfo using LRUCache 2021-09-05 02:35:44 +09:00
Leeingnyo de410f42b8 Replace InfoCache to LRUCache 2021-09-05 02:08:11 +09:00
Leeingnyo 0fd7caef4b Rename 2021-09-05 00:02:05 +09:00
Leeingnyo 5e919d3e19 Make entry generic 2021-09-04 23:56:17 +09:00
Leeingnyo 9e90aa17b9 Move entry specific method 2021-09-04 14:37:05 +09:00
Leeingnyo 0a8fd993e5 Use bytesize and add comments 2021-09-03 11:11:28 +09:00
Leeingnyo 365f71cd1d Change kbs to mbs 2021-08-30 23:10:08 +09:00
Leeingnyo 601346b209 Set cache if enabled 2021-08-30 23:07:59 +09:00
Leeingnyo e988a8c121 Add config for sorted entries cache
optional
2021-08-30 22:59:23 +09:00
Leeingnyo bf81a4e48b Implement sorted entries cache
sorted_entries cached
2021-08-30 22:58:40 +09:00
Leeingnyo 4a09aee177 Implement library caching TitleInfo
* Cache sum of entry progress
* Cache cover_url
* Cache display_name
* Cache sort_opt
2021-08-30 11:31:45 +09:00
Leeingnyo 00c9cc1fcd Prevent saving a sort opt unnecessarily 2021-08-30 11:31:45 +09:00
Leeingnyo 51a47b5ddd Cache display_name 2021-08-30 11:31:45 +09:00
Leeingnyo 244f97a68e Cache entries' cover_url 2021-08-30 08:24:40 +09:00
33 changed files with 1303 additions and 241 deletions
+27
View File
@@ -104,6 +104,33 @@
"contributions": [ "contributions": [
"infra" "infra"
] ]
},
{
"login": "lincolnthedev",
"name": "i use arch btw",
"avatar_url": "https://avatars.githubusercontent.com/u/41193328?v=4",
"profile": "https://lncn.dev",
"contributions": [
"infra"
]
},
{
"login": "BradleyDS2",
"name": "BradleyDS2",
"avatar_url": "https://avatars.githubusercontent.com/u/2174921?v=4",
"profile": "https://github.com/BradleyDS2",
"contributions": [
"doc"
]
},
{
"login": "nduja",
"name": "Robbo",
"avatar_url": "https://avatars.githubusercontent.com/u/69299134?v=4",
"profile": "https://github.com/nduja",
"contributions": [
"code"
]
} }
], ],
"contributorsPerLine": 7, "contributorsPerLine": 7,
+7
View File
@@ -1,2 +1,9 @@
node_modules node_modules
lib lib
Dockerfile
Dockerfile.arm32v7
Dockerfile.arm64v8
README.md
.all-contributorsrc
env.example
.github/
+12 -12
View File
@@ -13,7 +13,7 @@ Mango is a self-hosted manga server and reader. Its features include
- Supports nested folders in library - Supports nested folders in library
- Automatically stores reading progress - Automatically stores reading progress
- Thumbnail generation - Thumbnail generation
- Supports [plugins](https://github.com/hkalexling/mango-plugins) to download from thrid-party sites - Supports [plugins](https://github.com/hkalexling/mango-plugins) to download from third-party sites
- The web reader is responsive and works well on mobile, so there is no need for a mobile app - The web reader is responsive and works well on mobile, so there is no need for a mobile app
- All the static files are embedded in the binary, so the deployment process is easy and painless - All the static files are embedded in the binary, so the deployment process is easy and painless
@@ -51,7 +51,7 @@ The official docker images are available on [Dockerhub](https://hub.docker.com/r
### CLI ### CLI
``` ```
Mango - Manga Server and Web Reader. Version 0.23.0 Mango - Manga Server and Web Reader. Version 0.25.0
Usage: Usage:
@@ -80,29 +80,26 @@ base_url: /
session_secret: mango-session-secret session_secret: mango-session-secret
library_path: ~/mango/library library_path: ~/mango/library
db_path: ~/mango/mango.db db_path: ~/mango/mango.db
queue_db_path: ~/mango/queue.db
scan_interval_minutes: 5 scan_interval_minutes: 5
thumbnail_generation_interval_hours: 24 thumbnail_generation_interval_hours: 24
log_level: info log_level: info
upload_path: ~/mango/uploads upload_path: ~/mango/uploads
plugin_path: ~/mango/plugins plugin_path: ~/mango/plugins
download_timeout_seconds: 30 download_timeout_seconds: 30
library_cache_path: ~/mango/library.yml.gz
cache_enabled: true
cache_size_mbs: 50
cache_log_enabled: true
disable_login: false disable_login: false
default_username: "" default_username: ""
auth_proxy_header_name: "" auth_proxy_header_name: ""
mangadex:
base_url: https://mangadex.org
api_url: https://api.mangadex.org/v2
download_wait_seconds: 5
download_retries: 4
download_queue_db_path: ~/mango/queue.db
chapter_rename_rule: '[Vol.{volume} ][Ch.{chapter} ]{title|id}'
manga_rename_rule: '{title}'
subscription_update_interval_hours: 24
``` ```
- `scan_interval_minutes`, `thumbnail_generation_interval_hours` and `db_optimization_interval_hours` can be any non-negative integer. Setting them to `0` disables the periodic tasks - `scan_interval_minutes`, `thumbnail_generation_interval_hours` can be any non-negative integer. Setting them to `0` disables the periodic tasks
- `log_level` can be `debug`, `info`, `warn`, `error`, `fatal` or `off`. Setting it to `off` disables the logging - `log_level` can be `debug`, `info`, `warn`, `error`, `fatal` or `off`. Setting it to `off` disables the logging
- You can disable authentication by setting `disable_login` to true. Note that `default_username` must be set to an existing username for this to work. - You can disable authentication by setting `disable_login` to true. Note that `default_username` must be set to an existing username for this to work.
- By setting `cache_enabled` to `true`, you can enable an experimental feature where Mango caches library metadata to improve page load time. You can further fine-tune the feature with `cache_size_mbs` and `cache_log_enabled`.
### Library Structure ### Library Structure
@@ -174,6 +171,9 @@ Please check the [development guideline](https://github.com/hkalexling/Mango/wik
<td align="center"><a href="https://github.com/Leeingnyo"><img src="https://avatars0.githubusercontent.com/u/6760150?v=4?s=100" width="100px;" alt=""/><br /><sub><b>이인용</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=Leeingnyo" title="Code">💻</a></td> <td align="center"><a href="https://github.com/Leeingnyo"><img src="https://avatars0.githubusercontent.com/u/6760150?v=4?s=100" width="100px;" alt=""/><br /><sub><b>이인용</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=Leeingnyo" title="Code">💻</a></td>
<td align="center"><a href="http://h45h74x.eu.org"><img src="https://avatars1.githubusercontent.com/u/27204033?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Simon</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=h45h74x" title="Code">💻</a></td> <td align="center"><a href="http://h45h74x.eu.org"><img src="https://avatars1.githubusercontent.com/u/27204033?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Simon</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=h45h74x" title="Code">💻</a></td>
<td align="center"><a href="https://github.com/davidkna"><img src="https://avatars.githubusercontent.com/u/835177?v=4?s=100" width="100px;" alt=""/><br /><sub><b>David Knaack</b></sub></a><br /><a href="#infra-davidkna" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td> <td align="center"><a href="https://github.com/davidkna"><img src="https://avatars.githubusercontent.com/u/835177?v=4?s=100" width="100px;" alt=""/><br /><sub><b>David Knaack</b></sub></a><br /><a href="#infra-davidkna" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center"><a href="https://lncn.dev"><img src="https://avatars.githubusercontent.com/u/41193328?v=4?s=100" width="100px;" alt=""/><br /><sub><b>i use arch btw</b></sub></a><br /><a href="#infra-lincolnthedev" title="Infrastructure (Hosting, Build-Tools, etc)">🚇</a></td>
<td align="center"><a href="https://github.com/BradleyDS2"><img src="https://avatars.githubusercontent.com/u/2174921?v=4?s=100" width="100px;" alt=""/><br /><sub><b>BradleyDS2</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=BradleyDS2" title="Documentation">📖</a></td>
<td align="center"><a href="https://github.com/nduja"><img src="https://avatars.githubusercontent.com/u/69299134?v=4?s=100" width="100px;" alt=""/><br /><sub><b>Robbo</b></sub></a><br /><a href="https://github.com/hkalexling/Mango/commits?author=nduja" title="Code">💻</a></td>
</tr> </tr>
</table> </table>
+94
View File
@@ -0,0 +1,94 @@
class SortTitle < MG::Base
def up : String
<<-SQL
-- add sort_title column to ids and titles
ALTER TABLE ids ADD COLUMN sort_title TEXT;
ALTER TABLE titles ADD COLUMN sort_title TEXT;
SQL
end
def down : String
<<-SQL
-- remove sort_title column from ids
ALTER TABLE ids RENAME TO tmp;
CREATE TABLE ids (
path TEXT NOT NULL,
id TEXT NOT NULL,
signature TEXT,
unavailable INTEGER NOT NULL DEFAULT 0
);
INSERT INTO ids
SELECT path, id, signature, unavailable
FROM tmp;
DROP TABLE tmp;
-- recreate the indices
CREATE UNIQUE INDEX path_idx ON ids (path);
CREATE UNIQUE INDEX id_idx ON ids (id);
-- recreate the foreign key constraint on thumbnails
ALTER TABLE thumbnails RENAME TO tmp;
CREATE TABLE thumbnails (
id TEXT NOT NULL,
data BLOB NOT NULL,
filename TEXT NOT NULL,
mime TEXT NOT NULL,
size INTEGER NOT NULL,
FOREIGN KEY (id) REFERENCES ids (id)
ON UPDATE CASCADE
ON DELETE CASCADE
);
INSERT INTO thumbnails
SELECT * FROM tmp;
DROP TABLE tmp;
CREATE UNIQUE INDEX tn_index ON thumbnails (id);
-- remove sort_title column from titles
ALTER TABLE titles RENAME TO tmp;
CREATE TABLE titles (
id TEXT NOT NULL,
path TEXT NOT NULL,
signature TEXT,
unavailable INTEGER NOT NULL DEFAULT 0
);
INSERT INTO titles
SELECT id, path, signature, unavailable
FROM tmp;
DROP TABLE tmp;
-- recreate the indices
CREATE UNIQUE INDEX titles_id_idx on titles (id);
CREATE UNIQUE INDEX titles_path_idx on titles (path);
-- recreate the foreign key constraint on tags
ALTER TABLE tags RENAME TO tmp;
CREATE TABLE tags (
id TEXT NOT NULL,
tag TEXT NOT NULL,
UNIQUE (id, tag),
FOREIGN KEY (id) REFERENCES titles (id)
ON UPDATE CASCADE
ON DELETE CASCADE
);
INSERT INTO tags
SELECT * FROM tmp;
DROP TABLE tmp;
CREATE INDEX tags_id_idx ON tags (id);
CREATE INDEX tags_tag_idx ON tags (tag);
SQL
end
end
+1 -1
View File
@@ -55,7 +55,7 @@ const component = () => {
jobAction(action, event) { jobAction(action, event) {
let url = `${base_url}api/admin/mangadex/queue/${action}`; let url = `${base_url}api/admin/mangadex/queue/${action}`;
if (event) { if (event) {
const id = event.currentTarget.closest('tr').id.split('-')[1]; const id = event.currentTarget.closest('tr').id.split('-').slice(1).join('-');
url = `${url}?${$.param({ url = `${url}?${$.param({
id: id id: id
})}`; })}`;
+6 -1
View File
@@ -68,7 +68,12 @@ const buildTable = (chapters) => {
$('table').append(thead); $('table').append(thead);
const rows = chapters.map(ch => { const rows = chapters.map(ch => {
const tds = Object.values(ch).map(v => `<td>${v}</td>`).join(''); const tds = Object.values(ch).map(v => {
const maxLength = 40;
const shouldShrink = v && v.length > maxLength;
const content = shouldShrink ? `<span title="${v}">${v.substring(0, maxLength)}...</span><div uk-dropdown><span>${v}</span></div>` : v;
return `<td>${content}</td>`
}).join('');
return `<tr data-id="${ch.id}" data-title="${ch.title}">${tds}</tr>`; return `<tr data-id="${ch.id}" data-title="${ch.title}">${tds}</tr>`;
}); });
const tbody = `<tbody id="selectable">${rows}</tbody>`; const tbody = `<tbody id="selectable">${rows}</tbody>`;
+15 -3
View File
@@ -13,6 +13,7 @@ const readerComponent = () => {
selectedIndex: 0, // 0: not selected; 1: the first page selectedIndex: 0, // 0: not selected; 1: the first page
margin: 30, margin: 30,
preloadLookahead: 3, preloadLookahead: 3,
enableRightToLeft: false,
/** /**
* Initialize the component by fetching the page dimensions * Initialize the component by fetching the page dimensions
@@ -64,6 +65,13 @@ const readerComponent = () => {
const savedFlipAnimation = localStorage.getItem('enableFlipAnimation'); const savedFlipAnimation = localStorage.getItem('enableFlipAnimation');
this.enableFlipAnimation = savedFlipAnimation === null || savedFlipAnimation === 'true'; this.enableFlipAnimation = savedFlipAnimation === null || savedFlipAnimation === 'true';
const savedRightToLeft = localStorage.getItem('enableRightToLeft');
if (savedRightToLeft === null) {
this.enableRightToLeft = false;
} else {
this.enableRightToLeft = (savedRightToLeft === 'true');
}
}) })
.catch(e => { .catch(e => {
const errMsg = `Failed to get the page dimensions. ${e}`; const errMsg = `Failed to get the page dimensions. ${e}`;
@@ -114,9 +122,9 @@ const readerComponent = () => {
if (this.mode === 'continuous') return; if (this.mode === 'continuous') return;
if (event.key === 'ArrowLeft' || event.key === 'k') if (event.key === 'ArrowLeft' || event.key === 'k')
this.flipPage(false); this.flipPage(false ^ this.enableRightToLeft);
if (event.key === 'ArrowRight' || event.key === 'j') if (event.key === 'ArrowRight' || event.key === 'j')
this.flipPage(true); this.flipPage(true ^ this.enableRightToLeft);
}, },
/** /**
* Flips to the next or the previous page * Flips to the next or the previous page
@@ -136,7 +144,7 @@ const readerComponent = () => {
this.toPage(newIdx); this.toPage(newIdx);
if (this.enableFlipAnimation) { if (this.enableFlipAnimation) {
if (isNext) if (isNext ^ this.enableRightToLeft)
this.flipAnimation = 'right'; this.flipAnimation = 'right';
else else
this.flipAnimation = 'left'; this.flipAnimation = 'left';
@@ -320,5 +328,9 @@ const readerComponent = () => {
enableFlipAnimationChanged() { enableFlipAnimationChanged() {
localStorage.setItem('enableFlipAnimation', this.enableFlipAnimation); localStorage.setItem('enableFlipAnimation', this.enableFlipAnimation);
}, },
enableRightToLeftChanged() {
localStorage.setItem('enableRightToLeft', this.enableRightToLeft);
},
}; };
} }
+62 -6
View File
@@ -60,6 +60,11 @@ function showModal(encodedPath, pages, percentage, encodedeTitle, encodedEntryTi
UIkit.modal($('#modal')).show(); UIkit.modal($('#modal')).show();
} }
UIkit.util.on(document, 'hidden', '#modal', () => {
$('#read-btn').off('click');
$('#unread-btn').off('click');
});
const updateProgress = (tid, eid, page) => { const updateProgress = (tid, eid, page) => {
let url = `${base_url}api/progress/${tid}/${page}` let url = `${base_url}api/progress/${tid}/${page}`
const query = $.param({ const query = $.param({
@@ -90,8 +95,6 @@ const renameSubmit = (name, eid) => {
const upload = $('.upload-field'); const upload = $('.upload-field');
const titleId = upload.attr('data-title-id'); const titleId = upload.attr('data-title-id');
console.log(name);
if (name.length === 0) { if (name.length === 0) {
alert('danger', 'The display name should not be empty'); alert('danger', 'The display name should not be empty');
return; return;
@@ -122,15 +125,47 @@ const renameSubmit = (name, eid) => {
}); });
}; };
const renameSortNameSubmit = (name, eid) => {
const upload = $('.upload-field');
const titleId = upload.attr('data-title-id');
const params = {};
if (eid) params.eid = eid;
if (name) params.name = name;
const query = $.param(params);
let url = `${base_url}api/admin/sort_title/${titleId}?${query}`;
$.ajax({
type: 'PUT',
url,
contentType: 'application/json',
dataType: 'json'
})
.done(data => {
if (data.error) {
alert('danger', `Failed to update sort title. Error: ${data.error}`);
return;
}
location.reload();
})
.fail((jqXHR, status) => {
alert('danger', `Failed to update sort title. Error: [${jqXHR.status}] ${jqXHR.statusText}`);
});
};
const edit = (eid) => { const edit = (eid) => {
const cover = $('#edit-modal #cover'); const cover = $('#edit-modal #cover');
let url = cover.attr('data-title-cover'); let url = cover.attr('data-title-cover');
let displayName = $('h2.uk-title > span').text(); let displayName = $('h2.uk-title > span').text();
let fileTitle = $('h2.uk-title').attr('data-file-title');
let sortTitle = $('h2.uk-title').attr('data-sort-title');
if (eid) { if (eid) {
const item = $(`#${eid}`); const item = $(`#${eid}`);
url = item.find('img').attr('data-src'); url = item.find('img').attr('data-src');
displayName = item.find('.uk-card-title').attr('data-title'); displayName = item.find('.uk-card-title').attr('data-title');
fileTitle = item.find('.uk-card-title').attr('data-file-title');
sortTitle = item.find('.uk-card-title').attr('data-sort-title');
$('#title-progress-control').attr('hidden', ''); $('#title-progress-control').attr('hidden', '');
} else { } else {
$('#title-progress-control').removeAttr('hidden'); $('#title-progress-control').removeAttr('hidden');
@@ -140,14 +175,26 @@ const edit = (eid) => {
const displayNameField = $('#display-name-field'); const displayNameField = $('#display-name-field');
displayNameField.attr('value', displayName); displayNameField.attr('value', displayName);
console.log(displayNameField); displayNameField.attr('placeholder', fileTitle);
displayNameField.keyup(event => { displayNameField.keyup(event => {
if (event.keyCode === 13) { if (event.keyCode === 13) {
renameSubmit(displayNameField.val(), eid); renameSubmit(displayNameField.val() || fileTitle, eid);
} }
}); });
displayNameField.siblings('a.uk-form-icon').click(() => { displayNameField.siblings('a.uk-form-icon').click(() => {
renameSubmit(displayNameField.val(), eid); renameSubmit(displayNameField.val() || fileTitle, eid);
});
const sortTitleField = $('#sort-title-field');
sortTitleField.val(sortTitle);
sortTitleField.attr('placeholder', fileTitle);
sortTitleField.keyup(event => {
if (event.keyCode === 13) {
renameSortNameSubmit(sortTitleField.val(), eid);
}
});
sortTitleField.siblings('a.uk-form-icon').click(() => {
renameSortNameSubmit(sortTitleField.val(), eid);
}); });
setupUpload(eid); setupUpload(eid);
@@ -155,6 +202,16 @@ const edit = (eid) => {
UIkit.modal($('#edit-modal')).show(); UIkit.modal($('#edit-modal')).show();
}; };
UIkit.util.on(document, 'hidden', '#edit-modal', () => {
const displayNameField = $('#display-name-field');
displayNameField.off('keyup');
displayNameField.off('click');
const sortTitleField = $('#sort-title-field');
sortTitleField.off('keyup');
sortTitleField.off('click');
});
const setupUpload = (eid) => { const setupUpload = (eid) => {
const upload = $('.upload-field'); const upload = $('.upload-field');
const bar = $('#upload-progress').get(0); const bar = $('#upload-progress').get(0);
@@ -166,7 +223,6 @@ const setupUpload = (eid) => {
queryObj['eid'] = eid; queryObj['eid'] = eid;
const query = $.param(queryObj); const query = $.param(queryObj);
const url = `${base_url}api/admin/upload/cover?${query}`; const url = `${base_url}api/admin/upload/cover?${query}`;
console.log(url);
UIkit.upload('.upload-field', { UIkit.upload('.upload-field', {
url: url, url: url,
name: 'file', name: 'file',
+1 -1
View File
@@ -1,5 +1,5 @@
name: mango name: mango
version: 0.23.0 version: 0.25.0
authors: authors:
- Alex Ling <hkalexling@gmail.com> - Alex Ling <hkalexling@gmail.com>
+10
View File
@@ -61,3 +61,13 @@ describe "chapter_sort" do
end.should eq ary end.should eq ary
end end
end end
describe "sanitize_filename" do
it "returns a random string for empty sanitized string" do
sanitize_filename("..").should_not eq sanitize_filename("..")
end
it "sanitizes correctly" do
sanitize_filename(".. \n\v.\rマンゴー/|*()<[1/2] 3.14 hello world ")
.should eq "マンゴー_()[1_2] 3.14 hello world"
end
end
+19 -53
View File
@@ -4,38 +4,27 @@ class Config
include YAML::Serializable include YAML::Serializable
@[YAML::Field(ignore: true)] @[YAML::Field(ignore: true)]
property path : String = "" property path = ""
property host : String = "0.0.0.0" property host = "0.0.0.0"
property port : Int32 = 9000 property port : Int32 = 9000
property base_url : String = "/" property base_url = "/"
property session_secret : String = "mango-session-secret" property session_secret = "mango-session-secret"
property library_path : String = File.expand_path "~/mango/library", property library_path = "~/mango/library"
home: true property library_cache_path = "~/mango/library.yml.gz"
property db_path : String = File.expand_path "~/mango/mango.db", home: true property db_path = "~/mango/mango.db"
property queue_db_path = "~/mango/queue.db"
property scan_interval_minutes : Int32 = 5 property scan_interval_minutes : Int32 = 5
property thumbnail_generation_interval_hours : Int32 = 24 property thumbnail_generation_interval_hours : Int32 = 24
property log_level : String = "info" property log_level = "info"
property upload_path : String = File.expand_path "~/mango/uploads", property upload_path = "~/mango/uploads"
home: true property plugin_path = "~/mango/plugins"
property plugin_path : String = File.expand_path "~/mango/plugins",
home: true
property download_timeout_seconds : Int32 = 30 property download_timeout_seconds : Int32 = 30
property cache_enabled = true
property cache_size_mbs = 50
property cache_log_enabled = true
property disable_login = false property disable_login = false
property default_username = "" property default_username = ""
property auth_proxy_header_name = "" property auth_proxy_header_name = ""
property mangadex = Hash(String, String | Int32).new
@[YAML::Field(ignore: true)]
@mangadex_defaults = {
"base_url" => "https://mangadex.org",
"api_url" => "https://api.mangadex.org/v2",
"download_wait_seconds" => 5,
"download_retries" => 4,
"download_queue_db_path" => File.expand_path("~/mango/queue.db",
home: true),
"chapter_rename_rule" => "[Vol.{volume} ][Ch.{chapter} ]{title|id}",
"manga_rename_rule" => "{title}",
}
@@singlet : Config? @@singlet : Config?
@@ -53,7 +42,7 @@ class Config
if File.exists? cfg_path if File.exists? cfg_path
config = self.from_yaml File.read cfg_path config = self.from_yaml File.read cfg_path
config.path = path config.path = path
config.fill_defaults config.expand_paths
config.preprocess config.preprocess
return config return config
end end
@@ -61,7 +50,7 @@ class Config
"Dumping the default config there." "Dumping the default config there."
default = self.allocate default = self.allocate
default.path = path default.path = path
default.fill_defaults default.expand_paths
cfg_dir = File.dirname cfg_path cfg_dir = File.dirname cfg_path
unless Dir.exists? cfg_dir unless Dir.exists? cfg_dir
Dir.mkdir_p cfg_dir Dir.mkdir_p cfg_dir
@@ -71,13 +60,9 @@ class Config
default default
end end
def fill_defaults def expand_paths
{% for hash_name in ["mangadex"] %} {% for p in %w(library library_cache db queue_db upload plugin) %}
@{{hash_name.id}}_defaults.map do |k, v| @{{p.id}}_path = File.expand_path @{{p.id}}_path, home: true
if @{{hash_name.id}}[k]?.nil?
@{{hash_name.id}}[k] = v
end
end
{% end %} {% end %}
end end
@@ -92,24 +77,5 @@ class Config
raise "Login is disabled, but default username is not set. " \ raise "Login is disabled, but default username is not set. " \
"Please set a default username" "Please set a default username"
end end
# `Logger.default` is not available yet
Log.setup :debug
unless mangadex["api_url"] =~ /\/v2/
Log.warn { "It looks like you are using the deprecated MangaDex API " \
"v1 in your config file. Please update it to " \
"https://api.mangadex.org/v2 to suppress this warning." }
mangadex["api_url"] = "https://api.mangadex.org/v2"
end
if mangadex["api_url"] =~ /\/api\/v2/
Log.warn { "It looks like you are using the outdated MangaDex API " \
"url (mangadex.org/api/v2) in your config file. Please " \
"update it to https://api.mangadex.org/v2 to suppress this " \
"warning." }
mangadex["api_url"] = "https://api.mangadex.org/v2"
end
mangadex["api_url"] = mangadex["api_url"].to_s.rstrip "/"
mangadex["base_url"] = mangadex["base_url"].to_s.rstrip "/"
end end
end end
+3 -2
View File
@@ -54,8 +54,9 @@ class AuthHandler < Kemal::Handler
end end
def call(env) def call(env)
# Skip all authentication if requesting /login, /logout, or a static file # Skip all authentication if requesting /login, /logout, /api/login,
if request_path_startswith(env, ["/login", "/logout"]) || # or a static file
if request_path_startswith(env, ["/login", "/logout", "/api/login"]) ||
requesting_static_file env requesting_static_file env
return call_next(env) return call_next(env)
end end
+218
View File
@@ -0,0 +1,218 @@
require "digest"
require "./entry"
require "./title"
require "./types"
# Base class for an entry in the LRU cache.
# There are two ways to use it:
# 1. Use it as it is by instantiating with the appropriate `SaveT` and
# `ReturnT`. Note that in this case, `SaveT` and `ReturnT` must be the
# same type. That is, the input value will be stored as it is without
# any transformation.
# 2. You can also subclass it and provide custom implementations for
# `to_save_t` and `to_return_t`. This allows you to transform and store
# the input value to a different type. See `SortedEntriesCacheEntry` as
# an example.
private class CacheEntry(SaveT, ReturnT)
getter key : String, atime : Time
@value : SaveT
def initialize(@key : String, value : ReturnT)
@atime = @ctime = Time.utc
@value = self.class.to_save_t value
end
def value
@atime = Time.utc
self.class.to_return_t @value
end
def self.to_save_t(value : ReturnT)
value
end
def self.to_return_t(value : SaveT)
value
end
def instance_size
instance_sizeof(CacheEntry(SaveT, ReturnT)) + # sizeof itself
instance_sizeof(String) + @key.bytesize + # allocated memory for @key
@value.instance_size
end
end
class SortedEntriesCacheEntry < CacheEntry(Array(String), Array(Entry))
def self.to_save_t(value : Array(Entry))
value.map &.id
end
def self.to_return_t(value : Array(String))
ids_to_entries value
end
private def self.ids_to_entries(ids : Array(String))
e_map = Library.default.deep_entries.to_h { |entry| {entry.id, entry} }
entries = [] of Entry
begin
ids.each do |id|
entries << e_map[id]
end
return entries if ids.size == entries.size
rescue
end
end
def instance_size
instance_sizeof(SortedEntriesCacheEntry) + # sizeof itself
instance_sizeof(String) + @key.bytesize + # allocated memory for @key
@value.size * (instance_sizeof(String) + sizeof(String)) +
@value.sum(&.bytesize) # elements in Array(String)
end
def self.gen_key(book_id : String, username : String,
entries : Array(Entry), opt : SortOptions?)
entries_sig = Digest::SHA1.hexdigest (entries.map &.id).to_s
user_context = opt && opt.method == SortMethod::Progress ? username : ""
sig = Digest::SHA1.hexdigest (book_id + entries_sig + user_context +
(opt ? opt.to_tuple.to_s : "nil"))
"#{sig}:sorted_entries"
end
end
class SortedTitlesCacheEntry < CacheEntry(Array(String), Array(Title))
def self.to_save_t(value : Array(Title))
value.map &.id
end
def self.to_return_t(value : Array(String))
value.map { |title_id| Library.default.title_hash[title_id].not_nil! }
end
def instance_size
instance_sizeof(SortedTitlesCacheEntry) + # sizeof itself
instance_sizeof(String) + @key.bytesize + # allocated memory for @key
@value.size * (instance_sizeof(String) + sizeof(String)) +
@value.sum(&.bytesize) # elements in Array(String)
end
def self.gen_key(username : String, titles : Array(Title), opt : SortOptions?)
titles_sig = Digest::SHA1.hexdigest (titles.map &.id).to_s
user_context = opt && opt.method == SortMethod::Progress ? username : ""
sig = Digest::SHA1.hexdigest (titles_sig + user_context +
(opt ? opt.to_tuple.to_s : "nil"))
"#{sig}:sorted_titles"
end
end
class String
def instance_size
instance_sizeof(String) + bytesize
end
end
struct Tuple(*T)
def instance_size
sizeof(T) + # total size of non-reference types
self.sum do |e|
next 0 unless e.is_a? Reference
if e.responds_to? :instance_size
e.instance_size
else
instance_sizeof(typeof(e))
end
end
end
end
alias CacheableType = Array(Entry) | Array(Title) | String |
Tuple(String, Int32)
alias CacheEntryType = SortedEntriesCacheEntry |
SortedTitlesCacheEntry |
CacheEntry(String, String) |
CacheEntry(Tuple(String, Int32), Tuple(String, Int32))
def generate_cache_entry(key : String, value : CacheableType)
if value.is_a? Array(Entry)
SortedEntriesCacheEntry.new key, value
elsif value.is_a? Array(Title)
SortedTitlesCacheEntry.new key, value
else
CacheEntry(typeof(value), typeof(value)).new key, value
end
end
# LRU Cache
class LRUCache
@@limit : Int128 = Int128.new 0
@@should_log = true
# key => entry
@@cache = {} of String => CacheEntryType
def self.enabled
Config.current.cache_enabled
end
def self.init
cache_size = Config.current.cache_size_mbs
@@limit = Int128.new cache_size * 1024 * 1024 if enabled
@@should_log = Config.current.cache_log_enabled
end
def self.get(key : String)
return unless enabled
entry = @@cache[key]?
if @@should_log
Logger.debug "LRUCache #{entry.nil? ? "miss" : "hit"} #{key}"
end
return entry.value unless entry.nil?
end
def self.set(cache_entry : CacheEntryType)
return unless enabled
key = cache_entry.key
@@cache[key] = cache_entry
Logger.debug "LRUCache cached #{key}" if @@should_log
remove_least_recent_access
end
def self.invalidate(key : String)
return unless enabled
@@cache.delete key
end
def self.print
return unless @@should_log
sum = @@cache.sum { |_, entry| entry.instance_size }
Logger.debug "---- LRU Cache ----"
Logger.debug "Size: #{sum} Bytes"
Logger.debug "List:"
@@cache.each do |k, v|
Logger.debug "#{k} | #{v.atime} | #{v.instance_size}"
end
Logger.debug "-------------------"
end
private def self.is_cache_full
sum = @@cache.sum { |_, entry| entry.instance_size }
sum > @@limit
end
private def self.remove_least_recent_access
if @@should_log && is_cache_full
Logger.debug "Removing entries from LRUCache"
end
while is_cache_full && @@cache.size > 0
min_tuple = @@cache.min_by { |_, entry| entry.atime }
min_key = min_tuple[0]
min_entry = min_tuple[1]
Logger.debug " \
Target: #{min_key}, \
Last Access Time: #{min_entry.atime}" if @@should_log
invalidate min_key
end
end
end
+55 -13
View File
@@ -1,10 +1,16 @@
require "image_size" require "image_size"
require "yaml"
class Entry class Entry
include YAML::Serializable
getter zip_path : String, book : Title, title : String, getter zip_path : String, book : Title, title : String,
size : String, pages : Int32, id : String, encoded_path : String, size : String, pages : Int32, id : String, encoded_path : String,
encoded_title : String, mtime : Time, err_msg : String? encoded_title : String, mtime : Time, err_msg : String?
@[YAML::Field(ignore: true)]
@sort_title : String?
def initialize(@zip_path, @book) def initialize(@zip_path, @book)
storage = Storage.default storage = Storage.default
@encoded_path = URI.encode @zip_path @encoded_path = URI.encode @zip_path
@@ -46,29 +52,51 @@ class Entry
file.close file.close
end end
def to_slim_json : String def build_json(*, slim = false)
JSON.build do |json| JSON.build do |json|
json.object do json.object do
{% for str in ["zip_path", "title", "size", "id"] %} {% for str in ["zip_path", "title", "size", "id"] %}
json.field {{str}}, @{{str.id}} json.field {{str}}, @{{str.id}}
{% end %} {% end %}
json.field "title_id", @book.id json.field "title_id", @book.id
json.field "sort_title", sort_title
json.field "pages" { json.number @pages } json.field "pages" { json.number @pages }
unless slim
json.field "display_name", @book.display_name @title
json.field "cover_url", cover_url
json.field "mtime" { json.number @mtime.to_unix }
end
end end
end end
end end
def to_json(json : JSON::Builder) def sort_title
json.object do sort_title_cached = @sort_title
{% for str in ["zip_path", "title", "size", "id"] %} return sort_title_cached if sort_title_cached
json.field {{str}}, @{{str.id}} sort_title = @book.entry_sort_title_db id
{% end %} if sort_title
json.field "title_id", @book.id @sort_title = sort_title
json.field "display_name", @book.display_name @title return sort_title
json.field "cover_url", cover_url
json.field "pages" { json.number @pages }
json.field "mtime" { json.number @mtime.to_unix }
end end
@sort_title = @title
@title
end
def set_sort_title(sort_title : String | Nil, username : String)
Storage.default.set_entry_sort_title id, sort_title
if sort_title == "" || sort_title.nil?
@sort_title = nil
else
@sort_title = sort_title
end
@book.entry_sort_title_cache = nil
@book.remove_sorted_entries_cache [SortMethod::Auto, SortMethod::Title],
username
end
def sort_title_db
@book.entry_sort_title_db @id
end end
def display_name def display_name
@@ -81,9 +109,17 @@ class Entry
def cover_url def cover_url
return "#{Config.current.base_url}img/icon.png" if @err_msg return "#{Config.current.base_url}img/icon.png" if @err_msg
url = "#{Config.current.base_url}api/cover/#{@book.id}/#{@id}"
unless @book.entry_cover_url_cache
TitleInfo.new @book.dir do |info| TitleInfo.new @book.dir do |info|
info_url = info.entry_cover_url[@title]? @book.entry_cover_url_cache = info.entry_cover_url
end
end
entry_cover_url = @book.entry_cover_url_cache
url = "#{Config.current.base_url}api/cover/#{@book.id}/#{@id}"
if entry_cover_url
info_url = entry_cover_url[@title]?
unless info_url.nil? || info_url.empty? unless info_url.nil? || info_url.empty?
url = File.join Config.current.base_url, info_url url = File.join Config.current.base_url, info_url
end end
@@ -170,6 +206,12 @@ class Entry
# For backward backward compatibility with v0.1.0, we save entry titles # For backward backward compatibility with v0.1.0, we save entry titles
# instead of IDs in info.json # instead of IDs in info.json
def save_progress(username, page) def save_progress(username, page)
LRUCache.invalidate "#{@book.id}:#{username}:progress_sum"
@book.parents.each do |parent|
LRUCache.invalidate "#{parent.id}:#{username}:progress_sum"
end
@book.remove_sorted_caches [SortMethod::Progress], username
TitleInfo.new @book.dir do |info| TitleInfo.new @book.dir do |info|
if info.progress[username]?.nil? if info.progress[username]?.nil?
info.progress[username] = {@title => page} info.progress[username] = {@title => page}
+131 -40
View File
@@ -1,20 +1,94 @@
class Library class Library
struct ThumbnailContext
property current : Int32, total : Int32
def initialize
@current = 0
@total = 0
end
def progress
if total == 0
0
else
current / total
end
end
def reset
@current = 0
@total = 0
end
def increment
@current += 1
end
end
include YAML::Serializable
getter dir : String, title_ids : Array(String), getter dir : String, title_ids : Array(String),
title_hash : Hash(String, Title) title_hash : Hash(String, Title)
@[YAML::Field(ignore: true)]
getter thumbnail_ctx = ThumbnailContext.new
use_default use_default
def initialize def save_instance
register_mime_types path = Config.current.library_cache_path
Logger.debug "Caching library to #{path}"
writer = Compress::Gzip::Writer.new path,
Compress::Gzip::BEST_COMPRESSION
writer.write self.to_yaml.to_slice
writer.close
end
def self.load_instance
path = Config.current.library_cache_path
return unless File.exists? path
Logger.debug "Loading cached library from #{path}"
begin
Compress::Gzip::Reader.open path do |content|
loaded = Library.from_yaml content
# We will have to do a full restart in these cases. Otherwise having
# two instances of the library will cause some weirdness.
if loaded.dir != Config.current.library_path
Logger.fatal "Cached library dir #{loaded.dir} does not match " \
"current library dir #{Config.current.library_path}. " \
"Deleting cache"
delete_cache_and_exit path
end
if loaded.title_ids.size > 0 &&
Storage.default.count_titles == 0
Logger.fatal "The library cache is inconsistent with the DB. " \
"Deleting cache"
delete_cache_and_exit path
end
@@default = loaded
Logger.debug "Library cache loaded"
end
Library.default.register_jobs
rescue e
Logger.error e
end
end
def initialize
@dir = Config.current.library_path @dir = Config.current.library_path
# explicitly initialize @titles to bypass the compiler check. it will # explicitly initialize @titles to bypass the compiler check. it will
# be filled with actual Titles in the `scan` call below # be filled with actual Titles in the `scan` call below
@title_ids = [] of String @title_ids = [] of String
@title_hash = {} of String => Title @title_hash = {} of String => Title
@entries_count = 0 register_jobs
@thumbnails_count = 0 end
protected def register_jobs
register_mime_types
scan_interval = Config.current.scan_interval_minutes scan_interval = Config.current.scan_interval_minutes
if scan_interval < 1 if scan_interval < 1
@@ -25,7 +99,7 @@ class Library
start = Time.local start = Time.local
scan scan
ms = (Time.local - start).total_milliseconds ms = (Time.local - start).total_milliseconds
Logger.info "Scanned #{@title_ids.size} titles in #{ms}ms" Logger.debug "Library initialized in #{ms}ms"
sleep scan_interval.minutes sleep scan_interval.minutes
end end
end end
@@ -51,11 +125,6 @@ class Library
def sorted_titles(username, opt : SortOptions? = nil) def sorted_titles(username, opt : SortOptions? = nil)
if opt.nil? if opt.nil?
opt = SortOptions.from_info_json @dir, username opt = SortOptions.from_info_json @dir, username
else
TitleInfo.new @dir do |info|
info.sort_by[username] = opt.to_tuple
info.save
end
end end
# Helper function from src/util/util.cr # Helper function from src/util/util.cr
@@ -66,14 +135,18 @@ class Library
titles + titles.flat_map &.deep_titles titles + titles.flat_map &.deep_titles
end end
def to_slim_json : String def deep_entries
titles.flat_map &.deep_entries
end
def build_json(*, slim = false, depth = -1)
JSON.build do |json| JSON.build do |json|
json.object do json.object do
json.field "dir", @dir json.field "dir", @dir
json.field "titles" do json.field "titles" do
json.array do json.array do
self.titles.each do |title| self.titles.each do |title|
json.raw title.to_slim_json json.raw title.build_json(slim: slim, depth: depth)
end end
end end
end end
@@ -81,15 +154,6 @@ class Library
end end
end end
def to_json(json : JSON::Builder)
json.object do
json.field "dir", @dir
json.field "titles" do
json.raw self.titles.to_json
end
end
end
def get_title(tid) def get_title(tid)
@title_hash[tid]? @title_hash[tid]?
end end
@@ -99,6 +163,7 @@ class Library
end end
def scan def scan
start = Time.local
unless Dir.exists? @dir unless Dir.exists? @dir
Logger.info "The library directory #{@dir} does not exist. " \ Logger.info "The library directory #{@dir} does not exist. " \
"Attempting to create it" "Attempting to create it"
@@ -107,14 +172,38 @@ class Library
storage = Storage.new auto_close: false storage = Storage.new auto_close: false
(Dir.entries @dir) examine_context : ExamineContext = {
cached_contents_signature: {} of String => String,
deleted_title_ids: [] of String,
deleted_entry_ids: [] of String,
}
library_paths = (Dir.entries @dir)
.select { |fn| !fn.starts_with? "." } .select { |fn| !fn.starts_with? "." }
.map { |fn| File.join @dir, fn } .map { |fn| File.join @dir, fn }
@title_ids.select! do |title_id|
title = @title_hash[title_id]
next false unless library_paths.includes? title.dir
existence = title.examine examine_context
unless existence
examine_context["deleted_title_ids"].concat [title_id] +
title.deep_titles.map &.id
examine_context["deleted_entry_ids"].concat title.deep_entries.map &.id
end
existence
end
remained_title_dirs = @title_ids.map { |id| title_hash[id].dir }
examine_context["deleted_title_ids"].each do |title_id|
@title_hash.delete title_id
end
cache = examine_context["cached_contents_signature"]
library_paths
.select { |path| !(remained_title_dirs.includes? path) }
.select { |path| File.directory? path } .select { |path| File.directory? path }
.map { |path| Title.new path, "" } .map { |path| Title.new path, "", cache }
.select { |title| !(title.entries.empty? && title.titles.empty?) } .select { |title| !(title.entries.empty? && title.titles.empty?) }
.sort! { |a, b| a.title <=> b.title } .sort! { |a, b| a.sort_title <=> b.sort_title }
.tap { |_| @title_ids.clear }
.each do |title| .each do |title|
@title_hash[title.id] = title @title_hash[title.id] = title
@title_ids << title.id @title_ids << title.id
@@ -123,8 +212,15 @@ class Library
storage.bulk_insert_ids storage.bulk_insert_ids
storage.close storage.close
Logger.debug "Scan completed" ms = (Time.local - start).total_milliseconds
Storage.default.mark_unavailable Logger.info "Scanned #{@title_ids.size} titles in #{ms}ms"
Storage.default.mark_unavailable examine_context["deleted_entry_ids"],
examine_context["deleted_title_ids"]
spawn do
save_instance
end
end end
def get_continue_reading_entries(username) def get_continue_reading_entries(username)
@@ -208,34 +304,29 @@ class Library
.shuffle! .shuffle!
end end
def thumbnail_generation_progress
return 0 if @entries_count == 0
@thumbnails_count / @entries_count
end
def generate_thumbnails def generate_thumbnails
if @thumbnails_count > 0 if thumbnail_ctx.current > 0
Logger.debug "Thumbnail generation in progress" Logger.debug "Thumbnail generation in progress"
return return
end end
Logger.info "Starting thumbnail generation" Logger.info "Starting thumbnail generation"
entries = deep_titles.flat_map(&.deep_entries).reject &.err_msg entries = deep_titles.flat_map(&.deep_entries).reject &.err_msg
@entries_count = entries.size thumbnail_ctx.total = entries.size
@thumbnails_count = 0 thumbnail_ctx.current = 0
# Report generation progress regularly # Report generation progress regularly
spawn do spawn do
loop do loop do
unless @thumbnails_count == 0 unless thumbnail_ctx.current == 0
Logger.debug "Thumbnail generation progress: " \ Logger.debug "Thumbnail generation progress: " \
"#{(thumbnail_generation_progress * 100).round 1}%" "#{(thumbnail_ctx.progress * 100).round 1}%"
end end
# Generation is completed. We reset the count to 0 to allow subsequent # Generation is completed. We reset the count to 0 to allow subsequent
# calls to the function, and break from the loop to stop the progress # calls to the function, and break from the loop to stop the progress
# report fiber # report fiber
if thumbnail_generation_progress.to_i == 1 if thumbnail_ctx.progress.to_i == 1
@thumbnails_count = 0 thumbnail_ctx.reset
break break
end end
sleep 10.seconds sleep 10.seconds
@@ -249,7 +340,7 @@ class Library
# and CPU # and CPU
sleep 1.seconds sleep 1.seconds
end end
@thumbnails_count += 1 thumbnail_ctx.increment
end end
Logger.info "Thumbnail generation finished" Logger.info "Thumbnail generation finished"
end end
+284 -54
View File
@@ -1,13 +1,30 @@
require "digest"
require "../archive" require "../archive"
class Title class Title
include YAML::Serializable
getter dir : String, parent_id : String, title_ids : Array(String), getter dir : String, parent_id : String, title_ids : Array(String),
entries : Array(Entry), title : String, id : String, entries : Array(Entry), title : String, id : String,
encoded_title : String, mtime : Time, signature : UInt64 encoded_title : String, mtime : Time, signature : UInt64,
entry_cover_url_cache : Hash(String, String)?
setter entry_cover_url_cache : Hash(String, String)?,
entry_sort_title_cache : Hash(String, String | Nil)?
@[YAML::Field(ignore: true)]
@sort_title : String?
@[YAML::Field(ignore: true)]
@entry_sort_title_cache : Hash(String, String | Nil)?
@[YAML::Field(ignore: true)]
@entry_display_name_cache : Hash(String, String)? @entry_display_name_cache : Hash(String, String)?
@[YAML::Field(ignore: true)]
@entry_cover_url_cache : Hash(String, String)?
@[YAML::Field(ignore: true)]
@cached_display_name : String?
@[YAML::Field(ignore: true)]
@cached_cover_url : String?
def initialize(@dir : String, @parent_id) def initialize(@dir : String, @parent_id, cache = {} of String => String)
storage = Storage.default storage = Storage.default
@signature = Dir.signature dir @signature = Dir.signature dir
id = storage.get_title_id dir, signature id = storage.get_title_id dir, signature
@@ -20,6 +37,7 @@ class Title
}) })
end end
@id = id @id = id
@contents_signature = Dir.contents_signature dir, cache
@title = File.basename dir @title = File.basename dir
@encoded_title = URI.encode @title @encoded_title = URI.encode @title
@title_ids = [] of String @title_ids = [] of String
@@ -30,7 +48,7 @@ class Title
next if fn.starts_with? "." next if fn.starts_with? "."
path = File.join dir, fn path = File.join dir, fn
if File.directory? path if File.directory? path
title = Title.new path, @id title = Title.new path, @id, cache
next if title.entries.size == 0 && title.titles.size == 0 next if title.entries.size == 0 && title.titles.size == 0
Library.default.title_hash[title.id] = title Library.default.title_hash[title.id] = title
@title_ids << title.id @title_ids << title.id
@@ -53,59 +71,172 @@ class Title
end end
sorter = ChapterSorter.new @entries.map &.title sorter = ChapterSorter.new @entries.map &.title
@entries.sort! do |a, b| @entries.sort! do |a, b|
sorter.compare a.title, b.title sorter.compare a.sort_title, b.sort_title
end end
end end
def to_slim_json : String # Utility method used in library rescanning.
# - When the title does not exist on the file system anymore, return false
# and let it be deleted from the library instance
# - When the title exists, but its contents signature is now different from
# the cache, it means some of its content (nested titles or entries)
# has been added, deleted, or renamed. In this case we update its
# contents signature and instance variables
# - When the title exists and its contents signature is still the same, we
# return true so it can be reused without rescanning
def examine(context : ExamineContext) : Bool
return false unless Dir.exists? @dir
contents_signature = Dir.contents_signature @dir,
context["cached_contents_signature"]
return true if @contents_signature == contents_signature
@contents_signature = contents_signature
@signature = Dir.signature @dir
storage = Storage.default
id = storage.get_title_id dir, signature
if id.nil?
id = random_str
storage.insert_title_id({
path: dir,
id: id,
signature: signature.to_s,
})
end
@id = id
@mtime = File.info(@dir).modification_time
previous_titles_size = @title_ids.size
@title_ids.select! do |title_id|
title = Library.default.get_title title_id
unless title # for if data consistency broken
context["deleted_title_ids"].concat [title_id]
next false
end
existence = title.examine context
unless existence
context["deleted_title_ids"].concat [title_id] +
title.deep_titles.map &.id
context["deleted_entry_ids"].concat title.deep_entries.map &.id
end
existence
end
remained_title_dirs = @title_ids.map do |title_id|
title = Library.default.get_title! title_id
title.dir
end
previous_entries_size = @entries.size
@entries.select! do |entry|
existence = File.exists? entry.zip_path
Fiber.yield
context["deleted_entry_ids"] << entry.id unless existence
existence
end
remained_entry_zip_paths = @entries.map &.zip_path
is_titles_added = false
is_entries_added = false
Dir.entries(dir).each do |fn|
next if fn.starts_with? "."
path = File.join dir, fn
if File.directory? path
next if remained_title_dirs.includes? path
title = Title.new path, @id, context["cached_contents_signature"]
next if title.entries.size == 0 && title.titles.size == 0
Library.default.title_hash[title.id] = title
@title_ids << title.id
is_titles_added = true
# We think they are removed, but they are here!
# Cancel reserved jobs
revival_title_ids = [title.id] + title.deep_titles.map &.id
context["deleted_title_ids"].select! do |deleted_title_id|
!(revival_title_ids.includes? deleted_title_id)
end
revival_entry_ids = title.deep_entries.map &.id
context["deleted_entry_ids"].select! do |deleted_entry_id|
!(revival_entry_ids.includes? deleted_entry_id)
end
next
end
if is_supported_file path
next if remained_entry_zip_paths.includes? path
entry = Entry.new path, self
if entry.pages > 0 || entry.err_msg
@entries << entry
is_entries_added = true
context["deleted_entry_ids"].select! do |deleted_entry_id|
entry.id != deleted_entry_id
end
end
end
end
mtimes = [@mtime]
mtimes += @title_ids.map { |e| Library.default.title_hash[e].mtime }
mtimes += @entries.map &.mtime
@mtime = mtimes.max
if is_titles_added || previous_titles_size != @title_ids.size
@title_ids.sort! do |a, b|
compare_numerically Library.default.title_hash[a].title,
Library.default.title_hash[b].title
end
end
if is_entries_added || previous_entries_size != @entries.size
sorter = ChapterSorter.new @entries.map &.sort_title
@entries.sort! do |a, b|
sorter.compare a.sort_title, b.sort_title
end
end
if @title_ids.size > 0 || @entries.size > 0
true
else
context["deleted_title_ids"].concat [@id]
false
end
end
alias SortContext = NamedTuple(username: String, opt: SortOptions)
def build_json(*, slim = false, depth = -1,
sort_context : SortContext? = nil)
JSON.build do |json| JSON.build do |json|
json.object do json.object do
{% for str in ["dir", "title", "id"] %} {% for str in ["dir", "title", "id"] %}
json.field {{str}}, @{{str.id}} json.field {{str}}, @{{str.id}}
{% end %} {% end %}
json.field "signature" { json.number @signature } json.field "signature" { json.number @signature }
json.field "titles" do json.field "sort_title", sort_title
json.array do unless slim
self.titles.each do |title|
json.raw title.to_slim_json
end
end
end
json.field "entries" do
json.array do
@entries.each do |entry|
json.raw entry.to_slim_json
end
end
end
json.field "parents" do
json.array do
self.parents.each do |title|
json.object do
json.field "title", title.title
json.field "id", title.id
end
end
end
end
end
end
end
def to_json(json : JSON::Builder)
json.object do
{% for str in ["dir", "title", "id"] %}
json.field {{str}}, @{{str.id}}
{% end %}
json.field "signature" { json.number @signature }
json.field "display_name", display_name json.field "display_name", display_name
json.field "cover_url", cover_url json.field "cover_url", cover_url
json.field "mtime" { json.number @mtime.to_unix } json.field "mtime" { json.number @mtime.to_unix }
end
unless depth == 0
json.field "titles" do json.field "titles" do
json.raw self.titles.to_json json.array do
self.titles.each do |title|
json.raw title.build_json(slim: slim,
depth: depth > 0 ? depth - 1 : depth)
end
end
end end
json.field "entries" do json.field "entries" do
json.raw @entries.to_json json.array do
_entries = if sort_context
sorted_entries sort_context[:username],
sort_context[:opt]
else
@entries
end
_entries.each do |entry|
json.raw entry.build_json(slim: slim)
end
end
end
end end
json.field "parents" do json.field "parents" do
json.array do json.array do
@@ -119,11 +250,21 @@ class Title
end end
end end
end end
end
def titles def titles
@title_ids.map { |tid| Library.default.get_title! tid } @title_ids.map { |tid| Library.default.get_title! tid }
end end
def sorted_titles(username, opt : SortOptions? = nil)
if opt.nil?
opt = SortOptions.from_info_json @dir, username
end
# Helper function from src/util/util.cr
sort_titles titles, opt.not_nil!, username
end
# Get all entries, including entries in nested titles # Get all entries, including entries in nested titles
def deep_entries def deep_entries
return @entries if title_ids.empty? return @entries if title_ids.empty?
@@ -160,6 +301,48 @@ class Title
ary.join " and " ary.join " and "
end end
def sort_title
sort_title_cached = @sort_title
return sort_title_cached if sort_title_cached
sort_title = Storage.default.get_title_sort_title id
if sort_title
@sort_title = sort_title
return sort_title
end
@sort_title = @title
@title
end
def set_sort_title(sort_title : String | Nil, username : String)
Storage.default.set_title_sort_title id, sort_title
if sort_title == "" || sort_title.nil?
@sort_title = nil
else
@sort_title = sort_title
end
if parents.size > 0
target = parents[-1].titles
else
target = Library.default.titles
end
remove_sorted_titles_cache target,
[SortMethod::Auto, SortMethod::Title], username
end
def sort_title_db
Storage.default.get_title_sort_title id
end
def entry_sort_title_db(entry_id)
unless @entry_sort_title_cache
@entry_sort_title_cache =
Storage.default.get_entries_sort_title @entries.map &.id
end
@entry_sort_title_cache.not_nil![entry_id]?
end
def tags def tags
Storage.default.get_title_tags @id Storage.default.get_title_tags @id
end end
@@ -177,11 +360,15 @@ class Title
end end
def display_name def display_name
cached_display_name = @cached_display_name
return cached_display_name unless cached_display_name.nil?
dn = @title dn = @title
TitleInfo.new @dir do |info| TitleInfo.new @dir do |info|
info_dn = info.display_name info_dn = info.display_name
dn = info_dn unless info_dn.empty? dn = info_dn unless info_dn.empty?
end end
@cached_display_name = dn
dn dn
end end
@@ -205,6 +392,7 @@ class Title
end end
def set_display_name(dn) def set_display_name(dn)
@cached_display_name = dn
TitleInfo.new @dir do |info| TitleInfo.new @dir do |info|
info.display_name = dn info.display_name = dn
info.save info.save
@@ -214,11 +402,15 @@ class Title
def set_display_name(entry_name : String, dn) def set_display_name(entry_name : String, dn)
TitleInfo.new @dir do |info| TitleInfo.new @dir do |info|
info.entry_display_name[entry_name] = dn info.entry_display_name[entry_name] = dn
@entry_display_name_cache = info.entry_display_name
info.save info.save
end end
end end
def cover_url def cover_url
cached_cover_url = @cached_cover_url
return cached_cover_url unless cached_cover_url.nil?
url = "#{Config.current.base_url}img/icon.png" url = "#{Config.current.base_url}img/icon.png"
readable_entries = @entries.select &.err_msg.nil? readable_entries = @entries.select &.err_msg.nil?
if readable_entries.size > 0 if readable_entries.size > 0
@@ -230,10 +422,12 @@ class Title
url = File.join Config.current.base_url, info_url url = File.join Config.current.base_url, info_url
end end
end end
@cached_cover_url = url
url url
end end
def set_cover_url(url : String) def set_cover_url(url : String)
@cached_cover_url = url
TitleInfo.new @dir do |info| TitleInfo.new @dir do |info|
info.cover_url = url info.cover_url = url
info.save info.save
@@ -243,6 +437,7 @@ class Title
def set_cover_url(entry_name : String, url : String) def set_cover_url(entry_name : String, url : String)
TitleInfo.new @dir do |info| TitleInfo.new @dir do |info|
info.entry_cover_url[entry_name] = url info.entry_cover_url[entry_name] = url
@entry_cover_url_cache = info.entry_cover_url
info.save info.save
end end
end end
@@ -262,8 +457,15 @@ class Title
end end
def deep_read_page_count(username) : Int32 def deep_read_page_count(username) : Int32
load_progress_for_all_entries(username).sum + key = "#{@id}:#{username}:progress_sum"
sig = Digest::SHA1.hexdigest (entries.map &.id).to_s
cached_sum = LRUCache.get key
return cached_sum[1] if cached_sum.is_a? Tuple(String, Int32) &&
cached_sum[0] == sig
sum = load_progress_for_all_entries(username, nil, true).sum +
titles.flat_map(&.deep_read_page_count username).sum titles.flat_map(&.deep_read_page_count username).sum
LRUCache.set generate_cache_entry key, {sig, sum}
sum
end end
def deep_total_page_count : Int32 def deep_total_page_count : Int32
@@ -317,44 +519,46 @@ class Title
# use the default (auto, ascending) # use the default (auto, ascending)
# When `opt` is not nil, it saves the options to info.json # When `opt` is not nil, it saves the options to info.json
def sorted_entries(username, opt : SortOptions? = nil) def sorted_entries(username, opt : SortOptions? = nil)
cache_key = SortedEntriesCacheEntry.gen_key @id, username, @entries, opt
cached_entries = LRUCache.get cache_key
return cached_entries if cached_entries.is_a? Array(Entry)
if opt.nil? if opt.nil?
opt = SortOptions.from_info_json @dir, username opt = SortOptions.from_info_json @dir, username
else
TitleInfo.new @dir do |info|
info.sort_by[username] = opt.to_tuple
info.save
end
end end
case opt.not_nil!.method case opt.not_nil!.method
when .title? when .title?
ary = @entries.sort { |a, b| compare_numerically a.title, b.title } ary = @entries.sort do |a, b|
compare_numerically a.sort_title, b.sort_title
end
when .time_modified? when .time_modified?
ary = @entries.sort { |a, b| (a.mtime <=> b.mtime).or \ ary = @entries.sort { |a, b| (a.mtime <=> b.mtime).or \
compare_numerically a.title, b.title } compare_numerically a.sort_title, b.sort_title }
when .time_added? when .time_added?
ary = @entries.sort { |a, b| (a.date_added <=> b.date_added).or \ ary = @entries.sort { |a, b| (a.date_added <=> b.date_added).or \
compare_numerically a.title, b.title } compare_numerically a.sort_title, b.sort_title }
when .progress? when .progress?
percentage_ary = load_percentage_for_all_entries username, opt, true percentage_ary = load_percentage_for_all_entries username, opt, true
ary = @entries.zip(percentage_ary) ary = @entries.zip(percentage_ary)
.sort { |a_tp, b_tp| (a_tp[1] <=> b_tp[1]).or \ .sort { |a_tp, b_tp| (a_tp[1] <=> b_tp[1]).or \
compare_numerically a_tp[0].title, b_tp[0].title } compare_numerically a_tp[0].sort_title, b_tp[0].sort_title }
.map &.[0] .map &.[0]
else else
unless opt.method.auto? unless opt.method.auto?
Logger.warn "Unknown sorting method #{opt.not_nil!.method}. Using " \ Logger.warn "Unknown sorting method #{opt.not_nil!.method}. Using " \
"Auto instead" "Auto instead"
end end
sorter = ChapterSorter.new @entries.map &.title sorter = ChapterSorter.new @entries.map &.sort_title
ary = @entries.sort do |a, b| ary = @entries.sort do |a, b|
sorter.compare(a.title, b.title).or \ sorter.compare(a.sort_title, b.sort_title).or \
compare_numerically a.title, b.title compare_numerically a.sort_title, b.sort_title
end end
end end
ary.reverse! unless opt.not_nil!.ascend ary.reverse! unless opt.not_nil!.ascend
LRUCache.set generate_cache_entry cache_key, ary
ary ary
end end
@@ -415,7 +619,33 @@ class Title
zip + titles.flat_map &.deep_entries_with_date_added zip + titles.flat_map &.deep_entries_with_date_added
end end
def remove_sorted_entries_cache(sort_methods : Array(SortMethod),
username : String)
[false, true].each do |ascend|
sort_methods.each do |sort_method|
sorted_entries_cache_key =
SortedEntriesCacheEntry.gen_key @id, username, @entries,
SortOptions.new(sort_method, ascend)
LRUCache.invalidate sorted_entries_cache_key
end
end
end
def remove_sorted_caches(sort_methods : Array(SortMethod), username : String)
remove_sorted_entries_cache sort_methods, username
parents.each do |parent|
remove_sorted_titles_cache parent.titles, sort_methods, username
end
remove_sorted_titles_cache Library.default.titles, sort_methods, username
end
def bulk_progress(action, ids : Array(String), username) def bulk_progress(action, ids : Array(String), username)
LRUCache.invalidate "#{@id}:#{username}:progress_sum"
parents.each do |parent|
LRUCache.invalidate "#{parent.id}:#{username}:progress_sum"
end
remove_sorted_caches [SortMethod::Progress], username
selected_entries = ids selected_entries = ids
.map { |id| .map { |id|
@entries.find &.id.==(id) @entries.find &.id.==(id)
+29 -1
View File
@@ -1,4 +1,12 @@
SUPPORTED_IMG_TYPES = ["image/jpeg", "image/png", "image/webp"] SUPPORTED_IMG_TYPES = %w(
image/jpeg
image/png
image/webp
image/apng
image/avif
image/gif
image/svg+xml
)
enum SortMethod enum SortMethod
Auto Auto
@@ -88,6 +96,18 @@ class TitleInfo
@@mutex_hash = {} of String => Mutex @@mutex_hash = {} of String => Mutex
def self.new(dir, &) def self.new(dir, &)
key = "#{dir}:info.json"
info = LRUCache.get key
if info.is_a? String
begin
instance = TitleInfo.from_json info
instance.dir = dir
yield instance
return
rescue
end
end
if @@mutex_hash[dir]? if @@mutex_hash[dir]?
mutex = @@mutex_hash[dir] mutex = @@mutex_hash[dir]
else else
@@ -101,6 +121,7 @@ class TitleInfo
instance = TitleInfo.from_json File.read json_path instance = TitleInfo.from_json File.read json_path
end end
instance.dir = dir instance.dir = dir
LRUCache.set generate_cache_entry key, instance.to_json
yield instance yield instance
end end
end end
@@ -108,5 +129,12 @@ class TitleInfo
def save def save
json_path = File.join @dir, "info.json" json_path = File.join @dir, "info.json"
File.write json_path, self.to_pretty_json File.write json_path, self.to_pretty_json
key = "#{@dir}:info.json"
LRUCache.set generate_cache_entry key, self.to_json
end end
end end
alias ExamineContext = NamedTuple(
cached_contents_signature: Hash(String, String),
deleted_title_ids: Array(String),
deleted_entry_ids: Array(String))
+3 -1
View File
@@ -7,7 +7,7 @@ require "option_parser"
require "clim" require "clim"
require "tallboy" require "tallboy"
MANGO_VERSION = "0.23.0" MANGO_VERSION = "0.25.0"
# From http://www.network-science.de/ascii/ # From http://www.network-science.de/ascii/
BANNER = %{ BANNER = %{
@@ -55,8 +55,10 @@ class CLI < Clim
Config.load(opts.config).set_current Config.load(opts.config).set_current
# Initialize main components # Initialize main components
LRUCache.init
Storage.default Storage.default
Queue.default Queue.default
Library.load_instance
Library.default Library.default
Plugin::Downloader.default Plugin::Downloader.default
+3 -8
View File
@@ -23,11 +23,6 @@ class Plugin
job job
end end
private def process_filename(str)
return "_" if str == ".."
str.gsub "/", "_"
end
private def download(job : Queue::Job) private def download(job : Queue::Job)
@downloading = true @downloading = true
@queue.set_status Queue::JobStatus::Downloading, job @queue.set_status Queue::JobStatus::Downloading, job
@@ -42,8 +37,8 @@ class Plugin
pages = info["pages"].as_i pages = info["pages"].as_i
manga_title = process_filename job.manga_title manga_title = sanitize_filename job.manga_title
chapter_title = process_filename info["title"].as_s chapter_title = sanitize_filename info["title"].as_s
@queue.set_pages pages, job @queue.set_pages pages, job
lib_dir = @library_path lib_dir = @library_path
@@ -68,7 +63,7 @@ class Plugin
while page = plugin.next_page while page = plugin.next_page
break unless @queue.exists? job break unless @queue.exists? job
fn = process_filename page["filename"].as_s fn = sanitize_filename page["filename"].as_s
url = page["url"].as_s url = page["url"].as_s
headers = HTTP::Headers.new headers = HTTP::Headers.new
+1 -1
View File
@@ -112,7 +112,7 @@ class Queue
use_default use_default
def initialize(db_path : String? = nil) def initialize(db_path : String? = nil)
@path = db_path || Config.current.mangadex["download_queue_db_path"].to_s @path = db_path || Config.current.queue_db_path.to_s
dir = File.dirname @path dir = File.dirname @path
unless Dir.exists? dir unless Dir.exists? dir
Logger.info "The queue DB directory #{dir} does not exist. " \ Logger.info "The queue DB directory #{dir} does not exist. " \
-1
View File
@@ -66,7 +66,6 @@ struct AdminRouter
end end
get "/admin/downloads" do |env| get "/admin/downloads" do |env|
mangadex_base_url = Config.current.mangadex["base_url"]
layout "download-manager" layout "download-manager"
end end
+88 -14
View File
@@ -23,7 +23,7 @@ struct APIRouter
# Authentication # Authentication
All endpoints require authentication. After logging in, your session ID would be stored as a cookie named `mango-sessid-#{Config.current.port}`, which can be used to authenticate the API access. Note that all admin API endpoints (`/api/admin/...`) require the logged-in user to have admin access. All endpoints except `/api/login` require authentication. After logging in, your session ID would be stored as a cookie named `mango-sessid-#{Config.current.port}`, which can be used to authenticate the API access. Note that all admin API endpoints (`/api/admin/...`) require the logged-in user to have admin access.
# Terminologies # Terminologies
@@ -56,6 +56,29 @@ struct APIRouter
"error" => String?, "error" => String?,
} }
Koa.describe "Authenticates a user", <<-MD
After successful login, the cookie `mango-sessid-#{Config.current.port}` will contain a valid session ID that can be used for subsequent requests
MD
Koa.body schema: {
"username" => String,
"password" => String,
}
Koa.tag "users"
post "/api/login" do |env|
begin
username = env.params.json["username"].as String
password = env.params.json["password"].as String
token = Storage.default.verify_user(username, password).not_nil!
env.session.string "token", token
"Authenticated"
rescue e
Logger.error e
env.response.status_code = 403
e.message
end
end
Koa.describe "Returns a page in a manga entry" Koa.describe "Returns a page in a manga entry"
Koa.path "tid", desc: "Title ID" Koa.path "tid", desc: "Title ID"
Koa.path "eid", desc: "Entry ID" Koa.path "eid", desc: "Entry ID"
@@ -133,24 +156,38 @@ struct APIRouter
end end
Koa.describe "Returns the book with title `tid`", <<-MD Koa.describe "Returns the book with title `tid`", <<-MD
Supply the `tid` query parameter to strip away "display_name", "cover_url", and "mtime" from the returned object to speed up the loading time - Supply the `slim` query parameter to strip away "display_name", "cover_url", and "mtime" from the returned object to speed up the loading time
- Supply the `depth` query parameter to control the depth of nested titles to return.
- When `depth` is 1, returns the top-level titles and sub-titles/entries one level in them
- When `depth` is 0, returns the top-level titles without their sub-titles/entries
- When `depth` is N, returns the top-level titles and sub-titles/entries N levels in them
- When `depth` is negative, returns the entire library
MD MD
Koa.path "tid", desc: "Title ID" Koa.path "tid", desc: "Title ID"
Koa.query "slim" Koa.query "slim"
Koa.query "depth"
Koa.query "sort", desc: "Sorting option for entries. Can be one of 'auto', 'title', 'progress', 'time_added' and 'time_modified'"
Koa.query "ascend", desc: "Sorting direction for entries. Set to 0 for the descending order. Doesn't work without specifying 'sort'"
Koa.response 200, schema: "title" Koa.response 200, schema: "title"
Koa.response 404, "Title not found" Koa.response 404, "Title not found"
Koa.tag "library" Koa.tag "library"
get "/api/book/:tid" do |env| get "/api/book/:tid" do |env|
begin begin
username = get_username env
sort_opt = SortOptions.new
get_sort_opt
tid = env.params.url["tid"] tid = env.params.url["tid"]
title = Library.default.get_title tid title = Library.default.get_title tid
raise "Title ID `#{tid}` not found" if title.nil? raise "Title ID `#{tid}` not found" if title.nil?
if env.params.query["slim"]? slim = !env.params.query["slim"]?.nil?
send_json env, title.to_slim_json depth = env.params.query["depth"]?.try(&.to_i?) || -1
else
send_json env, title.to_json send_json env, title.build_json(slim: slim, depth: depth,
end sort_context: {username: username,
opt: sort_opt})
rescue e rescue e
Logger.error e Logger.error e
env.response.status_code = 404 env.response.status_code = 404
@@ -159,20 +196,25 @@ struct APIRouter
end end
Koa.describe "Returns the entire library with all titles and entries", <<-MD Koa.describe "Returns the entire library with all titles and entries", <<-MD
Supply the `tid` query parameter to strip away "display_name", "cover_url", and "mtime" from the returned object to speed up the loading time - Supply the `slim` query parameter to strip away "display_name", "cover_url", and "mtime" from the returned object to speed up the loading time
- Supply the `dpeth` query parameter to control the depth of nested titles to return.
- When `depth` is 1, returns the requested title and sub-titles/entries one level in it
- When `depth` is 0, returns the requested title without its sub-titles/entries
- When `depth` is N, returns the requested title and sub-titles/entries N levels in it
- When `depth` is negative, returns the requested title and all sub-titles/entries in it
MD MD
Koa.query "slim" Koa.query "slim"
Koa.query "depth"
Koa.response 200, schema: { Koa.response 200, schema: {
"dir" => String, "dir" => String,
"titles" => ["title"], "titles" => ["title"],
} }
Koa.tag "library" Koa.tag "library"
get "/api/library" do |env| get "/api/library" do |env|
if env.params.query["slim"]? slim = !env.params.query["slim"]?.nil?
send_json env, Library.default.to_slim_json depth = env.params.query["depth"]?.try(&.to_i?) || -1
else
send_json env, Library.default.to_json send_json env, Library.default.build_json(slim: slim, depth: depth)
end
end end
Koa.describe "Triggers a library scan" Koa.describe "Triggers a library scan"
@@ -198,7 +240,7 @@ struct APIRouter
} }
get "/api/admin/thumbnail_progress" do |env| get "/api/admin/thumbnail_progress" do |env|
send_json env, { send_json env, {
"progress" => Library.default.thumbnail_generation_progress, "progress" => Library.default.thumbnail_ctx.progress,
}.to_json }.to_json
end end
@@ -329,6 +371,38 @@ struct APIRouter
end end
end end
Koa.describe "Sets the sort title of a title or an entry", <<-MD
When `eid` is provided, apply the sort title to the entry. Otherwise, apply the sort title to the title identified by `tid`.
MD
Koa.tags ["admin", "library"]
Koa.path "tid", desc: "Title ID"
Koa.query "eid", desc: "Entry ID", required: false
Koa.query "name", desc: "The new sort title"
Koa.response 200, schema: "result"
put "/api/admin/sort_title/:tid" do |env|
username = get_username env
begin
title = (Library.default.get_title env.params.url["tid"])
.not_nil!
name = env.params.query["name"]?
entry = env.params.query["eid"]?
if entry.nil?
title.set_sort_title name, username
else
eobj = title.get_entry entry
eobj.set_sort_title name, username unless eobj.nil?
end
rescue e
Logger.error e
send_json env, {
"success" => false,
"error" => e.message,
}.to_json
else
send_json env, {"success" => true}.to_json
end
end
ws "/api/admin/mangadex/queue" do |socket, env| ws "/api/admin/mangadex/queue" do |socket, env|
interval_raw = env.params.query["interval"]? interval_raw = env.params.query["interval"]?
interval = (interval_raw.to_i? if interval_raw) || 5 interval = (interval_raw.to_i? if interval_raw) || 5
+9 -3
View File
@@ -41,7 +41,7 @@ struct MainRouter
username = get_username env username = get_username env
sort_opt = SortOptions.from_info_json Library.default.dir, username sort_opt = SortOptions.from_info_json Library.default.dir, username
get_sort_opt get_and_save_sort_opt Library.default.dir
titles = Library.default.sorted_titles username, sort_opt titles = Library.default.sorted_titles username, sort_opt
percentage = titles.map &.load_percentage username percentage = titles.map &.load_percentage username
@@ -59,12 +59,18 @@ struct MainRouter
username = get_username env username = get_username env
sort_opt = SortOptions.from_info_json title.dir, username sort_opt = SortOptions.from_info_json title.dir, username
get_sort_opt get_and_save_sort_opt title.dir
sorted_titles = title.sorted_titles username, sort_opt
entries = title.sorted_entries username, sort_opt entries = title.sorted_entries username, sort_opt
percentage = title.load_percentage_for_all_entries username, sort_opt percentage = title.load_percentage_for_all_entries username, sort_opt
title_percentage = title.titles.map &.load_percentage username title_percentage = title.titles.map &.load_percentage username
title_percentage_map = {} of String => Float64
title_percentage.each_with_index do |tp, i|
t = title.titles[i]
title_percentage_map[t.id] = tp
end
layout "title" layout "title"
rescue e rescue e
Logger.error e Logger.error e
+91 -3
View File
@@ -342,6 +342,67 @@ class Storage
end end
end end
def get_title_sort_title(title_id : String)
sort_title = nil
MainFiber.run do
get_db do |db|
sort_title =
db.query_one? "Select sort_title from titles where id = (?)",
title_id, as: String | Nil
end
end
sort_title
end
def set_title_sort_title(title_id : String, sort_title : String | Nil)
sort_title = nil if sort_title == ""
MainFiber.run do
get_db do |db|
db.exec "update titles set sort_title = (?) where id = (?)",
sort_title, title_id
end
end
end
def get_entry_sort_title(entry_id : String)
sort_title = nil
MainFiber.run do
get_db do |db|
sort_title =
db.query_one? "Select sort_title from ids where id = (?)",
entry_id, as: String | Nil
end
end
sort_title
end
def get_entries_sort_title(ids : Array(String))
results = Hash(String, String | Nil).new
MainFiber.run do
get_db do |db|
db.query "select id, sort_title from ids where id in " \
"(#{ids.join "," { |id| "'#{id}'" }})" do |rs|
rs.each do
id = rs.read String
sort_title = rs.read String | Nil
results[id] = sort_title
end
end
end
end
results
end
def set_entry_sort_title(entry_id : String, sort_title : String | Nil)
sort_title = nil if sort_title == ""
MainFiber.run do
get_db do |db|
db.exec "update ids set sort_title = (?) where id = (?)",
sort_title, entry_id
end
end
end
def save_thumbnail(id : String, img : Image) def save_thumbnail(id : String, img : Image)
MainFiber.run do MainFiber.run do
get_db do |db| get_db do |db|
@@ -428,12 +489,21 @@ class Storage
end end
end end
def mark_unavailable # Mark titles and entries that no longer exist on the file system as
# unavailable. By supplying `id_candidates` and `titles_candidates`, it
# only checks the existence of the candidate titles/entries to speed up
# the process.
def mark_unavailable(ids_candidates : Array(String)?,
titles_candidates : Array(String)?)
MainFiber.run do MainFiber.run do
get_db do |db| get_db do |db|
# Detect dangling entry IDs # Detect dangling entry IDs
trash_ids = [] of String trash_ids = [] of String
db.query "select path, id from ids where unavailable = 0" do |rs| query = "select path, id from ids where unavailable = 0"
unless ids_candidates.nil?
query += " and id in (#{ids_candidates.join "," { |i| "'#{i}'" }})"
end
db.query query do |rs|
rs.each do rs.each do
path = rs.read String path = rs.read String
fullpath = Path.new(path).expand(Config.current.library_path).to_s fullpath = Path.new(path).expand(Config.current.library_path).to_s
@@ -449,7 +519,11 @@ class Storage
# Detect dangling title IDs # Detect dangling title IDs
trash_titles = [] of String trash_titles = [] of String
db.query "select path, id from titles where unavailable = 0" do |rs| query = "select path, id from titles where unavailable = 0"
unless titles_candidates.nil?
query += " and id in (#{titles_candidates.join "," { |i| "'#{i}'" }})"
end
db.query query do |rs|
rs.each do rs.each do
path = rs.read String path = rs.read String
fullpath = Path.new(path).expand(Config.current.library_path).to_s fullpath = Path.new(path).expand(Config.current.library_path).to_s
@@ -545,6 +619,20 @@ class Storage
{token, expires} {token, expires}
end end
def count_titles : Int32
count = 0
MainFiber.run do
get_db do |db|
db.query "select count(*) from titles" do |rs|
rs.each do
count = rs.read Int32
end
end
end
end
count
end
def close def close
MainFiber.run do MainFiber.run do
unless @db.nil? unless @db.nil?
+28
View File
@@ -48,4 +48,32 @@ class Dir
end end
Digest::CRC32.checksum(signatures.sort.join).to_u64 Digest::CRC32.checksum(signatures.sort.join).to_u64
end end
# Returns the contents signature of the directory at dirname for checking
# to rescan.
# Rescan conditions:
# - When a file added, moved, removed, renamed (including which in nested
# directories)
def self.contents_signature(dirname, cache = {} of String => String) : String
return cache[dirname] if cache[dirname]?
Fiber.yield
signatures = [] of String
self.open dirname do |dir|
dir.entries.sort.each do |fn|
next if fn.starts_with? "."
path = File.join dirname, fn
if File.directory? path
signatures << Dir.contents_signature path, cache
else
# Only add its signature value to `signatures` when it is a
# supported file
signatures << fn if is_supported_file fn
end
Fiber.yield
end
end
hash = Digest::SHA1.hexdigest(signatures.join)
cache[dirname] = hash
hash
end
end end
+58 -6
View File
@@ -35,6 +35,11 @@ def register_mime_types
# FontAwesome fonts # FontAwesome fonts
".woff" => "font/woff", ".woff" => "font/woff",
".woff2" => "font/woff2", ".woff2" => "font/woff2",
# Supported image formats. JPG, PNG, GIF, WebP, and SVG are already
# defiend by Crystal in `MIME.DEFAULT_TYPES`
".apng" => "image/apng",
".avif" => "image/avif",
}.each do |k, v| }.each do |k, v|
MIME.register k, v MIME.register k, v
end end
@@ -82,30 +87,49 @@ def env_is_true?(key : String) : Bool
end end
def sort_titles(titles : Array(Title), opt : SortOptions, username : String) def sort_titles(titles : Array(Title), opt : SortOptions, username : String)
ary = titles cache_key = SortedTitlesCacheEntry.gen_key username, titles, opt
cached_titles = LRUCache.get cache_key
return cached_titles if cached_titles.is_a? Array(Title)
case opt.method case opt.method
when .time_modified? when .time_modified?
ary.sort! { |a, b| (a.mtime <=> b.mtime).or \ ary = titles.sort { |a, b| (a.mtime <=> b.mtime).or \
compare_numerically a.title, b.title } compare_numerically a.sort_title, b.sort_title }
when .progress? when .progress?
ary.sort! do |a, b| ary = titles.sort do |a, b|
(a.load_percentage(username) <=> b.load_percentage(username)).or \ (a.load_percentage(username) <=> b.load_percentage(username)).or \
compare_numerically a.title, b.title compare_numerically a.sort_title, b.sort_title
end
when .title?
ary = titles.sort do |a, b|
compare_numerically a.sort_title, b.sort_title
end end
else else
unless opt.method.auto? unless opt.method.auto?
Logger.warn "Unknown sorting method #{opt.not_nil!.method}. Using " \ Logger.warn "Unknown sorting method #{opt.not_nil!.method}. Using " \
"Auto instead" "Auto instead"
end end
ary.sort! { |a, b| compare_numerically a.title, b.title } ary = titles.sort { |a, b| compare_numerically a.sort_title, b.sort_title }
end end
ary.reverse! unless opt.not_nil!.ascend ary.reverse! unless opt.not_nil!.ascend
LRUCache.set generate_cache_entry cache_key, ary
ary ary
end end
def remove_sorted_titles_cache(titles : Array(Title),
sort_methods : Array(SortMethod),
username : String)
[false, true].each do |ascend|
sort_methods.each do |sort_method|
sorted_titles_cache_key = SortedTitlesCacheEntry.gen_key username,
titles, SortOptions.new(sort_method, ascend)
LRUCache.invalidate sorted_titles_cache_key
end
end
end
class String class String
# Returns the similarity (in [0, 1]) of two paths. # Returns the similarity (in [0, 1]) of two paths.
# For the two paths, separate them into arrays of components, count the # For the two paths, separate them into arrays of components, count the
@@ -120,3 +144,31 @@ class String
match / s.size match / s.size
end end
end end
# Does the followings:
# - turns space-like characters into the normal whitespaces ( )
# - strips and collapses spaces
# - removes ASCII control characters
# - replaces slashes (/) with underscores (_)
# - removes leading dots (.)
# - removes the following special characters: \:*?"<>|
#
# If the sanitized string is empty, returns a random string instead.
def sanitize_filename(str : String) : String
sanitized = str
.gsub(/\s+/, " ")
.strip
.gsub(/\//, "_")
.gsub(/^[\.\s]+/, "")
.gsub(/[\177\000-\031\\:\*\?\"<>\|]/, "")
sanitized.size > 0 ? sanitized : random_str
end
def delete_cache_and_exit(path : String)
File.delete path
Logger.fatal "Invalid library cache deleted. Mango needs to " \
"perform a full reset to recover from this. " \
"Pleae restart Mango. This is NOT a bug."
Logger.fatal "Exiting"
exit 1
end
+20
View File
@@ -107,6 +107,26 @@ macro get_sort_opt
end end
end end
macro get_and_save_sort_opt(dir)
sort_method = env.params.query["sort"]?
if sort_method
is_ascending = true
ascend = env.params.query["ascend"]?
if ascend && ascend.to_i? == 0
is_ascending = false
end
sort_opt = SortOptions.new sort_method, is_ascending
TitleInfo.new {{dir}} do |info|
info.sort_by[username] = sort_opt.to_tuple
info.save
end
end
end
module HTTP module HTTP
class Client class Client
private def self.exec(uri : URI, tls : TLSContext = nil) private def self.exec(uri : URI, tls : TLSContext = nil)
+3 -1
View File
@@ -61,7 +61,9 @@
<% if page == "home" && item.is_a? Entry %> <% if page == "home" && item.is_a? Entry %>
<%= "uk-margin-remove-bottom" %> <%= "uk-margin-remove-bottom" %>
<% end %> <% end %>
" data-title="<%= HTML.escape(item.display_name) %>"><%= HTML.escape(item.display_name) %> " data-title="<%= HTML.escape(item.display_name) %>"
data-file-title="<%= HTML.escape(item.title || "") %>"
data-sort-title="<%= HTML.escape(item.sort_title_db || "") %>"><%= HTML.escape(item.display_name) %>
</h3> </h3>
<% if page == "home" && item.is_a? Entry %> <% if page == "home" && item.is_a? Entry %>
<a class="uk-card-title break-word uk-margin-remove-top uk-text-meta uk-display-inline-block no-modal" data-title="<%= HTML.escape(item.book.display_name) %>" href="<%= base_url %>book/<%= item.book.id %>"><%= HTML.escape(item.book.display_name) %></a> <a class="uk-card-title break-word uk-margin-remove-top uk-text-meta uk-display-inline-block no-modal" data-title="<%= HTML.escape(item.book.display_name) %>" href="<%= base_url %>book/<%= item.book.id %>"><%= HTML.escape(item.book.display_name) %></a>
-6
View File
@@ -24,16 +24,10 @@
<template x-if="job.plugin_id"> <template x-if="job.plugin_id">
<td x-text="job.title"></td> <td x-text="job.title"></td>
</template> </template>
<template x-if="!job.plugin_id">
<td><a :href="`<%= mangadex_base_url %>/chapter/${job.id}`" x-text="job.title"></td>
</template>
<template x-if="job.plugin_id"> <template x-if="job.plugin_id">
<td x-text="job.manga_title"></td> <td x-text="job.manga_title"></td>
</template> </template>
<template x-if="!job.plugin_id">
<td><a :href="`<%= mangadex_base_url %>/manga/${job.manga_id}`" x-text="job.manga_title"></td>
</template>
<td x-text="`${job.success_count}/${job.pages}`"></td> <td x-text="`${job.success_count}/${job.pages}`"></td>
<td x-text="`${moment(job.time).fromNow()}`"></td> <td x-text="`${moment(job.time).fromNow()}`"></td>
+3 -3
View File
@@ -32,10 +32,10 @@
</div> </div>
<div class="uk-position-top"> <div class="uk-position-top">
<div class="uk-navbar-container uk-navbar-transparent" uk-navbar="uk-navbar"> <div class="uk-navbar-container uk-navbar-transparent" uk-navbar="uk-navbar">
<div class="uk-navbar-left uk-hidden@s"> <div class="uk-navbar-left uk-hidden@m">
<div class="uk-navbar-toggle" uk-navbar-toggle-icon="uk-navbar-toggle-icon" uk-toggle="target: #mobile-nav"></div> <div class="uk-navbar-toggle" uk-navbar-toggle-icon="uk-navbar-toggle-icon" uk-toggle="target: #mobile-nav"></div>
</div> </div>
<div class="uk-navbar-left uk-visible@s"> <div class="uk-navbar-left uk-visible@m">
<a class="uk-navbar-item uk-logo" href="<%= base_url %>"><img src="<%= base_url %>img/icon.png" style="width:90px;height:90px;"></a> <a class="uk-navbar-item uk-logo" href="<%= base_url %>"><img src="<%= base_url %>img/icon.png" style="width:90px;height:90px;"></a>
<ul class="uk-navbar-nav"> <ul class="uk-navbar-nav">
<li><a href="<%= base_url %>">Home</a></li> <li><a href="<%= base_url %>">Home</a></li>
@@ -57,7 +57,7 @@
<% end %> <% end %>
</ul> </ul>
</div> </div>
<div class="uk-navbar-right uk-visible@s"> <div class="uk-navbar-right uk-visible@m">
<ul class="uk-navbar-nav"> <ul class="uk-navbar-nav">
<li><a onclick="toggleTheme()"><i class="fas fa-adjust"></i></a></li> <li><a onclick="toggleTheme()"><i class="fas fa-adjust"></i></a></li>
<li><a href="<%= base_url %>logout">Logout</a></li> <li><a href="<%= base_url %>logout">Logout</a></li>
+1
View File
@@ -10,6 +10,7 @@
<div class="uk-margin-bottom uk-width-1-4@s"> <div class="uk-margin-bottom uk-width-1-4@s">
<% hash = { <% hash = {
"auto" => "Auto", "auto" => "Auto",
"title" => "Name",
"time_modified" => "Date Modified", "time_modified" => "Date Modified",
"progress" => "Progress" "progress" => "Progress"
} %> } %>
+8 -2
View File
@@ -55,8 +55,8 @@
object-fit: contain; object-fit: contain;
`" /> `" />
<div style="position:absolute;z-index:1; top:0;left:0; width:30%;height:100%;" @click="flipPage(false)"></div> <div style="position:absolute;z-index:1; top:0;left:0; width:30%;height:100%;" @click="flipPage(false ^ enableRightToLeft)"></div>
<div style="position:absolute;z-index:1; top:0;right:0; width:30%;height:100%;" @click="flipPage(true)"></div> <div style="position:absolute;z-index:1; top:0;right:0; width:30%;height:100%;" @click="flipPage(true ^ enableRightToLeft)"></div>
</div> </div>
</div> </div>
@@ -114,6 +114,12 @@
</div> </div>
</div> </div>
<div class="uk-margin uk-form-horizontal" x-show="mode !== 'continuous'">
<label class="uk-form-label" for="enable-right-to-left">Right to Left</label>
<div class="uk-form-controls">
<input id="enable-right-to-left" class="uk-checkbox" type="checkbox" x-model="enableRightToLeft" @change="enableRightToLeftChanged()">
</div>
</div>
<hr class="uk-divider-icon"> <hr class="uk-divider-icon">
<div class="uk-margin"> <div class="uk-margin">
+11 -3
View File
@@ -18,7 +18,8 @@
</div> </div>
</div> </div>
</div> </div>
<h2 class=uk-title><span><%= title.display_name %></span> <h2 class=uk-title data-file-title="<%= HTML.escape(title.title) %>" data-sort-title="<%= HTML.escape(title.sort_title_db || "") %>">
<span><%= title.display_name %></span>
&nbsp; &nbsp;
<% if is_admin %> <% if is_admin %>
<a onclick="edit()" class="uk-icon-button" uk-icon="icon:pencil"></a> <a onclick="edit()" class="uk-icon-button" uk-icon="icon:pencil"></a>
@@ -59,8 +60,8 @@
</div> </div>
<div class="uk-child-width-1-4@m uk-child-width-1-2" uk-grid> <div class="uk-child-width-1-4@m uk-child-width-1-2" uk-grid>
<% title.titles.each_with_index do |item, i| %> <% sorted_titles.each do |item| %>
<% progress = title_percentage[i] %> <% progress = title_percentage_map[item.id] %>
<%= render_component "card" %> <%= render_component "card" %>
<% end %> <% end %>
</div> </div>
@@ -89,6 +90,13 @@
<input class="uk-input" type="text" name="display-name" id="display-name-field"> <input class="uk-input" type="text" name="display-name" id="display-name-field">
</div> </div>
</div> </div>
<div class="uk-margin">
<label class="uk-form-label" for="sort-title">Sort Title</label>
<div class="uk-inline">
<a class="uk-form-icon uk-form-icon-flip" uk-icon="icon:check"></a>
<input class="uk-input" type="text" name="sort-title" id="sort-title-field">
</div>
</div>
<div class="uk-margin"> <div class="uk-margin">
<label class="uk-form-label">Cover Image</label> <label class="uk-form-label">Cover Image</label>
<div class="uk-grid"> <div class="uk-grid">