Compare commits

...

178 Commits

Author SHA1 Message Date
Simon ee5d73917c
Add channel art fallback, #build
Changed:
- Added additional channel art fallback extractor
2024-05-17 14:43:43 +02:00
Simon 478d05997c
channel art fallback injection, #701 2024-05-16 21:55:27 +02:00
Simon 0e305f60f4
Channel playlist index fix, #build
Changed:
- Fixed channel not indexing correctly for standalone playlist
2024-05-16 19:54:08 +02:00
Simon 3c441c3a31
fix missing channel 2024-05-16 17:44:21 +02:00
Simon 228677ddfb
add beta-testing info 2024-05-15 21:15:09 +02:00
Simon dc0c82c814
New scheduler store and playlist improvements, #build
Changed:
- Changed to persistent celery beat schedule store
- Changed playlist handling and refreshing
- Changed internal queue storage to use Redis
- Changed a lot of other things...
2024-05-15 20:42:08 +02:00
Simon 34ad0ca07d
add next steps 2024-05-15 20:40:29 +02:00
Simon c82d9e2140
bump base image 2024-05-15 19:54:16 +02:00
Simon bbf59eaca9
set first in queue with score 1 2024-05-15 19:54:05 +02:00
Simon e42737ad9b
bump req 2024-05-15 16:32:04 +02:00
Simon 1d9c274390
better reindex_type handler, fix off by one 2024-05-15 13:18:28 +02:00
Simon 2a35b42d88
add redisqueue in use docs 2024-05-15 11:56:09 +02:00
Simon 11ab314649
refact CommentList to use RedisQueue 2024-05-15 00:37:01 +02:00
Simon d58d133baf
refactor decouple DownloadPostProcess 2024-05-15 00:18:43 +02:00
Simon d6f3fd883c
skip empty add 2024-05-15 00:11:25 +02:00
Simon 8f1a5c8557
use dynamic max_score for reindex queue 2024-05-15 00:11:05 +02:00
Simon fdc427977e
fix types 2024-05-14 23:39:43 +02:00
Simon db080e97bb
better queue total management with score 2024-05-14 23:25:33 +02:00
Simon 5235af3d91
refactor download post processing to redis queue 2024-05-14 19:17:13 +02:00
Simon 6ab70c7602
better att to queue, track score 2024-05-14 18:06:27 +02:00
Simon 86d157699a
better 404 handling in views 2024-05-14 17:05:47 +02:00
Simon c176405b32
untangle channel_overwrites in video index and queue 2024-05-13 20:54:39 +02:00
Simon 6e7cb74366
clean up, remove unused 2024-05-13 19:51:25 +02:00
Simon 05cfeb9d99
fix stop iterator, better channel overwrite unpack 2024-05-13 19:37:46 +02:00
Simon 4edb5adead
thumb clean up notification, call in task 2024-05-13 18:26:17 +02:00
Simon c17627a911
clean up thumbs from cache 2024-05-12 23:00:58 +02:00
Simon 36a738d5d7
move get_channel_overwrites to helper 2024-05-12 21:11:21 +02:00
Simon d6c4a6ea46
remove redundant playlistcheck in _process_entry 2024-05-12 21:09:48 +02:00
Simon 77ee9cfc13
use video class for channel_id extraction 2024-05-12 21:08:37 +02:00
Simon c413811e17
fix _parse_playlist, add existing playlist to queue, #634 2024-05-12 01:02:01 +02:00
Simon 2a9769d154
simplify channel overwrite handling in posprocessing 2024-05-12 00:34:03 +02:00
Simon d9ce9641e2
rewrite playlist postprocess, ensure refresh, #686 2024-05-11 23:59:36 +02:00
Simon f874d402b1
implement skip_on_empty in update_playlist 2024-05-11 23:09:03 +02:00
Simon 97b6d7d606
simplify playlist reindex, use update 2024-05-11 22:34:44 +02:00
Simon 4a38636ef3
implement remove_vids_from_playlist 2024-05-11 22:34:21 +02:00
Simon 97bc03f855
split channel rescan checking with existing index, #500 2024-05-11 21:47:40 +02:00
Simon 770990c568
split playlist parsing in find_missing in playlist 2024-05-11 21:23:27 +02:00
Simon ddc4685811
decouple playlist video id matching to individual 2024-05-11 19:16:35 +02:00
Simon 56220a94e0
workaround channel_subs, use first entry, #701 2024-05-11 11:52:50 +02:00
Simon fd039de53d
add to queue notification, #714 2024-05-11 11:10:26 +02:00
Simon 320ead0bd2
clean up startup migrations 2024-05-09 20:52:42 +02:00
Simon e33341d30d
fix playlist subscribe template bool logic, #684 2024-05-09 16:00:02 +02:00
Simon 21b79e7c8f Merge branch 'testing' into feat-delete-ignore 2024-05-09 15:27:19 +02:00
Simon 9366b8eab9
Feature beat model (#713)
* add django-celery-beat

* implement schedule migration

* fix version_check migration

* remove old schedule init

* better schedule migration

* fix task_config migration

* show task config on settings page

* fix notify url builder

* refactor celery initiation

* fix get task

* fix scheduler mig

* fix linter

* better task_config store on periodic task

* save new schedules

* fix task_config extraction from custom model

* implement auto schedule

* implement schedule delete

* refactor notifications to ES config storage

* downgrade redis

* better notification migration to ES

* add notification url handling

* fix worker start

* fix docs spelling

* don't resend form data on notification refresh

* fix type hints

* move TASK_CONFIG to separate module

* fix partial task config imports

* fix yt_obs typing

* delete schedule

* remove outdated instructions

* create initial schedules

* fix reindex days config key

* fix doc string

* unregister BeatModels
2024-05-09 20:22:36 +07:00
Simon 011073617d
More experimental comment extraction, #build
Changed:
- Fix for like_count gt 1000
2024-05-07 09:28:09 +02:00
Simon 784f90b16d
more experimental comment extractions 2024-05-07 09:27:55 +02:00
Simon c1cd9bc8eb
add requirements-dev 2024-05-06 19:56:14 +02:00
Simon e0f1828d9c
Experimental Comment fix, #build
Changed:
- Changed to use custom build to fix comment extraction
- Changed ffmpeg installer using python build script
2024-04-26 09:10:08 +02:00
Simon f5a2e624d8
add unstable tag 2024-04-26 09:07:29 +02:00
Simon dc08c83da5
custom yt-dlp build 2024-04-26 09:07:07 +02:00
Simon 33ecd73137
ffmpeg download script, separate build step 2024-04-22 17:49:26 +02:00
Simon cb6476fa8c
add jellyfin and plex plugin links 2024-04-17 22:39:24 +02:00
Simon ec64a88d1e
update roadmap 2024-04-10 21:12:58 +02:00
Simon 0c487e6339
bump TA_VERSION 2024-04-10 18:40:55 +02:00
Simon f7ad1000c7
bump es version 2024-04-10 18:40:40 +02:00
Simon aecd189d04
bump requirements 2024-04-10 18:40:30 +02:00
Simon b735a770e3
Notification improvements, #build
Changed:
- Changed more robust channel title building
- Ensure 100 progress message is sent
- Remove mig_path at startup
- Fix comment extraction fail due to redirect
- Fix duplicate notification message threads
2024-04-05 15:11:24 +02:00
Simon 5c84a2cbf8
fix getMessages getting called multiple times in parallel 2024-04-05 15:03:44 +02:00
Simon a4d062fa52
fix comment extraction player_client for redirect workaround 2024-04-05 14:40:22 +02:00
Simon 9c34bb01d9
fix spelling 2024-04-03 21:04:26 +02:00
Simon 8c38a2eb69
clarify feature requests and contributing details 2024-04-03 21:02:33 +02:00
Simon 852abf254d
implement delete and ignore for video, #286 2024-04-03 19:39:20 +02:00
Simon 25edff28e7 Merge branch 'master' into testing 2024-04-03 16:52:24 +02:00
lamusmaser 731f4b6111
Add additional user scripts. (#682)
* Add additional user scripts.

* Add new repo.

* clarify license

---------

Co-authored-by: Simon <simobilleter@gmail.com>
2024-04-03 21:52:06 +07:00
Simon e512329599
remove migpath call at startup, #687 2024-04-03 16:40:16 +02:00
Simon e26b039899
bump requirements 2024-04-03 16:39:24 +02:00
Simon 8bf7f71351
ensure 100 download progress is sent 2024-04-03 16:31:36 +02:00
Simon a72be27982
more robust channel title building 2024-03-15 12:01:09 +01:00
Simon b2c1b417e5
add unstable tag 2024-03-11 21:57:03 +01:00
Simon a348b4a810
Custom user created Playlists, #build
Changed:
- Added playlist create form
- Removed autoplay
- Disable progress less than 10s
- Better cookie jar error logs
- bump yt-dlp
2024-03-11 21:30:18 +01:00
Simon bb8db53f7d
bump yt-dlp 2024-03-11 18:18:59 +01:00
Simon 2711537a4d
clarify cookie import choices, #672 2024-03-11 18:12:56 +01:00
dot-mike 45f455070d
Fix rare edge case where comment author is None. (#676)
This happens mostly for older YT profiles that has not set-up a @-handle.
2024-03-11 23:56:18 +07:00
Simon 6dcef70b8e
skip empty subtitle, #663 2024-03-11 17:50:50 +01:00
Simon c993a5de5c
bump dependencies 2024-03-10 17:38:23 +01:00
Greg 090d88c336
Feature 590 custom playlist (#620)
* add remove custom playlist

* custom playlist page, move video controls

* align to existing code patterns

* cleanup

* resolve merge conflict

* cleanup

* cleanup

* polish

* polish

* some fixes for lint

* resolve merge conflict

* bugfix on delete video/playlist/channel - preserve custom playlist but
delete corresponding videos in custom playlist

* cleanup

* ./deploy.sh validate isort fix - validate runs clean now

* sync to latest master branch

* sync to master

* updates per admin guidance. sync to master

* attempt to resolve merge conflict

* attempt to resolve merge conflict - reintroduce changes to file.

* validate playlist_type

* validate playlist custom action

* move custom id creation to view

* stricter custom playlist matching

* revert unreachable playlist delete check

* undo unneeded playlist matching

---------

Co-authored-by: Simon <simobilleter@gmail.com>
2024-03-10 22:57:59 +07:00
Nick 0e967d721f
log cookiejar.LoadError (#669) 2024-03-10 22:35:15 +07:00
Simon c32dbf8bc8 Merge branch 'master' into testing 2024-03-10 16:34:19 +01:00
dot-mike df08a6d591
Add in user script (#680)
* remove autoplay, disable video progress less than 10s

* Update readme.md. Add in user script. Format user scripts as bulleted list

* Revert "remove autoplay, disable video progress less than 10s"

This reverts commit 8778546577.

---------

Co-authored-by: Simon <simobilleter@gmail.com>
2024-03-10 22:32:40 +07:00
DarkFighterLuke 9339b9227e
Add base URL setup in README.md User Scripts section (#664) 2024-03-10 22:11:10 +07:00
Simon 8778546577
remove autoplay, disable video progress less than 10s 2024-02-05 21:55:05 +01:00
Simon 0ff27ebfb9
fix black linting 2024-01-27 10:26:23 +07:00
Simon 0d863ef557
bump TA_VERSION 2024-01-27 10:18:47 +07:00
Simon 56ca49d0e2
bump ES 2024-01-27 10:18:27 +07:00
Simon 27b6efcab7
Redirect and celery memory usage workaround, #build
Changed:
- Limit life span of worker to avoid building up memory usage
- Validate video ID at index, raise error on redirect
- Clean up subtitles on channel delete
2024-01-15 12:06:43 +07:00
Simon 18ba808664
bump TA_VERSION unstable 2024-01-15 12:06:03 +07:00
Simon 65738ef52c
validate expected video ID with remote ID to avoid redirect 2024-01-15 11:34:11 +07:00
Simon 4049a2a3c1
bump requirements 2024-01-15 09:23:37 +07:00
PhuriousGeorge 49659322a1
Limit worker lifespan - RAM useage mitigation (#644)
Limit worker lifespan to save our precious RAM as discussed on [Discord](https://discord.com/channels/920056098122248193/1179480913701241002/1180026088802496512)

Mitigates #500 though RAM usage can still ramp rather high before worker is culled
2024-01-15 09:12:44 +07:00
Simon 4078eb307f Merge branch 'master' into testing 2024-01-15 09:04:01 +07:00
Daniel Jue 7f056b38f4
Update README.md (#647)
* Update README.md

Added a link to a simple helper script for prioritizing downloads

* Update README.md

@Krafting Fixed a mistake I made

Co-authored-by: Krafting <36538123+Krafting@users.noreply.github.com>

---------

Co-authored-by: Krafting <36538123+Krafting@users.noreply.github.com>
2024-01-15 09:00:33 +07:00
Simon 86fe31d258
cleanup subtitles after deleting channels 2023-12-25 11:40:09 +07:00
Simon 5b26433599 Merge branch 'master' into testing 2023-12-25 11:32:56 +07:00
Simon 4d2fc5423e
fix video reindex exist check 2023-12-22 12:31:55 +07:00
Simon 94295cdbd4
add type hints to ReleaseVersion 2023-12-22 10:41:10 +07:00
Simon b84bf78974
hotfix: clear faulty version check 2023-12-22 09:57:05 +07:00
Simon 14e23a4371
handle reindex item delete during task run 2023-12-21 13:22:05 +07:00
Simon fe8f4faa10
update TA_VERSION 2023-12-21 10:32:53 +07:00
Simon ddc0b7a481
Various yt-dlp fixes, #build
Changed:
- Fix for channel about page parsing
- Fix for like_count extraction
- Fix for comment author extraction
- Refactor RedisQueue use sorted set
- Fix chrome scaling issue
- Fix autodelete
2023-12-17 11:34:46 +07:00
Simon 7eec3ece49
lock yt-dlp at commit 2023-12-17 11:28:36 +07:00
Simon 789c35e2b5
refactor RedisQueue to use sorted set 2023-12-17 11:22:26 +07:00
Simon 8870782a6e
refactor, use decode_responses in base class 2023-12-16 17:39:09 +07:00
Simon e75ffb603c
fix auto delete lte datatype, #622 2023-12-16 13:13:48 +07:00
Simon feabc87c9f
fix chrome scaling issue, #616 2023-12-16 12:44:19 +07:00
Simon 6f1a45ffb1
downgrade redis 2023-12-16 10:18:40 +07:00
Simon 098db97cba
revert channel about parsing, #614 2023-12-12 14:00:56 +07:00
Simon 597da56975
fix comment_author_is_uploader extraction 2023-12-12 13:55:41 +07:00
Simon 325bdf5cba
add unstable tag, #build 2023-12-03 15:19:18 +07:00
Simon db2f249979
Channel about parser workaround, #build
Changed:
- Use /featured endpoint for channel parsing
- Improved version check cleanup process
- Handle version check random channel building errors
2023-12-03 14:50:50 +07:00
Simon b61b8635b8
bump celery 2023-12-03 14:49:59 +07:00
Simon 5aafc21bda
use featured path to extract channel metadata, #614 2023-12-03 14:48:56 +07:00
lamusmaser 099c70a13b
Add check to determine if `sub_value` is `rand-d`, always. (#612) 2023-12-01 09:23:58 +07:00
Simon 43708ee2a3
refac _has_update parser, use tpl comparison 2023-11-22 12:46:09 +07:00
Simon cfb15c1a78
handle version check comparison over any diff 2023-11-22 10:49:04 +07:00
Simon e9a95d7ada
bump TA_VERSION 2023-11-21 12:50:53 +07:00
Simon a21a111221
rename to Videos 2023-11-20 14:56:22 +07:00
Simon 18e504faf2
fix add missining playlist_entries mappings, #605 2023-11-20 14:02:49 +07:00
Simon 9ffe2098a5
add unstable version 2023-11-19 22:10:07 +07:00
Simon 1315e836a4
Improved dashboard, reindex fix, #build
Changed:
- Added additional sort by fields
- [API] Chaned primary stats endpoints
- [API] Added separate video stats endpoints
- Added fallback for some manual import values
- Fix comment extration for members video
- Fix reindex outdated query
2023-11-19 22:02:23 +07:00
Simon 2e4289e75c
bump requirements 2023-11-19 21:37:08 +07:00
Simon 96e73a3a53
handle empty tile response 2023-11-19 21:32:11 +07:00
Simon a369be0f4a
split active videos tile, add duration 2023-11-19 21:20:42 +07:00
Simon d5676e5173
[API] remove primary endpoint, in favor of dedicated stats 2023-11-19 20:30:50 +07:00
Simon 44c4cf93e2
refactor dashboard tile building 2023-11-19 20:27:18 +07:00
Simon 02ac590caa
[API] add download stats 2023-11-19 14:42:16 +07:00
Simon a466c02304
[API] add playlist stats 2023-11-19 14:00:27 +07:00
Simon e74c26fe36
[API] add channel aggs 2023-11-19 13:48:24 +07:00
Simon b1267cba83
standard json style 2023-11-19 13:06:47 +07:00
Simon 91bb0ed9c0
[API] add video aggregation 2023-11-19 13:01:27 +07:00
Simon 4a145ee7cb
paginate to get total active docs count 2023-11-18 17:44:16 +07:00
Simon 463019ce5a
fix outdated redinex now_lte datatype 2023-11-18 17:30:31 +07:00
Simon 9a9d35cac4
explicitly define player mapping, #592 2023-11-17 09:44:10 +07:00
Simon f41ecd24c5
fix missing config for comments extraction, #596 2023-11-17 09:26:31 +07:00
crocs eced8200c1
Update settings_scheduling.html (#601)
I found more!
2023-11-17 09:23:19 +07:00
Simon 669bc6a620
fallback for view_count, refac, #581 2023-11-17 09:22:11 +07:00
lamusmaser 37df9b65c7
Add `allowed_null_keys` and its dictionary for manual imports. (#595)
* Add `allowed_null_keys` and its dictionary for manual imports.

* Fix linting for `allowed_null_keys` list.

* Add missing trailing comma for linting.

* Add missing newline that wasn't in earlier linting responses.

* Clear empty text in newlines.

* Remove newline that the linter requested because the linter now doesn't want it. ¯\_(ツ)_/¯

* Change default application from manual import to the video processing.

* Fix missing space.
2023-11-17 09:16:09 +07:00
lamusmaser 6721d01fa6
Fix `textarea` type from `shell` to `Shell`. (#594) 2023-11-17 09:12:02 +07:00
crocs 2b49af9620
Update settings.html (#599)
This was really bugging me lol
2023-11-15 11:45:08 +07:00
Derek Slenk 2f62898a10
Add new css item for web footer (#598) 2023-11-15 11:44:43 +07:00
spechter 832259ce48
Expanded sorting functionality (#589)
* - Added duration and filesize as options in sorting menu on Home and ChannelId views
- Added keys 'duration' and 'filesize' as valid parameters to sort by
- Mapped 'duration' and 'filesize' to their corresponding es keys

* Fixed spelling

* Changed formatting to comply to maximum line length.

* Locally running "deploy.sh validate" before committing

---------

Co-authored-by: spechter <spechter@spechter.net>
2023-11-15 11:06:51 +07:00
Simon b8ccce250a
bump TA_VERSION 2023-11-10 10:34:04 +07:00
Simon aa04ecff4f
bump es 2023-11-10 10:33:53 +07:00
Simon dcf97d3d24
tweak color matrix color filter 2023-11-10 09:57:18 +07:00
crocs 879ad52b32
updated icons (#588)
* icon updates

* Update icon-star-half.svg
2023-11-10 09:40:17 +07:00
Simon 0bedc3ee93
fix empty watchDetail building 2023-11-09 11:55:28 +07:00
Simon 1657c55cbe
Aggregation daily stats improvements, #build
Changed:
- [API] make daily stats TZ aware
- [API] add daily download media size
2023-11-09 10:42:02 +07:00
Simon 8b1324139d
pass time_zone to daily aggs 2023-11-09 10:34:08 +07:00
Simon 04124e3dad
add daily size download 2023-11-09 10:22:43 +07:00
Simon 9c26357f76
User conf endpoints, fix channel parser, #build
Changed:
- [API] Added endpoints to CRUD user conf vars
- [API] Added backup endpoints
- Fix channel about page parsing
- Add custom CSS files
- Remember player volume
2023-11-09 09:40:54 +07:00
extome 7133d6b441
Better CSS support (#583)
* Remove banner hardcoding

* Refactor "colors" to "stylesheet"

* Remove logo hardcoding

* Remove stylesheet hardcoding

* Add very basic static CSS scanning and a new style

* Respect environment settings

* Check if selected stylesheet still exists

* New theme and title formatting

* Revert migration change

* Code linting

* More outlines for Matrix style

* Change wording in settings

* Forgot this wording

* Add suggested changes
2023-11-09 09:33:03 +07:00
Simon 6bc0111d0a
set and get playerVolume from localStorage 2023-11-09 09:31:19 +07:00
Simon 1188e66f37
fix channel about page parsing, #587 2023-11-08 23:20:13 +07:00
Simon ef6d3e868d
bump requirements 2023-11-08 23:09:55 +07:00
Simon d677f9579e
replace old process view, use user conf api 2023-11-01 22:49:33 +07:00
Simon 0b920e87ae
[API] add user config endpoints 2023-11-01 19:07:22 +07:00
Simon 4d5aa4ad2f
validate user config values 2023-11-01 17:25:22 +07:00
Simon 4b63c2f536
simplify return message 2023-11-01 14:33:30 +07:00
Simon 31ad9424f5
remove unused db_restore 2023-11-01 14:10:45 +07:00
Simon 45f4ccfd93
fix off by one in filesystem rescan progress 2023-11-01 14:07:56 +07:00
Simon 285e2042ae
[API] add backup endpoints 2023-11-01 14:05:11 +07:00
Simon e4b7f8ce38
update roadmap 2023-11-01 11:04:21 +07:00
Simon 6892cbbc19
Read only user roles, refac env var builder, #build
Changed:
- Added view only user role
- Fixed media download URL builder
- Changed environment settings builder away from redis
- Improved dashboard
2023-11-01 09:24:21 +07:00
Simon 58ea256b44
add unstable tag 2023-11-01 09:19:18 +07:00
Merlin aa475c58aa
Refac settings dashboard (#577)
* Add padding to duration str text

* Add singular and plural to video in dailyStat

* Add code spacing for readability

* Refac Main overview in dashboard to be spaced evenly and use tables

* Refac simplify number padding

* Refac skip adding spacing rows on mobile

* Refac reorder watch progress to be in order of interest

* Fix that ther can be 0 Videos added a day

* Refac capitalize content keys
2023-11-01 08:40:41 +07:00
Simon 8247314d01
refactor admin permisson classes 2023-10-31 15:50:33 +07:00
Simon 2826ca4a43
move ES_SNAPSHOT_DIR to EnvironmentSettings 2023-10-28 15:25:57 +07:00
Simon 64ffc18da7
add debug methods for EnvironmentSettings 2023-10-28 15:16:22 +07:00
Simon 21fde5e068
remove old migrations 2023-10-28 15:03:16 +07:00
Simon ea9ed6c238
fix linter 2023-10-28 10:30:21 +07:00
Simon 8eaed07cff
remove unused renamer 2023-10-28 10:29:10 +07:00
Clark 4d111aff82
Move the startup application settings to a new class (#571)
* Move the startup application settings to a new class

* Replace settings methods with static fields

* Move Redis and ES configuration to the settings class

* Fix environment python imports

* Update envcheck to use the new settings
2023-10-28 10:27:03 +07:00
Simon 7236bea29a
add error setting rlimit to common errors 2023-10-20 15:36:59 +07:00
Simon 5165c3e34a
bump requirements 2023-10-16 16:12:28 +07:00
Simon 572b23169c
finetune limited permission user 2023-10-15 14:56:54 +07:00
Steve Ovens e1fce06f97
View only user (#539)
* Remove repo docs in favor of hosted docs (#537)

* updated base, channel, video htmls to hide elements based on if user is staff or in the group 'admin'

* added the load auth_extras

* updated auth_extras

* updated views.py to block api calls from deleting files from unprivileged users; The Templates needed to be updated to support the various group checks related to removing buttons an unprivileged user should not see

* bumped the channel templates to remove conflict

* fix linting issues

* more linting

---------

Co-authored-by: Merlin <4706504+MerlinScheurer@users.noreply.github.com>
2023-10-15 13:58:06 +07:00
Simon 446d5b7949 Merge branch 'master' into testing 2023-10-15 12:03:08 +07:00
Simon 17c0310220
bump docker compose version, #569 2023-10-13 08:32:40 +07:00
Omar Laham 1b0be84972
Remove /media/ prefix from Download File URL in video.html (#567) 2023-10-13 08:20:42 +07:00
105 changed files with 4141 additions and 2620 deletions

View File

@ -38,6 +38,6 @@ body:
attributes:
label: Relevant log output
description: Please copy and paste any relevant Docker logs. This will be automatically formatted into code, so no need for backticks.
render: shell
render: Shell
validations:
required: true

View File

@ -1,8 +1,10 @@
## Contributing to Tube Archivist
# Contributing to Tube Archivist
Welcome, and thanks for showing interest in improving Tube Archivist!
## Table of Content
- [Next Steps](#next-steps)
- [Beta Testing](#beta-testing)
- [How to open an issue](#how-to-open-an-issue)
- [Bug Report](#bug-report)
- [Feature Request](#feature-request)
@ -14,8 +16,31 @@ Welcome, and thanks for showing interest in improving Tube Archivist!
- [Development Environment](#development-environment)
---
## Next Steps
Going forward, this project will focus on developing a new modern frontend.
- For the time being, don't open any new PRs that are not towards the new frontend.
- New features requests likely won't get accepted during this process.
- Depending on the severity, bug reports may or may not get fixed during this time.
- When in doubt, reach out.
Join us on [Discord](https://tubearchivist.com/discord) if you want to help with that process.
## Beta Testing
Be the first to help test new features and improvements and provide feedback! There are regular `:unstable` builds for easy access. That's for the tinkerers and the breave. Ideally use a testing environment first, before a release be the first to install it on your main system.
There is always something that can get missed during development. Look at the commit messages tagged with `#build`, these are the unstable builds and give a quick overview what has changed.
- Test the features mentioned, play around, try to break it.
- Test the update path by installing the `:latest` release first, the upgrade to `:unstable` to check for any errors.
- Test the unstable build on a fresh install.
Then provide feedback, if there is a problem but also if there is no problem. Reach out on [Discord](https://tubearchivist.com/discord) in the `#beta-testing` channel with your findings.
This will help with a smooth update for the regular release. Plus you get to test things out early!
## How to open an issue
Please read this carefully before opening any [issue](https://github.com/tubearchivist/tubearchivist/issues) on GitHub.
Please read this carefully before opening any [issue](https://github.com/tubearchivist/tubearchivist/issues) on GitHub. Make sure you read [Next Steps](#next-steps) above.
**Do**:
- Do provide details and context, this matters a lot and makes it easier for people to help.
@ -37,12 +62,12 @@ Please keep in mind:
- A bug that can't be reproduced, is difficult or sometimes even impossible to fix. Provide very clear steps *how to reproduce*.
### Feature Request
This project needs your help to grow further. There is no shortage of ideas, see the open [issues on GH](https://github.com/tubearchivist/tubearchivist/issues?q=is%3Aopen+is%3Aissue+label%3Aenhancement) and the [roadmap](https://github.com/tubearchivist/tubearchivist#roadmap), what this project lacks is contributors to implement these ideas.
This project needs your help to grow further. There is no shortage of ideas, see the open [issues on GH](https://github.com/tubearchivist/tubearchivist/issues?q=is%3Aopen+is%3Aissue+label%3Aenhancement) and the [roadmap](https://github.com/tubearchivist/tubearchivist#roadmap), what this project lacks is contributors interested in helping with overall improvements of the application. Focus is *not* on adding new features, but improving existing ones.
Existing ideas are easily *multiple years* worth of development effort, at least at current speed. Best and fastest way to implement your feature is to do it yourself, that's why this project is open source after all. This project is *very* selective with accepting new feature requests at this point.
Existing ideas are easily *multiple years* worth of development effort, at least at current speed. This project is *very* selective with accepting new feature requests at this point.
Good feature requests usually fall into one or more of these categories:
- You want to work on your own idea within the next few days or weeks.
- You want to work on your own small scoped idea within the next few days or weeks.
- Your idea is beneficial for a wide range of users, not just for you.
- Your idea extends the current project by building on and improving existing functionality.
- Your idea is quick and easy to implement, for an experienced as well as for a first time contributor.
@ -66,7 +91,11 @@ IMPORTANT: When receiving help, contribute back to the community by improving th
## How to make a Pull Request
Thank you for contributing and helping improve this project. This is a quick checklist to help streamline the process:
Make sure you read [Next Steps](#next-steps) above.
Thank you for contributing and helping improve this project. Focus for the foreseeable future is on improving and building on existing functionality, *not* on adding and expanding the application.
This is a quick checklist to help streamline the process:
- For **code changes**, make your PR against the [testing branch](https://github.com/tubearchivist/tubearchivist/tree/testing). That's where all active development happens. This simplifies the later merging into *master*, minimizes any conflicts and usually allows for easy and convenient *fast-forward* merging.
- For **documentation changes**, make your PR directly against the *master* branch.

View File

@ -1,9 +1,9 @@
# multi stage to build tube archivist
# first stage to build python wheel, copy into final image
# build python wheel, download and extract ffmpeg, copy into final image
# First stage to build python wheel
FROM python:3.11.3-slim-bullseye AS builder
FROM python:3.11.8-slim-bookworm AS builder
ARG TARGETPLATFORM
RUN apt-get update && apt-get install -y --no-install-recommends \
@ -13,8 +13,13 @@ RUN apt-get update && apt-get install -y --no-install-recommends \
COPY ./tubearchivist/requirements.txt /requirements.txt
RUN pip install --user -r requirements.txt
# build ffmpeg
FROM python:3.11.8-slim-bookworm as ffmpeg-builder
COPY docker_assets/ffmpeg_download.py ffmpeg_download.py
RUN python ffmpeg_download.py $TARGETPLATFORM
# build final image
FROM python:3.11.3-slim-bullseye as tubearchivist
FROM python:3.11.8-slim-bookworm as tubearchivist
ARG TARGETPLATFORM
ARG INSTALL_DEBUG
@ -25,30 +30,15 @@ ENV PYTHONUNBUFFERED 1
COPY --from=builder /root/.local /root/.local
ENV PATH=/root/.local/bin:$PATH
# copy ffmpeg
COPY --from=ffmpeg-builder ./ffmpeg/ffmpeg /usr/bin/ffmpeg
COPY --from=ffmpeg-builder ./ffprobe/ffprobe /usr/bin/ffprobe
# install distro packages needed
RUN apt-get clean && apt-get -y update && apt-get -y install --no-install-recommends \
nginx \
atomicparsley \
curl \
xz-utils && rm -rf /var/lib/apt/lists/*
# install patched ffmpeg build, default to linux64
RUN if [ "$TARGETPLATFORM" = "linux/arm64" ] ; then \
curl -s https://api.github.com/repos/yt-dlp/FFmpeg-Builds/releases/latest \
| grep browser_download_url \
| grep ".*master.*linuxarm64.*tar.xz" \
| cut -d '"' -f 4 \
| xargs curl -L --output ffmpeg.tar.xz ; \
else \
curl -s https://api.github.com/repos/yt-dlp/FFmpeg-Builds/releases/latest \
| grep browser_download_url \
| grep ".*master.*linux64.*tar.xz" \
| cut -d '"' -f 4 \
| xargs curl -L --output ffmpeg.tar.xz ; \
fi && \
tar -xf ffmpeg.tar.xz --strip-components=2 --no-anchored -C /usr/bin/ "ffmpeg" && \
tar -xf ffmpeg.tar.xz --strip-components=2 --no-anchored -C /usr/bin/ "ffprobe" && \
rm ffmpeg.tar.xz
curl && rm -rf /var/lib/apt/lists/*
# install debug tools for testing environment
RUN if [ "$INSTALL_DEBUG" ] ; then \

View File

@ -34,8 +34,8 @@ Once your YouTube video collection grows, it becomes hard to search and find a s
- [Discord](https://www.tubearchivist.com/discord): Connect with us on our Discord server.
- [r/TubeArchivist](https://www.reddit.com/r/TubeArchivist/): Join our Subreddit.
- [Browser Extension](https://github.com/tubearchivist/browser-extension) Tube Archivist Companion, for [Firefox](https://addons.mozilla.org/addon/tubearchivist-companion/) and [Chrome](https://chrome.google.com/webstore/detail/tubearchivist-companion/jjnkmicfnfojkkgobdfeieblocadmcie)
- [Jellyfin Integration](https://github.com/tubearchivist/tubearchivist-jf): Add your videos to Jellyfin.
- [Tube Archivist Metrics](https://github.com/tubearchivist/tubearchivist-metrics) to create statistics in Prometheus/OpenMetrics format.
- [Jellyfin Plugin](https://github.com/tubearchivist/tubearchivist-jf-plugin): Add your videos to Jellyfin
- [Plex Plugin](https://github.com/tubearchivist/tubearchivist-plex): Add your videos to Plex
## Installing
For minimal system requirements, the Tube Archivist stack needs around 2GB of available memory for a small testing setup and around 4GB of available memory for a mid to large sized installation. Minimal with dual core with 4 threads, better quad core plus.
@ -135,6 +135,11 @@ The Elasticsearch index will turn to ***read only*** if the disk usage of the co
Similar to that, TubeArchivist will become all sorts of messed up when running out of disk space. There are some error messages in the logs when that happens, but it's best to make sure to have enough disk space before starting to download.
## `error setting rlimit`
If you are seeing errors like `failed to create shim: OCI runtime create failed` and `error during container init: error setting rlimits`, this means docker can't set these limits, usually because they are set at another place or are incompatible because of other reasons. Solution is to remove the `ulimits` key from the ES container in your docker compose and start again.
This can happen if you have nested virtualizations, e.g. LXC running Docker in Proxmox.
## Known limitations
- Video files created by Tube Archivist need to be playable in your browser of choice. Not every codec is compatible with every browser and might require some testing with format selection.
- Every limitation of **yt-dlp** will also be present in Tube Archivist. If **yt-dlp** can't download or extract a video for any reason, Tube Archivist won't be able to either.
@ -144,8 +149,9 @@ Similar to that, TubeArchivist will become all sorts of messed up when running o
We have come far, nonetheless we are not short of ideas on how to improve and extend this project. Issues waiting for you to be tackled in no particular order:
- [ ] User roles
- [ ] Audio download
- [ ] Podcast mode to serve channel as mp3
- [ ] User created playlists, random and repeat controls ([#108](https://github.com/tubearchivist/tubearchivist/issues/108), [#220](https://github.com/tubearchivist/tubearchivist/issues/220))
- [ ] Random and repeat controls ([#108](https://github.com/tubearchivist/tubearchivist/issues/108), [#220](https://github.com/tubearchivist/tubearchivist/issues/220))
- [ ] Auto play or play next link ([#226](https://github.com/tubearchivist/tubearchivist/issues/226))
- [ ] Multi language support
- [ ] Show total video downloaded vs total videos available in channel
@ -153,8 +159,10 @@ We have come far, nonetheless we are not short of ideas on how to improve and ex
- [ ] Custom searchable notes to videos, channels, playlists ([#144](https://github.com/tubearchivist/tubearchivist/issues/144))
- [ ] Search comments
- [ ] Search download queue
- [ ] Configure shorts, streams and video sizes per channel
Implemented:
- [X] User created playlists [2024-04-10]
- [X] Add statistics of index [2023-09-03]
- [X] Implement [Apprise](https://github.com/caronc/apprise) for notifications [2023-08-05]
- [X] Download video comments [2022-11-30]
@ -181,9 +189,17 @@ Implemented:
- [X] Scan your file system to index already downloaded videos [2021-09-14]
## User Scripts
This is a list of useful user scripts, generously created from folks like you to extend this project and its functionality. This is your time to shine, [read this](https://github.com/tubearchivist/tubearchivist/blob/master/CONTRIBUTING.md#user-scripts) then open a PR to add your script here.
This is a list of useful user scripts, generously created from folks like you to extend this project and its functionality. Make sure to check the respective repository links for detailed license information.
- Example 1
This is your time to shine, [read this](https://github.com/tubearchivist/tubearchivist/blob/master/CONTRIBUTING.md#user-scripts) then open a PR to add your script here.
- [danieljue/ta_dl_page_script](https://github.com/danieljue/ta_dl_page_script): Helper browser script to prioritize a channels' videos in download queue.
- [dot-mike/ta-scripts](https://github.com/dot-mike/ta-scripts): A collection of personal scripts for managing TubeArchivist.
- [DarkFighterLuke/ta_base_url_nginx](https://gist.github.com/DarkFighterLuke/4561b6bfbf83720493dc59171c58ac36): Set base URL with Nginx when you can't use subdomains.
- [lamusmaser/ta_migration_helper](https://github.com/lamusmaser/ta_migration_helper): Advanced helper script for migration issues to TubeArchivist v0.4.4 or later.
- [lamusmaser/create_info_json](https://gist.github.com/lamusmaser/837fb58f73ea0cad784a33497932e0dd): Script to generate `.info.json` files using `ffmpeg` collecting information from downloaded videos.
- [lamusmaser/ta_fix_for_video_redirection](https://github.com/lamusmaser/ta_fix_for_video_redirection): Script to fix videos that were incorrectly indexed by YouTube's "Video is Unavailable" response.
- [RoninTech/ta-helper](https://github.com/RoninTech/ta-helper): Helper script to provide a symlink association to reference TubeArchivist videos with their original titles.
## Donate
The best donation to **Tube Archivist** is your time, take a look at the [contribution page](CONTRIBUTING.md) to get started.

View File

@ -1,4 +1,4 @@
version: '3.3'
version: '3.5'
services:
tubearchivist:
@ -40,7 +40,7 @@ services:
depends_on:
- archivist-es
archivist-es:
image: bbilly1/tubearchivist-es # only for amd64, or use official es 8.9.0
image: bbilly1/tubearchivist-es # only for amd64, or use official es 8.13.2
container_name: archivist-es
restart: unless-stopped
environment:

View File

@ -0,0 +1,71 @@
"""
ffmpeg link builder
copied as into build step in Dockerfile
"""
import json
import os
import sys
import tarfile
import urllib.request
from enum import Enum
API_URL = "https://api.github.com/repos/yt-dlp/FFmpeg-Builds/releases/latest"
BINARIES = ["ffmpeg", "ffprobe"]
class PlatformFilter(Enum):
"""options"""
ARM64 = "linuxarm64"
AMD64 = "linux64"
def get_assets():
"""get all available assets from latest build"""
with urllib.request.urlopen(API_URL) as f:
all_links = json.loads(f.read().decode("utf-8"))
return all_links
def pick_url(all_links, platform):
"""pick url for platform"""
filter_by = PlatformFilter[platform.split("/")[1].upper()].value
options = [i for i in all_links["assets"] if filter_by in i["name"]]
if not options:
raise ValueError(f"no valid asset found for filter {filter_by}")
url_pick = options[0]["browser_download_url"]
return url_pick
def download_extract(url):
"""download and extract binaries"""
print("download file")
filename, _ = urllib.request.urlretrieve(url)
print("extract file")
with tarfile.open(filename, "r:xz") as tar:
for member in tar.getmembers():
member.name = os.path.basename(member.name)
if member.name in BINARIES:
print(f"extract {member.name}")
tar.extract(member, member.name)
def main():
"""entry point"""
args = sys.argv
if len(args) == 1:
platform = "linux/amd64"
else:
platform = args[1]
all_links = get_assets()
url = pick_url(all_links, platform)
download_extract(url)
if __name__ == "__main__":
main()

View File

@ -14,11 +14,10 @@ fi
python manage.py ta_envcheck
python manage.py ta_connection
python manage.py ta_startup
python manage.py ta_migpath
# start all tasks
nginx &
celery -A home.tasks worker --loglevel=INFO &
celery -A home.celery worker --loglevel=INFO --max-tasks-per-child 10 &
celery -A home beat --loglevel=INFO \
-s "${BEAT_SCHEDULE_PATH:-${cachedir}/celerybeat-schedule}" &
--scheduler django_celery_beat.schedulers:DatabaseScheduler &
uwsgi --ini uwsgi.ini

View File

@ -2,6 +2,7 @@
from home.src.es.connect import ElasticWrap
from home.src.ta.helper import get_duration_str
from home.src.ta.settings import EnvironmentSettings
class AggBase:
@ -23,62 +24,159 @@ class AggBase:
raise NotImplementedError
class Primary(AggBase):
"""primary aggregation for total documents indexed"""
class Video(AggBase):
"""get video stats"""
name = "primary"
path = "ta_video,ta_channel,ta_playlist,ta_subtitle,ta_download/_search"
name = "video_stats"
path = "ta_video/_search"
data = {
"size": 0,
"aggs": {
"video_type": {
"filter": {"exists": {"field": "active"}},
"aggs": {"filtered": {"terms": {"field": "vid_type"}}},
"terms": {"field": "vid_type"},
"aggs": {
"media_size": {"sum": {"field": "media_size"}},
"duration": {"sum": {"field": "player.duration"}},
},
},
"channel_total": {"value_count": {"field": "channel_active"}},
"channel_sub": {"terms": {"field": "channel_subscribed"}},
"playlist_total": {"value_count": {"field": "playlist_active"}},
"playlist_sub": {"terms": {"field": "playlist_subscribed"}},
"download": {"terms": {"field": "status"}},
"video_active": {
"terms": {"field": "active"},
"aggs": {
"media_size": {"sum": {"field": "media_size"}},
"duration": {"sum": {"field": "player.duration"}},
},
},
"video_media_size": {"sum": {"field": "media_size"}},
"video_count": {"value_count": {"field": "youtube_id"}},
"duration": {"sum": {"field": "player.duration"}},
},
}
def process(self):
"""make the call"""
"""process aggregation"""
aggregations = self.get()
videos = {"total": aggregations["video_type"].get("doc_count")}
videos.update(
{
i.get("key"): i.get("doc_count")
for i in aggregations["video_type"]["filtered"]["buckets"]
}
)
channels = {"total": aggregations["channel_total"].get("value")}
channels.update(
{
"sub_" + i.get("key_as_string"): i.get("doc_count")
for i in aggregations["channel_sub"]["buckets"]
}
)
playlists = {"total": aggregations["playlist_total"].get("value")}
playlists.update(
{
"sub_" + i.get("key_as_string"): i.get("doc_count")
for i in aggregations["playlist_sub"]["buckets"]
}
)
downloads = {
i.get("key"): i.get("doc_count")
for i in aggregations["download"]["buckets"]
duration = int(aggregations["duration"]["value"])
response = {
"doc_count": aggregations["video_count"]["value"],
"media_size": int(aggregations["video_media_size"]["value"]),
"duration": duration,
"duration_str": get_duration_str(duration),
}
for bucket in aggregations["video_type"]["buckets"]:
duration = int(bucket["duration"].get("value"))
response.update(
{
f"type_{bucket['key']}": {
"doc_count": bucket.get("doc_count"),
"media_size": int(bucket["media_size"].get("value")),
"duration": duration,
"duration_str": get_duration_str(duration),
}
}
)
for bucket in aggregations["video_active"]["buckets"]:
duration = int(bucket["duration"].get("value"))
response.update(
{
f"active_{bucket['key_as_string']}": {
"doc_count": bucket.get("doc_count"),
"media_size": int(bucket["media_size"].get("value")),
"duration": duration,
"duration_str": get_duration_str(duration),
}
}
)
return response
class Channel(AggBase):
"""get channel stats"""
name = "channel_stats"
path = "ta_channel/_search"
data = {
"size": 0,
"aggs": {
"channel_count": {"value_count": {"field": "channel_id"}},
"channel_active": {"terms": {"field": "channel_active"}},
"channel_subscribed": {"terms": {"field": "channel_subscribed"}},
},
}
def process(self):
"""process aggregation"""
aggregations = self.get()
response = {
"videos": videos,
"channels": channels,
"playlists": playlists,
"downloads": downloads,
"doc_count": aggregations["channel_count"].get("value"),
}
for bucket in aggregations["channel_active"]["buckets"]:
key = f"active_{bucket['key_as_string']}"
response.update({key: bucket.get("doc_count")})
for bucket in aggregations["channel_subscribed"]["buckets"]:
key = f"subscribed_{bucket['key_as_string']}"
response.update({key: bucket.get("doc_count")})
return response
class Playlist(AggBase):
"""get playlist stats"""
name = "playlist_stats"
path = "ta_playlist/_search"
data = {
"size": 0,
"aggs": {
"playlist_count": {"value_count": {"field": "playlist_id"}},
"playlist_active": {"terms": {"field": "playlist_active"}},
"playlist_subscribed": {"terms": {"field": "playlist_subscribed"}},
},
}
def process(self):
"""process aggregation"""
aggregations = self.get()
response = {"doc_count": aggregations["playlist_count"].get("value")}
for bucket in aggregations["playlist_active"]["buckets"]:
key = f"active_{bucket['key_as_string']}"
response.update({key: bucket.get("doc_count")})
for bucket in aggregations["playlist_subscribed"]["buckets"]:
key = f"subscribed_{bucket['key_as_string']}"
response.update({key: bucket.get("doc_count")})
return response
class Download(AggBase):
"""get downloads queue stats"""
name = "download_queue_stats"
path = "ta_download/_search"
data = {
"size": 0,
"aggs": {
"status": {"terms": {"field": "status"}},
"video_type": {
"filter": {"term": {"status": "pending"}},
"aggs": {"type_pending": {"terms": {"field": "vid_type"}}},
},
},
}
def process(self):
"""process aggregation"""
aggregations = self.get()
response = {}
for bucket in aggregations["status"]["buckets"]:
response.update({bucket["key"]: bucket.get("doc_count")})
for bucket in aggregations["video_type"]["type_pending"]["buckets"]:
key = f"pending_{bucket['key']}"
response.update({key: bucket.get("doc_count")})
return response
@ -117,7 +215,7 @@ class WatchProgress(AggBase):
all_duration = int(aggregations["total_duration"].get("value"))
response.update(
{
"all": {
"total": {
"duration": all_duration,
"duration_str": get_duration_str(all_duration),
"items": aggregations["total_vids"].get("value"),
@ -168,13 +266,22 @@ class DownloadHist(AggBase):
"calendar_interval": "day",
"format": "yyyy-MM-dd",
"order": {"_key": "desc"},
"time_zone": EnvironmentSettings.TZ,
},
"aggs": {
"total_videos": {"value_count": {"field": "youtube_id"}}
"total_videos": {"value_count": {"field": "youtube_id"}},
"media_size": {"sum": {"field": "media_size"}},
},
}
},
"query": {"range": {"date_downloaded": {"gte": "now-7d/d"}}},
"query": {
"range": {
"date_downloaded": {
"gte": "now-7d/d",
"time_zone": EnvironmentSettings.TZ,
}
}
},
}
def process(self):
@ -186,6 +293,7 @@ class DownloadHist(AggBase):
{
"date": i.get("key_as_string"),
"count": i.get("doc_count"),
"media_size": i["media_size"].get("value"),
}
for i in buckets
]

View File

@ -7,15 +7,14 @@ Functionality:
import urllib.parse
from home.src.download.thumbnails import ThumbManager
from home.src.ta.config import AppConfig
from home.src.ta.helper import date_praser, get_duration_str
from home.src.ta.settings import EnvironmentSettings
class SearchProcess:
"""process search results"""
CONFIG = AppConfig().config
CACHE_DIR = CONFIG["application"]["cache_dir"]
CACHE_DIR = EnvironmentSettings.CACHE_DIR
def __init__(self, response):
self.response = response

View File

@ -96,6 +96,16 @@ urlpatterns = [
views.SnapshotApiView.as_view(),
name="api-snapshot",
),
path(
"backup/",
views.BackupApiListView.as_view(),
name="api-backup-list",
),
path(
"backup/<str:filename>/",
views.BackupApiView.as_view(),
name="api-backup",
),
path(
"task-name/",
views.TaskListView.as_view(),
@ -111,6 +121,21 @@ urlpatterns = [
views.TaskIDView.as_view(),
name="api-task-id",
),
path(
"schedule/",
views.ScheduleView.as_view(),
name="api-schedule",
),
path(
"schedule/notification/",
views.ScheduleNotification.as_view(),
name="api-schedule-notification",
),
path(
"config/user/",
views.UserConfigView.as_view(),
name="api-config-user",
),
path(
"cookie/",
views.CookieView.as_view(),
@ -137,9 +162,24 @@ urlpatterns = [
name="api-notification",
),
path(
"stats/primary/",
views.StatPrimaryView.as_view(),
name="api-stats-primary",
"stats/video/",
views.StatVideoView.as_view(),
name="api-stats-video",
),
path(
"stats/channel/",
views.StatChannelView.as_view(),
name="api-stats-channel",
),
path(
"stats/playlist/",
views.StatPlaylistView.as_view(),
name="api-stats-playlist",
),
path(
"stats/download/",
views.StatDownloadView.as_view(),
name="api-stats-download",
),
path(
"stats/watch/",

View File

@ -1,13 +1,23 @@
"""all API views"""
from api.src.aggs import BiggestChannel, DownloadHist, Primary, WatchProgress
from api.src.aggs import (
BiggestChannel,
Channel,
Download,
DownloadHist,
Playlist,
Video,
WatchProgress,
)
from api.src.search_processor import SearchProcess
from home.models import CustomPeriodicTask
from home.src.download.queue import PendingInteract
from home.src.download.subscriptions import (
ChannelSubscription,
PlaylistSubscription,
)
from home.src.download.yt_dlp_base import CookieHandler
from home.src.es.backup import ElasticBackup
from home.src.es.connect import ElasticWrap
from home.src.es.snapshot import ElasticSnapshot
from home.src.frontend.searching import SearchForm
@ -18,38 +28,70 @@ from home.src.index.playlist import YoutubePlaylist
from home.src.index.reindex import ReindexProgress
from home.src.index.video import SponsorBlock, YoutubeVideo
from home.src.ta.config import AppConfig, ReleaseVersion
from home.src.ta.notify import Notifications, get_all_notifications
from home.src.ta.settings import EnvironmentSettings
from home.src.ta.ta_redis import RedisArchivist
from home.src.ta.task_config import TASK_CONFIG
from home.src.ta.task_manager import TaskCommand, TaskManager
from home.src.ta.urlparser import Parser
from home.src.ta.users import UserConfig
from home.tasks import (
BaseTask,
check_reindex,
download_pending,
extrac_dl,
run_restore_backup,
subscribe_to,
)
from rest_framework import permissions, status
from rest_framework.authentication import (
SessionAuthentication,
TokenAuthentication,
)
from rest_framework.authtoken.models import Token
from rest_framework.authtoken.views import ObtainAuthToken
from rest_framework.permissions import IsAuthenticated
from rest_framework.response import Response
from rest_framework.views import APIView
def check_admin(user):
"""check for admin permission for restricted views"""
return user.is_staff or user.groups.filter(name="admin").exists()
class AdminOnly(permissions.BasePermission):
"""allow only admin"""
def has_permission(self, request, view):
return check_admin(request.user)
class AdminWriteOnly(permissions.BasePermission):
"""allow only admin writes"""
def has_permission(self, request, view):
if request.method in permissions.SAFE_METHODS:
return permissions.IsAuthenticated().has_permission(request, view)
return check_admin(request.user)
class ApiBaseView(APIView):
"""base view to inherit from"""
authentication_classes = [SessionAuthentication, TokenAuthentication]
permission_classes = [IsAuthenticated]
permission_classes = [permissions.IsAuthenticated]
search_base = ""
data = ""
def __init__(self):
super().__init__()
self.response = {"data": False, "config": AppConfig().config}
self.response = {
"data": False,
"config": {
"enable_cast": EnvironmentSettings.ENABLE_CAST,
"downloads": AppConfig().config["downloads"],
},
}
self.data = {"query": {"match_all": {}}}
self.status_code = False
self.context = False
@ -102,6 +144,7 @@ class VideoApiView(ApiBaseView):
"""
search_base = "ta_video/_doc/"
permission_classes = [AdminWriteOnly]
def get(self, request, video_id):
# pylint: disable=unused-argument
@ -165,7 +208,6 @@ class VideoProgressView(ApiBaseView):
message = {"position": position, "youtube_id": video_id}
RedisArchivist().set_message(key, message)
self.response = request.data
return Response(self.response)
def delete(self, request, video_id):
@ -276,6 +318,7 @@ class ChannelApiView(ApiBaseView):
"""
search_base = "ta_channel/_doc/"
permission_classes = [AdminWriteOnly]
def get(self, request, channel_id):
# pylint: disable=unused-argument
@ -306,6 +349,7 @@ class ChannelApiListView(ApiBaseView):
search_base = "ta_channel/_search/"
valid_filter = ["subscribed"]
permission_classes = [AdminWriteOnly]
def get(self, request):
"""get request"""
@ -419,12 +463,27 @@ class PlaylistApiListView(ApiBaseView):
"""
search_base = "ta_playlist/_search/"
permission_classes = [AdminWriteOnly]
valid_playlist_type = ["regular", "custom"]
def get(self, request):
"""handle get request"""
self.data.update(
{"sort": [{"playlist_name.keyword": {"order": "asc"}}]}
)
playlist_type = request.GET.get("playlist_type", None)
query = {"sort": [{"playlist_name.keyword": {"order": "asc"}}]}
if playlist_type is not None:
if playlist_type not in self.valid_playlist_type:
message = f"invalid playlist_type {playlist_type}"
return Response({"message": message}, status=400)
query.update(
{
"query": {
"term": {"playlist_type": {"value": playlist_type}}
},
}
)
self.data.update(query)
self.get_document_list(request)
return Response(self.response)
@ -467,6 +526,8 @@ class PlaylistApiView(ApiBaseView):
"""
search_base = "ta_playlist/_doc/"
permission_classes = [AdminWriteOnly]
valid_custom_actions = ["create", "remove", "up", "down", "top", "bottom"]
def get(self, request, playlist_id):
# pylint: disable=unused-argument
@ -474,6 +535,27 @@ class PlaylistApiView(ApiBaseView):
self.get_document(playlist_id)
return Response(self.response, status=self.status_code)
def post(self, request, playlist_id):
"""post to custom playlist to add a video to list"""
playlist = YoutubePlaylist(playlist_id)
if not playlist.is_custom_playlist():
message = f"playlist with ID {playlist_id} is not custom"
return Response({"message": message}, status=400)
action = request.data.get("action")
if action not in self.valid_custom_actions:
message = f"invalid action: {action}"
return Response({"message": message}, status=400)
video_id = request.data.get("video_id")
if action == "create":
playlist.add_video_to_playlist(video_id)
else:
hide = UserConfig(request.user.id).get_value("hide_watched")
playlist.move_video(video_id, action, hide_watched=hide)
return Response({"success": True}, status=status.HTTP_201_CREATED)
def delete(self, request, playlist_id):
"""delete playlist"""
print(f"{playlist_id}: delete playlist")
@ -512,7 +594,8 @@ class DownloadApiView(ApiBaseView):
"""
search_base = "ta_download/_doc/"
valid_status = ["pending", "ignore", "priority"]
valid_status = ["pending", "ignore", "ignore-force", "priority"]
permission_classes = [AdminOnly]
def get(self, request, video_id):
# pylint: disable=unused-argument
@ -528,6 +611,11 @@ class DownloadApiView(ApiBaseView):
print(message)
return Response({"message": message}, status=400)
if item_status == "ignore-force":
extrac_dl.delay(video_id, status="ignore")
message = f"{video_id}: set status to ignore"
return Response(request.data)
_, status_code = PendingInteract(video_id).get_item()
if status_code == 404:
message = f"{video_id}: item not found {status_code}"
@ -559,6 +647,7 @@ class DownloadApiListView(ApiBaseView):
search_base = "ta_download/_search/"
valid_filter = ["pending", "ignore"]
permission_classes = [AdminOnly]
def get(self, request):
"""get request"""
@ -667,6 +756,8 @@ class SnapshotApiListView(ApiBaseView):
POST: take snapshot now
"""
permission_classes = [AdminOnly]
@staticmethod
def get(request):
"""handle get request"""
@ -691,6 +782,8 @@ class SnapshotApiView(ApiBaseView):
DELETE: delete a snapshot
"""
permission_classes = [AdminOnly]
@staticmethod
def get(request, snapshot_id):
"""handle get request"""
@ -725,11 +818,87 @@ class SnapshotApiView(ApiBaseView):
return Response(response)
class BackupApiListView(ApiBaseView):
"""resolves to /api/backup/
GET: returns list of available zip backups
POST: take zip backup now
"""
permission_classes = [AdminOnly]
task_name = "run_backup"
@staticmethod
def get(request):
"""handle get request"""
# pylint: disable=unused-argument
backup_files = ElasticBackup().get_all_backup_files()
return Response(backup_files)
def post(self, request):
"""handle post request"""
# pylint: disable=unused-argument
response = TaskCommand().start(self.task_name)
message = {
"message": "backup task started",
"task_id": response["task_id"],
}
return Response(message)
class BackupApiView(ApiBaseView):
"""resolves to /api/backup/<filename>/
GET: return a single backup
POST: restore backup
DELETE: delete backup
"""
permission_classes = [AdminOnly]
task_name = "restore_backup"
@staticmethod
def get(request, filename):
"""get single backup"""
# pylint: disable=unused-argument
backup_file = ElasticBackup().build_backup_file_data(filename)
if not backup_file:
message = {"message": "file not found"}
return Response(message, status=404)
return Response(backup_file)
def post(self, request, filename):
"""restore backup file"""
# pylint: disable=unused-argument
task = run_restore_backup.delay(filename)
message = {
"message": "backup restore task started",
"filename": filename,
"task_id": task.id,
}
return Response(message)
@staticmethod
def delete(request, filename):
"""delete backup file"""
# pylint: disable=unused-argument
backup_file = ElasticBackup().delete_file(filename)
if not backup_file:
message = {"message": "file not found"}
return Response(message, status=404)
message = {"message": f"file {filename} deleted"}
return Response(message)
class TaskListView(ApiBaseView):
"""resolves to /api/task-name/
GET: return a list of all stored task results
"""
permission_classes = [AdminOnly]
def get(self, request):
"""handle get request"""
# pylint: disable=unused-argument
@ -744,10 +913,12 @@ class TaskNameListView(ApiBaseView):
POST: start new background process
"""
permission_classes = [AdminOnly]
def get(self, request, task_name):
"""handle get request"""
# pylint: disable=unused-argument
if task_name not in BaseTask.TASK_CONFIG:
if task_name not in TASK_CONFIG:
message = {"message": "invalid task name"}
return Response(message, status=404)
@ -762,12 +933,12 @@ class TaskNameListView(ApiBaseView):
400 if task can't be started here without argument
"""
# pylint: disable=unused-argument
task_config = BaseTask.TASK_CONFIG.get(task_name)
task_config = TASK_CONFIG.get(task_name)
if not task_config:
message = {"message": "invalid task name"}
return Response(message, status=404)
if not task_config.get("api-start"):
if not task_config.get("api_start"):
message = {"message": "can not start task through this endpoint"}
return Response(message, status=400)
@ -782,6 +953,7 @@ class TaskIDView(ApiBaseView):
"""
valid_commands = ["stop", "kill"]
permission_classes = [AdminOnly]
def get(self, request, task_id):
"""handle get request"""
@ -805,16 +977,16 @@ class TaskIDView(ApiBaseView):
message = {"message": "task id not found"}
return Response(message, status=404)
task_conf = BaseTask.TASK_CONFIG.get(task_result.get("name"))
task_conf = TASK_CONFIG.get(task_result.get("name"))
if command == "stop":
if not task_conf.get("api-stop"):
if not task_conf.get("api_stop"):
message = {"message": "task can not be stopped"}
return Response(message, status=400)
message_key = self._build_message_key(task_conf, task_id)
TaskCommand().stop(task_id, message_key)
if command == "kill":
if not task_conf.get("api-stop"):
if not task_conf.get("api_stop"):
message = {"message": "task can not be killed"}
return Response(message, status=400)
@ -827,12 +999,64 @@ class TaskIDView(ApiBaseView):
return f"message:{task_conf.get('group')}:{task_id.split('-')[0]}"
class ScheduleView(ApiBaseView):
"""resolves to /api/schedule/
DEL: delete schedule for task
"""
permission_classes = [AdminOnly]
def delete(self, request):
"""delete schedule by task_name query"""
task_name = request.data.get("task_name")
try:
task = CustomPeriodicTask.objects.get(name=task_name)
except CustomPeriodicTask.DoesNotExist:
message = {"message": "task_name not found"}
return Response(message, status=404)
_ = task.delete()
return Response({"success": True})
class ScheduleNotification(ApiBaseView):
"""resolves to /api/schedule/notification/
GET: get all schedule notifications
DEL: delete notification
"""
def get(self, request):
"""handle get request"""
return Response(get_all_notifications())
def delete(self, request):
"""handle delete"""
task_name = request.data.get("task_name")
url = request.data.get("url")
if not TASK_CONFIG.get(task_name):
message = {"message": "task_name not found"}
return Response(message, status=404)
if url:
response, status_code = Notifications(task_name).remove_url(url)
else:
response, status_code = Notifications(task_name).remove_task()
return Response({"response": response, "status_code": status_code})
class RefreshView(ApiBaseView):
"""resolves to /api/refresh/
GET: get refresh progress
POST: start a manual refresh task
"""
permission_classes = [AdminOnly]
def get(self, request):
"""handle get request"""
request_type = request.GET.get("type")
@ -859,6 +1083,42 @@ class RefreshView(ApiBaseView):
return Response(data)
class UserConfigView(ApiBaseView):
"""resolves to /api/config/user/
GET: return current user config
POST: update user config
"""
def get(self, request):
"""get config"""
user_id = request.user.id
response = UserConfig(user_id).get_config()
response.update({"user_id": user_id})
return Response(response)
def post(self, request):
"""update config"""
user_id = request.user.id
data = request.data
user_conf = UserConfig(user_id)
for key, value in data.items():
try:
user_conf.set_value(key, value)
except ValueError as err:
message = {
"status": "Bad Request",
"message": f"failed updating {key} to '{value}', {err}",
}
return Response(message, status=400)
response = user_conf.get_config()
response.update({"user_id": user_id})
return Response(response)
class CookieView(ApiBaseView):
"""resolves to /api/cookie/
GET: check if cookie is enabled
@ -866,6 +1126,8 @@ class CookieView(ApiBaseView):
PUT: import cookie
"""
permission_classes = [AdminOnly]
@staticmethod
def get(request):
"""handle get request"""
@ -953,6 +1215,8 @@ class TokenView(ApiBaseView):
DELETE: revoke the token
"""
permission_classes = [AdminOnly]
@staticmethod
def delete(request):
print("revoke API token")
@ -978,16 +1242,52 @@ class NotificationView(ApiBaseView):
return Response(RedisArchivist().list_items(query))
class StatPrimaryView(ApiBaseView):
"""resolves to /api/stats/primary/
GET: return document count
class StatVideoView(ApiBaseView):
"""resolves to /api/stats/video/
GET: return video stats
"""
def get(self, request):
"""get stats"""
# pylint: disable=unused-argument
return Response(Primary().process())
return Response(Video().process())
class StatChannelView(ApiBaseView):
"""resolves to /api/stats/channel/
GET: return channel stats
"""
def get(self, request):
"""get stats"""
# pylint: disable=unused-argument
return Response(Channel().process())
class StatPlaylistView(ApiBaseView):
"""resolves to /api/stats/playlist/
GET: return playlist stats
"""
def get(self, request):
"""get stats"""
# pylint: disable=unused-argument
return Response(Playlist().process())
class StatDownloadView(ApiBaseView):
"""resolves to /api/stats/download/
GET: return download stats
"""
def get(self, request):
"""get stats"""
# pylint: disable=unused-argument
return Response(Download().process())
class StatWatchProgress(ApiBaseView):

View File

@ -3,12 +3,12 @@ Functionality:
- check that all connections are working
"""
from os import environ
from time import sleep
import requests
from django.core.management.base import BaseCommand, CommandError
from home.src.es.connect import ElasticWrap
from home.src.ta.settings import EnvironmentSettings
from home.src.ta.ta_redis import RedisArchivist
TOPIC = """
@ -156,10 +156,8 @@ class Command(BaseCommand):
" 🗙 path.repo env var not found. "
+ "set the following env var to the ES container:\n"
+ " path.repo="
+ environ.get(
"ES_SNAPSHOT_DIR", "/usr/share/elasticsearch/data/snapshot"
),
+ EnvironmentSettings.ES_SNAPSHOT_DIR
)
self.stdout.write(self.style.ERROR(f"{message}"))
self.stdout.write(self.style.ERROR(message))
sleep(60)
raise CommandError(message)

View File

@ -11,6 +11,7 @@ import re
from django.core.management.base import BaseCommand, CommandError
from home.models import Account
from home.src.ta.settings import EnvironmentSettings
LOGO = """
@ -96,18 +97,14 @@ class Command(BaseCommand):
def _elastic_user_overwrite(self):
"""check for ELASTIC_USER overwrite"""
self.stdout.write("[2] set default ES user")
if not os.environ.get("ELASTIC_USER"):
os.environ.setdefault("ELASTIC_USER", "elastic")
env = os.environ.get("ELASTIC_USER")
self.stdout.write("[2] check ES user overwrite")
env = EnvironmentSettings.ES_USER
self.stdout.write(self.style.SUCCESS(f" ✓ ES user is set to {env}"))
def _ta_port_overwrite(self):
"""set TA_PORT overwrite for nginx"""
self.stdout.write("[3] check TA_PORT overwrite")
overwrite = os.environ.get("TA_PORT")
overwrite = EnvironmentSettings.TA_PORT
if not overwrite:
self.stdout.write(self.style.SUCCESS(" TA_PORT is not set"))
return
@ -125,7 +122,7 @@ class Command(BaseCommand):
def _ta_uwsgi_overwrite(self):
"""set TA_UWSGI_PORT overwrite"""
self.stdout.write("[4] check TA_UWSGI_PORT overwrite")
overwrite = os.environ.get("TA_UWSGI_PORT")
overwrite = EnvironmentSettings.TA_UWSGI_PORT
if not overwrite:
message = " TA_UWSGI_PORT is not set"
self.stdout.write(self.style.SUCCESS(message))
@ -151,7 +148,7 @@ class Command(BaseCommand):
def _enable_cast_overwrite(self):
"""cast workaround, remove auth for static files in nginx"""
self.stdout.write("[5] check ENABLE_CAST overwrite")
overwrite = os.environ.get("ENABLE_CAST")
overwrite = EnvironmentSettings.ENABLE_CAST
if not overwrite:
self.stdout.write(self.style.SUCCESS(" ENABLE_CAST is not set"))
return
@ -174,8 +171,8 @@ class Command(BaseCommand):
self.stdout.write(self.style.SUCCESS(message))
return
name = os.environ.get("TA_USERNAME")
password = os.environ.get("TA_PASSWORD")
name = EnvironmentSettings.TA_USERNAME
password = EnvironmentSettings.TA_PASSWORD
Account.objects.create_superuser(name, password)
message = f" ✓ new superuser with name {name} created"
self.stdout.write(self.style.SUCCESS(message))

View File

@ -1,4 +1,8 @@
"""filepath migration from v0.3.6 to v0.3.7"""
"""
filepath migration from v0.3.6 to v0.3.7
not getting called at startup any more, to run manually if needed:
python manage.py ta_migpath
"""
import json
import os
@ -6,8 +10,8 @@ import shutil
from django.core.management.base import BaseCommand
from home.src.es.connect import ElasticWrap, IndexPaginate
from home.src.ta.config import AppConfig
from home.src.ta.helper import ignore_filelist
from home.src.ta.settings import EnvironmentSettings
TOPIC = """
@ -58,8 +62,7 @@ class FolderMigration:
"""migrate video archive folder"""
def __init__(self):
self.config = AppConfig().config
self.videos = self.config["application"]["videos"]
self.videos = EnvironmentSettings.MEDIA_DIR
self.bulk_list = []
def get_to_migrate(self):
@ -84,8 +87,8 @@ class FolderMigration:
def create_folders(self, to_migrate):
"""create required channel folders"""
host_uid = self.config["application"]["HOST_UID"]
host_gid = self.config["application"]["HOST_GID"]
host_uid = EnvironmentSettings.HOST_UID
host_gid = EnvironmentSettings.HOST_GID
all_channel_ids = {i["channel"]["channel_id"] for i in to_migrate}
for channel_id in all_channel_ids:

View File

@ -5,18 +5,24 @@ Functionality:
"""
import os
from random import randint
from time import sleep
from django.conf import settings
from django.core.management.base import BaseCommand, CommandError
from home.src.es.connect import ElasticWrap, IndexPaginate
from django_celery_beat.models import CrontabSchedule
from home.models import CustomPeriodicTask
from home.src.es.connect import ElasticWrap
from home.src.es.index_setup import ElasitIndexWrap
from home.src.es.snapshot import ElasticSnapshot
from home.src.index.video_streams import MediaStreamExtractor
from home.src.ta.config import AppConfig, ReleaseVersion
from home.src.ta.config_schedule import ScheduleBuilder
from home.src.ta.helper import clear_dl_cache
from home.src.ta.notify import Notifications
from home.src.ta.settings import EnvironmentSettings
from home.src.ta.ta_redis import RedisArchivist
from home.src.ta.task_config import TASK_CONFIG
from home.src.ta.task_manager import TaskManager
from home.src.ta.users import UserConfig
TOPIC = """
@ -37,15 +43,15 @@ class Command(BaseCommand):
self.stdout.write(TOPIC)
self._sync_redis_state()
self._make_folders()
self._release_locks()
self._clear_redis_keys()
self._clear_tasks()
self._clear_dl_cache()
self._version_check()
self._mig_index_setup()
self._mig_snapshot_check()
self._mig_set_streams()
self._mig_set_autostart()
self._mig_move_users_to_es()
self._mig_schedule_store()
self._mig_custom_playlist()
self._create_default_schedules()
def _sync_redis_state(self):
"""make sure redis gets new config.json values"""
@ -69,17 +75,17 @@ class Command(BaseCommand):
"playlists",
"videos",
]
cache_dir = AppConfig().config["application"]["cache_dir"]
cache_dir = EnvironmentSettings.CACHE_DIR
for folder in folders:
folder_path = os.path.join(cache_dir, folder)
os.makedirs(folder_path, exist_ok=True)
self.stdout.write(self.style.SUCCESS(" ✓ expected folders created"))
def _release_locks(self):
"""make sure there are no leftover locks set in redis"""
self.stdout.write("[3] clear leftover locks in redis")
all_locks = [
def _clear_redis_keys(self):
"""make sure there are no leftover locks or keys set in redis"""
self.stdout.write("[3] clear leftover keys in redis")
all_keys = [
"dl_queue_id",
"dl_queue",
"downloading",
@ -88,19 +94,22 @@ class Command(BaseCommand):
"rescan",
"run_backup",
"startup_check",
"reindex:ta_video",
"reindex:ta_channel",
"reindex:ta_playlist",
]
redis_con = RedisArchivist()
has_changed = False
for lock in all_locks:
if redis_con.del_message(lock):
for key in all_keys:
if redis_con.del_message(key):
self.stdout.write(
self.style.SUCCESS(f" ✓ cleared lock {lock}")
self.style.SUCCESS(f" ✓ cleared key {key}")
)
has_changed = True
if not has_changed:
self.stdout.write(self.style.SUCCESS(" no locks found"))
self.stdout.write(self.style.SUCCESS(" no keys found"))
def _clear_tasks(self):
"""clear tasks and messages"""
@ -119,8 +128,7 @@ class Command(BaseCommand):
def _clear_dl_cache(self):
"""clear leftover files from dl cache"""
self.stdout.write("[5] clear leftover files from dl cache")
config = AppConfig().config
leftover_files = clear_dl_cache(config)
leftover_files = clear_dl_cache(EnvironmentSettings.CACHE_DIR)
if leftover_files:
self.stdout.write(
self.style.SUCCESS(f" ✓ cleared {leftover_files} files")
@ -149,165 +157,212 @@ class Command(BaseCommand):
self.stdout.write("[MIGRATION] setup snapshots")
ElasticSnapshot().setup()
def _mig_set_streams(self):
"""migration: update from 0.3.5 to 0.3.6, set streams and media_size"""
self.stdout.write("[MIGRATION] index streams and media size")
videos = AppConfig().config["application"]["videos"]
data = {
"query": {
"bool": {"must_not": [{"exists": {"field": "streams"}}]}
},
"_source": ["media_url", "youtube_id"],
}
all_missing = IndexPaginate("ta_video", data).get_results()
if not all_missing:
self.stdout.write(" no videos need updating")
def _mig_schedule_store(self):
"""
update from 0.4.7 to 0.4.8
migrate schedule task store to CustomCronSchedule
"""
self.stdout.write("[MIGRATION] migrate schedule store")
config = AppConfig().config
current_schedules = config.get("scheduler")
if not current_schedules:
self.stdout.write(
self.style.SUCCESS(" no schedules to migrate")
)
return
total = len(all_missing)
for idx, missing in enumerate(all_missing):
media_url = missing["media_url"]
youtube_id = missing["youtube_id"]
media_path = os.path.join(videos, media_url)
if not os.path.exists(media_path):
self.stdout.write(f" file not found: {media_path}")
self.stdout.write(" run file system rescan to fix")
continue
self._mig_update_subscribed(current_schedules)
self._mig_download_pending(current_schedules)
self._mig_check_reindex(current_schedules)
self._mig_thumbnail_check(current_schedules)
self._mig_run_backup(current_schedules)
self._mig_version_check()
media = MediaStreamExtractor(media_path)
vid_data = {
"doc": {
"streams": media.extract_metadata(),
"media_size": media.get_file_size(),
}
}
path = f"ta_video/_update/{youtube_id}"
response, status_code = ElasticWrap(path).post(data=vid_data)
if not status_code == 200:
self.stdout.errors(
f" update failed: {path}, {response}, {status_code}"
)
del config["scheduler"]
RedisArchivist().set_message("config", config, save=True)
if idx % 100 == 0:
self.stdout.write(f" progress {idx}/{total}")
def _mig_update_subscribed(self, current_schedules):
"""create update_subscribed schedule"""
task_name = "update_subscribed"
update_subscribed_schedule = current_schedules.get(task_name)
if update_subscribed_schedule:
self._create_task(task_name, update_subscribed_schedule)
def _mig_set_autostart(self):
"""migration: update from 0.3.5 to 0.3.6 set auto_start to false"""
self.stdout.write("[MIGRATION] set default download auto_start")
self._create_notifications(task_name, current_schedules)
def _mig_download_pending(self, current_schedules):
"""create download_pending schedule"""
task_name = "download_pending"
download_pending_schedule = current_schedules.get(task_name)
if download_pending_schedule:
self._create_task(task_name, download_pending_schedule)
self._create_notifications(task_name, current_schedules)
def _mig_check_reindex(self, current_schedules):
"""create check_reindex schedule"""
task_name = "check_reindex"
check_reindex_schedule = current_schedules.get(task_name)
if check_reindex_schedule:
task_config = {}
days = current_schedules.get("check_reindex_days")
if days:
task_config.update({"days": days})
self._create_task(
task_name,
check_reindex_schedule,
task_config=task_config,
)
self._create_notifications(task_name, current_schedules)
def _mig_thumbnail_check(self, current_schedules):
"""create thumbnail_check schedule"""
thumbnail_check_schedule = current_schedules.get("thumbnail_check")
if thumbnail_check_schedule:
self._create_task("thumbnail_check", thumbnail_check_schedule)
def _mig_run_backup(self, current_schedules):
"""create run_backup schedule"""
run_backup_schedule = current_schedules.get("run_backup")
if run_backup_schedule:
task_config = False
rotate = current_schedules.get("run_backup_rotate")
if rotate:
task_config = {"rotate": rotate}
self._create_task(
"run_backup", run_backup_schedule, task_config=task_config
)
def _mig_version_check(self):
"""create version_check schedule"""
version_check_schedule = {
"minute": randint(0, 59),
"hour": randint(0, 23),
"day_of_week": "*",
}
self._create_task("version_check", version_check_schedule)
def _create_task(self, task_name, schedule, task_config=False):
"""create task"""
description = TASK_CONFIG[task_name].get("title")
schedule, _ = CrontabSchedule.objects.get_or_create(**schedule)
schedule.timezone = settings.TIME_ZONE
schedule.save()
task, _ = CustomPeriodicTask.objects.get_or_create(
crontab=schedule,
name=task_name,
description=description,
task=task_name,
)
if task_config:
task.task_config = task_config
task.save()
self.stdout.write(
self.style.SUCCESS(f" ✓ new task created: '{task}'")
)
def _create_notifications(self, task_name, current_schedules):
"""migrate notifications of task"""
notifications = current_schedules.get(f"{task_name}_notify")
if not notifications:
return
urls = [i.strip() for i in notifications.split()]
if not urls:
return
self.stdout.write(
self.style.SUCCESS(f" ✓ migrate notifications: '{urls}'")
)
handler = Notifications(task_name)
for url in urls:
handler.add_url(url)
def _mig_custom_playlist(self):
"""add playlist_type for migration from v0.4.6 to v0.4.7"""
self.stdout.write("[MIGRATION] custom playlist")
data = {
"query": {
"bool": {"must_not": [{"exists": {"field": "auto_start"}}]}
"bool": {"must_not": [{"exists": {"field": "playlist_type"}}]}
},
"script": {"source": "ctx._source['auto_start'] = false"},
"script": {"source": "ctx._source['playlist_type'] = 'regular'"},
}
path = "ta_download/_update_by_query"
path = "ta_playlist/_update_by_query"
response, status_code = ElasticWrap(path).post(data=data)
if status_code == 200:
updated = response.get("updated", 0)
if updated:
self.stdout.write(
self.style.SUCCESS(
f"{updated} videos updated in ta_download"
f"{updated} playlist_type updated in ta_playlist"
)
)
else:
self.stdout.write(
" no videos needed updating in ta_download"
self.style.SUCCESS(
" no playlist_type needed updating in ta_playlist"
)
)
return
message = " 🗙 ta_download auto_start update failed"
message = " 🗙 ta_playlist playlist_type update failed"
self.stdout.write(self.style.ERROR(message))
self.stdout.write(response)
sleep(60)
raise CommandError(message)
def _mig_move_users_to_es(self): # noqa: C901
"""migration: update from 0.4.1 to 0.4.2 move user config to ES"""
self.stdout.write("[MIGRATION] move user configuration to ES")
redis = RedisArchivist()
def _create_default_schedules(self) -> None:
"""
create default schedules for new installations
needs to be called after _mig_schedule_store
"""
self.stdout.write("[7] create initial schedules")
init_has_run = CustomPeriodicTask.objects.filter(
name="version_check"
).exists()
# 1: Find all users in Redis
users = {i.split(":")[0] for i in redis.list_keys("[0-9]*:")}
if not users:
self.stdout.write(" no users needed migrating to ES")
return
# 2: Write all Redis user settings to ES
# 3: Remove user settings from Redis
try:
for user in users:
new_conf = UserConfig(user)
colors_key = f"{user}:colors"
colors = redis.get_message(colors_key).get("status")
if colors:
new_conf.set_value("colors", colors)
redis.del_message(colors_key)
sort_by_key = f"{user}:sort_by"
sort_by = redis.get_message(sort_by_key).get("status")
if sort_by:
new_conf.set_value("sort_by", sort_by)
redis.del_message(sort_by_key)
page_size_key = f"{user}:page_size"
page_size = redis.get_message(page_size_key).get("status")
if page_size:
new_conf.set_value("page_size", page_size)
redis.del_message(page_size_key)
sort_order_key = f"{user}:sort_order"
sort_order = redis.get_message(sort_order_key).get("status")
if sort_order:
new_conf.set_value("sort_order", sort_order)
redis.del_message(sort_order_key)
grid_items_key = f"{user}:grid_items"
grid_items = redis.get_message(grid_items_key).get("status")
if grid_items:
new_conf.set_value("grid_items", grid_items)
redis.del_message(grid_items_key)
hide_watch_key = f"{user}:hide_watched"
hide_watch = redis.get_message(hide_watch_key).get("status")
if hide_watch:
new_conf.set_value("hide_watched", hide_watch)
redis.del_message(hide_watch_key)
ignore_only_key = f"{user}:show_ignored_only"
ignore_only = redis.get_message(ignore_only_key).get("status")
if ignore_only:
new_conf.set_value("show_ignored_only", ignore_only)
redis.del_message(ignore_only_key)
subed_only_key = f"{user}:show_subed_only"
subed_only = redis.get_message(subed_only_key).get("status")
if subed_only:
new_conf.set_value("show_subed_only", subed_only)
redis.del_message(subed_only_key)
for view in ["channel", "playlist", "home", "downloads"]:
view_key = f"{user}:view:{view}"
view_style = redis.get_message(view_key).get("status")
if view_style:
new_conf.set_value(f"view_style_{view}", view_style)
redis.del_message(view_key)
self.stdout.write(
self.style.SUCCESS(
f" ✓ Settings for user '{user}' migrated to ES"
)
)
except Exception as e:
message = " 🗙 user migration to ES failed"
self.stdout.write(self.style.ERROR(message))
self.stdout.write(self.style.ERROR(e))
sleep(60)
raise CommandError(message)
else:
if init_has_run:
self.stdout.write(
self.style.SUCCESS(
" ✓ Settings for all users migrated to ES"
" schedule init already done, skipping..."
)
)
return
builder = ScheduleBuilder()
check_reindex = builder.get_set_task(
"check_reindex", schedule=builder.SCHEDULES["check_reindex"]
)
check_reindex.task_config.update({"days": 90})
check_reindex.save()
self.stdout.write(
self.style.SUCCESS(
f" ✓ created new default schedule: {check_reindex}"
)
)
thumbnail_check = builder.get_set_task(
"thumbnail_check", schedule=builder.SCHEDULES["thumbnail_check"]
)
self.stdout.write(
self.style.SUCCESS(
f" ✓ created new default schedule: {thumbnail_check}"
)
)
daily_random = f"{randint(0, 59)} {randint(0, 23)} *"
version_check = builder.get_set_task(
"version_check", schedule=daily_random
)
self.stdout.write(
self.style.SUCCESS(
f" ✓ created new default schedule: {version_check}"
)
)
self.stdout.write(
self.style.SUCCESS(" ✓ all default schedules created")
)

View File

@ -17,8 +17,8 @@ from pathlib import Path
import ldap
from corsheaders.defaults import default_headers
from django_auth_ldap.config import LDAPSearch
from home.src.ta.config import AppConfig
from home.src.ta.helper import ta_host_parser
from home.src.ta.settings import EnvironmentSettings
# Build paths inside the project like this: BASE_DIR / 'subdir'.
BASE_DIR = Path(__file__).resolve().parent.parent
@ -27,7 +27,7 @@ BASE_DIR = Path(__file__).resolve().parent.parent
# Quick-start development settings - unsuitable for production
# See https://docs.djangoproject.com/en/3.2/howto/deployment/checklist/
PW_HASH = hashlib.sha256(environ["TA_PASSWORD"].encode())
PW_HASH = hashlib.sha256(EnvironmentSettings.TA_PASSWORD.encode())
SECRET_KEY = PW_HASH.hexdigest()
# SECURITY WARNING: don't run with debug turned on in production!
@ -38,6 +38,7 @@ ALLOWED_HOSTS, CSRF_TRUSTED_ORIGINS = ta_host_parser(environ["TA_HOST"])
# Application definition
INSTALLED_APPS = [
"django_celery_beat",
"home.apps.HomeConfig",
"django.contrib.admin",
"django.contrib.auth",
@ -180,7 +181,7 @@ if bool(environ.get("TA_LDAP")):
# Database
# https://docs.djangoproject.com/en/3.2/ref/settings/#databases
CACHE_DIR = AppConfig().config["application"]["cache_dir"]
CACHE_DIR = EnvironmentSettings.CACHE_DIR
DB_PATH = path.join(CACHE_DIR, "db.sqlite3")
DATABASES = {
"default": {
@ -228,7 +229,7 @@ if bool(environ.get("TA_ENABLE_AUTH_PROXY")):
# https://docs.djangoproject.com/en/3.2/topics/i18n/
LANGUAGE_CODE = "en-us"
TIME_ZONE = environ.get("TZ") or "UTC"
TIME_ZONE = EnvironmentSettings.TZ
USE_I18N = True
USE_L10N = True
USE_TZ = True
@ -269,4 +270,4 @@ CORS_ALLOW_HEADERS = list(default_headers) + [
# TA application settings
TA_UPSTREAM = "https://github.com/tubearchivist/tubearchivist"
TA_VERSION = "v0.4.2"
TA_VERSION = "v0.4.8-unstable"

View File

@ -13,6 +13,7 @@ Including another URLconf
1. Import the include() function: from django.urls import include, path
2. Add a URL to urlpatterns: path('blog/', include('blog.urls'))
"""
from django.contrib import admin
from django.urls import include, path

View File

@ -1,5 +1,7 @@
""" handle celery startup """
"""start celery app"""
from .tasks import app as celery_app
from __future__ import absolute_import, unicode_literals
from home.celery import app as celery_app
__all__ = ("celery_app",)

View File

@ -2,6 +2,7 @@
from django.contrib import admin
from django.contrib.auth.admin import UserAdmin as BaseUserAdmin
from django_celery_beat import models as BeatModels
from .models import Account
@ -34,3 +35,12 @@ class HomeAdmin(BaseUserAdmin):
admin.site.register(Account, HomeAdmin)
admin.site.unregister(
[
BeatModels.ClockedSchedule,
BeatModels.CrontabSchedule,
BeatModels.IntervalSchedule,
BeatModels.PeriodicTask,
BeatModels.SolarSchedule,
]
)

View File

@ -0,0 +1,24 @@
"""initiate celery"""
import os
from celery import Celery
from home.src.ta.config import AppConfig
from home.src.ta.settings import EnvironmentSettings
CONFIG = AppConfig().config
REDIS_HOST = EnvironmentSettings.REDIS_HOST
REDIS_PORT = EnvironmentSettings.REDIS_PORT
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings")
app = Celery(
"tasks",
broker=f"redis://{REDIS_HOST}:{REDIS_PORT}",
backend=f"redis://{REDIS_HOST}:{REDIS_PORT}",
result_extended=True,
)
app.config_from_object(
"django.conf:settings", namespace=EnvironmentSettings.REDIS_NAME_SPACE
)
app.autodiscover_tasks()
app.conf.timezone = EnvironmentSettings.TZ

View File

@ -25,23 +25,6 @@
"integrate_sponsorblock": false
},
"application": {
"app_root": "/app",
"cache_dir": "/cache",
"videos": "/youtube",
"enable_cast": false,
"enable_snapshot": true
},
"scheduler": {
"update_subscribed": false,
"update_subscribed_notify": false,
"download_pending": false,
"download_pending_notify": false,
"check_reindex": {"minute": "0", "hour": "12", "day_of_week": "*"},
"check_reindex_notify": false,
"check_reindex_days": 90,
"thumbnail_check": {"minute": "0", "hour": "17", "day_of_week": "*"},
"run_backup": false,
"run_backup_rotate": 5,
"version_check": "rand-d"
}
}

View File

@ -0,0 +1,23 @@
# Generated by Django 4.2.7 on 2023-12-05 13:47
from django.db import migrations, models
import django.db.models.deletion
class Migration(migrations.Migration):
dependencies = [
('django_celery_beat', '0018_improve_crontab_helptext'),
('home', '0001_initial'),
]
operations = [
migrations.CreateModel(
name='CustomPeriodicTask',
fields=[
('periodictask_ptr', models.OneToOneField(auto_created=True, on_delete=django.db.models.deletion.CASCADE, parent_link=True, primary_key=True, serialize=False, to='django_celery_beat.periodictask')),
('task_config', models.JSONField(default=dict)),
],
bases=('django_celery_beat.periodictask',),
),
]

View File

@ -1,10 +1,12 @@
"""custom models"""
from django.contrib.auth.models import (
AbstractBaseUser,
BaseUserManager,
PermissionsMixin,
)
from django.db import models
from django_celery_beat.models import PeriodicTask
class AccountManager(BaseUserManager):
@ -51,3 +53,9 @@ class Account(AbstractBaseUser, PermissionsMixin):
USERNAME_FIELD = "name"
REQUIRED_FIELDS = ["password"]
class CustomPeriodicTask(PeriodicTask):
"""add custom metadata to to task"""
task_config = models.JSONField(default=dict)

View File

@ -7,10 +7,7 @@ Functionality:
import json
from datetime import datetime
from home.src.download.subscriptions import (
ChannelSubscription,
PlaylistSubscription,
)
from home.src.download.subscriptions import ChannelSubscription
from home.src.download.thumbnails import ThumbManager
from home.src.download.yt_dlp_base import YtWrap
from home.src.es.connect import ElasticWrap, IndexPaginate
@ -196,7 +193,6 @@ class PendingList(PendingIndex):
self._parse_channel(entry["url"], vid_type)
elif entry["type"] == "playlist":
self._parse_playlist(entry["url"])
PlaylistSubscription().process_url_str([entry], subscribed=False)
else:
raise ValueError(f"invalid url_type: {entry}")
@ -227,15 +223,18 @@ class PendingList(PendingIndex):
def _parse_playlist(self, url):
"""add all videos of playlist to list"""
playlist = YoutubePlaylist(url)
playlist.build_json()
if not playlist.json_data:
is_active = playlist.update_playlist()
if not is_active:
message = f"{playlist.youtube_id}: failed to extract metadata"
print(message)
raise ValueError(message)
video_results = playlist.json_data.get("playlist_entries")
youtube_ids = [i["youtube_id"] for i in video_results]
for video_id in youtube_ids:
entries = playlist.json_data["playlist_entries"]
to_add = [i["youtube_id"] for i in entries if not i["downloaded"]]
if not to_add:
return
for video_id in to_add:
# match vid_type later
self._add_video(video_id, VideoTypeEnum.UNKNOWN)
@ -245,6 +244,7 @@ class PendingList(PendingIndex):
bulk_list = []
total = len(self.missing_videos)
videos_added = []
for idx, (youtube_id, vid_type) in enumerate(self.missing_videos):
if self.task and self.task.is_stopped():
break
@ -268,6 +268,7 @@ class PendingList(PendingIndex):
url = video_details["vid_thumb_url"]
ThumbManager(youtube_id).download_video_thumb(url)
videos_added.append(youtube_id)
if len(bulk_list) >= 20:
self._ingest_bulk(bulk_list)
@ -275,6 +276,8 @@ class PendingList(PendingIndex):
self._ingest_bulk(bulk_list)
return videos_added
def _ingest_bulk(self, bulk_list):
"""add items to queue in bulk"""
if not bulk_list:

View File

@ -4,14 +4,15 @@ Functionality:
- handle playlist subscriptions
"""
from home.src.download import queue # partial import
from home.src.download.thumbnails import ThumbManager
from home.src.download.yt_dlp_base import YtWrap
from home.src.es.connect import IndexPaginate
from home.src.index.channel import YoutubeChannel
from home.src.index.playlist import YoutubePlaylist
from home.src.index.video import YoutubeVideo
from home.src.index.video_constants import VideoTypeEnum
from home.src.ta.config import AppConfig
from home.src.ta.helper import is_missing
from home.src.ta.urlparser import Parser
@ -105,10 +106,6 @@ class ChannelSubscription:
if not all_channels:
return False
pending = queue.PendingList()
pending.get_download()
pending.get_indexed()
missing_videos = []
total = len(all_channels)
@ -118,22 +115,22 @@ class ChannelSubscription:
last_videos = self.get_last_youtube_videos(channel_id)
if last_videos:
ids_to_add = is_missing([i[0] for i in last_videos])
for video_id, _, vid_type in last_videos:
if video_id not in pending.to_skip:
if video_id in ids_to_add:
missing_videos.append((video_id, vid_type))
if not self.task:
continue
if self.task:
if self.task.is_stopped():
self.task.send_progress(["Received Stop signal."])
break
if self.task.is_stopped():
self.task.send_progress(["Received Stop signal."])
break
self.task.send_progress(
message_lines=[f"Scanning Channel {idx + 1}/{total}"],
progress=(idx + 1) / total,
)
self.task.send_progress(
message_lines=[f"Scanning Channel {idx + 1}/{total}"],
progress=(idx + 1) / total,
)
return missing_videos
@ -174,10 +171,6 @@ class PlaylistSubscription:
def process_url_str(self, new_playlists, subscribed=True):
"""process playlist subscribe form url_str"""
data = {"query": {"match_all": {}}, "_source": ["youtube_id"]}
all_indexed = IndexPaginate("ta_video", data).get_results()
all_youtube_ids = [i["youtube_id"] for i in all_indexed]
for idx, playlist in enumerate(new_playlists):
playlist_id = playlist["url"]
if not playlist["type"] == "playlist":
@ -185,7 +178,6 @@ class PlaylistSubscription:
continue
playlist_h = YoutubePlaylist(playlist_id)
playlist_h.all_youtube_ids = all_youtube_ids
playlist_h.build_json()
if not playlist_h.json_data:
message = f"{playlist_h.youtube_id}: failed to extract data"
@ -223,27 +215,15 @@ class PlaylistSubscription:
playlist.json_data["playlist_subscribed"] = subscribe_status
playlist.upload_to_es()
@staticmethod
def get_to_ignore():
"""get all youtube_ids already downloaded or ignored"""
pending = queue.PendingList()
pending.get_download()
pending.get_indexed()
return pending.to_skip
def find_missing(self):
"""find videos in subscribed playlists not downloaded yet"""
all_playlists = [i["playlist_id"] for i in self.get_playlists()]
if not all_playlists:
return False
to_ignore = self.get_to_ignore()
missing_videos = []
total = len(all_playlists)
for idx, playlist_id in enumerate(all_playlists):
size_limit = self.config["subscriptions"]["channel_size"]
playlist = YoutubePlaylist(playlist_id)
is_active = playlist.update_playlist()
if not is_active:
@ -251,27 +231,29 @@ class PlaylistSubscription:
continue
playlist_entries = playlist.json_data["playlist_entries"]
size_limit = self.config["subscriptions"]["channel_size"]
if size_limit:
del playlist_entries[size_limit:]
all_missing = [i for i in playlist_entries if not i["downloaded"]]
for video in all_missing:
youtube_id = video["youtube_id"]
if youtube_id not in to_ignore:
missing_videos.append(youtube_id)
to_check = [
i["youtube_id"]
for i in playlist_entries
if i["downloaded"] is False
]
needs_downloading = is_missing(to_check)
missing_videos.extend(needs_downloading)
if not self.task:
continue
if self.task:
self.task.send_progress(
message_lines=[f"Scanning Playlists {idx + 1}/{total}"],
progress=(idx + 1) / total,
)
if self.task.is_stopped():
self.task.send_progress(["Received Stop signal."])
break
if self.task.is_stopped():
self.task.send_progress(["Received Stop signal."])
break
self.task.send_progress(
message_lines=[f"Scanning Playlists {idx + 1}/{total}"],
progress=(idx + 1) / total,
)
return missing_videos
@ -359,8 +341,10 @@ class SubscriptionHandler:
if item["type"] == "video":
# extract channel id from video
vid = queue.PendingList().get_youtube_details(item["url"])
channel_id = vid["channel_id"]
video = YoutubeVideo(item["url"])
video.get_from_youtube()
video.process_youtube_meta()
channel_id = video.channel_id
elif item["type"] == "channel":
channel_id = item["url"]
else:

View File

@ -11,7 +11,8 @@ from time import sleep
import requests
from home.src.es.connect import ElasticWrap, IndexPaginate
from home.src.ta.config import AppConfig
from home.src.ta.helper import is_missing
from home.src.ta.settings import EnvironmentSettings
from mutagen.mp4 import MP4, MP4Cover
from PIL import Image, ImageFile, ImageFilter, UnidentifiedImageError
@ -21,8 +22,7 @@ ImageFile.LOAD_TRUNCATED_IMAGES = True
class ThumbManagerBase:
"""base class for thumbnail management"""
CONFIG = AppConfig().config
CACHE_DIR = CONFIG["application"]["cache_dir"]
CACHE_DIR = EnvironmentSettings.CACHE_DIR
VIDEO_DIR = os.path.join(CACHE_DIR, "videos")
CHANNEL_DIR = os.path.join(CACHE_DIR, "channels")
PLAYLIST_DIR = os.path.join(CACHE_DIR, "playlists")
@ -70,13 +70,13 @@ class ThumbManagerBase:
img_raw = Image.open(self.fallback)
return img_raw
app_root = self.CONFIG["application"]["app_root"]
app_root = EnvironmentSettings.APP_DIR
default_map = {
"video": os.path.join(
app_root, "static/img/default-video-thumb.jpg"
),
"playlist": os.path.join(
app_root, "static/img/default-video-thumb.jpg"
app_root, "static/img/default-playlist-thumb.jpg"
),
"icon": os.path.join(
app_root, "static/img/default-channel-icon.jpg"
@ -203,7 +203,18 @@ class ThumbManager(ThumbManagerBase):
if skip_existing and os.path.exists(thumb_path):
return
img_raw = self.download_raw(url)
img_raw = (
self.download_raw(url)
if not isinstance(url, str) or url.startswith("http")
else Image.open(os.path.join(self.CACHE_DIR, url))
)
width, height = img_raw.size
if not width / height == 16 / 9:
new_height = width / 16 * 9
offset = (height - new_height) / 2
img_raw = img_raw.crop((0, offset, width, height - offset))
img_raw = img_raw.resize((336, 189))
img_raw.convert("RGB").save(thumb_path)
def delete_video_thumb(self):
@ -316,7 +327,7 @@ class ThumbValidator:
},
]
def __init__(self, task):
def __init__(self, task=False):
self.task = task
def validate(self):
@ -336,6 +347,89 @@ class ThumbValidator:
)
_ = paginate.get_results()
def clean_up(self):
"""clean up all thumbs"""
self._clean_up_vids()
self._clean_up_channels()
self._clean_up_playlists()
def _clean_up_vids(self):
"""clean unneeded vid thumbs"""
video_dir = os.path.join(EnvironmentSettings.CACHE_DIR, "videos")
video_folders = os.listdir(video_dir)
for video_folder in video_folders:
folder_path = os.path.join(video_dir, video_folder)
thumbs_is = {i.split(".")[0] for i in os.listdir(folder_path)}
thumbs_should = self._get_vid_thumbs_should(video_folder)
to_delete = thumbs_is - thumbs_should
for thumb in to_delete:
delete_path = os.path.join(folder_path, f"{thumb}.jpg")
os.remove(delete_path)
if to_delete:
message = (
f"[thumbs][video][{video_folder}] "
+ f"delete {len(to_delete)} unused thumbnails"
)
print(message)
if self.task:
self.task.send_progress([message])
@staticmethod
def _get_vid_thumbs_should(video_folder: str) -> set[str]:
"""get indexed"""
should_list = [
{"prefix": {"youtube_id": {"value": video_folder.lower()}}},
{"prefix": {"youtube_id": {"value": video_folder.upper()}}},
]
data = {
"query": {"bool": {"should": should_list}},
"_source": ["youtube_id"],
}
result = IndexPaginate("ta_video,ta_download", data).get_results()
thumbs_should = {i["youtube_id"] for i in result}
return thumbs_should
def _clean_up_channels(self):
"""clean unneeded channel thumbs"""
channel_dir = os.path.join(EnvironmentSettings.CACHE_DIR, "channels")
channel_art = os.listdir(channel_dir)
thumbs_is = {"_".join(i.split("_")[:-1]) for i in channel_art}
to_delete = is_missing(list(thumbs_is), "ta_channel", "channel_id")
for channel_thumb in channel_art:
if channel_thumb[:24] in to_delete:
delete_path = os.path.join(channel_dir, channel_thumb)
os.remove(delete_path)
if to_delete:
message = (
"[thumbs][channel] "
+ f"delete {len(to_delete)} unused channel art"
)
print(message)
if self.task:
self.task.send_progress([message])
def _clean_up_playlists(self):
"""clean up unneeded playlist thumbs"""
playlist_dir = os.path.join(EnvironmentSettings.CACHE_DIR, "playlists")
playlist_art = os.listdir(playlist_dir)
thumbs_is = {i.split(".")[0] for i in playlist_art}
to_delete = is_missing(list(thumbs_is), "ta_playlist", "playlist_id")
for playlist_id in to_delete:
delete_path = os.path.join(playlist_dir, f"{playlist_id}.jpg")
os.remove(delete_path)
if to_delete:
message = (
"[thumbs][playlist] "
+ f"delete {len(to_delete)} unused playlist art"
)
print(message)
if self.task:
self.task.send_progress([message])
@staticmethod
def _get_total(index_name):
"""get total documents in index"""
@ -380,9 +474,8 @@ class ThumbFilesystem:
class EmbedCallback:
"""callback class to embed thumbnails"""
CONFIG = AppConfig().config
CACHE_DIR = CONFIG["application"]["cache_dir"]
MEDIA_DIR = CONFIG["application"]["videos"]
CACHE_DIR = EnvironmentSettings.CACHE_DIR
MEDIA_DIR = EnvironmentSettings.MEDIA_DIR
FORMAT = MP4Cover.FORMAT_JPEG
def __init__(self, source, index_name, counter=0):

View File

@ -10,6 +10,7 @@ from http import cookiejar
from io import StringIO
import yt_dlp
from home.src.ta.settings import EnvironmentSettings
from home.src.ta.ta_redis import RedisArchivist
@ -61,8 +62,8 @@ class YtWrap:
"""make extract request"""
try:
response = yt_dlp.YoutubeDL(self.obs).extract_info(url)
except cookiejar.LoadError:
print("cookie file is invalid")
except cookiejar.LoadError as err:
print(f"cookie file is invalid: {err}")
return False
except yt_dlp.utils.ExtractorError as err:
print(f"{url}: failed to extract with message: {err}, continue...")
@ -86,6 +87,7 @@ class CookieHandler:
def __init__(self, config):
self.cookie_io = False
self.config = config
self.cache_dir = EnvironmentSettings.CACHE_DIR
def get(self):
"""get cookie io stream"""
@ -95,8 +97,9 @@ class CookieHandler:
def import_cookie(self):
"""import cookie from file"""
cache_path = self.config["application"]["cache_dir"]
import_path = os.path.join(cache_path, "import", "cookies.google.txt")
import_path = os.path.join(
self.cache_dir, "import", "cookies.google.txt"
)
try:
with open(import_path, encoding="utf-8") as cookie_file:

View File

@ -20,188 +20,80 @@ from home.src.index.playlist import YoutubePlaylist
from home.src.index.video import YoutubeVideo, index_new_video
from home.src.index.video_constants import VideoTypeEnum
from home.src.ta.config import AppConfig
from home.src.ta.helper import ignore_filelist
from home.src.ta.helper import get_channel_overwrites, ignore_filelist
from home.src.ta.settings import EnvironmentSettings
from home.src.ta.ta_redis import RedisQueue
class DownloadPostProcess:
"""handle task to run after download queue finishes"""
class DownloaderBase:
"""base class for shared config"""
def __init__(self, download):
self.download = download
self.now = int(datetime.now().timestamp())
self.pending = False
CACHE_DIR = EnvironmentSettings.CACHE_DIR
MEDIA_DIR = EnvironmentSettings.MEDIA_DIR
CHANNEL_QUEUE = "download:channel"
PLAYLIST_QUEUE = "download:playlist:full"
PLAYLIST_QUICK = "download:playlist:quick"
VIDEO_QUEUE = "download:video"
def run(self):
"""run all functions"""
self.pending = PendingList()
self.pending.get_download()
self.pending.get_channels()
self.pending.get_indexed()
self.auto_delete_all()
self.auto_delete_overwrites()
self.validate_playlists()
self.get_comments()
def auto_delete_all(self):
"""handle auto delete"""
autodelete_days = self.download.config["downloads"]["autodelete_days"]
if not autodelete_days:
return
print(f"auto delete older than {autodelete_days} days")
now_lte = self.now - autodelete_days * 24 * 60 * 60
data = {
"query": {"range": {"player.watched_date": {"lte": now_lte}}},
"sort": [{"player.watched_date": {"order": "asc"}}],
}
self._auto_delete_watched(data)
def auto_delete_overwrites(self):
"""handle per channel auto delete from overwrites"""
for channel_id, value in self.pending.channel_overwrites.items():
if "autodelete_days" in value:
autodelete_days = value.get("autodelete_days")
print(f"{channel_id}: delete older than {autodelete_days}d")
now_lte = self.now - autodelete_days * 24 * 60 * 60
must_list = [
{"range": {"player.watched_date": {"lte": now_lte}}},
{"term": {"channel.channel_id": {"value": channel_id}}},
]
data = {
"query": {"bool": {"must": must_list}},
"sort": [{"player.watched_date": {"order": "desc"}}],
}
self._auto_delete_watched(data)
@staticmethod
def _auto_delete_watched(data):
"""delete watched videos after x days"""
to_delete = IndexPaginate("ta_video", data).get_results()
if not to_delete:
return
for video in to_delete:
youtube_id = video["youtube_id"]
print(f"{youtube_id}: auto delete video")
YoutubeVideo(youtube_id).delete_media_file()
print("add deleted to ignore list")
vids = [{"type": "video", "url": i["youtube_id"]} for i in to_delete]
pending = PendingList(youtube_ids=vids)
pending.parse_url_list()
pending.add_to_pending(status="ignore")
def validate_playlists(self):
"""look for playlist needing to update"""
for id_c, channel_id in enumerate(self.download.channels):
channel = YoutubeChannel(channel_id, task=self.download.task)
overwrites = self.pending.channel_overwrites.get(channel_id, False)
if overwrites and overwrites.get("index_playlists"):
# validate from remote
channel.index_channel_playlists()
continue
# validate from local
playlists = channel.get_indexed_playlists(active_only=True)
all_channel_playlist = [i["playlist_id"] for i in playlists]
self._validate_channel_playlist(all_channel_playlist, id_c)
def _validate_channel_playlist(self, all_channel_playlist, id_c):
"""scan channel for playlist needing update"""
all_youtube_ids = [i["youtube_id"] for i in self.pending.all_videos]
for id_p, playlist_id in enumerate(all_channel_playlist):
playlist = YoutubePlaylist(playlist_id)
playlist.all_youtube_ids = all_youtube_ids
playlist.build_json(scrape=True)
if not playlist.json_data:
playlist.deactivate()
continue
playlist.add_vids_to_playlist()
playlist.upload_to_es()
self._notify_playlist_progress(all_channel_playlist, id_c, id_p)
def _notify_playlist_progress(self, all_channel_playlist, id_c, id_p):
"""notify to UI"""
if not self.download.task:
return
total_channel = len(self.download.channels)
total_playlist = len(all_channel_playlist)
message = [
f"Post Processing Channels: {id_c}/{total_channel}",
f"Validate Playlists {id_p + 1}/{total_playlist}",
]
progress = (id_c + 1) / total_channel
self.download.task.send_progress(message, progress=progress)
def get_comments(self):
"""get comments from youtube"""
CommentList(self.download.videos, task=self.download.task).index()
class VideoDownloader:
"""
handle the video download functionality
if not initiated with list, take from queue
"""
def __init__(self, youtube_id_list=False, task=False):
self.obs = False
self.video_overwrites = False
self.youtube_id_list = youtube_id_list
def __init__(self, task):
self.task = task
self.config = AppConfig().config
self._build_obs()
self.channels = set()
self.videos = set()
self.channel_overwrites = get_channel_overwrites()
self.now = int(datetime.now().timestamp())
def run_queue(self, auto_only=False):
class VideoDownloader(DownloaderBase):
"""handle the video download functionality"""
def __init__(self, task=False):
super().__init__(task)
self.obs = False
self._build_obs()
def run_queue(self, auto_only=False) -> int:
"""setup download queue in redis loop until no more items"""
self._get_overwrites()
downloaded = 0
while True:
video_data = self._get_next(auto_only)
if self.task.is_stopped() or not video_data:
self._reset_auto()
break
youtube_id = video_data.get("youtube_id")
youtube_id = video_data["youtube_id"]
channel_id = video_data["channel_id"]
print(f"{youtube_id}: Downloading video")
self._notify(video_data, "Validate download format")
success = self._dl_single_vid(youtube_id)
success = self._dl_single_vid(youtube_id, channel_id)
if not success:
continue
self._notify(video_data, "Add video metadata to index")
vid_dict = index_new_video(
youtube_id,
video_overwrites=self.video_overwrites,
video_type=VideoTypeEnum(video_data["vid_type"]),
)
self.channels.add(vid_dict["channel"]["channel_id"])
self.videos.add(vid_dict["youtube_id"])
self._notify(video_data, "Add video metadata to index", progress=1)
video_type = VideoTypeEnum(video_data["vid_type"])
vid_dict = index_new_video(youtube_id, video_type=video_type)
RedisQueue(self.CHANNEL_QUEUE).add(channel_id)
RedisQueue(self.VIDEO_QUEUE).add(youtube_id)
self._notify(video_data, "Move downloaded file to archive")
self.move_to_archive(vid_dict)
self._delete_from_pending(youtube_id)
downloaded += 1
# post processing
self._add_subscribed_channels()
DownloadPostProcess(self).run()
DownloadPostProcess(self.task).run()
return self.videos
return downloaded
def _notify(self, video_data, message):
def _notify(self, video_data, message, progress=False):
"""send progress notification to task"""
if not self.task:
return
typ = VideoTypeEnum(video_data["vid_type"]).value.rstrip("s").title()
title = video_data.get("title")
self.task.send_progress([f"Processing {typ}: {title}", message])
self.task.send_progress(
[f"Processing {typ}: {title}", message], progress=progress
)
def _get_next(self, auto_only):
"""get next item in queue"""
@ -225,13 +117,6 @@ class VideoDownloader:
return response["hits"]["hits"][0]["_source"]
def _get_overwrites(self):
"""get channel overwrites"""
pending = PendingList()
pending.get_download()
pending.get_channels()
self.video_overwrites = pending.video_overwrites
def _progress_hook(self, response):
"""process the progress_hooks from yt_dlp"""
progress = False
@ -262,10 +147,7 @@ class VideoDownloader:
"""initial obs"""
self.obs = {
"merge_output_format": "mp4",
"outtmpl": (
self.config["application"]["cache_dir"]
+ "/download/%(id)s.mp4"
),
"outtmpl": (self.CACHE_DIR + "/download/%(id)s.mp4"),
"progress_hooks": [self._progress_hook],
"noprogress": True,
"continuedl": True,
@ -325,22 +207,17 @@ class VideoDownloader:
self.obs["postprocessors"] = postprocessors
def get_format_overwrites(self, youtube_id):
"""get overwrites from single video"""
overwrites = self.video_overwrites.get(youtube_id, False)
if overwrites:
return overwrites.get("download_format", False)
def _set_overwrites(self, obs: dict, channel_id: str) -> None:
"""add overwrites to obs"""
overwrites = self.channel_overwrites.get(channel_id)
if overwrites and overwrites.get("download_format"):
obs["format"] = overwrites.get("download_format")
return False
def _dl_single_vid(self, youtube_id):
def _dl_single_vid(self, youtube_id: str, channel_id: str) -> bool:
"""download single video"""
obs = self.obs.copy()
format_overwrite = self.get_format_overwrites(youtube_id)
if format_overwrite:
obs["format"] = format_overwrite
dl_cache = self.config["application"]["cache_dir"] + "/download/"
self._set_overwrites(obs, channel_id)
dl_cache = os.path.join(self.CACHE_DIR, "download")
# check if already in cache to continue from there
all_cached = ignore_filelist(os.listdir(dl_cache))
@ -370,20 +247,20 @@ class VideoDownloader:
def move_to_archive(self, vid_dict):
"""move downloaded video from cache to archive"""
videos = self.config["application"]["videos"]
host_uid = self.config["application"]["HOST_UID"]
host_gid = self.config["application"]["HOST_GID"]
host_uid = EnvironmentSettings.HOST_UID
host_gid = EnvironmentSettings.HOST_GID
# make folder
folder = os.path.join(videos, vid_dict["channel"]["channel_id"])
folder = os.path.join(
self.MEDIA_DIR, vid_dict["channel"]["channel_id"]
)
if not os.path.exists(folder):
os.makedirs(folder)
if host_uid and host_gid:
os.chown(folder, host_uid, host_gid)
# move media file
media_file = vid_dict["youtube_id"] + ".mp4"
cache_dir = self.config["application"]["cache_dir"]
old_path = os.path.join(cache_dir, "download", media_file)
new_path = os.path.join(videos, vid_dict["media_url"])
old_path = os.path.join(self.CACHE_DIR, "download", media_file)
new_path = os.path.join(self.MEDIA_DIR, vid_dict["media_url"])
# move media file and fix permission
shutil.move(old_path, new_path, copy_function=shutil.copyfile)
if host_uid and host_gid:
@ -395,18 +272,6 @@ class VideoDownloader:
path = f"ta_download/_doc/{youtube_id}?refresh=true"
_, _ = ElasticWrap(path).delete()
def _add_subscribed_channels(self):
"""add all channels subscribed to refresh"""
all_subscribed = PlaylistSubscription().get_playlists()
if not all_subscribed:
return
channel_ids = [i["playlist_channel_id"] for i in all_subscribed]
for channel_id in channel_ids:
self.channels.add(channel_id)
return
def _reset_auto(self):
"""reset autostart to defaults after queue stop"""
path = "ta_download/_update_by_query"
@ -421,3 +286,169 @@ class VideoDownloader:
updated = response.get("updated")
if updated:
print(f"[download] reset auto start on {updated} videos.")
class DownloadPostProcess(DownloaderBase):
"""handle task to run after download queue finishes"""
def run(self):
"""run all functions"""
self.auto_delete_all()
self.auto_delete_overwrites()
self.refresh_playlist()
self.match_videos()
self.get_comments()
def auto_delete_all(self):
"""handle auto delete"""
autodelete_days = self.config["downloads"]["autodelete_days"]
if not autodelete_days:
return
print(f"auto delete older than {autodelete_days} days")
now_lte = str(self.now - autodelete_days * 24 * 60 * 60)
data = {
"query": {"range": {"player.watched_date": {"lte": now_lte}}},
"sort": [{"player.watched_date": {"order": "asc"}}],
}
self._auto_delete_watched(data)
def auto_delete_overwrites(self):
"""handle per channel auto delete from overwrites"""
for channel_id, value in self.channel_overwrites.items():
if "autodelete_days" in value:
autodelete_days = value.get("autodelete_days")
print(f"{channel_id}: delete older than {autodelete_days}d")
now_lte = str(self.now - autodelete_days * 24 * 60 * 60)
must_list = [
{"range": {"player.watched_date": {"lte": now_lte}}},
{"term": {"channel.channel_id": {"value": channel_id}}},
]
data = {
"query": {"bool": {"must": must_list}},
"sort": [{"player.watched_date": {"order": "desc"}}],
}
self._auto_delete_watched(data)
@staticmethod
def _auto_delete_watched(data):
"""delete watched videos after x days"""
to_delete = IndexPaginate("ta_video", data).get_results()
if not to_delete:
return
for video in to_delete:
youtube_id = video["youtube_id"]
print(f"{youtube_id}: auto delete video")
YoutubeVideo(youtube_id).delete_media_file()
print("add deleted to ignore list")
vids = [{"type": "video", "url": i["youtube_id"]} for i in to_delete]
pending = PendingList(youtube_ids=vids)
pending.parse_url_list()
_ = pending.add_to_pending(status="ignore")
def refresh_playlist(self) -> None:
"""match videos with playlists"""
self.add_playlists_to_refresh()
queue = RedisQueue(self.PLAYLIST_QUEUE)
while True:
total = queue.max_score()
playlist_id, idx = queue.get_next()
if not playlist_id or not idx or not total:
break
playlist = YoutubePlaylist(playlist_id)
playlist.update_playlist(skip_on_empty=True)
if not self.task:
continue
channel_name = playlist.json_data["playlist_channel"]
playlist_title = playlist.json_data["playlist_name"]
message = [
f"Post Processing Playlists for: {channel_name}",
f"{playlist_title} [{idx}/{total}]",
]
progress = idx / total
self.task.send_progress(message, progress=progress)
def add_playlists_to_refresh(self) -> None:
"""add playlists to refresh"""
if self.task:
message = ["Post Processing Playlists", "Scanning for Playlists"]
self.task.send_progress(message)
self._add_playlist_sub()
self._add_channel_playlists()
self._add_video_playlists()
def _add_playlist_sub(self):
"""add subscribed playlists to refresh"""
subs = PlaylistSubscription().get_playlists()
to_add = [i["playlist_id"] for i in subs]
RedisQueue(self.PLAYLIST_QUEUE).add_list(to_add)
def _add_channel_playlists(self):
"""add playlists from channels to refresh"""
queue = RedisQueue(self.CHANNEL_QUEUE)
while True:
channel_id, _ = queue.get_next()
if not channel_id:
break
channel = YoutubeChannel(channel_id)
channel.get_from_es()
overwrites = channel.get_overwrites()
if "index_playlists" in overwrites:
channel.get_all_playlists()
to_add = [i[0] for i in channel.all_playlists]
RedisQueue(self.PLAYLIST_QUEUE).add_list(to_add)
def _add_video_playlists(self):
"""add other playlists for quick sync"""
all_playlists = RedisQueue(self.PLAYLIST_QUEUE).get_all()
must_not = [{"terms": {"playlist_id": all_playlists}}]
video_ids = RedisQueue(self.VIDEO_QUEUE).get_all()
must = [{"terms": {"playlist_entries.youtube_id": video_ids}}]
data = {
"query": {"bool": {"must_not": must_not, "must": must}},
"_source": ["playlist_id"],
}
playlists = IndexPaginate("ta_playlist", data).get_results()
to_add = [i["playlist_id"] for i in playlists]
RedisQueue(self.PLAYLIST_QUICK).add_list(to_add)
def match_videos(self) -> None:
"""scan rest of indexed playlists to match videos"""
queue = RedisQueue(self.PLAYLIST_QUICK)
while True:
total = queue.max_score()
playlist_id, idx = queue.get_next()
if not playlist_id or not idx or not total:
break
playlist = YoutubePlaylist(playlist_id)
playlist.get_from_es()
playlist.add_vids_to_playlist()
playlist.remove_vids_from_playlist()
if not self.task:
continue
message = [
"Post Processing Playlists.",
f"Validate Playlists: - {idx}/{total}",
]
progress = idx / total
self.task.send_progress(message, progress=progress)
def get_comments(self):
"""get comments from youtube"""
video_queue = RedisQueue(self.VIDEO_QUEUE)
comment_list = CommentList(task=self.task)
comment_list.add(video_ids=video_queue.get_all())
video_queue.clear()
comment_list.index()

View File

@ -10,19 +10,22 @@ import os
import zipfile
from datetime import datetime
from home.models import CustomPeriodicTask
from home.src.es.connect import ElasticWrap, IndexPaginate
from home.src.ta.config import AppConfig
from home.src.ta.helper import get_mapping, ignore_filelist
from home.src.ta.settings import EnvironmentSettings
class ElasticBackup:
"""dump index to nd-json files for later bulk import"""
INDEX_SPLIT = ["comment"]
CACHE_DIR = EnvironmentSettings.CACHE_DIR
BACKUP_DIR = os.path.join(CACHE_DIR, "backup")
def __init__(self, reason=False, task=False):
self.config = AppConfig().config
self.cache_dir = self.config["application"]["cache_dir"]
self.timestamp = datetime.now().strftime("%Y%m%d")
self.index_config = get_mapping()
self.reason = reason
@ -78,14 +81,13 @@ class ElasticBackup:
def zip_it(self):
"""pack it up into single zip file"""
file_name = f"ta_backup-{self.timestamp}-{self.reason}.zip"
folder = os.path.join(self.cache_dir, "backup")
to_backup = []
for file in os.listdir(folder):
for file in os.listdir(self.BACKUP_DIR):
if file.endswith(".json"):
to_backup.append(os.path.join(folder, file))
to_backup.append(os.path.join(self.BACKUP_DIR, file))
backup_file = os.path.join(folder, file_name)
backup_file = os.path.join(self.BACKUP_DIR, file_name)
comp = zipfile.ZIP_DEFLATED
with zipfile.ZipFile(backup_file, "w", compression=comp) as zip_f:
@ -98,7 +100,7 @@ class ElasticBackup:
def post_bulk_restore(self, file_name):
"""send bulk to es"""
file_path = os.path.join(self.cache_dir, file_name)
file_path = os.path.join(self.CACHE_DIR, file_name)
with open(file_path, "r", encoding="utf-8") as f:
data = f.read()
@ -109,9 +111,7 @@ class ElasticBackup:
def get_all_backup_files(self):
"""build all available backup files for view"""
backup_dir = os.path.join(self.cache_dir, "backup")
backup_files = os.listdir(backup_dir)
all_backup_files = ignore_filelist(backup_files)
all_backup_files = ignore_filelist(os.listdir(self.BACKUP_DIR))
all_available_backups = [
i
for i in all_backup_files
@ -120,24 +120,36 @@ class ElasticBackup:
all_available_backups.sort(reverse=True)
backup_dicts = []
for backup_file in all_available_backups:
file_split = backup_file.split("-")
if len(file_split) == 2:
timestamp = file_split[1].strip(".zip")
reason = False
elif len(file_split) == 3:
timestamp = file_split[1]
reason = file_split[2].strip(".zip")
to_add = {
"filename": backup_file,
"timestamp": timestamp,
"reason": reason,
}
backup_dicts.append(to_add)
for filename in all_available_backups:
data = self.build_backup_file_data(filename)
backup_dicts.append(data)
return backup_dicts
def build_backup_file_data(self, filename):
"""build metadata of single backup file"""
file_path = os.path.join(self.BACKUP_DIR, filename)
if not os.path.exists(file_path):
return False
file_split = filename.split("-")
if len(file_split) == 2:
timestamp = file_split[1].strip(".zip")
reason = False
elif len(file_split) == 3:
timestamp = file_split[1]
reason = file_split[2].strip(".zip")
data = {
"filename": filename,
"file_path": file_path,
"file_size": os.path.getsize(file_path),
"timestamp": timestamp,
"reason": reason,
}
return data
def restore(self, filename):
"""
restore from backup zip file
@ -148,22 +160,19 @@ class ElasticBackup:
def _unpack_zip_backup(self, filename):
"""extract backup zip and return filelist"""
backup_dir = os.path.join(self.cache_dir, "backup")
file_path = os.path.join(backup_dir, filename)
file_path = os.path.join(self.BACKUP_DIR, filename)
with zipfile.ZipFile(file_path, "r") as z:
zip_content = z.namelist()
z.extractall(backup_dir)
z.extractall(self.BACKUP_DIR)
return zip_content
def _restore_json_files(self, zip_content):
"""go through the unpacked files and restore"""
backup_dir = os.path.join(self.cache_dir, "backup")
for idx, json_f in enumerate(zip_content):
self._notify_restore(idx, json_f, len(zip_content))
file_name = os.path.join(backup_dir, json_f)
file_name = os.path.join(self.BACKUP_DIR, json_f)
if not json_f.startswith("es_") or not json_f.endswith(".json"):
os.remove(file_name)
@ -189,7 +198,12 @@ class ElasticBackup:
def rotate_backup(self):
"""delete old backups if needed"""
rotate = self.config["scheduler"]["run_backup_rotate"]
try:
task = CustomPeriodicTask.objects.get(name="run_backup")
except CustomPeriodicTask.DoesNotExist:
return
rotate = task.task_config.get("rotate")
if not rotate:
return
@ -200,13 +214,21 @@ class ElasticBackup:
print("no backup files to rotate")
return
backup_dir = os.path.join(self.cache_dir, "backup")
all_to_delete = auto[rotate:]
for to_delete in all_to_delete:
file_path = os.path.join(backup_dir, to_delete["filename"])
print(f"remove old backup file: {file_path}")
os.remove(file_path)
self.delete_file(to_delete["filename"])
def delete_file(self, filename):
"""delete backup file"""
file_path = os.path.join(self.BACKUP_DIR, filename)
if not os.path.exists(file_path):
print(f"backup file not found: {filename}")
return False
print(f"remove old backup file: {file_path}")
os.remove(file_path)
return file_path
class BackupCallback:
@ -217,6 +239,7 @@ class BackupCallback:
self.index_name = index_name
self.counter = counter
self.timestamp = datetime.now().strftime("%Y%m%d")
self.cache_dir = EnvironmentSettings.CACHE_DIR
def run(self):
"""run the junk task"""
@ -243,9 +266,8 @@ class BackupCallback:
def _write_es_json(self, file_content):
"""write nd-json file for es _bulk API to disk"""
cache_dir = AppConfig().config["application"]["cache_dir"]
index = self.index_name.lstrip("ta_")
file_name = f"es_{index}-{self.timestamp}-{self.counter}.json"
file_path = os.path.join(cache_dir, "backup", file_name)
file_path = os.path.join(self.cache_dir, "backup", file_name)
with open(file_path, "a+", encoding="utf-8") as f:
f.write(file_content)

View File

@ -3,14 +3,15 @@ functionality:
- wrapper around requests to call elastic search
- reusable search_after to extract total index
"""
# pylint: disable=missing-timeout
import json
import os
from typing import Any
import requests
import urllib3
from home.src.ta.settings import EnvironmentSettings
class ElasticWrap:
@ -18,16 +19,14 @@ class ElasticWrap:
returns response json and status code tuple
"""
ES_URL: str = str(os.environ.get("ES_URL"))
ES_PASS: str = str(os.environ.get("ELASTIC_PASSWORD"))
ES_USER: str = str(os.environ.get("ELASTIC_USER") or "elastic")
ES_DISABLE_VERIFY_SSL: bool = bool(os.environ.get("ES_DISABLE_VERIFY_SSL"))
def __init__(self, path: str):
self.url: str = f"{self.ES_URL}/{path}"
self.auth: tuple[str, str] = (self.ES_USER, self.ES_PASS)
self.url: str = f"{EnvironmentSettings.ES_URL}/{path}"
self.auth: tuple[str, str] = (
EnvironmentSettings.ES_USER,
EnvironmentSettings.ES_PASS,
)
if self.ES_DISABLE_VERIFY_SSL:
if EnvironmentSettings.ES_DISABLE_VERIFY_SSL:
urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning)
def get(
@ -43,7 +42,7 @@ class ElasticWrap:
"timeout": timeout,
}
if self.ES_DISABLE_VERIFY_SSL:
if EnvironmentSettings.ES_DISABLE_VERIFY_SSL:
kwargs["verify"] = False
if data:
@ -78,7 +77,7 @@ class ElasticWrap:
}
)
if self.ES_DISABLE_VERIFY_SSL:
if EnvironmentSettings.ES_DISABLE_VERIFY_SSL:
kwargs["verify"] = False
response = requests.post(self.url, **kwargs)
@ -103,7 +102,7 @@ class ElasticWrap:
"auth": self.auth,
}
if self.ES_DISABLE_VERIFY_SSL:
if EnvironmentSettings.ES_DISABLE_VERIFY_SSL:
kwargs["verify"] = False
response = requests.put(self.url, **kwargs)
@ -130,7 +129,7 @@ class ElasticWrap:
if data:
kwargs["json"] = data
if self.ES_DISABLE_VERIFY_SSL:
if EnvironmentSettings.ES_DISABLE_VERIFY_SSL:
kwargs["verify"] = False
response = requests.delete(self.url, **kwargs)

View File

@ -74,7 +74,7 @@
"type": "boolean"
},
"integrate_sponsorblock": {
"type" : "boolean"
"type": "boolean"
}
}
}
@ -168,7 +168,7 @@
"type": "boolean"
},
"integrate_sponsorblock": {
"type" : "boolean"
"type": "boolean"
}
}
}
@ -236,19 +236,37 @@
"comment_count": {
"type": "long"
},
"stats" : {
"properties" : {
"average_rating" : {
"type" : "float"
"stats": {
"properties": {
"average_rating": {
"type": "float"
},
"dislike_count" : {
"type" : "long"
"dislike_count": {
"type": "long"
},
"like_count" : {
"type" : "long"
"like_count": {
"type": "long"
},
"view_count" : {
"type" : "long"
"view_count": {
"type": "long"
}
}
},
"player": {
"properties": {
"duration": {
"type": "long"
},
"duration_str": {
"type": "keyword",
"index": false
},
"watched": {
"type": "boolean"
},
"watched_date": {
"type": "date",
"format": "epoch_second"
}
}
},
@ -314,28 +332,28 @@
"is_enabled": {
"type": "boolean"
},
"segments" : {
"properties" : {
"UUID" : {
"segments": {
"properties": {
"UUID": {
"type": "keyword"
},
"actionType" : {
"actionType": {
"type": "keyword"
},
"category" : {
"category": {
"type": "keyword"
},
"locked" : {
"type" : "short"
"locked": {
"type": "short"
},
"segment" : {
"type" : "float"
"segment": {
"type": "float"
},
"videoDuration" : {
"type" : "float"
"videoDuration": {
"type": "float"
},
"votes" : {
"type" : "long"
"votes": {
"type": "long"
}
}
}
@ -459,6 +477,41 @@
"playlist_last_refresh": {
"type": "date",
"format": "epoch_second"
},
"playlist_entries": {
"properties": {
"downloaded": {
"type": "boolean"
},
"idx": {
"type": "long"
},
"title": {
"type": "text",
"analyzer": "english",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256,
"normalizer": "to_lower"
}
}
},
"uploader": {
"type": "text",
"analyzer": "english",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256,
"normalizer": "to_lower"
}
}
},
"youtube_id": {
"type": "keyword"
}
}
}
},
"expected_set": {
@ -516,7 +569,7 @@
"format": "epoch_second"
},
"subtitle_index": {
"type" : "long"
"type": "long"
},
"subtitle_lang": {
"type": "keyword"
@ -525,7 +578,7 @@
"type": "keyword"
},
"subtitle_line": {
"type" : "text",
"type": "text",
"analyzer": "english"
}
},
@ -560,14 +613,14 @@
"type": "keyword"
},
"comment_text": {
"type" : "text"
"type": "text"
},
"comment_timestamp": {
"type": "date",
"format": "epoch_second"
},
"comment_time_text": {
"type" : "text"
"type": "text"
},
"comment_likecount": {
"type": "long"
@ -613,4 +666,4 @@
}
}
]
}
}

View File

@ -4,12 +4,12 @@ functionality:
"""
from datetime import datetime
from os import environ
from time import sleep
from zoneinfo import ZoneInfo
from home.src.es.connect import ElasticWrap
from home.src.ta.helper import get_mapping
from home.src.ta.settings import EnvironmentSettings
class ElasticSnapshot:
@ -19,9 +19,7 @@ class ElasticSnapshot:
REPO_SETTINGS = {
"compress": "true",
"chunk_size": "1g",
"location": environ.get(
"ES_SNAPSHOT_DIR", "/usr/share/elasticsearch/data/snapshot"
),
"location": EnvironmentSettings.ES_SNAPSHOT_DIR,
}
POLICY = "ta_daily"
@ -256,7 +254,7 @@ class ElasticSnapshot:
expected_format = "%Y-%m-%dT%H:%M:%S.%fZ"
date = datetime.strptime(date_utc, expected_format)
local_datetime = date.replace(tzinfo=ZoneInfo("localtime"))
converted = local_datetime.astimezone(ZoneInfo(environ.get("TZ")))
converted = local_datetime.astimezone(ZoneInfo(EnvironmentSettings.TZ))
converted_str = converted.strftime("%Y-%m-%d %H:%M")
return converted_str

View File

@ -1,92 +0,0 @@
"""
Functionality:
- collection of functions and tasks from frontend
- called via user input
"""
from home.src.ta.users import UserConfig
from home.tasks import run_restore_backup
class PostData:
"""
map frontend http post values to backend funcs
handover long running tasks to celery
"""
def __init__(self, post_dict, current_user):
self.post_dict = post_dict
self.to_exec, self.exec_val = list(post_dict.items())[0]
self.current_user = current_user
def run_task(self):
"""execute and return task result"""
to_exec = self.exec_map()
task_result = to_exec()
return task_result
def exec_map(self):
"""map dict key and return function to execute"""
exec_map = {
"change_view": self._change_view,
"change_grid": self._change_grid,
"sort_order": self._sort_order,
"hide_watched": self._hide_watched,
"show_subed_only": self._show_subed_only,
"show_ignored_only": self._show_ignored_only,
"db-restore": self._db_restore,
}
return exec_map[self.to_exec]
def _change_view(self):
"""process view changes in home, channel, and downloads"""
view, setting = self.exec_val.split(":")
UserConfig(self.current_user).set_value(f"view_style_{view}", setting)
return {"success": True}
def _change_grid(self):
"""process change items in grid"""
grid_items = int(self.exec_val)
grid_items = max(grid_items, 3)
grid_items = min(grid_items, 7)
UserConfig(self.current_user).set_value("grid_items", grid_items)
return {"success": True}
def _sort_order(self):
"""change the sort between published to downloaded"""
if self.exec_val in ["asc", "desc"]:
UserConfig(self.current_user).set_value(
"sort_order", self.exec_val
)
else:
UserConfig(self.current_user).set_value("sort_by", self.exec_val)
return {"success": True}
def _hide_watched(self):
"""toggle if to show watched vids or not"""
UserConfig(self.current_user).set_value(
"hide_watched", bool(int(self.exec_val))
)
return {"success": True}
def _show_subed_only(self):
"""show or hide subscribed channels only on channels page"""
UserConfig(self.current_user).set_value(
"show_subed_only", bool(int(self.exec_val))
)
return {"success": True}
def _show_ignored_only(self):
"""switch view on /downloads/ to show ignored only"""
UserConfig(self.current_user).set_value(
"show_ignored_only", bool(int(self.exec_val))
)
return {"success": True}
def _db_restore(self):
"""restore es zip from settings page"""
print("restoring index from backup zip")
filename = self.exec_val
run_restore_backup.delay(filename)
return {"success": True}

View File

@ -2,9 +2,12 @@
- hold all form classes used in the views
"""
import os
from django import forms
from django.contrib.auth.forms import AuthenticationForm
from django.forms.widgets import PasswordInput, TextInput
from home.src.ta.helper import get_stylesheets
class CustomAuthForm(AuthenticationForm):
@ -29,14 +32,16 @@ class CustomAuthForm(AuthenticationForm):
class UserSettingsForm(forms.Form):
"""user configurations values"""
CHOICES = [
("", "-- change color scheme --"),
("dark", "Dark"),
("light", "Light"),
]
STYLESHEET_CHOICES = [("", "-- change stylesheet --")]
STYLESHEET_CHOICES.extend(
[
(stylesheet, os.path.splitext(stylesheet)[0].title())
for stylesheet in get_stylesheets()
]
)
colors = forms.ChoiceField(
widget=forms.Select, choices=CHOICES, required=False
stylesheet = forms.ChoiceField(
widget=forms.Select, choices=STYLESHEET_CHOICES, required=False
)
page_size = forms.IntegerField(required=False)
@ -100,8 +105,8 @@ class ApplicationSettingsForm(forms.Form):
COOKIE_IMPORT_CHOICES = [
("", "-- change cookie settings"),
("0", "disable cookie"),
("1", "enable cookie"),
("0", "remove cookie"),
("1", "import cookie"),
]
subscriptions_channel_size = forms.IntegerField(
@ -154,50 +159,6 @@ class ApplicationSettingsForm(forms.Form):
)
class SchedulerSettingsForm(forms.Form):
"""handle scheduler settings"""
HELP_TEXT = "Add Apprise notification URLs, one per line"
update_subscribed = forms.CharField(required=False)
update_subscribed_notify = forms.CharField(
label=False,
widget=forms.Textarea(
attrs={
"rows": 2,
"placeholder": HELP_TEXT,
}
),
required=False,
)
download_pending = forms.CharField(required=False)
download_pending_notify = forms.CharField(
label=False,
widget=forms.Textarea(
attrs={
"rows": 2,
"placeholder": HELP_TEXT,
}
),
required=False,
)
check_reindex = forms.CharField(required=False)
check_reindex_notify = forms.CharField(
label=False,
widget=forms.Textarea(
attrs={
"rows": 2,
"placeholder": HELP_TEXT,
}
),
required=False,
)
check_reindex_days = forms.IntegerField(required=False)
thumbnail_check = forms.CharField(required=False)
run_backup = forms.CharField(required=False)
run_backup_rotate = forms.IntegerField(required=False)
class MultiSearchForm(forms.Form):
"""multi search form for /search/"""
@ -260,6 +221,20 @@ class SubscribeToPlaylistForm(forms.Form):
)
class CreatePlaylistForm(forms.Form):
"""text area form to create a single custom playlist"""
create = forms.CharField(
label="Or create custom playlist",
widget=forms.Textarea(
attrs={
"rows": 1,
"placeholder": "Input playlist name",
}
),
)
class ChannelOverwriteForm(forms.Form):
"""custom overwrites for channel settings"""

View File

@ -0,0 +1,101 @@
"""
Functionality:
- handle schedule forms
- implement form validation
"""
from celery.schedules import crontab
from django import forms
from home.src.ta.task_config import TASK_CONFIG
class CrontabValidator:
"""validate crontab"""
@staticmethod
def validate_fields(cron_fields):
"""expect 3 cron fields"""
if not len(cron_fields) == 3:
raise forms.ValidationError("expected three cron schedule fields")
@staticmethod
def validate_minute(minute_field):
"""expect minute int"""
try:
minute_value = int(minute_field)
if not 0 <= minute_value <= 59:
raise forms.ValidationError(
"Invalid value for minutes. Must be between 0 and 59."
)
except ValueError as err:
raise forms.ValidationError(
"Invalid value for minutes. Must be an integer."
) from err
@staticmethod
def validate_cron_tab(minute, hour, day_of_week):
"""check if crontab can be created"""
try:
crontab(minute=minute, hour=hour, day_of_week=day_of_week)
except ValueError as err:
raise forms.ValidationError(f"invalid crontab: {err}") from err
def validate(self, cron_expression):
"""create crontab schedule"""
if cron_expression == "auto":
return
cron_fields = cron_expression.split()
self.validate_fields(cron_fields)
minute, hour, day_of_week = cron_fields
self.validate_minute(minute)
self.validate_cron_tab(minute, hour, day_of_week)
def validate_cron(cron_expression):
"""callable for field"""
CrontabValidator().validate(cron_expression)
class SchedulerSettingsForm(forms.Form):
"""handle scheduler settings"""
update_subscribed = forms.CharField(
required=False, validators=[validate_cron]
)
download_pending = forms.CharField(
required=False, validators=[validate_cron]
)
check_reindex = forms.CharField(required=False, validators=[validate_cron])
check_reindex_days = forms.IntegerField(required=False)
thumbnail_check = forms.CharField(
required=False, validators=[validate_cron]
)
run_backup = forms.CharField(required=False, validators=[validate_cron])
run_backup_rotate = forms.IntegerField(required=False)
class NotificationSettingsForm(forms.Form):
"""add notification URL"""
SUPPORTED_TASKS = [
"update_subscribed",
"extract_download",
"download_pending",
"check_reindex",
]
TASK_LIST = [(i, TASK_CONFIG[i]["title"]) for i in SUPPORTED_TASKS]
TASK_CHOICES = [("", "-- select task --")]
TASK_CHOICES.extend(TASK_LIST)
PLACEHOLDER = "Apprise notification URL"
task = forms.ChoiceField(
widget=forms.Select, choices=TASK_CHOICES, required=False
)
notification_url = forms.CharField(
required=False,
widget=forms.TextInput(attrs={"placeholder": PLACEHOLDER}),
)

View File

@ -6,7 +6,6 @@ Functionality:
- calculate pagination values
"""
from api.src.search_processor import SearchProcess
from home.src.es.connect import ElasticWrap

View File

@ -6,14 +6,44 @@ functionality:
import json
import os
import re
from datetime import datetime
from home.src.download import queue # partial import
import requests
from home.src.download.thumbnails import ThumbManager
from home.src.download.yt_dlp_base import YtWrap
from home.src.es.connect import ElasticWrap, IndexPaginate
from home.src.index.generic import YouTubeItem
from home.src.index.playlist import YoutubePlaylist
from home.src.ta.helper import requests_headers
from home.src.ta.settings import EnvironmentSettings
def banner_extractor(channel_id: str) -> dict[str, str] | None:
"""workaround for new channel renderer, upstream #9893"""
url = f"https://www.youtube.com/channel/{channel_id}?hl=en"
cookies = {"SOCS": "CAI"}
response = requests.get(
url, cookies=cookies, headers=requests_headers(), timeout=30
)
if not response.ok:
return None
matched_urls = re.findall(
r'"(https://yt3.googleusercontent.com/[^"]+=w(\d{3,4})-fcrop64[^"]*)"',
response.text,
)
if not matched_urls:
return None
sorted_urls = sorted(matched_urls, key=lambda x: int(x[1]), reverse=True)
banner = sorted_urls[0][0]
channel_art_fallback = {
"channel_banner_url": banner,
"channel_tvart_url": banner.split("-fcrop64")[0],
}
return channel_art_fallback
class YoutubeChannel(YouTubeItem):
@ -23,8 +53,8 @@ class YoutubeChannel(YouTubeItem):
index_name = "ta_channel"
yt_base = "https://www.youtube.com/channel/"
yt_obs = {
"extract_flat": True,
"allow_playlist_files": True,
"playlist_items": "1,0",
"skip_download": True,
}
def __init__(self, youtube_id, task=False):
@ -32,10 +62,6 @@ class YoutubeChannel(YouTubeItem):
self.all_playlists = False
self.task = task
def build_yt_url(self):
"""overwrite base to use channel about page"""
return f"{self.yt_base}{self.youtube_id}/about"
def build_json(self, upload=False, fallback=False):
"""get from es or from youtube"""
self.get_from_es()
@ -55,21 +81,48 @@ class YoutubeChannel(YouTubeItem):
def process_youtube_meta(self):
"""extract relevant fields"""
self.youtube_meta["thumbnails"].reverse()
channel_subs = self.youtube_meta.get("channel_follower_count") or 0
self.json_data = {
"channel_active": True,
"channel_description": self.youtube_meta.get("description", False),
"channel_id": self.youtube_id,
"channel_last_refresh": int(datetime.now().timestamp()),
"channel_name": self.youtube_meta["uploader"],
"channel_subs": channel_subs,
"channel_subs": self._extract_follower_count(),
"channel_subscribed": False,
"channel_tags": self._parse_tags(self.youtube_meta.get("tags")),
"channel_banner_url": self._get_banner_art(),
"channel_thumb_url": self._get_thumb_art(),
"channel_tvart_url": self._get_tv_art(),
"channel_views": self.youtube_meta.get("view_count", 0),
"channel_views": self.youtube_meta.get("view_count") or 0,
}
self._inject_fallback()
def _inject_fallback(self):
"""fallback channel art work, workaround for upstream #9893"""
if self.json_data["channel_banner_url"]:
return
print(f"{self.youtube_id}: attempt art fallback extraction")
fallback = banner_extractor(self.youtube_id)
if fallback:
print(f"{self.youtube_id}: fallback succeeded: {fallback}")
self.json_data.update(fallback)
def _extract_follower_count(self) -> int:
"""workaround for upstream #9893, extract subs from first video"""
subs = self.youtube_meta.get("channel_follower_count")
if subs is not None:
return subs
entries = self.youtube_meta.get("entries", [])
if entries:
first_entry = entries[0]
if isinstance(first_entry, dict):
subs_entry = first_entry.get("channel_follower_count")
if subs_entry is not None:
return subs_entry
return 0
def _parse_tags(self, tags):
"""parse channel tags"""
@ -134,7 +187,7 @@ class YoutubeChannel(YouTubeItem):
def _info_json_fallback(self):
"""read channel info.json for additional metadata"""
info_json = os.path.join(
self.config["application"]["cache_dir"],
EnvironmentSettings.CACHE_DIR,
"import",
f"{self.youtube_id}.info.json",
)
@ -178,7 +231,7 @@ class YoutubeChannel(YouTubeItem):
def get_folder_path(self):
"""get folder where media files get stored"""
folder_path = os.path.join(
self.app_conf["videos"],
EnvironmentSettings.MEDIA_DIR,
self.json_data["channel_id"],
)
return folder_path
@ -201,12 +254,20 @@ class YoutubeChannel(YouTubeItem):
}
_, _ = ElasticWrap("ta_comment/_delete_by_query").post(data)
def delete_es_subtitles(self):
"""delete all subtitles from this channel"""
data = {
"query": {
"term": {"subtitle_channel_id": {"value": self.youtube_id}}
}
}
_, _ = ElasticWrap("ta_subtitle/_delete_by_query").post(data)
def delete_playlists(self):
"""delete all indexed playlist from es"""
all_playlists = self.get_indexed_playlists()
for playlist in all_playlists:
playlist_id = playlist["playlist_id"]
YoutubePlaylist(playlist_id).delete_metadata()
YoutubePlaylist(playlist["playlist_id"]).delete_metadata()
def delete_channel(self):
"""delete channel and all videos"""
@ -231,6 +292,7 @@ class YoutubeChannel(YouTubeItem):
print(f"{self.youtube_id}: delete indexed videos")
self.delete_es_videos()
self.delete_es_comments()
self.delete_es_subtitles()
self.del_in_es()
def index_channel_playlists(self):
@ -244,13 +306,12 @@ class YoutubeChannel(YouTubeItem):
print(f"{self.youtube_id}: no playlists found.")
return
all_youtube_ids = self.get_all_video_ids()
total = len(self.all_playlists)
for idx, playlist in enumerate(self.all_playlists):
if self.task:
self._notify_single_playlist(idx, total)
self._index_single_playlist(playlist, all_youtube_ids)
self._index_single_playlist(playlist)
print("add playlist: " + playlist[1])
def _notify_single_playlist(self, idx, total):
@ -263,32 +324,10 @@ class YoutubeChannel(YouTubeItem):
self.task.send_progress(message, progress=(idx + 1) / total)
@staticmethod
def _index_single_playlist(playlist, all_youtube_ids):
def _index_single_playlist(playlist):
"""add single playlist if needed"""
playlist = YoutubePlaylist(playlist[0])
playlist.all_youtube_ids = all_youtube_ids
playlist.build_json()
if not playlist.json_data:
return
entries = playlist.json_data["playlist_entries"]
downloaded = [i for i in entries if i["downloaded"]]
if not downloaded:
return
playlist.upload_to_es()
playlist.add_vids_to_playlist()
playlist.get_playlist_art()
@staticmethod
def get_all_video_ids():
"""match all playlists with videos"""
handler = queue.PendingList()
handler.get_download()
handler.get_indexed()
all_youtube_ids = [i["youtube_id"] for i in handler.all_videos]
return all_youtube_ids
playlist.update_playlist(skip_on_empty=True)
def get_channel_videos(self):
"""get all videos from channel"""
@ -325,9 +364,9 @@ class YoutubeChannel(YouTubeItem):
all_playlists = IndexPaginate("ta_playlist", data).get_results()
return all_playlists
def get_overwrites(self):
def get_overwrites(self) -> dict:
"""get all per channel overwrites"""
return self.json_data.get("channel_overwrites", False)
return self.json_data.get("channel_overwrites", {})
def set_overwrites(self, overwrites):
"""set per channel overwrites"""

View File

@ -10,6 +10,7 @@ from datetime import datetime
from home.src.download.yt_dlp_base import YtWrap
from home.src.es.connect import ElasticWrap
from home.src.ta.config import AppConfig
from home.src.ta.ta_redis import RedisQueue
class Comments:
@ -68,6 +69,7 @@ class Comments:
"youtube": {
"max_comments": max_comments_list,
"comment_sort": [comment_sort],
"player_client": ["ios", "web"], # workaround yt-dlp #9554
}
},
}
@ -77,7 +79,7 @@ class Comments:
def get_yt_comments(self):
"""get comments from youtube"""
yt_obs = self.build_yt_obs()
info_json = YtWrap(yt_obs).extract(self.youtube_id)
info_json = YtWrap(yt_obs, config=self.config).extract(self.youtube_id)
if not info_json:
return False, False
@ -115,6 +117,9 @@ class Comments:
time_text = time_text_datetime.strftime(format_string)
if not comment.get("author"):
comment["author"] = comment.get("author_id", "Unknown")
cleaned_comment = {
"comment_id": comment["id"],
"comment_text": comment["text"].replace("\xa0", ""),
@ -126,7 +131,7 @@ class Comments:
"comment_author_id": comment["author_id"],
"comment_author_thumbnail": comment["author_thumbnail"],
"comment_author_is_uploader": comment.get(
"comment_author_is_uploader", False
"author_is_uploader", False
),
"comment_parent": comment["parent"],
}
@ -185,20 +190,30 @@ class Comments:
class CommentList:
"""interact with comments in group"""
def __init__(self, video_ids, task=False):
self.video_ids = video_ids
COMMENT_QUEUE = "index:comment"
def __init__(self, task=False):
self.task = task
self.config = AppConfig().config
def index(self):
"""index comments for list, init with task object to notify"""
def add(self, video_ids: list[str]) -> None:
"""add list of videos to get comments, if enabled in config"""
if not self.config["downloads"].get("comment_max"):
return
total_videos = len(self.video_ids)
for idx, youtube_id in enumerate(self.video_ids):
RedisQueue(self.COMMENT_QUEUE).add_list(video_ids)
def index(self):
"""run comment index"""
queue = RedisQueue(self.COMMENT_QUEUE)
while True:
total = queue.max_score()
youtube_id, idx = queue.get_next()
if not youtube_id or not idx or not total:
break
if self.task:
self.notify(idx, total_videos)
self.notify(idx, total)
comment = Comments(youtube_id, config=self.config)
comment.build_json()
@ -207,6 +222,6 @@ class CommentList:
def notify(self, idx, total_videos):
"""send notification on task"""
message = [f"Add comments for new videos {idx + 1}/{total_videos}"]
progress = (idx + 1) / total_videos
message = [f"Add comments for new videos {idx}/{total_videos}"]
progress = idx / total_videos
self.task.send_progress(message, progress=progress)

View File

@ -8,14 +8,14 @@ import os
from home.src.es.connect import ElasticWrap, IndexPaginate
from home.src.index.comments import CommentList
from home.src.index.video import YoutubeVideo, index_new_video
from home.src.ta.config import AppConfig
from home.src.ta.helper import ignore_filelist
from home.src.ta.settings import EnvironmentSettings
class Scanner:
"""scan index and filesystem"""
VIDEOS: str = AppConfig().config["application"]["videos"]
VIDEOS: str = EnvironmentSettings.MEDIA_DIR
def __init__(self, task=False) -> None:
self.task = task
@ -83,13 +83,15 @@ class Scanner:
if self.task:
self.task.send_progress(
message_lines=[
f"Index missing video {youtube_id}, {idx}/{total}"
f"Index missing video {youtube_id}, {idx + 1}/{total}"
],
progress=(idx + 1) / total,
)
index_new_video(youtube_id)
CommentList(self.to_index, task=self.task).index()
comment_list = CommentList(task=self.task)
comment_list.add(video_ids=list(self.to_index))
comment_list.index()
def url_fix(self) -> None:
"""

View File

@ -17,7 +17,7 @@ class YouTubeItem:
es_path = False
index_name = ""
yt_base = ""
yt_obs = {
yt_obs: dict[str, bool | str] = {
"skip_download": True,
"noplaylist": True,
}
@ -26,7 +26,6 @@ class YouTubeItem:
self.youtube_id = youtube_id
self.es_path = f"{self.index_name}/_doc/{youtube_id}"
self.config = AppConfig().config
self.app_conf = self.config["application"]
self.youtube_meta = False
self.json_data = False

View File

@ -16,6 +16,7 @@ from home.src.index.comments import CommentList
from home.src.index.video import YoutubeVideo
from home.src.ta.config import AppConfig
from home.src.ta.helper import ignore_filelist
from home.src.ta.settings import EnvironmentSettings
from PIL import Image
from yt_dlp.utils import ISO639Utils
@ -28,7 +29,7 @@ class ImportFolderScanner:
"""
CONFIG = AppConfig().config
CACHE_DIR = CONFIG["application"]["cache_dir"]
CACHE_DIR = EnvironmentSettings.CACHE_DIR
IMPORT_DIR = os.path.join(CACHE_DIR, "import")
"""All extensions should be in lowercase until better handling is in place.
@ -146,7 +147,9 @@ class ImportFolderScanner:
ManualImport(current_video, self.CONFIG).run()
video_ids = [i["video_id"] for i in self.to_import]
CommentList(video_ids, task=self.task).index()
comment_list = CommentList(task=self.task)
comment_list.add(video_ids=video_ids)
comment_list.index()
def _notify(self, idx, current_video):
"""send notification back to task"""
@ -433,9 +436,9 @@ class ManualImport:
def _move_to_archive(self, json_data):
"""move identified media file to archive"""
videos = self.config["application"]["videos"]
host_uid = self.config["application"]["HOST_UID"]
host_gid = self.config["application"]["HOST_GID"]
videos = EnvironmentSettings.MEDIA_DIR
host_uid = EnvironmentSettings.HOST_UID
host_gid = EnvironmentSettings.HOST_GID
channel, file = os.path.split(json_data["media_url"])
channel_folder = os.path.join(videos, channel)
@ -472,7 +475,7 @@ class ManualImport:
os.remove(subtitle_file)
channel_info = os.path.join(
self.config["application"]["cache_dir"],
EnvironmentSettings.CACHE_DIR,
"import",
f"{json_data['channel']['channel_id']}.info.json",
)

View File

@ -8,7 +8,8 @@ import json
from datetime import datetime
from home.src.download.thumbnails import ThumbManager
from home.src.es.connect import ElasticWrap
from home.src.es.connect import ElasticWrap, IndexPaginate
from home.src.index import channel
from home.src.index.generic import YouTubeItem
from home.src.index.video import YoutubeVideo
@ -28,7 +29,6 @@ class YoutubePlaylist(YouTubeItem):
super().__init__(youtube_id)
self.all_members = False
self.nav = False
self.all_youtube_ids = []
def build_json(self, scrape=False):
"""collection to create json_data"""
@ -45,7 +45,9 @@ class YoutubePlaylist(YouTubeItem):
return
self.process_youtube_meta()
self.get_entries()
self._ensure_channel()
ids_found = self.get_local_vids()
self.get_entries(ids_found)
self.json_data["playlist_entries"] = self.all_members
self.json_data["playlist_subscribed"] = subscribed
@ -66,27 +68,40 @@ class YoutubePlaylist(YouTubeItem):
"playlist_thumbnail": playlist_thumbnail,
"playlist_description": self.youtube_meta["description"] or False,
"playlist_last_refresh": int(datetime.now().timestamp()),
"playlist_type": "regular",
}
def get_entries(self, playlistend=False):
"""get all videos in playlist"""
if playlistend:
# implement playlist end
print(playlistend)
def _ensure_channel(self):
"""make sure channel is indexed"""
channel_id = self.json_data["playlist_channel_id"]
channel_handler = channel.YoutubeChannel(channel_id)
channel_handler.build_json(upload=True)
def get_local_vids(self) -> list[str]:
"""get local video ids from youtube entries"""
entries = self.youtube_meta["entries"]
data = {
"query": {"terms": {"youtube_id": [i["id"] for i in entries]}},
"_source": ["youtube_id"],
}
indexed_vids = IndexPaginate("ta_video", data).get_results()
ids_found = [i["youtube_id"] for i in indexed_vids]
return ids_found
def get_entries(self, ids_found) -> None:
"""get all videos in playlist, match downloaded with ids_found"""
all_members = []
for idx, entry in enumerate(self.youtube_meta["entries"]):
if self.all_youtube_ids:
downloaded = entry["id"] in self.all_youtube_ids
else:
downloaded = False
if not entry["channel"]:
continue
to_append = {
"youtube_id": entry["id"],
"title": entry["title"],
"uploader": entry["channel"],
"idx": idx,
"downloaded": downloaded,
"downloaded": entry["id"] in ids_found,
}
all_members.append(to_append)
@ -127,17 +142,50 @@ class YoutubePlaylist(YouTubeItem):
ElasticWrap("_bulk").post(query_str, ndjson=True)
def update_playlist(self):
def remove_vids_from_playlist(self):
"""remove playlist ids from videos if needed"""
needed = [i["youtube_id"] for i in self.json_data["playlist_entries"]]
data = {
"query": {"match": {"playlist": self.youtube_id}},
"_source": ["youtube_id"],
}
result = IndexPaginate("ta_video", data).get_results()
to_remove = [
i["youtube_id"] for i in result if i["youtube_id"] not in needed
]
s = "ctx._source.playlist.removeAll(Collections.singleton(params.rm))"
for video_id in to_remove:
query = {
"script": {
"source": s,
"lang": "painless",
"params": {"rm": self.youtube_id},
},
"query": {"match": {"youtube_id": video_id}},
}
path = "ta_video/_update_by_query"
_, status_code = ElasticWrap(path).post(query)
if status_code == 200:
print(f"{self.youtube_id}: removed {video_id} from playlist")
def update_playlist(self, skip_on_empty=False):
"""update metadata for playlist with data from YouTube"""
self.get_from_es()
subscribed = self.json_data["playlist_subscribed"]
self.get_from_youtube()
self.build_json(scrape=True)
if not self.json_data:
# return false to deactivate
return False
self.json_data["playlist_subscribed"] = subscribed
if skip_on_empty:
has_item_downloaded = any(
i["downloaded"] for i in self.json_data["playlist_entries"]
)
if not has_item_downloaded:
return True
self.upload_to_es()
self.add_vids_to_playlist()
self.remove_vids_from_playlist()
self.get_playlist_art()
return True
def build_nav(self, youtube_id):
@ -178,6 +226,7 @@ class YoutubePlaylist(YouTubeItem):
def delete_metadata(self):
"""delete metadata for playlist"""
self.delete_videos_metadata()
script = (
"ctx._source.playlist.removeAll("
+ "Collections.singleton(params.playlist)) "
@ -195,6 +244,30 @@ class YoutubePlaylist(YouTubeItem):
_, _ = ElasticWrap("ta_video/_update_by_query").post(data)
self.del_in_es()
def is_custom_playlist(self):
self.get_from_es()
return self.json_data["playlist_type"] == "custom"
def delete_videos_metadata(self, channel_id=None):
"""delete video metadata for a specific channel"""
self.get_from_es()
playlist = self.json_data["playlist_entries"]
i = 0
while i < len(playlist):
video_id = playlist[i]["youtube_id"]
video = YoutubeVideo(video_id)
video.get_from_es()
if (
channel_id is None
or video.json_data["channel"]["channel_id"] == channel_id
):
playlist.pop(i)
self.remove_playlist_from_video(video_id)
i -= 1
i += 1
self.set_playlist_thumbnail()
self.upload_to_es()
def delete_videos_playlist(self):
"""delete playlist with all videos"""
print(f"{self.youtube_id}: delete playlist")
@ -208,3 +281,159 @@ class YoutubePlaylist(YouTubeItem):
YoutubeVideo(youtube_id).delete_media_file()
self.delete_metadata()
def create(self, name):
self.json_data = {
"playlist_id": self.youtube_id,
"playlist_active": False,
"playlist_name": name,
"playlist_last_refresh": int(datetime.now().timestamp()),
"playlist_entries": [],
"playlist_type": "custom",
"playlist_channel": None,
"playlist_channel_id": None,
"playlist_description": False,
"playlist_thumbnail": False,
"playlist_subscribed": False,
}
self.upload_to_es()
self.get_playlist_art()
return True
def add_video_to_playlist(self, video_id):
self.get_from_es()
video_metadata = self.get_video_metadata(video_id)
video_metadata["idx"] = len(self.json_data["playlist_entries"])
if not self.playlist_entries_contains(video_id):
self.json_data["playlist_entries"].append(video_metadata)
self.json_data["playlist_last_refresh"] = int(
datetime.now().timestamp()
)
self.set_playlist_thumbnail()
self.upload_to_es()
video = YoutubeVideo(video_id)
video.get_from_es()
if "playlist" not in video.json_data:
video.json_data["playlist"] = []
video.json_data["playlist"].append(self.youtube_id)
video.upload_to_es()
return True
def remove_playlist_from_video(self, video_id):
video = YoutubeVideo(video_id)
video.get_from_es()
if video.json_data is not None and "playlist" in video.json_data:
video.json_data["playlist"].remove(self.youtube_id)
video.upload_to_es()
def move_video(self, video_id, action, hide_watched=False):
self.get_from_es()
video_index = self.get_video_index(video_id)
playlist = self.json_data["playlist_entries"]
item = playlist[video_index]
playlist.pop(video_index)
if action == "remove":
self.remove_playlist_from_video(item["youtube_id"])
else:
if action == "up":
while True:
video_index = max(0, video_index - 1)
if (
not hide_watched
or video_index == 0
or (
not self.get_video_is_watched(
playlist[video_index]["youtube_id"]
)
)
):
break
elif action == "down":
while True:
video_index = min(len(playlist), video_index + 1)
if (
not hide_watched
or video_index == len(playlist)
or (
not self.get_video_is_watched(
playlist[video_index - 1]["youtube_id"]
)
)
):
break
elif action == "top":
video_index = 0
else:
video_index = len(playlist)
playlist.insert(video_index, item)
self.json_data["playlist_last_refresh"] = int(
datetime.now().timestamp()
)
for i, item in enumerate(playlist):
item["idx"] = i
self.set_playlist_thumbnail()
self.upload_to_es()
return True
def del_video(self, video_id):
playlist = self.json_data["playlist_entries"]
i = 0
while i < len(playlist):
if video_id == playlist[i]["youtube_id"]:
playlist.pop(i)
self.set_playlist_thumbnail()
i -= 1
i += 1
def get_video_index(self, video_id):
for i, child in enumerate(self.json_data["playlist_entries"]):
if child["youtube_id"] == video_id:
return i
return -1
def playlist_entries_contains(self, video_id):
return (
len(
list(
filter(
lambda x: x["youtube_id"] == video_id,
self.json_data["playlist_entries"],
)
)
)
> 0
)
def get_video_is_watched(self, video_id):
video = YoutubeVideo(video_id)
video.get_from_es()
return video.json_data["player"]["watched"]
def set_playlist_thumbnail(self):
playlist = self.json_data["playlist_entries"]
self.json_data["playlist_thumbnail"] = False
for video in playlist:
url = ThumbManager(video["youtube_id"]).vid_thumb_path()
if url is not None:
self.json_data["playlist_thumbnail"] = url
break
self.get_playlist_art()
def get_video_metadata(self, video_id):
video = YoutubeVideo(video_id)
video.get_from_es()
video_json_data = {
"youtube_id": video.json_data["youtube_id"],
"title": video.json_data["title"],
"uploader": video.json_data["channel"]["channel_name"],
"idx": 0,
"downloaded": "date_downloaded" in video.json_data
and video.json_data["date_downloaded"] > 0,
}
return video_json_data

View File

@ -8,8 +8,9 @@ import json
import os
from datetime import datetime
from time import sleep
from typing import Callable, TypedDict
from home.src.download.queue import PendingList
from home.models import CustomPeriodicTask
from home.src.download.subscriptions import ChannelSubscription
from home.src.download.thumbnails import ThumbManager
from home.src.download.yt_dlp_base import CookieHandler
@ -19,13 +20,23 @@ from home.src.index.comments import Comments
from home.src.index.playlist import YoutubePlaylist
from home.src.index.video import YoutubeVideo
from home.src.ta.config import AppConfig
from home.src.ta.settings import EnvironmentSettings
from home.src.ta.ta_redis import RedisQueue
class ReindexConfigType(TypedDict):
"""represents config type"""
index_name: str
queue_name: str
active_key: str
refresh_key: str
class ReindexBase:
"""base config class for reindex task"""
REINDEX_CONFIG = {
REINDEX_CONFIG: dict[str, ReindexConfigType] = {
"video": {
"index_name": "ta_video",
"queue_name": "reindex:ta_video",
@ -52,25 +63,36 @@ class ReindexBase:
def __init__(self):
self.config = AppConfig().config
self.now = int(datetime.now().timestamp())
self.total = None
def populate(self, all_ids, reindex_config):
def populate(self, all_ids, reindex_config: ReindexConfigType):
"""add all to reindex ids to redis queue"""
if not all_ids:
return
RedisQueue(queue_name=reindex_config["queue_name"]).add_list(all_ids)
self.total = None
class ReindexPopulate(ReindexBase):
"""add outdated and recent documents to reindex queue"""
INTERVAL_DEFAIULT: int = 90
def __init__(self):
super().__init__()
self.interval = self.config["scheduler"]["check_reindex_days"]
self.interval = self.INTERVAL_DEFAIULT
def add_recent(self):
def get_interval(self) -> None:
"""get reindex days interval from task"""
try:
task = CustomPeriodicTask.objects.get(name="check_reindex")
except CustomPeriodicTask.DoesNotExist:
return
task_config = task.task_config
if task_config.get("days"):
self.interval = task_config.get("days")
def add_recent(self) -> None:
"""add recent videos to refresh"""
gte = datetime.fromtimestamp(self.now - self.DAYS3).date().isoformat()
must_list = [
@ -88,10 +110,10 @@ class ReindexPopulate(ReindexBase):
return
all_ids = [i["_source"]["youtube_id"] for i in hits]
reindex_config = self.REINDEX_CONFIG.get("video")
reindex_config: ReindexConfigType = self.REINDEX_CONFIG["video"]
self.populate(all_ids, reindex_config)
def add_outdated(self):
def add_outdated(self) -> None:
"""add outdated documents"""
for reindex_config in self.REINDEX_CONFIG.values():
total_hits = self._get_total_hits(reindex_config)
@ -100,17 +122,19 @@ class ReindexPopulate(ReindexBase):
self.populate(all_ids, reindex_config)
@staticmethod
def _get_total_hits(reindex_config):
def _get_total_hits(reindex_config: ReindexConfigType) -> int:
"""get total hits from index"""
index_name = reindex_config["index_name"]
active_key = reindex_config["active_key"]
path = f"{index_name}/_search?filter_path=hits.total"
data = {"query": {"match": {active_key: True}}}
response, _ = ElasticWrap(path).post(data=data)
total_hits = response["hits"]["total"]["value"]
return total_hits
data = {
"query": {"term": {active_key: {"value": True}}},
"_source": False,
}
total = IndexPaginate(index_name, data, keep_source=True).get_results()
def _get_daily_should(self, total_hits):
return len(total)
def _get_daily_should(self, total_hits: int) -> int:
"""calc how many should reindex daily"""
daily_should = int((total_hits // self.interval + 1) * self.MULTIPLY)
if daily_should >= 10000:
@ -118,11 +142,13 @@ class ReindexPopulate(ReindexBase):
return daily_should
def _get_outdated_ids(self, reindex_config, daily_should):
def _get_outdated_ids(
self, reindex_config: ReindexConfigType, daily_should: int
) -> list[str]:
"""get outdated from index_name"""
index_name = reindex_config["index_name"]
refresh_key = reindex_config["refresh_key"]
now_lte = self.now - self.interval * 24 * 60 * 60
now_lte = str(self.now - self.interval * 24 * 60 * 60)
must_list = [
{"match": {reindex_config["active_key"]: True}},
{"range": {refresh_key: {"lte": now_lte}}},
@ -155,7 +181,7 @@ class ReindexManual(ReindexBase):
self.extract_videos = extract_videos
self.data = False
def extract_data(self, data):
def extract_data(self, data) -> None:
"""process data"""
self.data = data
for key, values in self.data.items():
@ -166,7 +192,9 @@ class ReindexManual(ReindexBase):
self.process_index(reindex_config, values)
def process_index(self, index_config, values):
def process_index(
self, index_config: ReindexConfigType, values: list[str]
) -> None:
"""process values per index"""
index_name = index_config["index_name"]
if index_name == "ta_video":
@ -176,32 +204,35 @@ class ReindexManual(ReindexBase):
elif index_name == "ta_playlist":
self._add_playlists(values)
def _add_videos(self, values):
def _add_videos(self, values: list[str]) -> None:
"""add list of videos to reindex queue"""
if not values:
return
RedisQueue("reindex:ta_video").add_list(values)
queue_name = self.REINDEX_CONFIG["video"]["queue_name"]
RedisQueue(queue_name).add_list(values)
def _add_channels(self, values):
def _add_channels(self, values: list[str]) -> None:
"""add list of channels to reindex queue"""
RedisQueue("reindex:ta_channel").add_list(values)
queue_name = self.REINDEX_CONFIG["channel"]["queue_name"]
RedisQueue(queue_name).add_list(values)
if self.extract_videos:
for channel_id in values:
all_videos = self._get_channel_videos(channel_id)
self._add_videos(all_videos)
def _add_playlists(self, values):
def _add_playlists(self, values: list[str]) -> None:
"""add list of playlists to reindex queue"""
RedisQueue("reindex:ta_playlist").add_list(values)
queue_name = self.REINDEX_CONFIG["playlist"]["queue_name"]
RedisQueue(queue_name).add_list(values)
if self.extract_videos:
for playlist_id in values:
all_videos = self._get_playlist_videos(playlist_id)
self._add_videos(all_videos)
def _get_channel_videos(self, channel_id):
def _get_channel_videos(self, channel_id: str) -> list[str]:
"""get all videos from channel"""
data = {
"query": {"term": {"channel.channel_id": {"value": channel_id}}},
@ -210,7 +241,7 @@ class ReindexManual(ReindexBase):
all_results = IndexPaginate("ta_video", data).get_results()
return [i["youtube_id"] for i in all_results]
def _get_playlist_videos(self, playlist_id):
def _get_playlist_videos(self, playlist_id: str) -> list[str]:
"""get all videos from playlist"""
data = {
"query": {"term": {"playlist.keyword": {"value": playlist_id}}},
@ -226,43 +257,42 @@ class Reindex(ReindexBase):
def __init__(self, task=False):
super().__init__()
self.task = task
self.all_indexed_ids = False
self.processed = {
"videos": 0,
"channels": 0,
"playlists": 0,
}
def reindex_all(self):
def reindex_all(self) -> None:
"""reindex all in queue"""
if not self.cookie_is_valid():
print("[reindex] cookie invalid, exiting...")
return
for name, index_config in self.REINDEX_CONFIG.items():
if not RedisQueue(index_config["queue_name"]).has_item():
if not RedisQueue(index_config["queue_name"]).length():
continue
self.total = RedisQueue(index_config["queue_name"]).length()
while True:
has_next = self.reindex_index(name, index_config)
if not has_next:
break
self.reindex_type(name, index_config)
def reindex_index(self, name, index_config):
def reindex_type(self, name: str, index_config: ReindexConfigType) -> None:
"""reindex all of a single index"""
reindex = self.get_reindex_map(index_config["index_name"])
youtube_id = RedisQueue(index_config["queue_name"]).get_next()
if youtube_id:
reindex = self._get_reindex_map(index_config["index_name"])
queue = RedisQueue(index_config["queue_name"])
while True:
total = queue.max_score()
youtube_id, idx = queue.get_next()
if not youtube_id or not idx or not total:
break
if self.task:
self._notify(name, index_config)
self._notify(name, total, idx)
reindex(youtube_id)
sleep_interval = self.config["downloads"].get("sleep_interval", 0)
sleep(sleep_interval)
return bool(youtube_id)
def get_reindex_map(self, index_name):
def _get_reindex_map(self, index_name: str) -> Callable:
"""return def to run for index"""
def_map = {
"ta_video": self._reindex_single_video,
@ -270,30 +300,28 @@ class Reindex(ReindexBase):
"ta_playlist": self._reindex_single_playlist,
}
return def_map.get(index_name)
return def_map[index_name]
def _notify(self, name, index_config):
def _notify(self, name: str, total: int, idx: int) -> None:
"""send notification back to task"""
if self.total is None:
self.total = RedisQueue(index_config["queue_name"]).length()
remaining = RedisQueue(index_config["queue_name"]).length()
idx = self.total - remaining
message = [f"Reindexing {name.title()}s {idx}/{self.total}"]
progress = idx / self.total
message = [f"Reindexing {name.title()}s {idx}/{total}"]
progress = idx / total
self.task.send_progress(message, progress=progress)
def _reindex_single_video(self, youtube_id):
def _reindex_single_video(self, youtube_id: str) -> None:
"""refresh data for single video"""
video = YoutubeVideo(youtube_id)
# read current state
video.get_from_es()
if not video.json_data:
return
es_meta = video.json_data.copy()
# get new
media_url = os.path.join(
self.config["application"]["videos"], es_meta["media_url"]
EnvironmentSettings.MEDIA_DIR, es_meta["media_url"]
)
video.build_json(media_path=media_url)
if not video.youtube_meta:
@ -311,10 +339,6 @@ class Reindex(ReindexBase):
video.json_data["playlist"] = es_meta.get("playlist")
video.upload_to_es()
if es_meta.get("media_url") != video.json_data["media_url"]:
self._rename_media_file(
es_meta.get("media_url"), video.json_data["media_url"]
)
thumb_handler = ThumbManager(youtube_id)
thumb_handler.delete_video_thumb()
@ -323,21 +347,14 @@ class Reindex(ReindexBase):
Comments(youtube_id, config=self.config).reindex_comments()
self.processed["videos"] += 1
return
def _rename_media_file(self, media_url_is, media_url_should):
"""handle title change"""
print(f"[reindex] fix media_url {media_url_is} to {media_url_should}")
videos = self.config["application"]["videos"]
old_path = os.path.join(videos, media_url_is)
new_path = os.path.join(videos, media_url_should)
os.rename(old_path, new_path)
def _reindex_single_channel(self, channel_id):
def _reindex_single_channel(self, channel_id: str) -> None:
"""refresh channel data and sync to videos"""
# read current state
channel = YoutubeChannel(channel_id)
channel.get_from_es()
if not channel.json_data:
return
es_meta = channel.json_data.copy()
# get new
@ -361,34 +378,24 @@ class Reindex(ReindexBase):
ChannelFullScan(channel_id).scan()
self.processed["channels"] += 1
def _reindex_single_playlist(self, playlist_id):
def _reindex_single_playlist(self, playlist_id: str) -> None:
"""refresh playlist data"""
self._get_all_videos()
playlist = YoutubePlaylist(playlist_id)
playlist.get_from_es()
subscribed = playlist.json_data["playlist_subscribed"]
playlist.all_youtube_ids = self.all_indexed_ids
playlist.build_json(scrape=True)
if not playlist.json_data:
if (
not playlist.json_data
or playlist.json_data["playlist_type"] == "custom"
):
return
is_active = playlist.update_playlist()
if not is_active:
playlist.deactivate()
return
playlist.json_data["playlist_subscribed"] = subscribed
playlist.upload_to_es()
self.processed["playlists"] += 1
return
def _get_all_videos(self):
"""add all videos for playlist index validation"""
if self.all_indexed_ids:
return
handler = PendingList()
handler.get_download()
handler.get_indexed()
self.all_indexed_ids = [i["youtube_id"] for i in handler.all_videos]
def cookie_is_valid(self):
def cookie_is_valid(self) -> bool:
"""return true if cookie is enabled and valid"""
if not self.config["downloads"]["cookie_import"]:
# is not activated, continue reindex
@ -397,7 +404,7 @@ class Reindex(ReindexBase):
valid = CookieHandler(self.config).validate()
return valid
def build_message(self):
def build_message(self) -> str:
"""build progress message"""
message = ""
for key, value in self.processed.items():
@ -427,7 +434,7 @@ class ReindexProgress(ReindexBase):
self.request_type = request_type
self.request_id = request_id
def get_progress(self):
def get_progress(self) -> dict:
"""get progress from task"""
queue_name, request_type = self._get_queue_name()
total = self._get_total_in_queue(queue_name)

View File

@ -12,6 +12,7 @@ from datetime import datetime
import requests
from home.src.es.connect import ElasticWrap
from home.src.ta.helper import requests_headers
from home.src.ta.settings import EnvironmentSettings
class YoutubeSubtitle:
@ -113,7 +114,7 @@ class YoutubeSubtitle:
def download_subtitles(self, relevant_subtitles):
"""download subtitle files to archive"""
videos_base = self.video.config["application"]["videos"]
videos_base = EnvironmentSettings.MEDIA_DIR
indexed = []
for subtitle in relevant_subtitles:
dest_path = os.path.join(videos_base, subtitle["media_url"])
@ -127,6 +128,10 @@ class YoutubeSubtitle:
print(response.text)
continue
if not response.text:
print(f"{self.video.youtube_id}: skip empty subtitle")
continue
parser = SubtitleParser(response.text, lang, source)
parser.process()
if not parser.all_cues:
@ -149,8 +154,8 @@ class YoutubeSubtitle:
with open(dest_path, "w", encoding="utf-8") as subfile:
subfile.write(subtitle_str)
host_uid = self.video.config["application"]["HOST_UID"]
host_gid = self.video.config["application"]["HOST_GID"]
host_uid = EnvironmentSettings.HOST_UID
host_gid = EnvironmentSettings.HOST_GID
if host_uid and host_gid:
os.chown(dest_path, host_uid, host_gid)
@ -162,7 +167,7 @@ class YoutubeSubtitle:
def delete(self, subtitles=False):
"""delete subtitles from index and filesystem"""
youtube_id = self.video.youtube_id
videos_base = self.video.config["application"]["videos"]
videos_base = EnvironmentSettings.MEDIA_DIR
# delete files
if subtitles:
files = [i["media_url"] for i in subtitles]

View File

@ -18,6 +18,7 @@ from home.src.index.subtitle import YoutubeSubtitle
from home.src.index.video_constants import VideoTypeEnum
from home.src.index.video_streams import MediaStreamExtractor
from home.src.ta.helper import get_duration_sec, get_duration_str, randomizor
from home.src.ta.settings import EnvironmentSettings
from home.src.ta.users import UserConfig
from ryd_client import ryd_client
@ -124,15 +125,9 @@ class YoutubeVideo(YouTubeItem, YoutubeSubtitle):
index_name = "ta_video"
yt_base = "https://www.youtube.com/watch?v="
def __init__(
self,
youtube_id,
video_overwrites=False,
video_type=VideoTypeEnum.VIDEOS,
):
def __init__(self, youtube_id, video_type=VideoTypeEnum.VIDEOS):
super().__init__(youtube_id)
self.channel_id = False
self.video_overwrites = video_overwrites
self.video_type = video_type
self.offline_import = False
@ -146,7 +141,7 @@ class YoutubeVideo(YouTubeItem, YoutubeSubtitle):
self.youtube_meta = youtube_meta_overwrite
self.offline_import = True
self._process_youtube_meta()
self.process_youtube_meta()
self._add_channel()
self._add_stats()
self.add_file_path()
@ -164,18 +159,18 @@ class YoutubeVideo(YouTubeItem, YoutubeSubtitle):
"""check if need to run sponsor block"""
integrate = self.config["downloads"]["integrate_sponsorblock"]
if self.video_overwrites:
single_overwrite = self.video_overwrites.get(self.youtube_id)
if not single_overwrite:
if overwrite := self.json_data["channel"].get("channel_overwrites"):
if not overwrite:
return integrate
if "integrate_sponsorblock" in single_overwrite:
return single_overwrite.get("integrate_sponsorblock")
if "integrate_sponsorblock" in overwrite:
return overwrite.get("integrate_sponsorblock")
return integrate
def _process_youtube_meta(self):
def process_youtube_meta(self):
"""extract relevant fields from youtube"""
self._validate_id()
# extract
self.channel_id = self.youtube_meta["channel_id"]
upload_date = self.youtube_meta["upload_date"]
@ -187,11 +182,11 @@ class YoutubeVideo(YouTubeItem, YoutubeSubtitle):
# build json_data basics
self.json_data = {
"title": self.youtube_meta["title"],
"description": self.youtube_meta["description"],
"category": self.youtube_meta["categories"],
"description": self.youtube_meta.get("description", ""),
"category": self.youtube_meta.get("categories", []),
"vid_thumb_url": self.youtube_meta["thumbnail"],
"vid_thumb_base64": base64_blur,
"tags": self.youtube_meta["tags"],
"tags": self.youtube_meta.get("tags", []),
"published": published,
"vid_last_refresh": last_refresh,
"date_downloaded": last_refresh,
@ -201,6 +196,19 @@ class YoutubeVideo(YouTubeItem, YoutubeSubtitle):
"active": True,
}
def _validate_id(self):
"""validate expected video ID, raise value error on mismatch"""
remote_id = self.youtube_meta["id"]
if not self.youtube_id == remote_id:
# unexpected redirect
message = (
f"[reindex][{self.youtube_id}] got an unexpected redirect "
+ f"to {remote_id}, you are probably getting blocked by YT. "
"See FAQ for more details."
)
raise ValueError(message)
def _add_channel(self):
"""add channel dict to video json_data"""
channel = ta_channel.YoutubeChannel(self.channel_id)
@ -209,31 +217,24 @@ class YoutubeVideo(YouTubeItem, YoutubeSubtitle):
def _add_stats(self):
"""add stats dicst to json_data"""
# likes
like_count = self.youtube_meta.get("like_count", 0)
dislike_count = self.youtube_meta.get("dislike_count", 0)
average_rating = self.youtube_meta.get("average_rating", 0)
self.json_data.update(
{
"stats": {
"view_count": self.youtube_meta["view_count"],
"like_count": like_count,
"dislike_count": dislike_count,
"average_rating": average_rating,
}
}
)
stats = {
"view_count": self.youtube_meta.get("view_count", 0),
"like_count": self.youtube_meta.get("like_count", 0),
"dislike_count": self.youtube_meta.get("dislike_count", 0),
"average_rating": self.youtube_meta.get("average_rating", 0),
}
self.json_data.update({"stats": stats})
def build_dl_cache_path(self):
"""find video path in dl cache"""
cache_dir = self.app_conf["cache_dir"]
cache_dir = EnvironmentSettings.CACHE_DIR
video_id = self.json_data["youtube_id"]
cache_path = f"{cache_dir}/download/{video_id}.mp4"
if os.path.exists(cache_path):
return cache_path
channel_path = os.path.join(
self.app_conf["videos"],
EnvironmentSettings.MEDIA_DIR,
self.json_data["channel"]["channel_id"],
f"{video_id}.mp4",
)
@ -282,7 +283,7 @@ class YoutubeVideo(YouTubeItem, YoutubeSubtitle):
if not self.json_data:
raise FileNotFoundError
video_base = self.app_conf["videos"]
video_base = EnvironmentSettings.MEDIA_DIR
media_url = self.json_data.get("media_url")
file_path = os.path.join(video_base, media_url)
try:
@ -311,6 +312,8 @@ class YoutubeVideo(YouTubeItem, YoutubeSubtitle):
playlist.json_data["playlist_entries"][idx].update(
{"downloaded": False}
)
if playlist.json_data["playlist_type"] == "custom":
playlist.del_video(self.youtube_id)
playlist.upload_to_es()
def delete_subtitles(self, subtitles=False):
@ -389,13 +392,9 @@ class YoutubeVideo(YouTubeItem, YoutubeSubtitle):
_, _ = ElasticWrap(path).post(data=data)
def index_new_video(
youtube_id, video_overwrites=False, video_type=VideoTypeEnum.VIDEOS
):
def index_new_video(youtube_id, video_type=VideoTypeEnum.VIDEOS):
"""combined classes to create new video in index"""
video = YoutubeVideo(
youtube_id, video_overwrites=video_overwrites, video_type=video_type
)
video = YoutubeVideo(youtube_id, video_type=video_type)
video.build_json()
if not video.json_data:
raise ValueError("failed to get metadata for " + youtube_id)

View File

@ -5,13 +5,10 @@ Functionality:
"""
import json
import os
import re
from random import randint
from time import sleep
import requests
from celery.schedules import crontab
from django.conf import settings
from home.src.ta.ta_redis import RedisArchivist
@ -28,7 +25,6 @@ class AppConfig:
if not config:
config = self.get_config_file()
config["application"].update(self.get_config_env())
return config
def get_config_file(self):
@ -36,25 +32,8 @@ class AppConfig:
with open("home/config.json", "r", encoding="utf-8") as f:
config_file = json.load(f)
config_file["application"].update(self.get_config_env())
return config_file
@staticmethod
def get_config_env():
"""read environment application variables.
Connection to ES is managed in ElasticWrap and the
connection to Redis is managed in RedisArchivist."""
application = {
"HOST_UID": int(os.environ.get("HOST_UID", False)),
"HOST_GID": int(os.environ.get("HOST_GID", False)),
"enable_cast": bool(os.environ.get("ENABLE_CAST")),
}
return application
@staticmethod
def get_config_redis():
"""read config json set from redis to overwrite defaults"""
@ -93,15 +72,6 @@ class AppConfig:
RedisArchivist().set_message("config", self.config, save=True)
return updated
@staticmethod
def _build_rand_daily():
"""build random daily schedule per installation"""
return {
"minute": randint(0, 59),
"hour": randint(0, 23),
"day_of_week": "*",
}
def load_new_defaults(self):
"""check config.json for missing defaults"""
default_config = self.get_config_file()
@ -110,7 +80,6 @@ class AppConfig:
# check for customizations
if not redis_config:
config = self.get_config()
config["scheduler"]["version_check"] = self._build_rand_daily()
RedisArchivist().set_message("config", config)
return False
@ -126,9 +95,6 @@ class AppConfig:
# missing nested values
for sub_key, sub_value in value.items():
if sub_key not in redis_config[key].keys():
if sub_value == "rand-d":
sub_value = self._build_rand_daily()
redis_config[key].update({sub_key: sub_value})
needs_update = True
@ -138,220 +104,80 @@ class AppConfig:
return needs_update
class ScheduleBuilder:
"""build schedule dicts for beat"""
SCHEDULES = {
"update_subscribed": "0 8 *",
"download_pending": "0 16 *",
"check_reindex": "0 12 *",
"thumbnail_check": "0 17 *",
"run_backup": "0 18 0",
"version_check": "0 11 *",
}
CONFIG = ["check_reindex_days", "run_backup_rotate"]
NOTIFY = [
"update_subscribed_notify",
"download_pending_notify",
"check_reindex_notify",
]
MSG = "message:setting"
def __init__(self):
self.config = AppConfig().config
def update_schedule_conf(self, form_post):
"""process form post"""
print("processing form, restart container for changes to take effect")
redis_config = self.config
for key, value in form_post.items():
if key in self.SCHEDULES and value:
try:
to_write = self.value_builder(key, value)
except ValueError:
print(f"failed: {key} {value}")
mess_dict = {
"group": "setting:schedule",
"level": "error",
"title": "Scheduler update failed.",
"messages": ["Invalid schedule input"],
"id": "0000",
}
RedisArchivist().set_message(
self.MSG, mess_dict, expire=True
)
return
redis_config["scheduler"][key] = to_write
if key in self.CONFIG and value:
redis_config["scheduler"][key] = int(value)
if key in self.NOTIFY and value:
if value == "0":
to_write = False
else:
to_write = value
redis_config["scheduler"][key] = to_write
RedisArchivist().set_message("config", redis_config, save=True)
mess_dict = {
"group": "setting:schedule",
"level": "info",
"title": "Scheduler changed.",
"messages": ["Restart container for changes to take effect"],
"id": "0000",
}
RedisArchivist().set_message(self.MSG, mess_dict, expire=True)
def value_builder(self, key, value):
"""validate single cron form entry and return cron dict"""
print(f"change schedule for {key} to {value}")
if value == "0":
# deactivate this schedule
return False
if re.search(r"[\d]{1,2}\/[\d]{1,2}", value):
# number/number cron format will fail in celery
print("number/number schedule formatting not supported")
raise ValueError
keys = ["minute", "hour", "day_of_week"]
if value == "auto":
# set to sensible default
values = self.SCHEDULES[key].split()
else:
values = value.split()
if len(keys) != len(values):
print(f"failed to parse {value} for {key}")
raise ValueError("invalid input")
to_write = dict(zip(keys, values))
self._validate_cron(to_write)
return to_write
@staticmethod
def _validate_cron(to_write):
"""validate all fields, raise value error for impossible schedule"""
all_hours = list(re.split(r"\D+", to_write["hour"]))
for hour in all_hours:
if hour.isdigit() and int(hour) > 23:
print("hour can not be greater than 23")
raise ValueError("invalid input")
all_days = list(re.split(r"\D+", to_write["day_of_week"]))
for day in all_days:
if day.isdigit() and int(day) > 6:
print("day can not be greater than 6")
raise ValueError("invalid input")
if not to_write["minute"].isdigit():
print("too frequent: only number in minutes are supported")
raise ValueError("invalid input")
if int(to_write["minute"]) > 59:
print("minutes can not be greater than 59")
raise ValueError("invalid input")
def build_schedule(self):
"""build schedule dict as expected by app.conf.beat_schedule"""
AppConfig().load_new_defaults()
self.config = AppConfig().config
schedule_dict = {}
for schedule_item in self.SCHEDULES:
item_conf = self.config["scheduler"][schedule_item]
if not item_conf:
continue
schedule_dict.update(
{
f"schedule_{schedule_item}": {
"task": schedule_item,
"schedule": crontab(
minute=item_conf["minute"],
hour=item_conf["hour"],
day_of_week=item_conf["day_of_week"],
),
}
}
)
return schedule_dict
class ReleaseVersion:
"""compare local version with remote version"""
REMOTE_URL = "https://www.tubearchivist.com/api/release/latest/"
NEW_KEY = "versioncheck:new"
def __init__(self):
self.local_version = self._parse_version(settings.TA_VERSION)
self.is_unstable = settings.TA_VERSION.endswith("-unstable")
self.remote_version = False
self.is_breaking = False
self.response = False
def __init__(self) -> None:
self.local_version: str = settings.TA_VERSION
self.is_unstable: bool = settings.TA_VERSION.endswith("-unstable")
self.remote_version: str = ""
self.is_breaking: bool = False
def check(self):
def check(self) -> None:
"""check version"""
print(f"[{self.local_version}]: look for updates")
self.get_remote_version()
new_version, is_breaking = self._has_update()
new_version = self._has_update()
if new_version:
message = {
"status": True,
"version": new_version,
"is_breaking": is_breaking,
"is_breaking": self.is_breaking,
}
RedisArchivist().set_message(self.NEW_KEY, message)
print(f"[{self.local_version}]: found new version {new_version}")
def get_local_version(self):
def get_local_version(self) -> str:
"""read version from local"""
return self.local_version
def get_remote_version(self):
def get_remote_version(self) -> None:
"""read version from remote"""
sleep(randint(0, 60))
self.response = requests.get(self.REMOTE_URL, timeout=20).json()
remote_version_str = self.response["release_version"]
self.remote_version = self._parse_version(remote_version_str)
self.is_breaking = self.response["breaking_changes"]
response = requests.get(self.REMOTE_URL, timeout=20).json()
self.remote_version = response["release_version"]
self.is_breaking = response["breaking_changes"]
def _has_update(self):
def _has_update(self) -> str | bool:
"""check if there is an update"""
for idx, number in enumerate(self.local_version):
is_newer = self.remote_version[idx] > number
if is_newer:
return self.response["release_version"], self.is_breaking
remote_parsed = self._parse_version(self.remote_version)
local_parsed = self._parse_version(self.local_version)
if remote_parsed > local_parsed:
return self.remote_version
if self.is_unstable and self.local_version == self.remote_version:
return self.response["release_version"], self.is_breaking
if self.is_unstable and local_parsed == remote_parsed:
return self.remote_version
return False, False
return False
@staticmethod
def _parse_version(version):
def _parse_version(version) -> tuple[int, ...]:
"""return version parts"""
clean = version.rstrip("-unstable").lstrip("v")
return tuple((int(i) for i in clean.split(".")))
def is_updated(self):
def is_updated(self) -> str | bool:
"""check if update happened in the mean time"""
message = self.get_update()
if not message:
return False
if self._parse_version(message.get("version")) == self.local_version:
local_parsed = self._parse_version(self.local_version)
message_parsed = self._parse_version(message.get("version"))
if local_parsed >= message_parsed:
RedisArchivist().del_message(self.NEW_KEY)
return settings.TA_VERSION
return False
def get_update(self):
def get_update(self) -> dict:
"""return new version dict if available"""
message = RedisArchivist().get_message(self.NEW_KEY)
if not message.get("status"):
return False
return {}
return message

View File

@ -0,0 +1,89 @@
"""
Functionality:
- Handle scheduler config update
"""
from django_celery_beat.models import CrontabSchedule
from home.models import CustomPeriodicTask
from home.src.ta.config import AppConfig
from home.src.ta.settings import EnvironmentSettings
from home.src.ta.task_config import TASK_CONFIG
class ScheduleBuilder:
"""build schedule dicts for beat"""
SCHEDULES = {
"update_subscribed": "0 8 *",
"download_pending": "0 16 *",
"check_reindex": "0 12 *",
"thumbnail_check": "0 17 *",
"run_backup": "0 18 0",
"version_check": "0 11 *",
}
CONFIG = {
"check_reindex_days": "check_reindex",
"run_backup_rotate": "run_backup",
"update_subscribed_notify": "update_subscribed",
"download_pending_notify": "download_pending",
"check_reindex_notify": "check_reindex",
}
MSG = "message:setting"
def __init__(self):
self.config = AppConfig().config
def update_schedule_conf(self, form_post):
"""process form post, schedules need to be validated before"""
for key, value in form_post.items():
if not value:
continue
if key in self.SCHEDULES:
if value == "auto":
value = self.SCHEDULES.get(key)
_ = self.get_set_task(key, value)
continue
if key in self.CONFIG:
self.set_config(key, value)
def get_set_task(self, task_name, schedule=False):
"""get task"""
try:
task = CustomPeriodicTask.objects.get(name=task_name)
except CustomPeriodicTask.DoesNotExist:
description = TASK_CONFIG[task_name].get("title")
task = CustomPeriodicTask(
name=task_name,
task=task_name,
description=description,
)
if schedule:
task_crontab = self.get_set_cron_tab(schedule)
task.crontab = task_crontab
task.save()
return task
@staticmethod
def get_set_cron_tab(schedule):
"""needs to be validated before"""
kwargs = dict(zip(["minute", "hour", "day_of_week"], schedule.split()))
kwargs.update({"timezone": EnvironmentSettings.TZ})
crontab, _ = CrontabSchedule.objects.get_or_create(**kwargs)
return crontab
def set_config(self, key, value):
"""set task_config"""
task_name = self.CONFIG.get(key)
if not task_name:
raise ValueError("invalid config key")
task = CustomPeriodicTask.objects.get(name=task_name)
config_key = key.split(f"{task_name}_")[-1]
task.task_config.update({config_key: value})
task.save()

View File

@ -9,9 +9,12 @@ import random
import string
import subprocess
from datetime import datetime
from typing import Any
from urllib.parse import urlparse
import requests
from home.src.es.connect import IndexPaginate
from home.src.ta.settings import EnvironmentSettings
def ignore_filelist(filelist: list[str]) -> list[str]:
@ -112,13 +115,13 @@ def time_parser(timestamp: str) -> float:
return int(hours) * 60 * 60 + int(minutes) * 60 + float(seconds)
def clear_dl_cache(config: dict) -> int:
def clear_dl_cache(cache_dir: str) -> int:
"""clear leftover files from dl cache"""
print("clear download cache")
cache_dir = os.path.join(config["application"]["cache_dir"], "download")
leftover_files = ignore_filelist(os.listdir(cache_dir))
download_cache_dir = os.path.join(cache_dir, "download")
leftover_files = ignore_filelist(os.listdir(download_cache_dir))
for cached in leftover_files:
to_delete = os.path.join(cache_dir, cached)
to_delete = os.path.join(download_cache_dir, cached)
os.remove(to_delete)
return len(leftover_files)
@ -178,7 +181,7 @@ def get_duration_str(seconds: int) -> str:
for unit_label, unit_seconds in units:
if seconds >= unit_seconds:
unit_count, seconds = divmod(seconds, unit_seconds)
duration_parts.append(f"{unit_count}{unit_label}")
duration_parts.append(f"{unit_count:02}{unit_label}")
return " ".join(duration_parts)
@ -203,3 +206,55 @@ def ta_host_parser(ta_host: str) -> tuple[list[str], list[str]]:
csrf_trusted_origins.append(f"{parsed.scheme}://{parsed.hostname}")
return allowed_hosts, csrf_trusted_origins
def get_stylesheets():
"""Get all valid stylesheets from /static/css"""
app_root = EnvironmentSettings.APP_DIR
stylesheets = os.listdir(os.path.join(app_root, "static/css"))
stylesheets.remove("style.css")
stylesheets.sort()
stylesheets = list(filter(lambda x: x.endswith(".css"), stylesheets))
return stylesheets
def check_stylesheet(stylesheet: str):
"""Check if a stylesheet exists. Return dark.css as a fallback"""
if stylesheet in get_stylesheets():
return stylesheet
return "dark.css"
def is_missing(
to_check: str | list[str],
index_name: str = "ta_video,ta_download",
on_key: str = "youtube_id",
) -> list[str]:
"""id or list of ids that are missing from index_name"""
if isinstance(to_check, str):
to_check = [to_check]
data = {
"query": {"terms": {on_key: to_check}},
"_source": [on_key],
}
result = IndexPaginate(index_name, data=data).get_results()
existing_ids = [i[on_key] for i in result]
dl = [i for i in to_check if i not in existing_ids]
return dl
def get_channel_overwrites() -> dict[str, dict[str, Any]]:
"""get overwrites indexed my channel_id"""
data = {
"query": {
"bool": {"must": [{"exists": {"field": "channel_overwrites"}}]}
},
"_source": ["channel_id", "channel_overwrites"],
}
result = IndexPaginate("ta_channel", data).get_results()
overwrites = {i["channel_id"]: i["channel_overwrites"] for i in result}
return overwrites

View File

@ -1,55 +1,141 @@
"""send notifications using apprise"""
import apprise
from home.src.ta.config import AppConfig
from home.src.es.connect import ElasticWrap
from home.src.ta.task_config import TASK_CONFIG
from home.src.ta.task_manager import TaskManager
class Notifications:
"""notification handler"""
"""store notifications in ES"""
def __init__(self, name: str, task_id: str, task_title: str):
self.name: str = name
self.task_id: str = task_id
self.task_title: str = task_title
GET_PATH = "ta_config/_doc/notify"
UPDATE_PATH = "ta_config/_update/notify/"
def send(self) -> None:
def __init__(self, task_name: str):
self.task_name = task_name
def send(self, task_id: str, task_title: str) -> None:
"""send notifications"""
apobj = apprise.Apprise()
hooks: str | None = self.get_url()
if not hooks:
urls: list[str] = self.get_urls()
if not urls:
return
hook_list: list[str] = self.parse_hooks(hooks=hooks)
title, body = self.build_message()
title, body = self._build_message(task_id, task_title)
if not body:
return
for hook in hook_list:
apobj.add(hook)
for url in urls:
apobj.add(url)
apobj.notify(body=body, title=title)
def get_url(self) -> str | None:
"""get apprise urls for task"""
config = AppConfig().config
hooks: str = config["scheduler"].get(f"{self.name}_notify")
return hooks
def parse_hooks(self, hooks: str) -> list[str]:
"""create list of hooks"""
hook_list: list[str] = [i.strip() for i in hooks.split()]
return hook_list
def build_message(self) -> tuple[str, str | None]:
def _build_message(
self, task_id: str, task_title: str
) -> tuple[str, str | None]:
"""build message to send notification"""
task = TaskManager().get_task(self.task_id)
task = TaskManager().get_task(task_id)
status = task.get("status")
title: str = f"[TA] {self.task_title} process ended with {status}"
title: str = f"[TA] {task_title} process ended with {status}"
body: str | None = task.get("result")
return title, body
def get_urls(self) -> list[str]:
"""get stored urls for task"""
response, code = ElasticWrap(self.GET_PATH).get(print_error=False)
if not code == 200:
return []
urls = response["_source"].get(self.task_name, [])
return urls
def add_url(self, url: str) -> None:
"""add url to task notification"""
source = (
"if (!ctx._source.containsKey(params.task_name)) "
+ "{ctx._source[params.task_name] = [params.url]} "
+ "else if (!ctx._source[params.task_name].contains(params.url)) "
+ "{ctx._source[params.task_name].add(params.url)} "
+ "else {ctx.op = 'none'}"
)
data = {
"script": {
"source": source,
"lang": "painless",
"params": {"url": url, "task_name": self.task_name},
},
"upsert": {self.task_name: [url]},
}
_, _ = ElasticWrap(self.UPDATE_PATH).post(data)
def remove_url(self, url: str) -> tuple[dict, int]:
"""remove url from task"""
source = (
"if (ctx._source.containsKey(params.task_name) "
+ "&& ctx._source[params.task_name].contains(params.url)) "
+ "{ctx._source[params.task_name]."
+ "remove(ctx._source[params.task_name].indexOf(params.url))}"
)
data = {
"script": {
"source": source,
"lang": "painless",
"params": {"url": url, "task_name": self.task_name},
}
}
response, status_code = ElasticWrap(self.UPDATE_PATH).post(data)
if not self.get_urls():
_, _ = self.remove_task()
return response, status_code
def remove_task(self) -> tuple[dict, int]:
"""remove all notifications from task"""
source = (
"if (ctx._source.containsKey(params.task_name)) "
+ "{ctx._source.remove(params.task_name)}"
)
data = {
"script": {
"source": source,
"lang": "painless",
"params": {"task_name": self.task_name},
}
}
response, status_code = ElasticWrap(self.UPDATE_PATH).post(data)
return response, status_code
def get_all_notifications() -> dict[str, list[str]]:
"""get all notifications stored"""
path = "ta_config/_doc/notify"
response, status_code = ElasticWrap(path).get(print_error=False)
if not status_code == 200:
return {}
notifications: dict = {}
source = response.get("_source")
if not source:
return notifications
for task_id, urls in source.items():
notifications.update(
{
task_id: {
"urls": urls,
"title": TASK_CONFIG[task_id]["title"],
}
}
)
return notifications

View File

@ -0,0 +1,95 @@
"""
Functionality:
- read and write application config backed by ES
- encapsulate persistence of application properties
"""
from os import environ
class EnvironmentSettings:
"""
Handle settings for the application that are driven from the environment.
These will not change when the user is using the application.
These settings are only provided only on startup.
"""
HOST_UID: int = int(environ.get("HOST_UID", False))
HOST_GID: int = int(environ.get("HOST_GID", False))
ENABLE_CAST: bool = bool(environ.get("ENABLE_CAST"))
TZ: str = str(environ.get("TZ", "UTC"))
TA_PORT: int = int(environ.get("TA_PORT", False))
TA_UWSGI_PORT: int = int(environ.get("TA_UWSGI_PORT", False))
TA_USERNAME: str = str(environ.get("TA_USERNAME"))
TA_PASSWORD: str = str(environ.get("TA_PASSWORD"))
# Application Paths
MEDIA_DIR: str = str(environ.get("TA_MEDIA_DIR", "/youtube"))
APP_DIR: str = str(environ.get("TA_APP_DIR", "/app"))
CACHE_DIR: str = str(environ.get("TA_CACHE_DIR", "/cache"))
# Redis
REDIS_HOST: str = str(environ.get("REDIS_HOST"))
REDIS_PORT: int = int(environ.get("REDIS_PORT", 6379))
REDIS_NAME_SPACE: str = str(environ.get("REDIS_NAME_SPACE", "ta:"))
# ElasticSearch
ES_URL: str = str(environ.get("ES_URL"))
ES_PASS: str = str(environ.get("ELASTIC_PASSWORD"))
ES_USER: str = str(environ.get("ELASTIC_USER", "elastic"))
ES_SNAPSHOT_DIR: str = str(
environ.get(
"ES_SNAPSHOT_DIR", "/usr/share/elasticsearch/data/snapshot"
)
)
ES_DISABLE_VERIFY_SSL: bool = bool(environ.get("ES_DISABLE_VERIFY_SSL"))
def print_generic(self):
"""print generic env vars"""
print(
f"""
HOST_UID: {self.HOST_UID}
HOST_GID: {self.HOST_GID}
TZ: {self.TZ}
ENABLE_CAST: {self.ENABLE_CAST}
TA_PORT: {self.TA_PORT}
TA_UWSGI_PORT: {self.TA_UWSGI_PORT}
TA_USERNAME: {self.TA_USERNAME}
TA_PASSWORD: *****"""
)
def print_paths(self):
"""debug paths set"""
print(
f"""
MEDIA_DIR: {self.MEDIA_DIR}
APP_DIR: {self.APP_DIR}
CACHE_DIR: {self.CACHE_DIR}"""
)
def print_redis_conf(self):
"""debug redis conf paths"""
print(
f"""
REDIS_HOST: {self.REDIS_HOST}
REDIS_PORT: {self.REDIS_PORT}
REDIS_NAME_SPACE: {self.REDIS_NAME_SPACE}"""
)
def print_es_paths(self):
"""debug es conf"""
print(
f"""
ES_URL: {self.ES_URL}
ES_PASS: *****
ES_USER: {self.ES_USER}
ES_SNAPSHOT_DIR: {self.ES_SNAPSHOT_DIR}
ES_DISABLE_VERIFY_SSL: {self.ES_DISABLE_VERIFY_SSL}"""
)
def print_all(self):
"""print all"""
self.print_generic()
self.print_paths()
self.print_redis_conf()
self.print_es_paths()

View File

@ -6,20 +6,22 @@ functionality:
"""
import json
import os
import redis
from home.src.ta.settings import EnvironmentSettings
class RedisBase:
"""connection base for redis"""
REDIS_HOST: str = str(os.environ.get("REDIS_HOST"))
REDIS_PORT: int = int(os.environ.get("REDIS_PORT") or 6379)
NAME_SPACE: str = "ta:"
NAME_SPACE: str = EnvironmentSettings.REDIS_NAME_SPACE
def __init__(self):
self.conn = redis.Redis(host=self.REDIS_HOST, port=self.REDIS_PORT)
self.conn = redis.Redis(
host=EnvironmentSettings.REDIS_HOST,
port=EnvironmentSettings.REDIS_PORT,
decode_responses=True,
)
class RedisArchivist(RedisBase):
@ -81,7 +83,7 @@ class RedisArchivist(RedisBase):
if not reply:
return []
return [i.decode().lstrip(self.NAME_SPACE) for i in reply]
return [i.lstrip(self.NAME_SPACE) for i in reply]
def list_items(self, query: str) -> list:
"""list all matches"""
@ -98,65 +100,90 @@ class RedisArchivist(RedisBase):
class RedisQueue(RedisBase):
"""dynamically interact with queues in redis"""
"""
dynamically interact with queues in redis using sorted set
- low score number is first in queue
- add new items with high score number
queue names in use:
download:channel channels during download
download:playlist:full playlists during dl for full refresh
download:playlist:quick playlists during dl for quick refresh
download:video videos during downloads
index:comment videos needing comment indexing
reindex:ta_video reindex videos
reindex:ta_channel reindex channels
reindex:ta_playlist reindex playlists
"""
def __init__(self, queue_name: str):
super().__init__()
self.key = f"{self.NAME_SPACE}{queue_name}"
def get_all(self):
def get_all(self) -> list[str]:
"""return all elements in list"""
result = self.conn.execute_command("LRANGE", self.key, 0, -1)
all_elements = [i.decode() for i in result]
return all_elements
result = self.conn.zrange(self.key, 0, -1)
return result
def length(self) -> int:
"""return total elements in list"""
return self.conn.execute_command("LLEN", self.key)
return self.conn.zcard(self.key)
def in_queue(self, element) -> str | bool:
"""check if element is in list"""
result = self.conn.execute_command("LPOS", self.key, element)
result = self.conn.zrank(self.key, element)
if result is not None:
return "in_queue"
return False
def add_list(self, to_add):
def add(self, to_add: str) -> None:
"""add single item to queue"""
if not to_add:
return
next_score = self._get_next_score()
self.conn.zadd(self.key, {to_add: next_score})
def add_list(self, to_add: list) -> None:
"""add list to queue"""
self.conn.execute_command("RPUSH", self.key, *to_add)
if not to_add:
return
def add_priority(self, to_add: str) -> None:
"""add single video to front of queue"""
item: str = json.dumps(to_add)
self.clear_item(item)
self.conn.execute_command("LPUSH", self.key, item)
next_score = self._get_next_score()
mapping = {i[1]: next_score + i[0] for i in enumerate(to_add)}
self.conn.zadd(self.key, mapping)
def get_next(self) -> str | bool:
"""return next element in the queue, False if none"""
result = self.conn.execute_command("LPOP", self.key)
def max_score(self) -> int | None:
"""get max score"""
last = self.conn.zrange(self.key, -1, -1, withscores=True)
if not last:
return None
return int(last[0][1])
def _get_next_score(self) -> float:
"""get next score in queue to append"""
last = self.conn.zrange(self.key, -1, -1, withscores=True)
if not last:
return 1.0
return last[0][1] + 1
def get_next(self) -> tuple[str | None, int | None]:
"""return next element in the queue, if available"""
result = self.conn.zpopmin(self.key)
if not result:
return False
return None, None
next_element = result.decode()
return next_element
item, idx = result[0][0], int(result[0][1])
return item, idx
def clear(self) -> None:
"""delete list from redis"""
self.conn.execute_command("DEL", self.key)
def clear_item(self, to_clear: str) -> None:
"""remove single item from list if it's there"""
self.conn.execute_command("LREM", self.key, 0, to_clear)
def trim(self, size: int) -> None:
"""trim the queue based on settings amount"""
self.conn.execute_command("LTRIM", self.key, 0, size)
def has_item(self) -> bool:
"""check if queue as at least one pending item"""
result = self.conn.execute_command("LRANGE", self.key, 0, 0)
return bool(result)
self.conn.delete(self.key)
class TaskRedis(RedisBase):
@ -169,7 +196,7 @@ class TaskRedis(RedisBase):
def get_all(self) -> list:
"""return all tasks"""
all_keys = self.conn.execute_command("KEYS", f"{self.BASE}*")
return [i.decode().replace(self.BASE, "") for i in all_keys]
return [i.replace(self.BASE, "") for i in all_keys]
def get_single(self, task_id: str) -> dict:
"""return content of single task"""
@ -177,7 +204,7 @@ class TaskRedis(RedisBase):
if not result:
return {}
return json.loads(result.decode())
return json.loads(result)
def set_key(
self, task_id: str, message: dict, expire: bool | int = False

View File

@ -0,0 +1,125 @@
"""
Functionality:
- Static Task config values
- Type definitions
- separate to avoid circular imports
"""
from typing import TypedDict
class TaskItemConfig(TypedDict):
"""describes a task item config"""
title: str
group: str
api_start: bool
api_stop: bool
UPDATE_SUBSCRIBED: TaskItemConfig = {
"title": "Rescan your Subscriptions",
"group": "download:scan",
"api_start": True,
"api_stop": True,
}
DOWNLOAD_PENDING: TaskItemConfig = {
"title": "Downloading",
"group": "download:run",
"api_start": True,
"api_stop": True,
}
EXTRACT_DOWNLOAD: TaskItemConfig = {
"title": "Add to download queue",
"group": "download:add",
"api_start": False,
"api_stop": True,
}
CHECK_REINDEX: TaskItemConfig = {
"title": "Reindex Documents",
"group": "reindex:run",
"api_start": False,
"api_stop": False,
}
MANUAL_IMPORT: TaskItemConfig = {
"title": "Manual video import",
"group": "setting:import",
"api_start": True,
"api_stop": False,
}
RUN_BACKUP: TaskItemConfig = {
"title": "Index Backup",
"group": "setting:backup",
"api_start": True,
"api_stop": False,
}
RESTORE_BACKUP: TaskItemConfig = {
"title": "Restore Backup",
"group": "setting:restore",
"api_start": False,
"api_stop": False,
}
RESCAN_FILESYSTEM: TaskItemConfig = {
"title": "Rescan your Filesystem",
"group": "setting:filesystemscan",
"api_start": True,
"api_stop": False,
}
THUMBNAIL_CHECK: TaskItemConfig = {
"title": "Check your Thumbnails",
"group": "setting:thumbnailcheck",
"api_start": True,
"api_stop": False,
}
RESYNC_THUMBS: TaskItemConfig = {
"title": "Sync Thumbnails to Media Files",
"group": "setting:thumbnailsync",
"api_start": True,
"api_stop": False,
}
INDEX_PLAYLISTS: TaskItemConfig = {
"title": "Index Channel Playlist",
"group": "channel:indexplaylist",
"api_start": False,
"api_stop": False,
}
SUBSCRIBE_TO: TaskItemConfig = {
"title": "Add Subscription",
"group": "subscription:add",
"api_start": False,
"api_stop": False,
}
VERSION_CHECK: TaskItemConfig = {
"title": "Look for new Version",
"group": "",
"api_start": False,
"api_stop": False,
}
TASK_CONFIG: dict[str, TaskItemConfig] = {
"update_subscribed": UPDATE_SUBSCRIBED,
"download_pending": DOWNLOAD_PENDING,
"extract_download": EXTRACT_DOWNLOAD,
"check_reindex": CHECK_REINDEX,
"manual_import": MANUAL_IMPORT,
"run_backup": RUN_BACKUP,
"restore_backup": RESTORE_BACKUP,
"rescan_filesystem": RESCAN_FILESYSTEM,
"thumbnail_check": THUMBNAIL_CHECK,
"resync_thumbs": RESYNC_THUMBS,
"index_playlists": INDEX_PLAYLISTS,
"subscribe_to": SUBSCRIBE_TO,
"version_check": VERSION_CHECK,
}

View File

@ -4,8 +4,9 @@ functionality:
- handle threads and locks
"""
from home import tasks as ta_tasks
from home.celery import app as celery_app
from home.src.ta.ta_redis import RedisArchivist, TaskRedis
from home.src.ta.task_config import TASK_CONFIG
class TaskManager:
@ -86,7 +87,7 @@ class TaskCommand:
def start(self, task_name):
"""start task by task_name, only pass task that don't take args"""
task = ta_tasks.app.tasks.get(task_name).delay()
task = celery_app.tasks.get(task_name).delay()
message = {
"task_id": task.id,
"status": task.status,
@ -104,7 +105,7 @@ class TaskCommand:
handler = TaskRedis()
task = handler.get_single(task_id)
if not task["name"] in ta_tasks.BaseTask.TASK_CONFIG:
if not task["name"] in TASK_CONFIG:
raise ValueError
handler.set_command(task_id, "STOP")
@ -113,4 +114,4 @@ class TaskCommand:
def kill(self, task_id):
"""send kill signal to task_id"""
print(f"[task][{task_id}]: received KILL signal.")
ta_tasks.app.control.revoke(task_id, terminate=True)
celery_app.control.revoke(task_id, terminate=True)

View File

@ -92,7 +92,7 @@ class Parser:
item_type = "video"
elif len_id_str == 24:
item_type = "channel"
elif len_id_str in (34, 26, 18):
elif len_id_str in (34, 26, 18) or id_str.startswith("TA_playlist_"):
item_type = "playlist"
else:
raise ValueError(f"not a valid id_str: {id_str}")

View File

@ -7,12 +7,13 @@ Functionality:
from typing import TypedDict
from home.src.es.connect import ElasticWrap
from home.src.ta.helper import get_stylesheets
class UserConfigType(TypedDict, total=False):
"""describes the user configuration"""
colors: str
stylesheet: str
page_size: int
sort_by: str
sort_order: str
@ -28,15 +29,10 @@ class UserConfigType(TypedDict, total=False):
class UserConfig:
"""Handle settings for an individual user
Create getters and setters for usage in the application.
Although tedious it helps prevents everything caring about how properties
are persisted. Plus it allows us to save anytime any value is set.
"""
"""Handle settings for an individual user"""
_DEFAULT_USER_SETTINGS = UserConfigType(
colors="dark",
stylesheet="dark.css",
page_size=12,
sort_by="published",
sort_order="desc",
@ -51,9 +47,22 @@ class UserConfig:
sponsorblock_id=None,
)
VALID_STYLESHEETS = get_stylesheets()
VALID_VIEW_STYLE = ["grid", "list"]
VALID_SORT_ORDER = ["asc", "desc"]
VALID_SORT_BY = [
"published",
"downloaded",
"views",
"likes",
"duration",
"filesize",
]
VALID_GRID_ITEMS = range(3, 8)
def __init__(self, user_id: str):
self._user_id: str = user_id
self._config: UserConfigType = self._get_config()
self._config: UserConfigType = self.get_config()
def get_value(self, key: str):
"""Get the given key from the users configuration
@ -65,15 +74,8 @@ class UserConfig:
return self._config.get(key) or self._DEFAULT_USER_SETTINGS.get(key)
def set_value(self, key: str, value: str | bool | int):
"""Set or replace a configuration value for the user
Throws a KeyError if the requested Key is not a permitted value"""
if not self._user_id:
raise ValueError("Unable to persist config for null user_id")
if key not in self._DEFAULT_USER_SETTINGS:
raise KeyError(f"Unable to persist config for unknown key '{key}'")
"""Set or replace a configuration value for the user"""
self._validate(key, value)
old = self.get_value(key)
self._config[key] = value
@ -84,9 +86,45 @@ class UserConfig:
if status < 200 or status > 299:
raise ValueError(f"Failed storing user value {status}: {response}")
print(f"User {self._user_id} value '{key}' change: {old} > {value}")
print(f"User {self._user_id} value '{key}' change: {old} -> {value}")
def _get_config(self) -> UserConfigType:
def _validate(self, key, value):
"""validate key and value"""
if not self._user_id:
raise ValueError("Unable to persist config for null user_id")
if key not in self._DEFAULT_USER_SETTINGS:
raise KeyError(
f"Unable to persist config for an unknown key '{key}'"
)
valid_values = {
"stylesheet": self.VALID_STYLESHEETS,
"sort_by": self.VALID_SORT_BY,
"sort_order": self.VALID_SORT_ORDER,
"view_style_home": self.VALID_VIEW_STYLE,
"view_style_channel": self.VALID_VIEW_STYLE,
"view_style_download": self.VALID_VIEW_STYLE,
"view_style_playlist": self.VALID_VIEW_STYLE,
"grid_items": self.VALID_GRID_ITEMS,
"page_size": int,
"hide_watched": bool,
"show_ignored_only": bool,
"show_subed_only": bool,
}
validation_value = valid_values.get(key)
if isinstance(validation_value, (list, range)):
if value not in validation_value:
raise ValueError(f"Invalid value for {key}: {value}")
elif validation_value == int:
if not isinstance(value, int):
raise ValueError(f"Invalid value for {key}: {value}")
elif validation_value == bool:
if not isinstance(value, bool):
raise ValueError(f"Invalid value for {key}: {value}")
def get_config(self) -> UserConfigType:
"""get config from ES or load from the application defaults"""
if not self._user_id:
# this is for a non logged-in user so use all the defaults

View File

@ -1,14 +1,12 @@
"""
Functionality:
- initiate celery app
- collect tasks
- user config changes won't get applied here
because tasks are initiated at application start
- handle task callbacks
- handle task notifications
- handle task locking
"""
import os
from celery import Celery, Task, shared_task
from celery import Task, shared_task
from home.src.download.queue import PendingList
from home.src.download.subscriptions import (
SubscriptionHandler,
@ -22,94 +20,19 @@ from home.src.index.channel import YoutubeChannel
from home.src.index.filesystem import Scanner
from home.src.index.manual import ImportFolderScanner
from home.src.index.reindex import Reindex, ReindexManual, ReindexPopulate
from home.src.ta.config import AppConfig, ReleaseVersion, ScheduleBuilder
from home.src.ta.config import ReleaseVersion
from home.src.ta.notify import Notifications
from home.src.ta.ta_redis import RedisArchivist
from home.src.ta.task_config import TASK_CONFIG
from home.src.ta.task_manager import TaskManager
from home.src.ta.urlparser import Parser
CONFIG = AppConfig().config
REDIS_HOST = os.environ.get("REDIS_HOST")
REDIS_PORT = os.environ.get("REDIS_PORT") or 6379
os.environ.setdefault("DJANGO_SETTINGS_MODULE", "config.settings")
app = Celery(
"tasks",
broker=f"redis://{REDIS_HOST}:{REDIS_PORT}",
backend=f"redis://{REDIS_HOST}:{REDIS_PORT}",
result_extended=True,
)
app.config_from_object("django.conf:settings", namespace="ta:")
app.autodiscover_tasks()
app.conf.timezone = os.environ.get("TZ") or "UTC"
class BaseTask(Task):
"""base class to inherit each class from"""
# pylint: disable=abstract-method
TASK_CONFIG = {
"update_subscribed": {
"title": "Rescan your Subscriptions",
"group": "download:scan",
"api-start": True,
"api-stop": True,
},
"download_pending": {
"title": "Downloading",
"group": "download:run",
"api-start": True,
"api-stop": True,
},
"extract_download": {
"title": "Add to download queue",
"group": "download:add",
"api-stop": True,
},
"check_reindex": {
"title": "Reindex Documents",
"group": "reindex:run",
},
"manual_import": {
"title": "Manual video import",
"group": "setting:import",
"api-start": True,
},
"run_backup": {
"title": "Index Backup",
"group": "setting:backup",
"api-start": True,
},
"restore_backup": {
"title": "Restore Backup",
"group": "setting:restore",
},
"rescan_filesystem": {
"title": "Rescan your Filesystem",
"group": "setting:filesystemscan",
"api-start": True,
},
"thumbnail_check": {
"title": "Check your Thumbnails",
"group": "setting:thumbnailcheck",
"api-start": True,
},
"resync_thumbs": {
"title": "Sync Thumbnails to Media Files",
"group": "setting:thumbnailsync",
"api-start": True,
},
"index_playlists": {
"title": "Index Channel Playlist",
"group": "channel:indexplaylist",
},
"subscribe_to": {
"title": "Add Subscription",
"group": "subscription:add",
},
}
def on_failure(self, exc, task_id, args, kwargs, einfo):
"""callback for task failure"""
print(f"{task_id} Failed callback")
@ -134,8 +57,8 @@ class BaseTask(Task):
def after_return(self, status, retval, task_id, args, kwargs, einfo):
"""callback after task returns"""
print(f"{task_id} return callback")
task_title = self.TASK_CONFIG.get(self.name).get("title")
Notifications(self.name, task_id, task_title).send()
task_title = TASK_CONFIG.get(self.name).get("title")
Notifications(self.name).send(task_id, task_title)
def send_progress(self, message_lines, progress=False, title=False):
"""send progress message"""
@ -154,7 +77,7 @@ class BaseTask(Task):
def _build_message(self, level="info"):
"""build message dict"""
task_id = self.request.id
message = self.TASK_CONFIG.get(self.name).copy()
message = TASK_CONFIG.get(self.name).copy()
message.update({"level": level, "id": task_id})
task_result = TaskManager().get_task(task_id)
if task_result:
@ -205,13 +128,13 @@ def download_pending(self, auto_only=False):
videos_downloaded = downloader.run_queue(auto_only=auto_only)
if videos_downloaded:
return f"downloaded {len(videos_downloaded)} videos."
return f"downloaded {videos_downloaded} video(s)."
return None
@shared_task(name="extract_download", bind=True, base=BaseTask)
def extrac_dl(self, youtube_ids, auto_start=False):
def extrac_dl(self, youtube_ids, auto_start=False, status="pending"):
"""parse list passed and add to pending"""
TaskManager().init(self)
if isinstance(youtube_ids, str):
@ -221,11 +144,18 @@ def extrac_dl(self, youtube_ids, auto_start=False):
pending_handler = PendingList(youtube_ids=to_add, task=self)
pending_handler.parse_url_list()
pending_handler.add_to_pending(auto_start=auto_start)
videos_added = pending_handler.add_to_pending(
status=status, auto_start=auto_start
)
if auto_start:
download_pending.delay(auto_only=True)
if videos_added:
return f"added {len(videos_added)} Videos to Queue"
return None
@shared_task(bind=True, name="check_reindex", base=BaseTask)
def check_reindex(self, data=False, extract_videos=False):
@ -248,6 +178,7 @@ def check_reindex(self, data=False, extract_videos=False):
populate = ReindexPopulate()
print(f"[task][{self.name}] reindex outdated documents")
self.send_progress("Add recent documents to the reindex Queue.")
populate.get_interval()
populate.add_recent()
self.send_progress("Add outdated documents to the reindex Queue.")
populate.add_outdated()
@ -291,7 +222,7 @@ def run_restore_backup(self, filename):
if manager.is_pending(self):
print(f"[task][{self.name}] restore is already running")
self.send_progress("Restore is already running.")
return
return None
manager.init(self)
self.send_progress(["Reset your Index"])
@ -299,6 +230,8 @@ def run_restore_backup(self, filename):
ElasticBackup(task=self).restore(filename)
print("index restore finished")
return f"backup restore completed: {filename}"
@shared_task(bind=True, name="rescan_filesystem", base=BaseTask)
def rescan_filesystem(self):
@ -326,7 +259,9 @@ def thumbnail_check(self):
return
manager.init(self)
ThumbValidator(task=self).validate()
thumnail = ThumbValidator(task=self)
thumnail.validate()
thumnail.clean_up()
@shared_task(bind=True, name="resync_thumbs", base=BaseTask)
@ -362,7 +297,3 @@ def index_channel_playlists(self, channel_id):
def version_check():
"""check for new updates"""
ReleaseVersion().check()
# start schedule here
app.conf.beat_schedule = ScheduleBuilder().build_schedule()

View File

@ -1,4 +1,5 @@
{% load static %}
{% load auth_extras %}
<!DOCTYPE html>
<html lang="en">
<head>
@ -22,11 +23,7 @@
{% else %}
<title>TubeArchivist</title>
{% endif %}
{% if colors == "dark" %}
<link rel="stylesheet" href="{% static 'css/dark.css' %}">
{% else %}
<link rel="stylesheet" href="{% static 'css/light.css' %}">
{% endif %}
<link rel="stylesheet" href="{% static 'css/' %}{{ stylesheet }}">
<script type="text/javascript" src="{% static 'script.js' %}"></script>
{% if cast %}
<script type="text/javascript" src="https://www.gstatic.com/cv/js/sender/v1/cast_sender.js?loadCastFramework=1"></script>
@ -36,16 +33,9 @@
<body>
<div class="main-content">
<div class="boxed-content">
<div class="top-banner">
<a href="{% url 'home' %}">
{% if colors == 'dark' %}
<img src="{% static 'img/banner-tube-archivist-dark.png' %}" alt="tube-archivist-banner">
{% endif %}
{% if colors == 'light' %}
<img src="{% static 'img/banner-tube-archivist-light.png' %}" alt="tube-archivist-banner">
{% endif %}
</a>
</div>
<a href="{% url 'home' %}">
<div class="top-banner"></div>
</a>
<div class="top-nav">
<div class="nav-items">
<a href="{% url 'home' %}">
@ -57,9 +47,11 @@
<a href="{% url 'playlist' %}">
<div class="nav-item">playlists</div>
</a>
{% if request.user|has_group:"admin" or request.user.is_staff %}
<a href="{% url 'downloads' %}">
<div class="nav-item">downloads</div>
</a>
{% endif %}
</div>
<div class="nav-icons">
<a href="{% url 'search' %}">

View File

@ -1,14 +1,17 @@
{# Base file for all of the settings pages to ensure a common menu #}
{% extends "home/base.html" %}
{% load static %}
{% load auth_extras %}
{% block content %}
<div class="boxed-content">
<div class="info-box-item child-page-nav">
<a href="{% url 'settings' %}"><h3>Dashboard</h3></a>
<a href="{% url 'settings_user' %}"><h3>User</h3></a>
<a href="{% url 'settings_application' %}"><h3>Application</h3></a>
<a href="{% url 'settings_scheduling' %}"><h3>Scheduling</h3></a>
<a href="{% url 'settings_actions' %}"><h3>Actions</h3></a>
{% if request.user|has_group:"admin" or request.user.is_staff %}
<a href="{% url 'settings_application' %}"><h3>Application</h3></a>
<a href="{% url 'settings_scheduling' %}"><h3>Scheduling</h3></a>
<a href="{% url 'settings_actions' %}"><h3>Actions</h3></a>
{% endif %}
</div>
<div id="notifications" data=""></div>
{% block settings_content %}{% endblock %}

View File

@ -2,11 +2,13 @@
{% load static %}
{% load humanize %}
{% block content %}
{% load auth_extras %}
<div class="boxed-content">
<div class="title-split">
<div class="title-bar">
<h1>Channels</h1>
</div>
{% if request.user|has_group:"admin" or request.user.is_staff %}
<div class="title-split-form">
<img id="animate-icon" onclick="showForm()" src="{% static 'img/icon-add.svg' %}" alt="add-icon" title="Subscribe to Channels">
<div class="show-form">
@ -17,6 +19,7 @@
</form>
</div>
</div>
{% endif %}
</div>
<div id="notifications" data="subscription"></div>
<div class="view-controls">

View File

@ -2,6 +2,8 @@
{% block content %}
{% load static %}
{% load humanize %}
{% load auth_extras %}
<div class="boxed-content">
<div class="channel-banner">
<a href="/channel/{{ channel_info.channel_id }}/"><img src="/cache/channels/{{ channel_info.channel_id }}_banner.jpg" alt="channel_banner"></a>
@ -19,7 +21,9 @@
{% endif %}
<a href="{% url 'channel_id_about' channel_info.channel_id %}"><h3>About</h3></a>
{% if has_pending %}
<a href="{% url 'downloads' %}?channel={{ channel_info.channel_id }}"><h3>Downloads</h3></a>
{% if request.user|has_group:"admin" or request.user.is_staff %}
<a href="{% url 'downloads' %}?channel={{ channel_info.channel_id }}"><h3>Downloads</h3></a>
{% endif %}
{% endif %}
</div>
<div id="notifications" data="channel reindex"></div>
@ -38,7 +42,9 @@
<p>Subscribers: {{ channel_info.channel_subs|intcomma }}</p>
{% endif %}
{% if channel_info.channel_subscribed %}
{% if request.user|has_group:"admin" or request.user.is_staff %}
<button class="unsubscribe" type="button" data-type="channel" data-subscribe="" data-id="{{ channel_info.channel_id }}" onclick="subscribeStatus(this)" title="Unsubscribe from {{ channel_info.channel_name }}">Unsubscribe</button>
{% endif %}
{% else %}
<button type="button" data-type="channel" data-subscribe="true" data-id="{{ channel_info.channel_id }}" onclick="subscribeStatus(this)" title="Subscribe to {{ channel_info.channel_name }}">Subscribe</button>
{% endif %}
@ -71,13 +77,15 @@
<div class="sort">
<div id="hidden-form">
<span>Sort by:</span>
<select name="sort" id="sort" onchange="sortChange(this.value)">
<select name="sort_by" id="sort" onchange="sortChange(this)">
<option value="published" {% if sort_by == "published" %}selected{% endif %}>date published</option>
<option value="downloaded" {% if sort_by == "downloaded" %}selected{% endif %}>date downloaded</option>
<option value="views" {% if sort_by == "views" %}selected{% endif %}>views</option>
<option value="likes" {% if sort_by == "likes" %}selected{% endif %}>likes</option>
<option value="duration" {% if sort_by == "duration" %}selected{% endif %}>duration</option>
<option value="filesize" {% if sort_by == "filesize" %}selected{% endif %}>file size</option>
</select>
<select name="sord-order" id="sort-order" onchange="sortChange(this.value)">
<select name="sort_order" id="sort-order" onchange="sortChange(this)">
<option value="asc" {% if sort_order == "asc" %}selected{% endif %}>asc</option>
<option value="desc" {% if sort_order == "desc" %}selected{% endif %}>desc</option>
</select>

View File

@ -2,6 +2,7 @@
{% block content %}
{% load static %}
{% load humanize %}
{% load auth_extras %}
<div class="boxed-content">
<div class="channel-banner">
<a href="{% url 'channel_id' channel_info.channel_id %}"><img src="{{ channel_info.channel_banner_url }}" alt="channel_banner"></a>
@ -19,7 +20,9 @@
{% endif %}
<a href="{% url 'channel_id_about' channel_info.channel_id %}"><h3>About</h3></a>
{% if has_pending %}
<a href="{% url 'downloads' %}?channel={{ channel_info.channel_id }}"><h3>Downloads</h3></a>
{% if request.user|has_group:"admin" or request.user.is_staff %}
<a href="{% url 'downloads' %}?channel={{ channel_info.channel_id }}"><h3>Downloads</h3></a>
{% endif %}
{% endif %}
</div>
<div id="notifications" data="channel reindex"></div>
@ -56,19 +59,21 @@
{% elif channel_info.channel_views > 0 %}
<p>Channel views: {{ channel_info.channel_views|intcomma }}</p>
{% endif %}
<div class="button-box">
<button onclick="deleteConfirm()" id="delete-item">Delete Channel</button>
<div class="delete-confirm" id="delete-button">
<span>Delete {{ channel_info.channel_name }} including all videos? </span><button class="danger-button" onclick="deleteChannel(this)" data-id="{{ channel_info.channel_id }}">Delete</button> <button onclick="cancelDelete()">Cancel</button>
</div>
</div>
{% if reindex %}
<p>Reindex scheduled</p>
{% else %}
<div id="reindex-button" class="button-box">
<button data-id="{{ channel_info.channel_id }}" data-type="channel" onclick="reindex(this)" title="Reindex Channel {{ channel_info.channel_name }}">Reindex</button>
<button data-id="{{ channel_info.channel_id }}" data-type="channel" data-extract-videos="true" onclick="reindex(this)" title="Reindex Videos of {{ channel_info.channel_name }}">Reindex Videos</button>
{% if request.user|has_group:"admin" or request.user.is_staff %}
<div class="button-box">
<button onclick="deleteConfirm()" id="delete-item">Delete Channel</button>
<div class="delete-confirm" id="delete-button">
<span>Delete {{ channel_info.channel_name }} including all videos? </span><button class="danger-button" onclick="deleteChannel(this)" data-id="{{ channel_info.channel_id }}">Delete</button> <button onclick="cancelDelete()">Cancel</button>
</div>
</div>
{% if reindex %}
<p>Reindex scheduled</p>
{% else %}
<div id="reindex-button" class="button-box">
<button data-id="{{ channel_info.channel_id }}" data-type="channel" onclick="reindex(this)" title="Reindex Channel {{ channel_info.channel_name }}">Reindex</button>
<button data-id="{{ channel_info.channel_id }}" data-type="channel" data-extract-videos="true" onclick="reindex(this)" title="Reindex Videos of {{ channel_info.channel_name }}">Reindex Videos</button>
</div>
{% endif %}
{% endif %}
</div>
</div>
@ -90,53 +95,55 @@
</div>
</div>
{% endif %}
<div id="overwrite-form" class="info-box">
<div class="info-box-item">
<h2>Customize {{ channel_info.channel_name }}</h2>
<form class="overwrite-form" action="/channel/{{ channel_info.channel_id }}/about/" method="POST">
{% csrf_token %}
<div class="overwrite-form-item">
<p>Download format: <span class="settings-current">
{% if channel_info.channel_overwrites.download_format %}
{{ channel_info.channel_overwrites.download_format }}
{% else %}
False
{% endif %}</span></p>
{{ channel_overwrite_form.download_format }}<br>
</div>
<div class="overwrite-form-item">
<p>Auto delete watched videos after x days: <span class="settings-current">
{% if channel_info.channel_overwrites.autodelete_days %}
{{ channel_info.channel_overwrites.autodelete_days }}
{% else %}
False
{% endif %}</span></p>
{{ channel_overwrite_form.autodelete_days }}<br>
</div>
<div class="overwrite-form-item">
<p>Index playlists: <span class="settings-current">
{% if channel_info.channel_overwrites.index_playlists %}
{{ channel_info.channel_overwrites.index_playlists }}
{% else %}
False
{% endif %}</span></p>
{{ channel_overwrite_form.index_playlists }}<br>
</div>
<div class="overwrite-form-item">
<p>Enable <a href="https://sponsor.ajay.app/" target="_blank">SponsorBlock</a>: <span class="settings-current">
{% if channel_info.channel_overwrites.integrate_sponsorblock %}
{{ channel_info.channel_overwrites.integrate_sponsorblock }}
{% elif channel_info.channel_overwrites.integrate_sponsorblock == False %}
Disabled
{% else %}
False
{% endif %}</span></p>
{{ channel_overwrite_form.integrate_sponsorblock }}<br>
</div>
<button type="submit">Save Channel Overwrites</button>
</form>
{% if request.user|has_group:"admin" or request.user.is_staff %}
<div id="overwrite-form" class="info-box">
<div class="info-box-item">
<h2>Customize {{ channel_info.channel_name }}</h2>
<form class="overwrite-form" action="/channel/{{ channel_info.channel_id }}/about/" method="POST">
{% csrf_token %}
<div class="overwrite-form-item">
<p>Download format: <span class="settings-current">
{% if channel_info.channel_overwrites.download_format %}
{{ channel_info.channel_overwrites.download_format }}
{% else %}
False
{% endif %}</span></p>
{{ channel_overwrite_form.download_format }}<br>
</div>
<div class="overwrite-form-item">
<p>Auto delete watched videos after x days: <span class="settings-current">
{% if channel_info.channel_overwrites.autodelete_days %}
{{ channel_info.channel_overwrites.autodelete_days }}
{% else %}
False
{% endif %}</span></p>
{{ channel_overwrite_form.autodelete_days }}<br>
</div>
<div class="overwrite-form-item">
<p>Index playlists: <span class="settings-current">
{% if channel_info.channel_overwrites.index_playlists %}
{{ channel_info.channel_overwrites.index_playlists }}
{% else %}
False
{% endif %}</span></p>
{{ channel_overwrite_form.index_playlists }}<br>
</div>
<div class="overwrite-form-item">
<p>Enable <a href="https://sponsor.ajay.app/" target="_blank">SponsorBlock</a>: <span class="settings-current">
{% if channel_info.channel_overwrites.integrate_sponsorblock %}
{{ channel_info.channel_overwrites.integrate_sponsorblock }}
{% elif channel_info.channel_overwrites.integrate_sponsorblock == False %}
Disabled
{% else %}
False
{% endif %}</span></p>
{{ channel_overwrite_form.integrate_sponsorblock }}<br>
</div>
<button type="submit">Save Channel Overwrites</button>
</form>
</div>
</div>
</div>
{% endif %}
</div>
<script type="text/javascript" src="{% static 'progress.js' %}"></script>
{% endblock content %}

View File

@ -2,6 +2,7 @@
{% block content %}
{% load static %}
{% load humanize %}
{% load auth_extras %}
<div class="boxed-content">
<div class="channel-banner">
<a href="{% url 'channel_id' channel_info.channel_id %}"><img src="{{ channel_info.channel_banner_url }}" alt="channel_banner"></a>
@ -19,7 +20,9 @@
{% endif %}
<a href="{% url 'channel_id_about' channel_info.channel_id %}"><h3>About</h3></a>
{% if has_pending %}
<a href="{% url 'downloads' %}?channel={{ channel_info.channel_id }}"><h3>Downloads</h3></a>
{% if request.user|has_group:"admin" or request.user.is_staff %}
<a href="{% url 'downloads' %}?channel={{ channel_info.channel_id }}"><h3>Downloads</h3></a>
{% endif %}
{% endif %}
</div>
<div id="notifications" data="channel reindex"></div>
@ -50,13 +53,14 @@
</a>
</div>
<div class="playlist-desc {{ view_style }}">
<a href="{% url 'channel_id' playlist.playlist_channel_id %}"><h3>{{ playlist.playlist_channel }}</h3></a>
<a href="{% url 'playlist_id' playlist.playlist_id %}"><h2>{{ playlist.playlist_name }}</h2></a>
<p>Last refreshed: {{ playlist.playlist_last_refresh }}</p>
{% if playlist.playlist_subscribed %}
<button class="unsubscribe" type="button" data-type="playlist" data-subscribe="" data-id="{{ playlist.playlist_id }}" onclick="subscribeStatus(this)" title="Unsubscribe from {{ playlist.playlist_name }}">Unsubscribe</button>
{% else %}
<button type="button" data-type="playlist" data-subscribe="true" data-id="{{ playlist.playlist_id }}" onclick="subscribeStatus(this)" title="Subscribe to {{ playlist.playlist_name }}">Subscribe</button>
{% if request.user|has_group:"admin" or request.user.is_staff %}
{% if playlist.playlist_subscribed %}
<button class="unsubscribe" type="button" data-type="playlist" data-subscribe="" data-id="{{ playlist.playlist_id }}" onclick="subscribeStatus(this)" title="Unsubscribe from {{ playlist.playlist_name }}">Unsubscribe</button>
{% else %}
<button type="button" data-type="playlist" data-subscribe="true" data-id="{{ playlist.playlist_id }}" onclick="subscribeStatus(this)" title="Subscribe to {{ playlist.playlist_name }}">Subscribe</button>
{% endif %}
{% endif %}
</div>
</div>

View File

@ -60,13 +60,15 @@
<div class="sort">
<div id="hidden-form">
<span>Sort by:</span>
<select name="sort" id="sort" onchange="sortChange(this.value)">
<select name="sort_by" id="sort" onchange="sortChange(this)">
<option value="published" {% if sort_by == "published" %}selected{% endif %}>date published</option>
<option value="downloaded" {% if sort_by == "downloaded" %}selected{% endif %}>date downloaded</option>
<option value="views" {% if sort_by == "views" %}selected{% endif %}>views</option>
<option value="likes" {% if sort_by == "likes" %}selected{% endif %}>likes</option>
<option value="duration" {% if sort_by == "duration" %}selected{% endif %}>duration</option>
<option value="filesize" {% if sort_by == "filesize" %}selected{% endif %}>file size</option>
</select>
<select name="sord-order" id="sort-order" onchange="sortChange(this.value)">
<select name="sort_order" id="sort-order" onchange="sortChange(this)">
<option value="asc" {% if sort_order == "asc" %}selected{% endif %}>asc</option>
<option value="desc" {% if sort_order == "desc" %}selected{% endif %}>desc</option>
</select>

View File

@ -18,20 +18,11 @@
<meta name="msapplication-TileColor" content="#01202e">
<meta name="msapplication-config" content="{% static 'favicon/browserconfig.xml' %}">
<meta name="theme-color" content="#01202e">
{% if colors == "dark" %}
<link rel="stylesheet" href="{% static 'css/dark.css' %}">
{% else %}
<link rel="stylesheet" href="{% static 'css/light.css' %}">
{% endif %}
<link rel="stylesheet" href="{% static 'css/' %}{{ stylesheet }}">
</head>
<body>
<div class="boxed-content login-page">
{% if colors == 'dark' %}
<img src="{% static 'img/logo-tube-archivist-dark.png' %}" alt="tube-archivist-logo">
{% endif %}
{% if colors == 'light' %}
<img src="{% static 'img/logo-tube-archivist-light.png' %}" alt="tube-archivist-banner">
{% endif %}
<img alt="tube-archivist-logo">
<h1>Tube Archivist</h1>
<h2>Your Self Hosted YouTube Media Server</h2>
{% if form_error %}

View File

@ -1,21 +1,32 @@
{% extends "home/base.html" %}
{% load static %}
{% block content %}
{% load auth_extras %}
<div class="boxed-content">
<div class="title-split">
<div class="title-bar">
<h1>Playlists</h1>
</div>
{% if request.user|has_group:"admin" or request.user.is_staff %}
<div class="title-split-form">
<img id="animate-icon" onclick="showForm()" src="{% static 'img/icon-add.svg' %}" alt="add-icon" title="Subscribe to Playlists">
<img id="animate-icon" onclick="showForm();showForm('hidden-form2')" src="{% static 'img/icon-add.svg' %}" alt="add-icon" title="Subscribe to Playlists">
<div class="show-form">
<form id="hidden-form" action="/playlist/" method="post">
{% csrf_token %}
{{ subscribe_form }}
<button type="submit">Subscribe</button>
</form>
<form id="hidden-form2" action="/playlist/" method="post">
{% csrf_token %}
{{ create_form }}
<button type="submit">Create</button>
</form>
</div>
</div>
{% endif %}
</div>
<div id="notifications" data="subscription"></div>
<div class="view-controls">
@ -45,14 +56,18 @@
</a>
</div>
<div class="playlist-desc {{ view_style }}">
<a href="{% url 'channel_id' playlist.playlist_channel_id %}"><h3>{{ playlist.playlist_channel }}</h3></a>
{% if playlist.playlist_type != "custom" %}
<a href="{% url 'channel_id' playlist.playlist_channel_id %}"><h3>{{ playlist.playlist_channel }}</h3></a>
{% endif %}
<a href="{% url 'playlist_id' playlist.playlist_id %}"><h2>{{ playlist.playlist_name }}</h2></a>
<p>Last refreshed: {{ playlist.playlist_last_refresh }}</p>
{% if playlist.playlist_subscribed %}
<button class="unsubscribe" type="button" data-type="playlist" data-subscribe="" data-id="{{ playlist.playlist_id }}" onclick="subscribeStatus(this)" title="Unsubscribe from {{ playlist.playlist_name }}">Unsubscribe</button>
{% else %}
<button type="button" data-type="playlist" data-subscribe="true" data-id="{{ playlist.playlist_id }}" onclick="subscribeStatus(this)" title="Subscribe to {{ playlist.playlist_name }}">Subscribe</button>
{% endif %}
{% if playlist.playlist_type != "custom" %}
{% if playlist.playlist_subscribed %}
<button class="unsubscribe" type="button" data-type="playlist" data-subscribe="" data-id="{{ playlist.playlist_id }}" onclick="subscribeStatus(this)" title="Unsubscribe from {{ playlist.playlist_name }}">Unsubscribe</button>
{% else %}
<button type="button" data-type="playlist" data-subscribe="true" data-id="{{ playlist.playlist_id }}" onclick="subscribeStatus(this)" title="Subscribe to {{ playlist.playlist_name }}">Subscribe</button>
{% endif %}
{% endif %}
</div>
</div>
{% endfor %}

View File

@ -2,40 +2,49 @@
{% load static %}
{% load humanize %}
{% block content %}
{% load auth_extras %}
<div class="boxed-content">
<div class="title-bar">
<h1>{{ playlist_info.playlist_name }}</h1>
</div>
<div class="info-box info-box-3">
<div class="info-box-item">
<div class="round-img">
<a href="{% url 'channel_id' channel_info.channel_id %}">
<img src="/cache/channels/{{ channel_info.channel_id }}_thumb.jpg" alt="channel-thumb">
</a>
</div>
<div>
<h3><a href="{% url 'channel_id' channel_info.channel_id %}">{{ channel_info.channel_name }}</a></h3>
{% if channel_info.channel_subs >= 1000000 %}
<span>Subscribers: {{ channel_info.channel_subs|intword }}</span>
{% else %}
<span>Subscribers: {{ channel_info.channel_subs|intcomma }}</span>
{% endif %}
</div>
</div>
{% if playlist_info.playlist_type != "custom" %}
<div class="info-box-item">
<div class="round-img">
<a href="{% url 'channel_id' channel_info.channel_id %}">
<img src="/cache/channels/{{ channel_info.channel_id }}_thumb.jpg" alt="channel-thumb">
</a>
</div>
<div>
<h3><a href="{% url 'channel_id' channel_info.channel_id %}">{{ channel_info.channel_name }}</a></h3>
{% if channel_info.channel_subs >= 1000000 %}
<span>Subscribers: {{ channel_info.channel_subs|intword }}</span>
{% else %}
<span>Subscribers: {{ channel_info.channel_subs|intcomma }}</span>
{% endif %}
</div>
</div>
{% endif %}
<div class="info-box-item">
<div>
<p>Last refreshed: {{ playlist_info.playlist_last_refresh }}</p>
<p>Playlist:
{% if playlist_info.playlist_subscribed %}
<button class="unsubscribe" type="button" data-type="playlist" data-subscribe="" data-id="{{ playlist_info.playlist_id }}" onclick="subscribeStatus(this)" title="Unsubscribe from {{ playlist_info.playlist_name }}">Unsubscribe</button>
{% else %}
<button type="button" data-type="playlist" data-subscribe="true" data-id="{{ playlist_info.playlist_id }}" onclick="subscribeStatus(this)" title="Subscribe to {{ playlist_info.playlist_name }}">Subscribe</button>
{% endif %}
</p>
{% if playlist_info.playlist_active %}
<p>Youtube: <a href="https://www.youtube.com/playlist?list={{ playlist_info.playlist_id }}" target="_blank">Active</a></p>
{% else %}
<p>Youtube: Deactivated</p>
{% if playlist_info.playlist_type != "custom" %}
<p>Playlist:
{% if playlist_info.playlist_subscribed %}
{% if request.user|has_group:"admin" or request.user.is_staff %}
<button class="unsubscribe" type="button" data-type="playlist" data-subscribe="" data-id="{{ playlist_info.playlist_id }}" onclick="subscribeStatus(this)" title="Unsubscribe from {{ playlist_info.playlist_name }}">Unsubscribe</button>
{% endif %}
{% else %}
<button type="button" data-type="playlist" data-subscribe="true" data-id="{{ playlist_info.playlist_id }}" onclick="subscribeStatus(this)" title="Subscribe to {{ playlist_info.playlist_name }}">Subscribe</button>
{% endif %}
</p>
{% if playlist_info.playlist_active %}
<p>Youtube: <a href="https://www.youtube.com/playlist?list={{ playlist_info.playlist_id }}" target="_blank">Active</a></p>
{% else %}
<p>Youtube: Deactivated</p>
{% endif %}
{% endif %}
<button onclick="deleteConfirm()" id="delete-item">Delete Playlist</button>
<div class="delete-confirm" id="delete-button">
@ -59,7 +68,9 @@
<p>Reindex scheduled</p>
{% else %}
<div id="reindex-button" class="button-box">
<button data-id="{{ playlist_info.playlist_id }}" data-type="playlist" onclick="reindex(this)" title="Reindex Playlist {{ playlist_info.playlist_name }}">Reindex</button>
{% if playlist_info.playlist_type != "custom" %}
<button data-id="{{ playlist_info.playlist_id }}" data-type="playlist" onclick="reindex(this)" title="Reindex Playlist {{ playlist_info.playlist_name }}">Reindex</button>
{% endif %}
<button data-id="{{ playlist_info.playlist_id }}" data-type="playlist" data-extract-videos="true" onclick="reindex(this)" title="Reindex Videos of {{ playlist_info.playlist_name }}">Reindex Videos</button>
</div>
{% endif %}
@ -134,15 +145,35 @@
{% endif %}
<span>{{ video.published }} | {{ video.player.duration_str }}</span>
</div>
<div>
<a class="video-more" href="{% url 'video' video.youtube_id %}"><h2>{{ video.title }}</h2></a>
<div class="video-desc-details">
<div>
{% if playlist_info.playlist_type == "custom" %}
<a href="{% url 'channel_id' video.channel.channel_id %}"><h3>{{ video.channel.channel_name }}</h3></a>
{% endif %}
<a class="video-more" href="{% url 'video' video.youtube_id %}"><h2>{{ video.title }}</h2></a>
</div>
{% if playlist_info.playlist_type == "custom" %}
{% if pagination %}
{% if pagination.last_page > 0 %}
<img id="{{ video.youtube_id }}-button" src="{% static 'img/icon-dot-menu.svg' %}" alt="dot-menu-icon" data-id="{{ video.youtube_id }}" data-context="video" onclick="showCustomPlaylistMenu(this,'{{playlist_info.playlist_id}}',{{pagination.current_page}},{{pagination.last_page}})" class="dot-button" title="More actions">
{% else %}
<img id="{{ video.youtube_id }}-button" src="{% static 'img/icon-dot-menu.svg' %}" alt="dot-menu-icon" data-id="{{ video.youtube_id }}" data-context="video" onclick="showCustomPlaylistMenu(this,'{{playlist_info.playlist_id}}',{{pagination.current_page}},{{pagination.current_page}})" class="dot-button" title="More actions">
{% endif %}
{% else %}
<img id="{{ video.youtube_id }}-button" src="{% static 'img/icon-dot-menu.svg' %}" alt="dot-menu-icon" data-id="{{ video.youtube_id }}" data-context="video" onclick="showCustomPlaylistMenu(this,'{{playlist_info.playlist_id}}',0,0)" class="dot-button" title="More actions">
{% endif %}
{% endif %}
</div>
</div>
</div>
{% endfor %}
{% else %}
<h2>No videos found...</h2>
<p>Try going to the <a href="{% url 'downloads' %}">downloads page</a> to start the scan and download tasks.</p>
{% if playlist_info.playlist_type == "custom" %}
<p>Try going to the <a href="{% url 'home' %}">home page</a> to add videos to this playlist.</p>
{% else %}
<p>Try going to the <a href="{% url 'downloads' %}">downloads page</a> to start the scan and download tasks.</p>
{% endif %}
{% endif %}
</div>
</div>

View File

@ -5,14 +5,26 @@
<h1>Your Archive</h1>
</div>
<div class="settings-item">
<h2>Main overview</h2>
<div id="primaryBox" class="info-box info-box-4">
<h2>Overview</h2>
<div id="activeBox" class="info-box info-box-3">
<p id="loading">Loading...</p>
</div>
</div>
<div class="settings-item">
<h2>Video Type</h2>
<div id="videoTypeBox" class="info-box info-box-3">
<p id="loading">Loading...</p>
</div>
</div>
<div class="settings-item">
<h2>Application</h2>
<div id="secondaryBox" class="info-box info-box-3">
<p id="loading">Loading...</p>
</div>
</div>
<div class="settings-item">
<h2>Watch Progress</h2>
<div id="watchBox" class="info-box info-box-3">
<div id="watchBox" class="info-box info-box-2">
<p id="loading">Loading...</p>
</div>
</div>

View File

@ -10,11 +10,9 @@
<li><span class="settings-current">0 15 *</span>: Run task every day at 15:00 in the afternoon.</li>
<li><span class="settings-current">30 8 */2</span>: Run task every second day of the week (Sun, Tue, Thu, Sat) at 08:30 in the morning.</li>
<li><span class="settings-current">auto</span>: Sensible default.</li>
<li><span class="settings-current">0</span>: (zero), deactivate that task.</li>
</ul>
<p>Note:</p>
<ul>
<li>Changes in the scheduler settings require a container restart to take effect.</li>
<li>Avoid an unnecessary frequent schedule to not get blocked by YouTube. For that reason, the scheduler doesn't support schedules that trigger more than once per hour.</li>
</ul>
</div>
@ -24,68 +22,47 @@
<div class="settings-group">
<h2>Rescan Subscriptions</h2>
<div class="settings-item">
<p>Become a sponsor and join <a href="https://members.tubearchivist.com/" target="_blank">members.tubearchivist.com</a> to get access to <span class="settings-current">real time</span> notifications for new videos uploaded by your favorite channels.</p>
<p>Current rescan schedule: <span class="settings-current">
{% if config.scheduler.update_subscribed %}
{% for key, value in config.scheduler.update_subscribed.items %}
{{ value }}
{% endfor %}
{% if update_subscribed %}
{{ update_subscribed.crontab.minute }} {{ update_subscribed.crontab.hour }} {{ update_subscribed.crontab.day_of_week }}
<button data-schedule="update_subscribed" onclick="deleteSchedule(this)" class="danger-button">Delete</button>
{% else %}
False
{% endif %}
</span></p>
<p>Become a sponsor and join <a href="https://members.tubearchivist.com/" target="_blank">members.tubearchivist.com</a> to get access to <span class="settings-current">real time</span> notifications for new videos uploaded by your favorite channels.</p>
<p>Periodically rescan your subscriptions:</p>
{% for error in scheduler_form.update_subscribed.errors %}
<p class="danger-zone">{{ error }}</p>
{% endfor %}
{{ scheduler_form.update_subscribed }}
</div>
<div class="settings-item">
<p>Send notification on task completed:</p>
{% if config.scheduler.update_subscribed_notify %}
<p><button type="button" onclick="textReveal(this)" id="text-reveal-button">Show</button> stored notification links</p>
<div id="text-reveal" class="description-text">
<p>{{ config.scheduler.update_subscribed_notify|linebreaks }}</p>
</div>
{% else %}
<p>Current notification urls: <span class="settings-current">{{ config.scheduler.update_subscribed_notify }}</span></p>
{% endif %}
{{ scheduler_form.update_subscribed_notify }}
</div>
</div>
<div class="settings-group">
<h2>Start download</h2>
<h2>Start Download</h2>
<div class="settings-item">
<p>Current Download schedule: <span class="settings-current">
{% if config.scheduler.download_pending %}
{% for key, value in config.scheduler.download_pending.items %}
{{ value }}
{% endfor %}
{% if download_pending %}
{{ download_pending.crontab.minute }} {{ download_pending.crontab.hour }} {{ download_pending.crontab.day_of_week }}
<button data-schedule="download_pending" onclick="deleteSchedule(this)" class="danger-button">Delete</button>
{% else %}
False
{% endif %}
</span></p>
<p>Automatic video download schedule:</p>
{% for error in scheduler_form.download_pending.errors %}
<p class="danger-zone">{{ error }}</p>
{% endfor %}
{{ scheduler_form.download_pending }}
</div>
<div class="settings-item">
<p>Send notification on task completed:</p>
{% if config.scheduler.download_pending_notify %}
<p><button type="button" onclick="textReveal(this)" id="text-reveal-button">Show</button> stored notification links</p>
<div id="text-reveal" class="description-text">
<p>{{ config.scheduler.download_pending_notify|linebreaks }}</p>
</div>
{% else %}
<p>Current notification urls: <span class="settings-current">{{ config.scheduler.download_pending_notify }}</span></p>
{% endif %}
{{ scheduler_form.download_pending_notify }}
</div>
</div>
<div class="settings-group">
<h2>Refresh Metadata</h2>
<div class="settings-item">
<p>Current Metadata refresh schedule: <span class="settings-current">
{% if config.scheduler.check_reindex %}
{% for key, value in config.scheduler.check_reindex.items %}
{{ value }}
{% endfor %}
{% if check_reindex %}
{{ check_reindex.crontab.minute }} {{ check_reindex.crontab.hour }} {{ check_reindex.crontab.day_of_week }}
<button data-schedule="check_reindex" onclick="deleteSchedule(this)" class="danger-button">Delete</button>
{% else %}
False
{% endif %}
@ -94,36 +71,29 @@
{{ scheduler_form.check_reindex }}
</div>
<div class="settings-item">
<p>Current refresh for metadata older than x days: <span class="settings-current">{{ config.scheduler.check_reindex_days }}</span></p>
<p>Current refresh for metadata older than x days: <span class="settings-current">{{ check_reindex.task_config.days }}</span></p>
<p>Refresh older than x days, recommended 90:</p>
{% for error in scheduler_form.check_reindex.errors %}
<p class="danger-zone">{{ error }}</p>
{% endfor %}
{{ scheduler_form.check_reindex_days }}
</div>
<div class="settings-item">
<p>Send notification on task completed:</p>
{% if config.scheduler.check_reindex_notify %}
<p><button type="button" onclick="textReveal(this)" id="text-reveal-button">Show</button> stored notification links</p>
<div id="text-reveal" class="description-text">
<p>{{ config.scheduler.check_reindex_notify|linebreaks }}</p>
</div>
{% else %}
<p>Current notification urls: <span class="settings-current">{{ config.scheduler.check_reindex_notify }}</span></p>
{% endif %}
{{ scheduler_form.check_reindex_notify }}
</div>
</div>
<div class="settings-group">
<h2>Thumbnail check</h2>
<h2>Thumbnail Check</h2>
<div class="settings-item">
<p>Current thumbnail check schedule: <span class="settings-current">
{% if config.scheduler.thumbnail_check %}
{% for key, value in config.scheduler.thumbnail_check.items %}
{{ value }}
{% endfor %}
{% if thumbnail_check %}
{{ thumbnail_check.crontab.minute }} {{ thumbnail_check.crontab.hour }} {{ thumbnail_check.crontab.day_of_week }}
<button data-schedule="thumbnail_check" onclick="deleteSchedule(this)" class="danger-button">Delete</button>
{% else %}
False
{% endif %}
</span></p>
<p>Periodically check and cleanup thumbnails:</p>
{% for error in scheduler_form.thumbnail_check.errors %}
<p class="danger-zone">{{ error }}</p>
{% endfor %}
{{ scheduler_form.thumbnail_check }}
</div>
</div>
@ -132,23 +102,51 @@
<div class="settings-item">
<p><i>Zip file backups are very slow for large archives and consistency is not guaranteed, use snapshots instead. Make sure no other tasks are running when creating a Zip file backup.</i></p>
<p>Current index backup schedule: <span class="settings-current">
{% if config.scheduler.run_backup %}
{% for key, value in config.scheduler.run_backup.items %}
{{ value }}
{% endfor %}
{% if run_backup %}
{{ run_backup.crontab.minute }} {{ run_backup.crontab.hour }} {{ run_backup.crontab.day_of_week }}
<button data-schedule="run_backup" onclick="deleteSchedule(this)" class="danger-button">Delete</button>
{% else %}
False
{% endif %}
</span></p>
<p>Automatically backup metadata to a zip file:</p>
{% for error in scheduler_form.run_backup.errors %}
<p class="danger-zone">{{ error }}</p>
{% endfor %}
{{ scheduler_form.run_backup }}
</div>
<div class="settings-item">
<p>Current backup files to keep: <span class="settings-current">{{ config.scheduler.run_backup_rotate }}</span></p>
<p>Current backup files to keep: <span class="settings-current">{{ run_backup.task_config.rotate }}</span></p>
<p>Max auto backups to keep:</p>
{{ scheduler_form.run_backup_rotate }}
</div>
</div>
<div class="settings-group">
<h2>Add Notification URL</h2>
<div class="settings-item">
{% if notifications %}
<p><button type="button" onclick="textReveal(this)" id="text-reveal-button">Show</button> stored notification links</p>
<div id="text-reveal" class="description-text">
{% for task, notifications in notifications.items %}
<h3>{{ notifications.title }}</h3>
{% for url in notifications.urls %}
<p>
<button type="button" class="danger-button" data-url="{{ url }}" data-task="{{ task }}" onclick="deleteNotificationUrl(this)"> Delete</button>
<span> {{ url }}</span>
</p>
{% endfor %}
{% endfor %}
</div>
{% else %}
<p>No notifications stored</p>
{% endif %}
</div>
<div class="settings-item">
<p><i>Send notification on completed tasks with the help of the <a href="https://github.com/caronc/apprise" target="_blank">Apprise</a> library.</i></p>
{{ notification_form.task }}
{{ notification_form.notification_url }}
</div>
</div>
<button type="submit" name="scheduler-settings">Update Scheduler Settings</button>
</form>
{% endblock settings_content %}

View File

@ -7,11 +7,11 @@
<form action="{% url 'settings_user' %}" method="POST" name="user-update">
{% csrf_token %}
<div class="settings-group">
<h2>Color scheme</h2>
<h2>Stylesheet</h2>
<div class="settings-item">
<p>Current color scheme: <span class="settings-current">{{ colors }}</span></p>
<i>Select your preferred color scheme between dark and light mode.</i><br>
{{ user_form.colors }}
<p>Current stylesheet: <span class="settings-current">{{ stylesheet }}</span></p>
<i>Select your preferred stylesheet.</i><br>
{{ user_form.stylesheet }}
</div>
</div>
<div class="settings-group">

View File

@ -2,6 +2,7 @@
{% block content %}
{% load static %}
{% load humanize %}
{% load auth_extras %}
<div id="player" class="player-wrapper">
<div class="video-main">
<div class="video-modal"><span class="video-modal-text"></span></div>
@ -81,15 +82,23 @@
{% if reindex %}
<p>Reindex scheduled</p>
{% else %}
{% if request.user|has_group:"admin" or request.user.is_staff %}
<div id="reindex-button" class="button-box">
<button data-id="{{ video.youtube_id }}" data-type="video" onclick="reindex(this)" title="Reindex {{ video.title }}">Reindex</button>
</div>
{% endif %}
{% endif %}
<a download="" href="/media/{{ video.media_url }}"><button id="download-item">Download File</button></a>
<a download="" href="{{ video.media_url }}"><button id="download-item">Download File</button></a>
{% if request.user|has_group:"admin" or request.user.is_staff %}
<button onclick="deleteConfirm()" id="delete-item">Delete Video</button>
<div class="delete-confirm" id="delete-button">
<span>Are you sure? </span><button class="danger-button" onclick="deleteVideo(this)" data-id="{{ video.youtube_id }}" data-redirect = "{{ video.channel.channel_id }}">Delete</button> <button onclick="cancelDelete()">Cancel</button>
<span>Are you sure? </span>
<button class="danger-button" onclick="deleteVideo(this)" data-id="{{ video.youtube_id }}" data-redirect = "{{ video.channel.channel_id }}">Delete</button>
<button class="danger-button" onclick="deleteVideo(this)" data-id="{{ video.youtube_id }}" data-ignore data-redirect = "{{ video.channel.channel_id }}">Delete and ignore</button>
<button onclick="cancelDelete()">Cancel</button>
</div>
{% endif %}
<button id="{{ video.youtube_id }}-button" data-id="{{ video.youtube_id }}" data-context="video" onclick="showAddToPlaylistMenu(this)">Add To Playlist</button>
</div>
</div>
<div class="info-box-item">

View File

@ -0,0 +1,8 @@
from django import template
register = template.Library()
@register.filter(name="has_group")
def has_group(user, group_name):
return user.groups.filter(name=group_name).exists()

View File

@ -58,7 +58,6 @@ urlpatterns = [
login_required(views.SettingsActionsView.as_view()),
name="settings_actions",
),
path("process/", login_required(views.process), name="process"),
path(
"channel/",
login_required(views.ChannelView.as_view()),

View File

@ -3,42 +3,53 @@ Functionality:
- all views for home app
- holds base classes to inherit from
"""
import enum
import json
import urllib.parse
import uuid
from time import sleep
from api.src.search_processor import SearchProcess, process_aggs
from api.views import check_admin
from django.conf import settings
from django.contrib.auth import login
from django.contrib.auth.decorators import user_passes_test
from django.contrib.auth.forms import AuthenticationForm
from django.http import Http404, JsonResponse
from django.http import Http404
from django.shortcuts import redirect, render
from django.utils.decorators import method_decorator
from django.views import View
from home.models import CustomPeriodicTask
from home.src.download.queue import PendingInteract
from home.src.download.yt_dlp_base import CookieHandler
from home.src.es.backup import ElasticBackup
from home.src.es.connect import ElasticWrap
from home.src.es.snapshot import ElasticSnapshot
from home.src.frontend.api_calls import PostData
from home.src.frontend.forms import (
AddToQueueForm,
ApplicationSettingsForm,
ChannelOverwriteForm,
CreatePlaylistForm,
CustomAuthForm,
MultiSearchForm,
SchedulerSettingsForm,
SubscribeToChannelForm,
SubscribeToPlaylistForm,
UserSettingsForm,
)
from home.src.frontend.forms_schedule import (
NotificationSettingsForm,
SchedulerSettingsForm,
)
from home.src.index.channel import channel_overwrites
from home.src.index.generic import Pagination
from home.src.index.playlist import YoutubePlaylist
from home.src.index.reindex import ReindexProgress
from home.src.index.video_constants import VideoTypeEnum
from home.src.ta.config import AppConfig, ReleaseVersion, ScheduleBuilder
from home.src.ta.helper import time_parser
from home.src.ta.config import AppConfig, ReleaseVersion
from home.src.ta.config_schedule import ScheduleBuilder
from home.src.ta.helper import check_stylesheet, time_parser
from home.src.ta.notify import Notifications, get_all_notifications
from home.src.ta.settings import EnvironmentSettings
from home.src.ta.ta_redis import RedisArchivist
from home.src.ta.users import UserConfig
from home.tasks import index_channel_playlists, subscribe_to
@ -53,7 +64,6 @@ class ArchivistViewConfig(View):
self.view_origin = view_origin
self.user_id = False
self.user_conf: UserConfig = False
self.default_conf = False
self.context = False
def get_all_view_styles(self):
@ -70,11 +80,12 @@ class ArchivistViewConfig(View):
"""build default context for every view"""
self.user_id = user_id
self.user_conf = UserConfig(self.user_id)
self.default_conf = AppConfig().config
self.context = {
"colors": self.user_conf.get_value("colors"),
"cast": self.default_conf["application"]["enable_cast"],
"stylesheet": check_stylesheet(
self.user_conf.get_value("stylesheet")
),
"cast": EnvironmentSettings.ENABLE_CAST,
"sort_by": self.user_conf.get_value("sort_by"),
"sort_order": self.user_conf.get_value("sort_order"),
"view_style": self.user_conf.get_value(
@ -109,6 +120,8 @@ class ArchivistResultsView(ArchivistViewConfig):
"likes": "stats.like_count",
"downloaded": "date_downloaded",
"published": "published",
"duration": "player.duration",
"filesize": "media_size",
}
sort_by = sort_by_map[self.context["sort_by"]]
@ -220,7 +233,9 @@ class MinView(View):
def get_min_context(request):
"""build minimal vars for context"""
return {
"colors": UserConfig(request.user.id).get_value("colors"),
"stylesheet": check_stylesheet(
UserConfig(request.user.id).get_value("stylesheet")
),
"version": settings.TA_VERSION,
"ta_update": ReleaseVersion().get_update(),
}
@ -317,6 +332,7 @@ class AboutView(MinView):
return render(request, "home/about.html", context)
@method_decorator(user_passes_test(check_admin), name="dispatch")
class DownloadView(ArchivistResultsView):
"""resolves to /download/
handle the download queue
@ -415,6 +431,8 @@ class ChannelIdBaseView(ArchivistResultsView):
path = f"ta_channel/_doc/{channel_id}"
response, _ = ElasticWrap(path).get()
channel_info = SearchProcess(response).process()
if not channel_info:
raise Http404
return channel_info
@ -514,7 +532,7 @@ class ChannelIdView(ChannelIdBaseView):
self.context.update(
{
"title": "Channel: " + channel_name,
"title": f"Channel: {channel_name}",
"channel_info": channel_info,
}
)
@ -597,6 +615,7 @@ class ChannelIdAboutView(ChannelIdBaseView):
return render(request, "home/channel_id_about.html", self.context)
@method_decorator(user_passes_test(check_admin), name="dispatch")
@staticmethod
def post(request, channel_id):
"""handle post request"""
@ -681,6 +700,7 @@ class ChannelView(ArchivistResultsView):
"term": {"channel_subscribed": {"value": True}}
}
@method_decorator(user_passes_test(check_admin), name="dispatch")
@staticmethod
def post(request):
"""handle http post requests"""
@ -706,6 +726,9 @@ class PlaylistIdView(ArchivistResultsView):
"""handle get request"""
self.initiate_vars(request)
playlist_info, channel_info = self._get_info(playlist_id)
if not playlist_info:
raise Http404
playlist_name = playlist_info["playlist_name"]
self._update_view_data(playlist_id, playlist_info)
self.find_results()
@ -730,12 +753,12 @@ class PlaylistIdView(ArchivistResultsView):
# playlist details
es_path = f"ta_playlist/_doc/{playlist_id}"
playlist_info = self.single_lookup(es_path)
# channel details
channel_id = playlist_info["playlist_channel_id"]
es_path = f"ta_channel/_doc/{channel_id}"
channel_info = self.single_lookup(es_path)
channel_info = None
if playlist_info["playlist_type"] != "custom":
# channel details
channel_id = playlist_info["playlist_channel_id"]
es_path = f"ta_channel/_doc/{channel_id}"
channel_info = self.single_lookup(es_path)
return playlist_info, channel_info
def _update_view_data(self, playlist_id, playlist_info):
@ -793,6 +816,7 @@ class PlaylistView(ArchivistResultsView):
{
"title": "Playlists",
"subscribe_form": SubscribeToPlaylistForm(),
"create_form": CreatePlaylistForm(),
}
)
@ -824,14 +848,22 @@ class PlaylistView(ArchivistResultsView):
}
}
@method_decorator(user_passes_test(check_admin), name="dispatch")
@staticmethod
def post(request):
"""handle post from search form"""
subscribe_form = SubscribeToPlaylistForm(data=request.POST)
if subscribe_form.is_valid():
url_str = request.POST.get("subscribe")
print(url_str)
subscribe_to.delay(url_str, expected_type="playlist")
"""handle post from subscribe or create form"""
if request.POST.get("create") is not None:
create_form = CreatePlaylistForm(data=request.POST)
if create_form.is_valid():
name = request.POST.get("create")
playlist_id = f"TA_playlist_{uuid.uuid4()}"
YoutubePlaylist(playlist_id).create(name)
else:
subscribe_form = SubscribeToPlaylistForm(data=request.POST)
if subscribe_form.is_valid():
url_str = request.POST.get("subscribe")
print(url_str)
subscribe_to.delay(url_str, expected_type="playlist")
sleep(1)
return redirect("playlist")
@ -847,6 +879,8 @@ class VideoView(MinView):
config_handler = AppConfig()
response, _ = ElasticWrap(f"ta_video/_doc/{video_id}").get()
video_data = SearchProcess(response).process()
if not video_data:
raise Http404
try:
rating = video_data["stats"]["average_rating"]
@ -870,7 +904,7 @@ class VideoView(MinView):
"video": video_data,
"playlist_nav": playlist_nav,
"title": video_data.get("title"),
"cast": config_handler.config["application"]["enable_cast"],
"cast": EnvironmentSettings.ENABLE_CAST,
"config": config_handler.config,
"position": time_parser(request.GET.get("t")),
"reindex": reindex.get("state"),
@ -973,9 +1007,9 @@ class SettingsUserView(MinView):
config_handler = UserConfig(request.user.id)
if user_form.is_valid():
user_form_post = user_form.cleaned_data
if user_form_post.get("colors"):
if user_form_post.get("stylesheet"):
config_handler.set_value(
"colors", user_form_post.get("colors")
"stylesheet", user_form_post.get("stylesheet")
)
if user_form_post.get("page_size"):
config_handler.set_value(
@ -986,6 +1020,7 @@ class SettingsUserView(MinView):
return redirect("settings_user", permanent=True)
@method_decorator(user_passes_test(check_admin), name="dispatch")
class SettingsApplicationView(MinView):
"""resolves to /settings/application/
handle the settings sub-page for application configuration,
@ -1075,6 +1110,7 @@ class SettingsApplicationView(MinView):
RedisArchivist().set_message(key, message=message, expire=True)
@method_decorator(user_passes_test(check_admin), name="dispatch")
class SettingsSchedulingView(MinView):
"""resolves to /settings/scheduling/
handle the settings sub-page for scheduling settings,
@ -1084,30 +1120,67 @@ class SettingsSchedulingView(MinView):
def get(self, request):
"""read and display current settings"""
context = self.get_min_context(request)
context.update(
{
"title": "Scheduling Settings",
"config": AppConfig().config,
"scheduler_form": SchedulerSettingsForm(),
}
)
context = self.get_context(request, SchedulerSettingsForm())
return render(request, "home/settings_scheduling.html", context)
def post(self, request):
"""handle form post to update settings"""
scheduler_form = SchedulerSettingsForm(request.POST)
notification_form = NotificationSettingsForm(request.POST)
if notification_form.is_valid():
notification_form_post = notification_form.cleaned_data
print(notification_form_post)
if any(notification_form_post.values()):
task_name = notification_form_post.get("task")
url = notification_form_post.get("notification_url")
Notifications(task_name).add_url(url)
if scheduler_form.is_valid():
scheduler_form_post = scheduler_form.cleaned_data
if any(scheduler_form_post.values()):
print(scheduler_form_post)
ScheduleBuilder().update_schedule_conf(scheduler_form_post)
else:
self.fail_message()
context = self.get_context(request, scheduler_form)
return render(request, "home/settings_scheduling.html", context)
sleep(1)
return redirect("settings_scheduling", permanent=True)
def get_context(self, request, scheduler_form):
"""get context"""
context = self.get_min_context(request)
all_tasks = CustomPeriodicTask.objects.all()
context.update(
{
"title": "Scheduling Settings",
"scheduler_form": scheduler_form,
"notification_form": NotificationSettingsForm(),
"notifications": get_all_notifications(),
}
)
for task in all_tasks:
context.update({task.name: task})
return context
@staticmethod
def fail_message():
"""send failure message"""
mess_dict = {
"group": "setting:schedule",
"level": "error",
"title": "Scheduler update failed.",
"messages": ["Invalid schedule input"],
"id": "0000",
}
RedisArchivist().set_message("message:setting", mess_dict, expire=True)
@method_decorator(user_passes_test(check_admin), name="dispatch")
class SettingsActionsView(MinView):
"""resolves to /settings/actions/
handle the settings actions sub-page
@ -1124,16 +1197,3 @@ class SettingsActionsView(MinView):
)
return render(request, "home/settings_actions.html", context)
def process(request):
"""handle all the buttons calls via POST ajax"""
if request.method == "POST":
current_user = request.user.id
post_dict = json.loads(request.body.decode())
post_handler = PostData(post_dict, current_user)
if post_handler.to_exec:
task_result = post_handler.run_task()
return JsonResponse(task_result)
return JsonResponse({"success": False})

View File

@ -0,0 +1,8 @@
-r requirements.txt
black
codespell
flake8
isort
pylint
pylint-django
types-requests

View File

@ -1,13 +1,14 @@
apprise==1.5.0
celery==5.3.4
Django==4.2.6
django-auth-ldap==4.6.0
django-cors-headers==4.2.0
djangorestframework==3.14.0
Pillow==10.0.1
redis==5.0.1
apprise==1.8.0
celery==5.4.0
Django==5.0.6
django-auth-ldap==4.8.0
django-celery-beat==2.6.0
django-cors-headers==4.3.1
djangorestframework==3.15.1
Pillow==10.3.0
redis==5.0.4
requests==2.31.0
ryd-client==0.0.6
uWSGI==2.0.22
whitenoise==6.5.0
yt-dlp==2023.10.7
uWSGI==2.0.25.1
whitenoise==6.6.0
yt-dlp @ git+https://github.com/bbilly1/yt-dlp@54b823be28f396608349cca69d52eb4c4b72b8b0

View File

@ -9,4 +9,6 @@
--accent-font-light: #97d4c8;
--img-filter: invert(50%) sepia(9%) saturate(2940%) hue-rotate(122deg) brightness(94%) contrast(90%);
--img-filter-error: invert(16%) sepia(60%) saturate(3717%) hue-rotate(349deg) brightness(86%) contrast(120%);
--banner: url("../img/banner-tube-archivist-dark.png");
--logo: url("../img/logo-tube-archivist-dark.png");
}

View File

@ -9,4 +9,6 @@
--accent-font-light: #35b399;
--img-filter: invert(50%) sepia(9%) saturate(2940%) hue-rotate(122deg) brightness(94%) contrast(90%);
--img-filter-error: invert(16%) sepia(60%) saturate(3717%) hue-rotate(349deg) brightness(86%) contrast(120%);
--banner: url("../img/banner-tube-archivist-light.png");
--logo: url("../img/logo-tube-archivist-light.png");
}

View File

@ -0,0 +1,67 @@
:root {
--main-bg: #000000;
--highlight-bg: #080808;
--highlight-error: #880000;
--highlight-error-light: #aa0000;
--highlight-bg-transparent: #0c0c0caf;
--main-font: #00aa00;
--accent-font-dark: #007700;
--accent-font-light: #00aa00;
--img-filter: brightness(0) saturate(100%) invert(45%) sepia(100%) saturate(3710%) hue-rotate(96deg) brightness(100%) contrast(102%);
--img-filter-error: invert(16%) sepia(60%) saturate(3717%) hue-rotate(349deg) brightness(86%) contrast(120%);
--banner: url("../img/banner-tube-archivist-dark.png");
--logo: url("../img/logo-tube-archivist-dark.png");
--outline: 1px solid green;
--filter: hue-rotate(310deg);
}
.settings-group {
outline: var(--outline);
}
.info-box-item {
outline: var(--outline);
}
.footer {
outline: var(--outline);
}
.top-banner img {
filter: var(--filter);
}
.icon-text {
outline: var(--outline);
}
.video-item {
outline: var(--outline);
}
.channel-banner {
outline: var(--outline);
}
.description-box {
outline: var(--outline);
}
.video-player {
outline: var(--outline);
}
#notification {
outline: var(--outline);
}
textarea {
background-color: var(--highlight-bg);
outline: var(--outline);
color: var(--main-font);
}
input {
background-color: var(--highlight-bg);
color: var(--main-font);
}

View File

@ -0,0 +1,14 @@
:root {
--main-bg: #000000;
--highlight-bg: #0c0c0c;
--highlight-error: #220000;
--highlight-error-light: #330000;
--highlight-bg-transparent: #0c0c0caf;
--main-font: #888888;
--accent-font-dark: #555555;
--accent-font-light: #999999;
--img-filter: invert(50%) sepia(9%) saturate(2940%) hue-rotate(122deg) brightness(94%) contrast(90%);
--img-filter-error: invert(16%) sepia(60%) saturate(3717%) hue-rotate(349deg) brightness(86%) contrast(120%);
--banner: url("../img/banner-tube-archivist-dark.png");
--logo: url("../img/logo-tube-archivist-dark.png");
}

View File

@ -159,18 +159,19 @@ button:hover {
}
.top-banner {
text-align: center;
}
.top-banner img {
width: 100%;
max-width: 700px;
background-image: var(--banner);
background-repeat: no-repeat;
background-size: contain;
height: 10vh;
min-height: 80px;
max-height: 120px;
background-position: center center;
}
.footer {
margin: 0;
padding: 20px 0;
background-color: var(--accent-font-dark);
background-color: var(--highlight-bg);
grid-row-start: 2;
grid-row-end: 3;
}
@ -179,6 +180,10 @@ button:hover {
text-decoration: underline;
}
.footer .boxed-content {
text-align: center;
}
/* toggle on-off */
.toggle {
display: flex;
@ -366,11 +371,23 @@ button:hover {
filter: var(--img-filter);
}
.video-popup-menu {
border-top: 2px solid;
border-color: var(--accent-font-dark);
margin: 5px 0;
padding-top: 10px;
}
#hidden-form {
display: none;
}
#hidden-form button {
#hidden-form2 {
display: none;
margin-top: 10px;
}
#hidden-form button, #hidden-form2 button {
margin-right: 1rem;
}
@ -559,6 +576,12 @@ video:-webkit-full-screen {
margin-right: 10px;
}
.video-popup-menu img.move-video-button {
width: 24px;
cursor: pointer;
filter: var(--img-filter);
}
.video-desc a {
text-decoration: none;
text-align: left;
@ -587,7 +610,13 @@ video:-webkit-full-screen {
align-items: center;
}
.video-desc-details {
display: flex;
justify-content: space-between;
}
.watch-button,
.dot-button,
.close-button {
cursor: pointer;
filter: var(--img-filter);
@ -677,6 +706,19 @@ video:-webkit-full-screen {
width: 100%;
}
.video-popup-menu img {
width: 12px;
cursor: pointer;
filter: var(--img-filter);
}
.video-popup-menu-close-button {
cursor: pointer;
filter: var(--img-filter);
float:right;
}
.description-text {
width: 100%;
}
@ -725,6 +767,7 @@ video:-webkit-full-screen {
max-width: 200px;
max-height: 200px;
margin-bottom: 40px;
content: var(--logo);
}
.login-page form {

Binary file not shown.

After

Width:  |  Height:  |  Size: 24 KiB

View File

@ -1,71 +1,13 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="500"
height="500"
viewBox="0 0 132.29197 132.29167"
version="1.1"
id="svg1303"
inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="Icon_add.svg">
<defs
id="defs1297" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="0.85859018"
inkscape:cx="-97.380081"
inkscape:cy="261.09215"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
units="px"
inkscape:window-width="1920"
inkscape:window-height="1017"
inkscape:window-x="-8"
inkscape:window-y="-8"
inkscape:window-maximized="1"
showguides="true"
inkscape:guide-bbox="true">
<sodipodi:guide
position="-221.87586,143.2945"
orientation="1,0"
id="guide1072"
inkscape:locked="false" />
</sodipodi:namedview>
<metadata
id="metadata1300">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title></dc:title>
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Ebene 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-164.70764)">
<path
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:0;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
d="m 58.600542,170.62113 c -1.62283,1.61686 -2.626579,3.8573 -2.631447,6.33943 l -0.08037,43.5977 -43.597706,-0.0803 c -4.9648228,-0.009 -8.9691711,3.98046 -8.9784054,8.94536 l -0.00482,2.62846 c -0.00925,4.9649 3.9804459,8.96933 8.9452674,8.97832 l 43.597694,0.0805 -0.08027,43.59778 c -0.0093,4.96488 3.980368,8.96922 8.945263,8.9783 l 2.628471,0.005 c 4.964897,0.009 8.969245,-3.98054 8.978406,-8.94536 l 0.08035,-43.59771 43.597715,0.0803 c 4.96484,0.009 8.96917,-3.98046 8.9784,-8.94534 l 0.005,-2.62847 c 0.009,-4.96489 -3.98037,-8.96923 -8.94525,-8.97831 l -43.597784,-0.0805 0.08034,-43.59771 c 0.0093,-4.96481 -3.980379,-8.96923 -8.945267,-8.97831 l -2.628469,-0.005 c -2.482483,-0.005 -4.724106,0.98906 -6.346936,2.60592 z"
id="rect1073"
inkscape:connector-curvature="0" />
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<sodipodi:namedview bordercolor="#666666" borderopacity="1.0" id="base" inkscape:current-layer="layer1" inkscape:cx="-97.380081" inkscape:cy="261.09215" inkscape:document-units="mm" inkscape:guide-bbox="true" inkscape:pageopacity="0.0" inkscape:pageshadow="2" inkscape:window-height="1017" inkscape:window-maximized="1" inkscape:window-width="1920" inkscape:window-x="-8" inkscape:window-y="-8" inkscape:zoom="0.85859018" pagecolor="#ffffff" showgrid="false" showguides="true" units="px">
<sodipodi:guide id="guide1072" inkscape:locked="false" orientation="1,0" position="-221.87586,143.2945"></sodipodi:guide>
</sodipodi:namedview>
<path d="M431.9,194.4H305.6V68.1c0-30.6-25-55.6-55.6-55.6h0c-30.6,0-55.6,25-55.6,55.6v126.3H68.1c-30.6,0-55.6,25-55.6,55.6v0
c0,30.6,25,55.6,55.6,55.6h126.3v126.3c0,30.6,25,55.6,55.6,55.6h0c30.6,0,55.6-25,55.6-55.6V305.6h126.3c30.6,0,55.6-25,55.6-55.6
v0C487.5,219.4,462.5,194.4,431.9,194.4z"/>
</svg>

Before

Width:  |  Height:  |  Size: 2.9 KiB

After

Width:  |  Height:  |  Size: 1.6 KiB

View File

@ -0,0 +1,14 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<g>
<path d="M231.3,382c4.8,5.3,11.6,8.3,18.7,8.3c7.1,0,13.9-3,18.7-8.3l112.1-124.2c6.7-7.4,8.4-18,4.3-27.1
c-4-9.1-13.1-14.9-23-14.9h-52.9V64.9c0-13.9-11.3-25.2-25.2-25.2h-68c-13.9,0-25.2,11.3-25.2,25.2v150.9h-52.9
c-10,0-19,5.9-23,14.9c-4,9.1-2.3,19.7,4.3,27.1L231.3,382z"/>
<path d="M436.1,408.8H63.9c-13.6,0-24.7,11.1-24.7,24.7c0,13.6,11.1,24.7,24.7,24.7h372.2c13.6,0,24.7-11.1,24.7-24.7
C460.8,419.9,449.7,408.8,436.1,408.8z"/>
</g>
</svg>

After

Width:  |  Height:  |  Size: 1.1 KiB

View File

@ -0,0 +1,6 @@
<?xml version="1.0" encoding="utf-8"?>
<svg viewBox="0 0 500 500" style="enable-background:new 0 0 500 500;" xmlns="http://www.w3.org/2000/svg">
<g style="" transform="matrix(0.9999999999999999, 0, 0, -0.9999999999999999, 0, 497.9000392526395)">
<path d="M 231.3 48 C 236.1 42.7 242.9 39.7 250 39.7 C 257.1 39.7 263.9 42.7 268.7 48 L 380.8 172.2 C 387.5 179.6 389.2 190.2 385.1 199.3 C 381.1 208.4 372 214.2 362.1 214.2 L 309.2 214.2 L 309.2 365.1 C 309.2 379 297.9 390.3 284 390.3 L 216 390.3 C 202.1 390.3 190.8 379 190.8 365.1 L 190.8 214.2 L 137.9 214.2 C 127.9 214.2 118.9 208.3 114.9 199.3 C 110.9 190.2 112.6 179.6 119.2 172.2 L 231.3 48 Z" style=""/>
</g>
</svg>

After

Width:  |  Height:  |  Size: 678 B

View File

@ -0,0 +1,7 @@
<?xml version="1.0" encoding="utf-8"?>
<svg viewBox="0 0 500 500" style="enable-background:new 0 0 500 500;" xmlns="http://www.w3.org/2000/svg">
<g style="" transform="matrix(0.9999999999999999, 0, 0, -0.9999999999999999, 0, 497.9000392526395)">
<path d="M231.3,382c4.8,5.3,11.6,8.3,18.7,8.3c7.1,0,13.9-3,18.7-8.3l112.1-124.2c6.7-7.4,8.4-18,4.3-27.1 c-4-9.1-13.1-14.9-23-14.9h-52.9V64.9c0-13.9-11.3-25.2-25.2-25.2h-68c-13.9,0-25.2,11.3-25.2,25.2v150.9h-52.9 c-10,0-19,5.9-23,14.9c-4,9.1-2.3,19.7,4.3,27.1L231.3,382z"/>
<path d="M436.1,408.8H63.9c-13.6,0-24.7,11.1-24.7,24.7c0,13.6,11.1,24.7,24.7,24.7h372.2c13.6,0,24.7-11.1,24.7-24.7 C460.8,419.9,449.7,408.8,436.1,408.8z"/>
</g>
</svg>

After

Width:  |  Height:  |  Size: 698 B

View File

@ -0,0 +1,6 @@
<?xml version="1.0" encoding="utf-8"?>
<svg viewBox="0 0 500 500" style="enable-background:new 0 0 500 500;" xmlns="http://www.w3.org/2000/svg">
<g style="" transform="matrix(0.9999999999999999, 0, 0, -0.9999999999999999, 0, 497.9000392526395)">
<path d="M231.3,382c4.8,5.3,11.6,8.3,18.7,8.3c7.1,0,13.9-3,18.7-8.3l112.1-124.2c6.7-7.4,8.4-18,4.3-27.1 c-4-9.1-13.1-14.9-23-14.9h-52.9V64.9c0-13.9-11.3-25.2-25.2-25.2h-68c-13.9,0-25.2,11.3-25.2,25.2v150.9h-52.9 c-10,0-19,5.9-23,14.9c-4,9.1-2.3,19.7,4.3,27.1L231.3,382z"/>
</g>
</svg>

After

Width:  |  Height:  |  Size: 538 B

View File

@ -1,63 +1,10 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="500"
height="500"
viewBox="0 0 132.29197 132.29167"
version="1.1"
id="svg1303"
inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="Icons_close.svg">
<defs
id="defs1297" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="1.4291625"
inkscape:cx="122.66624"
inkscape:cy="202.38142"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
units="px"
inkscape:window-width="1920"
inkscape:window-height="1017"
inkscape:window-x="-8"
inkscape:window-y="-8"
inkscape:window-maximized="1" />
<metadata
id="metadata1300">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title />
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Ebene 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-164.70764)">
<path
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:0;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
d="m 115.5244,167.09017 c -3.04348,0 -6.08905,1.16742 -8.42111,3.49893 L 66.146003,211.54641 25.18874,170.5891 c -4.664143,-4.66413 -12.173961,-4.66413 -16.8382429,0 l -2.4692132,2.46935 c -4.664282,4.66413 -4.664282,12.17408 0,16.83807 l 40.9572491,40.9573 -40.9572491,40.95745 c -4.664282,4.66412 -4.664282,12.17393 0,16.83806 l 2.4692132,2.46935 c 4.6642819,4.66413 12.1740999,4.66413 16.8382429,0 l 40.957263,-40.9573 40.957287,40.9573 c 4.66413,4.66413 12.17393,4.66413 16.8382,0 l 2.46921,-2.46935 c 4.66427,-4.66413 4.66427,-12.17394 0,-16.83806 L 85.453463,230.85382 126.4107,189.89652 c 4.66427,-4.66399 4.66427,-12.17394 0,-16.83807 l -2.46921,-2.46935 c -2.33221,-2.33206 -5.37361,-3.49893 -8.41709,-3.49893 z"
id="rect1073"
inkscape:connector-curvature="0" />
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<path d="M473.5,392.9L329.1,250.5l144.3-145.6c21.6-21.6,21.6-57,0-78.6l0,0c-21.6-21.6-57-21.6-78.6,0L250.5,171.9L106.3,26.2
c-21.6-21.6-57-21.6-78.6,0l0,0C6,47.9,6,83.2,27.6,104.9l144.2,145.6L26.4,393.9c-21.6,21.6-21.6,57,0,78.6l0,0
c21.6,21.6,57,21.6,78.6,0l145.4-143.4l144.3,142.4c21.6,21.6,57,21.6,78.6,0l0,0C495.1,449.9,495.1,414.6,473.5,392.9z"/>
</svg>

Before

Width:  |  Height:  |  Size: 2.6 KiB

After

Width:  |  Height:  |  Size: 1.0 KiB

View File

@ -0,0 +1,4 @@
<?xml version="1.0" encoding="utf-8"?><!-- Uploaded to: SVG Repo, www.svgrepo.com, Generator: SVG Repo Mixer Tools -->
<svg width="800px" height="800px" viewBox="0 0 16 16" xmlns="http://www.w3.org/2000/svg" fill="#000000" class="bi bi-three-dots-vertical">
<path d="M9.5 13a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0zm0-5a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0zm0-5a1.5 1.5 0 1 1-3 0 1.5 1.5 0 0 1 3 0z"/>
</svg>

After

Width:  |  Height:  |  Size: 405 B

View File

@ -1,63 +1,14 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="500"
height="500"
viewBox="0 0 132.29197 132.29167"
version="1.1"
id="svg1303"
inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="Download.svg">
<defs
id="defs1297" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="0.71458126"
inkscape:cx="-376.84122"
inkscape:cy="260.65826"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
units="px"
inkscape:window-width="1920"
inkscape:window-height="1017"
inkscape:window-x="-8"
inkscape:window-y="-8"
inkscape:window-maximized="1" />
<metadata
id="metadata1300">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title></dc:title>
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Ebene 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-164.70764)">
<path
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:0;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
d="M 249.89062 39.6875 L 249.88477 39.693359 C 229.11581 39.693359 212.39453 59.135896 212.39453 83.289062 L 212.39453 289.66016 L 159.20312 236.46875 L 127.63867 268.05469 L 218.74805 359.14258 L 218.73242 359.1582 L 249.93555 390.33789 L 281.52148 358.77344 L 372.36133 267.91211 L 341.1582 236.73047 L 287.37891 290.50977 L 287.37891 83.283203 C 287.37891 59.130339 270.65971 39.6875 249.89062 39.6875 z M 50.566406 407.88867 C 44.284822 407.88867 39.228516 412.94498 39.228516 419.22656 L 39.228516 444.62695 C 39.228516 450.90854 44.284822 455.96484 50.566406 455.96484 L 449.43359 455.96484 C 455.71518 455.96484 460.77148 450.90854 460.77148 444.62695 L 460.77148 419.22656 C 460.77148 412.94498 455.71518 407.88867 449.43359 407.88867 L 50.566406 407.88867 z "
transform="matrix(0.26458394,0,0,0.26458394,0,164.70749)"
id="rect1208" />
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<g>
<path d="M231.3,382c4.8,5.3,11.6,8.3,18.7,8.3c7.1,0,13.9-3,18.7-8.3l112.1-124.2c6.7-7.4,8.4-18,4.3-27.1
c-4-9.1-13.1-14.9-23-14.9h-52.9V64.9c0-13.9-11.3-25.2-25.2-25.2h-68c-13.9,0-25.2,11.3-25.2,25.2v150.9h-52.9
c-10,0-19,5.9-23,14.9c-4,9.1-2.3,19.7,4.3,27.1L231.3,382z"/>
<path d="M436.1,408.8H63.9c-13.6,0-24.7,11.1-24.7,24.7c0,13.6,11.1,24.7,24.7,24.7h372.2c13.6,0,24.7-11.1,24.7-24.7
C460.8,419.9,449.7,408.8,436.1,408.8z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 2.7 KiB

After

Width:  |  Height:  |  Size: 1.1 KiB

View File

@ -1,75 +1,19 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="210mm"
height="210mm"
viewBox="0 0 210 210"
version="1.1"
id="svg1566"
inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="Icons_exit 05.svg">
<defs
id="defs1560" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="0.35355339"
inkscape:cx="963.7258"
inkscape:cy="291.01609"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
inkscape:window-width="1920"
inkscape:window-height="1009"
inkscape:window-x="-8"
inkscape:window-y="-8"
inkscape:window-maximized="1" />
<metadata
id="metadata1563">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title></dc:title>
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Ebene 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-87)">
<path
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:2.35654187;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
d="M 106.49932,87.901069 C 49.504302,87.900974 3.3006913,134.10459 3.3007713,191.0996 c 0,0.30098 0.003,0.60131 0.005,0.90167 v 0 c -0.003,0.29952 -0.006,0.59901 -0.006,0.89912 -8e-5,56.99502 46.2035307,103.19865 103.1985287,103.19854 23.01714,-0.0773 45.34783,-7.84709 63.44155,-22.07425 0,0 9.01874,-8.71006 2.40579,-16.41737 -6.61297,-7.70731 -19.11222,0.3185 -19.11222,0.3185 -13.60985,9.81394 -29.95596,15.11012 -46.73512,15.14236 -44.275428,0 -80.167758,-35.89234 -80.167758,-80.16778 0,-0.30097 0.003,-0.60148 0.006,-0.90166 h -5.2e-4 c -0.003,-0.29934 -0.006,-0.59901 -0.006,-0.89913 0,-44.27545 35.89234,-80.16777 80.167778,-80.16777 16.77916,0.0322 33.12527,5.32843 46.73512,15.14236 0,0 12.49925,8.02581 19.11222,0.3185 6.61295,-7.70732 -2.4058,-16.41739 -2.4058,-16.41739 C 151.84561,95.74815 129.51494,87.97828 106.4978,87.901069 Z m 54.30959,56.450221 -12.13663,11.69622 20.15864,20.93332 -93.932488,-1.4899 c -9.22763,-0.17349 -16.77655,6.07423 -16.92587,14.00904 l 0.002,0.002 c -0.0149,1.82673 -0.0235,3.40102 0,4.99598 l -0.002,0.002 c 0.14932,7.93483 7.69824,14.18254 16.92587,14.00905 l 93.932488,-1.48991 -20.15864,20.93333 12.13663,11.69622 34.0585,-35.35536 11.82982,-12.29208 h 0.003 l -9.9e-4,-0.002 9.9e-4,-9.9e-4 h -0.003 l -11.82982,-12.29208 z"
id="path1405"
inkscape:connector-curvature="0"
sodipodi:nodetypes="cccccccsccsccsccscccccccccccccccccccccc" />
<path
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:2.39729571;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
d="m 506.57967,92.503023 c -57.98068,-1e-4 -104.98336,47.002567 -104.98326,104.983257 1.9e-4,57.98049 47.00276,104.98284 104.98326,104.98273 23.42489,-0.0758 46.15146,-7.98387 57.83458,-18.08923 11.68313,-10.10537 12.15613,-18.62993 7.38675,-23.04107 v -0.002 c -4.7711,-4.41269 -12.38099,-1.9587 -17.69245,2.25103 -13.83538,9.99805 -30.45915,15.40285 -47.52888,15.4528 -45.04116,0 -81.55421,-36.51305 -81.5542,-81.55419 0,-45.04114 36.51307,-81.5542 81.5542,-81.5542 17.06933,0.0328 33.21884,5.19482 43.16812,12.86758 9.94929,7.67275 17.33418,9.17607 22.1053,4.76338 v -0.002 c 4.77116,-4.41278 5.55882,-12.9887 -0.73482,-18.60197 -18.40654,-14.47308 -41.1234,-22.377337 -64.5386,-22.455877 z m 55.24881,57.426467 -12.34652,11.8985 20.50728,21.29534 -95.55697,-1.51567 c -9.38721,-0.17649 -17.06669,6.17929 -17.21858,14.25133 l 0.003,0.002 c -0.15192,8.07203 7.28245,14.71295 16.66978,14.88953 l 95.22519,1.50947 -21.06332,20.28455 12.04579,12.49846 36.06808,-34.74464 0.005,0.005 12.34654,-11.89954 -12.03701,-12.50724 z m 35.17874,98.71801 0.69918,0.67386 c 0.13539,-0.22412 0.26991,-0.44874 0.4036,-0.67386 z"
id="path1405-6"
inkscape:connector-curvature="0"
sodipodi:nodetypes="ccccccccsczccccccccccccccccccccccc" />
<path
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:2.39729571;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
d="m 740.89945,94.730897 c -57.98068,-9.6e-5 -104.98334,47.002563 -104.98325,104.983253 1.9e-4,57.98049 47.00276,104.98284 104.98325,104.98274 23.42488,-0.0758 46.15145,-7.98387 64.5635,-22.46581 l -17.03461,-16.41553 c -13.83537,9.99805 -30.45916,15.40285 -47.52889,15.4528 -45.04113,0 -81.55419,-36.51306 -81.55419,-81.5542 0,-45.04114 36.51306,-81.55419 81.55419,-81.55419 17.06934,0.0328 33.69814,5.42058 47.54336,15.40423 l 16.99534,-16.3773 c -18.40663,-14.4732 -41.12349,-22.377447 -64.5387,-22.455993 z m 55.24882,57.426473 -12.34653,11.8985 20.50728,21.29534 -95.55696,-1.51567 c -9.38721,-0.17649 -17.06668,6.17928 -17.21858,14.25132 l 0.002,0.002 c -0.1519,8.07203 7.28245,14.71295 16.66978,14.88953 l 95.22519,1.50947 -21.06332,20.28455 12.04578,12.49846 36.06808,-34.74465 0.005,0.005 12.34653,-11.89953 -12.03699,-12.50725 z m 35.17873,98.718 0.69919,0.67386 c 0.13538,-0.22412 0.26991,-0.44874 0.40359,-0.67386 z"
id="path1405-9"
inkscape:connector-curvature="0"
sodipodi:nodetypes="ccccccsccccccccccccccccccccccc" />
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1566" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 595.3 595.3"
style="enable-background:new 0 0 595.3 595.3;" xml:space="preserve">
<g>
<path d="M99,295.1c0,105.5,75.4,194.1,177.1,209.1c68.9,10.1,128.8-9.3,179.1-57.3c14.4-13.7,30.2-19,49-11.3
c27.8,11.3,35.3,47.2,14.3,68.5c-43.9,44.6-96.7,72.7-158.4,83.6c-65,11.4-127.4,2.9-185.7-27.8c-85.2-44.9-138.5-115-155.9-210.1
C2.5,261.9,23,182,78.9,112.4C129,50.1,194.6,13.3,274.5,4.6c94.2-10.3,175.3,18.6,243,84.9c13.3,13,16.6,31.3,9.4,47.7
c-7.1,16-23.4,26.2-40.9,25.4c-11.6-0.6-21.3-5.5-29.6-13.7c-28.7-28.3-62.6-47.1-102-56.1c-117-26.9-234.8,53.1-252.9,171.9
C99.9,275.4,98.7,286.2,99,295.1z"/>
<path d="M434.1,339.9c-2.6,0-5.2,0-7.8,0c-69,0-138.1,0-207.1,0c-19.1,0-34.3-10.6-40.8-28.2c-6-16.2-1.2-34.6,12.4-46.4
c8-7,17.5-10.2,28.3-10.2c69,0.1,138.1,0,207.1,0c2.6,0,5.2,0,8.7,0c-2.2-4.7-5.9-7-8.7-10.1c-11.1-12.6-15.3-26.9-9.9-43.1
c5.3-15.8,16.6-25.5,32.9-28.6c14.2-2.7,27.1,1.3,37.3,11.4c28,27.6,55.9,55.3,83.4,83.5c16.1,16.5,16.1,42.2,0.1,58.7
c-27.6,28.3-55.6,56.3-83.9,83.8c-12.7,12.3-28.4,15.4-44.9,8.8c-16.7-6.7-25.9-19.8-27.1-37.7c-0.8-12,3.7-22.7,12.1-31.5
c2.8-3,6.1-5.5,9.2-8.3C435,341.3,434.5,340.6,434.1,339.9z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 6.0 KiB

After

Width:  |  Height:  |  Size: 1.7 KiB

View File

@ -1,17 +1,15 @@
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 26.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:svg="http://www.w3.org/2000/svg" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" inkscape:version="0.92.4 (5da689c313, 2019-01-14)" sodipodi:docname="Icons_seen.svg"
id="svg1303" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<sodipodi:namedview bordercolor="#666666" borderopacity="1.0" id="base" inkscape:current-layer="layer1" inkscape:cx="84.208758" inkscape:cy="136.94831" inkscape:document-units="mm" inkscape:pageopacity="0.0" inkscape:pageshadow="2" inkscape:window-height="1017" inkscape:window-maximized="1" inkscape:window-width="1920" inkscape:window-x="-8" inkscape:window-y="-8" inkscape:zoom="1.0105705" pagecolor="#ffffff" showgrid="false" units="px">
</sodipodi:namedview>
<g id="layer1" transform="translate(0,-164.70764)" inkscape:groupmode="layer" inkscape:label="Ebene 1">
<path id="path1091" inkscape:connector-curvature="0" d="M250,259.4c-105.2,0.1-201.5,60-250,155.4C48.5,510.1,144.9,569.9,250,570
c105.2-0.1,201.5-60,250-155.4C451.5,319.3,355.1,259.5,250,259.4z M230.7,287.8c-18.7,2.8-36.6,9.8-52.4,20.3
c2.8-0.5,5.7-0.7,8.6-0.7c27.8,0,50.4,22.6,50.4,50.4c0,0,0,0,0,0c0,27.8-22.6,50.4-50.4,50.4c0,0,0,0,0,0
c-27.8,0-50.4-22.6-50.4-50.4c0,0,0,0,0,0c0-1.2,0.1-2.4,0.2-3.6c-10,18.6-15.2,39.4-15.2,60.5c0,62.6,45,116,106.7,126.7
c-78.3-7.6-147.6-55.2-183.9-126.6C81,342.6,151.3,294.7,230.7,287.8L230.7,287.8z M271.8,288c78.3,7.6,147.6,55.2,183.9,126.6
c-36.7,72.1-106.9,120-186.2,127c62.6-9.7,108.9-63.5,108.9-126.9C378.5,352.1,333.5,298.7,271.8,288L271.8,288z"/>
<g>
<path d="M249.8,406.5C142.2,401.9,59.2,355.3,3.6,261.9c-4.7-8-4.8-16.2-0.1-24.2c46.6-79.1,115-127.3,205.9-141.1
c113.5-17.1,224.7,36.5,283.5,134.6c1,1.7,1.9,3.5,3,5.2c5.6,8.8,5.4,17.5,0.1,26.5c-29,50.1-69.2,88.2-121,113.9
c-33.4,16.6-68.7,26.2-106,28.5C262.7,405.7,256.4,406.2,249.8,406.5z M139.6,249.7c0,61.4,49.1,110.5,110.6,110.5
c61.2,0,110.3-49.1,110.4-110.2c0.1-61.5-48.9-110.7-110.4-110.7C188.7,139.2,139.6,188.3,139.6,249.7z"/>
<path d="M174.9,249.8c0-42.1,33.2-75.3,75.4-75.2c41.7,0,75,33.3,75,75.2c0,42.1-33.3,75.3-75.5,75.3
C208.1,324.9,174.9,291.6,174.9,249.8z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 2.0 KiB

After

Width:  |  Height:  |  Size: 1.3 KiB

View File

@ -1,75 +1,13 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="500"
height="500"
viewBox="0 0 132.29197 132.29167"
version="1.1"
id="svg1303"
inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="Icons_play.svg">
<defs
id="defs1297" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="3.4343607"
inkscape:cx="51.709313"
inkscape:cy="230.85077"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
units="px"
inkscape:window-width="1920"
inkscape:window-height="1017"
inkscape:window-x="-8"
inkscape:window-y="-8"
inkscape:window-maximized="1" />
<metadata
id="metadata1300">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title />
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Ebene 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-164.70764)">
<path
sodipodi:type="star"
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:0;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
id="path1206"
sodipodi:sides="3"
sodipodi:cx="-142.60593"
sodipodi:cy="272.75879"
sodipodi:r1="30.817924"
sodipodi:r2="15.408962"
sodipodi:arg1="0.52359878"
sodipodi:arg2="1.5707963"
inkscape:flatsided="false"
inkscape:rounded="-3.46945e-18"
inkscape:randomized="0"
d="m -115.91682,288.16775 -26.68911,0 -26.6891,0 13.34455,-23.11344 13.34455,-23.11345 13.34456,23.11345 z"
transform="matrix(0,1.7014617,-1.7014617,0,530.04498,473.492)"
inkscape:transform-center-x="-18.792756" />
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<g>
<path d="M175.5,421.6c-4,0-8,0-12.1,0c-13.9-3.4-21-13-24.1-26.3c-1-4.5-1.5-9-1.5-13.6c0-87.7,0-175.4,0-263.1
c0-5.5,0.6-10.9,2.2-16.2c3.7-12.5,11-21.5,24.5-23.6c9-1.4,17.6,0.7,25.5,5c10.9,5.9,21.6,12.1,32.4,18.2
c66.6,38.2,133.2,76.4,199.8,114.6c7.1,4.1,13.7,8.7,18.8,15.4c8.7,11.5,8.7,25,0,36.5c-5,6.6-11.7,11.3-18.8,15.4
c-74,42.5-148.1,84.9-222.1,127.4C192.2,415.7,184.4,419.8,175.5,421.6z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 2.5 KiB

After

Width:  |  Height:  |  Size: 1.1 KiB

View File

@ -0,0 +1,4 @@
<?xml version="1.0" encoding="utf-8"?>
<svg viewBox="0 0 500 500" style="enable-background:new 0 0 500 500;" xmlns="http://www.w3.org/2000/svg">
<path d="M 408.514 358.563 L 303.835 255.003 L 408.441 149.115 C 424.1 133.406 424.1 107.662 408.441 91.953 C 392.783 76.245 367.12 76.245 351.463 91.953 L 246.857 197.841 L 142.323 91.881 C 126.665 76.172 101.002 76.172 85.344 91.881 C 69.613 107.662 69.613 133.334 85.272 149.115 L 189.805 255.003 L 84.401 359.291 C 68.743 375 68.743 400.744 84.401 416.453 C 100.06 432.161 125.722 432.161 141.381 416.453 L 246.784 312.165 L 351.39 415.726 C 367.048 431.434 392.711 431.434 408.369 415.726 C 424.172 400.017 424.172 374.345 408.514 358.563 Z" style="" transform="matrix(0.9999999999999999, 0, 0, 0.9999999999999999, 0, 0)"/>
</svg>

After

Width:  |  Height:  |  Size: 782 B

View File

@ -1,63 +1,23 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="500"
height="500"
viewBox="0 0 132.29197 132.29167"
version="1.1"
id="svg1303"
inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="Rescan Subscriptions.svg">
<defs
id="defs1297" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="0.85859018"
inkscape:cx="125.63373"
inkscape:cy="156.7016"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
units="px"
inkscape:window-width="1920"
inkscape:window-height="1017"
inkscape:window-x="-8"
inkscape:window-y="-8"
inkscape:window-maximized="1" />
<metadata
id="metadata1300">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title />
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Ebene 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-164.70764)">
<path
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:0;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
d="M 240.35352 18.810547 L 240.35352 29.246094 A 221.29301 221.29301 0 0 0 64.912109 129.45898 L 95.701172 150.14453 A 184.02261 184.02261 0 0 1 240.35352 66.369141 L 240.35352 87.705078 L 270.18359 70.480469 L 300.01562 53.257812 L 270.18359 36.035156 L 240.35352 18.810547 z M 312.28516 37.798828 L 301.08398 73.353516 A 184.02261 184.02261 0 0 1 422.6875 187.33594 L 406.34375 192.03125 L 431.13281 215.94922 L 455.92188 239.86719 L 464.24219 206.44141 L 472.56055 173.01367 L 458.55078 177.03711 A 221.29301 221.29301 0 0 0 312.28516 37.798828 z M 73.375 144.88281 L 48.585938 168.80078 L 23.796875 192.71875 L 35.617188 196.11328 A 221.29301 221.29301 0 0 0 28.707031 250 A 221.29301 221.29301 0 0 0 77.992188 388.9375 L 107.99805 366.83789 A 184.02261 184.02261 0 0 1 65.976562 250 A 184.02261 184.02261 0 0 1 71.474609 206.41211 L 90.013672 211.73633 L 81.693359 178.30859 L 73.375 144.88281 z M 433.95312 248.4375 A 184.02261 184.02261 0 0 1 434.02344 250 A 184.02261 184.02261 0 0 1 367.40625 391.49219 L 358.92773 380.14844 L 345.34375 411.80469 L 331.75977 443.45898 L 365.9668 439.39648 L 400.17188 435.33203 L 389.71484 421.33984 A 221.29301 221.29301 0 0 0 471.29297 250 A 221.29301 221.29301 0 0 0 471.23047 248.77539 L 433.95312 248.4375 z M 94.9375 385.80273 L 108.97852 417.25781 L 123.02148 448.71289 L 131.60156 436.87891 A 221.29301 221.29301 0 0 0 250 471.29297 A 221.29301 221.29301 0 0 0 328.64648 456.57227 L 315.99023 421.74805 A 184.02261 184.02261 0 0 1 250 434.02344 A 184.02261 184.02261 0 0 1 153.63867 406.48047 L 163.45898 392.93555 L 129.19727 389.36914 L 94.9375 385.80273 z "
transform="matrix(0.26458394,0,0,0.26458394,0,164.70749)"
id="path826" />
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<sodipodi:namedview bordercolor="#666666" borderopacity="1.0" id="base" inkscape:current-layer="layer1" inkscape:cx="125.63373" inkscape:cy="156.7016" inkscape:document-units="mm" inkscape:pageopacity="0.0" inkscape:pageshadow="2" inkscape:window-height="1017" inkscape:window-maximized="1" inkscape:window-width="1920" inkscape:window-x="-8" inkscape:window-y="-8" inkscape:zoom="0.85859018" pagecolor="#ffffff" showgrid="false" units="px">
</sodipodi:namedview>
<g>
<path d="M269,28.7c10.8,1.6,21.6,2.8,32.3,5.3c38.3,9.2,71.9,27.2,100.7,54.1c3.4,3.2,4.7,2.4,6.8-1.1c3.3-5.5,7.1-10.9,10.7-16.2
c4.8-7,11.3-9.7,18.5-7.6c6.8,1.9,11.5,8.2,11.5,16.5c-0.1,26.3,0.3,52.7-0.5,79c-0.4,12.5-0.1,25-0.2,37.6
c-0.1,14-11.1,21.3-23.9,15.6c-35.2-15.7-70.3-31.6-105.5-47.4c-7.7-3.5-11.8-10.1-10.8-16.9c1.1-7.8,6.9-13.2,15.6-14.4
c4-0.6,8-1.2,11.9-1.8c0.9-0.1,2.3,0.2,2.6-1.1c0.3-1.2-1-1.6-1.8-2.2c-12.2-8.9-25.5-15.7-39.8-20.3
c-72.7-23.2-148.9,9.9-181.6,78.9c-4.1,8.7-8.3,17-17.1,22.1c-13.7,8-30.1,6.6-42.2-3.8c-11.3-9.6-15.6-26.6-9.6-40.5
c15.7-36.9,39.4-67.7,71.6-91.8c29.8-22.4,63.2-36.3,100-41.9c4.8-0.7,9.7-1.3,14.5-2C244.8,28.7,256.9,28.7,269,28.7z"/>
<path d="M52.1,361.3c0.6-20,0.8-39,0.8-58c0-14,11.1-21.2,23.9-15.5c25.2,11.2,50.4,22.6,75.6,33.9c10.1,4.5,20.2,9,30.3,13.7
c6.4,2.9,10.5,7.8,10.5,15c0,6.4-3.2,11.2-9,14.2c-3.8,2-8.1,2-12.1,2.8c-2.1,0.4-4.3,0.6-6.4,1c-0.9,0.1-2.3-0.2-2.6,1.1
c-0.3,1.2,1,1.6,1.8,2.2c12.2,8.9,25.5,15.7,39.8,20.3c72.7,23.2,148.9-9.9,181.6-78.9c4.2-8.8,8.5-17.3,17.4-22.4
c13.6-7.7,29.8-6.3,41.8,4c11.3,9.6,15.6,26.6,9.7,40.5c-15.7,36.9-39.4,67.7-71.6,91.8c-29.7,22.3-62.9,36.5-99.6,41.8
c-57.8,8.3-110.9-4-159.1-36.9c-9.3-6.3-17.9-13.5-26.1-21.2c-2.1-2-3.3-2.1-4.9,0.5c-3.4,5.5-7.1,10.8-10.7,16.2
c-5.7,8.5-12.2,11.4-19.9,8.9c-7.3-2.3-11.1-8.6-11.1-18.5C52.1,398.6,52.1,379.4,52.1,361.3z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 3.5 KiB

After

Width:  |  Height:  |  Size: 2.5 KiB

View File

@ -1,64 +1,15 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="500"
height="500"
viewBox="0 0 132.29197 132.29167"
version="1.1"
id="svg1303"
inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="Icons_Lupe.svg">
<defs
id="defs1297" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="1.0105705"
inkscape:cx="-23.955221"
inkscape:cy="207.58555"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
units="px"
inkscape:window-width="1920"
inkscape:window-height="1017"
inkscape:window-x="-8"
inkscape:window-y="-8"
inkscape:window-maximized="1" />
<metadata
id="metadata1300">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title />
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Ebene 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-164.70764)">
<path
style="opacity:1;fill:#002130;fill-opacity:1;stroke:none;stroke-width:0;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
d="m 15.593586,179.85791 c 18.094683,-18.09394 47.431291,-18.09355 65.52552,7.7e-4 16.496472,16.53809 18.102694,42.7696 3.747207,61.19678 17.821687,15.21024 24.490877,18.93088 42.171347,34.20969 4.41254,4.71717 4.29213,12.08299 -0.27223,16.65347 -4.57089,4.56126 -11.93421,4.68001 -16.65007,0.26874 l -8.4e-4,9.1e-4 -0.0173,-0.0152 C 94.335453,273.50898 90.211408,265.24542 76.792795,249.15281 58.360539,263.5048 32.12693,261.89037 15.593512,245.38662 c -18.0947127,-18.09455 -18.0947127,-47.43175 0,-65.52636 z m 7.908493,7.90846 c -13.7275962,13.72683 -13.7279739,35.98299 -8.23e-4,49.71019 13.72719,13.72714 35.983444,13.72678 49.710173,-8.2e-4 13.726834,-13.72692 13.726834,-35.98245 0,-49.70937 -13.726863,-13.72676 -35.982478,-13.72676 -49.70935,0 z"
id="rect826"
inkscape:connector-curvature="0"
sodipodi:nodetypes="ccccccccccccccccc" />
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<g>
<path d="M492.4,462.2c-1,2.5-1.8,5.1-3,7.6C478,495,448.2,501.9,427.2,484c-1.8-1.5-3.5-3.2-5.2-4.9
c-34.1-34.8-68.3-69.5-102.3-104.4c-3.3-3.4-5.6-3.6-9.5-1c-30.5,20.3-64.1,31.1-100.6,32c-60.1,1.5-110.4-20.8-151.1-65.6
c-28.2-31-45.1-67.8-49.5-109.8C1.6,161.2,23.6,102.8,74.3,56.1c28.7-26.4,62.7-42.4,101-47.9c52.7-7.6,101,4.5,143.9,36.5
c43.7,32.6,70,76.6,78.4,131c7.1,45.8-1.1,89.2-23.7,129.6c-3.7,6.6-3.8,6.6,1.5,11.9c33.3,34,66.6,68,100.1,101.9
c7.6,7.7,14.2,15.9,16.9,26.7C494.1,451.5,493.6,456.9,492.4,462.2z M335.2,205.9c0.5-72.8-58.3-134.2-131.9-134
C132.9,72,72,129.3,72,206c0,74.8,58.1,134.2,131.7,134.2C278.8,340.2,335.8,278.1,335.2,205.9z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 2.7 KiB

After

Width:  |  Height:  |  Size: 1.3 KiB

View File

@ -1,67 +1,21 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="500"
height="500"
viewBox="0 0 132.29197 132.29167"
version="1.1"
id="svg1303"
inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="Icons_sort.svg">
<defs
id="defs1297" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="0.6482696"
inkscape:cx="214.8721"
inkscape:cy="136.02434"
inkscape:document-units="mm"
inkscape:current-layer="g855"
showgrid="false"
units="px"
inkscape:window-width="957"
inkscape:window-height="893"
inkscape:window-x="941"
inkscape:window-y="13"
inkscape:window-maximized="0" />
<metadata
id="metadata1300">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title></dc:title>
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Ebene 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-164.70764)">
<g
id="g855"
transform="matrix(1.9016362,0,0,1.9016362,-197.93838,-58.9418)">
<path
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:2.55118108;stroke-linecap:round;stroke-linejoin:round;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
d="M 341.125 82.275391 L 315.50977 106.92773 L 241.84375 177.89258 L 266.21289 203.17969 L 309.82617 161.17969 L 306.72266 379.375 C 306.36114 398.60059 319.37807 414.32761 335.91016 414.63867 L 335.91406 414.63477 C 352.44586 414.94597 366.04647 399.71987 366.4082 380.49414 L 369.5 162.97852 L 411.04297 206.11719 L 436.64062 181.44727 L 365.48242 107.57812 L 365.49609 107.5625 L 341.125 82.275391 z M 175.93359 82.277344 L 175.92969 82.28125 C 159.39782 81.970041 145.79728 97.196144 145.43555 116.42188 L 142.34375 333.9375 L 100.80078 290.79883 L 75.203125 315.46875 L 146.36133 389.33789 L 146.3457 389.35156 L 170.7168 414.63867 L 196.33203 389.98633 L 270 319.02344 L 245.63086 293.73633 L 202.01758 335.73633 L 205.12109 117.54102 C 205.48261 98.315428 192.46568 82.588409 175.93359 82.277344 z "
transform="matrix(0.13913489,0,0,0.13913489,104.08846,117.60887)"
id="path814" />
</g>
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<g>
<path d="M147.7,416.2c-8.7-3.4-11.9-8.1-11.9-17.5c0-53.5,0-107,0-160.5c0-9.3-4.2-15-12.6-16.7c-1.7-0.4-3.4-0.4-5.2-0.4
c-8.3,0-16.6,0-25,0c-7.3,0-12.9-3-16-9.5c-3.1-6.6-2.4-13,2.3-18.8c21.1-26.4,42.1-52.9,63.2-79.4c5.7-7.2,11.4-14.4,17.1-21.6
c3.2-4.1,7-7.4,12.5-7.9c6.6-0.6,11.4,2.3,15.3,7.3c5.7,7.4,11.5,14.9,17.3,22.3c20.5,26.4,40.9,52.8,61.5,79
c7.2,9.2,4.3,19.8-2,24.9c-3.2,2.6-6.8,3.6-10.8,3.6c-8.6,0-17.3,0-25.9,0c-10.6,0.1-16.2,5.6-16.2,16.2
c0,53.1-0.1,106.1,0.1,159.2c0,9.6-3.4,16.3-12.6,19.7C181.9,416.2,164.8,416.2,147.7,416.2z"/>
</g>
<g>
<path d="M369.5,83.8c8.7,3.4,11.9,8.1,11.9,17.5c0,53.5,0,106.9,0,160.4c0,9.3,4.2,15,12.6,16.7c1.7,0.3,3.4,0.4,5.2,0.4
c8.3,0,16.6,0,25,0c7.3,0,12.9,3,16,9.5c3.1,6.6,2.4,13-2.3,18.8c-21.1,26.4-42.1,52.9-63.2,79.3c-5.7,7.2-11.4,14.4-17.1,21.6
c-3.2,4.1-6.9,7.4-12.5,7.9c-6.6,0.6-11.4-2.3-15.3-7.3c-5.7-7.4-11.5-14.8-17.3-22.3c-20.5-26.4-40.8-52.8-61.5-79
c-7.2-9.2-4.3-19.8,2-24.9c3.2-2.6,6.8-3.6,10.8-3.6c8.6,0,17.3,0,25.9,0c10.6-0.1,16.2-5.6,16.2-16.2c0-53,0.1-106.1-0.1-159.1
c0-9.6,3.4-16.3,12.6-19.7C335.4,83.8,352.5,83.8,369.5,83.8z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 2.9 KiB

After

Width:  |  Height:  |  Size: 1.7 KiB

View File

@ -1,65 +1,25 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
width="500"
height="500"
viewBox="0 0 132.29197 132.29167"
version="1.1"
id="svg1303"
inkscape:version="1.1.1 (3bf5ae0d25, 2021-09-20)"
sodipodi:docname="icon-star-empty.svg"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns="http://www.w3.org/2000/svg"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:dc="http://purl.org/dc/elements/1.1/">
<defs
id="defs1297" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="2.4005076"
inkscape:cx="40.40812"
inkscape:cy="211.20533"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
units="px"
inkscape:window-width="3840"
inkscape:window-height="2112"
inkscape:window-x="0"
inkscape:window-y="48"
inkscape:window-maximized="1"
inkscape:pagecheckerboard="0" />
<metadata
id="metadata1300">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title />
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Ebene 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-164.70764)">
<path
id="path1255"
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:0;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
inkscape:transform-center-y="-6.4703827"
d="M 250 21.158203 L 189.78906 194.37891 L 6.4375 198.11523 L 152.57617 308.9082 L 99.470703 484.43945 L 250 379.69141 L 400.5293 484.43945 L 347.42383 308.9082 L 493.5625 198.11523 L 310.21094 194.37891 L 250 21.158203 z M 250 96.408203 L 291.69141 216.34766 L 418.64453 218.93555 L 317.45703 295.65039 L 354.22852 417.18945 L 250 344.66211 L 145.77148 417.18945 L 182.54297 295.65039 L 81.355469 218.93555 L 208.30859 216.34766 L 250 96.408203 z "
transform="matrix(0.26458394,0,0,0.26458394,0,164.70749)" />
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<g>
<path d="M492.6,214.8c-2.7,7.3-7.9,12.7-13.8,17.7c-32.6,28.4-65.1,57-97.7,85.5c-2.4,2.1-2.4,4.1-1.8,6.7
c9.9,43.5,19.8,87.1,29.7,130.6c3.8,16.8-4.6,30.8-20.6,34.1c-7.2,1.5-13.7-0.8-19.8-4.4c-37.7-22.5-75.4-45-113-67.7
c-4.2-2.6-7.2-2.5-11.4,0.1c-37.7,22.8-75.6,45.3-113.4,67.9c-17.6,10.5-37.4,1.6-40.9-18.2c-0.7-3.8,0-7.5,0.8-11.2
c9.8-43.4,19.7-86.8,29.6-130.2c0.8-3.7,0.4-5.9-2.6-8.5c-33.2-28.9-66.3-58.1-99.5-87c-4.9-4.3-8.5-9.3-10.9-15.2
c-1-4.4-1.4-8.8,0-13.3c4.9-13.7,14.9-19.5,29.3-20.4c22.5-1.4,44.9-4,67.4-6.1c19-1.7,38-3.6,57-5c6.3-0.5,10.3-2.2,13-8.8
c16.4-39.3,33.3-78.3,50-117.4c2.6-6.2,5.7-12,11.5-15.8c13.6-8.9,31.4-4.1,38.3,11.1c8.8,19.4,17,39.1,25.3,58.7
c9.4,22.1,18.9,44.1,28.2,66.2c1.4,3.3,3.3,4.9,6.9,5.2c15.9,1.3,31.7,2.8,47.6,4.3c23.1,2.1,46.2,4.2,69.2,6.4
c4.2,0.4,8.5,1.1,12.7,1.1c14.3,0.2,23.9,6.8,28.8,20.3C494,206,493.8,210.4,492.6,214.8z M50.8,211.9c-1.9,0.3-2.5,2.7-1.1,3.9
c0.1,0.1,0.2,0.2,0.3,0.3c16.9,14.8,33.7,29.6,50.6,44.4c14.2,12.5,28.5,25,42.7,37.5c8.8,7.7,11.7,17.1,9,28.6
c-8.4,36.3-16.6,72.6-24.8,108.9c-0.9,4.1-2.1,8.1-2.8,12.2c-0.3,2.1,2.1,3.6,3.8,2.4c0.4-0.3,0.8-0.5,1.1-0.8
c34.9-20.8,69.7-41.6,104.5-62.4c3.7-2.2,7.4-4,11.8-4.7c8.6-1.4,15.6,2.2,22.7,6.4c34.1,20.5,68.3,40.9,102.5,61.3
c1.2,0.7,2.6,2.6,4.1,1.3c1.1-1,0-2.7-0.3-4c-9-39.7-17.7-79.5-27-119.1c-3-12.8,0.1-22.8,9.9-31.3c24.3-21.1,48.5-42.4,72.7-63.6
c6.7-5.8,13.3-11.7,20.2-17.8c1.4-1.2,0.7-3.6-1.2-3.8c0,0,0,0,0,0c-17.9-1.6-35.8-3.3-53.7-4.8c-22.6-2-45.2-4.2-67.9-5.9
c-14.2-1.1-22.9-8-28.4-21.2c-15.4-37.2-31.5-74.1-47.3-111.1c-0.5-1.2-0.6-3.2-2.3-3.2s-1.8,2-2.3,3.2c-7.7,18-15.4,36-23.1,54
c-8.6,20.2-17.2,40.4-25.9,60.6c-4,9.2-10.6,15.6-20.7,16.7c-20.2,2.3-40.5,4-60.8,5.8c-19,1.7-38,3.2-57,5.1
C57.2,211,54.2,211.3,50.8,211.9z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 2.5 KiB

After

Width:  |  Height:  |  Size: 2.4 KiB

View File

@ -1,74 +1,19 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
xmlns:dc="http://purl.org/dc/elements/1.1/"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns="http://www.w3.org/2000/svg"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
width="500"
height="500"
viewBox="0 0 132.29197 132.29167"
version="1.1"
id="svg1303"
inkscape:version="0.92.4 (5da689c313, 2019-01-14)"
sodipodi:docname="Icons_star.svg">
<defs
id="defs1297" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="1.0105705"
inkscape:cx="61.891881"
inkscape:cy="148.25167"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
units="px"
inkscape:window-width="1920"
inkscape:window-height="1017"
inkscape:window-x="-8"
inkscape:window-y="-8"
inkscape:window-maximized="1" />
<metadata
id="metadata1300">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title />
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Ebene 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-164.70764)">
<path
sodipodi:type="star"
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:0;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
id="path1255"
sodipodi:sides="5"
sodipodi:cx="66.145981"
sodipodi:cy="237.32387"
sodipodi:r1="67.758904"
sodipodi:r2="27.103561"
sodipodi:arg1="-1.5707963"
sodipodi:arg2="-0.9424778"
inkscape:flatsided="false"
inkscape:rounded="-3.46945e-18"
inkscape:randomized="0"
d="m 66.145983,169.56496 15.931071,45.83167 48.511476,0.98859 L 91.923,245.69933 105.97366,292.14197 66.145981,264.42743 26.318295,292.14197 40.368962,245.69933 1.7034347,216.38521 50.214907,215.39663 Z"
inkscape:transform-center-y="-6.4703827" />
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:cc="http://creativecommons.org/ns#" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:svg="http://www.w3.org/2000/svg" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<sodipodi:namedview bordercolor="#666666" borderopacity="1.0" id="base" inkscape:current-layer="layer1" inkscape:cx="61.891881" inkscape:cy="148.25167" inkscape:document-units="mm" inkscape:pageopacity="0.0" inkscape:pageshadow="2" inkscape:window-height="1017" inkscape:window-maximized="1" inkscape:window-width="1920" inkscape:window-x="-8" inkscape:window-y="-8" inkscape:zoom="1.0105705" pagecolor="#ffffff" showgrid="false" units="px">
</sodipodi:namedview>
<g>
<path d="M493.2,207.8c-2.7,7.3-7.9,12.7-13.8,17.7c-32.6,28.4-65.1,57-97.7,85.5c-2.4,2.1-2.4,4.1-1.8,6.7
c9.9,43.5,19.8,87.1,29.7,130.6c3.8,16.8-4.6,30.8-20.6,34.1c-7.2,1.5-13.7-0.8-19.8-4.4c-37.7-22.5-75.4-45-113-67.7
c-4.2-2.6-7.2-2.5-11.4,0.1c-37.7,22.8-75.6,45.3-113.4,67.9c-17.6,10.5-37.4,1.6-40.9-18.2c-0.7-3.8,0-7.5,0.8-11.2
c9.8-43.4,19.7-86.8,29.6-130.2c0.8-3.7,0.4-5.9-2.6-8.5C85.1,281.1,52,252,18.8,223c-4.9-4.3-8.5-9.3-10.9-15.2
c-1-4.4-1.4-8.8,0-13.3c4.9-13.7,14.9-19.5,29.3-20.4c22.5-1.4,44.9-4,67.4-6.1c19-1.7,38-3.6,57-5c6.3-0.5,10.3-2.2,13-8.8
c16.3-39.2,33.2-78.2,49.8-117.2c2.6-6.2,5.7-12,11.5-15.8c13.6-8.9,31.4-4.1,38.3,11.1c8.8,19.4,17,39.1,25.3,58.7
c9.4,22.1,18.9,44.1,28.2,66.2c1.4,3.3,3.3,4.9,6.9,5.2c15.9,1.3,31.7,2.8,47.6,4.3c23.1,2.1,46.2,4.2,69.2,6.4
c4.2,0.4,8.5,1.1,12.7,1.1c14.3,0.2,23.9,6.8,28.8,20.3C494.5,199,494.3,203.4,493.2,207.8z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 2.5 KiB

After

Width:  |  Height:  |  Size: 2.0 KiB

View File

@ -1,66 +1,22 @@
<?xml version="1.0" encoding="UTF-8" standalone="no"?>
<!-- Created with Inkscape (http://www.inkscape.org/) -->
<svg
width="500"
height="500"
viewBox="0 0 132.29197 132.29167"
version="1.1"
id="svg1303"
inkscape:version="1.1.1 (3bf5ae0d25, 2021-09-20)"
sodipodi:docname="icon-star-half.svg"
xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape"
xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd"
xmlns="http://www.w3.org/2000/svg"
xmlns:svg="http://www.w3.org/2000/svg"
xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
xmlns:cc="http://creativecommons.org/ns#"
xmlns:dc="http://purl.org/dc/elements/1.1/">
<defs
id="defs1297" />
<sodipodi:namedview
id="base"
pagecolor="#ffffff"
bordercolor="#666666"
borderopacity="1.0"
inkscape:pageopacity="0.0"
inkscape:pageshadow="2"
inkscape:zoom="1.0105705"
inkscape:cx="44.034533"
inkscape:cy="168.22181"
inkscape:document-units="mm"
inkscape:current-layer="layer1"
showgrid="false"
units="px"
inkscape:window-width="3840"
inkscape:window-height="2078"
inkscape:window-x="0"
inkscape:window-y="82"
inkscape:window-maximized="1"
inkscape:pagecheckerboard="0" />
<metadata
id="metadata1300">
<rdf:RDF>
<cc:Work
rdf:about="">
<dc:format>image/svg+xml</dc:format>
<dc:type
rdf:resource="http://purl.org/dc/dcmitype/StillImage" />
<dc:title />
</cc:Work>
</rdf:RDF>
</metadata>
<g
inkscape:label="Ebene 1"
inkscape:groupmode="layer"
id="layer1"
transform="translate(0,-164.70764)">
<path
id="path1255"
style="opacity:1;fill:#000000;fill-opacity:1;stroke:none;stroke-width:0;stroke-linecap:round;stroke-linejoin:bevel;stroke-miterlimit:4;stroke-dasharray:none;stroke-dashoffset:0;stroke-opacity:1;paint-order:markers fill stroke"
inkscape:transform-center-x="1.1220981e-06"
inkscape:transform-center-y="-6.4703827"
d="M 250 21.158203 L 189.78906 194.37891 L 6.4375 198.11523 L 152.57617 308.9082 L 99.470703 484.43945 L 250 379.69141 L 400.5293 484.43945 L 347.42383 308.9082 L 493.5625 198.11523 L 310.21094 194.37891 L 250 21.158203 z M 250 96.408203 L 291.69141 216.34766 L 418.64453 218.93555 L 317.45703 295.65039 L 354.22852 417.18945 L 250 344.66211 L 250 96.408203 z "
transform="matrix(0.26458394,0,0,0.26458394,0,164.70749)" />
</g>
<?xml version="1.0" encoding="utf-8"?>
<!-- Generator: Adobe Illustrator 27.5.0, SVG Export Plug-In . SVG Version: 6.00 Build 0) -->
<svg version="1.1"
id="svg1303" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:sodipodi="http://sodipodi.sourceforge.net/DTD/sodipodi-0.dtd" xmlns:svg="http://www.w3.org/2000/svg" xmlns:inkscape="http://www.inkscape.org/namespaces/inkscape" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:cc="http://creativecommons.org/ns#"
xmlns="http://www.w3.org/2000/svg" xmlns:xlink="http://www.w3.org/1999/xlink" x="0px" y="0px" viewBox="0 0 500 500"
style="enable-background:new 0 0 500 500;" xml:space="preserve">
<g>
<path d="M7.5,207.8c2.8,7.3,8,12.6,13.8,17.7C54,253.9,86.4,282.6,119,311c2.4,2.1,2.4,4.1,1.8,6.7c-9.9,43.5-19.8,87.1-29.7,130.6
c-3.9,16.9,4.6,30.9,20.6,34.1c7.2,1.5,13.7-0.7,19.8-4.4c37.7-22.6,75.4-45,112.9-68c1.9-1.3,3.7-1.9,5.4-1.9v0.3
c1.9,0,3.7,0.6,5.8,1.9c37.7,22.8,75.5,45.4,113.4,67.9c17.6,10.5,37.4,1.6,40.9-18.2c0.7-3.8,0-7.5-0.8-11.2
c-9.8-43.4-19.6-86.8-29.6-130.2c-0.9-3.6-0.4-5.9,2.6-8.5c33.3-28.9,66.3-58,99.5-87c4.9-4.3,8.5-9.3,10.9-15.2
c1-4.5,1.4-8.9,0-13.3c-5-13.7-14.9-19.5-29.3-20.4c-22.5-1.4-44.9-4.1-67.4-6.1c-19-1.8-38-3.6-57-5c-6.3-0.5-10.2-2.2-13-8.8
c-16.2-39.2-33.2-78.1-49.8-117.2c-2.6-6.2-5.7-12-11.5-15.8c-4.6-3-9.5-4.5-14.5-4.5v-0.1c-9.8,0-19.2,5.5-23.8,15.6
c-8.8,19.4-16.9,39.1-25.3,58.7c-9.5,22-18.9,44.1-28.2,66.2c-1.4,3.3-3.3,4.9-6.9,5.2c-15.9,1.3-31.7,2.8-47.6,4.3
c-23,2.2-46.1,4.3-69.2,6.4c-4.2,0.4-8.5,1-12.7,1.1C22,174.4,12.4,181,7.5,194.5C6.1,199,6.3,203.4,7.5,207.8z M250,374.9V58.2
c1.6,0.1,1.7,2,2.2,3.2c7.7,18,15.4,36,23.1,54c8.7,20.2,17.3,40.4,25.9,60.6c3.9,9.2,10.6,15.6,20.7,16.7
c20.3,2.3,40.5,4,60.8,5.8c19,1.7,38,3.1,57,5.1l6,0.6c3.3,0.3,4.6,4.3,2.1,6.5c-15.6,13.8-33.6,29.7-48.4,42.7
c-14.2,12.5-28.5,25-42.7,37.5c-8.7,7.7-11.6,17.2-9,28.6c8.4,36.2,16.6,72.6,24.8,108.9c0.9,4,2.1,8,2.8,12.1
c0.3,2.2-2.1,3.7-3.9,2.5c-0.4-0.3-0.7-0.5-1.1-0.7c-34.8-20.8-69.7-41.6-104.5-62.4c-3.7-2.2-7.5-4-11.8-4.7
C252.8,375,251.6,374.9,250,374.9L250,374.9z"/>
</g>
</svg>

Before

Width:  |  Height:  |  Size: 2.4 KiB

After

Width:  |  Height:  |  Size: 2.1 KiB

Some files were not shown because too many files have changed in this diff Show More