diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index f2538fb..2a67c68 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -19,7 +19,7 @@ I have learned the hard way, that working on a dockerized application outside of This is my setup I have landed on, YMMV: - Clone the repo, work on it with your favorite code editor in your local filesystem. *testing* branch is the where all the changes are happening, might be unstable and is WIP. -- Then I have a VM on KVM hypervisor running standard Ubuntu Server LTS with docker installed. The VM keeps my projects separate and offers convenient snapshot functionality. The VM also offers ways to simulate lowend environments by limiting CPU cores and memory. But you could also just run docker on your host system. +- Then I have a VM on KVM hypervisor running standard Ubuntu Server LTS with docker installed. The VM keeps my projects separate and offers convenient snapshot functionality. The VM also offers ways to simulate lowend environments by limiting CPU cores and memory. You can use this [Ansible Docker Ubuntu](https://github.com/bbilly1/ansible-playbooks) playbook to get started quickly. But you could also just run docker on your host system. - The `Dockerfile` is structured in a way that the actual application code is in the last layer so rebuilding the image with only code changes utilizes the build cache for everything else and will just take a few seconds. - Take a look at the `deploy.sh` file. I have my local DNS resolve `tubearchivist.local` to the IP of the VM for convenience. To deploy the latest changes and rebuild the application to the testing VM run: ```bash @@ -32,11 +32,11 @@ This is my setup I have landed on, YMMV: ## Working with Elasticsearch Additionally to the required services as listed in the example docker-compose file, the **Dev Tools** of [Kibana](https://www.elastic.co/guide/en/kibana/current/docker.html) are invaluable for running and testing Elasticsearch queries. -If you want to run queries in on the Elasticsearch container directly from your host with for example `curl` or something like *postman*, you might want to **publish** the port 9200 instead of just **exposing** it. +If you want to run queries on the Elasticsearch container directly from your host with for example `curl` or something like *postman*, you might want to **publish** the port 9200 instead of just **exposing** it. ## Implementing a new feature -Do you see anything on the roadmap that you would like to take a closer look at but you are not sure, what's the best way to tackle that? Or anything not on there yet you'd like to implement but are not sure how? Open up an issue and we try to find a solution together. +Do you see anything on the roadmap that you would like to take a closer look at but you are not sure, what's the best way to tackle that? Or anything not on there yet you'd like to implement but are not sure how? Reach out on Discord and we'll look into it together. ## Making changes @@ -53,10 +53,10 @@ If you want to see what's in your container, checkout the matching release tag. ## Code formatting and linting -To keep things clean and consistent for everybody, there is a github action setup to lint and check the changes. You can test your code locally first if you want. For example if you made changes in the **download** module, run +To keep things clean and consistent for everybody, there is a github action setup to lint and check the changes. You can test your code locally first if you want. For example if you made changes in the **video** module, run ```shell -./deploy.sh validate tubearchivist/home/src/download.py +./deploy.sh validate tubearchivist/home/src/index/video.py ``` -to validate your changes. If you omit the path, all the project files will get checked. This is subject to change as the codebase improves. \ No newline at end of file +to validate your changes. If you omit the path, all the project files will get checked. This is subject to change as the codebase improves. diff --git a/README.md b/README.md index 16a6112..fcb1a86 100644 --- a/README.md +++ b/README.md @@ -5,11 +5,12 @@ Tube Archivist has a new home: https://github.com/tubearchivist/tubearchivist ## Table of contents: -* [Wiki](https://github.com/tubearchivist/tubearchivist/wiki) for a detailed documentation, with [FAQ](https://github.com/tubearchivist/tubearchivist/wiki/FAQ) +* [Wiki](https://github.com/tubearchivist/tubearchivist/wiki) with [FAQ](https://github.com/tubearchivist/tubearchivist/wiki/FAQ) * [Core functionality](#core-functionality) * [Screenshots](#screenshots) * [Problem Tube Archivist tries to solve](#problem-tube-archivist-tries-to-solve) * [Connect](#connect) +* [Extended Universe](#extended-universe) * [Installing and updating](#installing-and-updating) * [Getting Started](#getting-started) * [Potential pitfalls](#potential-pitfalls) @@ -52,6 +53,10 @@ Once your YouTube video collection grows, it becomes hard to search and find a s - [Discord](https://discord.gg/AFwz8nE7BK): Connect with us on our Discord server. - [r/TubeArchivist](https://www.reddit.com/r/TubeArchivist/): Join our Subreddit. +## Extended Universe +- [Browser Extension](https://github.com/tubearchivist/browser-extension) Tube Archivist Companion, for [Firefox](https://addons.mozilla.org/addon/tubearchivist-companion/) and [Chrome](https://chrome.google.com/webstore/detail/tubearchivist-companion/jjnkmicfnfojkkgobdfeieblocadmcie) +- [Tube Archivist Metrics](https://github.com/tubearchivist/tubearchivist-metrics) to create statistics in Prometheus/OpenMetrics format. + ## Installing and updating Take a look at the example `docker-compose.yml` file provided. Use the *latest* or the named semantic version tag. The *unstable* tag is for intermediate testing and as the name implies, is **unstable** and not be used on your main installation but in a [testing environment](CONTRIBUTING.md). @@ -155,7 +160,6 @@ We have come far, nonetheless we are not short of ideas on how to improve and ex - [ ] Podcast mode to serve channel as mp3 - [ ] Implement [PyFilesystem](https://github.com/PyFilesystem/pyfilesystem2) for flexible video storage - [ ] Implement [Apprise](https://github.com/caronc/apprise) for notifications ([#97](https://github.com/tubearchivist/tubearchivist/issues/97)) -- [ ] Add passing browser cookies to yt-dlp ([#199](https://github.com/tubearchivist/tubearchivist/issues/199)) - [ ] User created playlists, random and repeat controls ([#108](https://github.com/tubearchivist/tubearchivist/issues/108), [#220](https://github.com/tubearchivist/tubearchivist/issues/220)) - [ ] Auto play or play next link ([#226](https://github.com/tubearchivist/tubearchivist/issues/226)) - [ ] Show similar videos on video page @@ -170,6 +174,7 @@ We have come far, nonetheless we are not short of ideas on how to improve and ex - [ ] Download video comments Implemented: +- [X] Add passing browser cookies to yt-dlp [2022-05-08] - [X] Add [SponsorBlock](https://sponsor.ajay.app/) integration [2022-04-16] - [X] Implement per channel settings [2022-03-26] - [X] Subtitle download & indexing [2022-02-13] diff --git a/deploy.sh b/deploy.sh index 92ca8de..08874eb 100755 --- a/deploy.sh +++ b/deploy.sh @@ -33,7 +33,7 @@ function sync_blackhole { . -e ssh "$host":tubearchivist echo "$PASS" | ssh "$host" 'sudo -S docker buildx build --platform linux/amd64 -t bbilly1/tubearchivist:latest tubearchivist --load 2>/dev/null' - echo "$PASS" | ssh "$host" 'sudo -S docker-compose up -d 2>/dev/null' + echo "$PASS" | ssh "$host" 'sudo -S docker compose up -d 2>/dev/null' } @@ -69,7 +69,7 @@ function sync_test { fi ssh "$host" "docker buildx build --build-arg INSTALL_DEBUG=1 --platform $platform -t bbilly1/tubearchivist:latest tubearchivist --load" - ssh "$host" 'docker-compose -f docker/docker-compose.yml up -d' + ssh "$host" 'docker compose -f docker/docker-compose.yml up -d' } diff --git a/docker-compose.yml b/docker-compose.yml index 89257f1..e5966d9 100644 --- a/docker-compose.yml +++ b/docker-compose.yml @@ -33,7 +33,7 @@ services: depends_on: - archivist-es archivist-es: - image: bbilly1/tubearchivist-es # only for amd64, or use official es 7.17.2 + image: bbilly1/tubearchivist-es # only for amd64, or use official es 7.17.3 container_name: archivist-es restart: always environment: diff --git a/tubearchivist/config/settings.py b/tubearchivist/config/settings.py index 9a44b06..8787adb 100644 --- a/tubearchivist/config/settings.py +++ b/tubearchivist/config/settings.py @@ -163,4 +163,4 @@ CORS_ALLOW_HEADERS = list(default_headers) + [ # TA application settings TA_UPSTREAM = "https://github.com/tubearchivist/tubearchivist" -TA_VERSION = "v0.1.4" +TA_VERSION = "v0.1.5" diff --git a/tubearchivist/home/src/download/queue.py b/tubearchivist/home/src/download/queue.py index 8b4322e..27217c4 100644 --- a/tubearchivist/home/src/download/queue.py +++ b/tubearchivist/home/src/download/queue.py @@ -145,6 +145,16 @@ class PendingList(PendingIndex): cookie_path = CookieHandler().use() self.yt_obs.update({"cookiefile": cookie_path}) + def close_config(self): + """remove config after task finished""" + config = AppConfig().config + if config["downloads"]["cookie_import"]: + CookieHandler().hide() + try: + del self.yt_obs["cookiefile"] + except KeyError: + pass + def parse_url_list(self): """extract youtube ids from list""" self.missing_videos = [] @@ -225,6 +235,8 @@ class PendingList(PendingIndex): query_str = "\n".join(bulk_list) _, _ = ElasticWrap("_bulk").post(query_str, ndjson=True) + self.close_config() + def _notify_add(self, idx): """send notification for adding videos to download queue""" progress = f"{idx + 1}/{len(self.missing_videos)}" diff --git a/tubearchivist/home/src/download/yt_cookie.py b/tubearchivist/home/src/download/yt_cookie.py index 3b16ca5..3a7244e 100644 --- a/tubearchivist/home/src/download/yt_cookie.py +++ b/tubearchivist/home/src/download/yt_cookie.py @@ -40,6 +40,9 @@ class CookieHandler: print("no cookie imported") raise FileNotFoundError + if os.path.exists(self.COOKIE_PATH): + return self.COOKIE_PATH + with open(self.COOKIE_PATH, "w", encoding="utf-8") as cookie_file: cookie_file.write(cookie) diff --git a/tubearchivist/home/src/download/yt_dlp_handler.py b/tubearchivist/home/src/download/yt_dlp_handler.py index fd91432..2c24cb0 100644 --- a/tubearchivist/home/src/download/yt_dlp_handler.py +++ b/tubearchivist/home/src/download/yt_dlp_handler.py @@ -41,7 +41,7 @@ class DownloadPostProcess: self.auto_delete_all() self.auto_delete_overwrites() self.validate_playlists() - self.clear_cookie() + self.pending.close_config() def auto_delete_all(self): """handle auto delete""" @@ -141,11 +141,6 @@ class DownloadPostProcess: else: RedisArchivist().set_message("message:download", mess_dict) - def clear_cookie(self): - """hide cookie file""" - if self.download.config["downloads"]["cookie_import"]: - CookieHandler().hide() - class VideoDownloader: """ diff --git a/tubearchivist/home/templates/home/settings.html b/tubearchivist/home/templates/home/settings.html index c529443..d4cd7f5 100644 --- a/tubearchivist/home/templates/home/settings.html +++ b/tubearchivist/home/templates/home/settings.html @@ -118,7 +118,7 @@

Cookie

Import YouTube cookie: {{ config.downloads.cookie_import }}

- Place your cookie file named cookies.google.txt in cache/import before enabling.
+ Place your cookie file named cookies.google.txt in cache/import before enabling. Instructions in the Wiki.
{{ app_form.downloads_cookie_import }}
{% if config.downloads.cookie_import %}