Initial commit
Some checks failed
Build and Publish Docker Images / build (server.Dockerfile, ${{ vars.IMAGE_NAME_SERVER }}) (push) Has been cancelled
Build and Publish Docker Images / build (server.Dockerfile, ${{ vars.IMAGE_NAME_WORKER }}) (push) Has been cancelled
Build and Publish Docker Images / build (worker.Dockerfile, ${{ vars.IMAGE_NAME_SERVER }}) (push) Has been cancelled
Build and Publish Docker Images / build (worker.Dockerfile, ${{ vars.IMAGE_NAME_WORKER }}) (push) Has been cancelled
Build and Publish Docker Images / setup (push) Has been cancelled

This commit is contained in:
Ben Martin 2025-05-10 13:10:49 +01:00
commit 68f6e85c78
Signed by: ben
GPG key ID: 859A655FCD290E4A
17 changed files with 1286 additions and 0 deletions

View file

@ -0,0 +1,54 @@
name: Build and Publish Docker Images
on:
push:
branches:
- main
pull_request:
branches:
- main
jobs:
setup:
runs-on: ubuntu-latest
outputs:
branch: ${{ steps.get-git-context.outputs.branch }}
commit: ${{ steps.get-git-context.outputs.commit }}
steps:
- name: Checkout code
uses: actions/checkout@v4
- name: Set up QEMU
uses: docker/setup-qemu-action@v3
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@v3
- name: Get Git Context
id: get-git-context
uses: actions/github-script@v6
with:
script: |
const branch = github.ref.split('/').pop();
const commit = github.sha;
core.setOutput('branch', branch);
core.setOutput('commit', commit);
build:
needs: setup
runs-on: ubuntu-latest
strategy:
matrix:
dockerfile: ["worker.Dockerfile", "server.Dockerfile"]
image-name: ["${{ vars.IMAGE_NAME_WORKER }}", "${{ vars.IMAGE_NAME_SERVER }}"]
steps:
- name: Build and Push Image
uses: docker/build-push-action@v6
with:
context: .
file: ${{ matrix.dockerfile }}
platforms: linux/amd64,linux/arm64
push: true
tags: |
git.brmartin.co.uk/${{ matrix.image-name }}:${{ needs.setup.outputs.branch }}
git.brmartin.co.uk/${{ matrix.image-name }}:${{ needs.setup.outputs.commit }}

4
.gitignore vendored Normal file
View file

@ -0,0 +1,4 @@
/.venv/
/dist/
/.vscode/
__pycache__/

7
LICENCE Normal file
View file

@ -0,0 +1,7 @@
Copyright 2025 Ben Martin
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the “Software”), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED “AS IS”, WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.

193
README.md Normal file
View file

@ -0,0 +1,193 @@
# auto-transcoder
**Auto-transcoder** is a Python-based media transcoding application that automatically processes media files when they are added to a specified directory. It leverages Redis for tracking media state and Celery for handling transcoding tasks asynchronously. The application is containerised using Docker for easy deployment and management.
---
## 📦 Features
- **Automatic Transcoding**: Detects new media files in a watched directory and triggers transcoding tasks.
- **Redis Integration**: Tracks media state (e.g., whether a file has been transcoded) using Redis.
- **Celery Worker Support**: Uses Celery to handle transcoding tasks in the background.
- **Docker-Ready**: Comes with Dockerfiles for both the server and worker components.
- **Cross-Platform Compatibility**: Runs on Linux (ARM64 and AMD64 architectures).
---
## 📦 Requirements
- **Redis Server**: Required for media state tracking.
- **Celery Worker**: Needed to process transcoding tasks.
- **Docker**: For containerisation and deployment.
---
## 📁 Project Structure
Here are a few key files in the project structure:
```
auto-transcoder/
├── server.Dockerfile
├── worker.Dockerfile
├── pyproject.toml
├── src/
│ └── auto_transcoder/
│ ├── model.py
│ ├── tasks.py
│ └── server.py
```
---
## 🚀 Getting Started
### 1. **Build Docker Images**
Run the following commands to build the server and worker Docker images:
```bash
docker build -t auto-transcoder-server -f server.Dockerfile .
docker build -t auto-transcoder-worker -f worker.Dockerfile .
```
### 2. **Run Redis Server**
Ensure a Redis server is running. You can use Docker:
```bash
docker run --name redis -d -p 6379:6379 redis
```
### 3. **Run the Server and Worker**
To run the server and worker, use Docker Compose or individual commands. Heres an example using Docker Compose:
```yaml
# docker-compose.yml
services:
redis:
image: redis
ports:
- "6379:6379"
server:
image: auto-transcoder-server
command:
- /data
- --recycle-bin
- /data/recycle_bin
ports:
- "5000:5000"
volumes:
- /path/to/media:/data
depends_on:
- redis
environment:
- REDIS_URL=redis://redis:6379
worker:
image: auto-transcoder-worker
volumes:
- /path/to/media:/data
depends_on:
- redis
environment:
- REDIS_URL=redis://redis:6379
```
Start services with:
```bash
docker compose up -d
```
> 🔍 Note: Replace `/path/to/media` with the actual path to the directory you want to watch.
You may find a standalone docker-compose.yml example if you need help getting started [here](docker-compose.example.yml).
---
## 📌 Configuration
### Command Line Arguments
- `directory_to_watch`: Path to the directory where media files are stored. This is a required argument.
- `--recycle-bin`: Path to a directory where original media files will be moved after transcoding. If not provided, the original files will be deleted.
### Environment Variables
- `REDIS_URL`: URL to your Redis instance (e.g., `redis://redis:6379`).
---
## 🧪 Usage
1. Place media files in the watched directory.
2. The server will automatically detect new files and add them to the queue.
3. A Celery worker will process the transcoding tasks in the background.
---
## 📦 Dependencies
This project uses the following dependencies (managed via Poetry):
- `celery[redis]`
- `flask`
- `python-magic`
- `redis`
- `asyncio`
Install them with:
```bash
poetry install
```
---
## 🧱 Development
To set up the development environment:
1. Install [Poetry](https://python-poetry.org/docs/).
2. Run `poetry install` to install dependencies.
3. Run the server with `poetry run python src/auto_transcoder/server.py` (ensure Redis is running).
4. Run the worker with `poetry run celery -A auto_transcoder.tasks worker --loglevel=info`.
---
## 📝 Notes
- Ensure that both the worker and server instances have access to the media files at the same path.
- If you're using Docker, this can be achieved by mounting the media directory as a volume in your Docker Compose setup.
- The `pyproject.toml` defines package structure and build settings. It currently uses `poetry-core` for building the package.
- The `MediaDTO` class encapsulates media file metadata, and the `RedisManager` class is used to manage Redis connections.
---
## 📄 License
This project is licensed under the **MIT License**. See the [LICENCE](LICENCE) file for details.
---
## 📈 Contributing
Contributions are welcome! To contribute:
1. Fork the repository.
2. Create a feature branch.
3. Make your changes.
4. Submit a pull request.
This repository uses British English spelling. Please ensure that your contributions adhere to this standard.
---
## 📚 References
- [Poetry Documentation](https://python-poetry.org/docs/)
- [Docker Documentation](https://docs.docker.com/)
- [Redis Documentation](https://redis.io/docs/)
- [Celery Documentation](https://docs.celeryproject.org/)

View file

@ -0,0 +1,43 @@
services:
redis:
image: redis/redis-stack-server:latest
ports:
- "6379:6379"
volumes:
- redis-data:/data
networks:
- app-network
worker:
build:
context: .
dockerfile: worker.Dockerfile
environment:
- REDIS_URL=redis://redis:6379/0
depends_on:
- redis
networks:
- app-network
server:
build:
context: .
dockerfile: server.Dockerfile
ports:
- "5000:5000"
command:
- /path/to/media
environment:
- REDIS_URL=redis://redis:6379/0
depends_on:
- redis
networks:
- app-network
volumes:
redis-data:
driver: local
networks:
app-network:
driver: bridge

583
poetry.lock generated Normal file
View file

@ -0,0 +1,583 @@
# This file is automatically @generated by Poetry 2.1.3 and should not be changed by hand.
[[package]]
name = "amqp"
version = "5.3.1"
description = "Low-level AMQP client for Python (fork of amqplib)."
optional = false
python-versions = ">=3.6"
groups = ["main", "dev"]
files = [
{file = "amqp-5.3.1-py3-none-any.whl", hash = "sha256:43b3319e1b4e7d1251833a93d672b4af1e40f3d632d479b98661a95f117880a2"},
{file = "amqp-5.3.1.tar.gz", hash = "sha256:cddc00c725449522023bad949f70fff7b48f0b1ade74d170a6f10ab044739432"},
]
[package.dependencies]
vine = ">=5.0.0,<6.0.0"
[[package]]
name = "asyncio"
version = "3.4.3"
description = "reference implementation of PEP 3156"
optional = false
python-versions = "*"
groups = ["main"]
files = [
{file = "asyncio-3.4.3-cp33-none-win32.whl", hash = "sha256:b62c9157d36187eca799c378e572c969f0da87cd5fc42ca372d92cdb06e7e1de"},
{file = "asyncio-3.4.3-cp33-none-win_amd64.whl", hash = "sha256:c46a87b48213d7464f22d9a497b9eef8c1928b68320a2fa94240f969f6fec08c"},
{file = "asyncio-3.4.3-py3-none-any.whl", hash = "sha256:c4d18b22701821de07bd6aea8b53d21449ec0ec5680645e5317062ea21817d2d"},
{file = "asyncio-3.4.3.tar.gz", hash = "sha256:83360ff8bc97980e4ff25c964c7bd3923d333d177aa4f7fb736b019f26c7cb41"},
]
[[package]]
name = "billiard"
version = "4.2.1"
description = "Python multiprocessing fork with improvements and bugfixes"
optional = false
python-versions = ">=3.7"
groups = ["main", "dev"]
files = [
{file = "billiard-4.2.1-py3-none-any.whl", hash = "sha256:40b59a4ac8806ba2c2369ea98d876bc6108b051c227baffd928c644d15d8f3cb"},
{file = "billiard-4.2.1.tar.gz", hash = "sha256:12b641b0c539073fc8d3f5b8b7be998956665c4233c7c1fcd66a7e677c4fb36f"},
]
[[package]]
name = "blinker"
version = "1.9.0"
description = "Fast, simple object-to-object and broadcast signaling"
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "blinker-1.9.0-py3-none-any.whl", hash = "sha256:ba0efaa9080b619ff2f3459d1d500c57bddea4a6b424b60a91141db6fd2f08bc"},
{file = "blinker-1.9.0.tar.gz", hash = "sha256:b4ce2265a7abece45e7cc896e98dbebe6cead56bcf805a3d23136d145f5445bf"},
]
[[package]]
name = "celery"
version = "5.5.2"
description = "Distributed Task Queue."
optional = false
python-versions = ">=3.8"
groups = ["main", "dev"]
files = [
{file = "celery-5.5.2-py3-none-any.whl", hash = "sha256:54425a067afdc88b57cd8d94ed4af2ffaf13ab8c7680041ac2c4ac44357bdf4c"},
{file = "celery-5.5.2.tar.gz", hash = "sha256:4d6930f354f9d29295425d7a37261245c74a32807c45d764bedc286afd0e724e"},
]
[package.dependencies]
billiard = ">=4.2.1,<5.0"
click = ">=8.1.2,<9.0"
click-didyoumean = ">=0.3.0"
click-plugins = ">=1.1.1"
click-repl = ">=0.2.0"
kombu = ">=5.5.2,<5.6"
python-dateutil = ">=2.8.2"
redis = {version = ">=4.5.2,<4.5.5 || >4.5.5,<6.0.0", optional = true, markers = "extra == \"redis\""}
vine = ">=5.1.0,<6.0"
[package.extras]
arangodb = ["pyArango (>=2.0.2)"]
auth = ["cryptography (==44.0.2)"]
azureblockblob = ["azure-identity (>=1.19.0)", "azure-storage-blob (>=12.15.0)"]
brotli = ["brotli (>=1.0.0) ; platform_python_implementation == \"CPython\"", "brotlipy (>=0.7.0) ; platform_python_implementation == \"PyPy\""]
cassandra = ["cassandra-driver (>=3.25.0,<4)"]
consul = ["python-consul2 (==0.1.5)"]
cosmosdbsql = ["pydocumentdb (==2.3.5)"]
couchbase = ["couchbase (>=3.0.0) ; platform_python_implementation != \"PyPy\" and (platform_system != \"Windows\" or python_version < \"3.10\")"]
couchdb = ["pycouchdb (==1.16.0)"]
django = ["Django (>=2.2.28)"]
dynamodb = ["boto3 (>=1.26.143)"]
elasticsearch = ["elastic-transport (<=8.17.1)", "elasticsearch (<=8.17.2)"]
eventlet = ["eventlet (>=0.32.0) ; python_version < \"3.10\""]
gcs = ["google-cloud-firestore (==2.20.1)", "google-cloud-storage (>=2.10.0)", "grpcio (==1.67.0)"]
gevent = ["gevent (>=1.5.0)"]
librabbitmq = ["librabbitmq (>=2.0.0) ; python_version < \"3.11\""]
memcache = ["pylibmc (==1.6.3) ; platform_system != \"Windows\""]
mongodb = ["pymongo (==4.10.1)"]
msgpack = ["msgpack (==1.1.0)"]
pydantic = ["pydantic (>=2.4)"]
pymemcache = ["python-memcached (>=1.61)"]
pyro = ["pyro4 (==4.82) ; python_version < \"3.11\""]
pytest = ["pytest-celery[all] (>=1.2.0,<1.3.0)"]
redis = ["redis (>=4.5.2,!=4.5.5,<6.0.0)"]
s3 = ["boto3 (>=1.26.143)"]
slmq = ["softlayer_messaging (>=1.0.3)"]
solar = ["ephem (==4.2) ; platform_python_implementation != \"PyPy\""]
sqlalchemy = ["sqlalchemy (>=1.4.48,<2.1)"]
sqs = ["boto3 (>=1.26.143)", "kombu[sqs] (>=5.3.4)", "urllib3 (>=1.26.16)"]
tblib = ["tblib (>=1.3.0) ; python_version < \"3.8.0\"", "tblib (>=1.5.0) ; python_version >= \"3.8.0\""]
yaml = ["PyYAML (>=3.10)"]
zookeeper = ["kazoo (>=1.3.1)"]
zstd = ["zstandard (==0.23.0)"]
[[package]]
name = "click"
version = "8.1.8"
description = "Composable command line interface toolkit"
optional = false
python-versions = ">=3.7"
groups = ["main", "dev"]
files = [
{file = "click-8.1.8-py3-none-any.whl", hash = "sha256:63c132bbbed01578a06712a2d1f497bb62d9c1c0d329b7903a866228027263b2"},
{file = "click-8.1.8.tar.gz", hash = "sha256:ed53c9d8990d83c2a27deae68e4ee337473f6330c040a31d4225c9574d16096a"},
]
[package.dependencies]
colorama = {version = "*", markers = "platform_system == \"Windows\""}
[[package]]
name = "click-didyoumean"
version = "0.3.1"
description = "Enables git-like *did-you-mean* feature in click"
optional = false
python-versions = ">=3.6.2"
groups = ["main", "dev"]
files = [
{file = "click_didyoumean-0.3.1-py3-none-any.whl", hash = "sha256:5c4bb6007cfea5f2fd6583a2fb6701a22a41eb98957e63d0fac41c10e7c3117c"},
{file = "click_didyoumean-0.3.1.tar.gz", hash = "sha256:4f82fdff0dbe64ef8ab2279bd6aa3f6a99c3b28c05aa09cbfc07c9d7fbb5a463"},
]
[package.dependencies]
click = ">=7"
[[package]]
name = "click-plugins"
version = "1.1.1"
description = "An extension module for click to enable registering CLI commands via setuptools entry-points."
optional = false
python-versions = "*"
groups = ["main", "dev"]
files = [
{file = "click-plugins-1.1.1.tar.gz", hash = "sha256:46ab999744a9d831159c3411bb0c79346d94a444df9a3a3742e9ed63645f264b"},
{file = "click_plugins-1.1.1-py2.py3-none-any.whl", hash = "sha256:5d262006d3222f5057fd81e1623d4443e41dcda5dc815c06b442aa3c02889fc8"},
]
[package.dependencies]
click = ">=4.0"
[package.extras]
dev = ["coveralls", "pytest (>=3.6)", "pytest-cov", "wheel"]
[[package]]
name = "click-repl"
version = "0.3.0"
description = "REPL plugin for Click"
optional = false
python-versions = ">=3.6"
groups = ["main", "dev"]
files = [
{file = "click-repl-0.3.0.tar.gz", hash = "sha256:17849c23dba3d667247dc4defe1757fff98694e90fe37474f3feebb69ced26a9"},
{file = "click_repl-0.3.0-py3-none-any.whl", hash = "sha256:fb7e06deb8da8de86180a33a9da97ac316751c094c6899382da7feeeeb51b812"},
]
[package.dependencies]
click = ">=7.0"
prompt-toolkit = ">=3.0.36"
[package.extras]
testing = ["pytest (>=7.2.1)", "pytest-cov (>=4.0.0)", "tox (>=4.4.3)"]
[[package]]
name = "colorama"
version = "0.4.6"
description = "Cross-platform colored terminal text."
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,!=3.3.*,!=3.4.*,!=3.5.*,!=3.6.*,>=2.7"
groups = ["main", "dev"]
markers = "platform_system == \"Windows\""
files = [
{file = "colorama-0.4.6-py2.py3-none-any.whl", hash = "sha256:4f1d9991f5acc0ca119f9d443620b77f9d6b33703e51011c16baf57afb285fc6"},
{file = "colorama-0.4.6.tar.gz", hash = "sha256:08695f5cb7ed6e0531a20572697297273c47b8cae5a63ffc6d6ed5c201be6e44"},
]
[[package]]
name = "flask"
version = "3.1.0"
description = "A simple framework for building complex web applications."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "flask-3.1.0-py3-none-any.whl", hash = "sha256:d667207822eb83f1c4b50949b1623c8fc8d51f2341d65f72e1a1815397551136"},
{file = "flask-3.1.0.tar.gz", hash = "sha256:5f873c5184c897c8d9d1b05df1e3d01b14910ce69607a117bd3277098a5836ac"},
]
[package.dependencies]
blinker = ">=1.9"
click = ">=8.1.3"
itsdangerous = ">=2.2"
Jinja2 = ">=3.1.2"
Werkzeug = ">=3.1"
[package.extras]
async = ["asgiref (>=3.2)"]
dotenv = ["python-dotenv"]
[[package]]
name = "flower"
version = "2.0.1"
description = "Celery Flower"
optional = false
python-versions = ">=3.7"
groups = ["dev"]
files = [
{file = "flower-2.0.1-py2.py3-none-any.whl", hash = "sha256:9db2c621eeefbc844c8dd88be64aef61e84e2deb29b271e02ab2b5b9f01068e2"},
{file = "flower-2.0.1.tar.gz", hash = "sha256:5ab717b979530770c16afb48b50d2a98d23c3e9fe39851dcf6bc4d01845a02a0"},
]
[package.dependencies]
celery = ">=5.0.5"
humanize = "*"
prometheus-client = ">=0.8.0"
pytz = "*"
tornado = ">=5.0.0,<7.0.0"
[[package]]
name = "humanize"
version = "4.12.3"
description = "Python humanize utilities"
optional = false
python-versions = ">=3.9"
groups = ["dev"]
files = [
{file = "humanize-4.12.3-py3-none-any.whl", hash = "sha256:2cbf6370af06568fa6d2da77c86edb7886f3160ecd19ee1ffef07979efc597f6"},
{file = "humanize-4.12.3.tar.gz", hash = "sha256:8430be3a615106fdfceb0b2c1b41c4c98c6b0fc5cc59663a5539b111dd325fb0"},
]
[package.extras]
tests = ["freezegun", "pytest", "pytest-cov"]
[[package]]
name = "itsdangerous"
version = "2.2.0"
description = "Safely pass data to untrusted environments and back."
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "itsdangerous-2.2.0-py3-none-any.whl", hash = "sha256:c6242fc49e35958c8b15141343aa660db5fc54d4f13a1db01a3f5891b98700ef"},
{file = "itsdangerous-2.2.0.tar.gz", hash = "sha256:e0050c0b7da1eea53ffaf149c0cfbb5c6e2e2b69c4bef22c81fa6eb73e5f6173"},
]
[[package]]
name = "jinja2"
version = "3.1.6"
description = "A very fast and expressive template engine."
optional = false
python-versions = ">=3.7"
groups = ["main"]
files = [
{file = "jinja2-3.1.6-py3-none-any.whl", hash = "sha256:85ece4451f492d0c13c5dd7c13a64681a86afae63a5f347908daf103ce6d2f67"},
{file = "jinja2-3.1.6.tar.gz", hash = "sha256:0137fb05990d35f1275a587e9aee6d56da821fc83491a0fb838183be43f66d6d"},
]
[package.dependencies]
MarkupSafe = ">=2.0"
[package.extras]
i18n = ["Babel (>=2.7)"]
[[package]]
name = "kombu"
version = "5.5.3"
description = "Messaging library for Python."
optional = false
python-versions = ">=3.8"
groups = ["main", "dev"]
files = [
{file = "kombu-5.5.3-py3-none-any.whl", hash = "sha256:5b0dbceb4edee50aa464f59469d34b97864be09111338cfb224a10b6a163909b"},
{file = "kombu-5.5.3.tar.gz", hash = "sha256:021a0e11fcfcd9b0260ef1fb64088c0e92beb976eb59c1dfca7ddd4ad4562ea2"},
]
[package.dependencies]
amqp = ">=5.1.1,<6.0.0"
tzdata = {version = ">=2025.2", markers = "python_version >= \"3.9\""}
vine = "5.1.0"
[package.extras]
azureservicebus = ["azure-servicebus (>=7.10.0)"]
azurestoragequeues = ["azure-identity (>=1.12.0)", "azure-storage-queue (>=12.6.0)"]
confluentkafka = ["confluent-kafka (>=2.2.0)"]
consul = ["python-consul2 (==0.1.5)"]
gcpubsub = ["google-cloud-monitoring (>=2.16.0)", "google-cloud-pubsub (>=2.18.4)", "grpcio (==1.67.0)", "protobuf (==4.25.5)"]
librabbitmq = ["librabbitmq (>=2.0.0) ; python_version < \"3.11\""]
mongodb = ["pymongo (>=4.1.1)"]
msgpack = ["msgpack (==1.1.0)"]
pyro = ["pyro4 (==4.82)"]
qpid = ["qpid-python (>=0.26)", "qpid-tools (>=0.26)"]
redis = ["redis (>=4.5.2,!=4.5.5,!=5.0.2,<=5.2.1)"]
slmq = ["softlayer_messaging (>=1.0.3)"]
sqlalchemy = ["sqlalchemy (>=1.4.48,<2.1)"]
sqs = ["boto3 (>=1.26.143)", "urllib3 (>=1.26.16)"]
yaml = ["PyYAML (>=3.10)"]
zookeeper = ["kazoo (>=2.8.0)"]
[[package]]
name = "markupsafe"
version = "3.0.2"
description = "Safely add untrusted strings to HTML/XML markup."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "MarkupSafe-3.0.2-cp310-cp310-macosx_10_9_universal2.whl", hash = "sha256:7e94c425039cde14257288fd61dcfb01963e658efbc0ff54f5306b06054700f8"},
{file = "MarkupSafe-3.0.2-cp310-cp310-macosx_11_0_arm64.whl", hash = "sha256:9e2d922824181480953426608b81967de705c3cef4d1af983af849d7bd619158"},
{file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:38a9ef736c01fccdd6600705b09dc574584b89bea478200c5fbf112a6b0d5579"},
{file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bbcb445fa71794da8f178f0f6d66789a28d7319071af7a496d4d507ed566270d"},
{file = "MarkupSafe-3.0.2-cp310-cp310-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:57cb5a3cf367aeb1d316576250f65edec5bb3be939e9247ae594b4bcbc317dfb"},
{file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_aarch64.whl", hash = "sha256:3809ede931876f5b2ec92eef964286840ed3540dadf803dd570c3b7e13141a3b"},
{file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_i686.whl", hash = "sha256:e07c3764494e3776c602c1e78e298937c3315ccc9043ead7e685b7f2b8d47b3c"},
{file = "MarkupSafe-3.0.2-cp310-cp310-musllinux_1_2_x86_64.whl", hash = "sha256:b424c77b206d63d500bcb69fa55ed8d0e6a3774056bdc4839fc9298a7edca171"},
{file = "MarkupSafe-3.0.2-cp310-cp310-win32.whl", hash = "sha256:fcabf5ff6eea076f859677f5f0b6b5c1a51e70a376b0579e0eadef8db48c6b50"},
{file = "MarkupSafe-3.0.2-cp310-cp310-win_amd64.whl", hash = "sha256:6af100e168aa82a50e186c82875a5893c5597a0c1ccdb0d8b40240b1f28b969a"},
{file = "MarkupSafe-3.0.2-cp311-cp311-macosx_10_9_universal2.whl", hash = "sha256:9025b4018f3a1314059769c7bf15441064b2207cb3f065e6ea1e7359cb46db9d"},
{file = "MarkupSafe-3.0.2-cp311-cp311-macosx_11_0_arm64.whl", hash = "sha256:93335ca3812df2f366e80509ae119189886b0f3c2b81325d39efdb84a1e2ae93"},
{file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:2cb8438c3cbb25e220c2ab33bb226559e7afb3baec11c4f218ffa7308603c832"},
{file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:a123e330ef0853c6e822384873bef7507557d8e4a082961e1defa947aa59ba84"},
{file = "MarkupSafe-3.0.2-cp311-cp311-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:1e084f686b92e5b83186b07e8a17fc09e38fff551f3602b249881fec658d3eca"},
{file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_aarch64.whl", hash = "sha256:d8213e09c917a951de9d09ecee036d5c7d36cb6cb7dbaece4c71a60d79fb9798"},
{file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_i686.whl", hash = "sha256:5b02fb34468b6aaa40dfc198d813a641e3a63b98c2b05a16b9f80b7ec314185e"},
{file = "MarkupSafe-3.0.2-cp311-cp311-musllinux_1_2_x86_64.whl", hash = "sha256:0bff5e0ae4ef2e1ae4fdf2dfd5b76c75e5c2fa4132d05fc1b0dabcd20c7e28c4"},
{file = "MarkupSafe-3.0.2-cp311-cp311-win32.whl", hash = "sha256:6c89876f41da747c8d3677a2b540fb32ef5715f97b66eeb0c6b66f5e3ef6f59d"},
{file = "MarkupSafe-3.0.2-cp311-cp311-win_amd64.whl", hash = "sha256:70a87b411535ccad5ef2f1df5136506a10775d267e197e4cf531ced10537bd6b"},
{file = "MarkupSafe-3.0.2-cp312-cp312-macosx_10_13_universal2.whl", hash = "sha256:9778bd8ab0a994ebf6f84c2b949e65736d5575320a17ae8984a77fab08db94cf"},
{file = "MarkupSafe-3.0.2-cp312-cp312-macosx_11_0_arm64.whl", hash = "sha256:846ade7b71e3536c4e56b386c2a47adf5741d2d8b94ec9dc3e92e5e1ee1e2225"},
{file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1c99d261bd2d5f6b59325c92c73df481e05e57f19837bdca8413b9eac4bd8028"},
{file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:e17c96c14e19278594aa4841ec148115f9c7615a47382ecb6b82bd8fea3ab0c8"},
{file = "MarkupSafe-3.0.2-cp312-cp312-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:88416bd1e65dcea10bc7569faacb2c20ce071dd1f87539ca2ab364bf6231393c"},
{file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_aarch64.whl", hash = "sha256:2181e67807fc2fa785d0592dc2d6206c019b9502410671cc905d132a92866557"},
{file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_i686.whl", hash = "sha256:52305740fe773d09cffb16f8ed0427942901f00adedac82ec8b67752f58a1b22"},
{file = "MarkupSafe-3.0.2-cp312-cp312-musllinux_1_2_x86_64.whl", hash = "sha256:ad10d3ded218f1039f11a75f8091880239651b52e9bb592ca27de44eed242a48"},
{file = "MarkupSafe-3.0.2-cp312-cp312-win32.whl", hash = "sha256:0f4ca02bea9a23221c0182836703cbf8930c5e9454bacce27e767509fa286a30"},
{file = "MarkupSafe-3.0.2-cp312-cp312-win_amd64.whl", hash = "sha256:8e06879fc22a25ca47312fbe7c8264eb0b662f6db27cb2d3bbbc74b1df4b9b87"},
{file = "MarkupSafe-3.0.2-cp313-cp313-macosx_10_13_universal2.whl", hash = "sha256:ba9527cdd4c926ed0760bc301f6728ef34d841f405abf9d4f959c478421e4efd"},
{file = "MarkupSafe-3.0.2-cp313-cp313-macosx_11_0_arm64.whl", hash = "sha256:f8b3d067f2e40fe93e1ccdd6b2e1d16c43140e76f02fb1319a05cf2b79d99430"},
{file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:569511d3b58c8791ab4c2e1285575265991e6d8f8700c7be0e88f86cb0672094"},
{file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:15ab75ef81add55874e7ab7055e9c397312385bd9ced94920f2802310c930396"},
{file = "MarkupSafe-3.0.2-cp313-cp313-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:f3818cb119498c0678015754eba762e0d61e5b52d34c8b13d770f0719f7b1d79"},
{file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_aarch64.whl", hash = "sha256:cdb82a876c47801bb54a690c5ae105a46b392ac6099881cdfb9f6e95e4014c6a"},
{file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_i686.whl", hash = "sha256:cabc348d87e913db6ab4aa100f01b08f481097838bdddf7c7a84b7575b7309ca"},
{file = "MarkupSafe-3.0.2-cp313-cp313-musllinux_1_2_x86_64.whl", hash = "sha256:444dcda765c8a838eaae23112db52f1efaf750daddb2d9ca300bcae1039adc5c"},
{file = "MarkupSafe-3.0.2-cp313-cp313-win32.whl", hash = "sha256:bcf3e58998965654fdaff38e58584d8937aa3096ab5354d493c77d1fdd66d7a1"},
{file = "MarkupSafe-3.0.2-cp313-cp313-win_amd64.whl", hash = "sha256:e6a2a455bd412959b57a172ce6328d2dd1f01cb2135efda2e4576e8a23fa3b0f"},
{file = "MarkupSafe-3.0.2-cp313-cp313t-macosx_10_13_universal2.whl", hash = "sha256:b5a6b3ada725cea8a5e634536b1b01c30bcdcd7f9c6fff4151548d5bf6b3a36c"},
{file = "MarkupSafe-3.0.2-cp313-cp313t-macosx_11_0_arm64.whl", hash = "sha256:a904af0a6162c73e3edcb969eeeb53a63ceeb5d8cf642fade7d39e7963a22ddb"},
{file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:4aa4e5faecf353ed117801a068ebab7b7e09ffb6e1d5e412dc852e0da018126c"},
{file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:c0ef13eaeee5b615fb07c9a7dadb38eac06a0608b41570d8ade51c56539e509d"},
{file = "MarkupSafe-3.0.2-cp313-cp313t-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:d16a81a06776313e817c951135cf7340a3e91e8c1ff2fac444cfd75fffa04afe"},
{file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_aarch64.whl", hash = "sha256:6381026f158fdb7c72a168278597a5e3a5222e83ea18f543112b2662a9b699c5"},
{file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_i686.whl", hash = "sha256:3d79d162e7be8f996986c064d1c7c817f6df3a77fe3d6859f6f9e7be4b8c213a"},
{file = "MarkupSafe-3.0.2-cp313-cp313t-musllinux_1_2_x86_64.whl", hash = "sha256:131a3c7689c85f5ad20f9f6fb1b866f402c445b220c19fe4308c0b147ccd2ad9"},
{file = "MarkupSafe-3.0.2-cp313-cp313t-win32.whl", hash = "sha256:ba8062ed2cf21c07a9e295d5b8a2a5ce678b913b45fdf68c32d95d6c1291e0b6"},
{file = "MarkupSafe-3.0.2-cp313-cp313t-win_amd64.whl", hash = "sha256:e444a31f8db13eb18ada366ab3cf45fd4b31e4db1236a4448f68778c1d1a5a2f"},
{file = "MarkupSafe-3.0.2-cp39-cp39-macosx_10_9_universal2.whl", hash = "sha256:eaa0a10b7f72326f1372a713e73c3f739b524b3af41feb43e4921cb529f5929a"},
{file = "MarkupSafe-3.0.2-cp39-cp39-macosx_11_0_arm64.whl", hash = "sha256:48032821bbdf20f5799ff537c7ac3d1fba0ba032cfc06194faffa8cda8b560ff"},
{file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1a9d3f5f0901fdec14d8d2f66ef7d035f2157240a433441719ac9a3fba440b13"},
{file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:88b49a3b9ff31e19998750c38e030fc7bb937398b1f78cfa599aaef92d693144"},
{file = "MarkupSafe-3.0.2-cp39-cp39-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:cfad01eed2c2e0c01fd0ecd2ef42c492f7f93902e39a42fc9ee1692961443a29"},
{file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_aarch64.whl", hash = "sha256:1225beacc926f536dc82e45f8a4d68502949dc67eea90eab715dea3a21c1b5f0"},
{file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_i686.whl", hash = "sha256:3169b1eefae027567d1ce6ee7cae382c57fe26e82775f460f0b2778beaad66c0"},
{file = "MarkupSafe-3.0.2-cp39-cp39-musllinux_1_2_x86_64.whl", hash = "sha256:eb7972a85c54febfb25b5c4b4f3af4dcc731994c7da0d8a0b4a6eb0640e1d178"},
{file = "MarkupSafe-3.0.2-cp39-cp39-win32.whl", hash = "sha256:8c4e8c3ce11e1f92f6536ff07154f9d49677ebaaafc32db9db4620bc11ed480f"},
{file = "MarkupSafe-3.0.2-cp39-cp39-win_amd64.whl", hash = "sha256:6e296a513ca3d94054c2c881cc913116e90fd030ad1c656b3869762b754f5f8a"},
{file = "markupsafe-3.0.2.tar.gz", hash = "sha256:ee55d3edf80167e48ea11a923c7386f4669df67d7994554387f84e7d8b0a2bf0"},
]
[[package]]
name = "prometheus-client"
version = "0.21.1"
description = "Python client for the Prometheus monitoring system."
optional = false
python-versions = ">=3.8"
groups = ["dev"]
files = [
{file = "prometheus_client-0.21.1-py3-none-any.whl", hash = "sha256:594b45c410d6f4f8888940fe80b5cc2521b305a1fafe1c58609ef715a001f301"},
{file = "prometheus_client-0.21.1.tar.gz", hash = "sha256:252505a722ac04b0456be05c05f75f45d760c2911ffc45f2a06bcaed9f3ae3fb"},
]
[package.extras]
twisted = ["twisted"]
[[package]]
name = "prompt-toolkit"
version = "3.0.51"
description = "Library for building powerful interactive command lines in Python"
optional = false
python-versions = ">=3.8"
groups = ["main", "dev"]
files = [
{file = "prompt_toolkit-3.0.51-py3-none-any.whl", hash = "sha256:52742911fde84e2d423e2f9a4cf1de7d7ac4e51958f648d9540e0fb8db077b07"},
{file = "prompt_toolkit-3.0.51.tar.gz", hash = "sha256:931a162e3b27fc90c86f1b48bb1fb2c528c2761475e57c9c06de13311c7b54ed"},
]
[package.dependencies]
wcwidth = "*"
[[package]]
name = "pyjwt"
version = "2.9.0"
description = "JSON Web Token implementation in Python"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "PyJWT-2.9.0-py3-none-any.whl", hash = "sha256:3b02fb0f44517787776cf48f2ae25d8e14f300e6d7545a4315cee571a415e850"},
{file = "pyjwt-2.9.0.tar.gz", hash = "sha256:7e1e5b56cc735432a7369cbfa0efe50fa113ebecdc04ae6922deba8b84582d0c"},
]
[package.extras]
crypto = ["cryptography (>=3.4.0)"]
dev = ["coverage[toml] (==5.0.4)", "cryptography (>=3.4.0)", "pre-commit", "pytest (>=6.0.0,<7.0.0)", "sphinx", "sphinx-rtd-theme", "zope.interface"]
docs = ["sphinx", "sphinx-rtd-theme", "zope.interface"]
tests = ["coverage[toml] (==5.0.4)", "pytest (>=6.0.0,<7.0.0)"]
[[package]]
name = "python-dateutil"
version = "2.9.0.post0"
description = "Extensions to the standard Python datetime module"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
groups = ["main", "dev"]
files = [
{file = "python-dateutil-2.9.0.post0.tar.gz", hash = "sha256:37dd54208da7e1cd875388217d5e00ebd4179249f90fb72437e91a35459a0ad3"},
{file = "python_dateutil-2.9.0.post0-py2.py3-none-any.whl", hash = "sha256:a8b2bc7bffae282281c8140a97d3aa9c14da0b136dfe83f850eea9a5f7470427"},
]
[package.dependencies]
six = ">=1.5"
[[package]]
name = "python-magic"
version = "0.4.27"
description = "File type identification using libmagic"
optional = false
python-versions = ">=2.7, !=3.0.*, !=3.1.*, !=3.2.*, !=3.3.*, !=3.4.*"
groups = ["main"]
files = [
{file = "python-magic-0.4.27.tar.gz", hash = "sha256:c1ba14b08e4a5f5c31a302b7721239695b2f0f058d125bd5ce1ee36b9d9d3c3b"},
{file = "python_magic-0.4.27-py2.py3-none-any.whl", hash = "sha256:c212960ad306f700aa0d01e5d7a325d20548ff97eb9920dcd29513174f0294d3"},
]
[[package]]
name = "pytz"
version = "2025.2"
description = "World timezone definitions, modern and historical"
optional = false
python-versions = "*"
groups = ["dev"]
files = [
{file = "pytz-2025.2-py2.py3-none-any.whl", hash = "sha256:5ddf76296dd8c44c26eb8f4b6f35488f3ccbf6fbbd7adee0b7262d43f0ec2f00"},
{file = "pytz-2025.2.tar.gz", hash = "sha256:360b9e3dbb49a209c21ad61809c7fb453643e048b38924c765813546746e81c3"},
]
[[package]]
name = "redis"
version = "5.3.0"
description = "Python client for Redis database and key-value store"
optional = false
python-versions = ">=3.8"
groups = ["main"]
files = [
{file = "redis-5.3.0-py3-none-any.whl", hash = "sha256:f1deeca1ea2ef25c1e4e46b07f4ea1275140526b1feea4c6459c0ec27a10ef83"},
{file = "redis-5.3.0.tar.gz", hash = "sha256:8d69d2dde11a12dc85d0dbf5c45577a5af048e2456f7077d87ad35c1c81c310e"},
]
[package.dependencies]
PyJWT = ">=2.9.0,<2.10.0"
[package.extras]
hiredis = ["hiredis (>=3.0.0)"]
ocsp = ["cryptography (>=36.0.1)", "pyopenssl (==23.2.1)", "requests (>=2.31.0)"]
[[package]]
name = "six"
version = "1.17.0"
description = "Python 2 and 3 compatibility utilities"
optional = false
python-versions = "!=3.0.*,!=3.1.*,!=3.2.*,>=2.7"
groups = ["main", "dev"]
files = [
{file = "six-1.17.0-py2.py3-none-any.whl", hash = "sha256:4721f391ed90541fddacab5acf947aa0d3dc7d27b2e1e8eda2be8970586c3274"},
{file = "six-1.17.0.tar.gz", hash = "sha256:ff70335d468e7eb6ec65b95b99d3a2836546063f63acc5171de367e834932a81"},
]
[[package]]
name = "tornado"
version = "6.4.2"
description = "Tornado is a Python web framework and asynchronous networking library, originally developed at FriendFeed."
optional = false
python-versions = ">=3.8"
groups = ["dev"]
files = [
{file = "tornado-6.4.2-cp38-abi3-macosx_10_9_universal2.whl", hash = "sha256:e828cce1123e9e44ae2a50a9de3055497ab1d0aeb440c5ac23064d9e44880da1"},
{file = "tornado-6.4.2-cp38-abi3-macosx_10_9_x86_64.whl", hash = "sha256:072ce12ada169c5b00b7d92a99ba089447ccc993ea2143c9ede887e0937aa803"},
{file = "tornado-6.4.2-cp38-abi3-manylinux_2_17_aarch64.manylinux2014_aarch64.whl", hash = "sha256:1a017d239bd1bb0919f72af256a970624241f070496635784d9bf0db640d3fec"},
{file = "tornado-6.4.2-cp38-abi3-manylinux_2_5_i686.manylinux1_i686.manylinux_2_17_i686.manylinux2014_i686.whl", hash = "sha256:c36e62ce8f63409301537222faffcef7dfc5284f27eec227389f2ad11b09d946"},
{file = "tornado-6.4.2-cp38-abi3-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_17_x86_64.manylinux2014_x86_64.whl", hash = "sha256:bca9eb02196e789c9cb5c3c7c0f04fb447dc2adffd95265b2c7223a8a615ccbf"},
{file = "tornado-6.4.2-cp38-abi3-musllinux_1_2_aarch64.whl", hash = "sha256:304463bd0772442ff4d0f5149c6f1c2135a1fae045adf070821c6cdc76980634"},
{file = "tornado-6.4.2-cp38-abi3-musllinux_1_2_i686.whl", hash = "sha256:c82c46813ba483a385ab2a99caeaedf92585a1f90defb5693351fa7e4ea0bf73"},
{file = "tornado-6.4.2-cp38-abi3-musllinux_1_2_x86_64.whl", hash = "sha256:932d195ca9015956fa502c6b56af9eb06106140d844a335590c1ec7f5277d10c"},
{file = "tornado-6.4.2-cp38-abi3-win32.whl", hash = "sha256:2876cef82e6c5978fde1e0d5b1f919d756968d5b4282418f3146b79b58556482"},
{file = "tornado-6.4.2-cp38-abi3-win_amd64.whl", hash = "sha256:908b71bf3ff37d81073356a5fadcc660eb10c1476ee6e2725588626ce7e5ca38"},
{file = "tornado-6.4.2.tar.gz", hash = "sha256:92bad5b4746e9879fd7bf1eb21dce4e3fc5128d71601f80005afa39237ad620b"},
]
[[package]]
name = "tzdata"
version = "2025.2"
description = "Provider of IANA time zone data"
optional = false
python-versions = ">=2"
groups = ["main", "dev"]
files = [
{file = "tzdata-2025.2-py2.py3-none-any.whl", hash = "sha256:1a403fada01ff9221ca8044d701868fa132215d84beb92242d9acd2147f667a8"},
{file = "tzdata-2025.2.tar.gz", hash = "sha256:b60a638fcc0daffadf82fe0f57e53d06bdec2f36c4df66280ae79bce6bd6f2b9"},
]
[[package]]
name = "vine"
version = "5.1.0"
description = "Python promises."
optional = false
python-versions = ">=3.6"
groups = ["main", "dev"]
files = [
{file = "vine-5.1.0-py3-none-any.whl", hash = "sha256:40fdf3c48b2cfe1c38a49e9ae2da6fda88e4794c810050a728bd7413811fb1dc"},
{file = "vine-5.1.0.tar.gz", hash = "sha256:8b62e981d35c41049211cf62a0a1242d8c1ee9bd15bb196ce38aefd6799e61e0"},
]
[[package]]
name = "wcwidth"
version = "0.2.13"
description = "Measures the displayed width of unicode strings in a terminal"
optional = false
python-versions = "*"
groups = ["main", "dev"]
files = [
{file = "wcwidth-0.2.13-py2.py3-none-any.whl", hash = "sha256:3da69048e4540d84af32131829ff948f1e022c1c6bdb8d6102117aac784f6859"},
{file = "wcwidth-0.2.13.tar.gz", hash = "sha256:72ea0c06399eb286d978fdedb6923a9eb47e1c486ce63e9b4e64fc18303972b5"},
]
[[package]]
name = "werkzeug"
version = "3.1.3"
description = "The comprehensive WSGI web application library."
optional = false
python-versions = ">=3.9"
groups = ["main"]
files = [
{file = "werkzeug-3.1.3-py3-none-any.whl", hash = "sha256:54b78bf3716d19a65be4fceccc0d1d7b89e608834989dfae50ea87564639213e"},
{file = "werkzeug-3.1.3.tar.gz", hash = "sha256:60723ce945c19328679790e3282cc758aa4a6040e4bb330f53d30fa546d44746"},
]
[package.dependencies]
MarkupSafe = ">=2.1.1"
[package.extras]
watchdog = ["watchdog (>=2.3)"]
[metadata]
lock-version = "2.1"
python-versions = ">=3.13"
content-hash = "5395597c4bf838c1973ccd26643cae4286a5e57fb0e8e9097fd0ba34800ffa8a"

28
pyproject.toml Normal file
View file

@ -0,0 +1,28 @@
[project]
name = "auto-transcoder"
version = "0.1.0"
description = ""
authors = [
{name = "Ben Martin",email = "ben@brmartin.co.uk"}
]
readme = "README.md"
requires-python = ">=3.13"
dependencies = [
"celery[redis] (>=5.5.2,<6.0.0)",
"flask (>=3.1.0,<4.0.0)",
"python-magic (>=0.4.27,<0.5.0)",
"redis (>=5,<6)",
"asyncio (>=3.4.3,<4.0.0)"
]
[tool.poetry]
package-mode = false
packages = [{include = "auto_transcoder", from = "src"}]
[tool.poetry.group.dev.dependencies]
flower = "^2.0.1"
[build-system]
requires = ["poetry-core>=2.0.0,<3.0.0"]
build-backend = "poetry.core.masonry.api"

29
server.Dockerfile Normal file
View file

@ -0,0 +1,29 @@
FROM python:3.13-alpine AS builder
RUN apk add --no-cache curl && \
curl -sSL https://install.python-poetry.org | python3 -
ENV PATH="/root/.local/bin:$PATH" \
POETRY_NO_INTERACTION=1 \
POETRY_VIRTUALENVS_IN_PROJECT=1 \
POETRY_VIRTUALENVS_CREATE=1 \
POETRY_CACHE_DIR=/var/cache/pypoetry
WORKDIR /app
COPY pyproject.toml poetry.lock ./
RUN poetry install --only main && \
rm -rf ${POETRY_CACHE_DIR}
FROM python:3.13-alpine AS runtime
RUN apk add --no-cache libmagic
ENV REDIS_URL=redis://redis:6379/0
COPY --from=builder /app/.venv /app/.venv
COPY ./src/auto_transcoder /app/auto_transcoder
WORKDIR /app
EXPOSE 5000
ENTRYPOINT ["/app/.venv/bin/python", "-m", "auto_transcoder.server"]

View file

View file

@ -0,0 +1,7 @@
import os
worker_prefetch_multiplier = 1
worker_concurrency=1
broker_url=os.environ.get("REDIS_URL", "redis://localhost:6379/0")
result_backend=os.environ.get("REDIS_URL", "redis://localhost:6379/0")

View file

@ -0,0 +1,93 @@
from collections.abc import Awaitable, Callable
from dataclasses import asdict, dataclass
from json import JSONDecoder, JSONEncoder
from pathlib import Path
from typing import Any, Dict, Generator, List, cast
import asyncio
from magic import Magic
from redis.asyncio import Redis
from redis.asyncio.connection import ConnectionPool
class RedisManager:
def __init__(self, connection_url: str):
self.connection_pool = ConnectionPool.from_url(connection_url)
async def __aenter__(self):
return self
async def __aexit__(self, exc_type, exc_val, exc_tb):
await self.close()
async def close(self):
await cast(Awaitable, self.connection_pool.disconnect())
def get_client(self):
return Redis(connection_pool=self.connection_pool)
@dataclass
class MediaDTO:
inode: int
paths: list[Path]
is_transcoded: bool = False
def open(self):
return open(self.paths[0], "rb")
def size(self) -> int:
return self.paths[0].stat().st_size
class MediaDAO:
def __init__(self, redis_manager: RedisManager):
self.redis_client = redis_manager.get_client()
async def get_media_by_inode(self, inode: int) -> MediaDTO:
result = await cast(Awaitable[MediaDTO], self.redis_client.json(encoder=json_encoder, decoder=json_decoder).get(f"media_info:{inode}"))
if not result:
raise ValueError(f"Media with inode {inode} not found")
return result
async def get_all_inodes(self) -> list[int]:
keys = await self.redis_client.keys("media_info:*")
return [int(key.decode().split(":")[2]) for key in keys]
async def set_media(self, media: MediaDTO) -> None:
await cast(Awaitable[int], self.redis_client.json(encoder=json_encoder, decoder=json_decoder).set(f"media_info:{media.inode}", "$", asdict(media)))
async def batch_set_media(self, media_list: list[MediaDTO]) -> None:
async with self.redis_client.pipeline() as pipe:
for media in media_list:
await cast(Awaitable[int], pipe.json(encoder=json_encoder, decoder=json_decoder).set(f"media_info:{media.inode}", "$", asdict(media)))
await pipe.execute()
async def delete_media(self, media: MediaDTO) -> None:
await cast(Awaitable[int], self.redis_client.delete(f"media_info:{media.inode}"))
async def mark_as_transcoded(self, media: MediaDTO) -> None:
await cast(Awaitable[int], self.redis_client.json(encoder=json_encoder, decoder=json_decoder).set(f"media_info:{media.inode}", "$.is_transcoded", True))
async def is_transcoded(self, inode: int) -> bool:
return await cast(Awaitable[List[bool]], self.redis_client.json(encoder=json_encoder, decoder=json_decoder).get(f"media_info:{inode}", "$.is_transcoded")) == [True]
class JSONEncoderImpl(JSONEncoder):
def default(self, obj):
if isinstance(obj, Path):
return obj.as_uri()
return super().default(obj)
def object_hook(dict: Dict[str, Any]):
return MediaDTO(
paths=[Path.from_uri(v) for v in dict["paths"]],
inode=int(dict["inode"]),
is_transcoded=dict["is_transcoded"],
)
json_encoder = JSONEncoderImpl()
json_decoder = JSONDecoder(object_hook=object_hook)
def is_media_file(path: Path) -> bool:
return mime_detector.from_file(path).startswith("video/")
mime_detector = Magic(mime=True)

View file

@ -0,0 +1,54 @@
from argparse import ArgumentParser
from ast import Dict
from pathlib import Path
from typing import Generator
import asyncio
from auto_transcoder import tasks
from auto_transcoder.model import MediaDAO, MediaDTO, RedisManager
from auto_transcoder.web import run as run_web_app
def walk_directory(path: Path, bin: Path | None) -> Generator[Path, None, None]:
for root, _, files in path.walk():
if bin and root.is_relative_to(bin):
continue
for file_name in files:
yield Path(root, file_name)
async def main(directory_to_watch: Path, redis_connection_url: str, recycle_bin: Path | None = None):
print(f"Watching {directory_to_watch} for media files.")
print(f"Original files will be moved to recycle bin: {recycle_bin}.")
if not directory_to_watch.exists():
print(f"Directory {directory_to_watch} does not exist")
exit(1)
media_files_by_inode: dict[int, list[Path]] = {}
for path in walk_directory(directory_to_watch, recycle_bin):
inode = path.stat().st_ino
if inode not in media_files_by_inode:
media_files_by_inode[inode] = []
media_files_by_inode[inode].append(path)
async with RedisManager(redis_connection_url) as redis_manager:
media_dao = MediaDAO(redis_manager)
media_files_by_inode = {k: v for k, v in media_files_by_inode.items() if not await media_dao.is_transcoded(k)}
await media_dao.batch_set_media([MediaDTO(inode=i, paths=p, is_transcoded=False) for i, p in media_files_by_inode.items()])
for inode in media_files_by_inode.keys():
print(f"Sent transcode task for inode {inode}")
tasks.transcode_media_task.delay(inode, recycle_bin.as_uri() if recycle_bin else None)
run_web_app(redis_manager=redis_manager, host='0.0.0.0', port=5000, use_reloader=False)
if __name__ == '__main__':
parser = ArgumentParser(description='Auto Transcoder Server')
parser.add_argument('directory_to_watch', type=str, help='Directory to watch for media files')
parser.add_argument('--recycle-bin', type=str, default=None, help='Recycle bin directory for original files')
args = parser.parse_args()
recycle_bin = Path(args.recycle_bin) if args.recycle_bin else None
asyncio.run(main(Path(args.directory_to_watch), args.redis_connection_url, recycle_bin))

View file

@ -0,0 +1,58 @@
import os
from pathlib import Path
import shutil
import subprocess
import tempfile
from typing import List
from celery.utils.log import get_task_logger
from auto_transcoder.model import MediaDTO
logger = get_task_logger(__name__)
async def transcode_media(media: MediaDTO, recycle_bin: Path | None = None) -> List[MediaDTO] | None:
file_path = media.paths[0]
temp_path = Path(tempfile.mkstemp()[1])
full_command = ['ffmpeg', '-i', "-",
'-vf', 'scale=\'min(1920,iw)\':-1:flags=lanczos',
'-c:v', 'libsvtav1', '-crf', '30', '-preset', '6', '-g', '240', '-pix_fmt', 'yuv420p10le',
'-c:a', 'libopus', '-b:a', '128k', '-ac', '2',
'-c:s', 'webvtt',
'-map_chapters', '-1', '-map_metadata', '-1',
'-f', 'webm',
'-y',
temp_path.resolve()]
try:
subprocess.run(full_command, check=True, stdin=media.open())
except BaseException as e:
temp_path.unlink(missing_ok=True)
raise e
if temp_path.stat().st_size > media.size():
temp_path.unlink()
logger.warning(f"Transcoding did not reduce file size for {file_path}, keeping original")
return
paths_by_inode: dict[int, List[Path]] = {}
for file_path in media.paths:
media_directory = file_path.parent
bin(file_path, recycle_bin)
new_media = shutil.move(temp_path, media_directory.joinpath(os.path.splitext(file_path.name)[0] + ".webm"))
inode = new_media.stat().st_ino
if inode in paths_by_inode:
paths_by_inode[inode].append(new_media)
else:
paths_by_inode[inode] = [new_media]
logger.info(f"Transcoded {media.paths} to {paths_by_inode.values()}")
return [MediaDTO(inode, paths, True) for inode, paths in paths_by_inode.items()]
def bin(file: Path, recycle_bin: Path | None):
if recycle_bin:
recycle_bin.mkdir(parents=True, exist_ok=True)
shutil.move(file, recycle_bin.joinpath(file.name))
else:
file.unlink()

View file

@ -0,0 +1,49 @@
import os
from pathlib import Path
import asyncio
from celery import Celery
from celery.signals import worker_init, worker_shutdown
from celery.utils.log import get_task_logger
from auto_transcoder.model import MediaDAO, RedisManager
from auto_transcoder.services import transcode_media
redis_manager: RedisManager | None = None
logger = get_task_logger(__name__)
celery_app = Celery('auto_transcoder', config_source='auto_transcoder.celeryconfig')
@worker_init.connect
def setup_worker(**kwargs):
global redis_manager
redis_manager = RedisManager(os.environ.get("REDIS_URL", "redis://localhost:6379/0"))
@worker_shutdown.connect
def teardown_worker(**kwargs):
asyncio.run(__teardown_worker())
async def __teardown_worker():
global redis_manager
if redis_manager:
await redis_manager.close()
redis_manager = None
@celery_app.task(ignore_result=True)
def transcode_media_task(inode: int, recycle_bin_path: str | None = None):
async def process(inode: int, recycle_bin_path: str | None = None):
global redis_manager
if not redis_manager:
raise RuntimeError("RedisManager is not initialized")
async with redis_manager as manager:
media_dao = MediaDAO(manager)
media_dto = await media_dao.get_media_by_inode(inode)
if not media_dto.is_transcoded:
logger.info(f"Transcoding media with inode {inode}")
new_medias = await transcode_media(media_dto, Path.from_uri(recycle_bin_path) if recycle_bin_path else None)
if new_medias:
await asyncio.gather(
media_dao.batch_set_media(new_medias),
media_dao.delete_media(media_dto),
)
asyncio.run(process(inode, recycle_bin_path))

View file

@ -0,0 +1,28 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Auto Transcoder</title>
<script defer src="https://cdn.jsdelivr.net/npm/alpinejs@3.x.x/dist/cdn.min.js"></script>
<script>
async function fetchWorkers() {
const response = await fetch('/api/workers');
const data = await response.json();
return data;
}
</script>
</head>
<body>
<h1>Welcome to Auto Transcoder</h1>
<h2>Workers</h2>
<ul x-data="{ workers: {} }" x-init="workers = await fetchWorkers()">
<template x-for="id in Object.keys(workers)" :key="id">
<li x-text="id"></li>
</template>
</ul>
</body>
</html>

View file

@ -0,0 +1,27 @@
from flask import Flask, jsonify, render_template
from celery.app.control import Inspect
from auto_transcoder.model import MediaDAO, RedisManager
from auto_transcoder.tasks import celery_app as celery_app
def run(redis_manager: RedisManager, *args, **kwargs):
app = Flask(__name__)
@app.route('/')
def index():
return render_template('index.html')
@app.route('/api/workers')
def queues():
i = inspect()
workers = i.stats()
return jsonify(workers)
@app.route('/api/inodes')
async def inodes():
return await MediaDAO(redis_manager=redis_manager).get_all_inodes()
app.run(*args, **kwargs)
def inspect() -> Inspect:
return celery_app.control.inspect()

29
worker.Dockerfile Normal file
View file

@ -0,0 +1,29 @@
FROM python:3.13-alpine AS builder
RUN apk add --no-cache curl && \
curl -sSL https://install.python-poetry.org | python3 -
ENV PATH="/root/.local/bin:$PATH" \
POETRY_NO_INTERACTION=1 \
POETRY_VIRTUALENVS_IN_PROJECT=1 \
POETRY_VIRTUALENVS_CREATE=1 \
POETRY_CACHE_DIR=/var/cache/pypoetry
WORKDIR /app
COPY pyproject.toml poetry.lock ./
RUN poetry install --only main && \
rm -rf ${POETRY_CACHE_DIR}
FROM python:3.13-alpine AS runtime
RUN apk add --no-cache ffmpeg libmagic opus svt-av1
ENV REDIS_URL=redis://redis:6379/0
COPY --from=builder /app/.venv /app/.venv
COPY ./src/auto_transcoder /app/auto_transcoder
WORKDIR /app
EXPOSE 5000
ENTRYPOINT ["/app/.venv/bin/celery", "-A", "auto_transcoder.tasks", "worker", "--loglevel=info"]