mirror of
https://github.com/revanced/revanced-releases-api.git
synced 2025-05-04 16:04:24 +02:00
* Implements client generation and management * fix announcements endpoints * change annoucements model * bump deps * sync with main * refactor: adopt some functional standards in Releases.py * feat: add new workflows * chore: remove unused files * refactor: update build badge * refactor: move files around and delete unused ones * feat: add authentication endpoints * refactor: clean up code on Clients.py controller * fix: fix the client secret update endpoint * refactor: clean up authentication code * feat: add authentication to client endpoints * chore: bump deps * feat: add admin user generation
This commit is contained in:
parent
94e7ac64c0
commit
9dbef92fd1
1
.gitignore
vendored
1
.gitignore
vendored
@ -153,3 +153,4 @@ cython_debug/
|
||||
|
||||
# PROJECT SPECIFIC
|
||||
setup_env.sh
|
||||
admin_info.json
|
3
.vscode/settings.json
vendored
Normal file
3
.vscode/settings.json
vendored
Normal file
@ -0,0 +1,3 @@
|
||||
{
|
||||
"python.analysis.typeCheckingMode": "off"
|
||||
}
|
12
Dockerfile
12
Dockerfile
@ -3,14 +3,14 @@ FROM python:3.10-slim
|
||||
ARG GITHUB_TOKEN
|
||||
ENV GITHUB_TOKEN $GITHUB_TOKEN
|
||||
|
||||
ARG UVICORN_HOST
|
||||
ENV UVICORN_HOST $UVICORN_HOST
|
||||
ARG HYPERCORN_HOST
|
||||
ENV HYPERCORN_HOST $HYPERCORN_HOST
|
||||
|
||||
ARG UVICORN_PORT
|
||||
ENV UVICORN_PORT $UVICORN_PORT
|
||||
ARG HYPERCORN_PORT
|
||||
ENV HYPERCORN_PORT $HYPERCORN_PORT
|
||||
|
||||
ARG UVICORN_LOG_LEVEL
|
||||
ENV UVICORN_LOG_LEVEL $UVICORN_LOG_LEVEL
|
||||
ARG HYPERCORN_LOG_LEVEL
|
||||
ENV HYPERCORN_LOG_LEVEL $HYPERCORN_LOG_LEVEL
|
||||
|
||||
WORKDIR /usr/src/app
|
||||
|
||||
|
38
README.md
38
README.md
@ -8,7 +8,7 @@ This is a simple API that returns the latest ReVanced releases, patches and cont
|
||||
|
||||
## Usage
|
||||
|
||||
The API is available at [https://revanced-releases-api.afterst0rm.xyz/](https://revanced-releases-api.afterst0rm.xyz/).
|
||||
The API is available at [https://releases.rvcd.win/](https://releases.rvcd.win/).
|
||||
|
||||
You can deploy your own instance by cloning this repository, editing the `docker-compose.yml` file to include your GitHub token and running `docker-compose up` or `docker-compose up --build` if you want to build the image locally instead of pulling from GHCR. Optionally you can run the application without Docker by running `poetry install` and `poetry run ./run.sh`. In this case, you'll also need a redis server and setup the following environment variables on your system.
|
||||
|
||||
@ -28,9 +28,39 @@ If you don't have a Sentry instance, we recommend using [GlitchTip](https://glit
|
||||
|
||||
### API Endpoints
|
||||
|
||||
* [tools](https://revanced-releases-api.afterst0rm.xyz/tools) - Returns the latest version of all ReVanced tools and Vanced MicroG
|
||||
* [patches](https://revanced-releases-api.afterst0rm.xyz/patches) - Returns the latest version of all ReVanced patches
|
||||
* [contributors](https://revanced-releases-api.afterst0rm.xyz/contributors) - Returns contributors for all ReVanced projects
|
||||
* [tools](https://releases.rvcd.win/tools) - Returns the latest version of all ReVanced tools and Vanced MicroG
|
||||
* [patches](https://releases.rvcd.win/patches) - Returns the latest version of all ReVanced patches
|
||||
* [contributors](https://releases.rvcd.win/contributors) - Returns contributors for all ReVanced projects
|
||||
* [announcement](https://releases.rvcd.win/announcement) - Returns the latest announcement for the ReVanced projects
|
||||
|
||||
## Clients
|
||||
|
||||
The API has no concept of users. It is meant to be used by clients, such as the [ReVanced Manager](https://github.com/revanced/revanced-manager).
|
||||
|
||||
When the API is deployed for the first time it'll create a new client with admin permissions. The credentials can be found at the log console or in the file `admin_info.json` in the root directory of the project. Only admin clients can create, edit and delete other clients. If you're going to use any of the authenticated endpoints, you'll need to create a client and use its credentials. Please follow the API documentation for more information.
|
||||
|
||||
## Authentication
|
||||
|
||||
The API uses [PASETO](https://paseto.io/) tokens for authorization. To authenticate, you need to send a POST request to `/auth` with the following JSON body:
|
||||
|
||||
```json
|
||||
{
|
||||
"id": "your_client_id",
|
||||
"secret": "your_client_secret"
|
||||
}
|
||||
```
|
||||
|
||||
The API will answer with a PASETO token and a refresh token that you can use to authorize your requests. You can use the token in the `Authorization` header of your requests, like this:
|
||||
|
||||
```
|
||||
Authorization: Bearer <token>
|
||||
```
|
||||
|
||||
That token will be valid for 24 hours. After that, you'll need to refresh it by sending a POST request to `/auth/refresh` with your `refresh_token` in the `Authorization` header.
|
||||
|
||||
Refresh tokens are valid for 30 days. After that, you'll need to authenticate again and get new tokens.
|
||||
|
||||
Some endpoints might require fresh tokens, forcing you to authenticate.
|
||||
|
||||
## Contributing
|
||||
|
||||
|
42
config.toml
42
config.toml
@ -8,42 +8,38 @@ Changelogs are not included but can be found on the [ReVanced Repositories](http
|
||||
|
||||
The team also have a [Discord Server](https://revanced.app/discord) if you need help.
|
||||
|
||||
## API Endpoints
|
||||
|
||||
* [tools](/tools) - Returns the latest version of all ReVanced tools and Vanced MicroG
|
||||
* [patches](/patches) - Returns the latest version of all ReVanced patches
|
||||
* [contributors](/contributors) - Returns contributors for all ReVanced projects
|
||||
|
||||
## Additional Information
|
||||
## Important Information
|
||||
|
||||
* Rate Limiting - 60 requests per minute
|
||||
* Cache - 5 minutes
|
||||
* Token duration - 1 hour
|
||||
* Token refresh - 30 days
|
||||
|
||||
## Important Notes
|
||||
## Additional Notes
|
||||
|
||||
1. Although we will try to avoid breaking changes, we can't guarantee that it won't happen.
|
||||
2. Okay, the api is now cached and rate limited (per endpoint). But please don't abuse it, we don't want to have to block you.
|
||||
3. Make sure to implement a cache system on your end to avoid unnecessary requests.
|
||||
2. Make sure to implement a cache system on your end to avoid unnecessary requests.
|
||||
3. API abuse will result in IP blocks.
|
||||
|
||||
Godspeed 💀
|
||||
|
||||
"""
|
||||
version = "0.10 beta"
|
||||
version = "0.8 RC"
|
||||
|
||||
[license]
|
||||
|
||||
name = "AGPL-3.0"
|
||||
url = "https://www.gnu.org/licenses/agpl-3.0.en.html"
|
||||
|
||||
[uvicorn]
|
||||
|
||||
host = "0.0.0.0"
|
||||
port = 8000
|
||||
|
||||
[slowapi]
|
||||
|
||||
limit = "60/minute"
|
||||
|
||||
[logging]
|
||||
|
||||
level = "INFO"
|
||||
json_logs = false
|
||||
|
||||
[cache]
|
||||
expire = 120
|
||||
database = 0
|
||||
@ -52,12 +48,16 @@ database = 0
|
||||
expire = 300
|
||||
database = 1
|
||||
|
||||
[clients]
|
||||
database = 2
|
||||
|
||||
[tokens]
|
||||
database = 3
|
||||
|
||||
[announcements]
|
||||
database = 4
|
||||
|
||||
[app]
|
||||
|
||||
repositories = ["TeamVanced/VancedMicroG", "revanced/revanced-cli", "revanced/revanced-patcher", "revanced/revanced-patches", "revanced/revanced-integrations", "revanced/revanced-manager"]
|
||||
|
||||
[logging]
|
||||
|
||||
level = "INFO"
|
||||
json_logs = false
|
||||
redis_database = 2
|
@ -1,14 +1,13 @@
|
||||
version: "3.8"
|
||||
|
||||
volumes:
|
||||
redis-data:
|
||||
driver: local
|
||||
services:
|
||||
redis:
|
||||
container_name: revanced-releases-api-redis
|
||||
image: redis:latest
|
||||
image: redis-stack-server:latest
|
||||
environment:
|
||||
- REDIS_ARGS=--save 60 1 --appendonly yes
|
||||
volumes:
|
||||
- redis-data:/data
|
||||
- /data/redis/revanced-releases-api:/data
|
||||
networks:
|
||||
- infra
|
||||
restart: always
|
||||
@ -19,9 +18,9 @@ services:
|
||||
- GITHUB_TOKEN=YOUR_GITHUB_TOKEN
|
||||
- REDIS_URL=revanced-releases-api-redis
|
||||
- REDIS_PORT=6379
|
||||
- UVICORN_HOST=0.0.0.0
|
||||
- UVICORN_PORT=8000
|
||||
- UVICORN_LOG_LEVEL=debug
|
||||
- HYPERCORN_HOST=0.0.0.0
|
||||
- HYPERCORN_PORT=8000
|
||||
- HYPERCORN_LOG_LEVEL=debug
|
||||
- SENTRY_DSN=YOUR_SENTRY_DSN
|
||||
ports:
|
||||
- 127.0.0.1:7934:8000
|
||||
|
@ -1,16 +1,14 @@
|
||||
---
|
||||
version: "3.8"
|
||||
|
||||
volumes:
|
||||
redis-data:
|
||||
driver: local
|
||||
|
||||
services:
|
||||
redis:
|
||||
container_name: revanced-releases-api-redis
|
||||
image: redis:latest
|
||||
image: redis-stack-server:latest
|
||||
environment:
|
||||
- REDIS_ARGS=--save 60 1 --appendonly yes
|
||||
volumes:
|
||||
- redis-data:/data
|
||||
- /data/redis/revanced-releases-api:/data
|
||||
networks:
|
||||
- infra
|
||||
restart: always
|
||||
@ -21,9 +19,9 @@ services:
|
||||
- GITHUB_TOKEN=YOUR_GITHUB_TOKEN
|
||||
- REDIS_URL=revanced-releases-api-redis
|
||||
- REDIS_PORT=6379
|
||||
- UVICORN_HOST=0.0.0.0
|
||||
- UVICORN_PORT=8000
|
||||
- UVICORN_LOG_LEVEL=debug
|
||||
- HYPERCORN_HOST=0.0.0.0
|
||||
- HYPERCORN_PORT=8000
|
||||
- HYPERCORN_LOG_LEVEL=debug
|
||||
- SENTRY_DSN=YOUR_SENTRY_DSN
|
||||
ports:
|
||||
- 127.0.0.1:7934:8000
|
||||
|
11
env.sh
11
env.sh
@ -1,11 +0,0 @@
|
||||
#!/bin/bash
|
||||
|
||||
# This script is used to setup the environment variables
|
||||
|
||||
export GITHUB_TOKEN=your_token
|
||||
export UVICORN_HOST=0.0.0.0
|
||||
export UVICORN_PORT=8000
|
||||
export UVICORN_LOG_LEVEL=debug
|
||||
export REDIS_URL=127.0.0.1
|
||||
export REDIS_PORT=6379
|
||||
export SENTRY_DSN=your_sentry_dsn
|
406
main.py
406
main.py
@ -1,13 +1,15 @@
|
||||
#!/usr/bin/env python3
|
||||
|
||||
import binascii
|
||||
import os
|
||||
from typing import Coroutine
|
||||
import toml
|
||||
import uvicorn
|
||||
import aioredis
|
||||
import sentry_sdk
|
||||
import asyncio
|
||||
import uvloop
|
||||
|
||||
from fastapi import FastAPI, Request, Response
|
||||
from fastapi.responses import RedirectResponse
|
||||
from fastapi import FastAPI, Request, Response, status, HTTPException, Depends
|
||||
from fastapi.responses import RedirectResponse, JSONResponse, UJSONResponse
|
||||
|
||||
from slowapi.util import get_remote_address
|
||||
from slowapi import Limiter, _rate_limit_exceeded_handler
|
||||
@ -16,15 +18,29 @@ from fastapi_cache import FastAPICache
|
||||
from fastapi_cache.decorator import cache
|
||||
from slowapi.errors import RateLimitExceeded
|
||||
from fastapi_cache.backends.redis import RedisBackend
|
||||
from fastapi.exceptions import RequestValidationError
|
||||
|
||||
from fastapi_paseto_auth import AuthPASETO
|
||||
from fastapi_paseto_auth.exceptions import AuthPASETOException
|
||||
|
||||
from sentry_sdk.integrations.redis import RedisIntegration
|
||||
from sentry_sdk.integrations.httpx import HttpxIntegration
|
||||
from sentry_sdk.integrations.gnu_backtrace import GnuBacktraceIntegration
|
||||
|
||||
from modules.Releases import Releases
|
||||
import modules.models.ResponseModels as ResponseModels
|
||||
import src.controllers.Auth as Auth
|
||||
from src.controllers.Releases import Releases
|
||||
from src.controllers.Clients import Clients
|
||||
from src.controllers.Announcements import Announcements
|
||||
|
||||
import modules.utils.Logger as Logger
|
||||
from src.utils.Generators import Generators
|
||||
from src.utils.RedisConnector import RedisConnector
|
||||
|
||||
import src.models.ClientModels as ClientModels
|
||||
import src.models.GeneralErrors as GeneralErrors
|
||||
import src.models.ResponseModels as ResponseModels
|
||||
import src.models.AnnouncementModels as AnnouncementModels
|
||||
|
||||
import src.utils.Logger as Logger
|
||||
|
||||
# Enable sentry logging
|
||||
|
||||
@ -40,18 +56,23 @@ sentry_sdk.init(os.environ['SENTRY_DSN'], traces_sample_rate=1.0, integrations=[
|
||||
|
||||
config: dict = toml.load("config.toml")
|
||||
|
||||
# Redis connection parameters
|
||||
# Class instances
|
||||
|
||||
redis_config: dict[ str, str | int ] = {
|
||||
"url": f"redis://{os.environ['REDIS_URL']}",
|
||||
"port": os.environ['REDIS_PORT'],
|
||||
"database": config['cache']['database'],
|
||||
}
|
||||
|
||||
# Create releases instance
|
||||
generators = Generators()
|
||||
|
||||
releases = Releases()
|
||||
|
||||
clients = Clients()
|
||||
|
||||
announcements = Announcements()
|
||||
|
||||
# Setup admin client
|
||||
uvloop.install()
|
||||
|
||||
loop: asyncio.AbstractEventLoop = asyncio.get_event_loop()
|
||||
coroutine: Coroutine = clients.setup_admin()
|
||||
loop.run_until_complete(coroutine)
|
||||
|
||||
# Create FastAPI instance
|
||||
|
||||
app = FastAPI(title=config['docs']['title'],
|
||||
@ -59,7 +80,9 @@ app = FastAPI(title=config['docs']['title'],
|
||||
version=config['docs']['version'],
|
||||
license_info={"name": config['license']['name'],
|
||||
"url": config['license']['url']
|
||||
})
|
||||
},
|
||||
default_response_class=UJSONResponse
|
||||
)
|
||||
|
||||
# Slowapi limiter
|
||||
|
||||
@ -73,9 +96,35 @@ app.add_exception_handler(RateLimitExceeded, _rate_limit_exceeded_handler)
|
||||
async def get_cache() -> int:
|
||||
return 1
|
||||
|
||||
# Setup PASETO
|
||||
|
||||
@AuthPASETO.load_config
|
||||
def get_config():
|
||||
return Auth.PasetoSettings()
|
||||
|
||||
# Setup custom error handlers
|
||||
|
||||
@app.exception_handler(AuthPASETOException)
|
||||
async def authpaseto_exception_handler(request: Request, exc: AuthPASETOException):
|
||||
return JSONResponse(status_code=exc.status_code, content={"detail": exc.message})
|
||||
|
||||
@app.exception_handler(AttributeError)
|
||||
async def validation_exception_handler(request, exc):
|
||||
return JSONResponse(status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, content={
|
||||
"error": "Unprocessable Entity"
|
||||
})
|
||||
|
||||
@app.exception_handler(binascii.Error)
|
||||
async def invalid_token_exception_handler(request, exc):
|
||||
return JSONResponse(status_code=status.HTTP_401_UNAUTHORIZED, content={
|
||||
"error": GeneralErrors.Unauthorized().error,
|
||||
"message": GeneralErrors.Unauthorized().message
|
||||
})
|
||||
|
||||
# Routes
|
||||
|
||||
@app.get("/", response_class=RedirectResponse, status_code=301)
|
||||
@app.get("/", response_class=RedirectResponse,
|
||||
status_code=status.HTTP_301_MOVED_PERMANENTLY, tags=['Root'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
async def root(request: Request, response: Response) -> RedirectResponse:
|
||||
"""Brings up API documentation
|
||||
@ -85,7 +134,7 @@ async def root(request: Request, response: Response) -> RedirectResponse:
|
||||
"""
|
||||
return RedirectResponse(url="/docs")
|
||||
|
||||
@app.get('/tools', response_model=ResponseModels.ToolsResponseModel)
|
||||
@app.get('/tools', response_model=ResponseModels.ToolsResponseModel, tags=['ReVanced Tools'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
@cache(config['cache']['expire'])
|
||||
async def tools(request: Request, response: Response) -> dict:
|
||||
@ -96,7 +145,7 @@ async def tools(request: Request, response: Response) -> dict:
|
||||
"""
|
||||
return await releases.get_latest_releases(config['app']['repositories'])
|
||||
|
||||
@app.get('/patches', response_model=ResponseModels.PatchesResponseModel)
|
||||
@app.get('/patches', response_model=ResponseModels.PatchesResponseModel, tags=['ReVanced Tools'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
@cache(config['cache']['expire'])
|
||||
async def patches(request: Request, response: Response) -> dict:
|
||||
@ -108,7 +157,7 @@ async def patches(request: Request, response: Response) -> dict:
|
||||
|
||||
return await releases.get_patches_json()
|
||||
|
||||
@app.get('/contributors', response_model=ResponseModels.ContributorsResponseModel)
|
||||
@app.get('/contributors', response_model=ResponseModels.ContributorsResponseModel, tags=['ReVanced Tools'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
@cache(config['cache']['expire'])
|
||||
async def contributors(request: Request, response: Response) -> dict:
|
||||
@ -119,28 +168,319 @@ async def contributors(request: Request, response: Response) -> dict:
|
||||
"""
|
||||
return await releases.get_contributors(config['app']['repositories'])
|
||||
|
||||
@app.head('/ping', status_code=204)
|
||||
@app.post('/client', response_model=ClientModels.ClientModel, status_code=status.HTTP_201_CREATED, tags=['Clients'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
async def ping(request: Request, response: Response) -> None:
|
||||
"""Check if the API is running.
|
||||
async def create_client(request: Request, response: Response, admin: bool | None = False, Authorize: AuthPASETO = Depends()) -> ClientModels.ClientModel:
|
||||
"""Create a new API client.
|
||||
|
||||
Returns:
|
||||
None
|
||||
json: client information
|
||||
"""
|
||||
return None
|
||||
|
||||
Authorize.paseto_required()
|
||||
|
||||
admin_claim: dict[str, bool] = {"admin": False}
|
||||
|
||||
current_user: str | int | None = Authorize.get_subject()
|
||||
|
||||
if 'admin' in Authorize.get_token_payload():
|
||||
admin_claim = {"admin": Authorize.get_token_payload()['admin']}
|
||||
|
||||
if ( await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()) and
|
||||
admin_claim['admin'] == True):
|
||||
|
||||
client: ClientModels.ClientModel = await clients.generate(admin=admin)
|
||||
await clients.store(client)
|
||||
|
||||
return client
|
||||
else:
|
||||
raise HTTPException(status_code=401, detail={
|
||||
"error": GeneralErrors.Unauthorized().error,
|
||||
"message": GeneralErrors.Unauthorized().message
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@app.delete('/client/{client_id}', response_model=ResponseModels.ClientDeletedResponse, status_code=status.HTTP_200_OK, tags=['Clients'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
async def delete_client(request: Request, response: Response, client_id: str, Authorize: AuthPASETO = Depends()) -> dict:
|
||||
"""Delete an API client.
|
||||
|
||||
Returns:
|
||||
json: deletion status
|
||||
"""
|
||||
|
||||
Authorize.paseto_required()
|
||||
|
||||
admin_claim: dict[str, bool] = {"admin": False}
|
||||
|
||||
current_user: str | int | None = Authorize.get_subject()
|
||||
|
||||
if 'admin' in Authorize.get_token_payload():
|
||||
admin_claim = {"admin": Authorize.get_token_payload()['admin']}
|
||||
|
||||
if ( await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()) and
|
||||
( admin_claim['admin'] == True or
|
||||
current_user == client_id ) ):
|
||||
|
||||
if await clients.exists(client_id):
|
||||
return {"id": client_id, "deleted": await clients.delete(client_id)}
|
||||
else:
|
||||
raise HTTPException(status_code=404, detail={
|
||||
"error": GeneralErrors.ClientNotFound().error,
|
||||
"message": GeneralErrors.ClientNotFound().message
|
||||
}
|
||||
)
|
||||
else:
|
||||
raise HTTPException(status_code=401, detail={
|
||||
"error": GeneralErrors.Unauthorized().error,
|
||||
"message": GeneralErrors.Unauthorized().message
|
||||
}
|
||||
)
|
||||
|
||||
@app.patch('/client/{client_id}/secret', response_model=ResponseModels.ClientSecretUpdatedResponse, status_code=status.HTTP_200_OK, tags=['Clients'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
async def update_client(request: Request, response: Response, client_id: str, Authorize: AuthPASETO = Depends()) -> dict:
|
||||
"""Update an API client's secret.
|
||||
|
||||
Returns:
|
||||
json: client ID and secret
|
||||
"""
|
||||
|
||||
Authorize.paseto_required()
|
||||
|
||||
admin_claim: dict[str, bool] = {"admin": False}
|
||||
|
||||
current_user: str | int | None = Authorize.get_subject()
|
||||
|
||||
if 'admin' in Authorize.get_token_payload():
|
||||
admin_claim = {"admin": Authorize.get_token_payload()['admin']}
|
||||
|
||||
if ( await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()) and
|
||||
( admin_claim['admin'] == True or
|
||||
current_user == client_id ) ):
|
||||
|
||||
if await clients.exists(client_id):
|
||||
new_secret: str = await generators.generate_secret()
|
||||
|
||||
if await clients.update_secret(client_id, new_secret):
|
||||
return {"id": client_id, "secret": new_secret}
|
||||
else:
|
||||
raise HTTPException(status_code=500, detail={
|
||||
"error": GeneralErrors.InternalServerError().error,
|
||||
"message": GeneralErrors.InternalServerError().message
|
||||
}
|
||||
)
|
||||
else:
|
||||
raise HTTPException(status_code=404, detail={
|
||||
"error": GeneralErrors.ClientNotFound().error,
|
||||
"message": GeneralErrors.ClientNotFound().message
|
||||
}
|
||||
)
|
||||
else:
|
||||
raise HTTPException(status_code=401, detail={
|
||||
"error": GeneralErrors.Unauthorized().error,
|
||||
"message": GeneralErrors.Unauthorized().message
|
||||
}
|
||||
)
|
||||
|
||||
@app.patch('/client/{client_id}/status', response_model=ResponseModels.ClientStatusResponse, status_code=status.HTTP_200_OK, tags=['Clients'])
|
||||
async def client_status(request: Request, response: Response, client_id: str, active: bool, Authorize: AuthPASETO = Depends()) -> dict:
|
||||
"""Activate or deactivate a client
|
||||
|
||||
Returns:
|
||||
json: json response containing client ID and activation status
|
||||
"""
|
||||
|
||||
Authorize.paseto_required()
|
||||
|
||||
admin_claim: dict[str, bool] = {"admin": False}
|
||||
|
||||
current_user: str | int | None = Authorize.get_subject()
|
||||
|
||||
if 'admin' in Authorize.get_token_payload():
|
||||
admin_claim = {"admin": Authorize.get_token_payload()['admin']}
|
||||
|
||||
if ( await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()) and
|
||||
( admin_claim['admin'] == True or
|
||||
current_user == client_id ) ):
|
||||
|
||||
if await clients.exists(client_id):
|
||||
if await clients.status(client_id, active):
|
||||
return {"id": client_id, "active": active}
|
||||
else:
|
||||
raise HTTPException(status_code=500, detail={
|
||||
"error": GeneralErrors.InternalServerError().error,
|
||||
"message": GeneralErrors.InternalServerError().message
|
||||
}
|
||||
)
|
||||
else:
|
||||
raise HTTPException(status_code=404, detail={
|
||||
"error": GeneralErrors.ClientNotFound().error,
|
||||
"message": GeneralErrors.ClientNotFound().message
|
||||
}
|
||||
)
|
||||
else:
|
||||
raise HTTPException(status_code=401, detail={
|
||||
"error": GeneralErrors.Unauthorized().error,
|
||||
"message": GeneralErrors.Unauthorized().message
|
||||
}
|
||||
)
|
||||
|
||||
|
||||
@app.post('/announcement', response_model=AnnouncementModels.AnnouncementCreatedResponse,
|
||||
status_code=status.HTTP_201_CREATED, tags=['Announcements'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
async def create_announcement(request: Request, response: Response,
|
||||
announcement: AnnouncementModels.AnnouncementCreateModel,
|
||||
Authorize: AuthPASETO = Depends()) -> dict:
|
||||
"""Create a new announcement.
|
||||
|
||||
Returns:
|
||||
json: announcement information
|
||||
"""
|
||||
Authorize.paseto_required()
|
||||
|
||||
if await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()):
|
||||
announcement_created: bool = await announcements.store(announcement=announcement,
|
||||
author=Authorize.get_subject())
|
||||
|
||||
if announcement_created:
|
||||
return {"created": announcement_created}
|
||||
else:
|
||||
raise HTTPException(status_code=500, detail={
|
||||
"error": GeneralErrors.InternalServerError().error,
|
||||
"message": GeneralErrors.InternalServerError().message
|
||||
}
|
||||
)
|
||||
else:
|
||||
raise HTTPException(status_code=401, detail={
|
||||
"error": GeneralErrors.Unauthorized().error,
|
||||
"message": GeneralErrors.Unauthorized().message
|
||||
}
|
||||
)
|
||||
|
||||
@app.get('/announcement', response_model=AnnouncementModels.AnnouncementModel, tags=['Announcements'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
async def get_announcement(request: Request, response: Response) -> dict:
|
||||
"""Get an announcement.
|
||||
|
||||
Returns:
|
||||
json: announcement information
|
||||
"""
|
||||
if await announcements.exists():
|
||||
return await announcements.get()
|
||||
else:
|
||||
raise HTTPException(status_code=404, detail={
|
||||
"error": GeneralErrors.AnnouncementNotFound().error,
|
||||
"message": GeneralErrors.AnnouncementNotFound().message
|
||||
}
|
||||
)
|
||||
|
||||
@app.delete('/announcement',
|
||||
response_model=AnnouncementModels.AnnouncementDeleted,
|
||||
status_code=status.HTTP_200_OK, tags=['Announcements'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
async def delete_announcement(request: Request, response: Response,
|
||||
Authorize: AuthPASETO = Depends()) -> dict:
|
||||
"""Delete an announcement.
|
||||
|
||||
Returns:
|
||||
json: deletion status
|
||||
"""
|
||||
|
||||
Authorize.paseto_required()
|
||||
|
||||
if await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()):
|
||||
if await announcements.exists():
|
||||
return {"deleted": await announcements.delete()}
|
||||
else:
|
||||
raise HTTPException(status_code=404, detail={
|
||||
"error": GeneralErrors.AnnouncementNotFound().error,
|
||||
"message": GeneralErrors.AnnouncementNotFound().message
|
||||
}
|
||||
)
|
||||
else:
|
||||
raise HTTPException(status_code=401, detail={
|
||||
"error": GeneralErrors.Unauthorized().error,
|
||||
"message": GeneralErrors.Unauthorized().message
|
||||
}
|
||||
)
|
||||
|
||||
@app.post('/auth', response_model=ResponseModels.ClientAuthTokenResponse, status_code=status.HTTP_200_OK, tags=['Authentication'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
async def auth(request: Request, response: Response, client: ClientModels.ClientAuthModel, Authorize: AuthPASETO = Depends()) -> dict:
|
||||
"""Authenticate a client and get an auth token.
|
||||
|
||||
Returns:
|
||||
access_token: auth token
|
||||
refresh_token: refresh token
|
||||
"""
|
||||
|
||||
admin_claim: dict[str, bool]
|
||||
|
||||
if await clients.exists(client.id):
|
||||
authenticated: bool = await clients.authenticate(client.id, client.secret)
|
||||
|
||||
if not authenticated:
|
||||
raise HTTPException(status_code=401, detail={
|
||||
"error": GeneralErrors.Unauthorized().error,
|
||||
"message": GeneralErrors.Unauthorized().message
|
||||
}
|
||||
)
|
||||
else:
|
||||
if await clients.is_admin(client.id):
|
||||
admin_claim = {"admin": True}
|
||||
else:
|
||||
admin_claim = {"admin": False}
|
||||
|
||||
access_token = Authorize.create_access_token(subject=client.id,
|
||||
user_claims=admin_claim,
|
||||
fresh=True)
|
||||
|
||||
refresh_token = Authorize.create_refresh_token(subject=client.id,
|
||||
user_claims=admin_claim)
|
||||
|
||||
return {"access_token": access_token, "refresh_token": refresh_token}
|
||||
else:
|
||||
raise HTTPException(status_code=401, detail={
|
||||
"error": GeneralErrors.Unauthorized().error,
|
||||
"message": GeneralErrors.Unauthorized().message
|
||||
}
|
||||
)
|
||||
|
||||
@app.post('/auth/refresh', response_model=ResponseModels.ClientTokenRefreshResponse,
|
||||
status_code=status.HTTP_200_OK, tags=['Authentication'])
|
||||
@limiter.limit(config['slowapi']['limit'])
|
||||
async def refresh(request: Request, response: Response,
|
||||
Authorize: AuthPASETO = Depends()) -> dict:
|
||||
"""Refresh an auth token.
|
||||
|
||||
Returns:
|
||||
access_token: auth token
|
||||
"""
|
||||
|
||||
Authorize.paseto_required(refresh_token=True)
|
||||
|
||||
admin_claim: dict[str, bool] = {"admin": False}
|
||||
|
||||
current_user: str | int | None = Authorize.get_subject()
|
||||
|
||||
if 'admin' in Authorize.get_token_payload():
|
||||
admin_claim = {"admin": Authorize.get_token_payload()['admin']}
|
||||
|
||||
return {"access_token": Authorize.create_access_token(subject=current_user,
|
||||
user_claims=admin_claim,
|
||||
fresh=False)}
|
||||
|
||||
@app.on_event("startup")
|
||||
async def startup() -> None:
|
||||
redis_url = f"{redis_config['url']}:{redis_config['port']}/{redis_config['database']}"
|
||||
redis = aioredis.from_url(redis_url, encoding="utf8", decode_responses=True)
|
||||
FastAPICache.init(RedisBackend(redis), prefix="fastapi-cache")
|
||||
FastAPICache.init(RedisBackend(RedisConnector.connect(config['cache']['database'])),
|
||||
prefix="fastapi-cache")
|
||||
|
||||
return None
|
||||
|
||||
# setup right before running to make sure no other library overwrites it
|
||||
|
||||
Logger.setup_logging(LOG_LEVEL=config["logging"]["level"], JSON_LOGS=config["logging"]["json_logs"])
|
||||
|
||||
# Run app
|
||||
if __name__ == '__main__':
|
||||
uvicorn.run(app, host=config['uvicorn']['host'], port=config['uvicorn']['port'])
|
||||
Logger.setup_logging(LOG_LEVEL=config["logging"]["level"],
|
||||
JSON_LOGS=config["logging"]["json_logs"])
|
@ -1,41 +0,0 @@
|
||||
from pydantic import BaseModel
|
||||
import modules.models.ResponseFields as ResponseFields
|
||||
|
||||
"""Implements pydantic models and model generator for the API's responses."""
|
||||
|
||||
class ToolsResponseModel(BaseModel):
|
||||
"""Implements the JSON response model for the /tools endpoint.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
tools: list[ ResponseFields.ToolsResponseFields ]
|
||||
|
||||
class PatchesResponseModel(BaseModel):
|
||||
"""Implements the JSON response model for the /patches endpoint.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
__root__: list[ ResponseFields.PatchesResponseFields ]
|
||||
|
||||
class ContributorsResponseModel(BaseModel):
|
||||
"""Implements the JSON response model for the /contributors endpoint.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
repositories: list[ ResponseFields.ContributorsResponseFields ]
|
||||
|
||||
class PingResponseModel(BaseModel):
|
||||
"""Implements the JSON response model for the /heartbeat endpoint.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
status: int
|
||||
detail: str
|
12
mypy.ini
12
mypy.ini
@ -56,3 +56,15 @@ ignore_missing_imports = True
|
||||
[mypy-redis.*]
|
||||
# No stubs available
|
||||
ignore_missing_imports = True
|
||||
|
||||
[mypy-toolz.*]
|
||||
# No stubs available
|
||||
ignore_missing_imports = True
|
||||
|
||||
[mypy-fastapi_paseto_auth.*]
|
||||
# No stubs available
|
||||
ignore_missing_imports = True
|
||||
|
||||
[mypy-aiofiles.*]
|
||||
# No stubs available
|
||||
ignore_missing_imports = True
|
852
poetry.lock
generated
852
poetry.lock
generated
File diff suppressed because it is too large
Load Diff
@ -8,35 +8,27 @@ license = "AGPLv3"
|
||||
[tool.poetry.dependencies]
|
||||
python = "^3.10"
|
||||
fastapi = ">=0.85.0"
|
||||
uvicorn = {version = ">=0.18.3", extras = ["standard"]}
|
||||
httpx = {version = ">=0.23.0", extras = ["http2"]}
|
||||
httpx-cache = ">=0.6.0"
|
||||
toml = ">=0.10.2"
|
||||
slowapi = ">=0.1.6"
|
||||
orjson = ">=3.8.0"
|
||||
fastapi-cache2 = ">=0.1.9"
|
||||
aioredis = {version = ">=2.0.1", extras = ["hiredis"]}
|
||||
redis = ">=4.3.4"
|
||||
msgpack = ">=1.0.4"
|
||||
loguru = ">=0.6.0"
|
||||
sentry-sdk = ">=1.9.8"
|
||||
argon2-cffi = ">=21.3.0"
|
||||
hypercorn = {extras = ["uvloop"], version = ">=0.14.3"}
|
||||
cytoolz = ">=0.12.0"
|
||||
fastapi-paseto-auth = "^0.6.0"
|
||||
ujson = ">=5.5.0"
|
||||
hiredis = ">=2.0.0"
|
||||
aiofiles = ">=22.1.0"
|
||||
|
||||
[tool.poetry.dev-dependencies]
|
||||
fastapi = ">=0.85.0"
|
||||
uvicorn = {version = ">=0.18.3", extras = ["standard"]}
|
||||
httpx = {version = ">=0.23.0", extras = ["http2"]}
|
||||
httpx-cache = ">=0.6.0"
|
||||
toml = ">=0.10.2"
|
||||
slowapi = ">=0.1.6"
|
||||
orjson = ">=3.8.0"
|
||||
fastapi-cache2 = ">=0.1.9"
|
||||
aioredis = {version = ">=2.0.1", extras = ["hiredis"]}
|
||||
redis = ">=4.3.4"
|
||||
msgpack = ">=1.0.4"
|
||||
mypy = ">=0.971"
|
||||
types-toml = ">=0.10.8"
|
||||
loguru = ">=0.6.0"
|
||||
sentry-sdk = ">=1.9.8"
|
||||
types-redis = ">=4.3.21.1"
|
||||
|
||||
[build-system]
|
||||
requires = ["poetry-core>=1.0.0"]
|
||||
|
@ -1,50 +1,61 @@
|
||||
aioredis==2.0.1; python_version >= "3.6"
|
||||
aiofiles==22.1.0; python_version >= "3.7" and python_version < "4.0"
|
||||
aiorwlock==1.3.0; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.7.0"
|
||||
anyio==3.6.1; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.6.2"
|
||||
argon2-cffi-bindings==21.2.0; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
|
||||
argon2-cffi==21.3.0; python_version >= "3.6"
|
||||
async-timeout==4.0.2; python_version >= "3.6"
|
||||
attrs==21.4.0; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0"
|
||||
certifi==2022.9.24; python_version >= "3.7" and python_version < "4.0"
|
||||
cffi==1.15.1; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
|
||||
click==8.1.3; python_version >= "3.7" and python_version < "4.0"
|
||||
colorama==0.4.5; python_version >= "3.7" and python_full_version < "3.0.0" and sys_platform == "win32" and python_version < "4.0" and platform_system == "Windows" or sys_platform == "win32" and python_version >= "3.7" and python_full_version >= "3.5.0" and python_version < "4.0" and platform_system == "Windows"
|
||||
cryptography==37.0.4; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
|
||||
cytoolz==0.12.0; python_version >= "3.5"
|
||||
deprecated==1.2.13; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
|
||||
fastapi-cache2==0.1.9; python_version >= "3.7" and python_version < "4.0"
|
||||
fastapi-paseto-auth==0.6.0; python_version >= "3.10"
|
||||
fastapi==0.85.0; python_version >= "3.7"
|
||||
fasteners==0.17.3; python_version >= "3.7" and python_version < "4.0"
|
||||
h11==0.12.0; python_version >= "3.7" and python_version < "4.0"
|
||||
h11==0.12.0; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.7.0"
|
||||
h2==4.1.0; python_version >= "3.7" and python_full_version >= "3.6.1" and python_version < "4.0"
|
||||
hiredis==2.0.0; implementation_name == "cpython" and python_version >= "3.6"
|
||||
hpack==4.0.0; python_version >= "3.7" and python_full_version >= "3.6.1" and python_version < "4.0"
|
||||
hiredis==2.0.0; python_version >= "3.6"
|
||||
hpack==4.0.0; python_version >= "3.7" and python_full_version >= "3.6.1"
|
||||
httpcore==0.15.0; python_version >= "3.7" and python_version < "4.0"
|
||||
httptools==0.5.0; python_version >= "3.7" and python_full_version >= "3.5.0" and python_version < "4.0"
|
||||
httpx-cache==0.6.0; python_version >= "3.7" and python_version < "4.0"
|
||||
httpx==0.23.0; python_version >= "3.7"
|
||||
hyperframe==6.0.1; python_version >= "3.7" and python_full_version >= "3.6.1" and python_version < "4.0"
|
||||
idna==3.4
|
||||
hypercorn==0.14.3; python_version >= "3.7"
|
||||
hyperframe==6.0.1; python_version >= "3.7" and python_full_version >= "3.6.1"
|
||||
idna==3.4; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.6.2"
|
||||
iso8601==1.1.0; python_full_version >= "3.6.2" and python_version < "4.0" and python_full_version < "4.0.0" and python_version >= "3.10"
|
||||
limits==1.6; python_version >= "3.7" and python_version < "4.0"
|
||||
loguru==0.6.0; python_version >= "3.5"
|
||||
msgpack==1.0.4
|
||||
msgpack==1.0.4; python_version >= "3.7" and python_version < "4.0"
|
||||
orjson==3.8.0; python_version >= "3.7"
|
||||
packaging==21.3; python_version >= "3.6"
|
||||
passlib==1.7.4; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
|
||||
pendulum==2.1.2; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0"
|
||||
pydantic==1.10.2; python_version >= "3.7" and python_version < "4.0"
|
||||
priority==2.0.0; python_full_version >= "3.6.1" and python_version >= "3.7"
|
||||
pycparser==2.21; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
|
||||
pycryptodomex==3.15.0; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
|
||||
pydantic==1.10.2; python_version >= "3.10" and python_version < "4.0"
|
||||
pyparsing==3.0.9; python_full_version >= "3.6.8" and python_version >= "3.6"
|
||||
pyseto==1.6.10; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
|
||||
python-dateutil==2.8.2; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0"
|
||||
python-dotenv==0.21.0; python_version >= "3.7" and python_version < "4.0"
|
||||
pytzdata==2020.1; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0"
|
||||
pyyaml==6.0; python_version >= "3.7" and python_version < "4.0"
|
||||
redis==4.3.4; python_version >= "3.6"
|
||||
rfc3986==1.5.0; python_version >= "3.7" and python_version < "4.0"
|
||||
sentry-sdk==1.9.9
|
||||
sentry-sdk==1.9.10
|
||||
six==1.16.0; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0"
|
||||
slowapi==0.1.6; python_version >= "3.7" and python_version < "4.0"
|
||||
sniffio==1.3.0; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.6.2"
|
||||
starlette==0.20.4; python_version >= "3.7" and python_version < "4.0"
|
||||
starlette==0.20.4; python_version >= "3.10" and python_version < "4.0"
|
||||
toml==0.10.2; (python_version >= "2.6" and python_full_version < "3.0.0") or (python_full_version >= "3.3.0")
|
||||
typing-extensions==4.3.0; python_version >= "3.7" and python_version < "4.0"
|
||||
toolz==0.12.0; python_version >= "3.5"
|
||||
typing-extensions==4.4.0; python_version >= "3.10"
|
||||
ujson==5.5.0; python_version >= "3.7"
|
||||
urllib3==1.26.12; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version < "4" and python_version >= "3.6"
|
||||
uvicorn==0.18.3; python_version >= "3.7"
|
||||
uvloop==0.17.0; sys_platform != "win32" and sys_platform != "cygwin" and platform_python_implementation != "PyPy" and python_version >= "3.7" and python_version < "4.0"
|
||||
watchfiles==0.17.0; python_version >= "3.7" and python_version < "4.0"
|
||||
websockets==10.3; python_version >= "3.7" and python_version < "4.0"
|
||||
uvicorn==0.18.3; python_version >= "3.7" and python_version < "4.0"
|
||||
uvloop==0.17.0; platform_system != "Windows" and python_version >= "3.7"
|
||||
win32-setctime==1.1.0; sys_platform == "win32" and python_version >= "3.5"
|
||||
wrapt==1.14.1; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.5.0" and python_version >= "3.6"
|
||||
wsproto==1.2.0; python_full_version >= "3.7.0" and python_version >= "3.7"
|
||||
|
6
run.sh
6
run.sh
@ -7,6 +7,6 @@
|
||||
CORES=$(grep -c ^processor /proc/cpuinfo)
|
||||
|
||||
# Start the application
|
||||
uvicorn main:app --host="$UVICORN_HOST" --port="$UVICORN_PORT" \
|
||||
--workers="$CORES" --log-level="$UVICORN_LOG_LEVEL" --server-header \
|
||||
--proxy-headers --forwarded-allow-ips="*"
|
||||
hypercorn main:app --bind="${HYPERCORN_HOST}:${HYPERCORN_PORT}" \
|
||||
--workers="$CORES" --log-level="$HYPERCORN_LOG_LEVEL" \
|
||||
--worker-class uvloop
|
102
src/controllers/Announcements.py
Normal file
102
src/controllers/Announcements.py
Normal file
@ -0,0 +1,102 @@
|
||||
import toml
|
||||
from redis import asyncio as aioredis
|
||||
|
||||
import src.utils.Logger as Logger
|
||||
from src.utils.Generators import Generators
|
||||
from src.models.AnnouncementModels import AnnouncementCreateModel
|
||||
from src.utils.RedisConnector import RedisConnector
|
||||
|
||||
config: dict = toml.load("config.toml")
|
||||
|
||||
class Announcements:
|
||||
"""Implements the announcements class for the ReVanced API"""
|
||||
|
||||
redis = RedisConnector.connect(config['announcements']['database'])
|
||||
|
||||
AnnouncementsLogger = Logger.AnnouncementsLogger()
|
||||
|
||||
generators = Generators()
|
||||
|
||||
async def store(self, announcement: AnnouncementCreateModel, author: str) -> bool:
|
||||
"""Store an announcement in the database
|
||||
|
||||
Args:
|
||||
announcement (AnnouncementCreateModel): Pydantic model of the announcement
|
||||
|
||||
Returns:
|
||||
str | bool: UUID of the announcement or False if the announcement wasn't stored successfully
|
||||
"""
|
||||
|
||||
announcement_id: str = "announcement"
|
||||
|
||||
timestamp = await self.generators.generate_timestamp()
|
||||
|
||||
announcement_payload: dict[str, str | int] = {}
|
||||
|
||||
announcement_payload['created_at'] = timestamp
|
||||
announcement_payload['author'] = author
|
||||
announcement_payload['type'] = announcement.type
|
||||
announcement_payload['title'] = announcement.title
|
||||
announcement_payload['content'] = announcement.content
|
||||
|
||||
try:
|
||||
await self.redis.json().set(announcement_id, '$', announcement_payload)
|
||||
await self.AnnouncementsLogger.log("SET", None, announcement_id)
|
||||
except aioredis.RedisError as e:
|
||||
await self.AnnouncementsLogger.log("SET", e)
|
||||
raise e
|
||||
|
||||
return True
|
||||
|
||||
async def exists(self) -> bool:
|
||||
"""Check if an announcement exists in the database
|
||||
|
||||
Returns:
|
||||
bool: True if the announcement exists, False otherwise
|
||||
"""
|
||||
try:
|
||||
if await self.redis.exists("announcement"):
|
||||
await self.AnnouncementsLogger.log("EXISTS", None, "announcement")
|
||||
return True
|
||||
else:
|
||||
await self.AnnouncementsLogger.log("EXISTS", None, "announcement")
|
||||
return False
|
||||
except aioredis.RedisError as e:
|
||||
await self.AnnouncementsLogger.log("EXISTS", e)
|
||||
raise e
|
||||
|
||||
async def get(self) -> dict:
|
||||
"""Get a announcement from the database
|
||||
|
||||
Returns:
|
||||
dict: Dict of the announcement or an empty dict if the announcement doesn't exist
|
||||
"""
|
||||
|
||||
if await self.exists():
|
||||
try:
|
||||
announcement: dict[str, str | int] = await self.redis.json().get("announcement")
|
||||
await self.AnnouncementsLogger.log("GET", None, "announcement")
|
||||
except aioredis.RedisError as e:
|
||||
await self.AnnouncementsLogger.log("GET", e)
|
||||
return {}
|
||||
return announcement
|
||||
else:
|
||||
return {}
|
||||
|
||||
async def delete(self) -> bool:
|
||||
"""Delete an announcement from the database
|
||||
|
||||
Returns:
|
||||
bool: True if the announcement was deleted successfully, False otherwise
|
||||
"""
|
||||
|
||||
if await self.exists():
|
||||
try:
|
||||
await self.redis.delete("announcement")
|
||||
await self.AnnouncementsLogger.log("DELETE", None, "announcement")
|
||||
except aioredis.RedisError as e:
|
||||
await self.AnnouncementsLogger.log("DELETE", e)
|
||||
return False
|
||||
return True
|
||||
else:
|
||||
return False
|
7
src/controllers/Auth.py
Normal file
7
src/controllers/Auth.py
Normal file
@ -0,0 +1,7 @@
|
||||
import os
|
||||
from pydantic import BaseModel
|
||||
|
||||
class PasetoSettings(BaseModel):
|
||||
authpaseto_secret_key: str = os.environ['SECRET_KEY']
|
||||
authpaseto_access_token_expires: int = 86400
|
||||
|
351
src/controllers/Clients.py
Normal file
351
src/controllers/Clients.py
Normal file
@ -0,0 +1,351 @@
|
||||
from time import sleep
|
||||
import toml
|
||||
import orjson
|
||||
from typing import Optional
|
||||
import argon2
|
||||
from redis import asyncio as aioredis
|
||||
import aiofiles
|
||||
import uvloop
|
||||
|
||||
import src.utils.Logger as Logger
|
||||
from src.utils.Generators import Generators
|
||||
from src.models.ClientModels import ClientModel
|
||||
from src.utils.RedisConnector import RedisConnector
|
||||
|
||||
config: dict = toml.load("config.toml")
|
||||
|
||||
class Clients:
|
||||
|
||||
"""Implements a client for ReVanced Releases API."""
|
||||
|
||||
uvloop.install()
|
||||
|
||||
redis = RedisConnector.connect(config['clients']['database'])
|
||||
redis_tokens = RedisConnector.connect(config['tokens']['database'])
|
||||
|
||||
UserLogger = Logger.UserLogger()
|
||||
|
||||
generators = Generators()
|
||||
|
||||
async def generate(self, admin: Optional[bool] = False) -> ClientModel:
|
||||
"""Generate a new client
|
||||
|
||||
Args:
|
||||
admin (Optional[bool], optional): Defines if the client should have admin access. Defaults to False.
|
||||
|
||||
Returns:
|
||||
ClientModel: Pydantic model of the client
|
||||
"""
|
||||
|
||||
client_id: str = await self.generators.generate_id()
|
||||
client_secret: str = await self.generators.generate_secret()
|
||||
|
||||
client = ClientModel(id=client_id, secret=client_secret, admin=admin, active=True)
|
||||
|
||||
return client
|
||||
|
||||
async def store(self, client: ClientModel) -> bool:
|
||||
"""Store a client in the database
|
||||
|
||||
Args:
|
||||
client (ClientModel): Pydantic model of the client
|
||||
|
||||
Returns:
|
||||
bool: True if the client was stored successfully, False otherwise
|
||||
"""
|
||||
|
||||
client_payload: dict[str, str | bool] = {}
|
||||
ph: argon2.PasswordHasher = argon2.PasswordHasher()
|
||||
|
||||
client_payload['secret'] = ph.hash(client.secret)
|
||||
client_payload['admin'] = client.admin
|
||||
client_payload['active'] = client.active
|
||||
|
||||
try:
|
||||
await self.redis.json().set(client.id, '$', client_payload)
|
||||
await self.UserLogger.log("SET", None, client.id)
|
||||
except aioredis.RedisError as e:
|
||||
await self.UserLogger.log("SET", e)
|
||||
raise e
|
||||
|
||||
return True
|
||||
|
||||
async def exists(self, client_id: str) -> bool:
|
||||
"""Check if a client exists in the database
|
||||
|
||||
Args:
|
||||
client_id (str): UUID of the client
|
||||
|
||||
Returns:
|
||||
bool: True if the client exists, False otherwise
|
||||
"""
|
||||
try:
|
||||
if await self.redis.exists(client_id):
|
||||
await self.UserLogger.log("EXISTS", None, client_id)
|
||||
return True
|
||||
else:
|
||||
await self.UserLogger.log("EXISTS", None, client_id)
|
||||
return False
|
||||
except aioredis.RedisError as e:
|
||||
await self.UserLogger.log("EXISTS", e)
|
||||
raise e
|
||||
|
||||
async def get(self, client_id: str) -> ClientModel | bool:
|
||||
"""Get a client from the database
|
||||
|
||||
Args:
|
||||
client_id (str): UUID of the client
|
||||
|
||||
Returns:
|
||||
ClientModel | bool: Pydantic model of the client or False if the client doesn't exist
|
||||
"""
|
||||
|
||||
if await self.exists(client_id):
|
||||
try:
|
||||
client_payload: dict[str, str | bool] = await self.redis.json().get(client_id)
|
||||
client = ClientModel(id=client_id, secret=client_payload['secret'], admin=client_payload['admin'], active=True)
|
||||
await self.UserLogger.log("GET", None, client_id)
|
||||
except aioredis.RedisError as e:
|
||||
await self.UserLogger.log("GET", e)
|
||||
raise e
|
||||
return client
|
||||
else:
|
||||
return False
|
||||
|
||||
async def delete(self, client_id: str) -> bool:
|
||||
"""Delete a client from the database
|
||||
|
||||
Args:
|
||||
client_id (str): UUID of the client
|
||||
|
||||
Returns:
|
||||
bool: True if the client was deleted successfully, False otherwise
|
||||
"""
|
||||
|
||||
if await self.exists(client_id):
|
||||
try:
|
||||
await self.redis.delete(client_id)
|
||||
await self.UserLogger.log("DELETE", None, client_id)
|
||||
except aioredis.RedisError as e:
|
||||
await self.UserLogger.log("DELETE", e)
|
||||
raise e
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
|
||||
async def update_secret(self, client_id: str, new_secret: str) -> bool:
|
||||
"""Update the secret of a client
|
||||
|
||||
Args:
|
||||
client_id (str): UUID of the client
|
||||
new_secret (str): New secret of the client
|
||||
|
||||
Returns:
|
||||
bool: True if the secret was updated successfully, False otherwise
|
||||
"""
|
||||
|
||||
ph: argon2.PasswordHasher = argon2.PasswordHasher()
|
||||
|
||||
updated: bool = False
|
||||
|
||||
try:
|
||||
await self.redis.json().set(client_id, '.secret', ph.hash(new_secret))
|
||||
await self.UserLogger.log("UPDATE_SECRET", None, client_id)
|
||||
updated = True
|
||||
except aioredis.RedisError as e:
|
||||
await self.UserLogger.log("UPDATE_SECRET", e)
|
||||
raise e
|
||||
|
||||
return updated
|
||||
|
||||
async def authenticate(self, client_id: str, secret: str) -> bool:
|
||||
"""Check if the secret of a client is correct
|
||||
|
||||
Args:
|
||||
client_id (str): UUID of the client
|
||||
secret (str): Secret of the client
|
||||
|
||||
Returns:
|
||||
bool: True if the secret is correct, False otherwise
|
||||
"""
|
||||
|
||||
ph: argon2.PasswordHasher = argon2.PasswordHasher()
|
||||
authenticated: bool = False
|
||||
client_secret: str = await self.redis.json().get(client_id, '.secret')
|
||||
|
||||
try:
|
||||
if ph.verify(client_secret, secret):
|
||||
await self.UserLogger.log("CHECK_SECRET", None, client_id)
|
||||
|
||||
if ph.check_needs_rehash(client_secret):
|
||||
await self.redis.json().set(client_id, '.secret', ph.hash(secret))
|
||||
await self.UserLogger.log("REHASH SECRET", None, client_id)
|
||||
authenticated = True
|
||||
except argon2.exceptions.VerifyMismatchError as e:
|
||||
await self.UserLogger.log("CHECK_SECRET", e)
|
||||
return authenticated
|
||||
|
||||
return authenticated
|
||||
|
||||
async def is_admin(self, client_id: str) -> bool:
|
||||
"""Check if a client has admin access
|
||||
|
||||
Args:
|
||||
client_id (str): UUID of the client
|
||||
|
||||
Returns:
|
||||
bool: True if the client has admin access, False otherwise
|
||||
"""
|
||||
|
||||
client_admin: bool = False
|
||||
|
||||
try:
|
||||
client_admin = await self.redis.json().get(client_id, '.admin')
|
||||
await self.UserLogger.log("CHECK_ADMIN", None, client_id)
|
||||
except aioredis.RedisError as e:
|
||||
await self.UserLogger.log("CHECK_ADMIN", e)
|
||||
raise e
|
||||
|
||||
return client_admin
|
||||
|
||||
|
||||
async def is_active(self, client_id: str) -> bool:
|
||||
"""Check if a client is active
|
||||
|
||||
Args:
|
||||
client_id (str): UUID of the client
|
||||
|
||||
Returns:
|
||||
bool: True if the client is active, False otherwise
|
||||
"""
|
||||
|
||||
client_active: bool = False
|
||||
|
||||
try:
|
||||
client_active = await self.redis.json().get(client_id, '.active')
|
||||
await self.UserLogger.log("CHECK_ACTIVE", None, client_id)
|
||||
except aioredis.RedisError as e:
|
||||
await self.UserLogger.log("CHECK_ACTIVE", e)
|
||||
raise e
|
||||
|
||||
return client_active
|
||||
|
||||
async def status(self, client_id: str, active: bool) -> bool:
|
||||
"""Activate a client
|
||||
|
||||
Args:
|
||||
client_id (str): UUID of the client
|
||||
active (bool): True to activate the client, False to deactivate it
|
||||
|
||||
Returns:
|
||||
bool: True if the client status was change successfully, False otherwise
|
||||
"""
|
||||
|
||||
changed: bool = False
|
||||
|
||||
try:
|
||||
await self.redis.json().set(client_id, '.active', active)
|
||||
await self.UserLogger.log("ACTIVATE", None, client_id)
|
||||
changed = True
|
||||
except aioredis.RedisError as e:
|
||||
await self.UserLogger.log("ACTIVATE", e)
|
||||
raise e
|
||||
|
||||
return changed
|
||||
|
||||
async def ban_token(self, token: str) -> bool:
|
||||
"""Ban a token
|
||||
|
||||
Args:
|
||||
token (str): Token to ban
|
||||
|
||||
Returns:
|
||||
bool: True if the token was banned successfully, False otherwise
|
||||
"""
|
||||
|
||||
banned: bool = False
|
||||
|
||||
try:
|
||||
await self.redis_tokens.set(token, '')
|
||||
await self.UserLogger.log("BAN_TOKEN", None, token)
|
||||
banned = True
|
||||
except aioredis.RedisError as e:
|
||||
await self.UserLogger.log("BAN_TOKEN", e)
|
||||
raise e
|
||||
|
||||
return banned
|
||||
|
||||
async def is_token_banned(self, token: str) -> bool:
|
||||
"""Check if a token is banned
|
||||
|
||||
Args:
|
||||
token (str): Token to check
|
||||
|
||||
Returns:
|
||||
bool: True if the token is banned, False otherwise
|
||||
"""
|
||||
|
||||
banned: bool = True
|
||||
|
||||
try:
|
||||
banned = await self.redis_tokens.exists(token)
|
||||
await self.UserLogger.log("CHECK_TOKEN", None, token)
|
||||
except aioredis.RedisError as e:
|
||||
await self.UserLogger.log("CHECK_TOKEN", e)
|
||||
raise e
|
||||
|
||||
return banned
|
||||
|
||||
async def auth_checks(self, client_id: str, token: str) -> bool:
|
||||
"""Check if a client exists, is active and the token isn't banned
|
||||
|
||||
Args:
|
||||
client_id (str): UUID of the client
|
||||
secret (str): Secret of the client
|
||||
|
||||
Returns:
|
||||
bool: True if the client exists, is active
|
||||
and the token isn't banned, False otherwise
|
||||
"""
|
||||
|
||||
if await self.exists(client_id):
|
||||
if await self.is_active(client_id):
|
||||
if not await self.is_token_banned(token):
|
||||
return True
|
||||
else:
|
||||
return False
|
||||
else:
|
||||
if not await self.is_token_banned(token):
|
||||
await self.ban_token(token)
|
||||
return False
|
||||
else:
|
||||
await self.ban_token(token)
|
||||
return False
|
||||
|
||||
return False
|
||||
|
||||
async def setup_admin(self) -> bool:
|
||||
"""Create the admin user if it doesn't exist
|
||||
|
||||
Returns:
|
||||
bool: True if the admin user was created successfully, False otherwise
|
||||
"""
|
||||
created: bool = False
|
||||
|
||||
if not await self.exists('admin'):
|
||||
admin_info: ClientModel = await self.generate()
|
||||
admin_info.id = 'admin'
|
||||
admin_info.admin = True
|
||||
try:
|
||||
await self.store(admin_info)
|
||||
await self.UserLogger.log("CREATE_ADMIN | ID |", None, admin_info.id)
|
||||
await self.UserLogger.log("CREATE_ADMIN | SECRET |", None, admin_info.secret)
|
||||
async with aiofiles.open("admin_info.json", "wb") as file:
|
||||
await file.write(orjson.dumps(vars(admin_info)))
|
||||
await self.UserLogger.log("CREATE_ADMIN | TO FILE", None, "admin_info.json")
|
||||
created = True
|
||||
except aioredis.RedisError as e:
|
||||
await self.UserLogger.log("CREATE_ADMIN", e)
|
||||
raise e
|
||||
|
||||
return created
|
@ -1,33 +1,23 @@
|
||||
import os
|
||||
from toolz.dicttoolz import keyfilter
|
||||
import asyncio
|
||||
import uvloop
|
||||
import orjson
|
||||
import httpx_cache
|
||||
from base64 import b64decode
|
||||
from modules.utils.InternalCache import InternalCache
|
||||
import modules.utils.Logger as Logger
|
||||
from src.utils.HTTPXClient import HTTPXClient
|
||||
from src.utils.InternalCache import InternalCache
|
||||
|
||||
|
||||
class Releases:
|
||||
|
||||
"""Implements the methods required to get the latest releases and patches from revanced repositories."""
|
||||
|
||||
headers = {'Accept': "application/vnd.github+json",
|
||||
'Authorization': "token " + os.environ['GITHUB_TOKEN']
|
||||
}
|
||||
uvloop.install()
|
||||
|
||||
httpx_logger = Logger.HTTPXLogger()
|
||||
|
||||
httpx_client = httpx_cache.AsyncClient(
|
||||
headers=headers,
|
||||
http2=True,
|
||||
event_hooks={
|
||||
'request': [httpx_logger.log_request],
|
||||
'response': [httpx_logger.log_response]
|
||||
}
|
||||
)
|
||||
httpx_client = HTTPXClient.create()
|
||||
|
||||
InternalCache = InternalCache()
|
||||
|
||||
async def _get_release(self, repository: str) -> list:
|
||||
async def __get_release(self, repository: str) -> list:
|
||||
# Get assets from latest release in a given repository.
|
||||
#
|
||||
# Args:
|
||||
@ -83,16 +73,17 @@ class Releases:
|
||||
releases = {}
|
||||
releases['tools'] = []
|
||||
|
||||
for repository in repositories:
|
||||
files = await self._get_release(repository)
|
||||
if files:
|
||||
for file in files:
|
||||
releases['tools'].append(file)
|
||||
results: list = await asyncio.gather(*[self.__get_release(repository) for repository in repositories])
|
||||
|
||||
for result in results:
|
||||
for asset in result:
|
||||
releases['tools'].append(asset)
|
||||
|
||||
await self.InternalCache.store('releases', releases)
|
||||
|
||||
return releases
|
||||
|
||||
async def _get_patches_json(self) -> dict:
|
||||
async def __get_patches_json(self) -> dict:
|
||||
# Get revanced-patches repository's README.md.
|
||||
#
|
||||
# Returns:
|
||||
@ -113,12 +104,12 @@ class Releases:
|
||||
if await self.InternalCache.exists('patches'):
|
||||
patches = await self.InternalCache.get('patches')
|
||||
else:
|
||||
patches = await self._get_patches_json()
|
||||
patches = await self.__get_patches_json()
|
||||
await self.InternalCache.store('patches', patches)
|
||||
|
||||
return patches
|
||||
|
||||
async def _get_contributors(self, repository: str) -> list:
|
||||
async def __get_contributors(self, repository: str) -> list:
|
||||
# Get contributors from a given repository.
|
||||
#
|
||||
# Args:
|
||||
@ -127,9 +118,14 @@ class Releases:
|
||||
# Returns:
|
||||
# list: a list of dictionaries containing the repository's contributors
|
||||
|
||||
keep: set = {'login', 'avatar_url', 'html_url'}
|
||||
|
||||
response = await self.httpx_client.get(f"https://api.github.com/repos/{repository}/contributors")
|
||||
|
||||
return response.json()
|
||||
contributors = [keyfilter(lambda k: k in keep, contributor) for contributor in response.json()]
|
||||
|
||||
|
||||
return contributors
|
||||
|
||||
async def get_contributors(self, repositories: list) -> dict:
|
||||
"""Runs get_contributors() asynchronously for each repository.
|
||||
@ -148,11 +144,15 @@ class Releases:
|
||||
else:
|
||||
contributors = {}
|
||||
contributors['repositories'] = []
|
||||
for repository in repositories:
|
||||
if 'revanced' in repository:
|
||||
repo_contributors = await self._get_contributors(repository)
|
||||
data = { 'name': repository, 'contributors': repo_contributors }
|
||||
contributors['repositories'].append(data)
|
||||
|
||||
revanced_repositories = [repository for repository in repositories if 'revanced' in repository]
|
||||
|
||||
results: list[dict] = await asyncio.gather(*[self.__get_contributors(repository) for repository in revanced_repositories])
|
||||
|
||||
for key, value in zip(revanced_repositories, results):
|
||||
data = { 'name': key, 'contributors': value }
|
||||
contributors['repositories'].append(data)
|
||||
|
||||
await self.InternalCache.store('contributors', contributors)
|
||||
|
||||
return contributors
|
46
src/models/AnnouncementModels.py
Normal file
46
src/models/AnnouncementModels.py
Normal file
@ -0,0 +1,46 @@
|
||||
from pydantic import BaseModel
|
||||
from typing import Literal
|
||||
|
||||
AnnouncementType = Literal["info", "warning", "error"]
|
||||
|
||||
class AnnouncementModel(BaseModel):
|
||||
"""Implements the fields for the announcements.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
created_at: int
|
||||
author: str
|
||||
type: AnnouncementType
|
||||
title: str
|
||||
content: str
|
||||
|
||||
class AnnouncementCreateModel(BaseModel):
|
||||
"""Implements the fields for creating an announcement.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
type: AnnouncementType
|
||||
title: str
|
||||
content: str
|
||||
|
||||
class AnnouncementCreatedResponse(BaseModel):
|
||||
"""Implements the response fields for created announcements.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
created: bool
|
||||
|
||||
class AnnouncementDeleted(BaseModel):
|
||||
"""Implements the response fields for deleted announcements.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
deleted: bool
|
24
src/models/ClientModels.py
Normal file
24
src/models/ClientModels.py
Normal file
@ -0,0 +1,24 @@
|
||||
from pydantic import BaseModel
|
||||
|
||||
class ClientModel(BaseModel):
|
||||
"""Implements the fields for the clients.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
id: str
|
||||
secret: str
|
||||
admin: bool
|
||||
active: bool
|
||||
|
||||
class ClientAuthModel(BaseModel):
|
||||
"""Implements the fields for client authentication.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
id: str
|
||||
secret: str
|
||||
|
51
src/models/GeneralErrors.py
Normal file
51
src/models/GeneralErrors.py
Normal file
@ -0,0 +1,51 @@
|
||||
from pydantic import BaseModel
|
||||
|
||||
class InternalServerError(BaseModel):
|
||||
"""Implements the response fields for when an internal server error occurs.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
error: str = "Internal Server Error"
|
||||
message: str = "An internal server error occurred. Please try again later."
|
||||
|
||||
class AnnouncementNotFound(BaseModel):
|
||||
"""Implements the response fields for when an item is not found.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
error: str = "Not Found"
|
||||
message: str = "No announcement was found."
|
||||
|
||||
class ClientNotFound(BaseModel):
|
||||
"""Implements the response fields for when a client is not found.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
error: str = "Not Found"
|
||||
message: str = "No client matches the given ID"
|
||||
|
||||
class IdNotProvided(BaseModel):
|
||||
"""Implements the response fields for when the id is not provided.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
error: str = "Bad Request"
|
||||
message: str = "Missing client id"
|
||||
|
||||
class Unauthorized(BaseModel):
|
||||
"""Implements the response fields for when the client is unauthorized.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
error: str = "Unauthorized"
|
||||
message: str = "The client is unauthorized to access this resource"
|
@ -52,24 +52,8 @@ class ContributorFields(BaseModel):
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
login: str
|
||||
id: str
|
||||
node_id: str
|
||||
avatar_url: str
|
||||
gravatar_id: str
|
||||
url: str
|
||||
html_url: str
|
||||
followers_url: str
|
||||
following_url: str
|
||||
gists_url: str
|
||||
starred_url: str
|
||||
subscriptions_url: str
|
||||
organizations_url: str
|
||||
repos_url: str
|
||||
events_url: str
|
||||
received_events_url: str
|
||||
type: str
|
||||
site_admin: str
|
||||
contributions: int
|
||||
|
||||
class ContributorsResponseFields(BaseModel):
|
||||
"""Implements the fields for each repository in the /contributors endpoint
|
90
src/models/ResponseModels.py
Normal file
90
src/models/ResponseModels.py
Normal file
@ -0,0 +1,90 @@
|
||||
from pydantic import BaseModel
|
||||
import src.models.ResponseFields as ResponseFields
|
||||
|
||||
"""Implements pydantic models and model generator for the API's responses."""
|
||||
|
||||
class ToolsResponseModel(BaseModel):
|
||||
"""Implements the JSON response model for the /tools endpoint.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
tools: list[ ResponseFields.ToolsResponseFields ]
|
||||
|
||||
class PatchesResponseModel(BaseModel):
|
||||
"""Implements the JSON response model for the /patches endpoint.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
__root__: list[ ResponseFields.PatchesResponseFields ]
|
||||
|
||||
class ContributorsResponseModel(BaseModel):
|
||||
"""Implements the JSON response model for the /contributors endpoint.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
repositories: list[ ResponseFields.ContributorsResponseFields ]
|
||||
|
||||
class PingResponseModel(BaseModel):
|
||||
"""Implements the JSON response model for the /heartbeat endpoint.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
status: int
|
||||
detail: str
|
||||
|
||||
class ClientDeletedResponse(BaseModel):
|
||||
"""Implements the response fields for deleted clients.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
id: str
|
||||
deleted: bool
|
||||
|
||||
class ClientSecretUpdatedResponse(BaseModel):
|
||||
"""Implements the response fields for updated client secrets.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
id: str
|
||||
secret: str
|
||||
|
||||
class ClientAuthTokenResponse(BaseModel):
|
||||
"""Implements the response fields for client auth tokens.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
access_token: str
|
||||
refresh_token: str
|
||||
|
||||
class ClientTokenRefreshResponse(BaseModel):
|
||||
"""Implements the response fields for client token refresh.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
access_token: str
|
||||
|
||||
class ClientStatusResponse(BaseModel):
|
||||
"""Implements the response fields for client status.
|
||||
|
||||
Args:
|
||||
BaseModel (pydantic.BaseModel): BaseModel from pydantic
|
||||
"""
|
||||
|
||||
id: str
|
||||
active: bool
|
30
src/utils/Generators.py
Normal file
30
src/utils/Generators.py
Normal file
@ -0,0 +1,30 @@
|
||||
import time
|
||||
import uuid
|
||||
import secrets
|
||||
|
||||
class Generators:
|
||||
"""Generates UUIDs and secrets"""
|
||||
|
||||
async def generate_secret(self) -> str:
|
||||
"""Generate a random secret
|
||||
|
||||
Returns:
|
||||
str: A random secret
|
||||
"""
|
||||
return secrets.token_urlsafe(32)
|
||||
|
||||
async def generate_id(self) -> str:
|
||||
"""Generate a random UUID
|
||||
|
||||
Returns:
|
||||
str: A random UUID (str instead of UUID object)
|
||||
"""
|
||||
return str(uuid.uuid4())
|
||||
|
||||
async def generate_timestamp(self) -> int:
|
||||
"""Generate a timestamp
|
||||
|
||||
Returns:
|
||||
int: A timestamp
|
||||
"""
|
||||
return int(time.time())
|
32
src/utils/HTTPXClient.py
Normal file
32
src/utils/HTTPXClient.py
Normal file
@ -0,0 +1,32 @@
|
||||
import os
|
||||
import httpx_cache
|
||||
import src.utils.Logger as Logger
|
||||
|
||||
class HTTPXClient:
|
||||
|
||||
"""Implements the methods required to get the latest releases and patches from revanced repositories."""
|
||||
|
||||
@staticmethod
|
||||
def create() -> httpx_cache.AsyncClient:
|
||||
"""Create HTTPX client with cache
|
||||
|
||||
Returns:
|
||||
httpx_cache.AsyncClient: HTTPX client with cache
|
||||
"""
|
||||
|
||||
headers = {'Accept': "application/vnd.github+json",
|
||||
'Authorization': "token " + os.environ['GITHUB_TOKEN']
|
||||
}
|
||||
|
||||
httpx_logger = Logger.HTTPXLogger()
|
||||
|
||||
httpx_client = httpx_cache.AsyncClient(
|
||||
headers=headers,
|
||||
http2=True,
|
||||
event_hooks={
|
||||
'request': [httpx_logger.log_request],
|
||||
'response': [httpx_logger.log_response]
|
||||
}
|
||||
)
|
||||
|
||||
return httpx_client
|
@ -1,55 +1,57 @@
|
||||
import os
|
||||
import toml
|
||||
import orjson
|
||||
import msgpack
|
||||
import aioredis
|
||||
from typing import Any
|
||||
from redis import asyncio as aioredis
|
||||
|
||||
import modules.utils.Logger as Logger
|
||||
|
||||
# Load config
|
||||
import src.utils.Logger as Logger
|
||||
from src.utils.RedisConnector import RedisConnector
|
||||
|
||||
config: dict = toml.load("config.toml")
|
||||
|
||||
# Redis connection parameters
|
||||
|
||||
redis_config: dict[ str, str | int ] = {
|
||||
"url": f"redis://{os.environ['REDIS_URL']}",
|
||||
"port": os.environ['REDIS_PORT'],
|
||||
"database": config['internal-cache']['database'],
|
||||
}
|
||||
|
||||
class InternalCache:
|
||||
"""Implements an internal cache for ReVanced Releases API."""
|
||||
|
||||
redis_url = f"{redis_config['url']}:{redis_config['port']}/{redis_config['database']}"
|
||||
redis = aioredis.from_url(redis_url, encoding="utf-8", decode_responses=True)
|
||||
redis = RedisConnector.connect(config['internal-cache']['database'])
|
||||
|
||||
InternalCacheLogger = Logger.InternalCacheLogger()
|
||||
|
||||
async def store(self, key: str, value: dict) -> None:
|
||||
"""Stores a key-value pair in the cache.
|
||||
|
||||
Args:
|
||||
key (str): the key to store
|
||||
value (dict): the JSON value to store
|
||||
"""
|
||||
try:
|
||||
await self.redis.set(key, orjson.dumps(value), ex=config['internal-cache']['expire'])
|
||||
await self.redis.json().set(key, '$', value)
|
||||
await self.redis.expire(key, config['internal-cache']['expire'])
|
||||
await self.InternalCacheLogger.log("SET", None, key)
|
||||
except aioredis.RedisError as e:
|
||||
await self.InternalCacheLogger.log("SET", e)
|
||||
|
||||
async def delete(self, key: str) -> None:
|
||||
"""Removes a key-value pair from the cache.
|
||||
|
||||
Args:
|
||||
key (str): the key to delete
|
||||
"""
|
||||
try:
|
||||
await self.redis.delete(key)
|
||||
await self.InternalCacheLogger.log("DEL", None, key)
|
||||
except aioredis.RedisError as e:
|
||||
await self.InternalCacheLogger.log("DEL", e)
|
||||
|
||||
async def update(self, key: str, value: dict) -> None:
|
||||
try:
|
||||
await self.redis.set(key, orjson.dumps(value), ex=config['internal-cache']['expire'])
|
||||
await self.InternalCacheLogger.log("SET", None, key)
|
||||
except aioredis.RedisError as e:
|
||||
await self.InternalCacheLogger.log("SET", e)
|
||||
|
||||
async def get(self, key: str) -> dict:
|
||||
"""Gets a key-value pair from the cache.
|
||||
|
||||
Args:
|
||||
key (str): the key to retrieve
|
||||
|
||||
Returns:
|
||||
dict: the JSON value stored in the cache or an empty dict if key doesn't exist or an error occurred
|
||||
"""
|
||||
try:
|
||||
payload = orjson.loads(await self.redis.get(key))
|
||||
payload: dict[Any, Any] = await self.redis.json().get(key)
|
||||
await self.InternalCacheLogger.log("GET", None, key)
|
||||
return payload
|
||||
except aioredis.RedisError as e:
|
||||
@ -57,6 +59,14 @@ class InternalCache:
|
||||
return {}
|
||||
|
||||
async def exists(self, key: str) -> bool:
|
||||
"""Checks if a key exists in the cache.
|
||||
|
||||
Args:
|
||||
key (str): key to check
|
||||
|
||||
Returns:
|
||||
bool: True if key exists, False if key doesn't exist or an error occurred
|
||||
"""
|
||||
try:
|
||||
if await self.redis.exists(key):
|
||||
await self.InternalCacheLogger.log("EXISTS", None, key)
|
@ -4,6 +4,7 @@ from loguru import logger
|
||||
from typing import Optional
|
||||
from types import FrameType
|
||||
from redis import RedisError
|
||||
from argon2.exceptions import VerifyMismatchError
|
||||
|
||||
class InterceptHandler(logging.Handler):
|
||||
"""Setups a loging handler for uvicorn and FastAPI.
|
||||
@ -24,15 +25,22 @@ class InterceptHandler(logging.Handler):
|
||||
depth: int
|
||||
|
||||
# Get corresponding Loguru level if it exists
|
||||
# If not, use default level
|
||||
|
||||
try:
|
||||
level = logger.level(record.levelname).name
|
||||
except ValueError:
|
||||
level = record.levelno
|
||||
|
||||
# Find caller from where originated the logged message
|
||||
# Set depth to 2 to avoid logging of loguru internal calls
|
||||
frame = logging.currentframe()
|
||||
depth = 2
|
||||
|
||||
# Find caller from where originated the logged message
|
||||
# The logging module uses a stack frame to keep track of where logging messages originate
|
||||
# This stack frame is used to find the correct place in the code where the logging message was generated
|
||||
# The mypy error is ignored because the logging module is not properly typed
|
||||
while frame.f_code.co_filename == logging.__file__:
|
||||
frame = frame.f_back
|
||||
depth += 1
|
||||
@ -74,6 +82,33 @@ class InternalCacheLogger:
|
||||
else:
|
||||
logger.info(f"[InternalCache] REDIS {operation} {key} - OK")
|
||||
|
||||
class UserLogger:
|
||||
async def log(self, operation: str, result: RedisError | VerifyMismatchError | None = None,
|
||||
key: str = "",) -> None:
|
||||
"""Logs internal cache operations
|
||||
|
||||
Args:
|
||||
operation (str): Operation name
|
||||
key (str): Key used in the operation
|
||||
"""
|
||||
if type(result) is RedisError:
|
||||
logger.error(f"[User] REDIS {operation} - Failed with error: {result}")
|
||||
else:
|
||||
logger.info(f"[User] REDIS {operation} {key} - OK")
|
||||
|
||||
class AnnouncementsLogger:
|
||||
async def log(self, operation: str, result: RedisError | None = None, key: str = "") -> None:
|
||||
"""Logs internal cache operations
|
||||
|
||||
Args:
|
||||
operation (str): Operation name
|
||||
key (str): Key used in the operation
|
||||
"""
|
||||
if type(result) is RedisError:
|
||||
logger.error(f"[User] REDIS {operation} - Failed with error: {result}")
|
||||
else:
|
||||
logger.info(f"[User] REDIS {operation} {key} - OK")
|
||||
|
||||
def setup_logging(LOG_LEVEL: str, JSON_LOGS: bool) -> None:
|
||||
|
||||
"""Setup logging for uvicorn and FastAPI."""
|
23
src/utils/RedisConnector.py
Normal file
23
src/utils/RedisConnector.py
Normal file
@ -0,0 +1,23 @@
|
||||
import os
|
||||
import toml
|
||||
from redis import asyncio as aioredis
|
||||
|
||||
# Load config
|
||||
|
||||
config: dict = toml.load("config.toml")
|
||||
|
||||
# Redis connection parameters
|
||||
|
||||
redis_config: dict[ str, str | int ] = {
|
||||
"url": f"redis://{os.environ['REDIS_URL']}",
|
||||
"port": os.environ['REDIS_PORT'],
|
||||
}
|
||||
|
||||
class RedisConnector:
|
||||
"""Implements the RedisConnector class for the ReVanced API"""
|
||||
|
||||
@staticmethod
|
||||
def connect(database: str) -> aioredis.Redis:
|
||||
"""Connect to Redis"""
|
||||
redis_url = f"{redis_config['url']}:{redis_config['port']}/{database}"
|
||||
return aioredis.from_url(redis_url, encoding="utf-8", decode_responses=True)
|
0
src/utils/__init__.py
Normal file
0
src/utils/__init__.py
Normal file
Loading…
x
Reference in New Issue
Block a user