feat: implement clients, authentication, announcement and changelogs (#11)

* feat: resolves #5, resolves #4 (#7)

* Implements client generation and management

* fix announcements endpoints

* change annoucements model

* bump deps

* sync with main

* refactor: adopt some functional standards in Releases.py

* feat: add new workflows

* chore: remove unused files

* refactor: update build badge

* refactor: move files around and delete unused ones

* feat: add authentication endpoints

* refactor: clean up code on Clients.py controller

* fix: fix the client secret update endpoint

* refactor: clean up authentication code

* feat: add authentication to client endpoints

* chore: bump deps

* feat: add admin user generation

* feature: add /changelogs endpoint (#10)
This commit is contained in:
Alexandre Teles 2022-10-09 16:56:24 -03:00 committed by GitHub
parent 94e7ac64c0
commit ca49a3b31a
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
33 changed files with 2058 additions and 503 deletions

1
.gitignore vendored
View File

@ -153,3 +153,4 @@ cython_debug/
# PROJECT SPECIFIC # PROJECT SPECIFIC
setup_env.sh setup_env.sh
admin_info.json

3
.vscode/settings.json vendored Normal file
View File

@ -0,0 +1,3 @@
{
"python.analysis.typeCheckingMode": "off"
}

View File

@ -3,14 +3,14 @@ FROM python:3.10-slim
ARG GITHUB_TOKEN ARG GITHUB_TOKEN
ENV GITHUB_TOKEN $GITHUB_TOKEN ENV GITHUB_TOKEN $GITHUB_TOKEN
ARG UVICORN_HOST ARG HYPERCORN_HOST
ENV UVICORN_HOST $UVICORN_HOST ENV HYPERCORN_HOST $HYPERCORN_HOST
ARG UVICORN_PORT ARG HYPERCORN_PORT
ENV UVICORN_PORT $UVICORN_PORT ENV HYPERCORN_PORT $HYPERCORN_PORT
ARG UVICORN_LOG_LEVEL ARG HYPERCORN_LOG_LEVEL
ENV UVICORN_LOG_LEVEL $UVICORN_LOG_LEVEL ENV HYPERCORN_LOG_LEVEL $HYPERCORN_LOG_LEVEL
WORKDIR /usr/src/app WORKDIR /usr/src/app

View File

@ -8,7 +8,7 @@ This is a simple API that returns the latest ReVanced releases, patches and cont
## Usage ## Usage
The API is available at [https://revanced-releases-api.afterst0rm.xyz/](https://revanced-releases-api.afterst0rm.xyz/). The API is available at [https://releases.rvcd.win/](https://releases.rvcd.win/).
You can deploy your own instance by cloning this repository, editing the `docker-compose.yml` file to include your GitHub token and running `docker-compose up` or `docker-compose up --build` if you want to build the image locally instead of pulling from GHCR. Optionally you can run the application without Docker by running `poetry install` and `poetry run ./run.sh`. In this case, you'll also need a redis server and setup the following environment variables on your system. You can deploy your own instance by cloning this repository, editing the `docker-compose.yml` file to include your GitHub token and running `docker-compose up` or `docker-compose up --build` if you want to build the image locally instead of pulling from GHCR. Optionally you can run the application without Docker by running `poetry install` and `poetry run ./run.sh`. In this case, you'll also need a redis server and setup the following environment variables on your system.
@ -28,9 +28,39 @@ If you don't have a Sentry instance, we recommend using [GlitchTip](https://glit
### API Endpoints ### API Endpoints
* [tools](https://revanced-releases-api.afterst0rm.xyz/tools) - Returns the latest version of all ReVanced tools and Vanced MicroG * [tools](https://releases.rvcd.win/tools) - Returns the latest version of all ReVanced tools and Vanced MicroG
* [patches](https://revanced-releases-api.afterst0rm.xyz/patches) - Returns the latest version of all ReVanced patches * [patches](https://releases.rvcd.win/patches) - Returns the latest version of all ReVanced patches
* [contributors](https://revanced-releases-api.afterst0rm.xyz/contributors) - Returns contributors for all ReVanced projects * [contributors](https://releases.rvcd.win/contributors) - Returns contributors for all ReVanced projects
* [announcement](https://releases.rvcd.win/announcement) - Returns the latest announcement for the ReVanced projects
## Clients
The API has no concept of users. It is meant to be used by clients, such as the [ReVanced Manager](https://github.com/revanced/revanced-manager).
When the API is deployed for the first time it'll create a new client with admin permissions. The credentials can be found at the log console or in the file `admin_info.json` in the root directory of the project. Only admin clients can create, edit and delete other clients. If you're going to use any of the authenticated endpoints, you'll need to create a client and use its credentials. Please follow the API documentation for more information.
## Authentication
The API uses [PASETO](https://paseto.io/) tokens for authorization. To authenticate, you need to send a POST request to `/auth` with the following JSON body:
```json
{
"id": "your_client_id",
"secret": "your_client_secret"
}
```
The API will answer with a PASETO token and a refresh token that you can use to authorize your requests. You can use the token in the `Authorization` header of your requests, like this:
```
Authorization: Bearer <token>
```
That token will be valid for 24 hours. After that, you'll need to refresh it by sending a POST request to `/auth/refresh` with your `refresh_token` in the `Authorization` header.
Refresh tokens are valid for 30 days. After that, you'll need to authenticate again and get new tokens.
Some endpoints might require fresh tokens, forcing you to authenticate.
## Contributing ## Contributing

View File

@ -8,42 +8,38 @@ Changelogs are not included but can be found on the [ReVanced Repositories](http
The team also have a [Discord Server](https://revanced.app/discord) if you need help. The team also have a [Discord Server](https://revanced.app/discord) if you need help.
## API Endpoints ## Important Information
* [tools](/tools) - Returns the latest version of all ReVanced tools and Vanced MicroG
* [patches](/patches) - Returns the latest version of all ReVanced patches
* [contributors](/contributors) - Returns contributors for all ReVanced projects
## Additional Information
* Rate Limiting - 60 requests per minute * Rate Limiting - 60 requests per minute
* Cache - 5 minutes * Cache - 5 minutes
* Token duration - 1 hour
* Token refresh - 30 days
## Important Notes ## Additional Notes
1. Although we will try to avoid breaking changes, we can't guarantee that it won't happen. 1. Although we will try to avoid breaking changes, we can't guarantee that it won't happen.
2. Okay, the api is now cached and rate limited (per endpoint). But please don't abuse it, we don't want to have to block you. 2. Make sure to implement a cache system on your end to avoid unnecessary requests.
3. Make sure to implement a cache system on your end to avoid unnecessary requests. 3. API abuse will result in IP blocks.
Godspeed 💀 Godspeed 💀
""" """
version = "0.10 beta" version = "0.8 RC"
[license] [license]
name = "AGPL-3.0" name = "AGPL-3.0"
url = "https://www.gnu.org/licenses/agpl-3.0.en.html" url = "https://www.gnu.org/licenses/agpl-3.0.en.html"
[uvicorn]
host = "0.0.0.0"
port = 8000
[slowapi] [slowapi]
limit = "60/minute" limit = "60/minute"
[logging]
level = "INFO"
json_logs = false
[cache] [cache]
expire = 120 expire = 120
database = 0 database = 0
@ -52,12 +48,16 @@ database = 0
expire = 300 expire = 300
database = 1 database = 1
[clients]
database = 2
[tokens]
database = 3
[announcements]
database = 4
[app] [app]
repositories = ["TeamVanced/VancedMicroG", "revanced/revanced-cli", "revanced/revanced-patcher", "revanced/revanced-patches", "revanced/revanced-integrations", "revanced/revanced-manager"] repositories = ["TeamVanced/VancedMicroG", "revanced/revanced-cli", "revanced/revanced-patcher", "revanced/revanced-patches", "revanced/revanced-integrations", "revanced/revanced-manager"]
[logging]
level = "INFO"
json_logs = false
redis_database = 2

View File

@ -1,14 +1,13 @@
version: "3.8" version: "3.8"
volumes:
redis-data:
driver: local
services: services:
redis: redis:
container_name: revanced-releases-api-redis container_name: revanced-releases-api-redis
image: redis:latest image: redis-stack-server:latest
environment:
- REDIS_ARGS=--save 60 1 --appendonly yes
volumes: volumes:
- redis-data:/data - /data/redis/revanced-releases-api:/data
networks: networks:
- infra - infra
restart: always restart: always
@ -19,9 +18,9 @@ services:
- GITHUB_TOKEN=YOUR_GITHUB_TOKEN - GITHUB_TOKEN=YOUR_GITHUB_TOKEN
- REDIS_URL=revanced-releases-api-redis - REDIS_URL=revanced-releases-api-redis
- REDIS_PORT=6379 - REDIS_PORT=6379
- UVICORN_HOST=0.0.0.0 - HYPERCORN_HOST=0.0.0.0
- UVICORN_PORT=8000 - HYPERCORN_PORT=8000
- UVICORN_LOG_LEVEL=debug - HYPERCORN_LOG_LEVEL=debug
- SENTRY_DSN=YOUR_SENTRY_DSN - SENTRY_DSN=YOUR_SENTRY_DSN
ports: ports:
- 127.0.0.1:7934:8000 - 127.0.0.1:7934:8000

View File

@ -1,16 +1,14 @@
--- ---
version: "3.8" version: "3.8"
volumes:
redis-data:
driver: local
services: services:
redis: redis:
container_name: revanced-releases-api-redis container_name: revanced-releases-api-redis
image: redis:latest image: redis-stack-server:latest
environment:
- REDIS_ARGS=--save 60 1 --appendonly yes
volumes: volumes:
- redis-data:/data - /data/redis/revanced-releases-api:/data
networks: networks:
- infra - infra
restart: always restart: always
@ -21,9 +19,9 @@ services:
- GITHUB_TOKEN=YOUR_GITHUB_TOKEN - GITHUB_TOKEN=YOUR_GITHUB_TOKEN
- REDIS_URL=revanced-releases-api-redis - REDIS_URL=revanced-releases-api-redis
- REDIS_PORT=6379 - REDIS_PORT=6379
- UVICORN_HOST=0.0.0.0 - HYPERCORN_HOST=0.0.0.0
- UVICORN_PORT=8000 - HYPERCORN_PORT=8000
- UVICORN_LOG_LEVEL=debug - HYPERCORN_LOG_LEVEL=debug
- SENTRY_DSN=YOUR_SENTRY_DSN - SENTRY_DSN=YOUR_SENTRY_DSN
ports: ports:
- 127.0.0.1:7934:8000 - 127.0.0.1:7934:8000

11
env.sh
View File

@ -1,11 +0,0 @@
#!/bin/bash
# This script is used to setup the environment variables
export GITHUB_TOKEN=your_token
export UVICORN_HOST=0.0.0.0
export UVICORN_PORT=8000
export UVICORN_LOG_LEVEL=debug
export REDIS_URL=127.0.0.1
export REDIS_PORT=6379
export SENTRY_DSN=your_sentry_dsn

421
main.py
View File

@ -1,13 +1,15 @@
#!/usr/bin/env python3 #!/usr/bin/env python3
import binascii
import os import os
from typing import Coroutine
import toml import toml
import uvicorn
import aioredis
import sentry_sdk import sentry_sdk
import asyncio
import uvloop
from fastapi import FastAPI, Request, Response from fastapi import FastAPI, Request, Response, status, HTTPException, Depends
from fastapi.responses import RedirectResponse from fastapi.responses import RedirectResponse, JSONResponse, UJSONResponse
from slowapi.util import get_remote_address from slowapi.util import get_remote_address
from slowapi import Limiter, _rate_limit_exceeded_handler from slowapi import Limiter, _rate_limit_exceeded_handler
@ -16,15 +18,29 @@ from fastapi_cache import FastAPICache
from fastapi_cache.decorator import cache from fastapi_cache.decorator import cache
from slowapi.errors import RateLimitExceeded from slowapi.errors import RateLimitExceeded
from fastapi_cache.backends.redis import RedisBackend from fastapi_cache.backends.redis import RedisBackend
from fastapi.exceptions import RequestValidationError
from fastapi_paseto_auth import AuthPASETO
from fastapi_paseto_auth.exceptions import AuthPASETOException
from sentry_sdk.integrations.redis import RedisIntegration from sentry_sdk.integrations.redis import RedisIntegration
from sentry_sdk.integrations.httpx import HttpxIntegration from sentry_sdk.integrations.httpx import HttpxIntegration
from sentry_sdk.integrations.gnu_backtrace import GnuBacktraceIntegration from sentry_sdk.integrations.gnu_backtrace import GnuBacktraceIntegration
from modules.Releases import Releases import src.controllers.Auth as Auth
import modules.models.ResponseModels as ResponseModels from src.controllers.Releases import Releases
from src.controllers.Clients import Clients
from src.controllers.Announcements import Announcements
import modules.utils.Logger as Logger from src.utils.Generators import Generators
from src.utils.RedisConnector import RedisConnector
import src.models.ClientModels as ClientModels
import src.models.GeneralErrors as GeneralErrors
import src.models.ResponseModels as ResponseModels
import src.models.AnnouncementModels as AnnouncementModels
import src.utils.Logger as Logger
# Enable sentry logging # Enable sentry logging
@ -40,18 +56,23 @@ sentry_sdk.init(os.environ['SENTRY_DSN'], traces_sample_rate=1.0, integrations=[
config: dict = toml.load("config.toml") config: dict = toml.load("config.toml")
# Redis connection parameters # Class instances
redis_config: dict[ str, str | int ] = { generators = Generators()
"url": f"redis://{os.environ['REDIS_URL']}",
"port": os.environ['REDIS_PORT'],
"database": config['cache']['database'],
}
# Create releases instance
releases = Releases() releases = Releases()
clients = Clients()
announcements = Announcements()
# Setup admin client
uvloop.install()
loop: asyncio.AbstractEventLoop = asyncio.get_event_loop()
coroutine: Coroutine = clients.setup_admin()
loop.run_until_complete(coroutine)
# Create FastAPI instance # Create FastAPI instance
app = FastAPI(title=config['docs']['title'], app = FastAPI(title=config['docs']['title'],
@ -59,7 +80,9 @@ app = FastAPI(title=config['docs']['title'],
version=config['docs']['version'], version=config['docs']['version'],
license_info={"name": config['license']['name'], license_info={"name": config['license']['name'],
"url": config['license']['url'] "url": config['license']['url']
}) },
default_response_class=UJSONResponse
)
# Slowapi limiter # Slowapi limiter
@ -73,9 +96,35 @@ app.add_exception_handler(RateLimitExceeded, _rate_limit_exceeded_handler)
async def get_cache() -> int: async def get_cache() -> int:
return 1 return 1
# Setup PASETO
@AuthPASETO.load_config
def get_config():
return Auth.PasetoSettings()
# Setup custom error handlers
@app.exception_handler(AuthPASETOException)
async def authpaseto_exception_handler(request: Request, exc: AuthPASETOException):
return JSONResponse(status_code=exc.status_code, content={"detail": exc.message})
@app.exception_handler(AttributeError)
async def validation_exception_handler(request, exc):
return JSONResponse(status_code=status.HTTP_422_UNPROCESSABLE_ENTITY, content={
"error": "Unprocessable Entity"
})
@app.exception_handler(binascii.Error)
async def invalid_token_exception_handler(request, exc):
return JSONResponse(status_code=status.HTTP_401_UNAUTHORIZED, content={
"error": GeneralErrors.Unauthorized().error,
"message": GeneralErrors.Unauthorized().message
})
# Routes # Routes
@app.get("/", response_class=RedirectResponse, status_code=301) @app.get("/", response_class=RedirectResponse,
status_code=status.HTTP_301_MOVED_PERMANENTLY, tags=['Root'])
@limiter.limit(config['slowapi']['limit']) @limiter.limit(config['slowapi']['limit'])
async def root(request: Request, response: Response) -> RedirectResponse: async def root(request: Request, response: Response) -> RedirectResponse:
"""Brings up API documentation """Brings up API documentation
@ -85,7 +134,7 @@ async def root(request: Request, response: Response) -> RedirectResponse:
""" """
return RedirectResponse(url="/docs") return RedirectResponse(url="/docs")
@app.get('/tools', response_model=ResponseModels.ToolsResponseModel) @app.get('/tools', response_model=ResponseModels.ToolsResponseModel, tags=['ReVanced Tools'])
@limiter.limit(config['slowapi']['limit']) @limiter.limit(config['slowapi']['limit'])
@cache(config['cache']['expire']) @cache(config['cache']['expire'])
async def tools(request: Request, response: Response) -> dict: async def tools(request: Request, response: Response) -> dict:
@ -96,7 +145,7 @@ async def tools(request: Request, response: Response) -> dict:
""" """
return await releases.get_latest_releases(config['app']['repositories']) return await releases.get_latest_releases(config['app']['repositories'])
@app.get('/patches', response_model=ResponseModels.PatchesResponseModel) @app.get('/patches', response_model=ResponseModels.PatchesResponseModel, tags=['ReVanced Tools'])
@limiter.limit(config['slowapi']['limit']) @limiter.limit(config['slowapi']['limit'])
@cache(config['cache']['expire']) @cache(config['cache']['expire'])
async def patches(request: Request, response: Response) -> dict: async def patches(request: Request, response: Response) -> dict:
@ -108,7 +157,7 @@ async def patches(request: Request, response: Response) -> dict:
return await releases.get_patches_json() return await releases.get_patches_json()
@app.get('/contributors', response_model=ResponseModels.ContributorsResponseModel) @app.get('/contributors', response_model=ResponseModels.ContributorsResponseModel, tags=['ReVanced Tools'])
@limiter.limit(config['slowapi']['limit']) @limiter.limit(config['slowapi']['limit'])
@cache(config['cache']['expire']) @cache(config['cache']['expire'])
async def contributors(request: Request, response: Response) -> dict: async def contributors(request: Request, response: Response) -> dict:
@ -119,28 +168,334 @@ async def contributors(request: Request, response: Response) -> dict:
""" """
return await releases.get_contributors(config['app']['repositories']) return await releases.get_contributors(config['app']['repositories'])
@app.head('/ping', status_code=204) @app.get('/changelogs/{org}/{repo}', response_model=ResponseModels.ChangelogsResponseModel, tags=['ReVanced Tools'])
@limiter.limit(config['slowapi']['limit']) @limiter.limit(config['slowapi']['limit'])
async def ping(request: Request, response: Response) -> None: @cache(config['cache']['expire'])
"""Check if the API is running. async def changelogs(request: Request, response: Response, org: str, repo: str, path: str) -> dict:
"""Get the latest changes from a repository.
Returns: Returns:
None json: list of commits
""" """
return None return await releases.get_commits(
org=org,
repository=repo,
path=path
)
@app.post('/client', response_model=ClientModels.ClientModel, status_code=status.HTTP_201_CREATED, tags=['Clients'])
@limiter.limit(config['slowapi']['limit'])
async def create_client(request: Request, response: Response, admin: bool | None = False, Authorize: AuthPASETO = Depends()) -> ClientModels.ClientModel:
"""Create a new API client.
Returns:
json: client information
"""
Authorize.paseto_required()
admin_claim: dict[str, bool] = {"admin": False}
current_user: str | int | None = Authorize.get_subject()
if 'admin' in Authorize.get_token_payload():
admin_claim = {"admin": Authorize.get_token_payload()['admin']}
if ( await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()) and
admin_claim['admin'] == True):
client: ClientModels.ClientModel = await clients.generate(admin=admin)
await clients.store(client)
return client
else:
raise HTTPException(status_code=401, detail={
"error": GeneralErrors.Unauthorized().error,
"message": GeneralErrors.Unauthorized().message
}
)
@app.delete('/client/{client_id}', response_model=ResponseModels.ClientDeletedResponse, status_code=status.HTTP_200_OK, tags=['Clients'])
@limiter.limit(config['slowapi']['limit'])
async def delete_client(request: Request, response: Response, client_id: str, Authorize: AuthPASETO = Depends()) -> dict:
"""Delete an API client.
Returns:
json: deletion status
"""
Authorize.paseto_required()
admin_claim: dict[str, bool] = {"admin": False}
current_user: str | int | None = Authorize.get_subject()
if 'admin' in Authorize.get_token_payload():
admin_claim = {"admin": Authorize.get_token_payload()['admin']}
if ( await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()) and
( admin_claim['admin'] == True or
current_user == client_id ) ):
if await clients.exists(client_id):
return {"id": client_id, "deleted": await clients.delete(client_id)}
else:
raise HTTPException(status_code=404, detail={
"error": GeneralErrors.ClientNotFound().error,
"message": GeneralErrors.ClientNotFound().message
}
)
else:
raise HTTPException(status_code=401, detail={
"error": GeneralErrors.Unauthorized().error,
"message": GeneralErrors.Unauthorized().message
}
)
@app.patch('/client/{client_id}/secret', response_model=ResponseModels.ClientSecretUpdatedResponse, status_code=status.HTTP_200_OK, tags=['Clients'])
@limiter.limit(config['slowapi']['limit'])
async def update_client(request: Request, response: Response, client_id: str, Authorize: AuthPASETO = Depends()) -> dict:
"""Update an API client's secret.
Returns:
json: client ID and secret
"""
Authorize.paseto_required()
admin_claim: dict[str, bool] = {"admin": False}
current_user: str | int | None = Authorize.get_subject()
if 'admin' in Authorize.get_token_payload():
admin_claim = {"admin": Authorize.get_token_payload()['admin']}
if ( await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()) and
( admin_claim['admin'] == True or
current_user == client_id ) ):
if await clients.exists(client_id):
new_secret: str = await generators.generate_secret()
if await clients.update_secret(client_id, new_secret):
return {"id": client_id, "secret": new_secret}
else:
raise HTTPException(status_code=500, detail={
"error": GeneralErrors.InternalServerError().error,
"message": GeneralErrors.InternalServerError().message
}
)
else:
raise HTTPException(status_code=404, detail={
"error": GeneralErrors.ClientNotFound().error,
"message": GeneralErrors.ClientNotFound().message
}
)
else:
raise HTTPException(status_code=401, detail={
"error": GeneralErrors.Unauthorized().error,
"message": GeneralErrors.Unauthorized().message
}
)
@app.patch('/client/{client_id}/status', response_model=ResponseModels.ClientStatusResponse, status_code=status.HTTP_200_OK, tags=['Clients'])
async def client_status(request: Request, response: Response, client_id: str, active: bool, Authorize: AuthPASETO = Depends()) -> dict:
"""Activate or deactivate a client
Returns:
json: json response containing client ID and activation status
"""
Authorize.paseto_required()
admin_claim: dict[str, bool] = {"admin": False}
current_user: str | int | None = Authorize.get_subject()
if 'admin' in Authorize.get_token_payload():
admin_claim = {"admin": Authorize.get_token_payload()['admin']}
if ( await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()) and
( admin_claim['admin'] == True or
current_user == client_id ) ):
if await clients.exists(client_id):
if await clients.status(client_id, active):
return {"id": client_id, "active": active}
else:
raise HTTPException(status_code=500, detail={
"error": GeneralErrors.InternalServerError().error,
"message": GeneralErrors.InternalServerError().message
}
)
else:
raise HTTPException(status_code=404, detail={
"error": GeneralErrors.ClientNotFound().error,
"message": GeneralErrors.ClientNotFound().message
}
)
else:
raise HTTPException(status_code=401, detail={
"error": GeneralErrors.Unauthorized().error,
"message": GeneralErrors.Unauthorized().message
}
)
@app.post('/announcement', response_model=AnnouncementModels.AnnouncementCreatedResponse,
status_code=status.HTTP_201_CREATED, tags=['Announcements'])
@limiter.limit(config['slowapi']['limit'])
async def create_announcement(request: Request, response: Response,
announcement: AnnouncementModels.AnnouncementCreateModel,
Authorize: AuthPASETO = Depends()) -> dict:
"""Create a new announcement.
Returns:
json: announcement information
"""
Authorize.paseto_required()
if await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()):
announcement_created: bool = await announcements.store(announcement=announcement,
author=Authorize.get_subject())
if announcement_created:
return {"created": announcement_created}
else:
raise HTTPException(status_code=500, detail={
"error": GeneralErrors.InternalServerError().error,
"message": GeneralErrors.InternalServerError().message
}
)
else:
raise HTTPException(status_code=401, detail={
"error": GeneralErrors.Unauthorized().error,
"message": GeneralErrors.Unauthorized().message
}
)
@app.get('/announcement', response_model=AnnouncementModels.AnnouncementModel, tags=['Announcements'])
@limiter.limit(config['slowapi']['limit'])
async def get_announcement(request: Request, response: Response) -> dict:
"""Get an announcement.
Returns:
json: announcement information
"""
if await announcements.exists():
return await announcements.get()
else:
raise HTTPException(status_code=404, detail={
"error": GeneralErrors.AnnouncementNotFound().error,
"message": GeneralErrors.AnnouncementNotFound().message
}
)
@app.delete('/announcement',
response_model=AnnouncementModels.AnnouncementDeleted,
status_code=status.HTTP_200_OK, tags=['Announcements'])
@limiter.limit(config['slowapi']['limit'])
async def delete_announcement(request: Request, response: Response,
Authorize: AuthPASETO = Depends()) -> dict:
"""Delete an announcement.
Returns:
json: deletion status
"""
Authorize.paseto_required()
if await clients.auth_checks(Authorize.get_subject(), Authorize.get_jti()):
if await announcements.exists():
return {"deleted": await announcements.delete()}
else:
raise HTTPException(status_code=404, detail={
"error": GeneralErrors.AnnouncementNotFound().error,
"message": GeneralErrors.AnnouncementNotFound().message
}
)
else:
raise HTTPException(status_code=401, detail={
"error": GeneralErrors.Unauthorized().error,
"message": GeneralErrors.Unauthorized().message
}
)
@app.post('/auth', response_model=ResponseModels.ClientAuthTokenResponse, status_code=status.HTTP_200_OK, tags=['Authentication'])
@limiter.limit(config['slowapi']['limit'])
async def auth(request: Request, response: Response, client: ClientModels.ClientAuthModel, Authorize: AuthPASETO = Depends()) -> dict:
"""Authenticate a client and get an auth token.
Returns:
access_token: auth token
refresh_token: refresh token
"""
admin_claim: dict[str, bool]
if await clients.exists(client.id):
authenticated: bool = await clients.authenticate(client.id, client.secret)
if not authenticated:
raise HTTPException(status_code=401, detail={
"error": GeneralErrors.Unauthorized().error,
"message": GeneralErrors.Unauthorized().message
}
)
else:
if await clients.is_admin(client.id):
admin_claim = {"admin": True}
else:
admin_claim = {"admin": False}
access_token = Authorize.create_access_token(subject=client.id,
user_claims=admin_claim,
fresh=True)
refresh_token = Authorize.create_refresh_token(subject=client.id,
user_claims=admin_claim)
return {"access_token": access_token, "refresh_token": refresh_token}
else:
raise HTTPException(status_code=401, detail={
"error": GeneralErrors.Unauthorized().error,
"message": GeneralErrors.Unauthorized().message
}
)
@app.post('/auth/refresh', response_model=ResponseModels.ClientTokenRefreshResponse,
status_code=status.HTTP_200_OK, tags=['Authentication'])
@limiter.limit(config['slowapi']['limit'])
async def refresh(request: Request, response: Response,
Authorize: AuthPASETO = Depends()) -> dict:
"""Refresh an auth token.
Returns:
access_token: auth token
"""
Authorize.paseto_required(refresh_token=True)
admin_claim: dict[str, bool] = {"admin": False}
current_user: str | int | None = Authorize.get_subject()
if 'admin' in Authorize.get_token_payload():
admin_claim = {"admin": Authorize.get_token_payload()['admin']}
return {"access_token": Authorize.create_access_token(subject=current_user,
user_claims=admin_claim,
fresh=False)}
@app.on_event("startup") @app.on_event("startup")
async def startup() -> None: async def startup() -> None:
redis_url = f"{redis_config['url']}:{redis_config['port']}/{redis_config['database']}" FastAPICache.init(RedisBackend(RedisConnector.connect(config['cache']['database'])),
redis = aioredis.from_url(redis_url, encoding="utf8", decode_responses=True) prefix="fastapi-cache")
FastAPICache.init(RedisBackend(redis), prefix="fastapi-cache")
return None return None
# setup right before running to make sure no other library overwrites it # setup right before running to make sure no other library overwrites it
Logger.setup_logging(LOG_LEVEL=config["logging"]["level"], JSON_LOGS=config["logging"]["json_logs"]) Logger.setup_logging(LOG_LEVEL=config["logging"]["level"],
JSON_LOGS=config["logging"]["json_logs"])
# Run app
if __name__ == '__main__':
uvicorn.run(app, host=config['uvicorn']['host'], port=config['uvicorn']['port'])

View File

@ -1,41 +0,0 @@
from pydantic import BaseModel
import modules.models.ResponseFields as ResponseFields
"""Implements pydantic models and model generator for the API's responses."""
class ToolsResponseModel(BaseModel):
"""Implements the JSON response model for the /tools endpoint.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
tools: list[ ResponseFields.ToolsResponseFields ]
class PatchesResponseModel(BaseModel):
"""Implements the JSON response model for the /patches endpoint.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
__root__: list[ ResponseFields.PatchesResponseFields ]
class ContributorsResponseModel(BaseModel):
"""Implements the JSON response model for the /contributors endpoint.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
repositories: list[ ResponseFields.ContributorsResponseFields ]
class PingResponseModel(BaseModel):
"""Implements the JSON response model for the /heartbeat endpoint.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
status: int
detail: str

View File

@ -56,3 +56,15 @@ ignore_missing_imports = True
[mypy-redis.*] [mypy-redis.*]
# No stubs available # No stubs available
ignore_missing_imports = True ignore_missing_imports = True
[mypy-toolz.*]
# No stubs available
ignore_missing_imports = True
[mypy-fastapi_paseto_auth.*]
# No stubs available
ignore_missing_imports = True
[mypy-aiofiles.*]
# No stubs available
ignore_missing_imports = True

852
poetry.lock generated

File diff suppressed because it is too large Load Diff

View File

@ -8,35 +8,27 @@ license = "AGPLv3"
[tool.poetry.dependencies] [tool.poetry.dependencies]
python = "^3.10" python = "^3.10"
fastapi = ">=0.85.0" fastapi = ">=0.85.0"
uvicorn = {version = ">=0.18.3", extras = ["standard"]}
httpx = {version = ">=0.23.0", extras = ["http2"]} httpx = {version = ">=0.23.0", extras = ["http2"]}
httpx-cache = ">=0.6.0" httpx-cache = ">=0.6.0"
toml = ">=0.10.2" toml = ">=0.10.2"
slowapi = ">=0.1.6" slowapi = ">=0.1.6"
orjson = ">=3.8.0" orjson = ">=3.8.0"
fastapi-cache2 = ">=0.1.9" fastapi-cache2 = ">=0.1.9"
aioredis = {version = ">=2.0.1", extras = ["hiredis"]}
redis = ">=4.3.4" redis = ">=4.3.4"
msgpack = ">=1.0.4"
loguru = ">=0.6.0" loguru = ">=0.6.0"
sentry-sdk = ">=1.9.8" sentry-sdk = ">=1.9.8"
argon2-cffi = ">=21.3.0"
hypercorn = {extras = ["uvloop"], version = ">=0.14.3"}
cytoolz = ">=0.12.0"
fastapi-paseto-auth = "^0.6.0"
ujson = ">=5.5.0"
hiredis = ">=2.0.0"
aiofiles = ">=22.1.0"
[tool.poetry.dev-dependencies] [tool.poetry.dev-dependencies]
fastapi = ">=0.85.0"
uvicorn = {version = ">=0.18.3", extras = ["standard"]}
httpx = {version = ">=0.23.0", extras = ["http2"]}
httpx-cache = ">=0.6.0"
toml = ">=0.10.2"
slowapi = ">=0.1.6"
orjson = ">=3.8.0"
fastapi-cache2 = ">=0.1.9"
aioredis = {version = ">=2.0.1", extras = ["hiredis"]}
redis = ">=4.3.4"
msgpack = ">=1.0.4"
mypy = ">=0.971" mypy = ">=0.971"
types-toml = ">=0.10.8" types-toml = ">=0.10.8"
loguru = ">=0.6.0" types-redis = ">=4.3.21.1"
sentry-sdk = ">=1.9.8"
[build-system] [build-system]
requires = ["poetry-core>=1.0.0"] requires = ["poetry-core>=1.0.0"]

View File

@ -1,50 +1,61 @@
aioredis==2.0.1; python_version >= "3.6" aiofiles==22.1.0; python_version >= "3.7" and python_version < "4.0"
aiorwlock==1.3.0; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.7.0" aiorwlock==1.3.0; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.7.0"
anyio==3.6.1; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.6.2" anyio==3.6.1; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.6.2"
argon2-cffi-bindings==21.2.0; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
argon2-cffi==21.3.0; python_version >= "3.6"
async-timeout==4.0.2; python_version >= "3.6" async-timeout==4.0.2; python_version >= "3.6"
attrs==21.4.0; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0" attrs==21.4.0; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0"
certifi==2022.9.24; python_version >= "3.7" and python_version < "4.0" certifi==2022.9.24; python_version >= "3.7" and python_version < "4.0"
cffi==1.15.1; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
click==8.1.3; python_version >= "3.7" and python_version < "4.0" click==8.1.3; python_version >= "3.7" and python_version < "4.0"
colorama==0.4.5; python_version >= "3.7" and python_full_version < "3.0.0" and sys_platform == "win32" and python_version < "4.0" and platform_system == "Windows" or sys_platform == "win32" and python_version >= "3.7" and python_full_version >= "3.5.0" and python_version < "4.0" and platform_system == "Windows" colorama==0.4.5; python_version >= "3.7" and python_full_version < "3.0.0" and sys_platform == "win32" and python_version < "4.0" and platform_system == "Windows" or sys_platform == "win32" and python_version >= "3.7" and python_full_version >= "3.5.0" and python_version < "4.0" and platform_system == "Windows"
cryptography==37.0.4; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
cytoolz==0.12.0; python_version >= "3.5"
deprecated==1.2.13; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6" deprecated==1.2.13; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
fastapi-cache2==0.1.9; python_version >= "3.7" and python_version < "4.0" fastapi-cache2==0.1.9; python_version >= "3.7" and python_version < "4.0"
fastapi-paseto-auth==0.6.0; python_version >= "3.10"
fastapi==0.85.0; python_version >= "3.7" fastapi==0.85.0; python_version >= "3.7"
fasteners==0.17.3; python_version >= "3.7" and python_version < "4.0" fasteners==0.17.3; python_version >= "3.7" and python_version < "4.0"
h11==0.12.0; python_version >= "3.7" and python_version < "4.0" h11==0.12.0; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.7.0"
h2==4.1.0; python_version >= "3.7" and python_full_version >= "3.6.1" and python_version < "4.0" h2==4.1.0; python_version >= "3.7" and python_full_version >= "3.6.1" and python_version < "4.0"
hiredis==2.0.0; implementation_name == "cpython" and python_version >= "3.6" hiredis==2.0.0; python_version >= "3.6"
hpack==4.0.0; python_version >= "3.7" and python_full_version >= "3.6.1" and python_version < "4.0" hpack==4.0.0; python_version >= "3.7" and python_full_version >= "3.6.1"
httpcore==0.15.0; python_version >= "3.7" and python_version < "4.0" httpcore==0.15.0; python_version >= "3.7" and python_version < "4.0"
httptools==0.5.0; python_version >= "3.7" and python_full_version >= "3.5.0" and python_version < "4.0"
httpx-cache==0.6.0; python_version >= "3.7" and python_version < "4.0" httpx-cache==0.6.0; python_version >= "3.7" and python_version < "4.0"
httpx==0.23.0; python_version >= "3.7" httpx==0.23.0; python_version >= "3.7"
hyperframe==6.0.1; python_version >= "3.7" and python_full_version >= "3.6.1" and python_version < "4.0" hypercorn==0.14.3; python_version >= "3.7"
idna==3.4 hyperframe==6.0.1; python_version >= "3.7" and python_full_version >= "3.6.1"
idna==3.4; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.6.2"
iso8601==1.1.0; python_full_version >= "3.6.2" and python_version < "4.0" and python_full_version < "4.0.0" and python_version >= "3.10"
limits==1.6; python_version >= "3.7" and python_version < "4.0" limits==1.6; python_version >= "3.7" and python_version < "4.0"
loguru==0.6.0; python_version >= "3.5" loguru==0.6.0; python_version >= "3.5"
msgpack==1.0.4 msgpack==1.0.4; python_version >= "3.7" and python_version < "4.0"
orjson==3.8.0; python_version >= "3.7" orjson==3.8.0; python_version >= "3.7"
packaging==21.3; python_version >= "3.6" packaging==21.3; python_version >= "3.6"
passlib==1.7.4; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
pendulum==2.1.2; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0" pendulum==2.1.2; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0"
pydantic==1.10.2; python_version >= "3.7" and python_version < "4.0" priority==2.0.0; python_full_version >= "3.6.1" and python_version >= "3.7"
pycparser==2.21; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.4.0" and python_version >= "3.6"
pycryptodomex==3.15.0; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
pydantic==1.10.2; python_version >= "3.10" and python_version < "4.0"
pyparsing==3.0.9; python_full_version >= "3.6.8" and python_version >= "3.6" pyparsing==3.0.9; python_full_version >= "3.6.8" and python_version >= "3.6"
pyseto==1.6.10; python_full_version >= "3.6.2" and python_full_version < "4.0.0" and python_version >= "3.10"
python-dateutil==2.8.2; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0" python-dateutil==2.8.2; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0"
python-dotenv==0.21.0; python_version >= "3.7" and python_version < "4.0"
pytzdata==2020.1; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0" pytzdata==2020.1; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0"
pyyaml==6.0; python_version >= "3.7" and python_version < "4.0"
redis==4.3.4; python_version >= "3.6" redis==4.3.4; python_version >= "3.6"
rfc3986==1.5.0; python_version >= "3.7" and python_version < "4.0" rfc3986==1.5.0; python_version >= "3.7" and python_version < "4.0"
sentry-sdk==1.9.9 sentry-sdk==1.9.10
six==1.16.0; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0" six==1.16.0; python_version >= "3.7" and python_full_version < "3.0.0" and python_version < "4.0" or python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.5.0"
slowapi==0.1.6; python_version >= "3.7" and python_version < "4.0" slowapi==0.1.6; python_version >= "3.7" and python_version < "4.0"
sniffio==1.3.0; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.6.2" sniffio==1.3.0; python_version >= "3.7" and python_version < "4.0" and python_full_version >= "3.6.2"
starlette==0.20.4; python_version >= "3.7" and python_version < "4.0" starlette==0.20.4; python_version >= "3.10" and python_version < "4.0"
toml==0.10.2; (python_version >= "2.6" and python_full_version < "3.0.0") or (python_full_version >= "3.3.0") toml==0.10.2; (python_version >= "2.6" and python_full_version < "3.0.0") or (python_full_version >= "3.3.0")
typing-extensions==4.3.0; python_version >= "3.7" and python_version < "4.0" toolz==0.12.0; python_version >= "3.5"
typing-extensions==4.4.0; python_version >= "3.10"
ujson==5.5.0; python_version >= "3.7"
urllib3==1.26.12; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version < "4" and python_version >= "3.6" urllib3==1.26.12; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.6.0" and python_version < "4" and python_version >= "3.6"
uvicorn==0.18.3; python_version >= "3.7" uvicorn==0.18.3; python_version >= "3.7" and python_version < "4.0"
uvloop==0.17.0; sys_platform != "win32" and sys_platform != "cygwin" and platform_python_implementation != "PyPy" and python_version >= "3.7" and python_version < "4.0" uvloop==0.17.0; platform_system != "Windows" and python_version >= "3.7"
watchfiles==0.17.0; python_version >= "3.7" and python_version < "4.0"
websockets==10.3; python_version >= "3.7" and python_version < "4.0"
win32-setctime==1.1.0; sys_platform == "win32" and python_version >= "3.5" win32-setctime==1.1.0; sys_platform == "win32" and python_version >= "3.5"
wrapt==1.14.1; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.5.0" and python_version >= "3.6" wrapt==1.14.1; python_version >= "3.6" and python_full_version < "3.0.0" or python_full_version >= "3.5.0" and python_version >= "3.6"
wsproto==1.2.0; python_full_version >= "3.7.0" and python_version >= "3.7"

6
run.sh
View File

@ -7,6 +7,6 @@
CORES=$(grep -c ^processor /proc/cpuinfo) CORES=$(grep -c ^processor /proc/cpuinfo)
# Start the application # Start the application
uvicorn main:app --host="$UVICORN_HOST" --port="$UVICORN_PORT" \ hypercorn main:app --bind="${HYPERCORN_HOST}:${HYPERCORN_PORT}" \
--workers="$CORES" --log-level="$UVICORN_LOG_LEVEL" --server-header \ --workers="$CORES" --log-level="$HYPERCORN_LOG_LEVEL" \
--proxy-headers --forwarded-allow-ips="*" --worker-class uvloop

View File

@ -0,0 +1,102 @@
import toml
from redis import asyncio as aioredis
import src.utils.Logger as Logger
from src.utils.Generators import Generators
from src.models.AnnouncementModels import AnnouncementCreateModel
from src.utils.RedisConnector import RedisConnector
config: dict = toml.load("config.toml")
class Announcements:
"""Implements the announcements class for the ReVanced API"""
redis = RedisConnector.connect(config['announcements']['database'])
AnnouncementsLogger = Logger.AnnouncementsLogger()
generators = Generators()
async def store(self, announcement: AnnouncementCreateModel, author: str) -> bool:
"""Store an announcement in the database
Args:
announcement (AnnouncementCreateModel): Pydantic model of the announcement
Returns:
str | bool: UUID of the announcement or False if the announcement wasn't stored successfully
"""
announcement_id: str = "announcement"
timestamp = await self.generators.generate_timestamp()
announcement_payload: dict[str, str | int] = {}
announcement_payload['created_at'] = timestamp
announcement_payload['author'] = author
announcement_payload['type'] = announcement.type
announcement_payload['title'] = announcement.title
announcement_payload['content'] = announcement.content
try:
await self.redis.json().set(announcement_id, '$', announcement_payload)
await self.AnnouncementsLogger.log("SET", None, announcement_id)
except aioredis.RedisError as e:
await self.AnnouncementsLogger.log("SET", e)
raise e
return True
async def exists(self) -> bool:
"""Check if an announcement exists in the database
Returns:
bool: True if the announcement exists, False otherwise
"""
try:
if await self.redis.exists("announcement"):
await self.AnnouncementsLogger.log("EXISTS", None, "announcement")
return True
else:
await self.AnnouncementsLogger.log("EXISTS", None, "announcement")
return False
except aioredis.RedisError as e:
await self.AnnouncementsLogger.log("EXISTS", e)
raise e
async def get(self) -> dict:
"""Get a announcement from the database
Returns:
dict: Dict of the announcement or an empty dict if the announcement doesn't exist
"""
if await self.exists():
try:
announcement: dict[str, str | int] = await self.redis.json().get("announcement")
await self.AnnouncementsLogger.log("GET", None, "announcement")
except aioredis.RedisError as e:
await self.AnnouncementsLogger.log("GET", e)
return {}
return announcement
else:
return {}
async def delete(self) -> bool:
"""Delete an announcement from the database
Returns:
bool: True if the announcement was deleted successfully, False otherwise
"""
if await self.exists():
try:
await self.redis.delete("announcement")
await self.AnnouncementsLogger.log("DELETE", None, "announcement")
except aioredis.RedisError as e:
await self.AnnouncementsLogger.log("DELETE", e)
return False
return True
else:
return False

7
src/controllers/Auth.py Normal file
View File

@ -0,0 +1,7 @@
import os
from pydantic import BaseModel
class PasetoSettings(BaseModel):
authpaseto_secret_key: str = os.environ['SECRET_KEY']
authpaseto_access_token_expires: int = 86400

351
src/controllers/Clients.py Normal file
View File

@ -0,0 +1,351 @@
from time import sleep
import toml
import orjson
from typing import Optional
import argon2
from redis import asyncio as aioredis
import aiofiles
import uvloop
import src.utils.Logger as Logger
from src.utils.Generators import Generators
from src.models.ClientModels import ClientModel
from src.utils.RedisConnector import RedisConnector
config: dict = toml.load("config.toml")
class Clients:
"""Implements a client for ReVanced Releases API."""
uvloop.install()
redis = RedisConnector.connect(config['clients']['database'])
redis_tokens = RedisConnector.connect(config['tokens']['database'])
UserLogger = Logger.UserLogger()
generators = Generators()
async def generate(self, admin: Optional[bool] = False) -> ClientModel:
"""Generate a new client
Args:
admin (Optional[bool], optional): Defines if the client should have admin access. Defaults to False.
Returns:
ClientModel: Pydantic model of the client
"""
client_id: str = await self.generators.generate_id()
client_secret: str = await self.generators.generate_secret()
client = ClientModel(id=client_id, secret=client_secret, admin=admin, active=True)
return client
async def store(self, client: ClientModel) -> bool:
"""Store a client in the database
Args:
client (ClientModel): Pydantic model of the client
Returns:
bool: True if the client was stored successfully, False otherwise
"""
client_payload: dict[str, str | bool] = {}
ph: argon2.PasswordHasher = argon2.PasswordHasher()
client_payload['secret'] = ph.hash(client.secret)
client_payload['admin'] = client.admin
client_payload['active'] = client.active
try:
await self.redis.json().set(client.id, '$', client_payload)
await self.UserLogger.log("SET", None, client.id)
except aioredis.RedisError as e:
await self.UserLogger.log("SET", e)
raise e
return True
async def exists(self, client_id: str) -> bool:
"""Check if a client exists in the database
Args:
client_id (str): UUID of the client
Returns:
bool: True if the client exists, False otherwise
"""
try:
if await self.redis.exists(client_id):
await self.UserLogger.log("EXISTS", None, client_id)
return True
else:
await self.UserLogger.log("EXISTS", None, client_id)
return False
except aioredis.RedisError as e:
await self.UserLogger.log("EXISTS", e)
raise e
async def get(self, client_id: str) -> ClientModel | bool:
"""Get a client from the database
Args:
client_id (str): UUID of the client
Returns:
ClientModel | bool: Pydantic model of the client or False if the client doesn't exist
"""
if await self.exists(client_id):
try:
client_payload: dict[str, str | bool] = await self.redis.json().get(client_id)
client = ClientModel(id=client_id, secret=client_payload['secret'], admin=client_payload['admin'], active=True)
await self.UserLogger.log("GET", None, client_id)
except aioredis.RedisError as e:
await self.UserLogger.log("GET", e)
raise e
return client
else:
return False
async def delete(self, client_id: str) -> bool:
"""Delete a client from the database
Args:
client_id (str): UUID of the client
Returns:
bool: True if the client was deleted successfully, False otherwise
"""
if await self.exists(client_id):
try:
await self.redis.delete(client_id)
await self.UserLogger.log("DELETE", None, client_id)
except aioredis.RedisError as e:
await self.UserLogger.log("DELETE", e)
raise e
return True
else:
return False
async def update_secret(self, client_id: str, new_secret: str) -> bool:
"""Update the secret of a client
Args:
client_id (str): UUID of the client
new_secret (str): New secret of the client
Returns:
bool: True if the secret was updated successfully, False otherwise
"""
ph: argon2.PasswordHasher = argon2.PasswordHasher()
updated: bool = False
try:
await self.redis.json().set(client_id, '.secret', ph.hash(new_secret))
await self.UserLogger.log("UPDATE_SECRET", None, client_id)
updated = True
except aioredis.RedisError as e:
await self.UserLogger.log("UPDATE_SECRET", e)
raise e
return updated
async def authenticate(self, client_id: str, secret: str) -> bool:
"""Check if the secret of a client is correct
Args:
client_id (str): UUID of the client
secret (str): Secret of the client
Returns:
bool: True if the secret is correct, False otherwise
"""
ph: argon2.PasswordHasher = argon2.PasswordHasher()
authenticated: bool = False
client_secret: str = await self.redis.json().get(client_id, '.secret')
try:
if ph.verify(client_secret, secret):
await self.UserLogger.log("CHECK_SECRET", None, client_id)
if ph.check_needs_rehash(client_secret):
await self.redis.json().set(client_id, '.secret', ph.hash(secret))
await self.UserLogger.log("REHASH SECRET", None, client_id)
authenticated = True
except argon2.exceptions.VerifyMismatchError as e:
await self.UserLogger.log("CHECK_SECRET", e)
return authenticated
return authenticated
async def is_admin(self, client_id: str) -> bool:
"""Check if a client has admin access
Args:
client_id (str): UUID of the client
Returns:
bool: True if the client has admin access, False otherwise
"""
client_admin: bool = False
try:
client_admin = await self.redis.json().get(client_id, '.admin')
await self.UserLogger.log("CHECK_ADMIN", None, client_id)
except aioredis.RedisError as e:
await self.UserLogger.log("CHECK_ADMIN", e)
raise e
return client_admin
async def is_active(self, client_id: str) -> bool:
"""Check if a client is active
Args:
client_id (str): UUID of the client
Returns:
bool: True if the client is active, False otherwise
"""
client_active: bool = False
try:
client_active = await self.redis.json().get(client_id, '.active')
await self.UserLogger.log("CHECK_ACTIVE", None, client_id)
except aioredis.RedisError as e:
await self.UserLogger.log("CHECK_ACTIVE", e)
raise e
return client_active
async def status(self, client_id: str, active: bool) -> bool:
"""Activate a client
Args:
client_id (str): UUID of the client
active (bool): True to activate the client, False to deactivate it
Returns:
bool: True if the client status was change successfully, False otherwise
"""
changed: bool = False
try:
await self.redis.json().set(client_id, '.active', active)
await self.UserLogger.log("ACTIVATE", None, client_id)
changed = True
except aioredis.RedisError as e:
await self.UserLogger.log("ACTIVATE", e)
raise e
return changed
async def ban_token(self, token: str) -> bool:
"""Ban a token
Args:
token (str): Token to ban
Returns:
bool: True if the token was banned successfully, False otherwise
"""
banned: bool = False
try:
await self.redis_tokens.set(token, '')
await self.UserLogger.log("BAN_TOKEN", None, token)
banned = True
except aioredis.RedisError as e:
await self.UserLogger.log("BAN_TOKEN", e)
raise e
return banned
async def is_token_banned(self, token: str) -> bool:
"""Check if a token is banned
Args:
token (str): Token to check
Returns:
bool: True if the token is banned, False otherwise
"""
banned: bool = True
try:
banned = await self.redis_tokens.exists(token)
await self.UserLogger.log("CHECK_TOKEN", None, token)
except aioredis.RedisError as e:
await self.UserLogger.log("CHECK_TOKEN", e)
raise e
return banned
async def auth_checks(self, client_id: str, token: str) -> bool:
"""Check if a client exists, is active and the token isn't banned
Args:
client_id (str): UUID of the client
secret (str): Secret of the client
Returns:
bool: True if the client exists, is active
and the token isn't banned, False otherwise
"""
if await self.exists(client_id):
if await self.is_active(client_id):
if not await self.is_token_banned(token):
return True
else:
return False
else:
if not await self.is_token_banned(token):
await self.ban_token(token)
return False
else:
await self.ban_token(token)
return False
return False
async def setup_admin(self) -> bool:
"""Create the admin user if it doesn't exist
Returns:
bool: True if the admin user was created successfully, False otherwise
"""
created: bool = False
if not await self.exists('admin'):
admin_info: ClientModel = await self.generate()
admin_info.id = 'admin'
admin_info.admin = True
try:
await self.store(admin_info)
await self.UserLogger.log("CREATE_ADMIN | ID |", None, admin_info.id)
await self.UserLogger.log("CREATE_ADMIN | SECRET |", None, admin_info.secret)
async with aiofiles.open("admin_info.json", "wb") as file:
await file.write(orjson.dumps(vars(admin_info)))
await self.UserLogger.log("CREATE_ADMIN | TO FILE", None, "admin_info.json")
created = True
except aioredis.RedisError as e:
await self.UserLogger.log("CREATE_ADMIN", e)
raise e
return created

View File

@ -1,33 +1,23 @@
import os from toolz.dicttoolz import keyfilter
import asyncio
import uvloop
import orjson import orjson
import httpx_cache
from base64 import b64decode from base64 import b64decode
from modules.utils.InternalCache import InternalCache from src.utils.HTTPXClient import HTTPXClient
import modules.utils.Logger as Logger from src.utils.InternalCache import InternalCache
class Releases: class Releases:
"""Implements the methods required to get the latest releases and patches from revanced repositories.""" """Implements the methods required to get the latest releases and patches from revanced repositories."""
headers = {'Accept': "application/vnd.github+json", uvloop.install()
'Authorization': "token " + os.environ['GITHUB_TOKEN']
}
httpx_logger = Logger.HTTPXLogger() httpx_client = HTTPXClient.create()
httpx_client = httpx_cache.AsyncClient(
headers=headers,
http2=True,
event_hooks={
'request': [httpx_logger.log_request],
'response': [httpx_logger.log_response]
}
)
InternalCache = InternalCache() InternalCache = InternalCache()
async def _get_release(self, repository: str) -> list: async def __get_release(self, repository: str) -> list:
# Get assets from latest release in a given repository. # Get assets from latest release in a given repository.
# #
# Args: # Args:
@ -83,16 +73,17 @@ class Releases:
releases = {} releases = {}
releases['tools'] = [] releases['tools'] = []
for repository in repositories: results: list = await asyncio.gather(*[self.__get_release(repository) for repository in repositories])
files = await self._get_release(repository)
if files: for result in results:
for file in files: for asset in result:
releases['tools'].append(file) releases['tools'].append(asset)
await self.InternalCache.store('releases', releases) await self.InternalCache.store('releases', releases)
return releases return releases
async def _get_patches_json(self) -> dict: async def __get_patches_json(self) -> dict:
# Get revanced-patches repository's README.md. # Get revanced-patches repository's README.md.
# #
# Returns: # Returns:
@ -113,12 +104,12 @@ class Releases:
if await self.InternalCache.exists('patches'): if await self.InternalCache.exists('patches'):
patches = await self.InternalCache.get('patches') patches = await self.InternalCache.get('patches')
else: else:
patches = await self._get_patches_json() patches = await self.__get_patches_json()
await self.InternalCache.store('patches', patches) await self.InternalCache.store('patches', patches)
return patches return patches
async def _get_contributors(self, repository: str) -> list: async def __get_contributors(self, repository: str) -> list:
# Get contributors from a given repository. # Get contributors from a given repository.
# #
# Args: # Args:
@ -127,9 +118,14 @@ class Releases:
# Returns: # Returns:
# list: a list of dictionaries containing the repository's contributors # list: a list of dictionaries containing the repository's contributors
keep: set = {'login', 'avatar_url', 'html_url'}
response = await self.httpx_client.get(f"https://api.github.com/repos/{repository}/contributors") response = await self.httpx_client.get(f"https://api.github.com/repos/{repository}/contributors")
return response.json() contributors: list = [keyfilter(lambda k: k in keep, contributor) for contributor in response.json()]
return contributors
async def get_contributors(self, repositories: list) -> dict: async def get_contributors(self, repositories: list) -> dict:
"""Runs get_contributors() asynchronously for each repository. """Runs get_contributors() asynchronously for each repository.
@ -148,11 +144,72 @@ class Releases:
else: else:
contributors = {} contributors = {}
contributors['repositories'] = [] contributors['repositories'] = []
for repository in repositories:
if 'revanced' in repository: revanced_repositories = [repository for repository in repositories if 'revanced' in repository]
repo_contributors = await self._get_contributors(repository)
data = { 'name': repository, 'contributors': repo_contributors } results: list[dict] = await asyncio.gather(*[self.__get_contributors(repository) for repository in revanced_repositories])
contributors['repositories'].append(data)
for key, value in zip(revanced_repositories, results):
data = { 'name': key, 'contributors': value }
contributors['repositories'].append(data)
await self.InternalCache.store('contributors', contributors) await self.InternalCache.store('contributors', contributors)
return contributors return contributors
async def get_commits(self, org: str, repository: str, path: str) -> dict:
"""Get commit history from a given repository.
Args:
org (str): Username of the organization | valid values: revanced or vancedapp
repository (str): Repository name
path (str): Path to the file
per_page (int): Number of commits to return
since (str): ISO 8601 timestamp
Raises:
Exception: Raise a generic exception if the organization is not revanced or vancedapp
Returns:
dict: a dictionary containing the repository's latest commits
"""
payload: dict = {}
payload["repository"] = f"{org}/{repository}"
payload["path"] = path
payload["commits"] = []
if org == 'revanced' or org == 'vancedapp':
key: str = f"{org}/{repository}/{path}"
if await self.InternalCache.exists(key):
return await self.InternalCache.get(key)
else:
_releases = await self.httpx_client.get(
f"https://api.github.com/repos/{org}/{repository}/releases?per_page=2"
)
releases = _releases.json()
since = releases[1]['created_at']
until = releases[0]['created_at']
_response = await self.httpx_client.get(
f"https://api.github.com/repos/{org}/{repository}/commits?path={path}&since={since}&until={until}"
)
response = _response.json()
for commit in response:
data: dict[str, str] = {}
data["sha"] = commit["sha"]
data["author"] = commit["commit"]["author"]["name"]
data["message"] = commit["commit"]["message"]
data["html_url"] = commit["html_url"]
payload['commits'].append(data)
await self.InternalCache.store(key, payload)
return payload
else:
raise Exception("Invalid organization.")

View File

@ -0,0 +1,46 @@
from pydantic import BaseModel
from typing import Literal
AnnouncementType = Literal["info", "warning", "error"]
class AnnouncementModel(BaseModel):
"""Implements the fields for the announcements.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
created_at: int
author: str
type: AnnouncementType
title: str
content: str
class AnnouncementCreateModel(BaseModel):
"""Implements the fields for creating an announcement.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
type: AnnouncementType
title: str
content: str
class AnnouncementCreatedResponse(BaseModel):
"""Implements the response fields for created announcements.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
created: bool
class AnnouncementDeleted(BaseModel):
"""Implements the response fields for deleted announcements.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
deleted: bool

View File

@ -0,0 +1,24 @@
from pydantic import BaseModel
class ClientModel(BaseModel):
"""Implements the fields for the clients.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
id: str
secret: str
admin: bool
active: bool
class ClientAuthModel(BaseModel):
"""Implements the fields for client authentication.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
id: str
secret: str

View File

@ -0,0 +1,51 @@
from pydantic import BaseModel
class InternalServerError(BaseModel):
"""Implements the response fields for when an internal server error occurs.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
error: str = "Internal Server Error"
message: str = "An internal server error occurred. Please try again later."
class AnnouncementNotFound(BaseModel):
"""Implements the response fields for when an item is not found.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
error: str = "Not Found"
message: str = "No announcement was found."
class ClientNotFound(BaseModel):
"""Implements the response fields for when a client is not found.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
error: str = "Not Found"
message: str = "No client matches the given ID"
class IdNotProvided(BaseModel):
"""Implements the response fields for when the id is not provided.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
error: str = "Bad Request"
message: str = "Missing client id"
class Unauthorized(BaseModel):
"""Implements the response fields for when the client is unauthorized.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
error: str = "Unauthorized"
message: str = "The client is unauthorized to access this resource"

View File

@ -52,24 +52,8 @@ class ContributorFields(BaseModel):
BaseModel (pydantic.BaseModel): BaseModel from pydantic BaseModel (pydantic.BaseModel): BaseModel from pydantic
""" """
login: str login: str
id: str
node_id: str
avatar_url: str avatar_url: str
gravatar_id: str
url: str
html_url: str html_url: str
followers_url: str
following_url: str
gists_url: str
starred_url: str
subscriptions_url: str
organizations_url: str
repos_url: str
events_url: str
received_events_url: str
type: str
site_admin: str
contributions: int
class ContributorsResponseFields(BaseModel): class ContributorsResponseFields(BaseModel):
"""Implements the fields for each repository in the /contributors endpoint """Implements the fields for each repository in the /contributors endpoint
@ -80,3 +64,14 @@ class ContributorsResponseFields(BaseModel):
name: str name: str
contributors: list[ ContributorFields ] contributors: list[ ContributorFields ]
class ChangelogsResponseFields(BaseModel):
"""Implements the fields for the /changelogs endpoint.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
sha: str
author: str
message: str
html_url: str

View File

@ -0,0 +1,101 @@
from pydantic import BaseModel
import src.models.ResponseFields as ResponseFields
"""Implements pydantic models and model generator for the API's responses."""
class ToolsResponseModel(BaseModel):
"""Implements the JSON response model for the /tools endpoint.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
tools: list[ ResponseFields.ToolsResponseFields ]
class PatchesResponseModel(BaseModel):
"""Implements the JSON response model for the /patches endpoint.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
__root__: list[ ResponseFields.PatchesResponseFields ]
class ContributorsResponseModel(BaseModel):
"""Implements the JSON response model for the /contributors endpoint.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
repositories: list[ ResponseFields.ContributorsResponseFields ]
class PingResponseModel(BaseModel):
"""Implements the JSON response model for the /heartbeat endpoint.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
status: int
detail: str
class ClientDeletedResponse(BaseModel):
"""Implements the response fields for deleted clients.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
id: str
deleted: bool
class ClientSecretUpdatedResponse(BaseModel):
"""Implements the response fields for updated client secrets.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
id: str
secret: str
class ClientAuthTokenResponse(BaseModel):
"""Implements the response fields for client auth tokens.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
access_token: str
refresh_token: str
class ClientTokenRefreshResponse(BaseModel):
"""Implements the response fields for client token refresh.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
access_token: str
class ClientStatusResponse(BaseModel):
"""Implements the response fields for client status.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
id: str
active: bool
class ChangelogsResponseModel(BaseModel):
"""Implements the JSON response model for the /changelogs endpoint.
Args:
BaseModel (pydantic.BaseModel): BaseModel from pydantic
"""
repository: str
path: str
commits: list[ ResponseFields.ChangelogsResponseFields ]

30
src/utils/Generators.py Normal file
View File

@ -0,0 +1,30 @@
import time
import uuid
import secrets
class Generators:
"""Generates UUIDs and secrets"""
async def generate_secret(self) -> str:
"""Generate a random secret
Returns:
str: A random secret
"""
return secrets.token_urlsafe(32)
async def generate_id(self) -> str:
"""Generate a random UUID
Returns:
str: A random UUID (str instead of UUID object)
"""
return str(uuid.uuid4())
async def generate_timestamp(self) -> int:
"""Generate a timestamp
Returns:
int: A timestamp
"""
return int(time.time())

32
src/utils/HTTPXClient.py Normal file
View File

@ -0,0 +1,32 @@
import os
import httpx_cache
import src.utils.Logger as Logger
class HTTPXClient:
"""Implements the methods required to get the latest releases and patches from revanced repositories."""
@staticmethod
def create() -> httpx_cache.AsyncClient:
"""Create HTTPX client with cache
Returns:
httpx_cache.AsyncClient: HTTPX client with cache
"""
headers = {'Accept': "application/vnd.github+json",
'Authorization': "token " + os.environ['GITHUB_TOKEN']
}
httpx_logger = Logger.HTTPXLogger()
httpx_client = httpx_cache.AsyncClient(
headers=headers,
http2=True,
event_hooks={
'request': [httpx_logger.log_request],
'response': [httpx_logger.log_response]
}
)
return httpx_client

View File

@ -1,55 +1,57 @@
import os import os
import toml import toml
import orjson from typing import Any
import msgpack from redis import asyncio as aioredis
import aioredis
import modules.utils.Logger as Logger import src.utils.Logger as Logger
from src.utils.RedisConnector import RedisConnector
# Load config
config: dict = toml.load("config.toml") config: dict = toml.load("config.toml")
# Redis connection parameters
redis_config: dict[ str, str | int ] = {
"url": f"redis://{os.environ['REDIS_URL']}",
"port": os.environ['REDIS_PORT'],
"database": config['internal-cache']['database'],
}
class InternalCache: class InternalCache:
"""Implements an internal cache for ReVanced Releases API.""" """Implements an internal cache for ReVanced Releases API."""
redis_url = f"{redis_config['url']}:{redis_config['port']}/{redis_config['database']}" redis = RedisConnector.connect(config['internal-cache']['database'])
redis = aioredis.from_url(redis_url, encoding="utf-8", decode_responses=True)
InternalCacheLogger = Logger.InternalCacheLogger() InternalCacheLogger = Logger.InternalCacheLogger()
async def store(self, key: str, value: dict) -> None: async def store(self, key: str, value: dict) -> None:
"""Stores a key-value pair in the cache.
Args:
key (str): the key to store
value (dict): the JSON value to store
"""
try: try:
await self.redis.set(key, orjson.dumps(value), ex=config['internal-cache']['expire']) await self.redis.json().set(key, '$', value)
await self.redis.expire(key, config['internal-cache']['expire'])
await self.InternalCacheLogger.log("SET", None, key) await self.InternalCacheLogger.log("SET", None, key)
except aioredis.RedisError as e: except aioredis.RedisError as e:
await self.InternalCacheLogger.log("SET", e) await self.InternalCacheLogger.log("SET", e)
async def delete(self, key: str) -> None: async def delete(self, key: str) -> None:
"""Removes a key-value pair from the cache.
Args:
key (str): the key to delete
"""
try: try:
await self.redis.delete(key) await self.redis.delete(key)
await self.InternalCacheLogger.log("DEL", None, key) await self.InternalCacheLogger.log("DEL", None, key)
except aioredis.RedisError as e: except aioredis.RedisError as e:
await self.InternalCacheLogger.log("DEL", e) await self.InternalCacheLogger.log("DEL", e)
async def update(self, key: str, value: dict) -> None:
try:
await self.redis.set(key, orjson.dumps(value), ex=config['internal-cache']['expire'])
await self.InternalCacheLogger.log("SET", None, key)
except aioredis.RedisError as e:
await self.InternalCacheLogger.log("SET", e)
async def get(self, key: str) -> dict: async def get(self, key: str) -> dict:
"""Gets a key-value pair from the cache.
Args:
key (str): the key to retrieve
Returns:
dict: the JSON value stored in the cache or an empty dict if key doesn't exist or an error occurred
"""
try: try:
payload = orjson.loads(await self.redis.get(key)) payload: dict[Any, Any] = await self.redis.json().get(key)
await self.InternalCacheLogger.log("GET", None, key) await self.InternalCacheLogger.log("GET", None, key)
return payload return payload
except aioredis.RedisError as e: except aioredis.RedisError as e:
@ -57,6 +59,14 @@ class InternalCache:
return {} return {}
async def exists(self, key: str) -> bool: async def exists(self, key: str) -> bool:
"""Checks if a key exists in the cache.
Args:
key (str): key to check
Returns:
bool: True if key exists, False if key doesn't exist or an error occurred
"""
try: try:
if await self.redis.exists(key): if await self.redis.exists(key):
await self.InternalCacheLogger.log("EXISTS", None, key) await self.InternalCacheLogger.log("EXISTS", None, key)

View File

@ -4,6 +4,7 @@ from loguru import logger
from typing import Optional from typing import Optional
from types import FrameType from types import FrameType
from redis import RedisError from redis import RedisError
from argon2.exceptions import VerifyMismatchError
class InterceptHandler(logging.Handler): class InterceptHandler(logging.Handler):
"""Setups a loging handler for uvicorn and FastAPI. """Setups a loging handler for uvicorn and FastAPI.
@ -24,15 +25,22 @@ class InterceptHandler(logging.Handler):
depth: int depth: int
# Get corresponding Loguru level if it exists # Get corresponding Loguru level if it exists
# If not, use default level
try: try:
level = logger.level(record.levelname).name level = logger.level(record.levelname).name
except ValueError: except ValueError:
level = record.levelno level = record.levelno
# Find caller from where originated the logged message
# Set depth to 2 to avoid logging of loguru internal calls
frame = logging.currentframe() frame = logging.currentframe()
depth = 2 depth = 2
# Find caller from where originated the logged message # Find caller from where originated the logged message
# The logging module uses a stack frame to keep track of where logging messages originate
# This stack frame is used to find the correct place in the code where the logging message was generated
# The mypy error is ignored because the logging module is not properly typed
while frame.f_code.co_filename == logging.__file__: while frame.f_code.co_filename == logging.__file__:
frame = frame.f_back frame = frame.f_back
depth += 1 depth += 1
@ -74,6 +82,33 @@ class InternalCacheLogger:
else: else:
logger.info(f"[InternalCache] REDIS {operation} {key} - OK") logger.info(f"[InternalCache] REDIS {operation} {key} - OK")
class UserLogger:
async def log(self, operation: str, result: RedisError | VerifyMismatchError | None = None,
key: str = "",) -> None:
"""Logs internal cache operations
Args:
operation (str): Operation name
key (str): Key used in the operation
"""
if type(result) is RedisError:
logger.error(f"[User] REDIS {operation} - Failed with error: {result}")
else:
logger.info(f"[User] REDIS {operation} {key} - OK")
class AnnouncementsLogger:
async def log(self, operation: str, result: RedisError | None = None, key: str = "") -> None:
"""Logs internal cache operations
Args:
operation (str): Operation name
key (str): Key used in the operation
"""
if type(result) is RedisError:
logger.error(f"[User] REDIS {operation} - Failed with error: {result}")
else:
logger.info(f"[User] REDIS {operation} {key} - OK")
def setup_logging(LOG_LEVEL: str, JSON_LOGS: bool) -> None: def setup_logging(LOG_LEVEL: str, JSON_LOGS: bool) -> None:
"""Setup logging for uvicorn and FastAPI.""" """Setup logging for uvicorn and FastAPI."""

View File

@ -0,0 +1,23 @@
import os
import toml
from redis import asyncio as aioredis
# Load config
config: dict = toml.load("config.toml")
# Redis connection parameters
redis_config: dict[ str, str | int ] = {
"url": f"redis://{os.environ['REDIS_URL']}",
"port": os.environ['REDIS_PORT'],
}
class RedisConnector:
"""Implements the RedisConnector class for the ReVanced API"""
@staticmethod
def connect(database: str) -> aioredis.Redis:
"""Connect to Redis"""
redis_url = f"{redis_config['url']}:{redis_config['port']}/{database}"
return aioredis.from_url(redis_url, encoding="utf-8", decode_responses=True)

0
src/utils/__init__.py Normal file
View File