Compare commits

...

64 Commits

Author SHA1 Message Date
dfb0588d1c fix: fixed pipeline 2024-11-24 18:27:45 +01:00
3251afdb90 fix: fixed pipeline 2024-11-24 18:25:59 +01:00
85fe263da4 fix: pipeline fix 2024-11-24 18:21:43 +01:00
0be70deb00 fix: fixed pipeline 2024-11-24 18:18:13 +01:00
0418c75e19 feat: added all install option for dependencies 2024-11-24 18:16:03 +01:00
2444269486 feat: added auth0 common module 2024-11-24 18:13:58 +01:00
creyD
33bdeb12a0 Adjusted files for isort & autopep 2024-11-24 17:03:03 +00:00
5efed5399b Update README.md 2024-11-24 18:02:00 +01:00
7dbce117c8 feat: added common database helper 2024-11-24 18:01:45 +01:00
481bfcfffd feat: unified configs for pg sessions 2024-11-24 17:57:53 +01:00
90c9d2dc09 breaking: default version no longer uses postgres 2024-11-24 17:57:49 +01:00
renovate[bot]
8b037fbeb5 chore: Configure Renovate (#20)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
Co-authored-by: Conrad <grosserconrad@gmail.com>
2024-11-24 16:23:52 +01:00
b86b58f3e4 Merge pull request #19 from creyD/dev 2024-11-22 13:20:27 +01:00
creyD
17f96c920d Adjusted files for isort & autopep 2024-11-22 11:58:05 +00:00
vikynoah
523241ac4b feat: N-271 async db (#18) 2024-11-22 12:56:45 +01:00
creyD
6f09c2ef4c Adjusted files for isort & autopep 2024-11-15 11:39:59 +00:00
vikynoah
9bba5b0a4e fix: N 271 async db (#17) 2024-11-15 12:39:30 +01:00
creyD
50031556f9 Adjusted files for isort & autopep 2024-11-12 08:54:34 +00:00
vikynoah
2940ddbdcd feat: Introduce ASYNC DB as Plug and Play (#16)
Co-authored-by: vikbhas <waraa.vignesh@gmail.com>
2024-11-12 09:54:04 +01:00
807af12fa1 Merge pull request #13 from creyD/dev 2024-11-05 11:54:46 +01:00
creyD
dce897c247 Adjusted files for isort & autopep 2024-11-05 10:29:40 +00:00
vikynoah
89997372ef fix: Changes to accomodate pagination flag in Params (#14)
Co-authored-by: vikbhas <waraa.vignesh@gmail.com>
2024-11-05 11:29:06 +01:00
c8c5977978 fix: removed non-working backsync 2024-10-29 16:20:01 +01:00
974bc591d6 Merge pull request #11 from creyD/dev 2024-10-29 15:49:00 +01:00
eb895398ab fix: trying again to fix the pipeline 2024-10-29 15:46:52 +01:00
867abd7054 fix: fixed workflow again 2024-10-29 15:33:54 +01:00
26e18f6b31 Merge pull request #10 from creyD/dev 2024-10-29 15:32:28 +01:00
8a3a60dbb0 fix: fixed workflow again 2024-10-29 15:30:42 +01:00
e52a5f421b Merge pull request #9 from creyD/dev 2024-10-29 15:23:41 +01:00
a6ded91185 fix: syncing back tags to dev 2024-10-29 15:18:57 +01:00
eb64874c47 fix: minor adjustment to the pipeline 2024-10-29 15:13:36 +01:00
b7200852a4 Merge branch 'dev' of https://github.com/creyD/creyPY into dev 2024-10-29 15:13:30 +01:00
3d18205205 fix: fixed github pipeline 2024-10-29 15:12:41 +01:00
99c84b676c Merge pull request #8 from creyD/dev 2024-10-29 15:09:34 +01:00
6806de23b3 fix: fixed github pipeline 2024-10-29 15:06:31 +01:00
6a93ab05a3 Merge pull request #6 from creyD/dev 2024-10-29 14:56:14 +01:00
vikynoah
c5b2ab9932 fix: Add condition for total greater than zero (#7) 2024-10-28 15:37:14 +01:00
5a32a5908b Removed debug statement 2024-10-25 15:34:16 +02:00
b7df0bfdcd fix: added escape for variable names 2024-10-25 15:27:50 +02:00
378d1d60f1 fix: adjusting pipeline for prod as well 2024-10-25 15:22:35 +02:00
e381992f8e fix: fixing dev versioning 2024-10-25 15:09:15 +02:00
6d5411a8ae fix: debugging pipeline 2024-10-25 14:46:35 +02:00
89351d714b fix: debugging pipeline 2024-10-25 14:44:18 +02:00
c24f8933fb fix: attempt on fixing the versioning issue 2024-10-25 14:38:58 +02:00
0bed0e0da4 fix: attempt on fixing the versioning issue 2024-10-25 14:34:56 +02:00
8463eef907 fix: attempt on fixing the versioning issue 2024-10-25 14:25:48 +02:00
5903de2aad fix: fixed semantic versioning format selector 2024-10-25 14:19:11 +02:00
0bf89fe14d fix: switched to semantic versioning action 2024-10-25 14:12:04 +02:00
d54146e05b fix: fixed naming of pre-release commits 2024-10-24 12:41:04 +02:00
d6f79c3ed8 fix: fixed naming of pre-release commits 2024-10-24 12:35:13 +02:00
3f4a0ee00d fix: fixed naming of pre-release commits 2024-10-24 12:25:30 +02:00
714178d68f fix: fixed naming of pre-release commits 2024-10-24 12:22:45 +02:00
c7e205f14b fix: fixed naming of pre-release commits 2024-10-24 12:18:50 +02:00
39ae74becb fix: minor changelog adjustment 2024-10-24 12:15:23 +02:00
5f39966223 fix: Fixed tag pushing and changelog 2024-10-24 12:10:16 +02:00
c91e684f08 fix: fix attempt for the github pipeline 2024-10-24 12:10:16 +02:00
f11b8b8864 fix: alternative attempt on the fix 2024-10-24 12:10:16 +02:00
983553e97a fix: locked tag and publish to master and dev 2024-10-24 12:10:16 +02:00
8740eafce2 fix: fixed pipeline tagging 2024-10-24 12:10:16 +02:00
aa44b9ebe9 fix: fixed pipeline tagging 2024-10-24 12:10:16 +02:00
851573d964 fix: fixed pipeline tagging 2024-10-24 12:10:16 +02:00
cfa1da08d3 fix: pipeline now pushes pre-release versions 2024-10-24 12:10:16 +02:00
4a5a777ef5 breaking: Fixed #3 2024-10-24 12:10:16 +02:00
c9a9b1bc0a breaking: Fixed #1 2024-10-24 12:10:16 +02:00
23 changed files with 726 additions and 75 deletions

View File

@@ -6,14 +6,12 @@ on:
- master
- dev
paths-ignore:
- "**/.github/**"
- "**/.gitignore"
- "**/.vscode/**"
- "**/README.md"
- "**/CHANGELOG.md"
pull_request:
branches:
- master
- dev
workflow_dispatch:
@@ -42,12 +40,15 @@ jobs:
with:
python-version: '3.12'
- run: python -m pip install --upgrade pip
- run: python -m pip install -r requirements.txt
- run: |
python -m pip install -r requirements.txt
python -m pip install -r requirements.pg.txt
python -m pip install -r requirements.auth0.txt
- run: python test.py
tag_and_publish:
if: github.ref == 'refs/heads/master'
runs-on: ubuntu-latest
if: (github.ref_name == 'master' || github.ref_name == 'dev') && github.event_name == 'push'
needs: test
permissions:
id-token: write # IMPORTANT: this permission is mandatory for trusted publishing
@@ -57,7 +58,7 @@ jobs:
- uses: actions/checkout@v4
with:
fetch-tags: true
ref: ${{ github.ref }}
ref: ${{ github.ref_name }}
fetch-depth: 0
- name: setup git
@@ -65,18 +66,29 @@ jobs:
git config --local user.email "15138480+creyD@users.noreply.github.com"
git config --local user.name "creyD"
- name: set version format
id: version_format
run: |
if [[ ${{ github.ref_name }} == 'master' ]]; then
echo "version_format=\${major}.\${minor}.\${patch}" >> $GITHUB_OUTPUT
else
echo "version_format=\${major}.\${minor}.\${patch}rc\${increment}" >> $GITHUB_OUTPUT
fi
- name: Git Version
uses: codacy/git-version@2.8.0
uses: PaulHatch/semantic-version@v5.4.0
id: git_version
with:
minor-identifier: "feat:"
major-identifier: "breaking:"
tag_prefix: ""
major_pattern: "breaking:"
minor_pattern: "feat:"
enable_prerelease_mode: false
version_format: ${{ steps.version_format.outputs.version_format }}
- name: Create Tag
run: git tag ${{ steps.git_version.outputs.version }}
- name: Push tag
run: git push origin ${{ steps.git_version.outputs.version }}
- name: Create & Push Tag
run: |
git tag ${{ steps.git_version.outputs.version }}
git push origin ${{ steps.git_version.outputs.version }}
- name: Set up Python
uses: actions/setup-python@v5

View File

@@ -2,6 +2,25 @@
All notable changes to this project will be documented in this file.
## 2.0.0
- Fixed #1 Rename misspelled additonal_data to additional_data on create_obj_from_data
- Fixed #3 Inverse partial flag: bool = False because it was wrong on update_obj_from_data
Notes:
You will need to change calls to `create_obj_from_data` according to #1 (rename additonal_data to additional_data)
You will need to change calls to `update_obj_from_data` according to #3 (if you supplied `partial`, you will need to reverse it: `true` -> `false` and `false` -> `true`)
## 1.3.0
- Addition of pagination proxy and pagination=off functionality (Thanks to @vikbhas)
## 1.2.5
- Bumped dependencies
## 1.2.4
- Enabled newer versions for all dependencies

View File

@@ -55,8 +55,3 @@ from creyPY.const import LanguageEnum
print(LanguageEnum.EN) # Output: LanguageEnum.EN
print(LanguageEnum.EN.value) # Output: English
```
## TODO
- Add async support for database connection
- Add version without postgresql dependency

View File

@@ -1,63 +1,214 @@
from typing import Type, TypeVar
from typing import Type, TypeVar, overload
from uuid import UUID
from fastapi import HTTPException
from pydantic import BaseModel
from sqlalchemy.orm import Session
from sqlalchemy.ext.asyncio import AsyncSession
from sqlalchemy.future import select
import asyncio
from .models.base import Base
T = TypeVar("T", bound=Base)
@overload
async def get_object_or_404(
db_class: Type[T],
id: UUID | str,
db: AsyncSession,
expunge: bool = False,
lookup_column: str = "id",
) -> T:
pass
@overload
def get_object_or_404(
db_class: Type[T], id: UUID | str, db: Session, expunge: bool = False, lookup_column: str = "id"
) -> T:
obj = db.query(db_class).filter(getattr(db_class, lookup_column) == id).one_or_none()
if obj is None:
raise HTTPException(status_code=404, detail="The object does not exist.")
if expunge:
db.expunge(obj)
return obj
pass
# TODO: Add testing
def create_obj_from_data(
data: BaseModel, model: Type[T], db: Session, additonal_data={}, exclude={}
def get_object_or_404(
db_class: Type[T],
id: UUID | str,
db: Session | AsyncSession,
expunge: bool = False,
lookup_column: str = "id",
) -> T:
obj = model(**data.model_dump(exclude=exclude) | additonal_data)
db.add(obj)
db.commit()
db.refresh(obj)
return obj
async def _get_async_object() -> T:
query = select(db_class).filter(getattr(db_class, lookup_column) == id)
result = await db.execute(query)
obj = result.scalar_one_or_none()
if obj is None:
raise HTTPException(status_code=404, detail="The object does not exist.") # type: ignore
if expunge:
await db.expunge(obj)
return obj
def _get_sync_object() -> T:
obj = db.query(db_class).filter(getattr(db_class, lookup_column) == id).one_or_none()
if obj is None:
raise HTTPException(status_code=404, detail="The object does not exist.") # type: ignore
if expunge:
db.expunge(obj)
return obj
if isinstance(db, AsyncSession):
return asyncio.ensure_future(_get_async_object()) # type: ignore
elif isinstance(db, Session):
return _get_sync_object()
else:
raise HTTPException(status_code=404, detail="Invalid session type. Expected Session or AsyncSession.") # type: ignore
# TODO: Add testing
@overload
async def create_obj_from_data(
data: BaseModel,
model: Type[T],
db: AsyncSession,
additional_data: dict = {},
exclude: dict = {},
) -> T:
pass
@overload
def create_obj_from_data(
data: BaseModel, model: Type[T], db: Session, additional_data: dict = {}, exclude: dict = {}
) -> T:
pass
def create_obj_from_data(
data: BaseModel, model: Type[T], db: Session | AsyncSession, additional_data={}, exclude={}
) -> T:
obj_data = data.model_dump(exclude=exclude) | additional_data
obj = model(**obj_data)
async def _create_async_obj():
db.add(obj)
await db.commit()
await db.refresh(obj)
return obj
def _create_sync_obj():
db.add(obj)
db.commit()
db.refresh(obj)
return obj
if isinstance(db, AsyncSession):
return asyncio.ensure_future(_create_async_obj()) # type: ignore
elif isinstance(db, Session):
return _create_sync_obj()
else:
raise HTTPException(status_code=404, detail="Invalid session type. Expected Session or AsyncSession.") # type: ignore
# TODO: Add testing
@overload
async def update_obj_from_data(
data: BaseModel,
model: Type[T],
id: UUID | str,
db: AsyncSession,
partial: bool = True,
ignore_fields: list = [],
additional_data: dict = {},
exclude: dict = {},
) -> T:
pass
@overload
def update_obj_from_data(
data: BaseModel,
model: Type[T],
id: UUID | str,
db: Session,
partial: bool = False, # TODO: inverse, because it is currently the wrong way around
partial: bool = True,
ignore_fields: list = [],
additional_data: dict = {},
exclude: dict = {},
) -> T:
pass
def update_obj_from_data(
data: BaseModel,
model: Type[T],
id: UUID | str,
db: Session | AsyncSession,
partial: bool = True,
ignore_fields=[],
additional_data={},
exclude={},
) -> T:
obj = get_object_or_404(model, id, db)
data_dict = data.model_dump(exclude_unset=not partial, exclude=exclude)
data_dict.update(additional_data) # merge additional_data into data_dict
for field in data_dict:
if field not in ignore_fields:
setattr(obj, field, data_dict[field])
db.commit()
db.refresh(obj)
return obj
def _update_fields(obj: T):
data_dict = data.model_dump(exclude_unset=partial, exclude=exclude)
data_dict.update(additional_data)
for field in data_dict:
if field not in ignore_fields:
setattr(obj, field, data_dict[field])
async def _update_async_obj() -> T:
obj = await get_object_or_404(model, id, db)
_update_fields(obj)
await db.commit()
await db.refresh(obj)
return obj
def _update_sync_obj() -> T:
obj = get_object_or_404(model, id, db)
_update_fields(obj)
db.commit()
db.refresh(obj)
return obj
if isinstance(db, AsyncSession):
return asyncio.ensure_future(_update_async_obj()) # type: ignore
elif isinstance(db, Session):
return _update_sync_obj()
else:
raise HTTPException(status_code=404, detail="Invalid session type. Expected Session or AsyncSession.") # type: ignore
# TODO: Add testing
@overload
async def delete_object(db_class: Type[T], id: UUID | str, db: AsyncSession) -> None:
pass
@overload
def delete_object(db_class: Type[T], id: UUID | str, db: Session) -> None:
obj = db.query(db_class).filter(db_class.id == id).one_or_none()
if obj is None:
raise HTTPException(status_code=404, detail="The object does not exist.")
db.delete(obj)
db.commit()
pass
def delete_object(db_class: Type[T], id: UUID | str, db: Session | AsyncSession) -> None:
async def _delete_async_obj() -> None:
query = select(db_class).filter(db_class.id == id)
result = await db.execute(query)
obj = result.scalar_one_or_none()
if obj is None:
raise HTTPException(status_code=404, detail="The object does not exist.")
await db.delete(obj)
await db.commit()
def _delete_sync_obj() -> None:
obj = db.query(db_class).filter(db_class.id == id).one_or_none()
if obj is None:
raise HTTPException(status_code=404, detail="The object does not exist.")
db.delete(obj)
db.commit()
if isinstance(db, AsyncSession):
return asyncio.ensure_future(_delete_async_obj()) # type: ignore
elif isinstance(db, Session):
return _delete_sync_obj()
else:
raise HTTPException(status_code=404, detail="Invalid session type. Expected Session or AsyncSession.") # type: ignore

View File

@@ -1 +1,3 @@
from .async_session import * # noqa
from .helpers import * # noqa
from .session import * # noqa

View File

@@ -0,0 +1,21 @@
from typing import AsyncGenerator
from sqlalchemy.ext.asyncio import AsyncSession, create_async_engine
from sqlalchemy.orm import sessionmaker
from .common import SQLALCHEMY_DATABASE_URL, name
async_engine = create_async_engine(SQLALCHEMY_DATABASE_URL + name, pool_pre_ping=True)
AsyncSessionLocal = sessionmaker(
bind=async_engine,
class_=AsyncSession,
expire_on_commit=False,
autoflush=False,
autocommit=False,
)
async def get_async_db() -> AsyncGenerator[AsyncSession, None]:
async with AsyncSessionLocal() as db:
yield db

View File

@@ -0,0 +1,13 @@
import os
from dotenv import load_dotenv
load_dotenv()
host = os.getenv("POSTGRES_HOST", "localhost")
user = os.getenv("POSTGRES_USER", "postgres")
password = os.getenv("POSTGRES_PASSWORD", "root")
port = os.getenv("POSTGRES_PORT", "5432")
name = os.getenv("POSTGRES_DB", "fastapi")
SQLALCHEMY_DATABASE_URL = f"postgresql+psycopg://{user}:{password}@{host}:{port}/"

View File

@@ -0,0 +1,8 @@
from sqlalchemy_utils import create_database, database_exists
def create_if_not_exists(db_name: str):
from .common import SQLALCHEMY_DATABASE_URL
if not database_exists(SQLALCHEMY_DATABASE_URL + db_name):
create_database(SQLALCHEMY_DATABASE_URL + db_name)

View File

@@ -1,21 +1,10 @@
import os
from typing import Generator
from dotenv import load_dotenv
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker
from sqlalchemy.orm.session import Session
load_dotenv()
host = os.getenv("POSTGRES_HOST", "localhost")
user = os.getenv("POSTGRES_USER", "postgres")
password = os.getenv("POSTGRES_PASSWORD", "root")
port = os.getenv("POSTGRES_PORT", "5432")
name = os.getenv("POSTGRES_DB", "fastapi")
SQLALCHEMY_DATABASE_URL = f"postgresql+psycopg://{user}:{password}@{host}:{port}/"
from .common import SQLALCHEMY_DATABASE_URL, name
engine = create_engine(SQLALCHEMY_DATABASE_URL + name, pool_pre_ping=True)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)

View File

@@ -1,6 +1,7 @@
from math import ceil
from typing import Any, Generic, Optional, Self, Sequence, TypeVar, Union
from typing import Any, Generic, Optional, Self, Sequence, TypeVar, Union, overload
from contextlib import suppress
from pydantic import BaseModel
from fastapi_pagination import Params
from fastapi_pagination.bases import AbstractPage, AbstractParams
from fastapi_pagination.types import (
@@ -8,18 +9,36 @@ from fastapi_pagination.types import (
GreaterEqualZero,
AdditionalData,
SyncItemsTransformer,
AsyncItemsTransformer,
ItemsTransformer,
)
from fastapi_pagination.api import create_page, apply_items_transformer
from fastapi_pagination.utils import verify_params
from fastapi_pagination.ext.sqlalchemy import create_paginate_query
from fastapi_pagination.bases import AbstractParams, RawParams
from pydantic.json_schema import SkipJsonSchema
from sqlalchemy.sql.selectable import Select
from sqlalchemy.orm.session import Session
from sqlalchemy import select, func
from sqlalchemy.ext.asyncio import AsyncSession, async_scoped_session
from fastapi import Query
from sqlalchemy.util import await_only, greenlet_spawn
T = TypeVar("T")
class PaginationParams(BaseModel, AbstractParams):
page: int = Query(1, ge=1, description="Page number")
size: int = Query(50, ge=1, le=100, description="Page size")
pagination: bool = Query(True, description="Toggle pagination")
def to_raw_params(self) -> RawParams:
if not self.pagination:
return RawParams(limit=None, offset=None)
return RawParams(limit=self.size, offset=(self.page - 1) * self.size)
# TODO: Add complete fastapi-pagination proxy here
# TODO: Add pagination off functionality
# SkipJsonSchema is used to avoid generating invalid JSON schema in FastAPI
@@ -32,7 +51,7 @@ class Page(AbstractPage[T], Generic[T]):
has_next: bool | SkipJsonSchema[None] = None
has_prev: bool | SkipJsonSchema[None] = None
__params_type__ = Params
__params_type__ = PaginationParams
@classmethod
def create(
@@ -94,28 +113,72 @@ def unwrap_scalars(
return [item[0] if force_unwrap else item for item in items]
def _get_sync_conn_from_async(conn: Any) -> Session: # pragma: no cover
if isinstance(conn, async_scoped_session):
conn = conn()
with suppress(AttributeError):
return conn.sync_session # type: ignore
with suppress(AttributeError):
return conn.sync_connection # type: ignore
raise TypeError("conn must be an AsyncConnection or AsyncSession")
@overload
def paginate(
connection: Session,
query: Select,
paginationFlag: bool = True,
params: Optional[AbstractParams] = None,
transformer: Optional[SyncItemsTransformer] = None,
additional_data: Optional[AdditionalData] = None,
) -> Any:
pass
@overload
async def paginate(
connection: AsyncSession,
query: Select,
params: Optional[AbstractParams] = None,
transformer: Optional[AsyncItemsTransformer] = None,
additional_data: Optional[AdditionalData] = None,
) -> Any:
pass
def _paginate(
connection: Session,
query: Select,
params: Optional[AbstractParams] = None,
transformer: Optional[ItemsTransformer] = None,
additional_data: Optional[AdditionalData] = None,
async_: bool = False,
):
params, raw_params = verify_params(params, "limit-offset", "cursor")
if async_:
def _apply_items_transformer(*args: Any, **kwargs: Any) -> Any:
return await_only(apply_items_transformer(*args, **kwargs, async_=True))
else:
_apply_items_transformer = apply_items_transformer
params, raw_params = verify_params(params, "limit-offset", "cursor")
count_query = create_count_query(query)
total = connection.scalar(count_query)
if paginationFlag is False:
if params.pagination is False and total > 0:
params = Params(page=1, size=total)
else:
params = Params(page=params.page, size=params.size)
query = create_paginate_query(query, params)
items = connection.execute(query).all()
items = unwrap_scalars(items)
t_items = apply_items_transformer(items, transformer)
t_items = _apply_items_transformer(items, transformer)
return create_page(
t_items,
@@ -123,3 +186,19 @@ def paginate(
total=total,
**(additional_data or {}),
)
def paginate(
connection: Session,
query: Select,
params: Optional[AbstractParams] = None,
transformer: Optional[ItemsTransformer] = None,
additional_data: Optional[AdditionalData] = None,
):
if isinstance(connection, AsyncSession):
connection = _get_sync_conn_from_async(connection)
return greenlet_spawn(
_paginate, connection, query, params, transformer, additional_data, async_=True
)
return _paginate(connection, query, params, transformer, additional_data, async_=False)

View File

@@ -0,0 +1,139 @@
import json
from httpx import AsyncClient
class AsyncGenericClient:
def __init__(self, app):
self.c = AsyncClient(app=app, base_url="http://testserver", follow_redirects=True)
self.default_headers = {}
async def get(self, url: str, r_code: int = 200, parse_json=True):
re = await self.c.get(url, headers=self.default_headers)
if re.status_code != r_code:
print(re.content)
assert r_code == re.status_code
return re.json() if parse_json else re.content
async def delete(self, url: str, r_code: int = 204):
re = await self.c.delete(url, headers=self.default_headers)
if re.status_code != r_code:
print(re.content)
assert r_code == re.status_code
return re.json() if r_code != 204 else None
async def post(
self, url: str, obj: dict | str = {}, r_code: int = 201, raw_response=False, *args, **kwargs
):
re = await self.c.post(
url,
data=json.dumps(obj) if isinstance(obj, dict) else obj,
headers=self.default_headers | {"Content-Type": "application/json"},
*args,
**kwargs,
)
if re.status_code != r_code:
print(re.content)
assert r_code == re.status_code
return re.json() if not raw_response else re
async def post_file(
self, url: str, file, r_code: int = 201, raw_response=False, *args, **kwargs
):
re = await self.c.post(
url,
files={"file": file},
headers=self.default_headers | {"Content-Type": "application/json"},
*args,
**kwargs,
)
if re.status_code != r_code:
print(re.content)
assert r_code == re.status_code
return re.json() if not raw_response else re
async def patch(
self, url: str, obj: dict | str = {}, r_code: int = 200, raw_response=False, *args, **kwargs
):
re = await self.c.patch(
url,
data=json.dumps(obj) if isinstance(obj, dict) else obj,
headers=self.default_headers | {"Content-Type": "application/json"},
*args,
**kwargs,
)
if re.status_code != r_code:
print(re.content)
assert r_code == re.status_code
return re.json() if not raw_response else re
async def put(
self, url: str, obj: dict | str = {}, r_code: int = 200, raw_response=False, *args, **kwargs
):
re = await self.c.put(
url,
data=json.dumps(obj) if isinstance(obj, dict) else obj,
headers=self.default_headers
| {
"Content-Type": "application/json",
"accept": "application/json",
},
*args,
**kwargs,
)
if re.status_code != r_code:
print(re.content)
assert r_code == re.status_code
return re.json() if not raw_response else re
async def obj_lifecycle(
self,
input_obj: dict,
url: str,
pagination: bool = True,
id_field: str = "id",
created_at_check: bool = True,
):
# GET LIST
re = await self.get(url)
if pagination:
assert re["total"] == 0
assert len(re["results"]) == 0
else:
assert len(re) == 0
# CREATE
re = await self.post(url, obj=input_obj)
assert id_field in re
assert re[id_field] is not None
if created_at_check:
assert "created_at" in re
assert re["created_at"] is not None
obj_id = str(re[id_field])
# GET
re = await self.get(f"{url}{obj_id}/")
assert re[id_field] == obj_id
# GET LIST
re = await self.get(url)
if pagination:
assert re["total"] == 1
assert len(re["results"]) == 1
else:
assert len(re) == 1
# DELETE
await self.delete(f"{url}{obj_id}")
# GET LIST
re = await self.get(url)
if pagination:
assert re["total"] == 0
assert len(re["results"]) == 0
else:
assert len(re) == 0
# GET
await self.get(f"{url}{obj_id}", parse_json=False, r_code=404)

16
creyPY/helpers.py Normal file
View File

@@ -0,0 +1,16 @@
import random
import string
def create_random_password(length: int = 12) -> str:
all_characters = string.ascii_letters + string.digits + string.punctuation
password = [
random.choice(string.ascii_lowercase),
random.choice(string.ascii_uppercase),
random.choice(string.digits),
random.choice(string.punctuation),
]
password += random.choices(all_characters, k=length - 4)
random.shuffle(password)
return "".join(password)

View File

@@ -0,0 +1,3 @@
from .exceptions import * # noqa
from .manage import * # noqa
from .utils import * # noqa

View File

@@ -0,0 +1,13 @@
import os
from dotenv import load_dotenv
load_dotenv()
AUTH0_DOMAIN = os.getenv("AUTH0_DOMAIN")
AUTH0_CLIENT_ID = os.getenv("AUTH0_CLIENT_ID")
AUTH0_CLIENT_SECRET = os.getenv("AUTH0_CLIENT_SECRET")
AUTH0_ALGORIGHM = os.getenv("AUTH0_ALGORIGHM", "RS256")
AUTH0_AUDIENCE = os.getenv("AUTH0_AUDIENCE")
AUTH0_ISSUER = os.getenv("AUTH0_ISSUER")

View File

@@ -0,0 +1,12 @@
from fastapi import HTTPException, status
class UnauthorizedException(HTTPException):
def __init__(self, detail: str, **kwargs):
"""Returns HTTP 403"""
super().__init__(status.HTTP_403_FORBIDDEN, detail=detail)
class UnauthenticatedException(HTTPException):
def __init__(self):
super().__init__(status_code=status.HTTP_401_UNAUTHORIZED, detail="Requires authentication")

View File

@@ -0,0 +1,20 @@
import requests
from cachetools import TTLCache, cached
from .common import AUTH0_CLIENT_ID, AUTH0_CLIENT_SECRET, AUTH0_DOMAIN
cache = TTLCache(maxsize=100, ttl=600)
@cached(cache)
def get_management_token() -> str:
re = requests.post(
f"https://{AUTH0_DOMAIN}/oauth/token",
json={
"client_id": AUTH0_CLIENT_ID,
"client_secret": AUTH0_CLIENT_SECRET,
"audience": f"https://{AUTH0_DOMAIN}/api/v2/", # This should be the management audience
"grant_type": "client_credentials",
},
).json()
return re["access_token"]

View File

@@ -0,0 +1,131 @@
from typing import Optional
import jwt
import requests
from fastapi import HTTPException, Request, Security
from fastapi.security import HTTPAuthorizationCredentials, HTTPBearer
from creyPY.helpers import create_random_password
from .common import (
AUTH0_ALGORIGHM,
AUTH0_AUDIENCE,
AUTH0_CLIENT_ID,
AUTH0_DOMAIN,
AUTH0_ISSUER,
)
from .exceptions import UnauthenticatedException, UnauthorizedException
from .manage import get_management_token
JWKS_CLIENT = jwt.PyJWKClient(f"https://{AUTH0_DOMAIN}/.well-known/jwks.json")
async def verify(
request: Request,
token: Optional[HTTPAuthorizationCredentials] = Security(HTTPBearer(auto_error=False)),
) -> str:
if token is None:
raise UnauthenticatedException
# This gets the 'kid' from the passed token
try:
signing_key = JWKS_CLIENT.get_signing_key_from_jwt(token.credentials).key
except jwt.exceptions.PyJWKClientError as error:
raise UnauthorizedException(str(error))
except jwt.exceptions.DecodeError as error:
raise UnauthorizedException(str(error))
try:
payload = jwt.decode(
token.credentials,
signing_key,
algorithms=[AUTH0_ALGORIGHM],
audience=AUTH0_AUDIENCE,
issuer=AUTH0_ISSUER,
)
except Exception as error:
raise UnauthorizedException(str(error))
return payload["sub"]
### GENERIC AUTH0 CALLS ###
def get_user(sub) -> dict:
re = requests.get(
f"https://{AUTH0_DOMAIN}/api/v2/users/{sub}",
headers={"Authorization": f"Bearer {get_management_token()}"},
)
if re.status_code != 200:
raise HTTPException(re.status_code, re.json())
return re.json()
def patch_user(input_obj: dict, sub) -> dict:
re = requests.patch(
f"https://{AUTH0_DOMAIN}/api/v2/users/{sub}",
headers={"Authorization": f"Bearer {get_management_token()}"},
json=input_obj,
)
if re.status_code != 200:
raise HTTPException(re.status_code, re.json())
return re.json()
### USER METADATA CALLS ###
def get_user_metadata(sub) -> dict:
try:
return get_user(sub).get("user_metadata", {})
except:
return {}
def patch_user_metadata(input_obj: dict, sub) -> dict:
return patch_user({"user_metadata": input_obj}, sub)
def clear_user_metadata(sub) -> dict:
return patch_user({"user_metadata": {}}, sub)
def request_verification_mail(sub: str) -> None:
re = requests.post(
f"https://{AUTH0_DOMAIN}/api/v2/jobs/verification-email",
headers={"Authorization": f"Bearer {get_management_token()}"},
json={"user_id": sub},
)
if re.status_code != 201:
raise HTTPException(re.status_code, re.json())
return re.json()
def create_user_invite(email: str) -> dict:
re = requests.post(
f"https://{AUTH0_DOMAIN}/api/v2/users",
headers={"Authorization": f"Bearer {get_management_token()}"},
json={
"email": email,
"connection": "Username-Password-Authentication",
"password": create_random_password(),
"verify_email": False,
"app_metadata": {"invitedToMyApp": True},
},
)
if re.status_code != 201:
raise HTTPException(re.status_code, re.json())
return re.json()
def password_change_mail(email: str) -> bool:
re = requests.post(
f"https://{AUTH0_DOMAIN}/dbconnections/change_password",
headers={"Authorization": f"Bearer {get_management_token()}"},
json={
"client_id": AUTH0_CLIENT_ID,
"email": email,
"connection": "Username-Password-Authentication",
},
)
if re.status_code != 200:
raise HTTPException(re.status_code, re.json())
return True

7
renovate.json Normal file
View File

@@ -0,0 +1,7 @@
{
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"extends": [
"config:recommended",
":semanticCommitTypeAll(feat)"
]
}

7
requirements.auth0.txt Normal file
View File

@@ -0,0 +1,7 @@
cachetools==5.5.0 # for caching
charset-normalizer==3.4.0 # Auth0 API interactions
requests==2.32.3 # Auth0 API interactions
pyjwt==2.10.0 # Auth0 API interactions
cffi==1.17.1 # Auth0 API interactions
cryptography==43.0.3 # Auth0 API interactions
pycparser==2.22 # Auth0 API interactions

View File

@@ -23,5 +23,3 @@ twine>=5.0.0
urllib3>=2.2.1
wheel>=0.43.0
zipp>=3.18.1
-r requirements.txt

5
requirements.pg.txt Normal file
View File

@@ -0,0 +1,5 @@
psycopg>=3.2.1 # PostgreSQL
psycopg-binary>=3.2.1 # PostgreSQL
psycopg-pool>=3.2.2 # PostgreSQL
asyncpg>=0.30.0 # SQLAlchemy
greenlet>=3.1.1 # Async

View File

@@ -11,13 +11,10 @@ starlette>=0.37.2 # FastAPI
fastapi-pagination>=0.12.26 # Pagination
sqlalchemy>=2.0.31 # SQLAlchemy
sqlalchemy-utils==0.41.2 # For managing databases
python-dotenv>=1.0.1 # Environment variables
psycopg>=3.2.1 # PostgreSQL
psycopg-binary>=3.2.1 # PostgreSQL
psycopg-pool>=3.2.2 # PostgreSQL
h11>=0.14.0 # Testing
httpcore>=1.0.5 # Testing
httpx>=0.27.0 # Testing

View File

@@ -5,6 +5,15 @@ from setuptools import find_packages, setup
with open("requirements.txt") as f:
requirements = f.read().splitlines()
with open("requirements.build.txt") as f:
build_requirements = f.read().splitlines()
with open("requirements.pg.txt") as f:
pg_requirements = f.read().splitlines()
with open("requirements.auth0.txt") as f:
auth0_requirements = f.read().splitlines()
def get_latest_git_tag() -> str:
try:
@@ -33,6 +42,12 @@ setup(
license="MIT",
python_requires=">=3.12",
install_requires=requirements,
extras_require={
"build": build_requirements,
"postgres": pg_requirements,
"auth0": auth0_requirements,
"all": build_requirements + pg_requirements + auth0_requirements,
},
keywords=[
"creyPY",
"Python",
@@ -40,7 +55,6 @@ setup(
"shortcuts",
"snippets",
"utils",
"personal library",
],
platforms="any",
)