mirror of
https://github.com/kevin-DL/full-stack-fastapi-postgresql.git
synced 2026-01-14 11:04:41 +00:00
Compare commits
21 Commits
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
44d8a4358b | ||
|
|
9b4108fdae | ||
|
|
b4fa418e65 | ||
|
|
a612765b83 | ||
|
|
de7140f1e7 | ||
|
|
546dc8bdcb | ||
|
|
eae33cda72 | ||
|
|
1d30172e7a | ||
|
|
6fc9a37eb5 | ||
|
|
170231783a | ||
|
|
5216fcfd77 | ||
|
|
8bf3607d2b | ||
|
|
8ce745b7ef | ||
|
|
6bbd58c76f | ||
|
|
1aeb3208bf | ||
|
|
45317e54c7 | ||
|
|
47e0fe56e3 | ||
|
|
42ee0fe0ba | ||
|
|
92b757fc96 | ||
|
|
bece399368 | ||
|
|
5dd83c6350 |
24
README.md
24
README.md
@@ -148,6 +148,28 @@ After using this generator, your new project (the directory created) will contai
|
||||
|
||||
### Next release
|
||||
|
||||
### 0.4.0
|
||||
|
||||
* Fix security on resetting a password. Receive token as body, not query. PR [#34](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/34).
|
||||
|
||||
* Fix security on resetting a password. Receive it as body, not query. PR [#33](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/33) by [@dmontagu](https://github.com/dmontagu).
|
||||
|
||||
* Fix SQLAlchemy class lookup on initialization. PR [#29](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/29) by [@ebreton](https://github.com/ebreton).
|
||||
|
||||
* Fix SQLAlchemy operation errors on database restart. PR [#32](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/32) by [@ebreton](https://github.com/ebreton).
|
||||
|
||||
* Fix locations of scripts in generated README. PR [#19](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/19) by [@ebreton](https://github.com/ebreton).
|
||||
|
||||
* Forward arguments from script to `pytest` inside container. PR [#17](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/17) by [@ebreton](https://github.com/ebreton).
|
||||
|
||||
* Update development scripts.
|
||||
|
||||
* Read Alembic configs from env vars. PR <a href="https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/9" target="_blank">#9</a> by <a href="https://github.com/ebreton" target="_blank">@ebreton</a>.
|
||||
|
||||
* Create DB Item objects from all Pydantic model's fields.
|
||||
|
||||
* Update Jupyter Lab installation and util script/environment variable for local development.
|
||||
|
||||
### 0.3.0
|
||||
|
||||
* PR <a href="https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/14" target="_blank">#14</a>:
|
||||
@@ -161,7 +183,7 @@ After using this generator, your new project (the directory created) will contai
|
||||
* Update migrations to include new Items.
|
||||
* Update project README.md with tips about how to start with backend.
|
||||
|
||||
* Upgrade Python to 3.7 as Celery is now compatible too. <a href="https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/10" target="_blank">PR #10</a> by <a href="https://github.com/ebreton" target="_blank">@ebreton</a>.
|
||||
* Upgrade Python to 3.7 as Celery is now compatible too. PR <a href="https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/10" target="_blank">#10</a> by <a href="https://github.com/ebreton" target="_blank">@ebreton</a>.
|
||||
|
||||
### 0.2.2
|
||||
|
||||
|
||||
@@ -1,4 +1,5 @@
|
||||
rm -rf \{\{cookiecutter.project_slug\}\}/.git
|
||||
rm -rf \{\{cookiecutter.project_slug\}\}/backend/app/Pipfile.lock
|
||||
rm -rf \{\{cookiecutter.project_slug\}\}/frontend/node_modules
|
||||
rm -rf \{\{cookiecutter.project_slug\}\}/frontend/dist
|
||||
git checkout \{\{cookiecutter.project_slug\}\}/README.md
|
||||
@@ -6,8 +7,8 @@ git checkout \{\{cookiecutter.project_slug\}\}/.gitlab-ci.yml
|
||||
git checkout \{\{cookiecutter.project_slug\}\}/cookiecutter-config-file.yml
|
||||
git checkout \{\{cookiecutter.project_slug\}\}/docker-compose.deploy.networks.yml
|
||||
git checkout \{\{cookiecutter.project_slug\}\}/env-backend.env
|
||||
git checkout \{\{cookiecutter.project_slug\}\}/env-couchbase.env
|
||||
git checkout \{\{cookiecutter.project_slug\}\}/env-flower.env
|
||||
git checkout \{\{cookiecutter.project_slug\}\}/.env
|
||||
git checkout \{\{cookiecutter.project_slug\}\}/frontend/.env
|
||||
git checkout \{\{cookiecutter.project_slug\}\}/env-sync-gateway.env
|
||||
git checkout \{\{cookiecutter.project_slug\}\}/env-pgadmin.env
|
||||
git checkout \{\{cookiecutter.project_slug\}\}/env-postgres.env
|
||||
|
||||
@@ -123,10 +123,10 @@ Nevertheless, if it doesn't detect a change but a syntax error, it will just sto
|
||||
To test the backend run:
|
||||
|
||||
```bash
|
||||
DOMAIN=backend sh ./script-test.sh
|
||||
DOMAIN=backend sh ./scripts/test.sh
|
||||
```
|
||||
|
||||
The file `./script-test.sh` has the commands to generate a testing `docker-stack.yml` file from the needed Docker Compose files, start the stack and test it.
|
||||
The file `./scripts/test.sh` has the commands to generate a testing `docker-stack.yml` file from the needed Docker Compose files, start the stack and test it.
|
||||
|
||||
The tests run with Pytest, modify and add tests to `./backend/app/app/tests/`.
|
||||
|
||||
@@ -134,6 +134,22 @@ If you need to install any additional package for the tests, add it to the file
|
||||
|
||||
If you use GitLab CI the tests will run automatically.
|
||||
|
||||
#### Test running stack
|
||||
|
||||
If your stack is already up and you just want to run the tests, you can use:
|
||||
|
||||
```bash
|
||||
docker-compose exec backend-tests /tests-start.sh
|
||||
```
|
||||
|
||||
That `/tests-start.sh` script inside the `backend-tests` container calls `pytest`. If you need to pass extra arguments to `pytest`, you can pass them to that command and they will be forwarded.
|
||||
|
||||
For example, to stop on first error:
|
||||
|
||||
```bash
|
||||
docker-compose exec backend-tests /tests-start.sh -x
|
||||
```
|
||||
|
||||
### Live development with Python Jupyter Notebooks
|
||||
|
||||
If you know about Python [Jupyter Notebooks](http://jupyter.org/), you can take advantage of them during local development.
|
||||
@@ -384,7 +400,7 @@ Then you need to have those constraints in your deployment Docker Compose file f
|
||||
To be able to use different environments, like `prod` and `stag`, you should pass the name of the stack as an environment variable. Like:
|
||||
|
||||
```bash
|
||||
STACK_NAME={{cookiecutter.docker_swarm_stack_name_staging}} sh ./script-deploy.sh
|
||||
STACK_NAME={{cookiecutter.docker_swarm_stack_name_staging}} sh ./scripts/deploy.sh
|
||||
```
|
||||
|
||||
To use and expand that environment variable inside the `docker-compose.deploy.volumes-placement.yml` files you can add the constraints to the services like:
|
||||
@@ -401,7 +417,7 @@ services:
|
||||
- node.labels.${STACK_NAME}.app-db-data == true
|
||||
```
|
||||
|
||||
note the `${STACK_NAME}`. In the script `./script-deploy.sh`, that `docker-compose.deploy.volumes-placement.yml` would be converted, and saved to a file `docker-stack.yml` containing:
|
||||
note the `${STACK_NAME}`. In the script `./scripts/deploy.sh`, that `docker-compose.deploy.volumes-placement.yml` would be converted, and saved to a file `docker-stack.yml` containing:
|
||||
|
||||
```yaml
|
||||
version: '3'
|
||||
@@ -490,10 +506,10 @@ Here are the steps in detail:
|
||||
* Set these environment variables, prepended to the next command:
|
||||
* `TAG=prod`
|
||||
* `FRONTEND_ENV=production`
|
||||
* Use the provided `script-build.sh` file with those environment variables:
|
||||
* Use the provided `scripts/build.sh` file with those environment variables:
|
||||
|
||||
```bash
|
||||
TAG=prod FRONTEND_ENV=production bash ./script-build.sh
|
||||
TAG=prod FRONTEND_ENV=production bash ./scripts/build.sh
|
||||
```
|
||||
|
||||
2. **Optionally, push your images to a Docker Registry**
|
||||
@@ -505,10 +521,10 @@ If you are using a registry and pushing your images, you can omit running the pr
|
||||
* Set these environment variables:
|
||||
* `TAG=prod`
|
||||
* `FRONTEND_ENV=production`
|
||||
* Use the provided `script-build-push.sh` file with those environment variables:
|
||||
* Use the provided `scripts/build-push.sh` file with those environment variables:
|
||||
|
||||
```bash
|
||||
TAG=prod FRONTEND_ENV=production bash ./script-build.sh
|
||||
TAG=prod FRONTEND_ENV=production bash ./scripts/build-push.sh
|
||||
```
|
||||
|
||||
3. **Deploy your stack**
|
||||
@@ -518,14 +534,14 @@ TAG=prod FRONTEND_ENV=production bash ./script-build.sh
|
||||
* `TRAEFIK_TAG={{cookiecutter.traefik_constraint_tag}}`
|
||||
* `STACK_NAME={{cookiecutter.docker_swarm_stack_name_main}}`
|
||||
* `TAG=prod`
|
||||
* Use the provided `script-deploy.sh` file with those environment variables:
|
||||
* Use the provided `scripts/deploy.sh` file with those environment variables:
|
||||
|
||||
```bash
|
||||
DOMAIN={{cookiecutter.domain_main}} \
|
||||
TRAEFIK_TAG={{cookiecutter.traefik_constraint_tag}} \
|
||||
STACK_NAME={{cookiecutter.docker_swarm_stack_name_main}} \
|
||||
TAG=prod \
|
||||
bash ./script-deploy.sh
|
||||
bash ./scripts/deploy.sh
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
@@ -35,9 +35,6 @@ script_location = alembic
|
||||
# are written from script.py.mako
|
||||
# output_encoding = utf-8
|
||||
|
||||
sqlalchemy.url = postgresql://postgres:{{cookiecutter.postgres_password}}@db/app
|
||||
|
||||
|
||||
# Logging configuration
|
||||
[loggers]
|
||||
keys = root,sqlalchemy,alembic
|
||||
|
||||
@@ -1,4 +1,7 @@
|
||||
from __future__ import with_statement
|
||||
|
||||
import os
|
||||
|
||||
from alembic import context
|
||||
from sqlalchemy import engine_from_config, pool
|
||||
from logging.config import fileConfig
|
||||
@@ -27,6 +30,14 @@ target_metadata = Base.metadata
|
||||
# ... etc.
|
||||
|
||||
|
||||
def get_url():
|
||||
user = os.getenv("POSTGRES_USER", "postgres")
|
||||
password = os.getenv("POSTGRES_PASSWORD", "")
|
||||
server = os.getenv("POSTGRES_SERVER", "db")
|
||||
db = os.getenv("POSTGRES_DB", "app")
|
||||
return f"postgresql://{user}:{password}@{server}/{db}"
|
||||
|
||||
|
||||
def run_migrations_offline():
|
||||
"""Run migrations in 'offline' mode.
|
||||
|
||||
@@ -39,7 +50,7 @@ def run_migrations_offline():
|
||||
script output.
|
||||
|
||||
"""
|
||||
url = config.get_main_option("sqlalchemy.url")
|
||||
url = get_url()
|
||||
context.configure(
|
||||
url=url, target_metadata=target_metadata, literal_binds=True, compare_type=True
|
||||
)
|
||||
@@ -55,8 +66,10 @@ def run_migrations_online():
|
||||
and associate a connection with the context.
|
||||
|
||||
"""
|
||||
configuration = config.get_section(config.config_ini_section)
|
||||
configuration['sqlalchemy.url'] = get_url()
|
||||
connectable = engine_from_config(
|
||||
config.get_section(config.config_ini_section),
|
||||
configuration,
|
||||
prefix="sqlalchemy.",
|
||||
poolclass=pool.NullPool,
|
||||
)
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
from datetime import timedelta
|
||||
|
||||
from fastapi import APIRouter, Depends, HTTPException
|
||||
from fastapi import APIRouter, Body, Depends, HTTPException
|
||||
from fastapi.security import OAuth2PasswordRequestForm
|
||||
from sqlalchemy.orm import Session
|
||||
|
||||
@@ -74,7 +74,7 @@ def recover_password(email: str, db: Session = Depends(get_db)):
|
||||
|
||||
|
||||
@router.post("/reset-password/", tags=["login"], response_model=Msg)
|
||||
def reset_password(token: str, new_password: str, db: Session = Depends(get_db)):
|
||||
def reset_password(token: str = Body(...), new_password: str = Body(...), db: Session = Depends(get_db)):
|
||||
"""
|
||||
Reset password
|
||||
"""
|
||||
|
||||
@@ -28,7 +28,8 @@ def get_multi_by_owner(
|
||||
|
||||
|
||||
def create(db_session: Session, *, item_in: ItemCreate, owner_id: int) -> Item:
|
||||
item = Item(title=item_in.title, description=item_in.description, owner_id=owner_id)
|
||||
item_in_data = jsonable_encoder(item_in)
|
||||
item = Item(**item_in_data, owner_id=owner_id)
|
||||
db_session.add(item)
|
||||
db_session.commit()
|
||||
db_session.refresh(item)
|
||||
|
||||
@@ -2,6 +2,11 @@ from app import crud
|
||||
from app.core import config
|
||||
from app.models.user import UserCreate
|
||||
|
||||
# make sure all SQL Alchemy models are imported before initializing DB
|
||||
# otherwise, SQL Alchemy might fail to initialize properly relationships
|
||||
# for more details: https://github.com/tiangolo/full-stack-fastapi-postgresql/issues/28
|
||||
from app.db import base
|
||||
|
||||
|
||||
def init_db(db_session):
|
||||
# Tables should be created with Alembic migrations
|
||||
|
||||
@@ -3,7 +3,7 @@ from sqlalchemy.orm import scoped_session, sessionmaker
|
||||
|
||||
from app.core import config
|
||||
|
||||
engine = create_engine(config.SQLALCHEMY_DATABASE_URI)
|
||||
engine = create_engine(config.SQLALCHEMY_DATABASE_URI, pool_pre_ping=True)
|
||||
db_session = scoped_session(
|
||||
sessionmaker(autocommit=False, autoflush=False, bind=engine)
|
||||
)
|
||||
|
||||
@@ -3,4 +3,4 @@ set -e
|
||||
|
||||
python /app/app/tests_pre_start.py
|
||||
|
||||
pytest /app/app/tests/
|
||||
pytest $* /app/app/tests/
|
||||
|
||||
@@ -4,9 +4,9 @@ RUN pip install celery~=4.3 passlib[bcrypt] tenacity requests emails "fastapi>=0
|
||||
|
||||
# For development, Jupyter remote kernel, Hydrogen
|
||||
# Using inside the container:
|
||||
# jupyter notebook --ip=0.0.0.0 --allow-root
|
||||
# jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888
|
||||
ARG env=prod
|
||||
RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyter ; fi"
|
||||
RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyterlab ; fi"
|
||||
EXPOSE 8888
|
||||
|
||||
COPY ./app /app
|
||||
|
||||
@@ -4,9 +4,9 @@ RUN pip install raven celery~=4.3 passlib[bcrypt] tenacity requests "fastapi>=0.
|
||||
|
||||
# For development, Jupyter remote kernel, Hydrogen
|
||||
# Using inside the container:
|
||||
# jupyter notebook --ip=0.0.0.0 --allow-root
|
||||
# jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888
|
||||
ARG env=prod
|
||||
RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyter ; fi"
|
||||
RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyterlab ; fi"
|
||||
EXPOSE 8888
|
||||
|
||||
ENV C_FORCE_ROOT=1
|
||||
|
||||
@@ -4,9 +4,9 @@ RUN pip install requests pytest tenacity passlib[bcrypt] "fastapi>=0.16.0" psyco
|
||||
|
||||
# For development, Jupyter remote kernel, Hydrogen
|
||||
# Using inside the container:
|
||||
# jupyter notebook --ip=0.0.0.0 --allow-root
|
||||
# jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888
|
||||
ARG env=prod
|
||||
RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyter ; fi"
|
||||
RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyterlab ; fi"
|
||||
EXPOSE 8888
|
||||
|
||||
COPY ./app /app
|
||||
|
||||
@@ -2,13 +2,13 @@ version: '3.3'
|
||||
services:
|
||||
backend:
|
||||
environment:
|
||||
- 'JUPYTER=jupyter notebook --ip=0.0.0.0 --allow-root'
|
||||
- JUPYTER=jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888
|
||||
- SERVER_HOST=http://${DOMAIN}
|
||||
celeryworker:
|
||||
environment:
|
||||
- RUN=celery worker -A app.worker -l info -Q main-queue -c 1
|
||||
- JUPYTER=jupyter notebook --ip=0.0.0.0 --allow-root
|
||||
- JUPYTER=jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888
|
||||
- SERVER_HOST=http://${DOMAIN}
|
||||
backend-tests:
|
||||
environment:
|
||||
- JUPYTER=jupyter notebook --ip=0.0.0.0 --allow-root
|
||||
- JUPYTER=jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888
|
||||
|
||||
Reference in New Issue
Block a user