21 Commits
0.3.0 ... 0.4.0

Author SHA1 Message Date
Sebastián Ramírez
44d8a4358b 🔖 Release version 0.4.0 2019-05-29 09:49:17 +04:00
Sebastián Ramírez
9b4108fdae 📝 Update release notes 2019-05-29 09:48:28 +04:00
Sebastián Ramírez
b4fa418e65 🔒 Receive token as body in reset password (#34) 2019-05-29 09:47:59 +04:00
Sebastián Ramírez
a612765b83 📝 Update release notes, clarify text 2019-05-29 09:35:13 +04:00
Sebastián Ramírez
de7140f1e7 📝 Update release notes 2019-05-29 09:27:04 +04:00
dmontagu
546dc8bdcb 🔒 Update login.py to receive password as body (#33)
Change `new_password` from a query parameter to a body parameter for security.

(Why this is problematic is discussed in the top answer to https://stackoverflow.com/questions/2629222/are-querystring-parameters-secure-in-https-http-ssl)
2019-05-29 09:24:09 +04:00
Sebastián Ramírez
eae33cda72 📝 Update release notes 2019-05-22 15:30:51 +04:00
Manu
1d30172e7a 🗃️ Fix SQLAlchemy class lookup (#29) 2019-05-22 15:29:24 +04:00
Sebastián Ramírez
6fc9a37eb5 📝 Update release notes 2019-05-22 15:21:02 +04:00
Manu
170231783a 🗃️ Fix SQLAlchemy operation error after database restarts (#32) 2019-05-22 15:18:59 +04:00
Sebastián Ramírez
5216fcfd77 📝 Update release notes 2019-05-04 00:04:30 +04:00
Manu
8bf3607d2b 📝 Fix the paths of the scripts in README (#19)
* removed postgres_password from alembic.ini, read it from env var instead

* ♻️ use f-strings for PostgreSQL URL

* fix path to scripts
2019-05-03 23:57:12 +04:00
Sebastián Ramírez
8ce745b7ef 📝 Update release notes 2019-05-03 23:52:42 +04:00
Sebastián Ramírez
6bbd58c76f 📝 Update docs for running tests live 2019-05-03 23:52:23 +04:00
Manu
1aeb3208bf Use extra pytest arguments forwarded from shell (#17)
* removed postgres_password from alembic.ini, read it from env var instead

* ♻️ use f-strings for PostgreSQL URL

* passes given args
2019-05-03 23:44:18 +04:00
Sebastián Ramírez
45317e54c7 📝 Update release notes 2019-04-24 22:46:09 +04:00
Sebastián Ramírez
47e0fe56e3 ⬆️ Upgrade Jupyter to use Lab, update util/env var for local development 2019-04-24 22:45:20 +04:00
Sebastián Ramírez
42ee0fe0ba 📝 Update release notes 2019-04-20 19:59:47 +04:00
Sebastián Ramírez
92b757fc96 ♻️ Create Item from all fields in Pydantic model 2019-04-20 19:57:34 +04:00
Manu
bece399368 ♻️ removed postgres_password from alembic.ini (#9)
♻️ removed postgres_password from alembic.ini (#9)
2019-04-20 19:56:50 +04:00
Sebastián Ramírez
5dd83c6350 🔧 Update development scripts 2019-04-20 19:24:57 +04:00
14 changed files with 87 additions and 32 deletions

View File

@@ -148,6 +148,28 @@ After using this generator, your new project (the directory created) will contai
### Next release ### Next release
### 0.4.0
* Fix security on resetting a password. Receive token as body, not query. PR [#34](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/34).
* Fix security on resetting a password. Receive it as body, not query. PR [#33](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/33) by [@dmontagu](https://github.com/dmontagu).
* Fix SQLAlchemy class lookup on initialization. PR [#29](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/29) by [@ebreton](https://github.com/ebreton).
* Fix SQLAlchemy operation errors on database restart. PR [#32](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/32) by [@ebreton](https://github.com/ebreton).
* Fix locations of scripts in generated README. PR [#19](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/19) by [@ebreton](https://github.com/ebreton).
* Forward arguments from script to `pytest` inside container. PR [#17](https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/17) by [@ebreton](https://github.com/ebreton).
* Update development scripts.
* Read Alembic configs from env vars. PR <a href="https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/9" target="_blank">#9</a> by <a href="https://github.com/ebreton" target="_blank">@ebreton</a>.
* Create DB Item objects from all Pydantic model's fields.
* Update Jupyter Lab installation and util script/environment variable for local development.
### 0.3.0 ### 0.3.0
* PR <a href="https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/14" target="_blank">#14</a>: * PR <a href="https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/14" target="_blank">#14</a>:
@@ -161,7 +183,7 @@ After using this generator, your new project (the directory created) will contai
* Update migrations to include new Items. * Update migrations to include new Items.
* Update project README.md with tips about how to start with backend. * Update project README.md with tips about how to start with backend.
* Upgrade Python to 3.7 as Celery is now compatible too. <a href="https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/10" target="_blank">PR #10</a> by <a href="https://github.com/ebreton" target="_blank">@ebreton</a>. * Upgrade Python to 3.7 as Celery is now compatible too. PR <a href="https://github.com/tiangolo/full-stack-fastapi-postgresql/pull/10" target="_blank">#10</a> by <a href="https://github.com/ebreton" target="_blank">@ebreton</a>.
### 0.2.2 ### 0.2.2

View File

@@ -1,4 +1,5 @@
rm -rf \{\{cookiecutter.project_slug\}\}/.git rm -rf \{\{cookiecutter.project_slug\}\}/.git
rm -rf \{\{cookiecutter.project_slug\}\}/backend/app/Pipfile.lock
rm -rf \{\{cookiecutter.project_slug\}\}/frontend/node_modules rm -rf \{\{cookiecutter.project_slug\}\}/frontend/node_modules
rm -rf \{\{cookiecutter.project_slug\}\}/frontend/dist rm -rf \{\{cookiecutter.project_slug\}\}/frontend/dist
git checkout \{\{cookiecutter.project_slug\}\}/README.md git checkout \{\{cookiecutter.project_slug\}\}/README.md
@@ -6,8 +7,8 @@ git checkout \{\{cookiecutter.project_slug\}\}/.gitlab-ci.yml
git checkout \{\{cookiecutter.project_slug\}\}/cookiecutter-config-file.yml git checkout \{\{cookiecutter.project_slug\}\}/cookiecutter-config-file.yml
git checkout \{\{cookiecutter.project_slug\}\}/docker-compose.deploy.networks.yml git checkout \{\{cookiecutter.project_slug\}\}/docker-compose.deploy.networks.yml
git checkout \{\{cookiecutter.project_slug\}\}/env-backend.env git checkout \{\{cookiecutter.project_slug\}\}/env-backend.env
git checkout \{\{cookiecutter.project_slug\}\}/env-couchbase.env
git checkout \{\{cookiecutter.project_slug\}\}/env-flower.env git checkout \{\{cookiecutter.project_slug\}\}/env-flower.env
git checkout \{\{cookiecutter.project_slug\}\}/.env git checkout \{\{cookiecutter.project_slug\}\}/.env
git checkout \{\{cookiecutter.project_slug\}\}/frontend/.env git checkout \{\{cookiecutter.project_slug\}\}/frontend/.env
git checkout \{\{cookiecutter.project_slug\}\}/env-sync-gateway.env git checkout \{\{cookiecutter.project_slug\}\}/env-pgadmin.env
git checkout \{\{cookiecutter.project_slug\}\}/env-postgres.env

View File

@@ -123,10 +123,10 @@ Nevertheless, if it doesn't detect a change but a syntax error, it will just sto
To test the backend run: To test the backend run:
```bash ```bash
DOMAIN=backend sh ./script-test.sh DOMAIN=backend sh ./scripts/test.sh
``` ```
The file `./script-test.sh` has the commands to generate a testing `docker-stack.yml` file from the needed Docker Compose files, start the stack and test it. The file `./scripts/test.sh` has the commands to generate a testing `docker-stack.yml` file from the needed Docker Compose files, start the stack and test it.
The tests run with Pytest, modify and add tests to `./backend/app/app/tests/`. The tests run with Pytest, modify and add tests to `./backend/app/app/tests/`.
@@ -134,6 +134,22 @@ If you need to install any additional package for the tests, add it to the file
If you use GitLab CI the tests will run automatically. If you use GitLab CI the tests will run automatically.
#### Test running stack
If your stack is already up and you just want to run the tests, you can use:
```bash
docker-compose exec backend-tests /tests-start.sh
```
That `/tests-start.sh` script inside the `backend-tests` container calls `pytest`. If you need to pass extra arguments to `pytest`, you can pass them to that command and they will be forwarded.
For example, to stop on first error:
```bash
docker-compose exec backend-tests /tests-start.sh -x
```
### Live development with Python Jupyter Notebooks ### Live development with Python Jupyter Notebooks
If you know about Python [Jupyter Notebooks](http://jupyter.org/), you can take advantage of them during local development. If you know about Python [Jupyter Notebooks](http://jupyter.org/), you can take advantage of them during local development.
@@ -384,7 +400,7 @@ Then you need to have those constraints in your deployment Docker Compose file f
To be able to use different environments, like `prod` and `stag`, you should pass the name of the stack as an environment variable. Like: To be able to use different environments, like `prod` and `stag`, you should pass the name of the stack as an environment variable. Like:
```bash ```bash
STACK_NAME={{cookiecutter.docker_swarm_stack_name_staging}} sh ./script-deploy.sh STACK_NAME={{cookiecutter.docker_swarm_stack_name_staging}} sh ./scripts/deploy.sh
``` ```
To use and expand that environment variable inside the `docker-compose.deploy.volumes-placement.yml` files you can add the constraints to the services like: To use and expand that environment variable inside the `docker-compose.deploy.volumes-placement.yml` files you can add the constraints to the services like:
@@ -401,7 +417,7 @@ services:
- node.labels.${STACK_NAME}.app-db-data == true - node.labels.${STACK_NAME}.app-db-data == true
``` ```
note the `${STACK_NAME}`. In the script `./script-deploy.sh`, that `docker-compose.deploy.volumes-placement.yml` would be converted, and saved to a file `docker-stack.yml` containing: note the `${STACK_NAME}`. In the script `./scripts/deploy.sh`, that `docker-compose.deploy.volumes-placement.yml` would be converted, and saved to a file `docker-stack.yml` containing:
```yaml ```yaml
version: '3' version: '3'
@@ -490,10 +506,10 @@ Here are the steps in detail:
* Set these environment variables, prepended to the next command: * Set these environment variables, prepended to the next command:
* `TAG=prod` * `TAG=prod`
* `FRONTEND_ENV=production` * `FRONTEND_ENV=production`
* Use the provided `script-build.sh` file with those environment variables: * Use the provided `scripts/build.sh` file with those environment variables:
```bash ```bash
TAG=prod FRONTEND_ENV=production bash ./script-build.sh TAG=prod FRONTEND_ENV=production bash ./scripts/build.sh
``` ```
2. **Optionally, push your images to a Docker Registry** 2. **Optionally, push your images to a Docker Registry**
@@ -505,10 +521,10 @@ If you are using a registry and pushing your images, you can omit running the pr
* Set these environment variables: * Set these environment variables:
* `TAG=prod` * `TAG=prod`
* `FRONTEND_ENV=production` * `FRONTEND_ENV=production`
* Use the provided `script-build-push.sh` file with those environment variables: * Use the provided `scripts/build-push.sh` file with those environment variables:
```bash ```bash
TAG=prod FRONTEND_ENV=production bash ./script-build.sh TAG=prod FRONTEND_ENV=production bash ./scripts/build-push.sh
``` ```
3. **Deploy your stack** 3. **Deploy your stack**
@@ -518,14 +534,14 @@ TAG=prod FRONTEND_ENV=production bash ./script-build.sh
* `TRAEFIK_TAG={{cookiecutter.traefik_constraint_tag}}` * `TRAEFIK_TAG={{cookiecutter.traefik_constraint_tag}}`
* `STACK_NAME={{cookiecutter.docker_swarm_stack_name_main}}` * `STACK_NAME={{cookiecutter.docker_swarm_stack_name_main}}`
* `TAG=prod` * `TAG=prod`
* Use the provided `script-deploy.sh` file with those environment variables: * Use the provided `scripts/deploy.sh` file with those environment variables:
```bash ```bash
DOMAIN={{cookiecutter.domain_main}} \ DOMAIN={{cookiecutter.domain_main}} \
TRAEFIK_TAG={{cookiecutter.traefik_constraint_tag}} \ TRAEFIK_TAG={{cookiecutter.traefik_constraint_tag}} \
STACK_NAME={{cookiecutter.docker_swarm_stack_name_main}} \ STACK_NAME={{cookiecutter.docker_swarm_stack_name_main}} \
TAG=prod \ TAG=prod \
bash ./script-deploy.sh bash ./scripts/deploy.sh
``` ```
--- ---

View File

@@ -35,9 +35,6 @@ script_location = alembic
# are written from script.py.mako # are written from script.py.mako
# output_encoding = utf-8 # output_encoding = utf-8
sqlalchemy.url = postgresql://postgres:{{cookiecutter.postgres_password}}@db/app
# Logging configuration # Logging configuration
[loggers] [loggers]
keys = root,sqlalchemy,alembic keys = root,sqlalchemy,alembic

View File

@@ -1,4 +1,7 @@
from __future__ import with_statement from __future__ import with_statement
import os
from alembic import context from alembic import context
from sqlalchemy import engine_from_config, pool from sqlalchemy import engine_from_config, pool
from logging.config import fileConfig from logging.config import fileConfig
@@ -27,6 +30,14 @@ target_metadata = Base.metadata
# ... etc. # ... etc.
def get_url():
user = os.getenv("POSTGRES_USER", "postgres")
password = os.getenv("POSTGRES_PASSWORD", "")
server = os.getenv("POSTGRES_SERVER", "db")
db = os.getenv("POSTGRES_DB", "app")
return f"postgresql://{user}:{password}@{server}/{db}"
def run_migrations_offline(): def run_migrations_offline():
"""Run migrations in 'offline' mode. """Run migrations in 'offline' mode.
@@ -39,7 +50,7 @@ def run_migrations_offline():
script output. script output.
""" """
url = config.get_main_option("sqlalchemy.url") url = get_url()
context.configure( context.configure(
url=url, target_metadata=target_metadata, literal_binds=True, compare_type=True url=url, target_metadata=target_metadata, literal_binds=True, compare_type=True
) )
@@ -55,8 +66,10 @@ def run_migrations_online():
and associate a connection with the context. and associate a connection with the context.
""" """
configuration = config.get_section(config.config_ini_section)
configuration['sqlalchemy.url'] = get_url()
connectable = engine_from_config( connectable = engine_from_config(
config.get_section(config.config_ini_section), configuration,
prefix="sqlalchemy.", prefix="sqlalchemy.",
poolclass=pool.NullPool, poolclass=pool.NullPool,
) )

View File

@@ -1,6 +1,6 @@
from datetime import timedelta from datetime import timedelta
from fastapi import APIRouter, Depends, HTTPException from fastapi import APIRouter, Body, Depends, HTTPException
from fastapi.security import OAuth2PasswordRequestForm from fastapi.security import OAuth2PasswordRequestForm
from sqlalchemy.orm import Session from sqlalchemy.orm import Session
@@ -74,7 +74,7 @@ def recover_password(email: str, db: Session = Depends(get_db)):
@router.post("/reset-password/", tags=["login"], response_model=Msg) @router.post("/reset-password/", tags=["login"], response_model=Msg)
def reset_password(token: str, new_password: str, db: Session = Depends(get_db)): def reset_password(token: str = Body(...), new_password: str = Body(...), db: Session = Depends(get_db)):
""" """
Reset password Reset password
""" """

View File

@@ -28,7 +28,8 @@ def get_multi_by_owner(
def create(db_session: Session, *, item_in: ItemCreate, owner_id: int) -> Item: def create(db_session: Session, *, item_in: ItemCreate, owner_id: int) -> Item:
item = Item(title=item_in.title, description=item_in.description, owner_id=owner_id) item_in_data = jsonable_encoder(item_in)
item = Item(**item_in_data, owner_id=owner_id)
db_session.add(item) db_session.add(item)
db_session.commit() db_session.commit()
db_session.refresh(item) db_session.refresh(item)

View File

@@ -2,6 +2,11 @@ from app import crud
from app.core import config from app.core import config
from app.models.user import UserCreate from app.models.user import UserCreate
# make sure all SQL Alchemy models are imported before initializing DB
# otherwise, SQL Alchemy might fail to initialize properly relationships
# for more details: https://github.com/tiangolo/full-stack-fastapi-postgresql/issues/28
from app.db import base
def init_db(db_session): def init_db(db_session):
# Tables should be created with Alembic migrations # Tables should be created with Alembic migrations

View File

@@ -3,7 +3,7 @@ from sqlalchemy.orm import scoped_session, sessionmaker
from app.core import config from app.core import config
engine = create_engine(config.SQLALCHEMY_DATABASE_URI) engine = create_engine(config.SQLALCHEMY_DATABASE_URI, pool_pre_ping=True)
db_session = scoped_session( db_session = scoped_session(
sessionmaker(autocommit=False, autoflush=False, bind=engine) sessionmaker(autocommit=False, autoflush=False, bind=engine)
) )

View File

@@ -3,4 +3,4 @@ set -e
python /app/app/tests_pre_start.py python /app/app/tests_pre_start.py
pytest /app/app/tests/ pytest $* /app/app/tests/

View File

@@ -4,9 +4,9 @@ RUN pip install celery~=4.3 passlib[bcrypt] tenacity requests emails "fastapi>=0
# For development, Jupyter remote kernel, Hydrogen # For development, Jupyter remote kernel, Hydrogen
# Using inside the container: # Using inside the container:
# jupyter notebook --ip=0.0.0.0 --allow-root # jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888
ARG env=prod ARG env=prod
RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyter ; fi" RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyterlab ; fi"
EXPOSE 8888 EXPOSE 8888
COPY ./app /app COPY ./app /app

View File

@@ -4,9 +4,9 @@ RUN pip install raven celery~=4.3 passlib[bcrypt] tenacity requests "fastapi>=0.
# For development, Jupyter remote kernel, Hydrogen # For development, Jupyter remote kernel, Hydrogen
# Using inside the container: # Using inside the container:
# jupyter notebook --ip=0.0.0.0 --allow-root # jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888
ARG env=prod ARG env=prod
RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyter ; fi" RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyterlab ; fi"
EXPOSE 8888 EXPOSE 8888
ENV C_FORCE_ROOT=1 ENV C_FORCE_ROOT=1

View File

@@ -4,9 +4,9 @@ RUN pip install requests pytest tenacity passlib[bcrypt] "fastapi>=0.16.0" psyco
# For development, Jupyter remote kernel, Hydrogen # For development, Jupyter remote kernel, Hydrogen
# Using inside the container: # Using inside the container:
# jupyter notebook --ip=0.0.0.0 --allow-root # jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888
ARG env=prod ARG env=prod
RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyter ; fi" RUN bash -c "if [ $env == 'dev' ] ; then pip install jupyterlab ; fi"
EXPOSE 8888 EXPOSE 8888
COPY ./app /app COPY ./app /app

View File

@@ -2,13 +2,13 @@ version: '3.3'
services: services:
backend: backend:
environment: environment:
- 'JUPYTER=jupyter notebook --ip=0.0.0.0 --allow-root' - JUPYTER=jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888
- SERVER_HOST=http://${DOMAIN} - SERVER_HOST=http://${DOMAIN}
celeryworker: celeryworker:
environment: environment:
- RUN=celery worker -A app.worker -l info -Q main-queue -c 1 - RUN=celery worker -A app.worker -l info -Q main-queue -c 1
- JUPYTER=jupyter notebook --ip=0.0.0.0 --allow-root - JUPYTER=jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888
- SERVER_HOST=http://${DOMAIN} - SERVER_HOST=http://${DOMAIN}
backend-tests: backend-tests:
environment: environment:
- JUPYTER=jupyter notebook --ip=0.0.0.0 --allow-root - JUPYTER=jupyter lab --ip=0.0.0.0 --allow-root --NotebookApp.custom_display_url=http://127.0.0.1:8888