tuxput: update tuxput 0.0.5

We need to update to tuxput 0.0.5 to address a change in upstream AWS api
format of response to S3 SelectFromObject queries.

Also, discovered that zappa is basically orphaned.  Add a solution to run
zappa under pipenv inside a python 3.7 docker container interactively to
manage deployment until we can replace zappa for lambda deployment.

Signed-off-by: Kelley Spoon <kelley.spoon@linaro.org>
Change-Id: I6bc06b25cf571b7ddb33a2a7c8968e425c617f8c
3 files changed
tree: 53f6d141fceebd0df3fd6cb9a42a8c7ae07d0222
  1. do-docker.sh
  2. Pipfile
  3. Pipfile.lock
  4. README.md
  5. sample-tuxput.json
  6. zappa_init.py
  7. zappa_settings.yml
README.md

This repository uses Zappa, TuxPut, and pipenv to deploy an upload server to S3!

Installing

This will create Pipfile and Pipfile.lock.

$ pipenv install zappa
$ pipenv install -e git+ssh://git@gitlab.com/Linaro/tuxput.git@master#egg=tuxput

Updating

This will update Pipfile.lock based on the most current version of the things listed in Pipfile.

$ pipenv update

Once it's already been deployed, changes (such as tuxput updates) are deployed using 'zappa update'.

$ pipenv run zappa update dev
$ # test/validate dev
$ pipenv run zappa update prod

Running Locally

$ CONF_BUCKET=testing-tuxpub-auth S3_REGION=us-east-1 FLASK_APP=tuxput pipenv run flask run

Old Method for Deploying (see Deploying w/ Docker for current method)

For the time being, there is only one environment: production. The deploy and update verbs require an environment argument.

$ pipenv run zappa deploy prod

Deploying with Docker

After 18 months, we have come to discover that zappa did not age well as a deployment framework. The current version of python is 3.10, but zappa only supports python 3.6 to 3.9.

Rather than trying to hack together workarounds, let's just run the original code in a python 3.7 docker container.

The do-docker.sh script will run a python3.7 in interactive mode and give you a shell from which to run commands. It bind mounts the current directory as '/app', and will also bind mount your $HOME/.ssh under /user/.ssh in order to provide access to your ssh key to pull from gitlab.

The first step is to ensure your AWS credentials can be accessed from the docker environment. You can export them to a temporary file using the aws2-wrap python module (pip install aws2-wrap).

aws2-wrap --export --profile=<your AWS profile name from .config/aws> > ./tf.sh

Next run the do-docker.sh script. The python image should be pulled if not present already:

./do-docker.sh

You should now be logged in as root in a running docker container.

Find out the user that owns the files in and create a user with that same id:

UID=$(ls -ld config | awk '{print $3}')
adduser --uid $UID --disabled-password --gecos '' <your username>
su - <your username>

Now we need to install pipenv and set our AWS credentials:

source tf.sh
pip install pipenv
pipenv install zappa

And at this point we should have a working zappa install inside of pipenv inside of docker.

pipenv run zappa status
pipenv run zappa update prod

Once you exit the docker container shell, the container should shutdown and delete itself.

Certifying

To associate a zappa deploy with a domain name, as defined in zappa_settings.yml, run zappa certify. We use ACM certificates to terminate https, which were manually created and are automatically renewed.

$ pipenv run zappa certify prod

API Gateway

You will also need to setup the AWS API Gateway and Route 53 entry for your site. The API Gateway should be tied to the project_name value in zappa_settings.yml.