-
Notifications
You must be signed in to change notification settings - Fork 0
Open
Description
Mirroring offline convo to here.
Airflow will require that we have a way to allow it to connect with other servers to run commands on them. To do so we will need to:
- set up SSH keys on the airflow server
- have a way to inject those keys into new servers
- have a way to update the existing keys on existing servers (both delete and add new)
To achieve this I think we should do a few things:
- we want to generate new ssh keys with each server, we should do that using terraform so that it lives and dies with the airflow server. We can use the hashicorp/tls provider to generate a private key, add the ssh key as a key in DigitalOcean so it can be consumed by other deployments easily, then use the templatefile builtin to inject it into a user-data config file to inject the airflow server with the new public/private key pair.
- we will follow much the same practice as part one here, except we will use a data look up to fetch the public ssh key from our DigitalOcean account and inject that key into the new droplet using a similar user-data config file process.
- this will be more difficult. We will need a way to fetch all the droplets in our team, then someone with SSH keys to every server will need to run a script that will ssh into all of them and sed/awk to replace the old key with the new key, or just simply append the new key to the authorized_keys file. Hopefully this will be quite the break glass scenario.
Metadata
Metadata
Assignees
Labels
No labels
Type
Projects
Status
📋 Backlog