Skip to content

bcgov/eagle-api

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1,597 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

bcgov / eagle-api

Lifecycle:Stable

API for acting as a central authenticated data service for all EPIC front-ends

Documentation

All documentation has been consolidated in the Eagle Documentation Wiki:

Related projects

Eagle is a revision name of the EAO EPIC application suite.

These projects comprise EAO EPIC:

Pre-requisites

Note: The following commands work in MacOS bash (not zsh which now default in Catalina). The scripts are currently not fully working in Windows and Linux, so you may need to look at the source of the scripts and manually apply the commands in a right order.

Run the following two scripts to create your environment

#!/bin/bash
.\install_prerequisites.sh
#!/bin/bash
.\setup_project.sh

Fork, Build and Run

Start the server by running npm start

For development you can use npm run start-watch to restart the server on code changes.

Check the swagger-ui on http://localhost:3000/api/docs/

  1. POST http://localhost:3000/api/login/token with the following body
{
"username": #{username},
"password": #{password}
}

and take the token that you get in the response

  1. GET http://localhost:3000/api/application again with the following header Authorization: Bearer _TOKEN_, replacing _TOKEN_ with the value you got from that request

Deployment

Automated Deployments

The application uses GitHub Actions for CI/CD with S2I builds and OpenShift image tagging:

Development (6cdc9e-dev)

Test (6cdc9e-test)

  • Trigger: Manual via GitHub Actions UI
  • Workflow: .github/workflows/deploy-to-test.yaml
  • Process:
    1. Go to Actions → "Deploy to Test" → "Run workflow"
    2. Enter image tag (default: dev) or specific commit SHA
    3. Workflow tags image as test in OpenShift
  • URL: https://eagle-test.apps.silver.devops.gov.bc.ca/api

Production (6cdc9e-prod)

  • Trigger: Manual via GitHub Actions UI
  • Workflow: .github/workflows/deploy-to-prod.yaml
  • Process:
    1. Go to Actions → "Deploy to Prod" → "Run workflow"
    2. Enter image tag (default: test) or specific commit SHA
    3. Workflow tags image as prod in OpenShift
  • URL: https://eagle.gov.bc.ca/api

Deployment Flow Example

# 1. Push to develop → auto-builds and tags as dev with SHA abc1234
git push origin develop

# 2. Manually promote to test (via GitHub UI)
#    - Select "Deploy to Test" workflow
#    - Input: abc1234 (or leave default "dev")
#    - Click "Run workflow"

# 3. Manually promote to production (via GitHub UI)
#    - Select "Deploy to Prod" workflow
#    - Input: abc1234 (or leave default "test")
#    - Click "Run workflow"

Image Tagging in OpenShift

The eagle-api uses OpenShift image tagging for deployments:

# Dev build creates image in tools namespace
# Tag for test deployment
oc tag 6cdc9e-tools/eagle-api:dev 6cdc9e-tools/eagle-api:test

# Tag for prod deployment
oc tag 6cdc9e-tools/eagle-api:test 6cdc9e-tools/eagle-api:prod

The DeploymentConfigs in dev/test/prod namespaces reference images in the tools namespace.

Database

One can run the EPIC applications on two kinds of data; generated and backed-up-from-live.

Generated data will typically be cleaner as it is generated against the latest mongoose models. Generated data also does not require transferring PI to dev machines. Live production dumps should only be used in situations where a particular bug cannot be replicated locally, and after replicating, the data generators and unit tests should be updated to include that edge case.

Generate data

Described in generate README

Restoring from a live backup

Acquire a dump of the database from one of the live environments.

To make sure you don't have an existing old copy (careful, this is destructive):

#!/bin/bash
mongo
use epic
db.dropDatabase()
Load database dump
  1. Download and unzip archived dump file.
  2. Restore the dump into your local mongo:
#!/bin/bash
mongorestore -d epic epic/

Database Conversions

In the process of developing this application, we have database conversion scripts that must be run in order to update the db model so that the newest codebase can work properly. There are currently two methods of doing the database conversion depending on how long-lived and memory intensive the conversion is.

Method 1: db-migrate

Method 2: node scripts named migration* in the root folder

Method 1

See https://www.npmjs.com/package/db-migrate for documentation on running the db migrate command. General use case for local development at the root folder:

./node_modules/db-migrate/bin/db-migrate up

For dev/test/prod environments, you will need to change the database.json file in the root folder accordingly and run with the --env param. See https://www.npmjs.com/package/db-migrate for more information.

Method 2

In the root folder, there are files named migrateDocuments*.js. These are large, long-running, memory intensive scripts that operated on the vast majority of the EPIC documents. As a result, db-migrate was slow and unreliable given the nature of the connection to our database. As a result, these nodejs scripts operate using the mongodb driver in nodejs and can handle a more complicated, robust approach to doing the database conversion. They can be run from your local machine as long as there is a oc port-forward tunnel from your machine to the openshift mongdb database. Change the user/pass/port/host/authenticationDatabase params and the script will execute against the mongodb pod directly.

Developing

See Code Reuse Strategy

Configuring Environment Variables

To get all your settings for this project automatically set up, run the file

#!/bin/bash
./install_prerequisites.sh

...or follow the following manual process if you require custom settings:

Recall the environment variables we need for local dev:

  1. KEYCLOAK_ENABLED=true
  2. MONGODB_DATABASE='epic'

To get actual values for the above fields in the deployed environments, examine the openshift environment you wish to target:

#!/bin/bash
oc project [projectname]
oc get routes
oc get secrets

You will not be able to see the above value of the secret if you try examine it. You will only see the encrypted values. Approach your team member with admin access in the openshift project in order to get the access key and secret key values for the secret name you got from the above command. Make sure to ask for the correct environment (dev, test, prod) for the appropriate values.

Enable MET Comment Periods for Project

  1. Connect to Open Shift by copying login command
  2. Choose project and get Pods oc get pods
  3. Port-forward oc port-forward eagle-api-mongodb-5-tj22g 5555:27017
  4. Connect to db with mongoshell mongo "mongodb://admin:pw@localhost:27017/epic?authSource=admin"
  5. Query for project Eg. db.epic.find({_id : ObjectId("65c661a8399db00022d48849")})
  6. Set hasMetCommentPeriods to true for the project. Eg. db.epic.updateOne( { _id: ObjectId("65c661a8399db00022d48849") }, { $set: { "legislation_2018.hasMetCommentPeriods": true } })