You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We need to create Postgres tables based on SQLAlchemy models and populate them with the subset (1000 rows) of the GeoJSON data from the SFData API endpoints.
Definition of Done
Table Address is populated and all required fields are mapped accurately to the model.
Table Soft Story is populated and all required fields are mapped accurately to the model.
Table Seismic Hazard is populated and all required fields are mapped accurately to the model.
Table Tsunami is populated and all required fields are mapped accurately to the model.
Table Landslide is populated and all required fields are mapped accurately to the model.
Table Liquefaction Hazard is populated and all required fields are mapped accurately to the model.
I'm assuming this is really the meat of the ETL service, correct? If so, let's maybe just make sure we put some kind of limit on how much we're loading locally (people probably don't want to load all the datasets onto their computer haha).
Context
We need to create Postgres tables based on SQLAlchemy models and populate them with the subset (1000 rows) of the GeoJSON data from the SFData API endpoints.
Definition of Done
Engineering Details
The text was updated successfully, but these errors were encountered: