Improving API to collect Elevator Demand and compile it as CSV file #75
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Implementing the NextLevel-Elevator API with a simple and basic implementation focusing on supplying the necessary to store users' demands.
This API has 4 endpoints.
POST /api/v1/elevatorthat will create a new elevatorPUT /api/v1/elevator/{elevator_id}that initiates a new demandPOST /api/v1/elevator/{elevator_id}/stateTo react to an elevator state when it reaches a new levelGET /api/v1/elevator/dataset.csvTo retrieve the dataset that will be used to train the prediction modelSince the main goal is to provide data to predict the best level for the elevator rest, we basically need to provide data to predict where the next demand will be. Let's work on the demand itself.
I´m considering that the elevator will be controlled by its own control system, and this API is only to react to elevator events, such as a demand or a state update.
And how its going to work.
Whenever a user calls the elevator to a certain level the elevator system will query this api to store the user demand, and it needs to be unique until the elevator attends that demand. for that requiremente lets rely on SQL DB unique contraint and do it atomically waiting and reacting the DB Integrity Error to make sure that will never be a rece condition on that.
So The is the basic model of a Elevator Demand where will store the elevator_id, the timestamp and the leval, and each demand will be unique using the (elevator_id, level) columns
The next important rule is how to reacts the elevator state, Whenever a elevator stop in any level it must call the api to store the new state. At this time the backend will query for any demand opened for that level, if it founds it will create a new entry into a new table called elevator_demand_history with the necessary data to be used in training system and it will DELETE the entry that it founds opening a new slot to a new demand to that level all that in the same transaction to archieve consistency.
Storing the demand that was completelly attended by the Elevator and splitting the timestamp into
week_day,hour,minuteandsecondfor the given demand, so it will be more easy to group demand by any time heuristics with second precision.This model can be enriched with more data it demends considering the seasonal information like if it is a holyday and something like that but I decide to take it more straight foward
The end user can now query the endpoint to retrieve the dataset in a CSV format with the following info
elevator_id,week_day,hour,minute,second,levelit can use the elevator_id, week_day, hour, minute, second, to be the input of the ML model with any precision of time the users want since, grouping by quarter of a minute, half a minute or whatevet it wants. and use the level to be the output of the ML model
once Trained it can be used by the elevator control system any time the elevator is resting it can run the inference system to discover where will be the next demand.