Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
8 changes: 8 additions & 0 deletions Dockerfile.api
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
FROM python:3.11-slim

WORKDIR /app

COPY requirements.api.txt .
RUN pip install --no-cache-dir -r requirements.api.txt

COPY ./app ./app
8 changes: 8 additions & 0 deletions Dockerfile.sim
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
FROM python:3.11-slim

WORKDIR /sim

COPY requirements.sim.txt .
RUN pip install --no-cache-dir -r requirements.sim.txt

COPY ./simulation_runner ./
29 changes: 29 additions & 0 deletions README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,29 @@
# Elevator Simulation

## Overview
This software system is used to create high quality synthetic data and store it to be used in a ML ingestion pipeline.
The design considers 3 main components:
- Simulation
- API
- Database

### Simulation
A descrete event simulation in simpy is proposed to model the elevator scenario, its a great tool for logistics and phenomena that follows Poisson processes.
This allows us to recreate an environment where the elevator can perform its actions realistically and add all the logic we want.
For this case a simple simulation was created, considering a single elevator in a building with n floors, the requests are taken and executed in FIFO order.
A bit of business logic was added, considering that the first floor is usually at street level and is much busier, a spike in the demand for floor one was added, also, the elevator rests at the first floor when idle.
The generated data is posted to the API at runtime.

### API
A simple FastAPI was developed, with endpoint to create and read generated data. See routes.py
These allow the simulation to store data in the database, and the future ML pipeline to retrieve this data to train.
Also, tests were added to check the endpoints functionality.

### Database
A FastAPI data model connected to a PostgreSQL data schema is proposed (see models.py) to store simulation metadata and labeled demand data.
We store a snapshot of the state of the simulation when a relevant demand was created (features) and then add the next requested floor created after that scenario (label).
Also, a simple test was added to check the database connection with the API.

#### Note
The system was designed in a containerized fashion, to be able to deploy it easily in a production environment (see docker-compose.yml).
The logic was separated as a different service for the simulations and for the app, since the simulation could be very resource heavy and we dont want to overload the backend.
16 changes: 16 additions & 0 deletions app/db.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker

import os

DATABASE_URL = os.getenv("DATABASE_URL")

engine = create_engine(DATABASE_URL)
SessionLocal = sessionmaker(bind=engine, autocommit=False, autoflush=False)

def get_db():
db = SessionLocal()
try:
yield db
finally:
db.close()
11 changes: 11 additions & 0 deletions app/main.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
from fastapi import FastAPI
from routes import router

app = FastAPI(
title="elevator-sim API",
description="Stores simulation metadata and elevator requests for model training",
version="0.1.0"
)

# Include the endpoints
app.include_router(router)
60 changes: 60 additions & 0 deletions app/models.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,60 @@
from sqlalchemy import Column, Integer, Float, ForeignKey, DateTime
from sqlalchemy.dialects.postgresql import ARRAY
from sqlalchemy.orm import declarative_base, relationship

Base = declarative_base()

class SimulationMetadata(Base):
"""
Stores data describing the simulation, each simulation has many requets.
Parameters can be used to reproduce a simulation or provide extra features.
"""
__tablename__ = "simulations"

id = Column(Integer, primary_key=True, index=True)

# Simulation parameters
wait_time = Column(Float, nullable=False) # seconds
elevator_speed = Column(Float, nullable=False) # floors/sec
expo_lambda = Column(Float, nullable=False) # req/sec
start_datetime = Column(DateTime, nullable=False) # Timestamp
duration = Column(Integer, nullable=False) # seconds
base_floor = Column(Integer, nullable=True)
base_floor_weight = Column(Float, nullable=True) # chance multiplier
floor_min = Column(Integer, nullable=False)
floor_max = Column(Integer, nullable=False)
random_seed = Column(Integer, nullable=False) # for reproducibility

# 1-N relationship with requests
requests = relationship("ElevatorRequest", back_populates="simulation")


class ElevatorRequest(Base):
"""
Snapshot of the simulation when the elevator was idle waiting for next requested floor.
Used as features to train a model, next_floor_requested can be sused as label.
Each record belongs to a single simulation.
"""
__tablename__ = "elevator_requests"

id = Column(Integer, primary_key=True, index=True)

# State features
current_floor = Column(Integer, nullable=False)
last_floor = Column(Integer, nullable=False)
time_idle = Column(Float, nullable=False)
timestamp = Column(DateTime, nullable=False)

# Calculated indicators
floor_demand_histogram = Column(ARRAY(Integer), nullable=False) # eg: [1, 2, 0, 3, 1]
hot_floor_last_30s = Column(Integer, nullable=True)
requests_entropy = Column(Float, nullable=True)
mean_requested_floor = Column(Float, nullable=True)
distance_to_center_of_mass = Column(Float, nullable=True)

# Label
next_floor_requested = Column(Integer, nullable=True)

# N-1 relationship with simulation
simulation_id = Column(Integer, ForeignKey("simulations.id"), nullable=False)
simulation = relationship("SimulationMetadata", back_populates="requests")
63 changes: 63 additions & 0 deletions app/routes.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,63 @@
from fastapi import APIRouter, Depends, HTTPException
from sqlalchemy.orm import Session
from typing import List

from models import SimulationMetadata, ElevatorRequest
from schemas import SimulationCreate, SimulationOut, ElevatorRequestCreate, ElevatorRequestOut
from db import get_db

router = APIRouter()

# Simulation endpoints ---

@router.post("/simulation", response_model=SimulationOut)
def create_simulation(sim_data: SimulationCreate, db: Session = Depends(get_db)):
"""
Create a single simulation object
"""
sim = SimulationMetadata(**sim_data.dict())
db.add(sim)
db.commit()
db.refresh(sim)
return sim


@router.get("/simulations", response_model=List[SimulationOut])
def get_simulations(db: Session = Depends(get_db)):
"""
Read all available simulations
"""
return db.query(SimulationMetadata).all()


@router.get("/simulation/{id}", response_model=SimulationOut)
def get_simulation(id: int, db: Session = Depends(get_db)):
"""
Read a specific simulation
"""
sim = db.query(SimulationMetadata).filter(SimulationMetadata.id == id).first()
if not sim:
raise HTTPException(status_code=404, detail="Simulation not found")
return sim


# Requests endpoints ---

@router.post("/elevator_request", response_model=ElevatorRequestOut)
def create_elevator_request(req_data: ElevatorRequestCreate, db: Session = Depends(get_db)):
"""
Creates a single request
"""
req = ElevatorRequest(**req_data.dict())
db.add(req)
db.commit()
db.refresh(req)
return req


@router.get("/elevator_request/{sim_id}", response_model=List[ElevatorRequestOut])
def get_requests_for_simulation(sim_id: int, db: Session = Depends(get_db)):
"""
Read all requests that correspond to a specific simulation
"""
return db.query(ElevatorRequest).filter(ElevatorRequest.simulation_id == sim_id).all()
51 changes: 51 additions & 0 deletions app/schemas.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,51 @@
from datetime import datetime
from typing import List, Optional
from pydantic import BaseModel


# Simulation schema ---

class SimulationBase(BaseModel):
wait_time: float
elevator_speed: float
expo_lambda: float
start_datetime: datetime
duration: int
base_floor: Optional[int] = None
base_floor_weight: Optional[float] = None
floor_min: int
floor_max: int
random_seed: int

class SimulationCreate(SimulationBase):
pass

class SimulationOut(SimulationBase):
id: int
class Config:
orm_mode = True


# Request schema ---

class ElevatorRequestBase(BaseModel):
current_floor: int
last_floor: int
time_idle: float
timestamp: datetime
floor_demand_histogram: List[int]
hot_floor_last_30s: Optional[int] = None
requests_entropy: Optional[float] = None
mean_requested_floor: Optional[float] = None
distance_to_center_of_mass: Optional[float] = None
next_floor_requested: Optional[int] = None

class ElevatorRequestCreate(ElevatorRequestBase):
simulation_id: int

class ElevatorRequestOut(ElevatorRequestBase):
id: int
simulation_id: int

class Config:
orm_mode = True
10 changes: 0 additions & 10 deletions chatgpt/app_tests.py

This file was deleted.

12 changes: 0 additions & 12 deletions chatgpt/db.sql

This file was deleted.

43 changes: 0 additions & 43 deletions chatgpt/main.py

This file was deleted.

4 changes: 0 additions & 4 deletions chatgpt/requirements.txt

This file was deleted.

52 changes: 52 additions & 0 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,52 @@
# yaml file to orquestrate the services of the system:
# DB - API - Simulation

version: "3.9"

services:

db:
image: postgres:14
container_name: elevator_db
restart: always
environment:
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
POSTGRES_DB: elevator_sim
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data

api:
build:
context: .
dockerfile: Dockerfile.api
container_name: elevator_api
restart: always
depends_on:
- db
environment:
DATABASE_URL: postgres://postgres:postgres@db:5432/elevator_sim
ports:
- "8000:8000"
command: uvicorn app.main:app --host 0.0.0.0 --port 8000 --reload
volumes:
- .:/app

simulator:
build:
context: .
dockerfile: Dockerfile.sim
container_name: elevator_simulator
restart: "no" # does not run continously
depends_on:
- api
environment:
API_BASE_URL: http://api:8000
command: ["python", "runner.py"]
volumes:
- ./simulation_runner:/sim

volumes:
postgres_data:
File renamed without changes.
8 changes: 8 additions & 0 deletions requirements.api.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
fastapi
uvicorn
sqlalchemy
psycopg2-binary
pydantic
python-dotenv
pytest
httpx
4 changes: 4 additions & 0 deletions requirements.sim.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
simpy
requests
numpy
python-dotenv
Loading