π You can use CloudOps Automation Runbooks, built on Jupyter Notebooks, to quickly create SRE RunBooks for Runbook Automation and Cloud Infrastructure Management! π
Important
π Mission: Our mission is to simplify CloudOps Automation for DevOps and SRE teams by providing an extensive, community-driven repository of actions and runbooks that streamline day-to-day operations.
Note
ποΈ Vision: Our vision is to be the π₯ One-Stop Multi-Cloud Platform Engineering & Best Practices Solution for all CloudOps Automation needs, allowing DevOps and SRE teams to automate their workflows with ease, improve efficiency, and minimize toil.
- π₯ β
Lightning-Fast Toolchain: Powered by π¦
uv
- Next-generation Python dependency and build management, πruff
- Linting and formatting at blazing speed, and π§ͺ pytest - Robust testing framework with coverage reports. - π₯ β
Effortless CI/CD Pipelines: π οΈ Taskfile Automation - Say goodbye to manual SDLC repetitive tasks, π³ Containerized Workflows β π‘οΈ Security-first practices and Multi-stage Wolfi-based Docker builds for scalable production-ready environments, and βοΈ Auto-publish to
PyPI
and GitHub Container Registry (GHCR
) with GitHub Actions. - π₯ βοΈCloudOps Automation and FinOps Toolkit β Pre-configured hybrid-cloud workflows and seamlessly integrations (jupyterlab, mkdocs, boto3, moto) for managing cloud infrastructure π.
Feature | Toolchain | Purpose |
---|---|---|
π οΈ Configuration | pyproject.toml |
Centralized configuration for dependencies, testing, and linting. |
π§Ή Task Automation | Taskfile |
Automates repetitive tasks like linting, testing, and publishing. |
π¦ Python Dependencies | uv |
Lightning-fast dependency resolution, caching, and builds. |
π Linting & Formatting | ruff |
Enforces code quality standards, auto-formatting, and import sorting. |
π§ͺ Testing Framework | pytest |
Comprehensive unit tests, integration tests with coverage reporting. |
π³ Docker Integration | Dockerfile + DevContainer |
Optimized wolfi-based multi-stage builds for CI/CD and local development environments. |
π¦Ύ CI/CD Pipelines | GitHub Actions |
Automated builds, tests, and deployments to PyPI and GHCR. |
π Security Compliance | chainguard/wolfi-base + SBOM + Attestations |
Ensures compliance, vulnerability scanning, and security transparency. |
- π auto doc generation
- CLI Tools β Typer simplifies automation for AWS resources.
- Logging β Loguru ensures structured logs for debugging.
- π³ CI/CD Optimized Docker Image runs when a new release is created pushing to gh registry
- π¦Ύ GitHub actions:
- auto publish to
pypi
on push onmain
- auto creating a new tag on push on
main
, sync versions - run
tests
andlint
ondev
andmain
when a PR is open
- auto publish to
git clone https://github.com/nnthanh101/runbooks.git
cd runbooks
echo "Install Python dependencies using UV"
task install
echo "Run CI/CD pipeline tasks: clean, lint, format, test, and build"
task ci
echo "Publish the project package to PyPI"
task publish
- Open the project in VSCode.
- Install the Remote - Containers extension.
- Reopen in Container:
Command PaletteCtrl+Shift+P
β Reopen in Container.
π End-to-end Production-grade project structure for successful π CloudOps Automation and Visual Analytics FinOps projects π
cloudops-automation/
βββ .devcontainer/ ## Dev Container configurations
β βββ Dockerfile ## Container image build file
βββ .github/ ## CI/CD workflows
β βββ workflows/ ## GitHub Actions workflows
β βββ templates/ ## Workflow templates
βββ .vscode/ ## IDE-specific configurations
βββ config/ ## Configuration files (YAML, JSON)
βββ data π Where all your raw and processed data files are stored.
β βββ external <- Data from third-party sources.
β βββ interim <- Intermediate data that has been transformed.
β βββ processed <- The final, canonical data sets for modeling.
β βββ raw <- The original, unprocessed, immutable data dump.
β
βββ docs π A default mkdocs project; see mkdocs.org for details
β βββ api/ ## API documentation
β βββ architecture/ ## Architecture diagrams
β βββ tutorials/ ## Tutorials and guides
β βββ getting-started.md ## Quickstart guide
β βββ index.md ## Overview documentation
β
βββ logs/ ## Log files for debugging
|
βββ models π§ Store your trained and serialized models for easy access and versioning.
β
βββ notebooks π» Jupyter notebooks for experiments and visualization.
β βββ data_exploration.ipynb
β βββ data_preprocessing.ipynb
β βββ model_training.ipynb
β βββ model_evaluation.ipynb
β
βββ pyproject.toml <- Project configuration file with package metadata for
β runbooks and configuration for tools like black
β
βββ src/ ## 𧩠Source code for use in this project.
β βββ runbooks/ ## Main module for CloudOps Runbooks automation
β β βββ __init__.py ## Package initializer
β β βββ calculator.py ## [Python101] Calculator
β β βββ config.py
β β βββ exceptions.py
β β βββ utils.py ## Utility scripts (logging, configs)
β β βββ cleanup.py ## Cleanup automation runbook
β βββ main.py
βββ test/ ## Unit and integration tests
β βββ conftest.py
β βββ __init__.py
β βββ test_calculator.py ## [Python101] Test cases for calculator
β βββ test_utils.py ## Test cases for utils
β βββ test_exceptions.py
βββ templates/ ## Terraform and CloudFormation templates
βββ tools/ ## Developer tools and scripts
βββ .dockerignore ## Docker ignore file
βββ .env ## Environment variables
βββ .gitignore ## Git ignore file
βββ .python-version ## Python version management
βββ .gitignore
βββ mkdocs.yml # Documentation generator configuration
βββ README.md π€ Explain your project and its structure for better collaboration.
βββ references <- Data dictionaries, manuals, and all other explanatory materials.
β
βββ reports π Generated analysis (reports, charts, and plots) as HTML, PDF, LaTeX.
β βββ figures <- Generated graphics and figures to be used in reporting
β
βββ requirements.txt π The requirements file for reproducing the analysis environment, for easy environment setup.
βββ Taskfile <- Taskfile with convenience commands like `task data` or `task train`
- Check if the image exists in GitHub Packages:
docker pull ghcr.io/nnthanh101/runbooks:latest
- If the image doesn't exist, build and push it:
docker build -t ghcr.io/nnthanh101/runbooks:latest .
docker login ghcr.io -u nnthanh -p GH_TOKEN
docker push ghcr.io/nnthanh101/runbooks:latest
- Inspect
docker inspect ghcr.io/nnthanh101/runbooks:latest