Contributing to ML/MLOps Portfolio¶
Thank you for your interest in contributing to this portfolio! This project demonstrates end-to-end MLOps practices across 3 deployed ML services.
Getting Started¶
Prerequisites¶
- Python 3.11+ (3.12 also supported)
- Docker & Docker Compose
- Make utility
- pytest for running tests
Recommended Reading¶
Before contributing, please review: - Architecture Documentation: Understand the system design - Deployment Guide: Learn deployment and monitoring procedures - Deployment Evidence: Multi-cloud deployment verification
Setup¶
-
Clone the repository
-
Install Dependencies We use
maketo manage dependencies across all projects: -
Verify Environment Run the test suite to ensure everything is working:
Development Workflow¶
1. Project Structure¶
The portfolio consists of three main microservices:
- BankChurn-Predictor/ (FastAPI + Scikit-learn)
- NLPInsight-Analyzer/ (FastAPI + FinBERT / TF-IDF)
- ChicagoTaxi-Demand-Pipeline/ (FastAPI + PySpark + Dask)
2. Dependency Management¶
We use requirements.in for direct dependencies and pip-compile for locking.
To add a package:
1. Edit requirements.in in the specific project folder.
2. Run:
pip install -r requirements.txt
3. Running Locally¶
Use the unified Docker stack for testing integration:
Code Standards¶
Python Style¶
- Formatter:
black - Linter:
flake8 - Type Checking:
mypy - Imports:
isort
Run the linting suite before committing:
Testing¶
- Unit Tests: Required for all new logic (
tests/test_*.py). - Integration Tests: Required for API endpoints.
- Use
tests/integration/test_demo.pyfor cross-project validation. - Ensure all services pass health checks and prediction tests.
- Coverage: Must remain above 85% (actual: 90–98%, 395+ tests, Codecov verified).
Run integration tests:
# Start demo stack
docker compose -f docker-compose.demo.yml up -d
# Run tests
pytest tests/integration/test_demo.py -v
# Tear down
docker compose -f docker-compose.demo.yml down
Commit Messages¶
We follow Conventional Commits:
feat: New featurefix: Bug fixdocs: Documentationchore: Maintenance (deps, build)refactor: Code restructuring
Example:
Pull Request Process¶
- Create a branch:
feat/my-new-feature - Commit changes ensuring
make testpasses. - Open a PR targeting
main. - Ensure the CI pipeline (Tests, Lint, Docker Build) is green.
- Request review.
Development Process & AI Transparency¶
This portfolio was developed using AI-assisted tools (Cursor / Cascade) for code generation and boilerplate acceleration. All architectural decisions, project selection, MLOps pipeline design, infrastructure choices, and system integration were made by the author.
What AI tools were used for: - Boilerplate code generation (FastAPI endpoints, test scaffolding, Dockerfile templates) - Documentation drafting and formatting - Code refactoring suggestions and performance optimization patterns
What the author owns and maintains independently: - All architectural and design decisions - MLOps pipeline design (CI/CD workflow, security scanning strategy, coverage enforcement) - Infrastructure choices (Terraform modules, Kubernetes manifests, monitoring stack) - Model selection, experiment design, and hyperparameter tuning strategy - System integration, debugging, and production operations - Docker optimization and multi-stage build design
This transparency reflects the industry-standard practice of leveraging AI tools as productivity accelerators while retaining full engineering ownership.
License¶
By contributing, you agree that your contributions will be licensed under the MIT License.
Last Updated: March 2026 — v3.5.3