Lab Instructions: Building & Deploying AI Applications with CI/CD
AML3304 - Software Tools for AI/ML
Version 2.1 | March 2025
Lab Objectives
Implement end-to-end ML pipeline using Google Colab & Hugging Face Configure GitHub Actions CI/CD for model updates Manage projects via GitHub Issues + Project Boards Deploy production-grade models to Hugging Face Spaces Technical Stack
Step-by-Step Implementation
Phase 1: Environment Setup (1 Hour)
1.1 Create Lab Repository
bash
git clone https://github.com/[YOUR_USERNAME]/aml3304-lab.git
cd aml3304-lab
1.2 Colab Starter Notebook
Create AML3304_Lab.ipynb with:
python
!pip install -q transformers==4.30 datasets==2.12 scikit-learn==1.2 mlflow==2.5
import mlflow
mlflow.set_tracking_uri("sqlite:///mlruns.db")
Phase 2: Model Development (3 Hours)
2.1 Text Classification Pipeline
python
from transformers import pipeline
# Simple sentiment analysis
classifier = pipeline("text-classification", model="distilbert-base-uncased-finetuned-sst-2-english")
# Test prediction
sample_text = "Google Colab provides excellent GPU resources for AI education"
print(classifier(sample_text))
2.2 MLflow Tracking
python
with mlflow.start_run():
mlflow.log_param("model", "distilbert-base-uncased")
mlflow.log_metric("accuracy", 0.92)
mlflow.transformers.log_model(
transformers_model=classifier,
artifact_path="sentiment_model",
registered_model_name="lab-sentiment-v1"
)
Phase 3: Deployment (2 Hours)
3.1 Hugging Face Spaces Setup
3.2 Deployment Script
Create app.py:
python
import gradio as gr
from transformers import pipeline
model = pipeline("text-classification", "distilbert-base-uncased-finetuned-sst-2-english")
def analyze(text):
return model(text)[0]
gr.Interface(
fn=analyze,
inputs=gr.Textbox(lines=2, placeholder="Enter text..."),
outputs="label"
).launch()
Phase 4: CI/CD Implementation (2 Hours)
4.1 GitHub Actions Workflow
Create .github/workflows/deploy.yml:
text
name: Deploy to Hugging Face
on:
push:
branches: [ main ]
jobs:
build-deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- name: Set up Python
uses: actions/setup-python@v4
with:
python-version: '3.10'
- name: Install dependencies
run: |
pip install transformers gradio
- name: Deploy to Spaces
uses: huggingface/huggingface_hub@main
with:
endpoint: spaces
hf_token: ${{ secrets.HF_TOKEN }}
git_user_name: "AML3304 Student"
git_user_email: "student@lambtoncollege.ca"
4.2 Project Management Setup
Enable GitHub Projects in repo settings Testing & Validation
Sample Test Suite
Create tests/test_pipeline.py:
python
def test_data_loading():
from datasets import load_dataset
data = load_dataset("imdb", split="test[:1%]")
assert len(data) > 0
def test_model_inference():
from transformers import pipeline
classifier = pipeline("text-classification", model="distilbert-base-uncased-finetuned-sst-2-english")
result = classifier("This lab works perfectly!")
assert result[0]['label'] in ['POSITIVE', 'NEGATIVE']
Monitoring & Maintenance
GitHub Actions Dashboard: Monitor build statuses bash
mlflow ui --backend-store-uri sqlite:///mlruns.db
Hugging Face Metrics: Track API usage & performance Ethical Implementation Checklist
Bias evaluation using datasets library Privacy filter for user inputs Submission Requirements
Functional Colab notebook Active project board with ≥5 issues Live Hugging Face Space demo Recommended Resources
This lab aligns with Vocational Learning Outcomes 3, 6, and 9 by implementing end-to-end MLOps workflows with industry-standard tools.