Skip to content

Ian-Nich/HackRPI25

Repository files navigation

“RetroBrain: Reconstructing Modern Knowledge Using Outdated Models” 🔥 Pitch (what you tell the judges) A machine learning system that attempts to “relearn” modern world knowledge using only retro-era algorithms — then compares how far outdated models fall short versus modern ML. The result is a retro-to-modern ML reconstruction engine. It’s ML, it’s retro, it's unique, and it’s guaranteed to drop jaws.

🧠 Concept You take a modern dataset (text, images, or tabular — you choose). Then you build a pipeline with three tiers of ML models, each representing a "technological era": 1980s ML Logistic regression

Naive Bayes

Simple decision trees

K-means → cheap, fast, easy

2000s ML Random forests

SVM

PCA/feature engineering → slightly more advanced

2020s ML Small LLM or small vision model using GradientAI / Gemini → modern baseline

Then your system shows how different eras of ML interpret the same problem, producing a visual “retro-to-modern evolution of intelligence.” No one else will think of this.

🧩 Why This Wins “Best Machine Learning Model” Judges LOVE: comparisons

creative ML framing

retro themes

clean visualizations

“aha!” explanations

a strong narrative

Your idea becomes: scientifically meaningful

visually striking

educational

funny

technically impressive

100% novel

And each model trains in seconds, so it's 24-hr safe.

⚙️ Tech Stack (solo-friendly) Backend / ML Python

scikit-learn

XGBoost (optional tier)

Gemini API (for modern baseline)

DigitalOcean GradientAI (optional — deploy modern model)

Frontend Simple HTML/CSS/JS or Streamlit

Retro CRT aesthetic (big win)

Dataset Options (easy to use) Choose ONE of these depending on your comfort: Option A — Text sentiment Use movie reviews / news headlines

Fast pipeline, easy to visualize

Option B — Image classification Use CIFAR-10 (super small)

Your “retro models” will look hilariously bad → judges love it

Modern model will look amazing → good contrast

Option C — Tabular anomaly detection Easiest

Fastest

Super visual

🛠️ What You Build in 24 Hours MVP (6–8 hours) Load dataset

Train retro models (logistic reg, naive bayes, k-means)

Train middle-era models (random forest, SVM)

Query Gemini as a proxy for modern ML predictions

Build a “retro ranking board” showing:

accuracy

confusion matrix

how each model misinterprets data

Apply retro color palettes + animations

Deploy simple frontend

Add explanation mode:

"This is how a 1980s model sees the world"

"This is how 2000s ML sees the world"

"This is how modern AI sees the world"

This alone can win.

⭐ STRETCH GOALS (if time allows) These raise your win probability to >90%:

  1. Retro ML Inspector Generate synthetic examples showing how each era's model “imagines” the data.
  2. ML Time Machine Slider A UI slider gradually morphs predictions from: 1980 → 2000 → 2025
  3. Gemini Explanation Layer Ask Gemini: “Why does an SVM confuse images of cats and dogs?” “Why does Naive Bayes fail on modern slang?” Creates “AI explaining AI,” judges love this.

About

HackRPI repository for 11/15-16 project

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •