Interactive data science practice system covering pandas, scikit-learn, and deep learning. Generates coding exercises, concept quizzes, and mini-project scaffolds with auto-grading and progress tracking. Use this skill whenever the user wants to practice data science, machine learning, pandas, sklearn, PyTorch, deep learning, or says things like "daily drill", "DS practice", "ML exercise", "quiz me on pandas", "practice sklearn", "DL quiz", "data science training", or "let's study". Also triggers on "show my progress", "what should I review", or "continue my practice".
An interactive, progressive practice system for pandas, scikit-learn, and deep learning.
When the user invokes this skill:
python <skill-dir>/scripts/menu.py --show-options to see available choices.python <skill-dir>/scripts/menu.py --topic <topic> --format <format> --difficulty <difficulty>Important: Do NOT run menu.py without arguments — it uses interactive input() which does not work in Claude Code.
python <skill-dir>/scripts/progress.py --summary to check existing progress.
python <skill-dir>/scripts/init.py to bootstrap ~/.datascience-skill/.python <skill-dir>/scripts/menu.py --show-options and ask the user to pick topic, format, and difficulty.python <skill-dir>/scripts/menu.py --topic <t> --format <f> --difficulty <d> to confirm.Based on the selection, read the corresponding reference file and generate content:
| Topic | Reference to read |
|---|---|
| pandas | <skill-dir>/references/pandas-guide.md |
| scikit-learn | <skill-dir>/references/sklearn-guide.md |
| deep learning | <skill-dir>/references/deeplearning-guide.md |
| Format | What to do |
|---|---|
| Coding exercise | Generate a coding problem with test cases. User writes code. |
| Concept quiz | Generate 5 multiple-choice or fill-in-the-blank questions. |
| Mini-project | Scaffold a project directory with instructions + starter code. |
Difficulty mapping:
Adaptive difficulty: Check progress. If the user's avg_score > 0.85 on the selected topic at the selected difficulty, suggest leveling up. If < 0.6, suggest stepping back.
For coding exercises:
# YOUR CODE HERE placeholders.python <skill-dir>/scripts/grader.py with the user's code and expected output to auto-grade.For concept quizzes:
For mini-projects:
python <skill-dir>/scripts/scaffold_project.py --topic <topic> --difficulty <diff> to generate the project skeleton.After each session:
python <skill-dir>/scripts/progress.py --update --topic <topic> --format <format> --difficulty <diff> --score <score> --max-score <max> --weak "<area1>,<area2>" to persist progress.Run python <skill-dir>/scripts/progress.py --review-queue to get items due for review.
Prioritize these in the next session's exercise selection.
The skill includes a question bank at <skill-dir>/references/question-bank.json containing curated exercises. When generating exercises:
python <skill-dir>/scripts/question_bank.py --topic <topic> --difficulty <diff> --n <count>
question_ids in the session for deduplication.The user has existing 500-question notebooks for pandas and scikit-learn. To import:
python <skill-dir>/scripts/import_notebook.py --input pandas_500Q.ipynb --topic pandas --append-to <skill-dir>/references/question-bank.json
python <skill-dir>/scripts/import_notebook.py --input sklearn_500Q.ipynb --topic sklearn --append-to <skill-dir>/references/question-bank.json
~/.datascience-skill/progress.json