Analyze, clean up, and organize Apple Photos libraries. Find and report junk photos (screenshots, low-quality, burst leftovers, duplicates), analyze storage usage, generate photo timeline recaps, plan smart exports, analyze Live Photos, check iCloud sync, audit shared libraries, detect similar photos, curate seasonal highlights, and score face quality. All analysis operations are READ-ONLY on the database (safe). macOS only. Requires Python 3.9+ (stdlib only) and access to the Apple Photos SQLite database. Trigger on: Photos cleanup, photo storage, duplicate photos, junk photos, screenshot cleanup, Photos analysis, photo timeline, photo export, Photos library stats, burst cleanup, storage hogs, photo organization, Live Photos, iCloud sync, shared library, similar photos, seasonal highlights, face quality, portraits.
Comprehensive toolkit for analyzing and cleaning up Apple Photos libraries. Goes beyond what Photos.app offers: intelligent junk detection, detailed storage analysis, duplicate finding with quality scoring, timeline recaps for storytelling, and smart export planning.
Apple Photos is great at organizing and syncing photos, but it's not so great at cleanup. This skill fills that gap:
Safety: All operations are READ-ONLY database queries. No photos are modified or deleted without explicit user action.
Use when users mention:
All scripts work standalone. The Photos database is automatically located at:
~/Pictures/Photos Library.photoslibrary/database/Photos.sqlite
Basic workflow:
library_analysis.py to get overviewjunk_finder.py to identify cleanup candidatesduplicate_finder.py to find duplicatesGet comprehensive library statistics: counts, storage, date ranges, people, quality scores.
python3 scripts/library_analysis.py [--human] [--output FILE]
Options:
--human — Human-readable summary instead of JSON--output FILE — Write JSON to file--db-path PATH — Custom database path--library PATH — Custom Photos library pathExample Output:
📊 APPLE PHOTOS LIBRARY ANALYSIS
==================================================
Total Assets: 12,453
Total Storage: 48.3 GB
Average Size: 4.1 MB
Date Range: 2020-01-15 to 2025-03-03
By Type:
Photo: 11,234
Video: 891
Screenshots: 328
Favorites: 456
Bursts: 1,234
By Year:
2025: 1,203 items, 5.2 GB
2024: 3,456 items, 15.1 GB
2023: 2,987 items, 12.4 GB
...
Top People:
Jonah: 3,456 photos
Silas: 3,234 photos
...
Usage in Conversation:
User: "How many photos do I have?"
AI: Runs library_analysis.py with --human flag, reports summary
User: "Show me my Photos storage breakdown"
AI: Runs library_analysis.py, highlights key stats
Identify cleanup candidates: screenshots, low-quality photos, burst leftovers, duplicates.
python3 scripts/junk_finder.py [--screenshot-age DAYS] [--quality-threshold N] [--human]
Options:
--screenshot-age DAYS — Consider screenshots older than N days as junk (default: 30)--quality-threshold N — Quality score threshold for low-quality (default: 0.3, range: 0.0-1.0)--no-duplicates — Skip duplicate detection--human — Human-readable summary--output FILE — Write JSON to fileExample Output:
🗑️ JUNK FINDER RESULTS
==================================================
Found:
📸 Screenshots: 328
└─ Old (>30 days): 287
📉 Low Quality: 156
📸 Burst Leftovers: 1,089
👥 Possible Duplicates: 45
Estimated Savings:
Conservative: 2.3 GB
(Old screenshots + burst leftovers)
Aggressive: 5.7 GB
(All screenshots + low quality + bursts + ~50% of duplicates)
What It Finds:
ZISDETECTEDSCREENSHOT flagUsage in Conversation:
User: "Find junk in my Photos"
AI: Runs junk_finder.py, reports totals and estimated savings
User: "How many old screenshots do I have?"
AI: Runs junk_finder.py, focuses on screenshot stats
User: "What can I delete to free up 5GB?"
AI: Runs junk_finder.py, shows conservative/aggressive estimates, suggests next steps
Find duplicate photos and recommend which to keep based on quality, favorite status, and file size.
python3 scripts/duplicate_finder.py [--human] [--output FILE]
Detection Methods:
ZDUPLICATEASSETVISIBILITYSTATERecommendation Logic:
Example Output:
👥 DUPLICATE FINDER RESULTS
==================================================
Found 12 duplicate groups
Total duplicates: 27
Can safely delete: 15
Total size: 156 MB
Potential savings: 89 MB
Sample groups (showing first 5):
Group 1 (apple_builtin):
✓ KEEP ★ IMG_1234.jpg (4.2 MB, Q:0.823)
DELETE IMG_1234-2.jpg (4.1 MB, Q:0.801)
Group 2 (timestamp_dimensions):
✓ KEEP IMG_5678.heic (2.8 MB, Q:0.756)
DELETE IMG_5678-edited.jpg (3.1 MB, Q:0.654)
Usage in Conversation:
User: "Do I have duplicate photos?"
AI: Runs duplicate_finder.py, reports findings
User: "Find duplicates and tell me which to delete"
AI: Runs duplicate_finder.py, explains recommendations
Detailed storage breakdown: by year, type, source, growth trends, file types, storage hogs.
python3 scripts/storage_analyzer.py [--human] [--output FILE]
Analyzes:
Example Output:
💾 STORAGE ANALYSIS
==================================================
Total Storage: 48.3 GB
By Type:
Photo: 32.1 GB (66.5%)
11,234 items, avg 2.9 MB
Video: 16.2 GB (33.5%)
891 items, avg 18.7 MB
By Source:
Photos & Videos: 46.1 GB (95.4%)
Screenshots: 2.2 GB (4.6%)
By Year:
2025: 5.2 GB (1,203 items)
2024: 15.1 GB (3,456 items)
2023: 12.4 GB (2,987 items)
...
Top 10 Largest Files:
1. 📹 287 MB - VID_2024_vacation.mov
2. 📹 245 MB - VID_2024_swim_meet.mov
3. 📹 198 MB - VID_2023_birthday.mov
...
Recent Growth (last 12 months):
Total added: 18.7 GB
Average per month: 1.6 GB
Usage in Conversation:
User: "What's taking up space in my Photos?"
AI: Runs storage_analyzer.py, highlights biggest categories
User: "Show me my largest videos"
AI: Runs storage_analyzer.py, focuses on storage_hogs section filtered by videos
User: "How much storage am I adding per month?"
AI: Runs storage_analyzer.py, reports recent growth stats
Generate narrative summaries of photo activity for any date range. Groups photos into events and includes context: people, locations, scenes.
python3 scripts/timeline_recap.py --start-date YYYY-MM-DD [--end-date YYYY-MM-DD] [--narrative]
Options:
--start-date — Start date (required)--end-date — End date (optional, defaults to today)--cluster-hours N — Hours between photos to consider separate events (default: 4)--narrative — Output narrative text instead of JSON--output FILE — Write to fileWhat It Generates:
Example Output:
📅 PHOTO TIMELINE RECAP
==================================================
Period: 2025-03-01 to 2025-03-07
Total: 156 photos across 5 days
Events: 12
📆 2025-03-01 (Saturday) - 45 photos
🕐 09:15 (2h 15m)
32 photos, 2 videos ⭐ 5 favorites
👥 Jonah, Silas
🏷️ swimming, pool, sports
📍 41.5369, -90.5776
🕐 18:30 (45m)
13 photos
👥 Jonah, Silas
🏷️ dinner, food, family
Usage in Conversation:
User: "What did I do last week?"
AI: Runs timeline_recap.py with last week's dates, narrates the timeline in story form
User: "Show me my photo activity for February"
AI: Runs timeline_recap.py with Feb 1 - Feb 28, summarizes highlights
User: "Tell me about our vacation photos from August"
AI: Runs timeline_recap.py with August dates, creates a narrative story
AI Tip: When presenting timeline results, narrate them like a story! Don't just dump the JSON. Example:
"You had a busy Saturday on March 1st! In the morning around 9:15, you spent about 2 hours at the pool — 32 photos with Jonah and Silas, mostly swimming and sports shots. You marked 5 as favorites. Then in the evening around 6:30, you captured a family dinner with 13 photos. Looks like a great day!"
Plan organized exports by year/month, person, album, or location. Shows what will be exported without actually doing it (unless confirmed).
python3 scripts/smart_export.py --output-dir PATH [--organize-by MODE] [--plan-only]
Options:
--output-dir PATH — Where to export (required)--organize-by MODE — How to organize: year_month, person, album, location (default: year_month)--favorites — Export only favorites--start-date YYYY-MM-DD — Filter by start date--end-date YYYY-MM-DD — Filter by end date--person NAME — Export only photos with this person--album NAME — Export only from this album--plan-only — Show plan without exporting (recommended first step)Example Output:
📤 EXPORT PLAN
==================================================
Organization: year_month
Total photos: 3,456
Total size: 15.2 GB
Folders: 36
Folders:
2025/01-January/
123 items, 542 MB
2025/02-February/
156 items, 687 MB
2025/03-March/
89 items, 398 MB
...
Note: Actual export via AppleScript is not fully implemented. This command generates the export plan and folder structure. For now, use this to identify what to export, then do it manually in Photos.app.
Usage in Conversation:
User: "I want to export all my 2024 photos organized by month"
AI: Runs smart_export.py with year filter and --plan-only, shows the plan
User: "Export all photos with Jonah"
AI: Runs smart_export.py with --person "Jonah" and --plan-only, shows what would be exported
Surface your highest-quality photos using Apple's computed quality scores. Find hidden gems — great photos you never favorited.
python3 scripts/best_photos.py [--min-quality N] [--top N] [--hidden-gems] [--human]
Options:
--min-quality N — Minimum quality score threshold (default: 0.7, range: 0.0-1.0)--top N — Number of top photos to return (default: 50)--hidden-gems — Only show photos that are NOT favorited--year YYYY — Filter to specific year--human — Human-readable summary--output FILE — Write JSON to fileWhat It Shows:
Example Output:
⭐ BEST PHOTOS / HIDDEN GEMS
==================================================
Photos with quality scores: 11,234
Above threshold (0.7): 2,456
Hidden gems (great but not favorited): 2,100
Already favorited high-quality: 356
Quality Distribution:
🌟 Excellent (≥0.85): 456
✅ Good (≥0.70): 2,000
📊 Average (≥0.50): 5,234
📉 Below avg (≥0.30): 2,544
❌ Poor (<0.30): 1,000
Top 20 Photos:
1. IMG_1234.jpg
Q:0.952 | 4.2 MB | 4032x3024 💎
📐 composition:0.95, lighting:0.92, symmetry:0.88
Usage in Conversation:
User: "Show me my best photos"
AI: Runs best_photos.py with --human flag, highlights top shots
User: "Find hidden gems I haven't favorited"
AI: Runs best_photos.py with --hidden-gems, suggests which to favorite
User: "What are my best photos from 2025?"
AI: Runs best_photos.py with --year 2025, shows quality distribution and top picks
Deep analysis of people detected in your photos: who appears most, who's photographed together, trends over time, best photo of each person.
python3 scripts/people_analyzer.py [--min-photos N] [--top N] [--human]
Options:
--min-photos N — Minimum photos to include a person (default: 5)--top N — Number of top people to analyze in detail (default: 20)--human — Human-readable summary--output FILE — Write JSON to fileWhat It Shows:
Example Output:
👥 PEOPLE ANALYZER
==================================================
Named people (≥5 photos): 15
Photos with unnamed faces: 1,234
Top People:
Jonah: 3,456 photos (★ 89)
📅 2023:890, 2024:1,200, 2025:1,366
Silas: 3,234 photos (★ 76)
📅 2023:845, 2024:1,100, 2025:1,289
Frequently Photographed Together:
Jonah + Silas: 2,100 photos
Jonah + Mom: 456 photos
Usage in Conversation:
User: "Who's in my photos the most?"
AI: Runs people_analyzer.py, reports top people with counts
User: "Who do I photograph together?"
AI: Runs people_analyzer.py, focuses on co-occurrence analysis
Analyze where your photos were taken. Clusters GPS coordinates into locations, identifies trips, and shows most-photographed places.
python3 scripts/location_mapper.py [--radius N] [--year YYYY] [--human]
Options:
--radius N — Cluster radius in km (default: 1.0)--year YYYY — Filter to specific year--min-photos N — Minimum photos per location cluster (default: 3)--human — Human-readable summary--output FILE — Write JSON to fileWhat It Shows:
Example Output:
📍 LOCATION / TRAVEL MAPPER
==================================================
Photos with GPS: 8,456 (67.9%)
Without GPS: 3,997
Unique locations: 45
Possible trips: 12
Top Locations:
1. (41.5369, -90.5776)
1,234 photos ⭐12 🧳 | 5.2 GB
📅 2020-01-15 → 2025-03-03
👥 Jonah, Silas
Usage in Conversation:
User: "Where have I taken the most photos?"
AI: Runs location_mapper.py, reports top locations
User: "Show me my trips from 2025"
AI: Runs location_mapper.py with --year 2025, highlights identified trips
Search photos by ML-detected scene classifications (beach, sunset, dog, food, etc.) or generate a complete content inventory.
python3 scripts/scene_search.py [--search TERM] [--min-confidence N] [--human]
Options:
--search TERM — Scene name to search for (omit for content inventory)--min-confidence N — Minimum confidence score (default: 0.0)--top N — Number of search results (default: 50)--year YYYY — Filter to specific year--human — Human-readable summary--output FILE — Write JSON to fileModes:
--search beach) — Find all photos matching a scene labelExample Output (inventory):
🏷️ SCENE / CONTENT SEARCH
==================================================
Unique scene labels: 234
Scene-tagged entries: 45,678
Library total: 12,453
By Category:
📂 Nature Outdoor (5,678 photos)
beach: 234
sunset: 189
mountain: 156
📂 Animals (2,345 photos)
dog: 1,234
cat: 567
📂 Food Drink (1,890 photos)
food: 890
coffee: 234
Usage in Conversation:
User: "How many beach photos do I have?"
AI: Runs scene_search.py --search beach, reports count and related scenes
User: "What kinds of photos do I take?"
AI: Runs scene_search.py (inventory mode), summarizes categories
Behavioral analytics: when you shoot most, busiest days, seasonal patterns, streaks, photo vs video ratio trends.
python3 scripts/photo_habits.py [--year YYYY] [--human]
Options:
--year YYYY — Filter to specific year--human — Human-readable summary--output FILE — Write JSON to fileWhat It Shows:
Example Output:
📊 PHOTO HABITS & INSIGHTS
==================================================
Total: 11,234 photos, 891 videos, 328 screenshots
Average per active day: 8.3
⏰ When You Shoot:
Peak hour: 14:00
Peak day: Saturday
Peak month: Jul
Time of Day:
Morning (6am-12pm): 3,456 (28.1%) █████████
Afternoon (12pm-6pm): 5,123 (41.6%) █████████████
Evening (6pm-12am): 2,987 (24.3%) ████████
Night (12am-6am): 737 (6.0%) ██
🔥 Streaks:
Longest streak: 45 consecutive days
(2024-06-15 → 2024-07-29)
Usage in Conversation:
User: "What are my photo-taking patterns?"
AI: Runs photo_habits.py, narrates key insights
User: "When do I take the most photos?"
AI: Runs photo_habits.py, highlights peak times and days
See what you photographed on today's date in previous years. Includes people, scenes, and quality context.
python3 scripts/on_this_day.py [--date YYYY-MM-DD] [--window N] [--human]
Options:
--date YYYY-MM-DD — Target date (defaults to today)--window N — Include photos ±N days around target (default: 0)--human — Human-readable summary--output FILE — Write JSON to fileWhat It Shows:
Example Output:
📅 ON THIS DAY
==================================================
Date: March 3
Photos found: 45 across 4 years
📆 2025 (1 year ago)
12 photos, ⭐ 3
👥 Jonah, Silas
🏷️ swimming, pool
📸 Best: IMG_1234.jpg Q:0.89
📆 2024 (2 years ago)
8 photos
👥 Jonah
🏷️ park, outdoor
Usage in Conversation:
User: "What did I do on this day in past years?"
AI: Runs on_this_day.py, narrates memories year by year
User: "Show me memories from March 3"
AI: Runs on_this_day.py --date 2026-03-03, tells the story
AI Tip: Narrate memories warmly! "2 years ago today, you were at the pool with Jonah and Silas — you took 12 photos and favorited 3 of them. The best shot has a quality score of 0.89!"
Find orphan photos (not in any album), empty albums, tiny albums, and overlapping albums.
python3 scripts/album_auditor.py [--human]
Options:
--human — Human-readable summary--output FILE — Write JSON to fileWhat It Shows:
Example Output:
📁 ALBUM AUDITOR
==================================================
Total albums: 45
Albums with photos: 38
Empty albums: 4
Tiny albums (≤3 photos): 7
Photos in albums: 8,234 / 12,453
Orphan photos (no album): 4,219
📭 Orphan Photos:
4,219 photos not in any album
Total size: 18.3 GB
🗑️ Empty Albums (4):
• Old Vacation
• Test Album
🔄 Album Overlaps:
"Vacation 2024" ∩ "Summer 2024"
45 shared (12.3% / 8.9%)
Usage in Conversation:
User: "Are my photos well organized?"
AI: Runs album_auditor.py, reports orphans, empty albums, and overlap
User: "How many photos aren't in any album?"
AI: Runs album_auditor.py, reports orphan count and size
Actually move junk photos to trash via AppleScript. Supports old screenshots, burst leftovers, low quality, and duplicates. All items go to Recently Deleted (recoverable for 30 days).
python3 scripts/cleanup_executor.py --category CATEGORY [--execute] [--human]
Options:
--category — What to clean up: old_screenshots, all_screenshots, burst_leftovers, low_quality, duplicates--screenshot-age N — Screenshot age in days for old_screenshots (default: 30)--quality-threshold N — Quality threshold for low_quality (default: 0.3)--limit N — Maximum items to process (default: 500)--execute — Actually perform the cleanup (without this, preview only)--batch-size N — Items per AppleScript batch (default: 50)--human — Human-readable summarySafety:
--execute, shows preview only (dry run)--execute, requires typing 'yes' to confirmExample Usage:
# Preview what would be cleaned
python3 scripts/cleanup_executor.py --category old_screenshots --human
# Actually clean up (with confirmation prompt)
python3 scripts/cleanup_executor.py --category old_screenshots --execute
# Clean burst leftovers
python3 scripts/cleanup_executor.py --category burst_leftovers --execute
Usage in Conversation:
User: "Delete my old screenshots"
AI: Runs cleanup_executor.py --category old_screenshots --human first to show preview, then prompts user before running with --execute
User: "Clean up burst photos"
AI: Runs cleanup_executor.py --category burst_leftovers --human, shows count and size, asks for confirmation
⚠️ AI Tip: Always show the preview first! Never run with --execute without showing the user what will be affected and getting confirmation.
Analyze Live Photos vs still photos: identify Live Photos, compare storage impact, find Live Photos that could be converted to stills to save space.
python3 scripts/live_photo_analyzer.py [--human] [--year YYYY] [--output FILE]
Options:
--year YYYY — Filter to specific year--human — Human-readable summary--output FILE — Write JSON to fileWhat it reports:
Analyze the Shared Library in Apple Photos: personal vs shared content, contributor breakdown, storage impact.
python3 scripts/shared_library.py [--human] [--output FILE]
What it reports:
Note: Requires macOS 13+ / iOS 16+ database format for Shared Library columns.
Check iCloud sync coverage across the library: synced vs local-only, download status, large unsynced items.
python3 scripts/icloud_status.py [--human] [--output FILE]
What it reports:
Find visually similar photos beyond exact duplicates using computed quality feature vectors (composition, lighting, color, patterns, etc.).
python3 scripts/similarity_finder.py [--threshold 0.95] [--year YYYY] [--limit 500] [--human]
Options:
--threshold — Cosine similarity threshold 0-1 (default: 0.95, very similar)--year YYYY — Filter to specific year--limit N — Max photos to compare (default: 500, controls runtime)--human — Human-readable summaryWhat it reports:
Runtime note: O(n²) comparison; use --limit to control runtime for large libraries.
Curate the best photos from each season using quality scores, favorites, and scene context.
python3 scripts/seasonal_highlights.py [--year YYYY] [--top 20] [--southern] [--human]
Options:
--year YYYY — Filter to specific year--top N — Top N photos per season (default: 20)--southern — Use Southern Hemisphere season definitions--human — Human-readable summaryWhat it reports:
Score face quality per person using Apple Photos' face detection attributes: quality measure, blur, yaw angle, smile, face size, and center position.
python3 scripts/face_quality.py [--person NAME] [--top 10] [--human]
Options:
--person NAME — Filter to specific person name--top N — Top N best/worst per person (default: 10)--human — Human-readable summaryWhat it reports:
Detailed schema documentation is in references/database-schema.md. Key tables:
Important: Core Data timestamps are seconds since 2001-01-01, not Unix epoch.
# Get overview
python3 scripts/library_analysis.py --human
# Find junk
python3 scripts/junk_finder.py --human
# Review and manually clean up in Photos.app
# Analyze storage
python3 scripts/storage_analyzer.py --human
# Find duplicates
python3 scripts/duplicate_finder.py --human
# Find junk with aggressive settings
python3 scripts/junk_finder.py --screenshot-age 14 --quality-threshold 0.4 --human
# Use findings to guide cleanup
# Generate timeline for the year
python3 scripts/timeline_recap.py --start-date 2024-01-01 --end-date 2024-12-31 --narrative
# Get storage stats by year
python3 scripts/storage_analyzer.py | jq '.by_year'
# Get top people for the year
python3 scripts/library_analysis.py | jq '.top_people'
# Photo habits for the year
python3 scripts/photo_habits.py --year 2024 --human
# Best photos of the year
python3 scripts/best_photos.py --year 2024 --top 20 --human
# Plan export
python3 scripts/smart_export.py --output-dir ~/Desktop/Photos-Export --favorites --start-date 2024-01-01 --plan-only
# Review plan, then execute (manual for now)
# Find all junk
python3 scripts/junk_finder.py --human
# Preview old screenshots
python3 scripts/cleanup_executor.py --category old_screenshots --human
# Execute cleanup (with confirmation)
python3 scripts/cleanup_executor.py --category old_screenshots --execute
# Clean burst leftovers
python3 scripts/cleanup_executor.py --category burst_leftovers --execute
# Check album health
python3 scripts/album_auditor.py --human
# See what happened on this day in past years
python3 scripts/on_this_day.py --human
# With a wider window
python3 scripts/on_this_day.py --window 2 --human
# See all your locations
python3 scripts/location_mapper.py --human
# Focus on trips from a specific year
python3 scripts/location_mapper.py --year 2025 --human
# What content you shot at those places
python3 scripts/scene_search.py --human
# Who's in your photos
python3 scripts/people_analyzer.py --human
# Find hidden gems of specific people
python3 scripts/best_photos.py --hidden-gems --human
# Best and worst portraits per person
python3 scripts/face_quality.py --human
# Analyze Live Photos vs stills
python3 scripts/live_photo_analyzer.py --human
# Find similar photos (potential duplicates)
python3 scripts/similarity_finder.py --threshold 0.95 --human
# Check iCloud sync status
python3 scripts/icloud_status.py --human
# Get seasonal highlights
python3 scripts/seasonal_highlights.py --year 2024 --human
# Location review with place names
python3 scripts/location_mapper.py --year 2024 --human
# Photo habits for the year
python3 scripts/photo_habits.py --year 2024 --human
# Check shared library status
python3 scripts/shared_library.py --human
# See who contributed what and storage impact
python3 scripts/shared_library.py --output shared_report.json
library_analysis.py to understand the libraryjunk_finder.py to quantify cleanup opportunitiesJSON output: Default for programmatic use, includes full details
Human output: Use --human flag for readable summaries
In conversation: Synthesize the data into natural language, don't just read the output
~/Pictures/Photos Library.photoslibrary/database/Photos.sqlite--limit"Database not found"
→ Specify path with --library ~/Path/To/Photos Library.photoslibrary
"Permission denied"
→ Close Photos.app first, or run script while Photos.app is open (read-only is safe)
"No quality scores"
→ Not all photos have computed quality attributes; scripts handle NULLs gracefully
"Results don't match Photos.app counts"
→ Scripts exclude trashed items; Photos.app may show different views
All previously planned features have been implemented! Possible further expansions:
Bottom line: This skill gives you X-ray vision into your Photos library. Use it to understand what you have, find what you don't need, and make cleanup decisions with confidence.