Provide first-turn understanding of the ai-memory repository with minimal tokens. Always use this skill when user asks what this project does, asks for implementation entrypoints, requests architecture explanation, or when a subagent needs a narrow read plan for this repository.
Use this skill as the first project-level semantic index before broad file reads.
When user asks for a quick intro, answer with:
ai-memory: command router, context rendering, inject modeslib/learn.sh: memory update pipeline from LLM outputdocs/core-idea.md: architecture principlesnote.md: implementation progress and roadmap.github/copilot-instructions.md: Copilot routing instructions generated by injectdocs/core-idea.md then ai-memory.ai-memory first, then specific module files.lib/learn.sh and related helper functions in ai-memory..github/skills/memory-token-audit/SKILL.md.