PRIMARY research source. NotebookLM holds the project's curated knowledge base. Query-only — never creates notebooks or writes artifacts. For any question that might live in a curated notebook, NotebookLM is the FIRST choice, BEFORE local ripgrep and BEFORE Context7. Internet is never used unless the user explicitly requests it.
Adaptive Reasoning gate: You MUST state Mode: {n} as the first line of your response per the gate instructions in your prompt.
NotebookLM is the primary external research source for this project. It holds curated notebooks hand-selected by the team — architecture decisions, onboarding guides, vendor playbooks, Odoo-upgrade notes, and anything else the team decided deserved a durable home.
In V3.1, this skill was promoted from "consulted sometimes" to primary research authority. The research priority order is:
See _shared/research-routing.md for the full routing table.
notebooklm/{notebook}/{topic}Any attempt to create artifacts must be redirected — tell the user to go to the NotebookLM web app. This skill is read-only on content; it may only edit metadata.
mem_search(query: "notebooklm/{topic-guess}", project: "{project}")
→ if found, read the observation
→ if the finding is fresh (< 7 days old for living docs, < 30 days for frozen), use it
→ if stale or missing, proceed to step 2
notebooklm_list_notebooks()
→ returns array of {id, title, description, tags}
→ pick the best match by title/tags/description
If none match the topic, fall through to Step 2 of research routing (local code + docs).
notebooklm_query(
notebook_id: "{id}",
query: "{user question, verbatim or paraphrased}"
)
→ returns answer + source citations
mem_save(
title: "notebooklm/{notebook}/{topic}",
topic_key: "notebooklm/{notebook}/{topic}",
type: "external-research",
project: "{project}",
content: "Q: {question}\nA: {answer}\nSources: {citations}\nNotebook: {notebook-id}\nDate: {iso-date}"
)
Next session asking the same question hits the Engram cache instead of re-querying NotebookLM.
When the active overlay is Odoo (detected via .atl/overlays/odoo-*/), add a code-first constraint to every NotebookLM query:
Query prefix: "Answer with code-first examples from the Odoo source. Quote model
names, field names, and decorators verbatim. Cite the file path (addons/x/models/y.py)
when possible."
This prevents NotebookLM from returning marketing-style prose when the user wants code.
ripgrep, not NotebookLM.Do NOT pad NotebookLM queries with filler in hopes of getting a hit. If the topic doesn't match, move on.
Every NotebookLM response gets persisted in Engram. The topic-key format is:
notebooklm/{notebook-slug}/{topic-slug}
Before calling NotebookLM, ALWAYS check Engram first with mem_search. The orchestrator pays for NotebookLM calls; Engram is free.
Staleness rules:
{
"source": "notebooklm",
"notebook_id": "...",
"query": "...",
"answer_summary": "2-3 sentences",
"citation_count": N,
"engram_key": "notebooklm/{nb}/{topic}",
"cached_hit": true|false // was it served from Engram?
}
The orchestrator uses cached_hit to compute the token-savings banner (see Phase E of V3.1 plan).
{source: "notebooklm", status: "unavailable"}, orchestrator falls through to local research.{source: "notebooklm", status: "no-match"}, orchestrator falls through.{source: "notebooklm", status: "rate-limited", retry_after_s: N}, orchestrator waits or falls through._shared/research-routing.md — the 4-step priority ordermcp-context7-skill/SKILL.md — tertiary research, defers to this skillripgrep/SKILL.md — secondary (local) research