Guide users through a structured workflow for co-authoring documents such as proposals, tenders, technical specs, decision docs, or similar structured content. Helps users efficiently transfer context, refine content through iteration, and verify the document works for readers. Trigger when user mentions writing docs, creating proposals, drafting specs, preparing tenders, or similar documentation tasks.
This skill provides a structured workflow for guiding users through collaborative document creation. Act as an active guide, walking users through three stages: Context Gathering, Refinement & Structure, and Reader Testing.
Trigger conditions:
Initial offer: Offer the user a structured workflow for co-authoring the document. Explain the three stages:
Explain that this approach helps ensure the doc works well when others read it. Ask if they want to try this workflow or prefer to work freeform.
If user declines, work freeform. If user accepts, proceed to Stage 1.
IMPORTANT: For any document that represents Open Elements externally (proposals, tenders, presentations, cover letters), load and apply the rules from business-communication.md throughout all stages. These rules govern truthfulness, confidentiality, transparency, attribution, tone, and language.
Key rules to keep in mind at all times:
open-elements-info skill as the primary source of truth.Goal: Close the gap between what the user knows and what Claude knows, enabling smart guidance later.
Start by asking the user for meta-context about the document:
Inform them they can answer in shorthand or dump information however works best for them.
If user provides a template or reference document:
If the document is a tender response or formal submission:
Once initial questions are answered, encourage the user to dump all the context they have. Request information such as:
Advise them not to worry about organizing it — just get it all out.
During context gathering:
open-elements-info skill before proceedingAsking clarifying questions:
When user signals they have done their initial dump, ask clarifying questions to ensure understanding:
Generate 5-10 numbered questions based on gaps in the context.
Inform them they can use shorthand to answer (e.g., "1: yes, 2: no because backwards compat, 3: see attached doc").
Exit condition: Sufficient context has been gathered when questions show understanding — when edge cases and trade-offs can be asked about without needing basics explained.
Transition: Ask if there is any more context they want to provide at this stage, or if it is time to move on to drafting the document.
Goal: Build the document section by section through brainstorming, curation, and iterative refinement.
Instructions to user: Explain that the document will be built section by section. For each section:
Start with whichever section has the most unknowns, then work through the rest.
Section ordering:
If the document structure is clear (e.g., from a template or tender requirements): Ask which section they would like to start with. Suggest starting with whichever section has the most unknowns.
If user does not know what sections they need: Based on the type of document, suggest 3-5 sections appropriate for the doc type. Ask if this structure works or if they want to adjust it.
Once structure is agreed:
Create the document as a markdown file in the working directory. Name it appropriately (e.g., decision-doc.md, technical-spec.md, tender-response.md).
Create the file with all section headers and brief placeholder text like "[To be written]".
Confirm the file has been created and indicate it is time to fill in each section.
For each section:
Announce work will begin on the section. Ask 5-10 clarifying questions about what should be included.
Brainstorm 5-20 things that might be included, depending on the section's complexity. Look for:
Ask which points should be kept, removed, or combined. Request brief justifications to help learn priorities for the next sections.
If user gives freeform feedback instead of numbered selections, extract their preferences and proceed.
Based on what they have selected, ask if there is anything important missing for this section.
For tender responses: Check the selected content against the evaluation criteria. Flag any criteria that are not yet addressed and ask the user how to handle them.
Use str_replace to replace the placeholder text with the actual drafted content.
After drafting, confirm completion and ask the user to read through it and indicate what to change.
Key instruction for user (include when drafting the first section): Instead of editing the doc directly, ask them to indicate what to change. This helps learning of their style for future sections.
As user provides feedback:
str_replace to make edits (never reprint the whole doc)Continue iterating until user is satisfied with the section.
After 3 consecutive iterations with no substantial changes, ask if anything can be removed without losing important information.
When section is done, confirm it is complete and ask if ready to move to the next section.
Repeat for all sections.
As approaching completion (80%+ of sections done), re-read the entire document and check for:
For business documents, additionally check against the review checklist from business-communication.md:
open-elements-info skill or other confirmed sourcesFor tender responses, additionally verify:
Provide any findings and suggestions.
When all sections are drafted and refined, ask if ready to move to Reader Testing, or if they want to refine anything else.
Goal: Test the document with a fresh Claude (no context bleed) to verify it works for readers.
Explain that testing will now occur to see if the document actually works for readers. This catches blind spots — things that make sense to the authors but might confuse others.
Predict what questions readers might ask when reading this document.
Generate 5-10 questions that readers would realistically ask. For tender responses, include questions an evaluator would ask when scoring against the criteria.
Test these questions with a fresh Claude instance (no context from this conversation).
For each question, invoke a sub-agent with just the document content and the question.
Summarize what the reader agent got right and wrong for each question.
Invoke a sub-agent to check for:
For business documents, also check:
If issues are found:
When the reader agent consistently answers questions correctly and does not surface new gaps or ambiguities, the doc is ready.
When Reader Testing passes:
If user wants a final review, provide it. Otherwise:
Announce document completion. Provide a few final tips:
Tone:
Handling Deviations:
Context Management:
Quality over Speed: