Export or use a Dataverse solution ZIP, extract its contents, identify Power Automate flows and related metadata, and answer questions such as: which flows trigger on Account, how many flows exist, and which flows use a specific connector like SQL. Use this when the user wants to analyze Power Automate / cloud flow metadata from a Dataverse solution.
Help the user analyze Power Automate flows contained in a Dataverse solution.
This skill must:
Ask whether the user already has the solution ZIP file.
If yes, ask for the ZIP file path.
If no, ask for the solution name and, if needed, confirm or ask for the target Dataverse environment/auth profile.
Verify the Power Platform CLI (pac) is available locally.
Export the solution ZIP to a temporary workspace.
Extract the ZIP contents into the same workspace.
Inspect the extracted files to find Power Automate flow definitions and metadata.
Answer the user’s questions from the extracted artifacts.
Important operating principles
Prefer a workspace rooted at $env:TEMP\dv-flowanalyzer instead of a hard-coded user profile path. This is more portable and avoids embedding the username. On Windows, typically resolves under the user’s local temp directory.
相关技能
$env:TEMP
Create a unique run folder per analysis to avoid collisions:
Never overwrite an existing ZIP or extracted folder unless the user explicitly asks.
Prefer read-only analysis of the exported solution contents.
Do not modify the user’s Dataverse environment unless the user explicitly requests changes.
If PAC is installed but there is no active auth profile, explain that export cannot proceed until the user signs in or selects an auth profile.
If the solution ZIP is already available, skip export and go straight to extraction and analysis.
If the user only wants counts or quick facts, avoid verbose summaries unless asked.
What to ask the user
Step 1: Determine whether a ZIP already exists
Ask:
Do you already have the Dataverse solution ZIP file available locally?
If the user says yes:
Ask for the full ZIP path.
If the path contains spaces, treat it as a quoted path in commands.
If the user says no:
Ask:
What is the solution name?
Which Dataverse environment/auth profile should be used, if not already selected?
If the environment/auth context is unclear, ask one concise follow-up such as:
I need the Dataverse environment that contains this solution. Do you want me to use the currently selected PAC auth profile, or should I help you choose one?
Store machine-readable outputs in reports/manifest.json and reports/flow-summary.json when possible.
How to answer the user’s questions
Once analysis is complete, answer directly from the extracted manifest.
Examples
If asked:
Which flows have triggers on account?
Return:
A concise bullet list of matching flow names.
Mention the trigger name/type where useful.
If asked:
How many flows are there in total?
Return:
The total count of discovered flow definitions.
If relevant, note whether the count includes only cloud flows discovered in the solution export.
If asked:
Which flows use SQL connector?
Return:
A list of flows where connectors include SQL-related API names.
If none are found, say so clearly.
If asked a question that cannot be answered from the extracted solution alone:
Say exactly what is missing.
Do not guess.
Connector detection guidance
When detecting connectors, inspect both:
connectionReferences
inputs.host.apiId, inputs.host.connectionName, and related API metadata in triggers/actions
Common examples:
Dataverse / Common Data Service: shared_commondataserviceforapps
SQL: often connector names containing sql
SharePoint: connector names containing sharepoint
Outlook / Office 365: connector names containing office365, outlook, etc.
Use exact matches when present, otherwise case-insensitive contains checks.
Quality and safety rules
Do not claim certainty when the export format is ambiguous.
If flow artifacts are not found where expected, explain that not all solution exports serialize cloud flow definitions in the same layout, and report what was searched.
Be transparent about partial analysis.
Prefer deterministic parsing over heuristic guessing.
Never expose secrets or credentials if any are present in extracted content.
Never modify extracted files unless the user explicitly asks for transformation or cleanup.
Error handling rules
If PAC is missing
Say:
Power Platform CLI (pac) is not available on this machine, so I can’t export the solution yet. If you want, I can help you install it or proceed with an existing solution ZIP.
If PAC auth is missing
Say:
PAC is installed, but I don’t see a usable authentication profile for the Dataverse environment. Please sign in or tell me which PAC auth profile/environment to use.
If solution export fails
Say:
The solution export failed. I’ll summarize the PAC error and stop so we can correct the solution name, auth profile, or environment.
If extraction fails
Say:
The ZIP export succeeded, but extraction failed. I’ll report the Python error so we can retry with the same ZIP or inspect whether the file is corrupted.
If no flows are found
Say:
I extracted the solution, but I couldn’t find any recognizable flow definitions in the exported contents. I can still summarize what artifacts were found and refine the search.
Suggested execution order
Ask whether the user already has the solution ZIP.
If yes, collect ZIP path.
If no, collect solution name and confirm/select PAC auth/environment.
Check pac availability.
Check PAC auth profiles.
Create the temp workspace.
Export solution ZIP if needed.
Run Python extraction.
Discover flow definition files.
Build a manifest of flows, triggers, actions, entities, and connectors.
Answer the user’s question from the manifest.
Offer follow-up analysis, such as connector inventory or trigger/entity matrix.
Output style
When reporting results:
Start with a short answer.
Then show the supporting list or count.
Include the workspace path if relevant for troubleshooting.
Offer a follow-up question like:
“Do you want a connector inventory for all flows?”
“Do you want me to list every flow that references the Account table?”
Example follow-up prompts this skill should handle well
Analyze this Dataverse solution for cloud flows.
Which flows trigger on account?
Count all Power Automate flows in this solution.
Which flows use the SQL connector?
Show all Dataverse entities referenced by flows in this solution.
Which flows update accounts?
Which connectors are used across all flows?
Export my solution and tell me how many flows it contains.