Intent-driven query builder for non-technical users. Translates natural language into SQL by exploring available data sources and schemas. Use this sub-skill when the user wants to build a new query and no suitable existing query was found. Not for direct user invocation.
Builds and executes Redash queries from natural language intent. Designed for users who do not know SQL or the underlying data schema.
https://redash.data-bonial.comAuthorization: Key $REDASH_API_KEYThe GET /api/data_sources response includes a type field for each data source. Use it to determine the correct SQL dialect before writing any query:
type | SQL engine | Key syntax notes |
|---|---|---|
athena | Presto | date_add('minute', N, now()), from_iso8601_timestamp(), || for string concat, LIKE (no ILIKE). No DATE_SUB, no DATE_FORMAT. |
pg |
| PostgreSQL |
Standard PG syntax: NOW(), INTERVAL, :: casts, ILIKE, ` |
rds_mysql | MySQL | DATE_SUB(NOW(), INTERVAL N MINUTE), CONCAT(), LIKE. No ILIKE, no :: casts. |
results | n/a | Queries over saved query results — use simple SELECT/WHERE only. |
Always fetch the data source type in Step 2 and apply the matching dialect in Step 4. Never assume the dialect from the data source name.
Confirm what the user wants to know in one sentence. Ask for missing context (time range, filters, groupings) in plain business terms before touching the API.
curl -s -H "Authorization: Key $REDASH_API_KEY" \
"https://redash.data-bonial.com/api/data_sources" | jq '.[] | {id, name, type}'
Present the list to the user and ask which data source to use, or infer from intent if obvious. Note the type field of the chosen data source — it determines the SQL dialect (see Data Source SQL Dialects above).
curl -s -H "Authorization: Key $REDASH_API_KEY" \
"https://redash.data-bonial.com/api/data_sources/<id>/schema" | jq '.schema[] | {name, columns: [.columns[]?.name]}'
Identify the relevant tables and columns based on the user's intent. Explain which tables and fields you plan to use in plain language (not SQL) before writing the query.
Partition field detection (critical for performance): After identifying the target table, inspect its columns for any that look like partition fields — common names include partition_date, year, month, day, hour, dt, date_partition. If any are found, note them explicitly. They MUST be used in the WHERE clause of any query on that table.
Write the SQL query using the correct dialect for the data source (see Data Source SQL Dialects above). If the target table has partition fields, always include them as WHERE predicates — this is required to avoid expensive full table scans. For time-based queries, derive the partition values from the requested time range (e.g. partition_date = '2026-04-07').
Show the query to the user with a plain-language explanation of what it does, including a note if partition fields are being used for performance. Wait for confirmation before executing.
Execute ad-hoc via a new query object:
Create a draft query:
curl -s -X POST \
-H "Authorization: Key $REDASH_API_KEY" \
-H "Content-Type: application/json" \
-d "{\"name\": \"Draft - <intent summary>\", \"query\": \"<sql>\", \"data_source_id\": <id>, \"options\": {}}" \
"https://redash.data-bonial.com/api/queries" | jq '{id, name}'
Trigger execution:
curl -s -X POST \
-H "Authorization: Key $REDASH_API_KEY" \
-H "Content-Type: application/json" \
-d "{\"max_age\": 0}" \
"https://redash.data-bonial.com/api/queries/<query_id>/results" | jq '.'
If a job is returned, poll until complete:
# Poll every 2 seconds
curl -s -H "Authorization: Key $REDASH_API_KEY" \
"https://redash.data-bonial.com/api/jobs/<job_id>" | jq '{status, query_result_id, error}'
Job status codes: 1=PENDING, 2=STARTED, 3=SUCCESS, 4=FAILURE, 5=CANCELLED.
Poll up to 30 times (60 seconds total). If status is still PENDING or STARTED after 30 attempts, stop and tell the user: "The query is taking longer than expected. You can check the result later in Redash directly."
Fetch result:
curl -s -H "Authorization: Key $REDASH_API_KEY" \
"https://redash.data-bonial.com/api/query_results/<result_id>" | jq '.query_result.data'
50 rows: show row count + column names + offer to save as
.csvor.json:
curl -s -H "Authorization: Key $REDASH_API_KEY" \
"https://redash.data-bonial.com/api/query_results/<result_id>.csv" -o results.csv
If the results are wrong or incomplete, ask the user what to adjust in plain language. Update the SQL, re-execute, and repeat.
Ask the user if they want to save the query with a descriptive name. If yes, update the draft query:
curl -s -X POST \
-H "Authorization: Key $REDASH_API_KEY" \
-H "Content-Type: application/json" \
-d "{\"name\": \"<user-approved name>\", \"description\": \"<brief description>\"}" \
"https://redash.data-bonial.com/api/queries/<query_id>"
job.error in plain language. Offer to adjust the query.https://redash.data-bonial.com/users/me.When editing or deleting an existing query (as opposed to creating a new one), only proceed if the coordinator prompt explicitly states: "The user has confirmed permission to modify this resource." If this statement is absent, do not modify the query and tell the user to confirm via the coordinator.
To archive (soft-delete) an existing query after coordinator permission is granted:
curl -s -X DELETE \
-H "Authorization: Key $REDASH_API_KEY" \
"https://redash.data-bonial.com/api/queries/<query_id>"
Inform the user that the query has been archived and can be restored from the Redash UI if needed.
Always explain in business/data terms, not SQL terms. Instead of "I'm adding a GROUP BY clause", say "I'm grouping the results by country so you can see totals for each one."