Cotera Tools
Authentication Type: No Authentication
Description: Access and query datasets from your Cotera datagraph with powerful filtering capabilities.
Dataset Query Operations
Execute queries against datasets in your organization's datagraph.
Query Dataset
Query any dataset by symbol name with optional filtering. Perfect for data exploration, debugging, and analysis of your Cotera data.
Operation Type: Query (Read)
Parameters:
- symbol
string
(required): The name/symbol of the dataset to query - filterColumn
string
(nullable): Column name to filter on (optional) - filterValue
string
(nullable): Value to filter by using LIKE operator (optional) - limit
number
(nullable): Maximum number of rows to return. Defaults to 100
Returns:
- symbol
string
: The dataset symbol that was queried - data
array of objects
: Array of result rows - metadata
object
: Query metadata information- rowCount
number
: Number of rows returned - columns
array of strings
: Available column names in the dataset - filterApplied
boolean
: Whether a filter was applied - filter
string
(nullable): The filter condition that was applied, if any
- rowCount
Example Usage:
{
"symbol": "customer_transactions",
"filterColumn": "transaction_type",
"filterValue": "purchase",
"limit": 500
}
Common Use Cases
Data Exploration:
- Query datasets to understand data structure and available columns
- Explore sample data from various datasets in your organization's datagraph
- Use filtering to examine specific subsets of data for pattern analysis
Data Debugging:
- Investigate data quality issues by filtering for specific values or conditions
- Verify data transformations by comparing input and output datasets
- Check for missing or anomalous data patterns using targeted queries
Analytics and Reporting:
- Extract filtered datasets for use in business intelligence and reporting tools
- Query transaction data with date range or category filters for financial analysis
- Retrieve customer data segments for targeted marketing campaign analysis
Development and Testing:
- Access sample datasets during application development and testing
- Validate data pipeline outputs by querying intermediate and final datasets
- Monitor data freshness and completeness across different data sources in your datagraph