# Moose / Apis / Analytics Api Documentation – Python
## Included Files
1. moose/apis/analytics-api/analytics-api.mdx
## APIs
Source: moose/apis/analytics-api/analytics-api.mdx
APIs for Moose
# APIs
## Overview
APIs are functions that run on your server and automatically exposed as HTTP `GET` endpoints.
They are designed to read data from your OLAP database. Out of the box, these APIs provide:
- Automatic type validation and type conversion for your query parameters, which are sent in the URL, and response body
- Managed database client connection
- Automatic OpenAPI documentation generation
Common use cases include:
- Powering user-facing analytics, dashboards and other front-end components
- Enabling AI tools to interact with your data
- Building custom APIs for your internal tools
### Enabling APIs
Analytics APIs are enabled by default. To explicitly control this feature in your `moose.config.toml`:
```toml filename="moose.config.toml" copy
[features]
apis = true
```
### Basic Usage
`execute` is the recommended way to execute queries. It provides a thin wrapper around the ClickHouse Python client so that you can safely pass `OlapTable` and `Column` objects to your query without needing to worry about ClickHouse identifiers:
```python filename="ExampleApi.py" copy
from moose_lib import Api, MooseClient
from pydantic import BaseModel
## Import the source pipeline
from app.path.to.SourcePipeline import SourcePipeline
# Define the query parameters
class QueryParams(BaseModel):
filter_field: str
max_results: int
# Define the response body
class ResponseBody(BaseModel):
id: int
name: str
value: float
SourceTable = SourcePipeline.get_table()
# Define the route handler function (parameterized)
def run(client: MooseClient, params: QueryParams) -> list[ResponseBody]:
query = """
SELECT
id,
name,
value
FROM {table}
WHERE category = {category}
LIMIT {limit}
"""
return client.query.execute(query, {"table": SourceTable, "category": params.filter_field, "limit": params.max_results})
# Create the API
example_api = Api[QueryParams, ResponseBody](name="example_endpoint", query_function=run)
```
Use `execute_raw` with parameter binding for safe, typed queries:
```python filename="ExampleApi.py" copy
from moose_lib import Api, MooseClient
from pydantic import BaseModel
## Import the source pipeline
from app.path.to.SourcePipeline import SourcePipeline
# Define the query parameters
class QueryParams(BaseModel):
filterField: str
maxResults: int
# Define the response body
class ResponseBody(BaseModel):
id: int
name: str
value: float
SourceTable = SourcePipeline.get_table()
# Define the route handler function (using execute_raw with typed parameters)
def run(client: MooseClient, params: QueryParams) -> list[ResponseBody]:
query = """
SELECT
id,
name,
value
FROM Source
WHERE category = {category:String}
LIMIT {limit:UInt32}
"""
return client.query.execute_raw(query, {"category": params.filterField, "limit": params.maxResults})
# Create the API
example_api = Api[QueryParams, ResponseBody](name="example_endpoint", query_function=run)
```
```python filename="SourcePipeline.py" copy
from moose_lib import IngestPipeline, IngestPipelineConfig, Key
from pydantic import BaseModel
class SourceSchema(BaseModel):
id: Key[int]
name: str
value: float
SourcePipeline = IngestPipeline[SourceSchema]("Source", IngestPipelineConfig(
ingest_api=False,
stream=True,
table=True,
))
```
The `Api` class takes:
- Route name: The URL path to access your API (e.g., `"example_endpoint"`)
- Handler function: Processes requests with typed parameters and returns the result
The generic type parameters specify:
- `QueryParams`: The structure of accepted URL parameters
- `ResponseBody`: The exact shape of your API's response data
You can name these types anything you want. The first type generates validation for query parameters, while the second defines the response structure for OpenAPI documentation.
## Type Validation
You can also model the query parameters and response body as Pydantic models, which Moose will use to provide automatic type validation and type conversion for your query parameters, which are sent in the URL, and response body.
### Modeling Query Parameters
Define your API's parameters as a Pydantic model:
```python filename="ExampleQueryParams.py" copy
from pydantic import BaseModel
from typing import Optional
class QueryParams(BaseModel):
filterField: str = Field(..., description="The field to filter by")
maxResults: int = Field(..., description="The maximum number of results to return")
optionalParam: Optional[str] = Field(None, description="An optional parameter")
```
Moose automatically handles:
- Runtime validation
- Clear error messages for invalid parameters
- OpenAPI documentation generation
Complex nested objects and arrays are not supported. Analytics APIs are `GET` endpoints designed to be simple and lightweight.
### Adding Advanced Type Validation
Moose uses Pydantic for runtime validation. Use Pydantic's `Field` class for more complex validation:
```python filename="ExampleQueryParams.py" copy
from pydantic import BaseModel, Field
class QueryParams(BaseModel):
filterField: str = Field(pattern=r"^(id|name|email)$", description="The field to filter by") ## Only allow valid column names from the UserTable
maxResults: int = Field(gt=0, description="The maximum number of results to return") ## Positive integer
```
### Common Validation Options
```python filename="ValidationExamples.py" copy
from pydantic import BaseModel, Field
class QueryParams(BaseModel):
# Numeric validations
id: int = Field(..., gt=0)
age: int = Field(..., gt=0, lt=120)
price: float = Field(..., gt=0, lt=1000)
discount: float = Field(..., gt=0, multiple_of=0.5)
# String validations
username: str = Field(..., min_length=3, max_length=20)
email: str = Field(..., format="email")
zipCode: str = Field(..., pattern=r"^[0-9]{5}$")
uuid: str = Field(..., format="uuid")
ipAddress: str = Field(..., format="ipv4")
# Date validations
startDate: str = Field(..., format="date")
# Enum validation
status: str = Field(..., enum=["active", "pending", "inactive"])
# Optional parameters
limit: int = Field(None, gt=0, lt=100)
```
For a full list of validation options, see the [Pydantic documentation](https://docs.pydantic.dev/latest/concepts/types/#customizing-validation-with-fields).
### Setting Default Values
You can set default values for parameters by setting values for each parameter in your Pydantic model:
```python filename="ExampleQueryParams.py" copy {9}
from pydantic import BaseModel
class QueryParams(BaseModel):
filterField: str = "example"
maxResults: int = 10
optionalParam: str | None = "default"
```
## Implementing Route Handler
API route handlers are regular functions, so you can implement whatever arbitrary logic you want inside these functions. Most of the time you will be use APIs to expose your data to your front-end applications or other tools:
### Connecting to the Database
Moose provides a managed `MooseClient` to your function execution context. This client provides access to the database and other Moose resources, and handles connection pooling/lifecycle management for you:
```python filename="ExampleApi.py" copy
from moose_lib import MooseClient
from app.UserTable import UserTable
def run(client: MooseClient, params: QueryParams):
# You can use a formatted string for simple static query
query = """
SELECT COUNT(*) FROM {table}
"""
## You can optionally pass the table object to the query
return client.query.execute(query, {"table": UserTable})
## Create the API
example_api = Api[QueryParams, ResponseBody](name="example_endpoint", query_function=run)
```
Use `execute_raw` with parameter binding:
```python filename="ExampleApi.py" copy
from moose_lib import MooseClient
def run(params: QueryParams, client: MooseClient):
# Using execute_raw for safe queries
query = """
SELECT COUNT(*) FROM {table: Identifier}
"""
## Must be the name of the table, not the table object
return client.query.execute_raw(query, {"table": UserTable.name})
## Create the API
example_api = Api[QueryParams, ResponseBody](name="example_endpoint", query_function=run)
```
### Constructing Safe SQL Queries
```python filename="SafeQueries.py" copy
from pydantic import BaseModel, Field
class QueryParams(BaseModel):
min_age: int = Field(ge=0, le=150)
status: str = Field(pattern=r"^(active|inactive)$")
limit: int = Field(default=10, ge=1, le=1000)
search_text: str = Field(pattern=r'^[a-zA-Z0-9\s]*$')
def run(client: MooseClient, params: QueryParams):
query = """
SELECT *
FROM users
WHERE age >= {min_age}
AND status = '{status}'
AND name ILIKE '%{search_text}%'
LIMIT {limit}
"""
return client.query.execute(query, {"min_age": params.min_age, "status": params.status, "search_text": params.search_text, "limit": params.limit})
```
```python filename="SafeQueries.py" copy
from pydantic import BaseModel, Field
class QueryParams(BaseModel):
min_age: int = Field(ge=0, le=150)
status: str = Field(pattern=r"^(active|inactive)$")
limit: int = Field(default=10, ge=1, le=1000)
search_text: str = Field(pattern=r'^[a-zA-Z0-9\s]*$')
def run(client: MooseClient, params: QueryParams):
query = """
SELECT *
FROM users
WHERE age >= {minAge:UInt32}
AND status = {status:String}
AND name ILIKE {searchPattern:String}
LIMIT {limit:UInt32}
"""
return client.query.execute_raw(query, {
"minAge": params.min_age,
"status": params.status,
"searchPattern": f"%{params.search_text}%",
"limit": params.limit
})
```
#### Basic Query Parameter Interpolation
#### Table and Column References
```python filename="ValidatedQueries.py" copy
from moose_lib import Api, MooseClient
from pydantic import BaseModel, Field, constr
from typing import Literal, Optional
from enum import Enum
from app.UserTable import UserTable
class QueryParams(BaseModel):
# When using f-strings, we need extremely strict validation
column: str = Field(pattern=r"^(id|name|email)$", description="Uses a regex pattern to only allow valid column names")
search_term: str = Field(
pattern=r'^[\w\s\'-]{1,50}$', # Allows letters, numbers, spaces, hyphens, apostrophes; Does not allow special characters that could be used in SQL injection
strip_whitespace=True,
min_length=1,
max_length=50
)
limit: int = Field(
default=10,
ge=1,
le=100,
description="Number of results to return"
)
def run(client: MooseClient, params: QueryParams):
query = """
SELECT {column}
FROM {table}
WHERE name ILIKE '%{search_term}%'
LIMIT {limit}
"""
return client.query.execute(query, {"column": UserTable.cols[params.column], "table": UserTable, "search_term": params.search_term, "limit": params.limit})
```
```python filename="UserTable.py" copy
from moose_lib import OlapTable, Key
from pydantic import BaseModel
class UserSchema(BaseModel):
id: Key[int]
name: str
email: str
UserTable = OlapTable[UserSchema]("users")
```
### Advanced Query Patterns
#### Dynamic Column & Table Selection
```python filename="DynamicColumns.py" copy
from app.UserTable import UserTable
class QueryParams(BaseModel):
colName: str = Field(pattern=r"^(id|name|email)$", description="Uses a regex pattern to only allow valid column names from the UserTable")
class QueryResult(BaseModel):
id: Optional[int]
name: Optional[str]
email: Optional[str]
def run(client: MooseClient, params: QueryParams):
# Put column and table in the dict for variables
query = "SELECT {column} FROM {table}"
return client.query.execute(query, {"column": UserTable.cols[params.colName], "table": UserTable})
## Create the API
bar = Api[QueryParams, QueryResult](name="bar", query_function=run)
## Call the API
## HTTP Request: GET http://localhost:4000/api/bar?colName=id
## EXECUTED QUERY: SELECT id FROM users
```
```python filename="UserTable.py" copy
from moose_lib import OlapTable, Key
from pydantic import BaseModel
class UserSchema(BaseModel):
id: Key[int]
name: str
email: str
UserTable = OlapTable[UserSchema]("users")
```
#### Conditional `WHERE` Clauses
Build `WHERE` clauses based on provided parameters:
```python filename="ConditionalColumns.py" copy
class FilterParams(BaseModel):
min_age: Optional[int]
status: Optional[str] = Field(pattern=r"^(active|inactive)$")
search_text: Optional[str] = Field(pattern=r"^[a-zA-Z0-9\s]+$", description="Alphanumeric search text without special characters to prevent SQL injection")
class QueryResult(BaseModel):
id: int
name: str
email: str
def build_query(client: MooseClient, params: FilterParams) -> QueryResult:
# Using f-strings with validated parameters
conditions = []
if params.min_age:
conditions.append("age >= {min_age}")
parameters["min_age"] = params.min_age
if params.status:
conditions.append("status = {status}")
parameters["status"] = params.status
if params.search_text:
conditions.append("(name ILIKE {search_text} OR email ILIKE {search_text})")
parameters["search_text"] = params.search_text
where_clause = f" WHERE {' AND '.join(conditions)}" if conditions else ""
query = f"""SELECT * FROM users {where_clause} ORDER BY created_at DESC"""
return client.query.execute(query, parameters)
## Create the API
bar = Api[FilterParams, QueryResult](name="bar", query_function=build_query)
## Call the API
## HTTP Request: GET http://localhost:4000/api/bar?min_age=20&status=active&search_text=John
## EXECUTED QUERY: SELECT * FROM users WHERE age >= 20 AND status = 'active' AND (name ILIKE '%John%' OR email ILIKE '%John%') ORDER BY created_at DESC
```
### Adding Authentication
Moose supports authentication via JSON web tokens (JWTs). When your client makes a request to your Analytics API, Moose will automatically parse the JWT and pass the **authenticated** payload to your handler function as the `jwt` object:
```python filename="Authentication.py" copy
def run(client: MooseClient, params: QueryParams, jwt: dict):
# Use parameter binding with JWT data
query = """SELECT * FROM userReports WHERE user_id = {user_id} LIMIT 5"""
return client.query.execute(query, {"user_id": jwt["userId"]})
```
Moose validates the JWT signature and ensures the JWT is properly formatted. If the JWT authentication fails, Moose will return a `401 Unauthorized error`.
## Understanding Response Codes
Moose automatically provides standard HTTP responses:
| Status Code | Meaning | Response Body |
|-------------|-------------------------|---------------------------------|
| 200 | Success | Your API's result data |
| 400 | Validation error | `{ "error": "Detailed message"}`|
| 401 | Unauthorized | `{ "error": "Unauthorized"}` |
| 500 | Internal server error | `{ "error": "Internal server error"}` |
## Post-Processing Query Results
After executing your database query, you can transform the data before returning it to the client. This allows you to:
```python filename="PostProcessingExample.py" copy
from datetime import datetime
from moose_lib import Api
from pydantic import BaseModel
class QueryParams(BaseModel):
category: str
max_results: int = 10
class ResponseItem(BaseModel):
itemId: int
displayName: str
formattedValue: str
isHighValue: bool
date: str
def run(client: MooseClient, params: QueryParams):
# 1. Fetch raw data using parameter binding
query = """
SELECT id, name, value, timestamp
FROM data_table
WHERE category = {category}
LIMIT {limit}
"""
raw_results = client.query.execute(query, {"category": params.category, "limit": params.max_results})
# 2. Post-process the results
processed_results = []
for row in raw_results:
processed_results.append(ResponseItem(
# Transform field names
itemId=row['id'],
displayName=row['name'].upper(),
# Add derived fields
formattedValue=f"${row['value']:.2f}",
isHighValue=row['value'] > 1000,
# Format dates
date=datetime.fromisoformat(row['timestamp']).date().isoformat()
))
return processed_results
# Create the API
process_data_api = Api[QueryParams, ResponseItem](name="process_data_endpoint", query_function=run)
```
### Best Practices
While post-processing gives you flexibility, remember that database operations are typically more efficient for heavy data manipulation. Reserve post-processing for transformations that are difficult to express in SQL or that involve application-specific logic.
## Client Integration
By default, all API endpoints are automatically integrated with OpenAPI/Swagger documentation. You can integrate your OpenAPI SDK generator of choice to generate client libraries for your APIs.
Please refer to the [OpenAPI](/moose/apis/open-api-sdk) page for more information on how to integrate your APIs with OpenAPI.