APIs are functions that run on your server and automatically exposed as HTTP GET endpoints.
They are designed to read data from your OLAP database. Out of the box, these APIs provide:
Common use cases include:
Analytics APIs are enabled by default. To explicitly control this feature in your moose.config.toml:
[features]apis = trueimport { Api } from "@514labs/moose-lib";import { SourcePipeline } from "path/to/SourcePipeline"; // Define the query parametersinterface QueryParams { filterField: string; maxResults: number;} // Model the query result typeinterface ResultItem { id: number; name: string; value: number;} const SourceTable = SourcePipeline.table!; // Use `!` to assert that the table is not nullconst cols = SourceTable.columns; // Define the result type as an array of the result item typeexport const exampleApi = new Api<QueryParams, ResultItem[]>("example_endpoint", async ({ filterField, maxResults }: QueryParams, { client, sql }) => { const query = sql` SELECT ${cols.id}, ${cols.name}, ${cols.value} FROM ${SourceTable} WHERE category = ${filterField} LIMIT ${maxResults}`; // Set the result type to the type of the each row in the result set const resultSet = await client.query.execute<ResultItem>(query); // Return the result set as an array of the result item type return await resultSet.json(); });import { Api } from "@514labs/moose-lib";import { SourcePipeline } from "path/to/SourcePipeline"; // Define the query parametersinterface QueryParams { filterField: string; maxResults: number;} // Model the query result typeinterface ResultItem { id: number; name: string; value: number;} const SourceTable = SourcePipeline.table!; // Use `!` to assert that the table is not nullconst cols = SourceTable.columns; // Define the result type as an array of the result item typeexport const exampleApi = new Api<QueryParams, ResultItem[]>("example_endpoint", async ({ filterField, maxResults }: QueryParams, { client, sql }) => { const query = sql` SELECT ${cols.id}, ${cols.name}, ${cols.value} FROM ${SourceTable} WHERE category = ${filterField} LIMIT ${maxResults}`; // Set the result type to the type of the each row in the result set const resultSet = await client.query.execute<ResultItem>(query); // Return the result set as an array of the result item type return await resultSet.json(); });The Api class takes:
"example_endpoint")The generic type parameters specify:
QueryParams: The structure of accepted URL parametersResponseBody: The exact shape of your API's response dataYou can name these types anything you want. The first type generates validation for query parameters, while the second defines the response structure for OpenAPI documentation.
You can also model the query parameters and response body as interfaces (TypeScript) or Pydantic models (Python), which Moose will use to provide automatic type validation and type conversion for your query parameters, which are sent in the URL, and response body.
Define your API's parameters as a TypeScript interface:
interface QueryParams { filterField: string; maxResults: number; optionalParam?: string; // Not required for client to provide}Moose automatically handles:
Complex nested objects and arrays are not supported. Analytics APIs are GET endpoints designed to be simple and lightweight.
Moose uses Typia to extract type definitions and provide runtime validation. Use Typia's tags for more complex validation:
interface QueryParams { filterField: string; // Ensure maxResults is a positive integer maxResults: number & tags.Type<"int64"> & tags.Minimum<"1">; }interface QueryParams { // Numeric validations id: number & tags.Type<"uint32">; // Positive integer (0 to 4,294,967,295) age: number & tags.Minimum<18> & tags.Maximum<120>; // Range: 18 <= age <= 120 price: number & tags.ExclusiveMinimum<0> & tags.ExclusiveMaximum<1000>; // Range: 0 < price < 1000 discount: number & tags.MultipleOf<0.5>; // Must be multiple of 0.5 // String validations username: string & tags.MinLength<3> & tags.MaxLength<20>; // Length between 3-20 characters email: string & tags.Format<"email">; // Valid email format zipCode: string & tags.Pattern<"^[0-9]{5}$">; // 5 digits uuid: string & tags.Format<"uuid">; // Valid UUID ipAddress: string & tags.Format<"ipv4">; // Valid IPv4 address // Date validations startDate: string & tags.Format<"date">; // YYYY-MM-DD format // Literal validation status: "active" | "pending" | "inactive"; // Must be one of these values // Optional parameters limit?: number & tags.Type<"uint32"> & tags.Maximum<100>; // Optional, if provided: positive integer <= 100 // Combined validations searchTerm?: (string & tags.MinLength<3>) | null; // Either null or string with ≥3 characters}Notice its just regular TypeScript union types. For a full list of validation options, see the Typia documentation.
You can derive a safe orderBy union from your actual table columns and use it directly in SQL:
interface MyTableSchema { column1: string; column2: number; column3: string;} const MyTable = new OlapTable<MyTableSchema>("my_table"); interface QueryParams { orderByColumn: keyof MyTableSchema; // validates against the column names in "my_table"}You can set default values for parameters by setting values for each parameter in the API route handler function signature:
interface QueryParams { filterField: string; maxResults: number; optionalParam?: string; // Not required for client to provide} const api = new Api<QueryParams, ResponseBody>("example_endpoint", async ({ filterField = "example", maxResults = 10, optionalParam = "default" }, { client, sql }) => { // Your logic here... });API route handlers are regular functions, so you can implement whatever arbitrary logic you want inside these functions. Most of the time you will be use APIs to expose your data to your front-end applications or other tools:
Moose provides a managed MooseClient to your function execution context. This client provides access to the database and other Moose resources, and handles connection pooling/lifecycle management for you:
import { ApiUtil } from "@514labs/moose-lib";import { UserTable } from "./UserTable"; async function handler({ client, sql }: ApiUtil) { const query = sql`SELECT * FROM ${UserTable}`; const data = await client.query.execute<UserSchema>(query);}Pass the type of the result to the client.query.execute<T>() method to ensure type safety.
The sql template literal in Moose provides type-safe query construction with protection against SQL injection. Below are some examples of common patterns for builing safe queries:
import { UserTable } from "./UserTable"; const minAge = 18;const userRole = "admin"; const query = sql` SELECT * FROM ${UserTable} WHERE age > ${minAge} AND role = ${userRole}`; // MooseClient handles type conversion and escapingconst data = await client.query.execute<UserSchema>(query); // EXECUTION: SELECT * FROM users WHERE age > 18 AND role = 'admin'Reference tables and columns directly from your Moose objects as variables in your sql template literals:
import { userTable } from "../tables/userTable"; const query = sql` SELECT ${UserTable.columns.id}, ${UserTable.columns.name}, ${UserTable.columns.email} FROM ${UserTable} WHERE ${UserTable.columns.isActive} = true`; // EXECUTION: SELECT id, name, email FROM users WHERE is_active = trueStatic type checking ensures you only reference columns that actually exist.
Use ApiHelpers to handle dynamic column and table references in your queries:
import { ApiHelpers as CH } from "@514labs/moose-lib"; interface QueryParams { sortBy: string; // Column to sort by fields: string; // Comma-separated list of columns to select (e.g., "id,name,email")} const queryHandler = async ({ sortBy = "id", fields = "id,name" }: QueryParams, { client, sql }) => { // Split the comma-separated string into individual fields const fieldList = fields.split(',').map(f => f.trim()); // Build the query by selecting each column individually const query = sql` SELECT ${fieldList.map(field => sql`${CH.column(field)}`).join(', ')} FROM ${userTable} ORDER BY ${CH.column(sortBy)} `; // MooseClient converts fieldList to valid ClickHouse identifiers return client.query.execute(query); // EXECUTION: `SELECT id, name FROM users ORDER BY id`};import { ApiHelpers as CH } from "@514labs/moose-lib"; interface QueryParams { tableName: string;} const queryHandler = async ({ tableName = "users" }: QueryParams, { client, sql }) => { const query = sql` SELECT * FROM ${CH.table(tableName)} `; // MooseClient converts tableName to a valid ClickHouse identifier return client.query.execute(query); // EXECUTION: `SELECT * FROM users`};Build WHERE clauses based on provided parameters:
interface FilterParams { minAge?: number; status?: "active" | "inactive"; searchText?: string;} const buildQuery = ({ minAge, status, searchText }: FilterParams, { sql }) => { let conditions = []; if (minAge !== undefined) { conditions.push(sql`age >= ${minAge}`); } if (status) { conditions.push(sql`status = ${status}`); } if (searchText) { conditions.push(sql`(name ILIKE ${'%' + searchText + '%'} OR email ILIKE ${'%' + searchText + '%'})`); } // Build the full query with conditional WHERE clause let query = sql`SELECT * FROM ${userTable}`; if (conditions.length > 0) { // Join conditions with AND operator let whereClause = conditions.join(' AND '); query = sql`${query} WHERE ${whereClause}`; } query = sql`${query} ORDER BY created_at DESC`; return query;};Moose supports authentication via JSON web tokens (JWTs). When your client makes a request to your Analytics API, Moose will automatically parse the JWT and pass the authenticated payload to your handler function as the jwt object:
async ( { orderBy = "totalRows", limit = 5 }, { client, sql, jwt }) => { // Use jwt.userId to filter data for the current user const query = sql` SELECT * FROM userReports WHERE user_id = ${jwt.userId} LIMIT ${limit} `; return client.query.execute(query);}Moose validates the JWT signature and ensures the JWT is properly formatted. If the JWT authentication fails, Moose will return a 401 Unauthorized error.
Moose automatically provides standard HTTP responses:
| Status Code | Meaning | Response Body |
|---|---|---|
| 200 | Success | Your API's result data |
| 400 | Validation error | { "error": "Detailed message"} |
| 401 | Unauthorized | { "error": "Unauthorized"} |
| 500 | Internal server error | { "error": "Internal server error"} |
After executing your database query, you can transform the data before returning it to the client. This allows you to:
interface QueryParams { category: string; maxResults: number;} interface ResponseBody { itemId: number; displayName: string; formattedValue: string; isHighValue: boolean; date: string;} const processDataApi = new Api<QueryParams, ResponseBody>( "process_data_endpoint", async ({ category, maxResults = 10 }, { client, sql }) => { // 1. Fetch raw data const query = sql` SELECT id, name, value, timestamp FROM data_table WHERE category = ${category} LIMIT ${maxResults} `; const rawResults = await client.query.execute<{ id: number; name: string; value: number; timestamp: string; }>(query); // 2. Post-process the results return rawResults.map(row => ({ // Transform field names itemId: row.id, displayName: row.name.toUpperCase(), // Add derived fields formattedValue: `$${row.value.toFixed(2)}`, isHighValue: row.value > 1000, // Format dates date: new Date(row.timestamp).toISOString().split('T')[0] })); });When working with large amounts of data, perform as much filtering, grouping, and aggregation as possible in your SQL query
Post-process to reduce response size when needed, especially for user-facing APIs
Ensure consistent formatting for dates, currencies, and other values in your responses
Use post-processing to remove or mask sensitive information before returning data to clients
Include appropriate error handling in your post-processing logic
While post-processing gives you flexibility, remember that database operations are typically more efficient for heavy data manipulation. Reserve post-processing for transformations that are difficult to express in SQL or that involve application-specific logic.
By default, all API endpoints are automatically integrated with OpenAPI/Swagger documentation. You can integrate your OpenAPI SDK generator of choice to generate client libraries for your APIs.
Please refer to the OpenAPI page for more information on how to integrate your APIs with OpenAPI.