1. MooseStack
  2. Moose APIs & Web Apps
  3. APIs

APIs

Overview

APIs are functions that run on your server and automatically exposed as HTTP GET endpoints. They are designed to read data from your OLAP database. Out of the box, these APIs provide:

  • Automatic type validation and type conversion for your query parameters, which are sent in the URL, and response body
  • Managed database client connection
  • Automatic OpenAPI documentation generation

Common use cases include:

  • Powering user-facing analytics, dashboards and other front-end components
  • Enabling AI tools to interact with your data
  • Building custom APIs for your internal tools

Enabling APIs

Analytics APIs are enabled by default. To explicitly control this feature in your moose.config.toml:

moose.config.toml
[features]apis = true

Basic Usage

ExampleApi.ts
import { Api } from "@514labs/moose-lib";import { SourcePipeline } from "path/to/SourcePipeline";  // Define the query parametersinterface QueryParams {  filterField: string;  maxResults: number;} // Model the query result typeinterface ResultItem {  id: number;  name: string;  value: number;}  const SourceTable = SourcePipeline.table!; // Use `!` to assert that the table is not nullconst cols = SourceTable.columns; // Define the result type as an array of the result item typeexport const exampleApi = new Api<QueryParams, ResultItem[]>("example_endpoint",      async ({ filterField, maxResults }: QueryParams, { client, sql }) => {        const query = sql`        SELECT           ${cols.id},          ${cols.name},          ${cols.value}        FROM ${SourceTable}        WHERE category = ${filterField}        LIMIT ${maxResults}`;                // Set the result type to the type of the each row in the result set        const resultSet = await client.query.execute<ResultItem>(query);         // Return the result set as an array of the result item type        return await resultSet.json();    });

The Api class takes:

  • Route name: The URL path to access your API (e.g., "example_endpoint")
  • Handler function: Processes requests with typed parameters and returns the result

The generic type parameters specify:

  • QueryParams: The structure of accepted URL parameters
  • ResponseBody: The exact shape of your API's response data
MooseTip:

You can name these types anything you want. The first type generates validation for query parameters, while the second defines the response structure for OpenAPI documentation.

Moose automatically handles:

URL parameter validation and type conversion

SQL query interpolation and execution

Response formatting

Automated OpenAPI documentation

Type Validation

You can also model the query parameters and response body as interfaces (TypeScript) or Pydantic models (Python), which Moose will use to provide automatic type validation and type conversion for your query parameters, which are sent in the URL, and response body.

Modeling Query Parameters

Define your API's parameters as a TypeScript interface:

ExampleQueryParams.ts
interface QueryParams {  filterField: string;  maxResults: number;  optionalParam?: string; // Not required for client to provide}

Moose automatically handles:

  • Runtime validation
  • Clear error messages for invalid parameters
  • OpenAPI documentation generation
Warning:

Complex nested objects and arrays are not supported. Analytics APIs are GET endpoints designed to be simple and lightweight.

Adding Advanced Type Validation

Moose uses Typia to extract type definitions and provide runtime validation. Use Typia's tags for more complex validation:

ExampleQueryParams.ts
interface QueryParams {  filterField: string;  // Ensure maxResults is a positive integer  maxResults: number & tags.Type<"int64"> & tags.Minimum<"1">; }

Common Validation Options

ValidationExamples.ts
interface QueryParams {  // Numeric validations  id: number & tags.Type<"uint32">;  // Positive integer (0 to 4,294,967,295)  age: number & tags.Minimum<18> & tags.Maximum<120>;  // Range: 18 <= age <= 120  price: number & tags.ExclusiveMinimum<0> & tags.ExclusiveMaximum<1000>;  // Range: 0 < price < 1000  discount: number & tags.MultipleOf<0.5>;  // Must be multiple of 0.5    // String validations  username: string & tags.MinLength<3> & tags.MaxLength<20>;  // Length between 3-20 characters  email: string & tags.Format<"email">;  // Valid email format  zipCode: string & tags.Pattern<"^[0-9]{5}$">;  // 5 digits  uuid: string & tags.Format<"uuid">;  // Valid UUID  ipAddress: string & tags.Format<"ipv4">; // Valid IPv4 address    // Date validations  startDate: string & tags.Format<"date">;  // YYYY-MM-DD format    // Literal validation  status: "active" | "pending" | "inactive";  // Must be one of these values   // Optional parameters  limit?: number & tags.Type<"uint32"> & tags.Maximum<100>;  // Optional, if provided: positive integer <= 100    // Combined validations  searchTerm?: (string & tags.MinLength<3>) | null;  // Either null or string with ≥3 characters}

Notice its just regular TypeScript union types. For a full list of validation options, see the Typia documentation.

You can derive a safe orderBy union from your actual table columns and use it directly in SQL:

ValidationExamples.ts
interface MyTableSchema {  column1: string;  column2: number;  column3: string;} const MyTable = new OlapTable<MyTableSchema>("my_table"); interface QueryParams {  orderByColumn: keyof MyTableSchema; // validates against the column names in "my_table"}

Setting Default Values

You can set default values for parameters by setting values for each parameter in the API route handler function signature:

ExampleQueryParams.ts
interface QueryParams {  filterField: string;  maxResults: number;  optionalParam?: string; // Not required for client to provide} const api = new Api<QueryParams, ResponseBody>("example_endpoint",  async ({ filterField = "example", maxResults = 10, optionalParam = "default" }, { client, sql }) => {    // Your logic here...  });

Implementing Route Handler

API route handlers are regular functions, so you can implement whatever arbitrary logic you want inside these functions. Most of the time you will be use APIs to expose your data to your front-end applications or other tools:

Connecting to the Database

Moose provides a managed MooseClient to your function execution context. This client provides access to the database and other Moose resources, and handles connection pooling/lifecycle management for you:

ExampleApi.ts
import { ApiUtil } from "@514labs/moose-lib";import { UserTable } from "./UserTable"; async function handler({ client, sql }: ApiUtil) {  const query = sql`SELECT * FROM ${UserTable}`;  const data = await client.query.execute<UserSchema>(query);}

Pass the type of the result to the client.query.execute<T>() method to ensure type safety.

Constructing Safe SQL Queries

The sql template literal in Moose provides type-safe query construction with protection against SQL injection. Below are some examples of common patterns for builing safe queries:

Basic Query Parameter Interpolation

BasicInterpolation.ts
import { UserTable } from "./UserTable"; const minAge = 18;const userRole = "admin"; const query = sql`  SELECT * FROM ${UserTable}  WHERE age > ${minAge}   AND role = ${userRole}`; // MooseClient handles type conversion and escapingconst data = await client.query.execute<UserSchema>(query); // EXECUTION: SELECT * FROM users WHERE age > 18 AND role = 'admin'

Table and Column References

Reference tables and columns directly from your Moose objects as variables in your sql template literals:

ExampleTableColumnQuery.ts
import { userTable } from "../tables/userTable"; const query = sql`  SELECT     ${UserTable.columns.id},    ${UserTable.columns.name},    ${UserTable.columns.email}  FROM ${UserTable}  WHERE ${UserTable.columns.isActive} = true`; // EXECUTION: SELECT id, name, email FROM users WHERE is_active = true
Type Safety

Static type checking ensures you only reference columns that actually exist.

Advanced Query Patterns

SQL Template Helpers

The sql template tag provides helper methods for building dynamic queries safely:

Combine SQL Fragments

Join an array of Sql fragments with a separator. Spaces are automatically added around separators that don't include them:

SqlJoinExample.ts
import { UserTable } from "./UserTable"; const cols = UserTable.columns; // Join column referencesconst selectColumns = sql.join([  sql`${cols.id}`,  sql`${cols.name}`,  sql`${cols.email}`], ",");// Result: `id`, `name`, `email` // Join WHERE conditions with ANDconst conditions = [  sql`${cols.age} >= ${18}`,  sql`${cols.status} = ${"active"}`];const whereClause = sql.join(conditions, "AND");// Result: `age` >= 18 AND `status` = 'active'
Insert Raw SQL

Insert raw SQL strings without parameterization. Use for SQL keywords, functions, or static values that can't be parameterized:

SqlRawExample.ts
// SQL functionsconst timestamp = sql.raw("NOW()");const query = sql`SELECT *, ${timestamp} as query_time FROM ${UserTable}`;// Result: SELECT *, NOW() as query_time FROM `users` // Sort directionconst direction = sql.raw("DESC");const sortedQuery = sql`SELECT * FROM ${UserTable} ORDER BY created_at ${direction}`;// Result: SELECT * FROM `users` ORDER BY created_at DESC
Warning:

Only use sql.raw() with trusted, static values. Never pass user input directly to sql.raw() as it bypasses SQL injection protection.

Build Queries Incrementally

Chain SQL fragments together to build queries step by step:

SqlAppendExample.ts
import { Sql } from "@514labs/moose-lib";import { UserTable } from "./UserTable"; interface QueryParams {  minAge?: number;  status?: string;  limit?: number;} const buildQuery = ({ minAge, status, limit = 100 }: QueryParams, { sql }) => {  // Start with base query  let query = sql`SELECT * FROM ${UserTable}`;   // Build conditions  const conditions: Sql[] = [];  if (minAge !== undefined) {    conditions.push(sql`age >= ${minAge}`);  }  if (status) {    conditions.push(sql`status = ${status}`);  }   // Append WHERE clause if conditions exist  if (conditions.length > 0) {    query = query.append(sql` WHERE ${sql.join(conditions, "AND")}`);  }   // Append ORDER BY and LIMIT  query = query    .append(sql` ORDER BY created_at ${sql.raw("DESC")}`)    .append(sql` LIMIT ${limit}`);   return query;};

Dynamic Column & Table Selection

Use ApiHelpers to handle dynamic column and table references in your queries:

DynamicColumns.ts
import { ApiHelpers as CH } from "@514labs/moose-lib"; interface QueryParams {  sortBy: string;  // Column to sort by  fields: string;  // Comma-separated list of columns to select (e.g., "id,name,email")} const queryHandler = async ({ sortBy = "id", fields = "id,name" }: QueryParams, { client, sql }) => {  // Split the comma-separated string into individual fields  const fieldList = fields.split(',').map(f => f.trim());    // Build the query by selecting each column individually  const query = sql`    SELECT       ${fieldList.map(field => sql`${CH.column(field)}`).join(', ')}    FROM ${userTable}    ORDER BY ${CH.column(sortBy)}  `;    // MooseClient converts fieldList to valid ClickHouse identifiers  return client.query.execute(query);   // EXECUTION: `SELECT id, name FROM users ORDER BY id`};
DynamicTables.ts
import { ApiHelpers as CH } from "@514labs/moose-lib"; interface QueryParams {  tableName: string;} const queryHandler = async ({ tableName = "users" }: QueryParams, { client, sql }) => {  const query = sql`    SELECT * FROM ${CH.table(tableName)}  `;   // MooseClient converts tableName to a valid ClickHouse identifier  return client.query.execute(query);   // EXECUTION: `SELECT * FROM users`};

Conditional WHERE Clauses

Build WHERE clauses based on provided parameters using sql.join() and Sql.append():

ConditionalColumns.ts
import { Sql } from "@514labs/moose-lib";import { UserTable } from "./UserTable"; interface FilterParams {  minAge?: number;  status?: "active" | "inactive";  searchText?: string;} const buildQuery = ({ minAge, status, searchText }: FilterParams, { sql }) => {  const conditions: Sql[] = [];    if (minAge !== undefined) {    conditions.push(sql`age >= ${minAge}`);  }    if (status) {    conditions.push(sql`status = ${status}`);  }    if (searchText) {    conditions.push(sql`(name ILIKE ${'%' + searchText + '%'} OR email ILIKE ${'%' + searchText + '%'})`);  }    // Build the full query with conditional WHERE clause  let query = sql`SELECT * FROM ${UserTable}`;    if (conditions.length > 0) {    // Use sql.join() to combine conditions with AND    query = query.append(sql` WHERE ${sql.join(conditions, "AND")}`);  }    // Use append() to add ORDER BY clause  query = query.append(sql` ORDER BY created_at ${sql.raw("DESC")}`);  return query;};

Adding Authentication

Moose supports authentication via JSON web tokens (JWTs). When your client makes a request to your Analytics API, Moose will automatically parse the JWT and pass the authenticated payload to your handler function as the jwt object:

Authentication.ts
async (  { orderBy = "totalRows", limit = 5 },  { client, sql, jwt }) => {  // Use jwt.userId to filter data for the current user  const query = sql`    SELECT * FROM userReports     WHERE user_id = ${jwt.userId}    LIMIT ${limit}  `;    return client.query.execute(query);}
JWT Error Handling

Moose validates the JWT signature and ensures the JWT is properly formatted. If the JWT authentication fails, Moose will return a 401 Unauthorized error.

Understanding Response Codes

Moose automatically provides standard HTTP responses:

Status CodeMeaningResponse Body
200SuccessYour API's result data
400Validation error{ "error": "Detailed message"}
401Unauthorized{ "error": "Unauthorized"}
500Internal server error{ "error": "Internal server error"}

Post-Processing Query Results

After executing your database query, you can transform the data before returning it to the client. This allows you to:

Common post-processing operations:

Transform field names or data formats

Calculate derived values

Filter or sort results

Aggregate or group data

Apply business logic

PostProcessingExample.ts
interface QueryParams {  category: string;  maxResults: number;} interface ResponseBody {  itemId: number;  displayName: string;  formattedValue: string;  isHighValue: boolean;  date: string;} const processDataApi = new Api<QueryParams, ResponseBody>(  "process_data_endpoint",  async ({ category, maxResults = 10 }, { client, sql }) => {    // 1. Fetch raw data    const query = sql`      SELECT id, name, value, timestamp      FROM data_table      WHERE category = ${category}      LIMIT ${maxResults}    `;        const rawResults = await client.query.execute<{      id: number;      name: string;      value: number;      timestamp: string;    }>(query);        // 2. Post-process the results    return rawResults.map(row => ({      // Transform field names      itemId: row.id,      displayName: row.name.toUpperCase(),            // Add derived fields      formattedValue: `$${row.value.toFixed(2)}`,      isHighValue: row.value > 1000,            // Format dates      date: new Date(row.timestamp).toISOString().split('T')[0]    }));  });

Best Practices

Post-Processing Best Practices

Prefer database processing for large datasets

When working with large amounts of data, perform as much filtering, grouping, and aggregation as possible in your SQL query

Keep response size reasonable

Post-process to reduce response size when needed, especially for user-facing APIs

Format dates and numbers consistently

Ensure consistent formatting for dates, currencies, and other values in your responses

Handle sensitive data appropriately

Use post-processing to remove or mask sensitive information before returning data to clients

Add clear error handling

Include appropriate error handling in your post-processing logic

MooseTip:

While post-processing gives you flexibility, remember that database operations are typically more efficient for heavy data manipulation. Reserve post-processing for transformations that are difficult to express in SQL or that involve application-specific logic.

Client Integration

By default, all API endpoints are automatically integrated with OpenAPI/Swagger documentation. You can integrate your OpenAPI SDK generator of choice to generate client libraries for your APIs.

Please refer to the OpenAPI page for more information on how to integrate your APIs with OpenAPI.

On this page

OverviewEnabling APIsBasic UsageType ValidationModeling Query ParametersAdding Advanced Type ValidationCommon Validation OptionsSetting Default ValuesImplementing Route HandlerConnecting to the DatabaseConstructing Safe SQL QueriesAdvanced Query PatternsAdding AuthenticationUnderstanding Response CodesPost-Processing Query ResultsBest PracticesClient Integration
FiveonefourFiveonefour
Fiveonefour Docs
MooseStackTemplatesGuides
Release Notes
Source523
  • Overview
Build a New App
  • 5 Minute Quickstart
  • Browse Templates
  • Existing ClickHouse
Add to Existing App
  • Next.js
  • Fastify
Fundamentals
  • Moose Runtime
  • MooseDev MCP
  • Data Modeling
Moose Modules
  • Moose OLAP
  • Moose Streaming
  • Moose Workflows
  • Moose APIs & Web Apps
    • Native APIs
    • Ingest API
    • Analytics API
    • Workflow Trigger
    • Admin APIs
    • Authentication
    • Use Your Web Framework
    • Overview
    • Express
    • Fastify
    • Koa
    • Raw Node.js
Deployment & Lifecycle
  • Moose Migrate
  • Moose Deploy
Reference
  • API Reference
  • Data Types
  • Table Engines
  • CLI
  • Configuration
  • Observability Metrics
  • Help
  • Release Notes
Contribution
  • Documentation
  • Framework
ExampleApi.ts
import { Api } from "@514labs/moose-lib";import { SourcePipeline } from "path/to/SourcePipeline";  // Define the query parametersinterface QueryParams {  filterField: string;  maxResults: number;} // Model the query result typeinterface ResultItem {  id: number;  name: string;  value: number;}  const SourceTable = SourcePipeline.table!; // Use `!` to assert that the table is not nullconst cols = SourceTable.columns; // Define the result type as an array of the result item typeexport const exampleApi = new Api<QueryParams, ResultItem[]>("example_endpoint",      async ({ filterField, maxResults }: QueryParams, { client, sql }) => {        const query = sql`        SELECT           ${cols.id},          ${cols.name},          ${cols.value}        FROM ${SourceTable}        WHERE category = ${filterField}        LIMIT ${maxResults}`;                // Set the result type to the type of the each row in the result set        const resultSet = await client.query.execute<ResultItem>(query);         // Return the result set as an array of the result item type        return await resultSet.json();    });

Define your API's parameters as a TypeScript interface:

ExampleQueryParams.ts
interface QueryParams {  filterField: string;  maxResults: number;  optionalParam?: string; // Not required for client to provide}

Moose uses Typia to extract type definitions and provide runtime validation. Use Typia's tags for more complex validation:

ExampleQueryParams.ts
interface QueryParams {  filterField: string;  // Ensure maxResults is a positive integer  maxResults: number & tags.Type<"int64"> & tags.Minimum<"1">; }
ValidationExamples.ts
interface QueryParams {  // Numeric validations  id: number & tags.Type<"uint32">;  // Positive integer (0 to 4,294,967,295)  age: number & tags.Minimum<18> & tags.Maximum<120>;  // Range: 18 <= age <= 120  price: number & tags.ExclusiveMinimum<0> & tags.ExclusiveMaximum<1000>;  // Range: 0 < price < 1000  discount: number & tags.MultipleOf<0.5>;  // Must be multiple of 0.5    // String validations  username: string & tags.MinLength<3> & tags.MaxLength<20>;  // Length between 3-20 characters  email: string & tags.Format<"email">;  // Valid email format  zipCode: string & tags.Pattern<"^[0-9]{5}$">;  // 5 digits  uuid: string & tags.Format<"uuid">;  // Valid UUID  ipAddress: string & tags.Format<"ipv4">; // Valid IPv4 address    // Date validations  startDate: string & tags.Format<"date">;  // YYYY-MM-DD format    // Literal validation  status: "active" | "pending" | "inactive";  // Must be one of these values   // Optional parameters  limit?: number & tags.Type<"uint32"> & tags.Maximum<100>;  // Optional, if provided: positive integer <= 100    // Combined validations  searchTerm?: (string & tags.MinLength<3>) | null;  // Either null or string with ≥3 characters}

Notice its just regular TypeScript union types. For a full list of validation options, see the Typia documentation.

You can derive a safe orderBy union from your actual table columns and use it directly in SQL:

ValidationExamples.ts
interface MyTableSchema {  column1: string;  column2: number;  column3: string;} const MyTable = new OlapTable<MyTableSchema>("my_table"); interface QueryParams {  orderByColumn: keyof MyTableSchema; // validates against the column names in "my_table"}

You can set default values for parameters by setting values for each parameter in the API route handler function signature:

ExampleQueryParams.ts
interface QueryParams {  filterField: string;  maxResults: number;  optionalParam?: string; // Not required for client to provide} const api = new Api<QueryParams, ResponseBody>("example_endpoint",  async ({ filterField = "example", maxResults = 10, optionalParam = "default" }, { client, sql }) => {    // Your logic here...  });
ExampleApi.ts
import { ApiUtil } from "@514labs/moose-lib";import { UserTable } from "./UserTable"; async function handler({ client, sql }: ApiUtil) {  const query = sql`SELECT * FROM ${UserTable}`;  const data = await client.query.execute<UserSchema>(query);}

Pass the type of the result to the client.query.execute<T>() method to ensure type safety.

The sql template literal in Moose provides type-safe query construction with protection against SQL injection. Below are some examples of common patterns for builing safe queries:

BasicInterpolation.ts
import { UserTable } from "./UserTable"; const minAge = 18;const userRole = "admin"; const query = sql`  SELECT * FROM ${UserTable}  WHERE age > ${minAge}   AND role = ${userRole}`; // MooseClient handles type conversion and escapingconst data = await client.query.execute<UserSchema>(query); // EXECUTION: SELECT * FROM users WHERE age > 18 AND role = 'admin'

Reference tables and columns directly from your Moose objects as variables in your sql template literals:

ExampleTableColumnQuery.ts
import { userTable } from "../tables/userTable"; const query = sql`  SELECT     ${UserTable.columns.id},    ${UserTable.columns.name},    ${UserTable.columns.email}  FROM ${UserTable}  WHERE ${UserTable.columns.isActive} = true`; // EXECUTION: SELECT id, name, email FROM users WHERE is_active = true
Type Safety

Static type checking ensures you only reference columns that actually exist.

Combine SQL Fragments

Join an array of Sql fragments with a separator. Spaces are automatically added around separators that don't include them:

SqlJoinExample.ts
import { UserTable } from "./UserTable"; const cols = UserTable.columns; // Join column referencesconst selectColumns = sql.join([  sql`${cols.id}`,  sql`${cols.name}`,  sql`${cols.email}`], ",");// Result: `id`, `name`, `email` // Join WHERE conditions with ANDconst conditions = [  sql`${cols.age} >= ${18}`,  sql`${cols.status} = ${"active"}`];const whereClause = sql.join(conditions, "AND");// Result: `age` >= 18 AND `status` = 'active'
Insert Raw SQL

Insert raw SQL strings without parameterization. Use for SQL keywords, functions, or static values that can't be parameterized:

SqlRawExample.ts
// SQL functionsconst timestamp = sql.raw("NOW()");const query = sql`SELECT *, ${timestamp} as query_time FROM ${UserTable}`;// Result: SELECT *, NOW() as query_time FROM `users` // Sort directionconst direction = sql.raw("DESC");const sortedQuery = sql`SELECT * FROM ${UserTable} ORDER BY created_at ${direction}`;// Result: SELECT * FROM `users` ORDER BY created_at DESC
Warning:

Only use sql.raw() with trusted, static values. Never pass user input directly to sql.raw() as it bypasses SQL injection protection.

Build Queries Incrementally

Chain SQL fragments together to build queries step by step:

SqlAppendExample.ts
import { Sql } from "@514labs/moose-lib";import { UserTable } from "./UserTable"; interface QueryParams {  minAge?: number;  status?: string;  limit?: number;} const buildQuery = ({ minAge, status, limit = 100 }: QueryParams, { sql }) => {  // Start with base query  let query = sql`SELECT * FROM ${UserTable}`;   // Build conditions  const conditions: Sql[] = [];  if (minAge !== undefined) {    conditions.push(sql`age >= ${minAge}`);  }  if (status) {    conditions.push(sql`status = ${status}`);  }   // Append WHERE clause if conditions exist  if (conditions.length > 0) {    query = query.append(sql` WHERE ${sql.join(conditions, "AND")}`);  }   // Append ORDER BY and LIMIT  query = query    .append(sql` ORDER BY created_at ${sql.raw("DESC")}`)    .append(sql` LIMIT ${limit}`);   return query;};

Use ApiHelpers to handle dynamic column and table references in your queries:

DynamicColumns.ts
import { ApiHelpers as CH } from "@514labs/moose-lib"; interface QueryParams {  sortBy: string;  // Column to sort by  fields: string;  // Comma-separated list of columns to select (e.g., "id,name,email")} const queryHandler = async ({ sortBy = "id", fields = "id,name" }: QueryParams, { client, sql }) => {  // Split the comma-separated string into individual fields  const fieldList = fields.split(',').map(f => f.trim());    // Build the query by selecting each column individually  const query = sql`    SELECT       ${fieldList.map(field => sql`${CH.column(field)}`).join(', ')}    FROM ${userTable}    ORDER BY ${CH.column(sortBy)}  `;    // MooseClient converts fieldList to valid ClickHouse identifiers  return client.query.execute(query);   // EXECUTION: `SELECT id, name FROM users ORDER BY id`};
DynamicTables.ts
import { ApiHelpers as CH } from "@514labs/moose-lib"; interface QueryParams {  tableName: string;} const queryHandler = async ({ tableName = "users" }: QueryParams, { client, sql }) => {  const query = sql`    SELECT * FROM ${CH.table(tableName)}  `;   // MooseClient converts tableName to a valid ClickHouse identifier  return client.query.execute(query);   // EXECUTION: `SELECT * FROM users`};
ConditionalColumns.ts
import { Sql } from "@514labs/moose-lib";import { UserTable } from "./UserTable"; interface FilterParams {  minAge?: number;  status?: "active" | "inactive";  searchText?: string;} const buildQuery = ({ minAge, status, searchText }: FilterParams, { sql }) => {  const conditions: Sql[] = [];    if (minAge !== undefined) {    conditions.push(sql`age >= ${minAge}`);  }    if (status) {    conditions.push(sql`status = ${status}`);  }    if (searchText) {    conditions.push(sql`(name ILIKE ${'%' + searchText + '%'} OR email ILIKE ${'%' + searchText + '%'})`);  }    // Build the full query with conditional WHERE clause  let query = sql`SELECT * FROM ${UserTable}`;    if (conditions.length > 0) {    // Use sql.join() to combine conditions with AND    query = query.append(sql` WHERE ${sql.join(conditions, "AND")}`);  }    // Use append() to add ORDER BY clause  query = query.append(sql` ORDER BY created_at ${sql.raw("DESC")}`);  return query;};
Authentication.ts
async (  { orderBy = "totalRows", limit = 5 },  { client, sql, jwt }) => {  // Use jwt.userId to filter data for the current user  const query = sql`    SELECT * FROM userReports     WHERE user_id = ${jwt.userId}    LIMIT ${limit}  `;    return client.query.execute(query);}
PostProcessingExample.ts
interface QueryParams {  category: string;  maxResults: number;} interface ResponseBody {  itemId: number;  displayName: string;  formattedValue: string;  isHighValue: boolean;  date: string;} const processDataApi = new Api<QueryParams, ResponseBody>(  "process_data_endpoint",  async ({ category, maxResults = 10 }, { client, sql }) => {    // 1. Fetch raw data    const query = sql`      SELECT id, name, value, timestamp      FROM data_table      WHERE category = ${category}      LIMIT ${maxResults}    `;        const rawResults = await client.query.execute<{      id: number;      name: string;      value: number;      timestamp: string;    }>(query);        // 2. Post-process the results    return rawResults.map(row => ({      // Transform field names      itemId: row.id,      displayName: row.name.toUpperCase(),            // Add derived fields      formattedValue: `$${row.value.toFixed(2)}`,      isHighValue: row.value > 1000,            // Format dates      date: new Date(row.timestamp).toISOString().split('T')[0]    }));  });