5-Minute Quickstart
Viewing:
Prerequisites
Check that your pre-requisites are installed by running the following commands:
Before you start
Node.js 20+
Required for TypeScript development
Download →Docker Desktop
For local development environment (requires at least 2.5GB memory)
Download →macOS/Linux
Windows works via WSL2
node --versiondocker psBefore you start
Python 3.12+
Required for Python development
Download →Docker Desktop
For local development environment (requires at least 2.5GB memory)
Download →macOS/Linux
Windows works via WSL2
python --versiondocker psMake sure Docker Desktop has at least 2.5GB of memory allocated. To check or change this setting, open Docker Desktop, go to Settings → Resources → Memory, and adjust the slider if needed. Learn more about Docker Desktop settings →
Already have ClickHouse running?
Skip the tutorial and add Moose as a layer on top of your existing database
Step 1: Install Moose (30 seconds)
Run the installation script
bash -i <(curl -fsSL https://fiveonefour.com/install.sh) mooseYou should see this message: Moose vX.X.X installed successfully! (note that X.X.X is the actual version number)
If you see an error instead, check Troubleshooting below.
Reload your shell configuration
This step is required. Your current terminal doesn’t know about the moose command yet.
source ~/.zshrcVerify moose command works
moose --versionYou should see:
moose X.X.XTry these steps in order:
- Re-run the correct
sourcecommand for your shell - Close this terminal completely and open a new terminal window
- Run
moose --versionagain - If still failing, see Troubleshooting
You should see the moose version number. Do not proceed to Step 2 until moose --version works.
Step 2: Create Your Project (1 minute)
Initialize your project
moose init my-analytics-app typescriptYou should see output like:
✓ Created my-analytics-app
✓ Initialized TypeScript projectmoose init my-analytics-app pythonYou should see output like:
✓ Created my-analytics-app
✓ Initialized Python projectNavigate to your project directory
cd my-analytics-appA virtual environment isolates your project’s dependencies. We recommend creating one for your project.
Create a virtual environment (Recommended)
python3 -m venv .venvactivate your virtual environment(Recommended)
source .venv/bin/activateThis creates a .venv folder and activates it. Your terminal prompt should now look something like this:
(.venv) username@computer my-analytics-app %Install dependencies
npm installWait for installation to complete.
Dependencies installed successfully with no errors.
pip install -r requirements.txtWait for installation to complete. You should see successful installation messages ending with:
Successfully installed [list of packages]You should see (.venv) in your prompt and dependencies installed with no errors.
Start your development environment
moose devMoose is:
- Downloading Docker images for ClickHouse, Redpanda, and Temporal
- Starting containers
- Initializing databases
- Starting the development server
Do not proceed until you see the “Started Webserver” message.
Created docker compose file
⡗ Starting local infrastructure
Successfully started containers
Validated clickhousedb-1 docker container
Validated redpanda-1 docker container
Successfully validated red panda cluster
Validated temporal docker container
Successfully ran local infrastructure
Node Id: my-analytics-app::b15efaca-0c23-42b2-9b0c-642105f9c437
Starting development mode
Watching "/path/to/my-analytics-app/app"
Started Webserver. 👈 WAIT FOR THIS
Next Steps
💻 Run the moose 👉 `ls` 👈 command for a bird's eye view of your application and infrastructure
📥 Send Data to Moose
Your local development server is running at: http://localhost:4000/ingestKeep this terminal running. This is your Moose development server. You’ll open a new terminal for the next step.
Step 3: Understand Your Project (1 minute)
Your project includes a complete example pipeline:
- models.py
- transform.py
- main.py
- models.ts
- transform.ts
- index.ts
Important: While your pipeline objects are defined in the child folders, they must be imported into the root index.tsmain.py file for the Moose CLI to discover and use them.
export * from "./ingest/models"; // Data models & pipelines
export * from "./ingest/transforms"; // Transformation logic
export * from "./apis/bar"; // API endpoints
export * from "./views/barAggregated"; // Materialized views
export * from "./workflows/generator"; // Background workflowsfrom app.ingest.models import * # Data models & pipelines
from app.ingest.transform import * # Transformation logic
from app.apis.bar import * # API endpoints
from app.views.bar_aggregated import * # Materialized views
from app.workflows.generator import * # Background workflowsStep 4: Test Your Pipeline (2 minutes)
Keep your moose dev terminal running. You need a second terminal for the next commands.
Navigate to your project in the new terminal
In your new terminal window (not the one running moose dev):
cd my-analytics-appIf not automatically activated, activate the virtual environment:
source .venv/bin/activateRun the data generator workflow
Your project comes with a pre-built Workflow called generator that acts as a data simulator:
moose workflow run generatorYou should see:
Workflow 'generator' triggered successfullyWatch for data processing logs
Switch to your first terminal (where moose dev is running). You should see new logs streaming:
POST ingest/Foo
[POST] Data received at ingest API sink for Foo
Received Foo_0_0 -> Bar_0_0 1 message(s)
[DB] 17 row(s) successfully written to DB table (Bar)These logs show your pipeline working: Workflow generates data → Ingestion API receives it → Data transforms → Writes to ClickHouse
If you don’t see logs after 30 seconds:
- Verify
moose devis still running in Terminal 1 - Check Terminal 2 for error messages from the workflow command
- Run
docker psto verify containers are running
The workflow runs in the background, powered by Temporal. You can see workflow status at http://localhost:8080.
Query your data
Your application has a pre-built API that reads from your database. The API runs on localhost:4000.
In Terminal 2, call the API with curl:
curl "http://localhost:4000/api/bar"You should see JSON data like:
[
{
"dayOfMonth": 15,
"totalRows": 67,
"rowsWithText": 34,
"maxTextLength": 142,
"totalTextLength": 2847
},
{
"dayOfMonth": 14,
"totalRows": 43,
"rowsWithText": 21,
"maxTextLength": 98,
"totalTextLength": 1923
}
]You should see JSON data with analytics results. Your complete data pipeline is working!
Try query parameters:
curl "http://localhost:4000/api/bar?limit=5&orderBy=totalRows"- Port 4000: Your Moose application webserver (all APIs are running on this port)
- Port 8080: Temporal UI dashboard (workflow management)
- Port 18123: ClickHouse HTTP interface (direct database access)
Step 5: Hot Reload Schema Changes (1 minute)
- Open
app/ingest/models.tsapp/ingest/models.py - Add a new field to your data model:
/** Analyzed text metrics derived from Foo */
export interface Bar {
primaryKey: Key<string>; // From Foo.primaryKey
utcTimestamp: DateTime; // From Foo.timestamp
hasText: boolean; // From Foo.optionalText?
textLength: number; // From Foo.optionalText.length
newField?: string; // Add this new optional field
}from moose_lib import Key, StringToEnumMixin
from typing import Optional, Annotated
from enum import IntEnum, auto
from pydantic import BaseModel
class Baz(StringToEnumMixin, IntEnum):
QUX = auto()
QUUX = auto()
class Bar(BaseModel):
primary_key: Key[str]
utc_timestamp: datetime
baz: Baz
has_text: bool
text_length: int
new_field: Optional[str] = None # New field- Save the file and watch your terminal
Switch to Terminal 1 (where moose dev is running). You should see Moose automatically update your infrastructure:
⠋ Processing Infrastructure changes from file watcher
~ Table Bar:
Column changes:
+ newField: String⠋ Processing Infrastructure changes from file watcher
~ Table Bar:
Column changes:
+ new_field: StringYou should see the column change logged. Your API, database schema, and streaming topic all updated automatically!
Try it yourself: Add another field with a different data type and watch the infrastructure update in real-time.
Recap
You’ve built a complete analytical backend with: