5-Minute Quickstart
Loading video player...
Prerequisites
Check that your pre-requisites are installed by running the following commands:
Before you start
Node.js 20+
Required for TypeScript development
Download →Docker Desktop
For local development environment (requires at least 2.5GB memory)
Download →macOS/Linux
Windows works via WSL2
node --versiondocker psDocker Desktop Memory Requirements
Make sure Docker Desktop has at least 2.5GB of memory allocated. To check or change this setting, open Docker Desktop, go to Settings → Resources → Memory, and adjust the slider if needed. Learn more about Docker Desktop settings →
Already have ClickHouse running?
Step 1: Install Moose (30 seconds)
Run the installation script
bash -i <(curl -fsSL https://fiveonefour.com/install.sh) mooseYou should see this message: Moose vX.X.X installed successfully! (note that X.X.X is the actual version number)
If you see an error instead, check Troubleshooting below.
Reload your shell configuration
This step is required. Your current terminal doesn't know about the moose command yet.
source ~/.zshrcVerify moose command works
moose --versionYou should see:
moose X.X.XIf you see 'command not found'
Try these steps in order:
- Re-run the correct
sourcecommand for your shell - Close this terminal completely and open a new terminal window
- Run
moose --versionagain - If still failing, see Troubleshooting
Checkpoint
You should see the moose version number. Do not proceed to Step 2 until moose --version works.
Step 2: Create Your Project (1 minute)
Initialize your project
moose init my-analytics-app typescriptYou should see output like:
✓ Created my-analytics-app✓ Initialized TypeScript projectNavigate to your project directory
cd my-analytics-appInstall dependencies
npm installWait for installation to complete.
Checkpoint
Dependencies installed successfully with no errors.
Start your development environment
moose devFor AI assistants or when you need to run other commands in the same terminal:
nohup moose dev &This may take a minute the first time.
Moose is:
- Downloading Docker images for ClickHouse, Redpanda, and Temporal
- Starting containers
- Initializing databases
- Starting the development server
Do not proceed until you see the "Started Webserver" message.
Created docker compose file⡗ Starting local infrastructure Successfully started containers Validated clickhousedb-1 docker container Validated redpanda-1 docker container Successfully validated red panda cluster Validated temporal docker container Successfully ran local infrastructure Node Id: my-analytics-app::b15efaca-0c23-42b2-9b0c-642105f9c437 Starting development mode Watching "/path/to/my-analytics-app/app" Started Webserver. 👈 WAIT FOR THIS Next Steps 💻 Run the moose 👉 `ls` 👈 command for a bird's eye view of your application and infrastructure 📥 Send Data to Moose Your local development server is running at: http://localhost:4000/ingestCheckpoint
Keep this terminal running. This is your Moose development server. You'll open a new terminal for the next step.
Step 3: Understand Your Project (1 minute)
Your project includes a complete example pipeline:
Important: While your pipeline objects are defined in the child folders, they must be imported into the root index.ts (TypeScript) or main.py (Python) file for the Moose CLI to discover and use them.
export * from "./ingest/models"; // Data models & pipelinesexport * from "./ingest/transforms"; // Transformation logicexport * from "./apis/bar"; // API endpointsexport * from "./views/barAggregated"; // Materialized viewsexport * from "./workflows/generator"; // Background workflowsStep 4: Test Your Pipeline (2 minutes)
Important: Open a second terminal window
Keep your moose dev terminal running. You need a second terminal for the next commands.
Navigate to your project in the new terminal
In your new terminal window (not the one running moose dev):
cd my-analytics-appRun the data generator workflow
Your project comes with a pre-built Workflow called generator that acts as a data simulator:
moose workflow run generatorYou should see:
Workflow 'generator' triggered successfullyView Temporal Dashboard
The workflow runs in the background, powered by Temporal. You can see workflow status at http://localhost:8080.
Watch for data processing logs
Switch to your first terminal (where moose dev is running). You should see new logs streaming:
POST ingest/Foo[POST] Data received at ingest API sink for FooReceived Foo_0_0 -> Bar_0_0 1 message(s)[DB] 17 row(s) successfully written to DB table (Bar)Data is flowing!
These logs show your pipeline working: Workflow generates data → Ingestion API receives it → Data transforms → Writes to ClickHouse
Query your data
Your application has a pre-built API that reads from your database. The API runs on localhost:4000.
In Terminal 2, call the API with curl:
curl "http://localhost:4000/api/bar"You should see JSON data like:
[ { "dayOfMonth": 15, "totalRows": 67, "rowsWithText": 34, "maxTextLength": 142, "totalTextLength": 2847 }, { "dayOfMonth": 14, "totalRows": 43, "rowsWithText": 21, "maxTextLength": 98, "totalTextLength": 1923 }]Checkpoint
You should see JSON data with analytics results. Your complete data pipeline is working!
Try query parameters:
curl "http://localhost:4000/api/bar?limit=5&orderBy=totalRows"Port Reference
- Port 4000: Your Moose application webserver (all APIs are running on this port)
- Port 8080: Temporal UI dashboard (workflow management)
- Port 18123: ClickHouse HTTP interface (direct database access)
Step 5: Hot Reload Schema Changes (1 minute)
- Open
app/ingest/models.ts - Add a new field to your data model:
/** Analyzed text metrics derived from Foo */export interface Bar { primaryKey: Key<string>; // From Foo.primaryKey utcTimestamp: DateTime; // From Foo.timestamp hasText: boolean; // From Foo.optionalText? textLength: number; // From Foo.optionalText.length newField?: string; // Add this new optional field}- Save the file and watch your terminal (in Terminal 1 where
moose devis running). You should see:
⠋ Processing Infrastructure changes from file watcher + Table Bar: Column changes: + newField: StringCheckpoint
You should see the column change logged. Your API, database schema, and streaming topic all updated automatically!
Try it yourself: Add another field with a different data type and watch the infrastructure update in real-time.
Recap
You've built a complete analytical backend with:
What's Working
Type-safe ingestion pipeline with API and stream processing
ClickHouse database with dynamic schema management
Analytics API with filtering
Hot-reload development
Need Help?
Still stuck? Join our Slack community or open an issue.