# Templates Documentation – TypeScript ## Included Files 1. templates/adsb.mdx 2. templates/brainmoose.mdx 3. templates/github.mdx 4. templates/heartrate.mdx ## Aircraft Transponder (ADS-B) Template Source: templates/adsb.mdx Build a real-time aircraft tracking dashboard with Moose, Next.js, and open ADS-B data # Aircraft Transponder (ADS-B) Template This template demonstrates how to build a modern full-stack application that ingests and analyzes aircraft transponder data (ADS-B) from military aircraft. It showcases a complete architecture combining a Next.js frontend with a Moose backend for real-time data processing and storage. Aircraft have transponders, and there are a group of folk around the world that open-source record the transponder data and push it to [ADSB.lol](https://adsb.lol/). Lets grab that data and create something interesting. View Source Code → ## Getting Started ### Prerequisites Before getting started, make sure you have the following installed: * [Docker Desktop](https://www.docker.com/products/docker-desktop/) * [NodeJS 20+](https://nodejs.org/en/download) * [Anthropic API Key](https://docs.anthropic.com/en/api/getting-started) * [Claude Desktop 24+](https://claude.ai/download) * [Cursor](https://www.cursor.com/) ### Install Moose and Sloan To get started, install Moose (open source developer platform for data engineering) and Sloan (AI data engineering product). ```bash filename="terminal" copy bash -i <(curl -fsSL https://fiveonefour.com/install.sh) moose,sloan ``` This installation will ask for your Anthropic API key if you want to use the AI features. If you don't have one, you can follow the [setup guide at Anthropic's website](https://docs.anthropic.com/en/api/getting-started). ### Create a new project using the ADS-B template ```bash filename="terminal" copy moose init aircraft ads-b-frontend ``` This template creates a project "aircraft" with two subdirectories: ``` project/ ├── frontend/ # Next.js frontend application │ ├── src/ │ └── package.json │ ├── moose/ # Moose backend │ ├── app/ │ │ ├── datamodels/ # Data models for aircraft tracking │ │ ├── functions/ # Processing functions │ │ └── scripts/ # Workflow scripts │ ├── moose.config.toml # Moose configuration │ └── package.json ``` The frontend—a template NextJS application to be used by the LLM for generating a frontend. Moose—the backend ingesting the data, processing and storing it. ### Set up the project Navigate to the moose subdirectory and install dependencies: ```bash filename="terminal" copy cd aircraft/moose npm install ``` ### Start the Docker services Make sure Docker Desktop is running, then start the Moose development server: ```bash filename="terminal" copy moose dev ``` This will start all necessary local infrastructure including ClickHouse, Redpanda, Temporal, and the Rust ingest servers. Thiw will also start the ingestion workflow, retrieve data from the adsb.lol military aircraft tracking API, process it according to the data model, and ingest it into ClickHouse. You should see a large amount of data being ingested in the Moose CLI. ``` [POST] Data received at ingest API sink for AircraftTrackingData Received AircraftTrackingData -> AircraftTrackingProcessed - 1 1 message(s) POST ingest/AircraftTrackingData [POST] Data received at ingest API sink for AircraftTrackingData POST ingest/AircraftTrackingData Received AircraftTrackingData -> AircraftTrackingProcessed - 1 1 message(s) [POST] Data received at ingest API sink for AircraftTrackingData Received AircraftTrackingData -> AircraftTrackingProcessed - 1 1 message(s) POST ingest/AircraftTrackingData [DB] 136 row(s) successfully written to DB table (AircraftTrackingProcessed) [POST] Data received at ingest API sink for AircraftTrackingData ``` ### Test Moose deployment and data ingestion To see what infrastructure has been set up, run `moose ls` or `moose ls --json`. If you connect to the locally deployed ClickHouse table, query it! Here's a query that will return unique types of aircraft in the dataset. ```SQL SELECT DISTINCT aircraft_type FROM aircraft_tracking_data; ``` ## Explore the Data Using Sloan's MCP tools, you can explore the ingested aircraft data using AI tools like Claude Desktop or Cursor. We recommend using Claude for exploring your data and ad-hoc analytics, and Cursor for productionizing your results. ### Explore with Claude Desktop Claude Desktop can help you analyze the data through natural language queries. #### Initialize the Sloan MCP for your project with the Claude Client ``` cd /path/to/your/project/moose sloan setup --mcp claude-desktop ``` #### Explore your data in Claude Try asking Claude exploratory questions like: - "Tell me about the data in my ClickHouse tables" - "Tell me about the flow of data in Moose project" - "Create a pie chart of the types of aircraft in the air right now" - "Create a visualization of aircraft type against altitude" ### Productionize Your Results with Cursor For a code-forward workflow, Cursor provides a great environment to productionize your queries. #### Initialize the Sloan MCP for your project with the Claude Client ```bash filename="terminal" copy sloan setup --mcp cursor-global ``` This will create a `/.cursor/mcp.json` file with Sloan's MCP configuration. #### Enable the Sloan Cursor MCP Enable the MCP in Cursor by going to: `cursor > settings > cursor settings > MCP` and clicking `enable` and `refresh`. Try asking Cursor to help you productionize your analysis: - "Could you create a query that returns all the aircraft types that are currently in the sky?" - "Could you create an egress API to furnish that data to a frontend?" ## Integration Points You If you create an egress API above, you can create a frontend using the provided NextJS project. ### Install NextJS dependencies ```bash filename="terminal" copy cd path/to/your/project/frontend npm i ``` ### Run the local development server ```bash filename="terminal" copy npm run dev ``` Drag the frontend folder to the chat as context, as well as the file containing the generated egress API, and prompt: - "using the packagages you've been given, generate a frontend showing the data offered by the API." --- ## Live Brainwave Analytics Template Source: templates/brainmoose.mdx Build a real-time brainwave monitoring and analytics platform using Moose, EEG devices, and advanced data processing # Brainwaves Template ## Overview The Brainwaves template demonstrates how to build a comprehensive brain mapping and movement analytics platform using Moose. It features real-time collection, analysis, and visualization of brainwave data from EEG devices like the Muse Headband, with support for both live device streaming and simulation using datasets. ## Features - Real-time brainwave data collection and analysis from EEG devices - Live terminal dashboard with interactive charts and visualizations - Session-based data logging and tracking - Movement and relaxation state analysis - Dual-application architecture (DAS + Brainmoose backend) - Optional OpenAI-powered insights and analysis - Support for both live device streaming and CSV simulation ## Architecture ### DAS: Data Acquisition Server The Data Acquisition Server is a Node.js/TypeScript application that handles: 1. **Real-time Data Ingestion** - UDP/OSC data collection from Muse devices or simulators - Session-based CSV logging with unique identifiers 2. **Live Analysis & Visualization** - Terminal dashboard with real-time charts and tables - Brainwave band analysis (Alpha, Beta, Delta, Theta, Gamma) - Movement detection using accelerometer and gyroscope data 3. **Data Forwarding** - Automatic forwarding to Moose backend for storage and analytics ### Brainmoose: Analytics & API Backend The Moose-powered backend provides: - Modular data ingestion and storage pipeline - Advanced analytics blocks for movement scoring - RESTful APIs for querying session insights - Optional OpenAI GPT-4o integration for enhanced analysis - ClickHouse-based data warehouse for complex queries ## Getting Started Node.js 20+ Moose CLI Optional: Muse Headband EEG device ```bash bash -i <(curl -fsSL https://fiveonefour.com/install.sh) moose ``` 1. Create a new Moose project from the template: ```bash moose init moose-brainwaves brainwaves cd moose-brainwaves ``` 2. Install dependencies for both applications: ```bash # Install DAS dependencies cd apps/das npm install # Install Brainmoose dependencies cd ../brainmoose npm install ``` 3. Start with simulation (no device required): ```bash # Download sample data cd ../das ./download.sh # Start DAS npm run dev -- --sessionId=MyTestSession ``` 4. In another terminal start the Moose analytics backend: ```bash cd {project_root}/apps/brainmoose moose dev ``` 5. In another terminal run the simulation (no device required): ```bash cd {project_root}/apps/das ./sim.sh brain_data_coding.csv ``` ## API Endpoints The template exposes key API endpoints for accessing brainwave analytics: - `/api/sessionInsights` - Parameters: sessions (comma-separated sessionId|sessionLabel pairs) - Returns: Movement scores and session analytics for specified sessions ## Data Models ### Brain Data Schema ```typescript interface BrainData { timestamp: DateTime; bandOn: boolean; acc: { x: number; y: number; z: number; }; gyro: { x: number; y: number; z: number; }; alpha: number; beta: number; delta: number; theta: number; gamma: number; ppm: { channel1: number; channel2: number; channel3: number; }; sessionId: string; } ``` ## Analytics & Insights The template includes sophisticated analytics for: ### Movement Analysis - Accelerometer-based movement scoring - Gyroscope rotation detection - Combined movement metrics calculation ### Brainwave Band Analysis - Alpha wave relaxation indicators - Beta wave focus measurements - Delta, Theta, and Gamma band processing - Real-time state classification ### Session Scoring Example movement score calculation: ```sql SELECT sessionId, SUM(sqrt((arrayElement(acc, 1)^2) + (arrayElement(acc, 2)^2) + (arrayElement(acc, 3)^2))) AS acc_movement_score, SUM(sqrt((arrayElement(gyro, 1)^2) + (arrayElement(gyro, 2)^2) + (arrayElement(gyro, 3)^2))) AS gyro_movement_score FROM Brain_0_0 WHERE sessionId = '1735785243' GROUP BY sessionId; ``` ## Device Support ### Muse Headband Integration - Direct OSC data streaming support - Real-time EEG signal processing - Multi-channel brainwave analysis ### Simulation Support - CSV data playback for testing - Sample datasets included - Configurable replay speeds ## Customization You can customize the template by: - Extending brainwave analysis algorithms - Adding new visualization components to the terminal UI - Implementing custom movement detection logic - Integrating additional EEG devices or data sources - Enhancing the OpenAI analysis prompts and insights - Creating custom API endpoints for specific analytics needs ## Educational Resources Listen to these ~15 minute podcasts generated using NotebookLM: - [Muse Headband Overview](https://downloads.fiveonefour.com/moose/template-data/brainwaves/podcasts/MuseHeadband.mp3) - [Research Using Consumer EEG Devices](https://downloads.fiveonefour.com/moose/template-data/brainwaves/podcasts/ResearchUsingConsumerEEGDevices.mp3) --- ## Github Trending Topics Template Source: templates/github.mdx Build a real-time GitHub trending topics dashboard with Moose and Next.js # Github Trending Topics This template demonstrates how to build a real-time data pipeline and dashboard for tracking GitHub trending topics. It showcases a complete full-stack architecture that combines a Next.js frontend with a Moose backend for data ingestion, processing, and API generation. View Source Code → ## Architecture Overview The template implements a modern full-stack architecture with the following components: 1. **Frontend Layer (Next.js)** - Real-time dashboard for trending topics - Interactive data visualization - Type-safe API integration with Moose Analytics APIs - Built with TypeScript and Tailwind CSS 2. **Backend Layer (Moose)** - GitHub Events data ingestion - Real-time data processing pipeline - Type-safe APIs with Moose Analytics APIs - ClickHouse for analytics storage 3. **Infrastructure Layer** - ClickHouse for high-performance analytics - Redpanda for event streaming - Temporal for workflow orchestration - Type-safe HTTP API endpoints ```mermaid graph TD A[Next.js Dashboard] --> B[Moose Analytics APIs] B --> C[ClickHouse] D[GitHub API] --> E[Moose Workflow] E --> F[Moose Ingest Pipeline] F --> C ``` ## Getting Started ### Clone the repository: To get started with this template, you can run the following command: ```bash filename="terminal" copy moose init moose-github-dev-trends github-dev-trends ``` ### Install dependencies: ```bash filename="terminal" copy cd moose-github-dev-trends/moose-backend && npm install ``` ## Project Structure The template is organized into two main components: ``` moose-github-dev-trends/ ├── dashboard/ # Frontend dashboard │ ├── app/ # Next.js pages and routes │ ├── components/ # React components │ ├── generated-client/ # Auto-generated API client │ └── lib/ # Utility functions │ ├── moose-backend/ # Backend services │ ├── app/ │ │ ├── apis/ # Analytics API definitions │ │ ├── ingest/ # Data ingestion logic │ │ ├── blocks/ # Reusable processing blocks │ │ └── scripts/ # Workflow scripts │ └── moose.config.toml # Moose configuration ``` ### Run the application ```bash filename="terminal" copy moose dev ``` ### Watch the polling workflow run When you run `moose dev`, you'll see the workflow run in the Moose logs. This workflow will run a script every 60 seconds to poll GitHub for trending topics. [Learn more about scheduled workflows](/moose/building/workflows#scheduling-workflows). ### (Optional) Configure GitHub API Token ```bash filename="terminal" copy touch .env ``` ```.env filename=".env" copy GITHUB_TOKEN= ``` Without authentication, you're limited to 60 requests/hour. With a token, this increases to 5,000 requests/hour. ## Development Workflow The visualization combines several modern technologies: - **Moose** for type-safe API generation - **Tanstack Query** for data management - **Recharts** for visualization - **shadcn/ui** for UI components - **Tailwind CSS** for styling The template demonstrates a complete integration between Moose and Next.js: ### Moose Backend The backend is responsible for data ingestion and API generation: ```bash filename="terminal" copy cd moose-backend && moose dev ``` Key components: - **GitHub Event Polling**: Workflow in `app/scripts/` fetches public events and posts them to the Moose Ingest Pipeline. - **Ingest Pipeline**: Data Model and infrastructure for ingesting GitHub Events in `app/ingest/WatchEvent.ts` - **Analytics APIs**: Analytics APIs in `app/apis/` expose the data for the frontend #### Start the Moose Backend Spin up the Moose dev server: ```bash filename="terminal" copy moose dev ``` This will automatically start the workflow that polls GitHub for trending topics and ingests the data into ClickHouse. ### Frontend Integration The frontend automatically integrates with the Moose backend through generated APIs: #### API Client Generation Moose generates an OpenAPI spec from your Analytics APIs that we use with [OpenAPI Generator](https://openapi-generator.tech/): - Generated client located in `live-dev-trends-dashboard/generated-client` - Based on OpenAPI schema from `moose-backend/.moose/openapi.yaml` When running the following command, make sure: - You're in the `moose-backend` directory - The dev server is running: `moose dev` ```bash filename="Terminal" copy cd moose-backend openapi-generator-cli generate -i .moose/openapi.yaml -g typescript-fetch -o ../dashboard/generated-client ``` Remember to regenerate the client when you: - Update Analytics API schemas - Modify Ingest Pipeline definitions - Change API endpoints or parameters #### Data Flow & Visualization The dashboard implements a modern data flow pattern using [Tanstack Query](https://tanstack.com/query/latest/docs/framework/react/react-query-data-fetching) and [Shadcn/UI Charts](https://ui.shadcn.com/docs/components/charts): ```ts filename="lib/moose-client.ts" copy const apiConfig = new Configuration({ basePath: "http://localhost:4000", }); const mooseClient = new DefaultApi(apiConfig); export type TopicTimeseriesRequest = ApiTopicTimeseriesGetRequest; export type TopicTimeseriesResponse = TopicTimeseriesGet200ResponseInner; ``` ```ts filename="components/trending-topics-chart.tsx" copy // components/trending-topics-chart.tsx export function TrendingTopicsChart() { const [interval, setInterval] = useState("hour"); const [limit, setLimit] = useState(10); const [exclude, setExclude] = useState(""); const { data, isLoading, error } = useQuery({ queryKey: ["topicTimeseries", interval, limit, exclude], queryFn: async () => { const result = await mooseClient.ApiTopicTimeseriesGet({ interval, limit, exclude: exclude || undefined, }); return result; }, }); // Handle loading and error states if (isLoading) return ; if (error) return ; } ``` The dashboard creates an animated bar chart of trending topics over time: ```tsx filename="components/trending-topics-chart.tsx" copy // Transform the current time slice of data for the chart const chartData = data[currentTimeIndex].topicStats.map((stat) => ({ eventCount: parseInt(stat.eventCount), topic: stat.topic, fill: `var(--color-${stat.topic})` })); // Render vertical bar chart with controls return (
); ``` #### Start the Frontend ```bash filename="terminal" copy cd dashboard && npm install && npm run dev ``` Visit [http://localhost:3000](http://localhost:3000) to see real-time trending topics. ## Next Steps Once you have the data flowing, you can: 1. Add custom metrics and visualizations 2. Implement additional GitHub data sources 3. Create new API endpoints for specific analyses 4. Build alerting for trending topic detection Feel free to modify the data models, processing functions, or create new APIs to suit your specific needs! ## Deployment: Deploying this project involves deploying the Moose backend service and the frontend dashboard separately. **Prerequisites:** * A GitHub account and your project code pushed to a GitHub repository. * A [Boreal](https://boreal.cloud/signup) account for the backend. * A [Vercel](https://vercel.com/signup) account (or similar platform) for the frontend. ### 1. Push to GitHub Push your code to a GitHub repository. Configure a remote repository for your Moose project. ```bash filename="terminal" copy git remote add origin git push -u origin main ``` ### 2. Deploying the Moose Backend (Boreal) * **Create Boreal Project:** * Log in to your Boreal account and create a new project. * Connect Boreal to your GitHub account and select the repository containing your project. * Configure the project settings, setting the **Project Root** to the `moose-backend` directory. * **Configure Environment Variables:** * In the Boreal project settings, add the `GITHUB_TOKEN` environment variable with your GitHub Personal Access Token as the value. * **Deploy:** Boreal should automatically build and deploy your Moose service based on your repository configuration. It will also typically start any polling sources (like the GitHub event poller) defined in your `moose.config.toml`. * **Note API URL:** Once deployed, Boreal will provide a public URL for your Moose backend API. You will need this for the frontend deployment. ### 3. Deploying the Frontend Dashboard (Vercel) * **Create Vercel Project:** * Log in to your Vercel account and create a new project. * Connect Vercel to your GitHub account and select the repository containing your project. * Set the **Root Directory** in Vercel to `dashboard` (or wherever your frontend code resides within the repository). * **Configure Environment Variables:** * This is crucial: The frontend needs to know where the deployed backend API is located. * Add an environment variable in Vercel to point to your Boreal API URL. The variable name depends on how the frontend code expects it (e.g., `NEXT_PUBLIC_API_URL`). Check the frontend code (`dashboard/`) for the exact variable name. ``` # Example Vercel Environment Variable NEXT_PUBLIC_API_URL=https://your-boreal-project-url.boreal.cloud ``` * **Deploy:** Vercel will build and deploy your Next.js frontend. Once both backend and frontend are deployed and configured correctly, your live GitHub Trends Dashboard should be accessible via the Vercel deployment URL. --- ## Live Heart Rate Monitoring Template Source: templates/heartrate.mdx Build a real-time health analytics dashboard with Moose, Streamlit, and Python # Live Heart Rate Leaderboard Template ## Overview The Live Heart Rate Leaderboard template demonstrates how to build a real-time health monitoring application using Moose. It features a Streamlit-based dashboard that displays live heart rate data, calculates performance metrics, and maintains a competitive leaderboard. ## Features - Real-time heart rate monitoring dashboard with interactive graphs - Live leaderboard tracking multiple users - Heart rate zone visualization - Performance metrics calculation (power output and calories burned) - User-specific data tracking and visualization ## Architecture ### Moose Data Pipeline Backend The template implements a three-stage data processing pipeline: 1. **Raw Data Ingestion** (`RawAntHRPacket`) - Captures raw heart rate data from ANT+ devices - Includes basic device and timestamp information 2. **Data Processing** (`ProcessedAntHRPacket`) - Transforms raw data into processed format - Adds calculated metrics and validation 3. **Unified Format** (`UnifiedHRPacket`) - Standardizes heart rate data for analytics - Includes user information and derived metrics ### Streamlit Frontend Dashboard The Streamlit dashboard (`streamlit_app.py`) provides: - Real-time heart rate visualization - Performance metrics display - Interactive user selection - Live-updating leaderboard - Heart rate zone indicators ## Getting Started Python 3.12+ Moose CLI ```bash bash -i <(curl -fsSL https://fiveonefour.com/install.sh) moose ``` 1. Create a new Moose project from the template: ```bash moose init moose-heartrate live-heartrate-leaderboard cd moose-heartrate ``` 2. Create a new virtual environment and install the dependencies: ```bash python -m venv venv source venv/bin/activate pip install -r requirements.txt ``` 3. Configure your environment variables in `.env` 4. Start the Moose pipeline: ```bash moose dev ``` 5. Launch the Streamlit dashboard: ```bash streamlit run streamlit_app.py ``` ## API Endpoints The template exposes two main API endpoints: - `/api/getUserLiveHeartRateStats` - Parameters: user_name, window_seconds - Returns: Recent heart rate statistics for a specific user - `/api/getLeaderboard` - Parameters: time_window_seconds, limit - Returns: Ranked list of users based on performance metrics ## Data Models ### UnifiedHRPacket ```python from moose_lib import Key from pydantic import BaseModel from datetime import datetime class UnifiedHRPacket(BaseModel): user_id: Key[int] user_name: str device_id: int hr_timestamp_seconds: float hr_value: float rr_interval_ms: float processed_timestamp: datetime ``` ## Performance Calculations The template includes calculations for: - Heart rate zones (1-5) - Estimated power output - Cumulative calories burned - Average performance metrics ## Customization You can customize the template by: - Modifying heart rate zone thresholds - Adjusting performance calculation formulas - Extending the data pipeline with additional metrics - Customizing the dashboard layout and visualizations