Getting Started
TypeStream is a streaming data platform that compiles pipeline definitions into Kafka Streams topologies. You can build pipelines three ways: an interactive CLI with Unix-style syntax, config-as-code with .typestream.json files, or a visual GUI with drag-and-drop.
All three converge on the same execution engine -- pick the one that fits your workflow.
Quick setup
Install the CLI and start the local environment:
brew install typestreamio/tap/typestream
typestream local dev
Demo data generators start automatically with the local environment -- topics like web_visits, crypto_tickers, and wikipedia_changes will appear within a few seconds.
For detailed setup instructions, see the installation page.
Your first pipeline: filter a stream
Let's filter the web_visits stream to find successful requests. Here's the same pipeline built three ways.
CLI (interactive shell)
Start the TypeStream shell:
typestream
Then run:
grep /dev/kafka/local/topics/web_visits [.status_code == 200]
Output:
{"ip_address":"203.0.113.42","url_path":"/products","status_code":200,"http_method":"GET",...}
Press Ctrl+C to stop (streaming pipelines run until cancelled).
To write the filtered results to a new topic:
grep /dev/kafka/local/topics/web_visits [.status_code == 200] > /dev/kafka/local/topics/web_visits_ok
Config-as-code
Create a file called filter-visits.typestream.json:
{
"name": "filter-visits",
"version": "1",
"description": "Filter web visits to successful requests",
"graph": {
"nodes": [
{
"id": "source-1",
"kafkaSource": {
"topicPath": "/dev/kafka/local/topics/web_visits",
"encoding": "AVRO"
}
},
{
"id": "filter-1",
"filter": {
"expression": ".status_code == 200"
}
},
{
"id": "sink-1",
"kafkaSink": {
"topicName": "web_visits_ok"
}
}
],
"edges": [
{ "fromId": "source-1", "toId": "filter-1" },
{ "fromId": "filter-1", "toId": "sink-1" }
]
}
}
Preview what will happen:
typestream plan filter-visits.typestream.json
Apply the pipeline:
typestream apply filter-visits.typestream.json
Visual GUI
Open the TypeStream UI at http://localhost:5173, navigate to the graph builder, and:
- Drag a Kafka Source node onto the canvas and select the
web_visitstopic - Drag a Filter node and connect it to the source, then set the expression to
.status_code == 200 - Drag a Kafka Sink node and connect it to the filter, then set the output topic
- Click Create Job
Where to go from here
- Three Ways to Build -- understand the CLI, config-as-code, and GUI approaches in depth
- Node Reference -- all 18 node types with config fields and schema behavior
- How-to Guides -- step-by-step recipes for common tasks