REST Intro
REST Intro
Get started with Zilla by deploying our Docker Compose stack. Before proceeding, you should have Docker Compose installed.
CRUD on a Kafka event stream
Running this Zilla sample will create a simple API to create and list items. All of the data will be stored on a Kafka topic.
Setup
Create these files, zilla.yaml
and docker-compose.yaml
, in the same directory.
zilla.yaml
name: REST-example
bindings:
# Proxy service entrypoint
north_tcp_server:
type: tcp
kind: server
options:
host: 0.0.0.0
port: 7114
exit: north_http_server
north_http_server:
type: http
kind: server
routes:
- when:
- headers:
:scheme: http
:authority: localhost:7114
exit: north_http_kafka_mapping
# Proxy REST endpoints to Kafka a topic
north_http_kafka_mapping:
type: http-kafka
kind: proxy
routes:
- when:
- method: POST
path: /items
exit: north_kafka_cache_client
with:
capability: produce
topic: items-snapshots
- when:
- method: GET
path: /items
exit: north_kafka_cache_client
with:
capability: fetch
topic: items-snapshots
merge:
content-type: application/json
# Kafka sync layer
north_kafka_cache_client:
type: kafka
kind: cache_client
exit: south_kafka_cache_server
south_kafka_cache_server:
type: kafka
kind: cache_server
options:
bootstrap:
- items-snapshots
exit: south_kafka_client
# Connect to local Kafka
south_kafka_client:
type: kafka
kind: client
exit: south_kafka_tcp_client
south_kafka_tcp_client:
type: tcp
kind: client
options:
host: ${{env.KAFKA_HOST}}
port: ${{env.KAFKA_PORT}}
routes:
- when:
- cidr: 0.0.0.0/0
docker-compose.yaml
version: '3'
services:
zilla:
image: ghcr.io/aklivity/zilla:latest
pull_policy: always
depends_on:
- kafka
ports:
- 7114:7114
environment:
KAFKA_HOST: kafka
KAFKA_PORT: 29092
volumes:
- ./zilla.yaml:/etc/zilla/zilla.yaml
command: start -v -e
kafka:
image: bitnami/kafka:3.2
hostname: kafka
ports:
- 9092:9092
- 29092:9092
environment:
ALLOW_PLAINTEXT_LISTENER: "yes"
KAFKA_CFG_NODE_ID: "1"
KAFKA_CFG_BROKER_ID: "1"
KAFKA_CFG_CONTROLLER_QUORUM_VOTERS: "1@127.0.0.1:9093"
KAFKA_CFG_LISTENER_SECURITY_PROTOCOL_MAP: "CLIENT:PLAINTEXT,INTERNAL:PLAINTEXT,CONTROLLER:PLAINTEXT"
KAFKA_CFG_CONTROLLER_LISTENER_NAMES: "CONTROLLER"
KAFKA_CFG_LOG_DIRS: "/tmp/logs"
KAFKA_CFG_PROCESS_ROLES: "broker,controller"
KAFKA_CFG_LISTENERS: "CLIENT://:9092,INTERNAL://:29092,CONTROLLER://:9093"
KAFKA_CFG_INTER_BROKER_LISTENER_NAME: "INTERNAL"
KAFKA_CFG_ADVERTISED_LISTENERS: "CLIENT://localhost:9092,INTERNAL://kafka:29092"
KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE: "true"
kafka-init:
image: bitnami/kafka:3.2
command:
- "/bin/bash"
- "-c"
- |
/opt/bitnami/kafka/bin/kafka-topics.sh --bootstrap-server kafka:29092 --create --if-not-exists --topic items-snapshots
depends_on:
- kafka
init: true
Run Zilla and Kafka
docker-compose up -d
curl
to send a greeting
Use curl -X POST http://localhost:7114/items -H 'Content-Type: application/json' -d '{"greeting":"Hello, world"}'
curl
to list all of the greetings
Use curl http://localhost:7114/items
[{"greeting":"Hello, world"}]
Remove the running containers
docker-compose down
See more of what Zilla can do
Go deeper into this concept with the http.kafka.crud example.
Going Deeper
Try out more HTTP examples: