sse-kafka
support, with proxy
behavior.proxy
kind sse-kafka
binding adapts sse
data streams into kafka
data streams, so that kafka
messages can be delivered to sse
clients.kafka
message key, message headers, or a combination of both message key and headers, extracting the parameter values from the inbound sse
path.kafka
topic partitions is conveyed to the sse
client via event id
and when the stream is implicitly paused during sse
client reconnect, the last-event-id
header in the sse
reconnect request contains the last received event id
value, allowing the sse
stream to resume reliable message delivery automatically.id
can be configured to include the message key
and etag
of each message, avoiding the need to duplicate the key in the message body and making it suitable for integration with http-kafka
binding's use of etag
for conditional if-match
operations.kafka
tombstone (null
value) message is received by the sse-kafka
binding, it delivers a delete
event to the sse
client. This informs the client which specific message has been deleted by observing the message key from the sse
delete
event id
.sse
data streams to kafka
data streams.type
*const "sse-kafka"
sse
data streams to kafka
data streamskind
*enum [ "proxy" ]
sse-kafka
proxy
exit
string
sse
data streams to kafka
data streams.guarded
object
as named map of string
array
exit
*string
sse
data streams to kafka
data streams.path
*string
/{topic}
sse
data streams to kafka
data streams.sse
data streams to kafka
data streams.key
string
${params.key}
headers
object
${params.headerX}
id
*enum [
"${etag}",
"[\"${base64(key)}\",\"${etag}\"]"
]
id
field in sse
event
Defaults to "${etag}"