Skip to content

MarkOSIndustries/monosodium-glutamate

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

277 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Monosodium Glutamate

A collection of tools for working with protobuf messages in datastores, over GRPC, and in Kafka.

Index

  • GRPC-GUI See and type JSON, send/receive GRPC protobufs
  • kat Produce/consume to/from kafka via piped stdin/out
  • proto Encode/decode JSON to/from protobufs using various encodings
  • kgb Run a GRPC gateway to perform simple Kafka operations
  • qs A useful tool for piping the output from other commands into a rocksdb

Running

The command line tools (everything except for GRPC-GUI) are packaged into a Docker image image. To run in interactive mode, with your protobuf schemas accessible to the tools:

docker run --mount type=bind,src=/your/proto/schemas,dst=/home/schemas -it markosindustries/monosodium-glutamate

Building

Each of the tools' README explains how to build it independently, but if you just want to build the docker image yourself:

# Clone this repository
git clone https://github.com/markosindustries/monosodium-glutamate
# Go into the repository
cd monosodium-glutamate
# Build the docker image
docker build . -f docker/Dockerfile -t msg

Example Usage

These tools are at their best when combined together. Here are some examples of real use cases.

Tail a Kafka topic in realtime as JSON

Consume a topic called my.Topic and deserialise the values as my.MessageType protobuf messages

kat consume my.Topic binary \
  --from latest --until forever |\
  proto transform my.MessageType binary json

Fill a Kafka topic with valid protobuf messages

Produce protobuf messages of type my.MessageType, serialise them, and send to topic my.Topic with random keys

proto spam my.MessageType binary | \
  kat produce my.Topic binary

Extract a partition time range from Kafka as insert statements

Construct a file of INSERT statements from my.MessageType protobuf messages in topic my.Topic filtering to only records on partition #3

kat consume my.Topic msg.TypedKafkaRecord \
  --from 1636900200000 --until 1636900800000 \
  --schema my.MessageType |\
  proto transform msg.TypedKafkaRecord binary json -f "{\"partition\":3}" \
  -t "INSERT INTO SomeTable(Id,Name) VALUES ('${msg.value.id}','${msg.value.name}')" > /tmp/script.sql

Mirror a topic to another topic/cluster

Use the same keys/values produced to clusterA to populate a topic on clusterB

kat consume my.TopicA -b clusterA msg.KafkaRecord | \
  kat produce my.TopicB -b clusterB msg.KafkaRecord

About

Tools for working with protobufs, kafka, grpc, etc...

Topics

Resources

License

Stars

Watchers

Forks

Packages

 
 
 

Contributors