Add Markers list from Screenflow to Youtube Table of Contents
Using Confluent Cloud when there is no Cloud (or internet)
How to install kafkacat on Fedora
Converting from AsciiDoc to Google Docs and MS Word
A quick and dirty way to monitor data arriving on Kafka
I’ve been poking around recently with capturing Wi-Fi packet data and streaming it into Apache Kafka, from where I’m processing and analysing it. Kafka itself is rock-solid - because I’m using ☁️Confluent Cloud and someone else worries about provisioning it, scaling it, and keeping it running for me. But whilst Kafka works just great, my side of the setup—tshark running on a Raspberry Pi—is less than stable. For whatever reason it sometimes stalls and I have to restart the Raspberry Pi and restart the capture process.
Are Tech Conferences Dead?
Streaming Wi-Fi trace data from Raspberry Pi to Apache Kafka with Confluent Cloud
Kafka Connect JDBC Sink - setting the key field name
I wanted to get some data from a Kafka topic:
ksql> PRINT PERSON_STATS FROM BEGINNING;
Key format: KAFKA (STRING)
Value format: AVRO
rowtime: 2/25/20 1:12:51 PM UTC, key: robin, value: {"PERSON": "robin",
"LOCATION_CHANGES":1, "UNIQUE_LOCATIONS": 1}into Postgres, so did the easy thing and used Kafka Connect with the JDBC Sink connector.
Adventures in the Cloud, Part 94: ECS
My name’s Robin, and I’m a Developer Advocate. What that means in part is that I build a ton of demos, and Docker Compose is my jam. I love using Docker Compose for the same reasons that many people do:
-
Spin up and tear down fully-functioning multi-component environments with ease. No bespoke builds, no cloning of VMs to preserve "that magic state where everything works"
-
Repeatability. It’s the same each time.
-
Portability. I can point someone at a
docker-compose.ymlthat I’ve written and they can run the same on their machine with the same results almost guaranteed.