Project Metamorphosis: Wir präsentieren die Event-Streaming-Plattform der nächsten GenerationMehr Erfahren

How to Write Great Kafka Connectors

Register Now

Available On-Demand

Apache Kafka with all its simple but needed offerings has left deep footprints in the software industry. And, with the ever-growing and maturing Kafka ecosystem, Kafka Connect allows us to focus on the data transformation rather than handling Kafka nitty-gritty details (which is mandatory if someone is using Kafka’s Producer/Consumer APIs). Kafka Connect in contrast, provides developers and operators a simple way of accessing, transforming and delivering data to connect an organization’s applications with their event streaming platform in the form of connectors.

Confluent's partner HashedIn Technologies has created many Kafka connectors. In this online talk, HashedIn shares their best practices on how to write great ones.

HashedIn will cover:

  • In-field best practices for writing great Connectors, including both Sink Connectors and Source Connectors that transform and move data in and out of diverse external systems like AWS SQS, AWS S3, Firebase, Hadoop File System, InfluxDB, JDBC, Prometheus, Salesforce and Windows event logging.
  • How to unlock the true potential of the Confluent Platform and move petabytes of data each day with the best possible Kafka Connector promises including exactly/at-least-once message delivery, retry mechanism, restart/rebalance behavior and ordering guarantees.

Speakers

Fahad Sheikh, Tech Lead, HashedIn Technologies