What is stream processing, and how does it work?
Stream processing is a data processing technology used to collect, store, and manage continuous streams of data as it’s produced or received. Also known as event streaming or complex event processing (CEP), stream processing has grown exponentially in recent years due to its powerful ability to simplify data architectures, provide real-time insights and analytics, and the ability to react to time-sensitive data like IoT data, multiplayer video games, or location-based applications as it happens.
Today, stream processing is often the backend process for everything from billing, fulfillment and fraud detection, to Netflix recommendations and ride-share apps like Lyft, which may need to be decoupled from the frontend where users expect instant results with the click of a button. Apache Kafka® has become a de-facto standard for ingesting event-based data and is considered the central nervous system for data in many organizations.
In this three-part online talk series, we will cover everything you need to know about stream processing, including:
Available On-Demand
The event-driven model provides many benefits: It decouples dependencies between services, provides some level of pluggability to the architecture, and enables services to evolve independently.
Such systems typically use Apache Kafka as the foundation. Kafka is like a central dataplane that holds shared events and keeps services in sync. Its distributed cluster technology provides availability, resiliency and performance properties that strengthen the architecture, leaving the programmer to simply write and deploy client applications that will run load balanced and be highly available.
This session will cover the use of Apache Kafka as a platform for streaming data and how stream processing can make your data systems more flexible and less complex.
Register now to learn:
Available On-Demand
Eine Event-Streaming-Plattform würde ihrem Namen nicht gerecht, wenn die Daten nicht direkt bei ihrem Eintreffen verarbeitet werden könnten. Die Streams API in Apache Kafka ist eine leistungsstarke, schlanke Bibliothek, die eine On-the-fly-Verarbeitung ermöglicht. Hier können Sie aggregieren, Windowing-Parameter erstellen, Daten innerhalb eines Streams zusammenführen und vieles mehr. Doch das ist noch nicht alles: Sie wurde als Java-Anwendung auf Kafka erstellt, sodass Ihr Workflow unbeeinträchtigt bleibt und Sie sich nicht um zusätzliche Cluster kümmern müssen.
Register now to learn:
Available On-Demand
You’ve got streams of data that you want to process and store? You’ve got events from which you’d like to derive state or build aggregates? And you want to do all of this in a scalable and fault-tolerant manner? It’s just as well that Kafka and ksqlDB exist!
Sie haben Erfahrung mit dem Programmieren von traditionellen Anwendungen auf Basis von relationalen Datenbanken? ksqlDB sorgt dafür, dass Ihnen die Entwicklung von Event-Streaming-Anwendungen genau so einfach und vertraut vorkommt. Die Event-Streaming-Datenbank ksqlDB vereinfacht die zugrunde liegende Architektur, damit leistungsstarke Echtzeit-Systeme mit nur wenigen SQL-Statements auskommen.
This talk will cover the concepts and capabilities of ksqlDB. We’ll show how you can apply transformations to a stream of events from one Kafka topic to another. We’ll discuss using ksqlDB connectors to bring in data from other systems and use that data to join and enrich streams.
Register now to learn: