Introduction: #
- The source of an event can be from internal or external inputs. Events can generate from a user, like a mouse click or keystroke, an external source, such as a sensor output, or come from the system, like loading a program.
- This is the modern way of communication between different applications. It has two components: Publish and subscribe. For example, Netflix, youtube.
- An event is any significant occurrence or change in state for system hardware or software. An event is not the same as an event notification, which is a message or notification sent by the system to notify another part of the system that an event has taken place.
Kafka Adapter is basically an Event-based communication. Event-based communication contains two main components: Publish (Producer) and Subscribe (Consumer). The work of a publisher is to upload a message into a queue or topic, which is Kafka or Pulsar and the middleware tool like Kafka and Pulsar will maintain the message, they follow the principle of Publish and Subscribe. And they will deliver messages to the consumer. Then all the applications can subscribe. For example, Netflix plays the role of a publisher. and we play the role of the subscriber(Consumer). Same way Kafka Adapter which we will see in this tutorial.
Event-based Communication: #
- An event-driven architecture can help organizations achieve a flexible system that can adapt to changes and make decisions in real-time. Real-time situational awareness means that business decisions, whether manual or automated, can be made using all of the available data that reflects the current state of your systems.
- Events are captured as they occur from event sources such as Internet of Things (IoT) devices, applications, and networks, allowing event producers and event consumers to share status and response information in real-time.
- Organizations can add event-driven architecture to their systems and applications to improve the scalability and responsiveness of applications and access to the data and context needed for better business decisions.
- This is the modern way of communication between different applications. It has two components: Publish and subscribe—for example, Netflix, youtube.
- We can see the picture given below for reference. The SAP is a producer who pushing messages and Kafka or Pulsar works here as middleware and maintains messages. All other applications like Twitter, HubSpot, salesforce is consumer or subscriber here. So SAP just needed to publish it one time, they don’t need to send it to different applications individually.
- This is the modern way of communication between different applications. It has two components: Publish and subscribe. For example, Netflix, youtube.
- We can see the picture given below for reference. The SAP is a producer who pushing messages and Kafka or Pulsar works here as middleware and maintains messages. All other applications like Twitter, HubSpot, salesforce is consumer or subscriber here. So SAP just needed to publish it one time, they don’t need to send it to different applications individually.
How does event-driven architecture work? #
SKYVVA Integration is a comprehensive set of integration and messaging technologies to connect applications and data across hybrid infrastructures. It is an agile, distributed, containerized, and API-centric solution. It provides service composition and orchestration, application connectivity and data transformation, real-time message streaming, change data capture, and API management—all combined with a cloud-native platform and toolchain to support the full spectrum of modern application development.
- The event-driven architecture is made up of event producers and event consumers. An event producer detects or senses an event and represents the event as a message. It does not know the consumer of the event or the outcome of an event.
- After an event has been detected, it is transmitted from the event producer to the event consumers through event channels, where an event processing platform processes the event synchronously. Event consumers need to be informed when an event has occurred. They might process the event or may only be impacted by it.
- The event processing platform will execute the correct response to an event and send the activity downstream to the right consumers. This downstream activity is where the outcome of an event is seen
- One more example for better understanding.
We all use Netflix to watch movies. It uses the Queuing tool and streaming tool. Netflix here plays the role of producer and we play the role of the consumer. Netflix uploads Movie once and all others who are subscribers of Netflix can watch.
They do not need to request again and again. It takes so much time to send it individually, here we just needed to upload once and all the subscribers will get it.
Event-Driven Technology Terminology: #
We have 5 main Event terminology.
- Publish
- Subscribe
- Streaming
- Producer
- Consumer
Check the picture below for a description.
Kafka Adapter: #
We have 2 Kafka Adapter-
- Inbound/Consumer Kafka Adapter:
We use Inbound Kafka Adapter when somebody pushes the topic into a topic customer on the Kafka side, then we are consuming. We have an event-driven listener who is a camel consumer. We don’t need a scheduler. We can consume immediately. This is the Inbound process because we are receiving data here. It plays the role of the consumer.
– How to use Agent Kafka adapter for Consumer?
- Kafka consumer adapter detail and properties
- Check on the Message Board record inserted successfully
Agent Kafka Consumer uses reading data from Topic to Salesforce, which means we Send data from Topic in Kafka to Salesforce. In this adapter, we can do Inbound with CSV, XML, and JSON data records, but we need to make sure our Adapter is configurated to match the file type.
- Outbound/Producer Kafka Adapter:
We need Outbound Kafka Adapter when we want to send data out from salesforce, here We need to create a topic first. From salesforce, we can send data messages to our agent’s API endpoint, and then we have a camel producer to publish into the topic invoice, which is on the Kafka side. This is the outbound process because we are sending data out here, It plays the role of Producer.
– How to use the Agent Kafka Adapter for Producer?
- Kafka Producer adapter Information
- Check on the Message Board record inserted successfully
Agent Kafka for Producer uses for writing data from Salesforce to the Topic in Kafka, meaning that we use outbound data from Salesforce to Kafka. In Agent Adapter for a producer can work with CSV, XML, and JSON data records, but we need to configure file type in Adapter too. Follow this guide to learning How to use Agent Kafka adapter for consumers