Start From A Real Scenario:
Let's say we are a technique company, and we have some isolate systems:
- Business System (mainly serves the end clients with business logics)
- Search System (providers search service to other systems)
- Data Mining System (mining data, generating report and providers cleared data for other services)
- Notification System (notifies users for something happened)
The workflow will be like this:
Then let's list some requirements:
- the "Business System" needs a Queue to handler some heavy processes. Dispatch the reuqest to a Queue, And handler the request on a backend process asynchronously
- If something happened on the "Business System", Then tell other systems
For example: A user updated his profile
Then tell the "Search System" to updated related data as well
And tell the "Data Mining System" for this new event
Then tell the "Notification System" to notify the user for the changes by mail or other ways. - the "Search System" subscribes all search realtead "Events", And do actions according to the "Event"
- the "Data Mining System" listening on all "Events" happenes on among all systems, And record the "Events" to data warehouse with some kind of unstandable format.
- the "Notification System" listening on "Notify Events" and also emit "Events" like a user's email got bounced.
- Easily tracking the lifetime of a "Event"
- ...
These requirements cover most of important ports of EventBus, A system distributes "Event"/"Message" among multiple systems with unified format. Let's go to satisfy these requirements by EventBus.
Installation
Requirements
- Java 1.8+ installed
- Sbt
- Zookeeper
- Kafka 0.10+ (only when use Kafka Source/Kafka Sink)
- Cassandra (only when use Cassandra Fallback)
Install Zookeepr
Install Kafka
Install EventBus
From Source
> git clone https://github.com/thenetcircle/event-bus.git
Launch EventBus
Setup
After we installed and started all dependencies, We can setup EventBus by it's configuration.(Please check the configuration section)
For example let's change the zookeeper address of application.conf to be:
zookeeper {
servers = "localhost:2181"
rootpath = "/testnode"
}
Compile & Run
EventBus includes two main components, Runner and Admin. Which are the two sub-projects in the source code as well (Runner is inside core).
- Let's stage the project first
> cd ${the_root_path_of_event_bus}
> sbt stage
- Launch Runner
> # uses environment variables for some settings, you can also set them inside application.conf directly
> EB_APPNAME=${application_name} EB_DEV=dev EB_RUNNERNAME=default-runner ./target/universal/stage/bin/runner
- Launch Admin
> # changes admin listen port to be 8080, default is 8990
> EB_APPNAME=${application_name} EB_DEV=dev ./target/universal/stage/bin/admin -Dapp.admin.port=8080
Now open the url http://localhost:8080 you will see the homepage of admin interface.
Workflow
Workflow of EventBus
EventBus internal includes a list of stories, The word "story" is a virtual concept. Which describes a scenario about transfer data from one point to another point.
For more details please check Overview Section
A story includes a Source, a Sink, maybe a couple of Transforms and a Fallback
The internal structure of a story looks like this:
Data/Event come from the left side and eventually will reach right side, That's a end of the story.
We could have some different stories running paralleln
For example: one story listening on a HTTP port and deliveries data to Kafka, And another one listening on Kafka Topics deliveries data to a HTTP EndPoint.
There suppose to be some different Souce/Sink/Transforms/Fallback implementations (For now only implemented Http Souce/Sink, Kafka Souce/Sink, Cassandra Fallback), In the future could be Redis Souce, JMS Sink, etc...
Workflow of Our Current Scenario:
Back to our current scenario, What the workflow looks like?
How the different systems working together with EventBus?
- Business send Events to EventBus by HTTP Request
- EventBus stores the requset to Kafka
- There are sereral EventBus stories which subscribing on Kafka topcis and send data to specific systems by HTTP requests.
Create Stories
Let's open the Admin Interface (http://localhost:8080)
Click "New Story" button on the navagator.
Like we mentioned before, We need a couple of stories to satisfy the workflow.
We need a first story which listening on a HTTP port and transferring data to Kafka, It should be like this:
After the story is created, We also should assign it to a Runner(run stories). For more details, please check Overview Section.
Okay, Now the first story is created and running, We also need to create a couple of other stories to subscribe on specific Kafka topics and send data to specific systems by HTTP.
It should be like this(don't forget to assign it to a Runner):
Now the configuration of EventBus is done, A HTTP request sent to the HTTP port listened by the story, will be directly send to Kafka.
And other stories will fetch the request from Kafka, to send it to different systems.
Fallback
Tracking
Monitoring
Grafana
We use Grafana to present some metrics for monitoring the health of EventBus
Sentry
And use Sentry for Error Tracking