Learning and understanding any technology (aka, rampup), more abstract from design principle and technical implementation, more transfrable , reusable knowledge. Or simply put, understand product or solution to its utmost depth (bordering academics), so that knowledge thus gained is valid across technologies and stands perinial.
Am given a task to understand, compare “Complex Event Processing” engines like
Azure Stream Analytics
Started with reading an excellent article on Event Driven Architecture by Brenda M. Michelson https://dl.dropboxusercontent.com/u/20315902/EventDrivenArchitectureOverview_ElementalLinks_Feb2011.pdf.
Have I experienced Brenda’s architecture, hence this blog.
In an “Event Based Architecture”, EVENT is a central abstraction and basis of everything that happens post event. EVENT is highly generalized, to include anything that happens, be it due to sensors or smart devices, shop floor machines, business transactions (order or inventory management systems) through services and solutions or could be due envoirment. Also EVENT concept is all encompassing, anything that happens under “Universal Model” till BIG BANG can be viewed as event. Philosophically, Life is a sequence of notable, forgettable sequence or layers of events.
One implicit assumption with EVENT being base is, “event” occured / took place in the past, detected and captured. “EVENT BASED ARCHITECTURE” deals with what to do with an event and how to handle events when scale is necessary.
Question should not be whether events occur, but more apt questions are
Do we / can we capture all events that occur?
If yes, are they interesting and what can be learnt?
How to learn from event and what action (if required) to be taken?
Events are Sensed (Sensor) / Probed (Labs) / Detected , analyzed and consumed.
An “Event” is a notable thing that happens that could be either internal or external to scoped system. Scoped System is field of interest could be Organization or any other entity.
Event term is generally used to indicate “Occurance” and “Specification” of a notable thing. Specification of what of Event including “Event Header” and “Event Body” while in Occurance captures when of event.
Specification of Event needs to be as complete as possible for downstream applications to analyze and act on event.
Event Header: //More Generic in Nature. For all events
Event ID: <Unique Identification Number> ,
Event Type: <Type of Event>
Event Name: <Name of Event>
Event Timestamp: <When was Event generated>,
Event Origin: <Event generated by>
Other Event Details:
//Specific to Scope of work.
//Standardization across Event Generators / Consumers necessary
Event Deviation Details:
Event Generators: Events are generated by source. Source is a generic abstraction to include anything from Software Application, Service, Sensors, Transmitters, Business Process. Every events may not be necessary in all cases, only interested events (like deviations from threshold) may be necessary for downstream applications, provided thresholds or deviation boundaries are well defined. Each event is evaluated and from a universal set of all events, subset of notable events are filtered to be further processed by downstream applications. As indicated, evaluation and filteration of events is based on context and type of work.
Event Channel: Messaging backbone to transports events for processing by downstream entities. Event channel may need to persist state and ensure events are durable and not lost during transit.
Event Processing: Upon receiving events, events are evaluated against processing rules (analyze phase) and actions are initiated as prescribed (act phase). Evaluation of events and actions are defined as required at event processing layer and is completely indepdent of needs of event generators. For Event processor, how to evaluate an event and what action to take is primary concern rather than who / what raised event. Similarly, Event Generators are concerned only with raising an event and event processing is a black box to them.
This decoupling is achieved best only when event generators, provide all necessary data points related to events so that event processors are not stuck due to lack of data.
Relationship between Event Generators and Event Processors:
Event generators and Event consumers are very loosely coupled and only intermidate channel is connecting point between them.
Due to loose coupling
Event generators or Event Processors can be scaled out independent of each other.
A lag could be introduced between time event was generated and when it was processed.
Event processing patterns are of three types.
Simple Event Processing
Stream Event Processing
Complex Event Processing
Frankly speaking, I have not yet got grasp of Stream Event Processing in given examples, still trying to understand in my own terms what are use cases for each and explain in more detail.
Once I get my head around these three patterns of Event Processing, will put out my thoughts and ideas..