Why is everyone these days talking about streaming analytics? Real-time data analytics has become one of the most valuable assets for any business. This is because the business community across all sectors now realizes that the new real-time insights are of greater value than those delayed by a few days or even hours. Businesses make their critical decisions based on fresher insights, and stale data holds no value.
This video explains the common types of use cases for real-time data analytics and how AWS Kinesis is used in data analytics architecture.
Real-time analytics can be categorized based on their latency characteristics. First, there is a millisecond level-latency. It uses messaging layer for microservice asynchronous communication. The second-level latency is based on log ingestion. And lastly, these days, every workload uses an online transaction processing database. This is exported to a data lake and then used for analytics.
The data processing for each category can be different depending on how the businesses respond to the customer demands, inventories et. al.
There are many tools and applications to enable real-time analytics. The process starts with the source, which is from where the data is received. Then the data needs to be ingested using tools like
AWS Kinesis. Once the data is ingested, it is deposited into a stream storage layer. Following this, there can be many consumers that can read and process the streaming data in real-time. The last component here is the destination. The data can be stored in a database like Data Lake.