What is Streaming Analytics?

Organizations adopt streaming data feeds so they can react to events in real time—as they happen. It is the very opposite of batch processed data. If you’re looking to visualize and analyze streaming data, you know there’s immense value in acting on the freshest data. You also know that the interpretation of real-time events often needs a look back in history for context.

Today, enterprises reap value from high velocity, real-time data such as device logs, sensor readings, social media feeds, IoT data, and more. Real-time analysis of these streams requires a paradigm shift from batch-oriented architectures. Recent technologies such as Apache Kafka, Spark Streaming, Apache Storm, Apache Apex, Apache Nifi, and Amazon Kinesis have emerged to manage the velocity of big data.

While much of the discussion on real-time data focuses on machine processing, helping data users see these streams through real-time analytics and visualization is just as important.  

Streaming Analytics for Internet of Things (IoT)

IoT data processing brings several challenges to data analytics platforms. First, there’s simply a lot of it and the volume of IoT data will balloon in the next decade as many millions of IoT devices come online. In fact, industry analysts including Gartner predict 25 billion IoT devices by 2020. Even if that prediction turns out to be wrong by half, that’s still a virtually unimaginable amount of data.

The promise of streaming analytics for IoT is that it presents a chance to gather and analyze real-time information about every aspect of a business. Although IoT data is real-time data, the speed at which it can be processed and ingested—speeds and feeds—is another matter. That’s what the architects of streaming analytics platforms are facing.

Zoomdata Streaming Analytics

Zoomdata provides enterprise-grade real-time visualization and streaming analytics based on self-service, interactive, sub-second response to ad hoc queries of high-velocity data. It’s purpose-built for streaming data, with a scalable architecture that can push updates from the source through our stream processing engine to end users via a WebSockets connection.

Zoomdata features an interface for integrating with any stream processing infrastructure. It is independent of the exact streaming infrastructure used. More than just simply monitoring real-time data feeds, Zoomdata enables streaming analytics based on visual interaction with the stream. Users can dynamically filter the stream and change the aggregation level, as well as pause, rewind, replay, and fast-forward the stream.

Zoomdata Data DVR

Data DVR is another reason why Zoomdata is great for visualizing big data--especially streaming data. Working with streaming data demands a different approach than querying historical data, such as monitoring updates pushed to the user interface.
 

Traditionally, organizations have been forced to use different tools and techniques for real-time data visualization versus historical data analysis. The Zoomdata Data DVR unifies these techniques in a single user experience. It brings a video DVR (digital video recorder) model to streaming data, making it accessible to business users through this familiar interface. 

As the name implies, Data DVR works like the DVR you use to watch and record live television. It has similar controls so rather than simply monitoring real-time data, you can pause it, rewind, replay, or fast-forward. In addition, Data DVR’s time controls function the same way for historical data replayed as a stream. Users can transition seamlessly from real-time monitoring to historical analysis, just by expanding the time bar.

The time controls let you choose the time period to display for a data set by using the time slider. Simply expand the time window from monitoring the live feed to a longer time horizon to analyze the full history. To see changes reflected in your chart or graph, you simply scroll forward or backward in time. You can also change the speed at which a visualization progresses, every second, every minute, every hour or every day.

Real-Time Data Visualization Demo

 

What are the Advantages of Streaming Analytics?

The point of any analytics is to gain deeper insight into a situation or problem with an eye towards solving it or producing a better outcome. Because streaming analytics gives visibility into what’s happening right now, it can:

  • Enable decision makers to act sooner
  • Manage key performance indicators (KPIs) daily rather weekly or monthly
  • Understand the root cause of problems more quickly
  • Detect patterns that emerge across diverse data sets such as sales figures and weather variations

A good example of the power of streaming analytics would be a company like GuestDNA, which helps many of the largest retailers transform huge volumes of transaction and sentiment data into a consistent and pleasant experience for their guests, employees, and franchisees. To do this, the company built a data lake on AWS and chose Zoomdata for its visual analytics tool. GuestDNA updates data in real time, 24 hours a day from thousands of sites handling millions of transactions.

Commonly Asked Questions about Streaming Analytics

Internet of Things (IoT) analytics refers to analyzing the date from devices other than computers that are connected to the Internet and can send and receive data. For example, it could be data from smart appliances in the home or sensor data from an automobile.
Streaming data sources and analytics use cases go hand-in-hand. As data source increase so do use cases. Right now, on the consumer side, personal fitness apps that track and analyze real-time heart rate, blood pressure, sleep quality, etc are very popular. But it's easy to see how streaming data analytics might predict the imminent breakdown of automobile components or manufacturing equipment failures. Likewise, financial services apps may help people and institutions make smarter decisions. Look for a spate of real-time streaming analytics use cases to emerge once the revised Payment Services Directive takes effect in Europe in January 2018.
In edge analytics, data collection and analysis occur automatically on data at a sensor, network switch, or other device instead of having data sent to a central data store.
Sensor data is just the output of a device that detects and responds to some type of activity in the physical environment. Sensors care respond to nearly anything: heat, moisture, movement, pressure, velocity, etc.
Without going into the weeds, two things separate modern and legacy data frameworks. The first is the ability to handle huge volumes of data without "breaking." Hadoop is what comes to mind first in most discussions around this topic, but there are many other options. It seems like a new big data processing framework pops up almost every week. The second is the type of data that the framework can manage. Legacy data frameworks are typically relational--and they can only process structured data. Most modern data frameworks can manage structured, unstructured, and semi-structured data. If you have an analytics platform that can handle any data type, you need a data store that can as well.
Featured Resources

Real-Time & Streaming Analytics

Zoomdata enables users to execute  streaming analytics against real-time, historical, and asynchronous data sources.

Contact

401 E. 3rd Avenue, Second Floor
San Mateo, CA 94401
(650) 399-0024

11921 Freedom Drive, Suite 750
Reston, VA 20190
(571) 279-6166