Podcast: How time collection information is revolutionizing information control


Time series data is an important component of having IoT devices like smart cars or medical equipment that work properly because it is collecting measurements based on time values. 

To learn more about the crucial role time series data plays in today’s connected world, we invited Evan Kaplan, CEO of InfluxData, onto our podcast to talk about this topic.

Here is an edited and abridged version of that conversation:

What is time series data?

It’s actually fairly easy to understand. It’s basically the idea that you’re collecting measurement or instrumentation based on time values. The easiest way to think about it is, say sensors, sensor analytics, or things like that. Sensors could measure pressure, volume, temperature, humidity, light, and it’s usually recorded as a time based measurement, a time stamp, if you will,  every 30 seconds or every minute or every nanosecond. The idea is that you’re instrumenting systems at scale, and so you want to watch how they perform. One, to look for anomalies, but two, to train future AI models and things like that. 

And so that instrumentation stuff is done, typically, with a time series foundation. In the years gone by it might have been done on a general database, but increasingly, because of the amount of data that’s coming through and the real time performance requirements, specialty databases have been built.  A specialized database to handle this sort of stuff really changes the game for system architects building these sophisticated real time systems.

So let’s say you have a sensor in a medical device, and it’s just throwing data off, as you said, rapidly. Now, is it collecting all of it, or is it just flagging what an anomaly comes along?

It’s both about data in motion and data at rest. So it’s collecting the data and there are some applications that we support, that are billions of points per second —  think hundreds or  thousands of sensors reading every 100 milliseconds. And we’re looking at the data as it’s being written, and it’s available for being queried almost instantly. There’s almost zero time, but it’s a database, so it stores the data, it holds the data, and it’s capable of long term analytics on the same data. 

So storage, is that a big issue? If all this data is being thrown off, and if there are no anomalies, you could be collecting hours of data that nothing has changed?

If you’re getting data — some regulated industries require that you keep this data around for a really long period of time — it’s really important that you’re skillful at compressing it. It’s also really important that you’re capable of delivering an object storage format, which is not easy for a performance-based system, right? And it’s also really important that you be able to downsample it. And downsample means we’re taking measurements every 10 milliseconds, but every 20 minutes, we want to summarize that. We want to downsample it to look for the signal that was in that 10 minute or 20 minute window. And we downsample it and evict a lot of data and just keep the summary data. So you have to be very good at that kind of stuff. Most databases are not good at eviction or downsampling, so it’s a really specific set of skills that makes it highly useful, not just us, but our competitors too. 

We were talking about edge devices and now artificial intelligence coming into the picture. So how does time series data augment those systems? Benefit from those advances? Or how can they help move things along even further?

I think it’s pretty darn fundamental. The concept of time series data has been around for a long time. So if you built a system 30 years ago, it’s likely you built it on Oracle or Informatics or IBM Db2. The canonical example is financial Wall Street data, where you know how stocks are trading one minute to the next, one second to the next. So it’s been around for a really long time. But what’s new and different about the space is we’re sensifying the physical world at an incredibly fast pace. You mentioned medical devices, but smart cities, public transportation, your cars, your home, your industrial factories, everything’s getting sensored — I know that’s not a real word, but easy to understand. 

And so sensors speak time series. That’s their lingua franca. They speak pressure, volume, humidity, temperature, whatever you’re measuring over time. And it turns out, if you want to build a smarter system, an intelligent system, it has to start with sophisticated instrumentation. So I want to have a very good self-driving car, so I want to have a very, very high resolution picture of what that car is doing and what that environment is doing around the car at all times. So I can train a model with all the potential awareness that a human driver or better, might have in the future. In order to do that, I have to instrument. I then have to observe, and then have to re-instrument, and then I have to observe. I run that process of observing, correcting and re-instrumenting over and over again 4 billion times. 

So what are some of the things that we might look forward to in terms of use cases? You mentioned a few of them now with, you know, cities and cars and things like that. So what other areas are you seeing that this can also move into?

So first of all, where we were really strong is energy, aerospace, financial trading, network, telemetry. Our largest customers are everybody from JPMorgan Chase to AT&T to Salesforce to a variety of stuff. So it’s a horizontal capability, that instrumentation capability. 

I think what’s really important about our space, and becoming increasingly relevant, is the role that time series data plays in AI, and really the importance of understanding how systems behave. Essentially, what you’re trying to do with AI is you’re trying to say what happened to train your model and what will happen to get the answers from your model and to get your system to perform better. 

And so, “what happened?” is our lingua franca, that’s a fundamental thing we do, getting a very good picture of everything that’s happening around that sensor around that time, all that sort of stuff, collecting high resolution data and then feeding that to training models where people do sophisticated machine learning or robotics training models and then to take action based on that data. So without that instrumentation data, the AI stuff is basically without the foundational pieces, particularly the real world AI, not necessarily talking about the generative LLMs, but I’m talking about cars, robots, cities, factories, healthcare, that sort of stuff.

Leave a Reply

Your email address will not be published. Required fields are marked *