Turn on CNBC and watch the stock ticker go by for a few minutes. (I’ll wait.)
Are you done? Good! What did you learn? It’s overwhelming, isn’t it?
We live in an incredible time with vast amounts of data. But that’s a double-edged sword. We’ve been falsely led to believe that the more data, the better! We have yet to create our own set of internal filters to determine what data = relevant vs. what data = not relevant.
Put a Wall Street trader in front of CNBC. They won’t have any problems understanding it because they innately understand what data matters and what doesn’t. The same is true in IT.
Our networks generate a vast (and diverse!) amount of data. Those that aren’t adept at processing that kind of data quickly become overwhelmed. What’s important? What’s not?
Our team of operators look at this data 24/7/365. More importantly? They see this kind of data across multiple types of organizations. They innately filter out the irrelevant from the relevant.
But that’s not the totality of our secret sauce.
We also leverage multiple tools to help us filter even further. To dig deeper and pinpoint the truly important data.
Your engineers? They aren’t as well-versed. Their internal filters aren’t quite as finely tuned and they don’t have access to the same tools we do. This isn’t a judgement of your engineering team, and in fact, you actually don’t want them to become capable operators because then they lose the thing that makes them great innovative engineers.
You want your engineers to retain part of their cowboy mentality; you want them to explore white space and define the edges of what’s possible. Operators don’t do that.
The Problem is Part of the Solution
Think about your doctor. They have a vast amount of data about you. Height, weight, gender, marital status, smoker vs. non-smoker, blood pressure, any medications you’re on, etc. But let’s say you go in with a sprained ankle. How much of all of that information is most relevant to your sprained ankle? The problem you presented your doctor with is a part of the solution. It’s how the doctor builds decision models to determine what data is relevant, and what data isn’t that plot your course of treatment.
The doctor’s job is to understand the problem and filter through the relevant data to determine the underlying cause. Operators do the same thing. They start with the problem. That enables them to begin filtering out the irrelevant and zero in on the relevant.
Time is a Premium
I don’t think a single person would argue the fact that time is the great commodity of our day. Everyone I know wishes there was more time. The secret is to create more time. And we do that by understanding what’s important and what isn’t. How many of us hand wash our laundry? I’d suspect very few, if any. It seems almost ridiculous to use such an archaic, time-consuming method when we have technology to perform that task for us. Technology that frees up our time!
Access to all of this data is great. Sifting through it using traditional methods? That’s a bit like hand-washing laundry. Sure, it’ll get the job done, but at a significant time cost. We have to leverage technology and new skill sets to filter the data correctly; still getting the job done, but saving ourselves time.
The velocity of change appears to be accelerating. The amount of data is predicted to explode over the next several decades and the variety of data coming in is becoming more and more exotic. We have to leverage all available resources to make sense of the data and put it to use.