back caretBlog

The Data Growth Dilemma

You spent how much on packet capture?

ExtraHop CIO John Matthews

ExtraHop CIO John Matthews

There are two truths in IT that I always have to wrestle with:

  • Our needs are ever expanding, yet somehow our budgets are shrinking.
  • And I'll have to spend an inordinate amount of money on keeping things running.

I'm sure you all struggle with the same exact equation. There's a lot of very cool projects you want to put into place in this fiscal year - maybe a new cloud environment, an IoT implementation, or a Big Data solution to help the business; but more than likely you're sweating the amount of money you're going to spend on something like a monitoring solution.

Yes, the guy who works at an IT Analytics company is going to talk to you about the economics of your monitoring solution (didn't see that one coming did you). Now I don't need to tell you how critical it is to monitor everything in your environment but have you stopped to consider how you're going to budget for growth?

Data Growth Diagram

In Cisco's global cloud index we see that data is growing at a CAGR of 20-25% a year. Have you given any consideration into what that means for your monitoring? Sure that's not likely going to do do much to your Logs or agent costs but it is going to dramatically change what you're spending on your Network monitoring and even worse yet it could break all your workflows.

Packet Capture Scale Out

So you're spending what $260k for a 10Gbps packet capture appliance right? What happens when you max out the first packet capture box? You go spend another $260k on a second one? This isn't just some fuzzy math in a TCO calculator we're talking about here this is a likely expense you're going to incur. Unless, of course you just decide I don't need to see that traffic… I'm sure the security guys would love that answer.

Have You Considered the Look Back?

It's not just the amount of throughput you have to worry about it's how long it takes to solve problems in an increasingly complex environment. This forces you to purchase additional marked up storage or risk the packets falling off forever before you've traced down the problem. It adds up pretty quickly especially when coupled with scale out. Oh you need 1.2PB of data? That will be an extra $1M please. Yeah... that's not what you want to hear when you have your lines of business asking about how you'll support their digital transformation efforts. I hate paying more money to vendors to do simple stuff.

Finally and Most Importantly have you considered the people cost?

You probably have your own equation based on the amount of throughput maybe 2 Network Admins per 1Gbps to analyze packet captures. This isn't sustainable as you grow 20% YoY. Honestly most of these folks probably would want to be doing more rewarding work anyways, and you probably see high turnover and burnout in these roles. So it isn't just a matter of the cost of the infrastructure itself but how much it weighs down your people in time consuming monotonous labor.

We Believed the Traditional Approach to NPM was Flawed

Having your vendor tax you for making their product functional, especially when it's deficiencies forced you into needing that extra look back seems like paying protection money. Oh... you need to sift through a massive packet dump? Download and come back in an hour when it's finished and only then begin your search. Oh... you can't look through packets that fast? Well you're going to need additional lookback then. Well while ExtraHop will supply the hardware in cases where folks need it, we're just as happy to provide the spec and push you to a vendor who won't be marking it up because of a proprietary closed system. We actually think you should have more lookback not less.

Yeah, that's pretty awesome you get to save budget for something transformative instead of just keeping things running, but cheaper hardware doesn't help with that challenge with employees. What can be done for them? ExtraHop initially avoided packet capture because we felt the workflow was broken, but a little over a year ago we were able to link the underlying packets to the metadata we have always collected making them for the first time truly searchable. We've brought the world of network monitoring into the new age of real-time IT analytics. That means instead of sifting through 500GB packet dumps you can search based on transaction details and collect just the packets you need.

In Closing

Guys, this isn't the most exciting topic in the world (let's be honest here - much of what we do in IT is not exciting) I know. It's not the machine learning juggernaut (although we do that too), or IoT, or Cloud, but we're talking about a substantial portion of your budget that is being earmarked to old and outdated technologies that can't keep up with where you're going. You don't have to throw out your traditional workflows, but modernizing them and tapping into the richness of data currently lying hidden in your network can get you there. It's time to stop looking at the network as plumbing or the problem, it's actually the solution to managing complex environments at scale.

ExtraHop Reveal(x) Live Activity Map

Stop Breaches 87% Faster

Investigate a live attack in the full product demo of ExtraHop Reveal(x), network detection and response, to see how it accelerates workflows.

Start Demo

Sign Up to Stay Informed