Skip to content

It’s Time to Stop Settling for Bad Data

While tech teams enable the data collection and automation that drive the rest of the business, their own data practices have lagged. Let’s change that.

Perspective

Across industries, data is driving unprecedented productivity, performance and competition.It’s Time to Stop Settling for Bad IT Data

Hospitals use admissions data and predictive analytics to forecast staffing needs, providing better patient care and reducing overtime costs. In education, administrators use data to identify at-risk students, providing early intervention and reducing dropout rates. In global commerce, factories and supply chains use data to predict maintenance downtime and increase efficiency.

The people at the heart of this digital transformation sit in the IT department, building the software that collects, analyzes and protects great data for every team that needs it. 

Great data for every team, that is, except IT.

For years, IT and security have grown accustomed to working with bad data. That sounds odd, but the reason is simple. As environments have grown more heterogeneous, collecting telemetry has grown more complex. Most of the tools built to analyze IT health, performance and security provide visibility only into a narrow slice of the environment (say, a single cloud provider) or for a targeted use (like threat detection).

In fact, as recently as 2019, some 55% of organizations surveyed by Forrester Consulting for Tanium said they used more than 20 tools across IT and security for endpoint management. In addition, 60% of the respondents said their tools came from five or more vendors, and 38% said they came from 10 or more vendors.

This sprawling toolkit is a mess, requiring IT to manually cobble together multiple data sets to perform daily work. The result, at best, is data that is 85% complete. It’s often outdated, inappropriate for the task at hand and duplicative or error-ridden. In many environments, it’s so bad that they have no idea what assets they have, where those assets are, who is using them, or what vulnerabilities they have, with any degree of confidence.

Working alongside our customers, which include the largest government agencies and enterprises around the world, we’ve seen many variations of this data issue. An IT organization may have great security data but weak operations data — or great operations data and weak security data. It might possess keen insight into what’s happening at its own data centers, but be blind to what’s going on in the public cloud.

That lack of visibility and situational awareness has to change. But to do that, it’s helpful to understand where things stand today for IT teams.

The challenge of remote work

If anything, the change to remote work last year makes this issue even more acute. In the early weeks of the COVID-19 pandemic, IT teams scrambled to enable the remote work environment we now know so well. Overnight, they scaled VPNs, bought and rolled out videoconferencing, shipped laptops by the thousands, and reconfigured entire workforces to take decision-making out of the conference room and place it at the endpoint.

But that overnight pivot “put an unprecedented strain on cybersecurity professionals to move and secure remote environments,” according to 2020’s annual Cybersecurity Workforce Study by (ISC)2, an association for cybersecurity professionals. Among the 3,790 security practitioners it surveyed, (ISC)2 found an 18% increase in security incidents after the shift to remote work.

[Read also: How IT decision-makers will lead in 2021]

For IT teams, moving to an all-remote environment broke their tools and made the data even worse. Devices and systems IT teams could once see went dark, as those devices began connecting remotely. SaaS applications proliferated as online collaboration tools became mission critical.

The lack of visibility not only hobbled normal processes, but also created a sprawling attack surface. Every organization had to harden its defenses as hackers sought to exploit the pandemic through phishing emails, ransomware, and other scams meant to disrupt vital business and government functions. Data breaches and computer outages are always disasters. But when your workforce relies on a single device for every job function, the magnitude of impact is much worse.

The data-driven future

For IT, data is about understanding what assets you have, what information is being accessed and by whom, what devices are connecting, and which connections are failing. But data also powers automation and enables action-taking. Automation with incomplete or inaccurate data is a recipe for disaster.

Imagine a self-driving car operating on incomplete telemetry (or data). You would never be willing to get into a car that had only 85% of the right inputs. With that logic, why would you trust the very backbone of modern business — your IT infrastructure — to run on 85% data?

The answer is that, really, you can’t. And you shouldn’t.

In a world of distributed devices, IT automation is going to be more critical than ever before. Powering IT service management for remote employees, quickly quarantining infected devices, managing configuration changes, distributing software, patching devices, alerting users to potentially malicious activity, killing rogue processes to prevent breaches — all of this needs to happen in a matter of seconds, not weeks, and without its requiring human intervention every single time.

Why would you trust the very backbone of modern business — your IT infrastructure — to run on 85% data?

Orion Hindawi

When we look back on 2020, there will be many lessons learned: what worked, what didn’t, what we could have prepared for, and what was impossible to anticipate. For IT, one of the most important lessons was that they could do things they thought would take months or years — like equip and enable entire remote workforces — in a matter of weeks or even days. That lesson should be an empowering one, but also a reminder. With the right data — with accurate data — anything is possible.

[Read also: What Lumentum learned about security in the WFH age]

If 2020 was about surviving, 2021 is an opportunity to thrive. For IT and security teams, this means actively rethinking the things they thought could not be done — including doing away with tools and processes that don’t enable them with the data they need to be successful. 

In 2021, it’s time for IT and security to find solutions that serve more than a single function. These solutions must deliver visibility across hybrid environments, including on-premises infrastructure and private and public cloud services. They must also provide integration that will enable the sharing and cross-functional analysis of data. Only through this approach will organizations get to the level of accuracy they need to truly support the modern distributed enterprise.

Orion Hindawi

Orion Hindawi is co-founder & executive chairman of Tanium

Tanium Subscription Center

Get Tanium digests straight to your inbox, including the latest thought leadership, industry news and best practices for IT security and operations.

SUBSCRIBE NOW