Data quality now outranks security in new governance initiatives

We are excited to bring Transform 2022 back in-person July 19 and virtually July 20 – 28. Join AI and data leaders for insightful talks and exciting networking opportunities. Register today!


Enterprise Strategy Group (ESG) and systems management provider Quest Software today released their 2022 State of Data Governance and Empowerment Report, an annual study [subscription required] that identifies challenges and innovations in data governance, data management and dataops. 

A key takeaway is that the pursuit of high-quality data has overtaken data security as the most important motivator for data governance initiatives. Forty-one percent of IT leaders agreed that their business decision-making relies fundamentally on trustworthy, high-quality data. At the same time, 45% of those surveyed contend that problems with data quality are the biggest detractor from return-on-investment in data governance efforts.

The findings are based on ESG’s survey of 220 business and IT professionals responsible for and/or familiar with data governance and empowerment strategies, investments and operations at their organizations. All the organizations represented in the research have at least 1,000 employees and annual revenues of $100 million or more.

Using data to the max

While they recognize the high importance of high-quality data, according to the survey, data management leaders are still struggling to improve their data and the capability to strategically leverage and maximize data use in practice.

Security remains a major concern for all data executives. But the tide has shifted toward maintaining high data quality standards first in order to get the right data in place – even before safeguarding it in whatever storage arrangement is selected by the enterprise.

“We saw the convergence of data-quality initiatives into data governance some time ago,” Heath Thompson, president and GM of Quest’s information systems management (ISM) business, told VentureBeat. “When I looked at this ESG report, it was interesting to see how important data quality has really become as a front-and-center topic.” 

Once you lay the foundation for data governance, Thompson said, there are many operational things you can begin to do, including building policy, whether it be data security, dataops or something else. “That’s how the industry is starting to do it now.” 

Dataops being used in data-quality initiatives

Dataops is a collaborative data management practice focused on improving the communication, integration and automation of data flows between data managers and data consumers across an organization.

While the challenges of data visibility and observability are different across industries, dataops was overwhelmingly recognized in the survey as the primary solution to drive forward data empowerment, Thompson said. Ninety percent of those surveyed agreed that strengthening dataops capabilities improves data quality, visibility and access issues across their businesses. 

The biggest opportunities to improve dataops accuracy and efficiency lie in investing in automated technologies and the deployment of time-saving tools, such as metadata management. The survey reported that only 37% of respondents describe their dataops processes as automated, and a similarly small proportion report having automated data cataloging and mapping today (36% and 35%, respectively).

“Trustworthy data and efficient data operations have never been more influential in determining the success or failure of business goals,” Quest CEO Patrick Nichols said in a media advisory. “When people lack access to high-quality data and the confidence and guidance to use it properly, it’s virtually impossible for them to reach their desired outcomes.”

The report also revealed that business leaders struggle not only to make sense of their data but to locate it and use it in the first place, Thompson said. Forty-two percent of survey respondents said at least half of their data was “dark data” – that is, retained by the organization, but unused, unmanageable and unfindable. An influx in dark data and a lack of data visibility often leads to downstream bottlenecks, impeding the accuracy and effectiveness of operational data, Thompson said.

“We’re not talking about the ‘dark web’ here,” Thompson said. “The dark data we’re concerned with is all that data that is collected but never used by an enterprise. Dataops automation is being used to fix this.”

Originally appeared on: TheSpuzz

Scoophot
Logo