Corporate data is remarkably flawed and likely won't get more accurate for quite some time, according to a new...
report from Gartner Inc.
According to the Stamford, Conn.-based research firm, 25% of critical data within Fortune 1000 companies will continue to be inaccurate through 2007.
"Most organizations have a culture where they view data as a necessary IT evil and not really a business asset the way they think of things like employees and buildings," said Ted Friedman, principal analyst for Gartner. "There's not enough in the way of good data quality controls in place to make sure data is maintained in the right way going forward."
Too often, companies look to technology to solve their data quality problems, Friedman said. Management pushes data issues to IT, and IT answers by installing new technology. This accounts for many of the failures of large CRM and business intelligence (BI) rollouts, Friedman said.
Friedman advocates a process that begins with measuring data quality and determining the gap between a company's current data accuracy and 100% accuracy. Businesses can then determine how the chasm translates into revenue loss and customer churn.
"Most organizations come up with some eye-opening figures," he said. "They find they are losing millions as a result of the issue. Ultimately, this will get management on board in investing people and money in the problem."
The answer, Friedman said, lies in all legs of the proverbial three-legged stool -- people, processes and technology.
Data management's cutting edge
Forward-thinking companies put the business end of the organization in charge of data quality rather than IT, Friedman said. These data stewards manage their departments and are responsible for certain slices of a company's data landscape.
For example, the manager of the call center would be the data steward for the company's customer contact data. While IT staff supports the data stewards, the steward is ultimately responsible for defining good data and putting processes in place to ensure them.
Some organizations even tie compensation to how data is maintained and improved, Friedman said.
"That's how important data quality has become," he said.
When it comes to processes, data quality needs to be ongoing, said Friedman. Organizations need to constantly quantify and measure their data quality.
While Gartner's research focused on Fortune 1000 companies, data quality is just as important in the midmarket, where many functions are outsourced.
"More of their data and processes are outside of their own four walls," Friedman said. "When the data come in, you have to have good data controls at the border of your enterprise and a way to assess the quality and reject what is not good quality."
Much of the data quality concern today is centered on CRM data because companies have made major investments in capturing information about their customers, Friedman said. Those investments have often shown no return because the large CRM or BI applications have bad data to work with in the first place.
Data quality problems are only going to get worse with the adoption of new information-intensive technologies like Radio Frequency Identification, Friedman predicted. "We're out there trying to spread the message that companies need to think more holistically than just about CRM," he said.