Getting BI and data quality together

Clean data is vital to an effective BI project, but the need for integration between the two technologies is not as clear cut.

Two years ago, the LexisNexis Group set about creating a marketing data warehouse to bring together source data from the myriad companies it owns.

The company sends out 12 million pieces of direct mail a year and had been using an outside source to run its operation only to see some customers getting 18 pieces of identical mail or mail arriving at the homes of people who had long been deceased.

"The mail house wasn't interested in cutting down on the mail we sent out, so we took it upon ourselves," said Bill Welch, marketing systems manager.

With data quality technology from Cary, N.C.-based DataFlux Corp., LexisNexis now downloads all its customer information, runs it through DataFlux's schemes and verification process and deduplicates overlapping the information. What emerges is one view of the customer from all its sources. For example, from its Martindale-Hubbell legal databases, LexisNexis retrieves a lawyer's date of birth and the year he or she passed the bar. Combined with accurate mailing addresses, the company can send out targeted mailings. The system also works with new acquisitions, integrating the new database and identifying overlapping customer records.

At the time LexisNexis was shopping for a data quality application, the Cary, N.C.-based SAS Institute was also shopping for a data quality vendor to complement its business intelligence (BI) products. When LexisNexis bought its data quality tool from DataFlux, SAS bought DataFlux itself. LexisNexis still essentially treats the two as separate companies, but it has seen some benefits from combined training materials and documentation, Welch said.

The need for clean data when running BI is well established and SAS has been playing up the integration between its two offerings. Yet, the products don't necessarily need to come from the same vendor.

For more information

See how data integration vendors are buying up data quality firms

Sign up for our free Customer Data Management seminars

"Your data quality issues will certainly rear their ugly head in a BI system," said Keith Gile, principal analyst with Cambridge, Mass.-based Forrester Research Inc. "I don't know that means you have to have it attached. I don't think it has to be physically in the same engine. Nor does it have to be delivered by the same vendor."

Data quality tools do, however, need to be close to the decision engine and cleanse the data before it goes in, Gile said.

The SAS approach, which is everything comes under the umbrella of BI -- modeling, extract, transform and load (ETL) tools and analysis -- is not necessarily shared by everyone, Gile said. But for companies looking at a wider data management strategy, data quality is vital.

In fact, it has only been in the past two years or so that data quality has emerged as an important area of concern, due partly to the failure of some systems like CRM and partly to the increased pressures of regulatory compliance. In many organizations there are too many BI tools, a situation that has arisen thanks to departmental purchases of BI applications, while there are often too few data quality tools.

"The partnerships and acquisitions are going to fill into that line as well," Gile said. "It's part of that data management issue. If you get everything from Oracle, it makes sense that an Oracle quality tool fits your approach."

Dig deeper on Sales strategy and sales force effectiveness

Pro+

Features

Enjoy the benefits of Pro+ membership, learn more and join.

0 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchDataManagement

SearchSAP

SearchOracle

SearchAWS

SearchContentManagement

Close