Solving data integrity issues with Master Data Management

The enterprise needs tools to make data cleaner and more consumable.

Continuing this week's interest in enterprise topics, I spoke with Carl Lehmann, Senior Vice President, Strategy, Advanced Data Exchange about what's going on in ERP (enterprise resource planning) and MDM (master data management.

Companies have invested millions with ERP vendors to bind unique software modules to enable business process flows from across financial, customer relationship and supply chain functions. Even when SOA principles of loose coupling and services come into play there can be serious data integrity issues.

Inaccurate and/or erroneous data entry prohibits the benefits from cross-functional processes flows sought in environments like build-to-order manufacturing for example. Master Data Management (MDM) is the technology approach to solving this pain.

Business professionals have the knowledge of the market needs and competitive landscape that many IT folks don't have. Likewise, IT folks have knowledge of the type and structure of data and how best to optimize the systems that use the data (computers, software, etc.) but don't necessarily understand how best to apply data analysis to business decisions.

MDM lays the foundation for data quality management that can be more easily and reliably used for business intelligence applications (e.g., market trend analysis, supply chain performance optimization, costs analysis, product mix, etc.) controlled by business professionals thus making data more valuable to an enterprise.

In this economy, incremental ERP investments in solutions like MDM may be delayed. Nevertheless, the IT landscape in 2009 will be asked to do more with less. The need to maximize the return on ERP investment will emerge as high priority, and the need to improve cross-functional process flows will not abate.

Simpler and less costly approaches to improving data quality are available (as possible value-added prerequisites to future MDM investments) that prevent non-compliant or erroneous data from being entered into the ERP system in the first place.

Rather than having customers or suppliers send paper-based business documents to an employee for data entry, have them enter the data using familiar tools that validate data accuracy before it is accepted into the ERP system.

This trend is proliferating. For example, banks have recently introduced ATM machines that require depositors to insert checks individually for real-time image scanning. Here banks have outsourced the check scanning process to the customer saving the time and cost of opening deposit envelops collected from ATM machines.

A similar approach can be used by outsourcing data quality management to trading partners. Look for the emergence of third party B2B integration and commerce management service providers that support data entry and validation for all trading partners. Integrated suites of direct system-to-system integration and Web portal services will be supplemented with combined e-mail and smart-form technologies solving the data quality problem associated with paper-based exchanges with small and occasional trading partners.

Tags:
Software
About the author

Dave Rosenberg has more than 15 years of technology and marketing experience that spans from Bell Labs to startup IPOs to open-source and cloud software companies. He is CEO and founder of Nodeable, co-founder of MuleSoft, and managing director for Hardy Way. He is an adviser to DataStax, IT Database, and Puppet Labs. Disclosure. You can contact Dave via e-mail at softwareinterrupted@gmail.com.

 

ARTICLE DISCUSSION

Conversation powered by Livefyre

Don't Miss
Hot Products
Trending on CNET

Hot on CNET

CNET's giving away a 3D printer

Enter for a chance to win* the MakerBot Replicator 3D Printer and all the supplies you need to get started.