banner

Data quality can also spell scope creep so you better spell out your requirements.

Data management, master data management and systems integration, however, are much more critical priorities, ensuring a minimum level of ‘data quality’ and interoperability. It’s also arguably much easier to achieve.

It’s nearly impossible to guarantee that our data will always be as up-to-date as we need it to be, or that it will always match the real-world people, companies, products, parts, materials, and locations it describes.

It’s nearly impossible (and even counterintuitive) to think that all of the data should match exactly, internally and externally, across so many different systems and so many different real-world entities.

Each business system is unique and has been deployed to fulfil unique business requirements, which means the system and the data will also be different (not exactly the same), which makes it difficult to integrate and ‘sync’ with all other business systems.

That’s where the breakdown happens. It’s usually the data.

But we can match and link similar data, within and across systems, internally and externally, effectively integrating those systems, with data matching and entity resolution. This allows us to find and resolve duplicates within each system, and allows cross-system communication, or interoperability, by linking the data and automating the most important data updates.

At their core, master data management and systems integration are disciplines, allowing interoperability, usually facilitated with APIs and ETL to move and transform the data, and some form of data matching and entity resolution to identify related data, and rules (at a minimum).

Author: Ben Cutler

Inquiries: bcutler@matchdatapro.com

Leave a Reply

Your email address will not be published. Required fields are marked *