Blogpost
Masters of the data

Masters of the data

Portrait of Dirk Möbus
Partner
Frankfurt Office, Центральная Европа
+49 69 29924-6579
19 мая 2016 г.

  • At many companies, disillusionment has set in. Considerable efforts are invested in system harmonization to simplify their IT landscapes, only to find that reality falls short of expectations and master data is still not used efficiently. One reason is that at many companies, data harmonization and master data management play only a peripheral role. This has consequences, and sometimes costly ones in the following areas:

  • Location information — mistakes in geographical data lead to errors in logistics planning and in turn to higher freight costs.
  • Supplier data — duplicates in the supplier master mean that incorrect sales figures are calculated for a specific supplier, which can be a drawback when negotiating new terms.
  • Customer data — missing or incorrect information, for instance regarding the relationships between parent companies and subsidiaries, could result in credit limits being overdrawn.

To correct such mistakes, departments and IT sometimes require considerable personnel capacity that has to be siphoned away from other areas. At times this can incur serious costs, and these people are also being taken away from important strategic projects.

"Data analytics – a greater demand for quality."
Portrait of Dirk Möbus
Dirk Möbus
Partner
Frankfurt Office, Central Europe

Just what causes poor master data quality? One main reason is the relationship between perceived quality and the level it is being viewed at. An employee of a national subsidiary, for example, may say the master data is generally good. Yet as you successively move up through the levels, the defects in quality quickly become apparent. Standards of data maintenance frequently differ considerably among the various national subsidiaries. This forces many companies to resort to making manual adjustments when consolidating data or they resign themselves to distorted conclusions.

Data analytics—a greater demand on quality

As planning and analysis departments continue to work with data analytics and big data, demands on master data quality are increasing. Confidence in data reliability is an essential requirement for making key decisions, yet a lack of standards for data maintenance shakes that confidence, leaving optimization potential untapped. This is neatly illustrated in this example: if employees don’t rely on the product dimensions as stored in the system, warehouse space won’t be fully utilized and trucks won’t be fully loaded.

To improve the quality of their master data, companies need a clear overview for cleaning up existing data.
To improve the quality of their master data, companies need a clear overview for cleaning up existing data.

At the other end of the spectrum are companies that have already perfected data usage. One example is a certain large US group. For each of its more than 300,000 employees, the group calculates the probability that the employee plans to leave the company. This allows managers to take countermeasures in good time or at least plan for replacements.

Solving the problem—success factors and where to start

To solve these problems and improve the quality of their master data, companies need a clear overview and a foundation for cleaning up existing data. The first step is to precisely determine the scope of their master data. Ideally, every master data element should be the responsibility of a specific department, such as product development, marketing, or sales. Interviewing employees from various departments can also help define the most urgent problems connected to master data quality and identify fast and simple improvements, such as clearing duplicates out of the supplier master.

What processes are needed for master data administration?

Data quality can only be improved long-term if data management processes are also taken sufficiently into account. This means defining and firmly establishing clear rules and responsibilities, and all departments concerned should be involved in developing the process guidelines to ensure that all requirements are considered and all employees have the same knowledge base for implementation.

In addition, the master data should be prioritized. This makes it possible to steadily improve its quality without overburdening the organization. Theoretically, clear processes and rules for data management should be enough to improve data quality. Yet in practice, a combined approach (upstream data cleansing plus clear processes) achieves the greatest success.

Theoretically, clear processes and rules for data management should be enough to improve data quality.
Theoretically, clear processes and rules for data management should be enough to improve data quality.

Three success factors to master data management

  1. As master data is always managed by the departments and not by IT, master data management is primarily the responsibility of the former. However, support from IT, such as for preparing reports of duplicates or for system interfaces, is always necessary.
  2. Data harmonization and master data management are long-term programs that frequently require procedural and organizational changes. Strong approval and support from management are therefore a fundamental requirement for the success of the program.
  3. To measure overall progress and sustainably improve data quality, companies need to establish controls and KPI systems.

Experience has shown that setting up rules alone is often not enough—it’s only by ensuring that they are applied correctly that they achieve the desired effects.

  • Photo Credits: Caiaimage; Tom Merton / Getty Images