MDM Can Challenge Traditional Development Paradigms

I’ve been making the point in the past several years that master data management (MDM) development
projects are different, and are accompanied by unique challenges. Because of the “newness” of MDM and its unique value proposition, MDM development can challenge traditional IT development assumptions.
MDM is very much a transactional processing system; it receives application requests, processes them, and returns a result. The complexities of transaction management, near real-time processing, and the details associated security, logging, and application interfaces are a handful. Most OLTP applications assume that the provided data is usable; if the data is unacceptable, the application simply returns an error. Most OLTP developers are accustomed to addressing these types of functional requirements. Dealing with imperfect data has traditionally been unacceptable because it slowed down processing; ignoring it or returning an error was a best practice.
The difference about MDM development is the focus on data content (and value-based) processing. The whole purpose MDM is to deal with all data, including the unacceptable stuff. It assumes that the data is good enough. MDM code assumes the data is complex and “unacceptable” and focuses on figuring out the values. The development methods associated with deciphering, interpreting, or decoding unacceptable data to make it usable is very different. It requires a deep understanding of a different type of business rule – those associated with data content. Because most business processes have data inputs and data outputs, there can be dozens of data content rules associated with each business process. Traditionally, OLTP developers didn’t focus on the business content rules; they were focused on automating business processes.
MDM developers need to be comfortable with addressing the various data content processing issues (identification, matching, survivorship, etc.) along with the well understood issues of OLTP development (transaction management, high performance, etc.) We’ve learned that the best MDM development environments invest heavily in data analysis and data management during the initial design and development stages. They invest in profiling and analyzing each system of creation. They also differentiate hub development from source on-boarding and hub administration. The team that focuses on application interfaces, CRUD processing, and transaction & bulk processing requires different skills from those developers focused on match processing rules, application on-boarding, and hub administration. The developers focused on hub construction are different than those team members focused on the data changes and value questions coming from data stewards and application developers. This isn’t about differentiating development from maintenance; this is about differentiating the skills associated with the various development activities.
If the MDM team does its job right it can dramatically reduce the data errors that cause application processing and reporting problems. They can identify and quantify data problems so that other development teams can recognize them, too. This is why MDM development is critical to creating the single version of truth.
Image via cafepress.com.
Staring at the Lights: Your Data Warehouse Isn’t a Commodity
There are far too many data warehouse development teams solely focused on loading data. They’ve completely lost sight of their success metrics.
Why have they fallen into this rut? Because they’re doing what they’ve always done. One of the challenges in data warehousing is that as time progresses the people on the data warehouse development team are often not the same people who launched the team. This erosion of experience has eroded the original vision and degraded the team’s effectiveness .
One client of mine actually bonused their data warehouse development team based on system usage and capacity. Was there a lot of data in the data warehouse? Yep. Were there multiple sandboxes, each with its own copy of data? Yep. Was this useful three years ago? Yep. Does any of this matter now? Nope. The original purpose of the data warehouse—indeed, the entire BI program—has been forgotten.
In the beginning the team understood how to help the business. They were measured on business impact. Success was based on new revenues and lower costs outside of IT. The team understood the evolution of the applications and data to support BI was critical in continually delivering business value. There was an awareness of what was next. Success was based on responding to the new business need. Sometimes this meant reusing data with new reports, sometimes it meant new data, sometimes it was just adjusting a report. The focus was on aggressively responding to business change and the resulting business need.
How does your BI team support decision making? Does it still deliver value to business users? Maybe your company is like some of the companies that I’ve seen: the success of the data warehouse and the growth of its budget propelled it into being managed like an operational system. People have refocused their priorities to managing loads, monitoring jobs, and throwing the least-expensive, commodity skills at the program. So a few years after BI is introduced, the entire program has become populated with IT order-takers, watching and managing extracts, load jobs, and utilization levels.
Then an executive asks: “Why is this data warehouse costing us so much?”
You’ve built applications, you’ve delivered business value, and you’ve managed your budget. Good for you. But now you have to do more. IT’s definition of data warehouse success is you cutting your budget. Why? Because IT’s definition of success isn’t business value creation, it’s budget conformance.
Because BI isn’t focused on business operation automation, as with many operational systems, it can’t thrive in a maintenance-driven mode. In order to continue to support the business, BI must continually deliver new information and new functionality. Beware the IT organization that wants to migrate the data warehouse to an operational support model measured on budgets, not business value. This can jeopardize more than just your next platform upgrade, it can imperil the BI program itself. The tunnel-vision of Service Level Agreements, manpower estimates, and project plan maintenance aren’t doing you any favors. They can’t be done devoid of business drivers.
When there are new business needs, business users may try to enlist IT resources to support them. But they no longer see partners who will help realize their visions and deliver valuable analytics. They see a few low-cost, less experienced technicians monitoring system uptime and staring at the blinking lights.
photo by jurvetson via Flickr (Creative Commons License)
MDM Streamlines the Supply Chain
I’ve always been a little jealous of ERP development teams. They operate on the premise that you have to standardize business processes across the enterprise. Every process feeds another process until the work is done. There are no custom processes: if you suddenly modify a business process there are upstream and downstream dependencies. Things could break.
We don’t have that luxury when we build MDM solutions for our clients. This was on my mind this past week when I was teaching my “Change Management for MDM” class in Las Vegas. The fact is that business people constantly add and modify their data. What’s important is that a consistent method exists for capturing and remediating these changes. The whole premise of MDM is that reference data changes all the time. Values are added, changed, and removed.
Let’s take the poster-child-du-jour, Toyota. Toyota has already announced that it will stop manufacturing its FJ Cruiser model in a few years. In the interest of its dealers, repair facilities, and after-market parts retailers, Toyota will need to get out in front of this change. There are catalogs to be modified, inventories to sell off, and cars to move. Likewise MDM environments can deal with data changes in advance. The hub needs to be prepared to respond to and support data changes at the right time.
We work a retailer that is constantly changing its merchandise with fluctuating purchase patterns and seasons. Adding spring merchandise to the inventory means new SKUs, new prices, and changes in product availability. Not every staff member in every store can anticipate all these new changes. Neither can the developers of the myriad operational systems. But with MDM they don’t have to keep up with all the new merchandise. The half-dozen applications that deal with inventory details can leverage the MDM hub as a clearing house of detailed changes, allowing them to be deployed in a scheduled manner according to the business calendar.
No more developers having to understand the details of hundreds of product categories and subcategories. No more one-off discussions between stores and suppliers. No more intensive manual work to change suppliers or substitute merchandise. No more updating POS systems with custom code. With MDM it’s all transparent to the applications—and to the people who use them.
Our most successful MDM engagements have confirmed what many of our clients already suspected but could never prove: that there are far more consumers of data than they knew. MDM formalizes the processes to ensure that data changes can scale to escalating volumes. It automates the communication of changes to the business areas and individuals who need to know about those changes, without needing to know each individual change.
With spring, shoppers may be thinking about new Easter outfits, gourmet items, or children’s clothes. But suppliers think about trucking capacity. Store managers can anticipate shelf and floor space requirements. Finance staff can prepare for potential product returns. Distribution center staff can allocate warehouse space. You can’t know everyone who needs the information. But the supply chain can become incredibly flexible and streamlined as a result of MDM.
And—okay, this makes me feel much better—it doesn’t even matter whether you have ERP or not!
Note: Evan will be presenting The Five Levels of MDM (and Data Governance!) Maturity next week at TDWI’s Master Data Quality and Governance Solutions Summit in Savannah, Georgia. The event is sold-out, so if you were lucky enough to get in, please stop by and say hello!
Photo by Rennett Stowe via Flickr (Creative Commons License)