Master Data Management (MDM) continues to play a foundational role in the Data Management Architecture of every 21st century enterprise. In a forward-looking organization, MDM is significant in the Enterprise Integration Hub.
This article was originally published in DATAVERSITY on July 27, 2021
My first job out of graduate school was to develop an e-commerce application for a manufacturing behemoth. The idea was to build a web-based application for internal sales teams to place orders on behalf of the customers and eliminate paper processing. Don’t judge it using today’s standard. At the time, almost two decades ago, it was a revolutionary idea, and a tremendous undertaking. As a new computer science grad, I was excited and took on this task with immense enthusiasm.
This article was originally published in DATAVERSITY on July 19, 2021
Most companies are either considering cloud, if not on a cloud adoption journey already. If you are responsible for Master Data Management (MDM) in your company, you are likely considering moving or implementing MDM on the cloud. Although there are still some challenges and kinks to work out in cloud technology, there are cost efficiencies, agility benefits, and many other advantages that should be driving you to look at cloud as a base for your current and future MDM implementations.
The year and a half after the COVID crash has left consumer confidence adrift on a never-before-seen kind of bubble. This one is different from what the Baby Boomers rode in the late 90s leading up to the tech bubble burst; or from the pre-2008 bacchanalia that got people believing they can take out multiple mortgages because of their own “worth”, when they were but a means to pumping up mortgage-backed securities. Credit bureaus have started to sound the alarm on delinquency trends, but it’s difficult to fathom what’s to come because income estimation has gotten more difficult (despite the tons more data available on every person who has any online activity). While consumer confidence metrics seem green based on how much people are spending, consumer psyche is further and further habituated to negative financial events and consequences. Six factors contribute to this conditioning:
If we want to win, we need to learn quickly at speed. This applies to business as well. Winning enterprises have a data ecosystem that exposes all data, makes it trusted, applies machine learning, and eventually lets it self-learn and evolve – the genesis of a data-driven enterprise. The enterprise that learns and executes the fastest always wins. To accomplish this, enterprises need an Enterprise Intelligence Hub (EIH).
Organizations across industries understand the importance of Customer Experience and are making large investments, and rightly so, to offer an enhanced experience to their customers. Companies that lead in Customer Experience outperformed those farther behind by almost 80%. (The Total Economic Impact of Qualtrics CustomerXM, 2019) However, it’s hard to improve the customer experience without a clear understanding of their actual experience. Mapping out the customer journey is the first essential step a business must take to improve its overall customer experience.
Technology spend has been shifting to the business for some time and now more than 60% of businesses allow non-IT functions to lead technology decisions (KPMG & Harvey Nash, 2019). Further, most technology workers now reside outside the traditional IT function (TechRepublic, 2020). Together, these occurrences offer evidence that the business is becoming increasingly dominant, not just when it comes to making technology choices, but also when it comes to determining technology strategy.
Antifragility is a strong criterion that says that if we experience shocks or unexpected variances in the inputs, the model performance (by whatever measure of performance you choose to use) will actually improve. Most machine learning models can be made to be reasonably robust, but I have never seen an example of a “classic” machine learning model that is antifragile. Yes, machine learning benefits from exposure to more data and more diversity (when that variability is related to the underlying process) but, obviously, not all shocks lead to improved performance in such systems.
In the movie “The Core” [spoiler alert!] there is a great example (completely made-up, of course) of a theoretical element called (with tongue in cheek) Unobtanium. It has the unique property that it becomes stronger (gains structural integrity) as it is exposed to increasing levels of heat and pressure. So, it turns out to be the ideal material to build the fantastic ship they use to journey to the center of the Earth. Unobtanium strikes me as an idealized example of an antifragile physical material: A scene from “The Core” illustrating the physical magic of Unobtainium.
A client recently asked if our entity matching algorithms are “antifragile”. This got me thinking. It is a really interesting question. Bear with me as I take you on a mind trip to explore this question and its implications. First, let’s start with a common understanding of “antifragile”. We’ve all read the google definition: When a system gains from stressors, shocks, volatility, noise, disorder, mistakes, faults, attacks, or failures, it is termed “antifragile”.