The Data Migration Nightmare That Could Have Been Avoided
Introducing the concept of a Target Operating Model (TOM)
“Have you ever stepped into a project and immediately sensed disaster? That’s exactly what happened when I replaced someone who had burned out… only to realize I was on the same path.”
I walked into the project full of optimism, but that didn’t last long. My task? Deduplicate 10,000 business partner addresses.
The catch? Excel was the only tool.
No documentation. No structured approach. No clear handover. Just a chaotic spreadsheet filled with inconsistent data.
At first, I thought, “Okay, this will take time, but it’s doable.” Then I started noticing patterns—some addresses used the Roman alphabet, others used Polish diacritic characters. Different people had entered data in different ways, and there was no standard rule for handling duplicates.
Worse, I wasn’t the first person thrown into this mess. The person before me had burned out, and before them, someone else had also quit due to stress. I was the third person in the burnout cycle.
It was only a matter of time before the next person in line (me) either fixed the problem—or became another name on the list of people who tried and failed.
This wasn’t just a bad migration task—this was a governance disaster waiting to explode.
That’s when I learned one of the most important lessons in data migration:
💡 The best time to fix a problem is before it happens.
The Root of the Problem – Governance Neglected Until It Was Too Late
At first, I tried to be logical about the task. “Okay, let’s take a structured approach,” I thought. I started by scanning the spreadsheet, hoping to spot obvious duplicates.
But then I realized—there was no single pattern.
🔹 Some addresses were written using the Roman alphabet, others used Polish diacritic characters.
🔹 Company names were entered differently for the same entity—sometimes in full, sometimes abbreviated, sometimes with unnecessary spaces.
🔹 Street names varied—one row might say “Jana Pawła II,” another might say “J. P. II,” and yet another might spell it incorrectly.
I soon understood why the previous two people had burned out.
➡️ With no predefined rules for handling duplicates, every person had to invent their own method.
➡️ With no governance in place, Excel became the default tool, even though it wasn’t suited for the job.
➡️ With no structured handover, each new person had to start from scratch, making the same mistakes over and over.
📌 If you give 10 people this assignment with Excel, you get 10 different ways of doing it. But without governance, you also get 10 different sets of errors.
It wasn’t just frustrating. It was a ticking time bomb.
The Breaking Point – Realizing This Was a Governance Failure, Not a Data Issue
The deeper I dug into the dataset, the worse it got. Every attempt to clean up the mess created more inconsistencies, more errors, and more unanswered questions.
I started asking around:
- “Who originally created these records?” – Nobody knew.
- “Are there existing rules for handling duplicates?” – Blank stares.
- “Should we prioritize certain business partners over others?” – Silence.
It hit me: This wasn’t just a data problem. This was a governance failure.
🔴 The Lack of Governance Created a Vicious Cycle
💡 Without predefined rules, each person made their own decisions about what was correct and what wasn’t.
💡 Without standard validation methods, errors multiplied instead of decreasing.
💡 Without accountability, bad data kept flowing into the system unchecked.
This wasn’t just about duplicate addresses—it was about the entire migration process lacking structure.
📌 We weren’t fixing a broken dataset. We were recreating the same mistakes in the new system.
That’s when I realised: No amount of Excel filtering could solve this problem. What we needed wasn’t just data cleanup—it was a structured, well-defined governance approach.
And we needed it before migration, not after.
The Solution – Embedding Data Governance Before Migration
At the time, I wasn’t aware of the concept of a Target Operating Model (TOM). I had never encountered it in any of my SAP projects, and no one in the team had mentioned it either. We were all focused on fixing the data rather than questioning why the problem existed in the first place.
But looking back, I now realise that if a TOM had been in place before migration, the entire approach would have been different. Instead of firefighting with Excel and wasting weeks on trial and error, the project could have had a structured, well-defined governance model that prevented these issues from escalating.
What Is a Target Operating Model (TOM) and Why Does It Matter?
Had I known about TOM, I would have recognised that we were missing a crucial step in the migration process. A Target Operating Model (TOM) isn’t a piece of software or a tool—it’s a structured framework that defines exactly how data governance should work before, during, and after migration.
It’s essentially an instruction manual that answers critical questions such as:
✔ Who owns the data? (Roles & accountability)
✔ What rules should be followed for data quality and deduplication?
✔ How will inconsistencies—such as multiple alphabets—be handled?
✔ What technology will be used instead of relying on Excel?
At its core, TOM ensures that data governance is not an afterthought but an integral part of the migration strategy. If it had been applied in my project, we wouldn’t have had people burn out trying to fix the same issues manually.
Frequently Asked Questions About TOM – Answered
📌 Is TOM a piece of software?
No. TOM is a framework, not a tool. However, it helps define which tools should be used for governance.
For example, instead of relying on Excel for deduplication, a TOM would specify SAP MDG (Master Data Governance) or a specialised address-cleaning solution.
📌 Is TOM just a document?
Yes and no. It starts as a document, but it’s more than that—it’s a blueprint for running data governance effectively.
Think of it as a restaurant operations manual—you don’t just write it and forget it; you use it to ensure consistency across all operations.
📌 Who creates a TOM?
A cross-functional team involving IT, data stewards, business owners, and compliance teams should create it before migration starts. This prevents last-minute firefighting.
📌 What would have happened differently if a TOM had been in place in my project?
If a TOM had existed before migration, it would have ensured:
✅ Deduplication rules were standardised—so no one had to guess how to handle multiple versions of the same address.
✅ Proper tools were used—instead of Excel, we would have implemented a structured data-cleansing process.
✅ A clear handover process existed—so new team members didn’t have to start from scratch every time.
✅ Data ownership was defined—so there was accountability rather than a ‘free-for-all’ approach.
Had I been aware of TOM back then, I would have raised these issues before migration, rather than attempting to fix them afterwards.
How to Implement a TOM in Data Migration Projects
📌 1. Start Governance Discussions Early
- Before migration, define who owns what data and what rules apply to it.
- Ensure business users, IT, and compliance teams collaborate in setting governance policies.
📌 2. Standardise Data Quality & Validation Rules
- Define naming conventions, deduplication criteria, and acceptable data quality levels.
- Document and test these rules before migration starts.
📌 3. Use the Right Tools—Not Just Excel
- A TOM would specify data governance tools like SAP MDG, Collibra, or Informatica instead of relying on spreadsheets.
📌 4. Document Everything for Handover
- A structured data governance document should exist, so if someone new joins mid-project, they don’t have to start from zero.
📌 5. Test the Governance Model Before Go-Live
- Instead of assuming legacy data processes will work, a TOM allows teams to test governance strategies on both old and new systems before migration is complete.
Final Thought: A TOM Saves You from Cleaning Up a Mess Later
At the time, I thought I was just dealing with bad data. What I didn’t realise was that I was experiencing the consequences of bad governance.
If a TOM had been in place from the beginning, I wouldn’t have spent weeks drowning in Excel, trying to make sense of inconsistent records with no guidelines. The previous two people wouldn’t have burned out. The entire project would have been structured to prevent these issues from happening in the first place.
💡 The best time to fix a problem is before it happens—and a TOM is the tool that makes that possible.
Conclusion – The Best Time to Fix a Problem Is Before It Happens
When I first joined that project, I thought my job was simple: clean up a dataset. But what I was really doing was fixing the symptoms of a deeper governance failure.
I didn’t know about TOM at the time, but if I had, I would have approached the project differently. Instead of spending weeks trying to make sense of 10,000 inconsistent business partner addresses in Excel, we could have established governance rules upfront, defined standardised data cleansing methods, and used the right tools to ensure accuracy.
Instead, we tried to fix the problem after the damage had already been done—and that’s why two people burned out before me.
🔹 We could have defined deduplication rules before migration.
🔹 We could have set up a clear governance framework so no one had to reinvent the wheel.
🔹 We could have prevented the same mistakes from being copied into the new system.
But we didn’t—because governance was an afterthought.
📌 The lesson? If you don’t embed data governance into migration, you’re setting yourself up for failure.
💡 The best time to fix a problem is before it happens.
That’s why, in every project since, I’ve made sure to prioritise governance before migration, not after. Because no one wants to be the next person in line to burn out fixing the same mistakes over and over again.