Why less is more in data migration

subscription icon
Subscribe
Do you like what you read ?

Join our community

Member

To free

5 free articles per month, $ 6.95 / article thereafter, free newsletter.

Subscribe

$ 89 $ 45 / year

Unlimited digital content, quarterly magazine, free newsletter, full archive.

Sign me up


As the pandemic continues, companies are rushing to move data from old, overloaded IT systems to more agile and modern configurations to launch new online services and keep operating systems remote. But few of these large-scale initiatives are going as planned or producing the results promised. Many multi-year computer data migration programs fail, often at great cost.

Businesses can reduce their chances of having problems by accepting that “less is more”. Below, we share three principles companies can follow to successfully move data to new systems in months instead of years, fueling faster innovation: marking critical data that needs to be migrated; leave behind “useful” data; and lower data quality standards, even if only by less than 1%.

Start with a minimum set of viable data

When businesses migrate data, they typically aim to move all fundamental data structures from their existing systems to the new IT system. But data migrations can happen much faster if companies first select a new IT system and then work backwards, with the goal of migrating only the minimum amount of viable legacy data required.

For example, a financial services company moved data from one of its products in four months instead of two years, after re-examining the data it really needed to move forward. The old computer system collected thousands of columns of historical data that captured the evolution of the product’s value whenever its fees or interest rate were adjusted. But managers only needed to migrate the current value of the product and its transaction history. Every incremental change in its value over the past decade could be recalculated in the new system, if necessary. As a result, managers saved years on their data migration project by only transforming hundreds of columns of data and leaving the rest behind.

Find data to leave out

As managers now have access to more accessible and affordable data storage options, they can now more easily and securely store legacy data in cold storage and later retrieve it from different systems as needed. Data storage options mean that the traditional all data approach (migrating all legacy system data in one operation with a single failover date) is outdated. The “big bang” strategy, all given, may appear to offer up-front benefits, such as shorter implementation times and lower up-front costs. But it’s easier, cheaper, and faster to delay, remove, or archive unnecessary data instead.

The most efficient way to select the data to migrate is to sort the labeled data to migrate into three compartments: the essential data required for security or compliance reasons, the necessary data which is essential to achieve the critical and superfluous objectives, interesting data.

Be prepared to move more datasets to the “good to have” category as the project progresses. A company where inventory compliance was critical, for example, expected to move the number of maintenance parts and tools in its bins to a new operating system. Then he found that developers were spending months validating and cleaning up historical data. Every time they cleaned the dataset there was one or two parts or the serial numbers didn’t match perfectly. Ultimately, managers concluded that it would be cheaper and faster to have someone simply count and manually update this inventory data instead.

Weigh speed against perfection

Finally, managers should encourage technologists to reject requests for perfect non-critical data. When managers outside of IT are involved in setting data standards, their preferences for stronger data must be weighed against competing demands and goals, such as quickly updating an operating system. . Otherwise, migrating data to a new system can take forever.

Operating system transformations often fall behind schedule because companies have reassigned data columns, relaxed data definitions, or even unnecessarily duplicated data to solve business problems for many years. Legacy data items are often spread across multiple systems, partially available, or entangled in subsets of other data. Sometimes they are just plain wrong.

Reasonable, quantifiable and testable data quality standards are in fact better than perfect precision. Improving the accuracy of a dataset by 1% or 2% may take longer than achieving a high level of accuracy – say 97% – in the first place.

Keep the goal in sight

It’s easy to fall into the trap of waiting for perfect data, but pandemic challenges have crystallized for many companies how the perfectionist approach loses sight of the overarching goal – getting the new information system up and running to better respond. business needs. Following these principles will help your organization to manage and migrate data more efficiently.

Sean N. Ayres