The risks of delaying data migration for healthcare systems

Sashank Purighalla, Founder and CEO of BOS Framework

It only takes one data breach to cripple an entire industry. Hospitality offers a premonitory example. In 2014, the hotel chain Marriott International was the victim of a serious cyberattack, during which the information of 500 million customers was leaked. This scandal exploded into the political sphere when Chinese hackers were accused of trying to destroy competition in the hospitality industry. Hotel reservations are one thing; just imagine the reaction to a similar hack exposing the medical records of entire nations.

Attacks on hospitals and medical care centers are not a question of why, but of when. Protenus’ annual “Breach Barometer” report details that 41.1 million private patient records were hacked across 572 detected incidents in 2020. IBM estimates that the cost of a single data breach targeting major health care providers is at least $7 million. The Coronavirus pandemic has made the territory even more precarious, with a Interpol report attesting to an alarming increase in cyberattacks “exploiting the fear and uncertainty caused by the unstable social and economic situation”.

As healthcare becomes fully digitized, so does the enormous amount of private patient data the sector generates. A single patient’s electronic health record (EHR) averages several gigabytes per month.

Telehealth, which has grown exponentially due to the pandemic, has allowed for further deepening of data. In 2020, 43.5% of patients in the United States received primary care via telehealth, which means that health care providers need fast and secure communication between multiple data storage services to manage the influx. It is also essential that very sensitive Personally Identifiable Information (PII) and Protected Health Information (PHI)) are well guarded.

To manage such a large amount of sensitive data, there is only one long-term solution: structure the data according to sound principles of data management and storage and migrate to an online cloud while guaranteeing the birth of the cloud. Although complex, vendors face the challenges of data migration, as every day data continues to exist in older structures and systems, it becomes more and more vulnerable.

Stick with legacy

The healthcare industry is a late adopter of cloud computing, which puts providers and their patients at risk. Until now, most healthcare providers have tried to delay data migration by extending the use of legacy systems. Data migration is a complex process as it requires consideration of several technical, process, personnel, and business aspects.

Maintaining business continuity and ensuring non-disruptive user experience adds complexity, time, and cost. Additionally, businesses need to find creative ways to provide analytics, reports, and insights from this data to stay competitive.

In this context, it is tempting to resort to approaches that create a quick gain/feeling of comfort. But this invariably results in myopic measures such as implementing network-level security perimeters, sophisticated authentication routines, niche security tools, data warehouses that aggregate data, and the provision of controlled access on demand to authorized persons.

While these measures may bring some gains, they invariably end up creating a false sense of security. However, the biggest problem is that these measures lead to delays in taking comprehensive measures to create sustainable data security and economic intelligence. It also means that they have stored most of their sensitive data on outdated physical storage systems in an ecosystem that is becoming more vulnerable and obsolete every day.

Before the pandemic, migrating data systems was expensive and time-consuming. The situation has worsened with the pandemic, which has seen businesses move rapidly to the cloud, causing demand and prices to spike. Some companies have recorded a 20% to 50% increased costs.

If healthcare providers can’t afford to pay big companies to outsource their migration service, it’s up to their IT staff to handle it on their own. However, data migration requires not only IT expertise, but also healthcare and regulatory requirements. When migrating data to cloud systems, any small mistake means an increase in probability of cyberattacks.

Hospitals are chronically understaffed. A study of Society for Health Information and Management Systems found that more than half of hospitals surveyed said their organization did not employ even a single chief information or technology officer. If vendors are being brutally honest with themselves, they have to come to terms with the labor and expertise shortages they face when considering migrating data on their own. Even when fully staffed with trained professionals who are familiar with data handling best practices, the time required to undertake a migration project, as well as maintaining business continuity, is in itself an extremely difficult process. .

Data migration includes

  • data restructuring for PII/PHI separation and encryption,
  • move data from legacy systems to the cloud,
  • ensure systems are designed to be cloud native, and
  • ensure security with network isolation controls and with least privilege.

He must take into account

  • constructs for on-demand and continuous access,
  • communication between resting and streaming data repositories,
  • consideration of physical isolation of data per tenant for multi-tenant systems, and
  • ensuring that the architecture natively lends itself to delivering robust data analytics, reporting and insights.

To do this effectively, health systems must seek to avoid carrying their “current baggage” with them when moving to a new system.

This can be accomplished using an automated strategy that emphasizes sound engineering principles in every layer of the ecosystem. With the right constructs guiding the thought process and with proper planning and “cloud engineering automation,” this can be accomplished as part of the migration process.

This migration process can be thought of as several iterative cycles where each application and its data is migrated from its source to its new cloud destination, one application at a time. One tactic could be to use machine learning to reliably spot errors or missing data points when collecting data from various applications.

Automation can also ensure compliance with security standards and corporate policies set by the CTO/CIO/CISO for the organization. The DevOps team can introduce phase gates to ensure policies are adhered to throughout the data lifecycle.

For compliance, PII/PHI data must be physically separated from the rest of the operational data. Proper application of data encryption and least privilege in combination with physical separation of data by tenant in the case of multi-tenant systems can help reduce the possibility of data being held for ransom. It also greatly reduces the blast radius when breached.

In addition, a well-developed open API standard system allows a security team to see how interactions are taking place between data repositories and how APIs communicate with databases. The operating mechanism of the API system requires network isolation control and the application of least privilege, which allows applications to interact with each other, while remaining separate and secure. This allows security teams to observe interference between databases and analyze potential threats.

Finally, it is important that all communications are encrypted. No one can expect constant vigilance from those who use the communication channels; at any given time, patients or providers may share very sensitive information. Encryption ensures that information cannot fall into the wrong hands.

Migrating your data to the cloud is not only an efficient initiative. There is an urgent need for healthcare providers to migrate their data to the cloud and make their systems cloud native. In today’s environment, however, healthcare providers need to take a broader view to ensure sustainable safety and efficiency (both in terms of capital and resources) and to make data a strategic business asset. ‘business.

The central idea should always be the end state that is based on sound engineering and data architecture principles. Technology leaders and architects should never forget the fact that it’s not a problem with the tools they use as much as it is the ecosystem that inevitably leads them to compromise data integrity and of the system, which leads to vulnerabilities.

It is important to understand that lasting security and data integration can only result from a strong holistic architecture; no patching niche tools on outdated systems, inconsistent configurations and implementations.


Sashank Purighalla is the founder/CEO of BOS frameworka cloud engineering platform that automates the seamless transition to microservices and DevSecOps, enabling enterprises to drive data and product strategies with security, scalability, and compliance.

Sean N. Ayres