process image

The Application Decommissioning Process

By Bill Young, Senior Application Decommissioning Consultant

Two common questions we’re asked are “What is the process for decommissioning a legacy system?” and “Do you have an application decommissioning diagram?”

To answer both questions, we put together a 9-minute video that discusses the five basic steps of the application decommissioning process. The five steps are summarized in this infographic.

The five basic steps include:

1. An Application Portfolio Analysis

The purpose of this step is to review applications in your overall portfolio and consider 1) whether the data in those applications is still needed; 2) whether it should be migrated to a new, production environment for continued active use and updates; or 3) whether the data is still useful for analytics or compliance reasons, but is not changing and does not depend on all the features of the original application, so may be archived; or 4) perhaps a combination of the last two, with some data identified as unchanging and appropriate for archiving, with other data needed in an active, production system.

Application decommissioning with data archiving is the third of these alternatives, and our archiving platform also supports the fourth alternative, sometimes referred to as active or live archiving.

Sometimes features of newly available applications supersede and significantly enhance those of older applications, or a home-grown, purpose-built application has data which will be more easily supported and maintained by a commercial, off-the-shelf application. But the data from the older applications is still necessary for the business. CRM, HR, healthcare, and other systems containing actively used and updated records are common examples. In these cases, migrating the data to the new application before turning off the old application may be the required solution.

If the data on the legacy system isn’t required (is effectively past its “expiration date” for compliance, audit, or regulatory purposes), you can shut down the system. You can simply turn it off.

If the data still has value, but is no longer being updated, or the data must be retained to satisfy compliance requirements, archiving provides a low-cost alternative to retaining hardware and software licenses for otherwise minimally used applications. The archive provides access to the data, without the full capabilities of expensive, full-featured software.

2. Business and Data Analysis

Once the decision has been made to archive data, the next step is a business and data analysis. This entails:

  • Determining what data must be retained. Sometimes, only transaction details are required, without full identification of accounts or personal identifying information. In other cases, all the data must be retained to enable full reporting on the data.
    Determining how to retain the data. Can the data be stored in summary format (extracting reports from the source system) or is full data retention necessary to allow building highly detailed or analytical reports?
    Are there related data sources which should also be archived, to provide complete reporting?
    Will additional data modeling (for example, denormalization of data or materialized views) help with reporting and performance?
    Who will require access to the data, how frequently, and in what formats?

3. Design and Build

The analysis phase provides user stories describing the user requirements for access to the archive. The steps required to meet these requirements are:

  • Deploy and configure the archive. Configuration includes integration with directory services (AD, LDAP) and/or single-sign-on (SSO) applications, storage configuration, encryption settings, backup and recovery process, high-availability configuration (if required).
  • Moving the data from the source system to the archive is performed in the ETL (Extract, Transform, Load) process. The data is extracted (or retrieved) from the legacy system, transformed into a format appropriate for the archive, and loaded into the archive. Following the ETL process is chain-of-custody checking, to ensure the desired data has been captured and has not been changed during the ETL process.
  • For well-defined, frequently used reporting, queries and user interfaces may be defined, much as the source application had defined queries and reporting screens. This portion of the process does not reproduce the source system but provides access to the data in ways familiar to the users of the source application.

4. Deliver

Delivering a project means ensuring the stated requirements in the analysis phase meet the user needs. This is accomplished by user acceptance testing of the reporting interfaces.

Delivery also includes training users on the interface, which will necessarily differ from the legacy application. Not only users, but administrators and maintainers of the archive may also require training.

5. Decommission

Now that any data needed for safekeeping is in the archive, you can decommission the legacy system. This means, as well as turning off and reclaiming the assets, canceling any support contracts for the decommissioned software, etc.

Application Decommissioning offers the opportunity for cost recovery through savings achieved through both the initial cost and annual maintenance for servers, reduced storage and software licensing costs, and reduced support costs (e.g., backup, recovery, and upgrades).

If you have legacy systems and are unsure how to start, consider engaging Flatirons for an application portfolio analysis. An analysis can give you the information you need to understand what the process might look like for you and what the specific benefits for your organization could be.