When Application Decommissioning Is, and Is Not, an Effort in Reverse Engineering

By Patrick Moan, Program Manager

The goal of application decommissioning is to move data from legacy systems to a lower-cost, secure, read-only archive and then shut down the outdated legacy applications.

Legacy datasets can often occupy terabytes of storage. Having decommissioned hundreds of applications for clients, our experience is that for business continuity purposes, most organizations only require access to a subset of their data.

To validate data requirements, we begin by working with clients to understand their overall business goals and objectives along with specific search, reporting, and data retention requirements. From there, we identify the specific data to bring into the archive.

Extract, Transform, and Load (ETL)

Once the data has been identified, it needs to be extracted from the legacy system, transformed, and loaded into the archive. This is known as the ETL or Extract, Transform and Load process. The extract can often be performed using an application-specific connector that understands the translation between the source legacy data format and the target archive format. If an application-specific connector does not exist for the legacy data, then decommissioning the application is an exercise in reverse engineering.

How to Extract Data without An Application-Specific Connector

When reverse engineering, one needs to identify which legacy database fields drive the search and reporting required in the archive. For example, take a case where a hospital requires a report on various procedures that a patient has undergone. We’d need to know the tables and fields associated with procedure-specific information such as a code, procedure name, date, and physician.

In most cases, when reverse-engineering an application, SQL tracing can be used to determine which database fields are accessed by a given report in the legacy system. Investigation consists of turning on SQL tracing for the legacy application, running a report of interest, then analyzing SQL trace logs to determine which fields in a given database table were accessed. Not infrequently, further analysis needs to be performed to make sense of complex joins or decompile a stored procedure that manipulates the data in some way.

There are certainly cases where SQL tracing is not available. Take the case of a 1970’s-era mainframe application using a VSAM database. In this scenario, the COBOL copybook, which describes how the data is stored in the VSAM database, must be relied upon. Our customer would provide data dumps out of the VSAM database and we’d analyze the data using the copybook as a guide when writing logic to transform and load this data into the archive.

Regardless of the approach we take to archive your data, it’s always helpful have a subject matter expert in your organization who understands how your data is laid out in the legacy database. Sometimes this person exists. Sometimes they don’t due to retirements, etc. The key point here is that decommissioning any application benefits from targeted collaboration with the technical expertise within your organization.

In advance of any engagement with Flatirons Digital Innovations, we’d aim to understand up-front which of the scenarios described here apply to the applications you’re considering for decommissioning so that mutually understood expectations are established at the outset.