How to Become Year 2000 Compliant

Changing all the date references in software code is complicated and time-consuming work, and yet it must be done if agency computer networks are to survive into the next millennium. Information technology companies are offering a wide variety of products to help analyze source code, convert date fields and test applications.

"The first step in any year 2000 project should be drawing up a detailed plan that determines the methodologies to be used and the proper mix of technical resources," says Kathleen Adams, associate commissioner for systems design and development at the Social Security Administration, which began year 2000 preparations in 1989. "The trickiest part is finding all the code that needs to be changed."

Software tools are available to help agencies analyze source code and locate all date fields. These products work several different ways. Some use clock simulators to flag applications that produce errors when the calendar rolls over to 2000. Others use browsers that scan text for date references.

Adpac Corp.'s SystemVision Year 2000 software uses parsing algorithms to identify all date occurrences in customized Cobol code on mainframes. Database vendor Informix offers a COTS 2000 program that identifies which legacy programs are best candidates for replacement with commercial, off-the-shelf (COTS) software.

COTS software can be a viable option for many administrative systems, but will not work for detailed applications based on rapidly changing laws and regulations. When the Federal Reserve began gearing up for the year 2000 problem five years ago, it decided to write new programs for its legacy systems. The homegrown software recently replaced old mainframe programs controlling securities transfers, accounting and other treasury functions.

Some software tools examine relationships between programs and systems, and then generate statistics on how components are cross-referenced. After conducting line-by-line impact analyses of all date references within systems, the tools determine the complexity of the required changes and project the conversion costs. Some even offer detailed revision plans.

Both the Energy Department and the IRS have relied on semi-automated software from Viasoft Inc. The planning tools helped the agencies estimate the extent of the required reprogramming.

Code Conversion

Finding date references is only half the battle. Software code must be made Y2K-compliant. The most common conversion methodology is simply to change all two-digit year fields to four-digit year fields. But such field expansion requires significantly more storage capacity.

An alternative is to use specialized compression technology that forces four-digit year references into two-digit date fields. Dates then can be decompressed when output to printers or other systems. This approach, however, is fairly complex and therefore requires a high degree of conformity across programs.

Two-digit year references can be converted to three-year fields by dropping the zero for dates in the 21st century. But this technique also is complex and can be problematic when the fields are translated for external consumption.

One popular conversion methodology is called windowing logic, in which two-digit numbers greater than or equal to 50 are interpreted as 1900 years, and numbers less than 50 are interpreted as 2000. The number 50, for instance, would read as 1950 while 10 read as 2010. The only problem is that the methodology cannot be used in applications that contain dates going back 100 years. For that reason, windowing logic generally is reserved for legacy systems that soon will be replaced by client-server networks.

Intersolv, a provider of client-server development programs, offers tools for making changes to Cobol applications. Computer Horizons Corp. has a five-phase process for managing the year 2000 conversion across enterprises. Several companies, such as Adpac, Peritus Software Services and Trans-Century Data Systems use artificial intelligence to fix date references.

Software Testing

Perhaps the most important phase of year 2000 projects is testing and validation. If even 1 percent of the dates are skipped or incorrectly converted, clean data could be infiltrated and information could be lost or compromised. "If you don't build in filters, you're leaving information systems as vulnerable as they were before the date conversions were done," says Adams.

Customized test data sets must be created to do unit tests on individual code. The converted code then has to be integrated with the rest of the system and validated for accuracy. Next the code must be deployed into a simulated year 2000 environment to verify whether it will survive. Finally, a database has to be created so the converted system does not get polluted.

"Testing represents 50 percent of the cost in year 2000 projects because it is so labor-intensive," says Robert Molter, computer scientist for the Information Technology Directorate in the Office of the Assistant Secretary of Defense for Command, Control, Communications and Intelligence. "Although it's expensive and difficult, the process is worth every penny because it ensures that data won't lose its integrity and that year 2000 projects will be successful."

Stay up-to-date with federal news alerts and analysis — Sign up for GovExec's email newsletters.
Close [ x ] More from GovExec

Thank you for subscribing to newsletters from
We think these reports might interest you:

  • Forecasting Cloud's Future

    Conversations with Federal, State, and Local Technology Leaders on Cloud-Driven Digital Transformation

  • The Big Data Campaign Trail

    With everyone so focused on security following recent breaches at federal, state and local government and education institutions, there has been little emphasis on the need for better operations. This report breaks down some of the biggest operational challenges in IT management and provides insight into how agencies and leaders can successfully solve some of the biggest lingering government IT issues.

  • Communicating Innovation in Federal Government

    Federal Government spending on ‘obsolete technology’ continues to increase. Supporting the twin pillars of improved digital service delivery for citizens on the one hand, and the increasingly optimized and flexible working practices for federal employees on the other, are neither easy nor inexpensive tasks. This whitepaper explores how federal agencies can leverage the value of existing agency technology assets while offering IT leaders the ability to implement the kind of employee productivity, citizen service improvements and security demanded by federal oversight.

  • IT Transformation Trends: Flash Storage as a Strategic IT Asset

    MIT Technology Review: Flash Storage As a Strategic IT Asset For the first time in decades, IT leaders now consider all-flash storage as a strategic IT asset. IT has become a new operating model that enables self-service with high performance, density and resiliency. It also offers the self-service agility of the public cloud combined with the security, performance, and cost-effectiveness of a private cloud. Download this MIT Technology Review paper to learn more about how all-flash storage is transforming the data center.

  • Ongoing Efforts in Veterans Health Care Modernization

    This report discusses the current state of veterans health care


When you download a report, your information may be shared with the underwriters of that document.