December 19, 1999 In Final Year 2000 Testing, Focus Is on Smallest Flaws By BARNABY J. FEDER ALTHAM, Mass. -- The strings of letters and numbers flashing on his computer screen were unintelligible to Kishore Parwani, but he knew that each was the name of a computer program with hundreds or even thousands of lines of code. He also knew something was wrong because the screen was telling him the code was free of Year 2000 faults, a result as believable as a report that a unicorn had won the Kentucky Derby. The programs, sent by one of the nation's largest commercial real estate companies for testing by Data Integrity Inc., had been certified as Year 2000-ready by the large software company that created them. But Data Integrity and other companies that review code to verify software claims typically find scores, even hundreds of serious oversights. "Our best clients still have 40 to 50 errors per millions of lines of code," said Richard E. Evans, an analyst with Meta Group of Stamford, Conn., a consulting firm that provides information on verification tools and services. "Half of those could corrupt data or crash systems." That adds up to thousands of potentially serious flaws for banks, insurance companies and others. The government and most of corporate America have declared that virtually all of their critical systems will function normally when Jan. 1 arrives. But because only a portion of most computer code is actually tested to make sure the year "00" will be correctly interpreted, even the most confident computer managers anticipate at least minor flaws. Thus, as the repairs and testing wind up, Year 2000 boils down to one pressing question: Since stamping out every Year 2000 date problem is impossible, has the caseload of miscalculations and crashes been reduced to manageable levels? For the real estate company, the mirage of clean code disappeared when Mr. Parwani adjusted his scanning tactics for the obscure code in which the program was written. A three-day review of close to 2.5 million lines of the software vendor's supposedly Year 2000-ready code identified about 250 flaws. "We found at least 10 flaws that would have required several days to fix," said a programmer for the real estate company, which allowed a reporter to observe the procedure on the condition that it not be identified. "They would not have stopped business but they might have interfered with things like tracking how long rents are overdue." While true showstoppers rarely turn up in such inspections, the number of flaws uncovered naturally raises questions about whether the government and many corporations are overstating their readiness. The prevailing confidence is probably justified as far as the New Year's weekend goes, but the longer-term picture is murkier, according to verification-tool providers like Data Integrity. As with software flaws in general, system crashes are usually less troublesome than malfunctions that generate faults not immediately apparent. "Less than 10 percent of the problems we find would cause something to stop," said Scott Hilson, director of technical support for Reasoning Inc., a Palo Alto, Calif.-based rival of Data Integrity. "This is more like termites than an earthquake." The Year 2000 termites might be more dangerous than normal bugs because they are expected to peak in the first weeks of January, when many computer workers are already stretched thin handling malfunctions that occur as the old year ends and the new one begins. "Anywhere from 2 percent to 5 percent of computer jobs normally fail in late December and early January," Mr. Evans of Meta Group said. That rate will more than double this year, according to projections by the Gartner Group, a technology consulting firm in Stamford, Conn. Unfortunately, thanks to the Year 2000 challenge, even the normal problems are likely to be more common in coming weeks. In addition to the Year 2000 flaws, programmers will be wrestling with other errors inadvertently added to the code during repair efforts. Based on an examination of 30 years of software records, Capers Jones, chairman of Software Productivity Research Inc. in Burlington, Mass., predicts that Year 2000 workers have introduced 7 flaws for every 100 they fixed. In addition, many computer users bought new software to retire programs that had date problems. But newly installed software tends to have more bugs than average. Nevertheless, vendors like Data Integrity have found contracts harder to come by than analysts had originally projected. Sales for the automated tools the vendors provide have reached about $200 million, the analysts estimate, but more had been expected. The tools are generally modified versions of programs that find and repair date flaws. One of their strongest selling points, beyond blazing speed, is that they scan every line in a program. By contrast, although running test programs provides a better window on real-world use, such tests require intricate scripts -- and even complex simulations, like those conducted by Wall Street in March and April, can check only a small percentage of a large user's code. Most verification tools start with vast glossaries of all terms that might be used for dates in coding and for operations using them, like tracking how many days a payment is overdue. Data Integrity's approach is somewhat different. It looks not for date terms but for coding that indicates a mathematical operation, like "compare." It then tries to filter out commands that could not involve Year 2000 errors. But regardless of approach, users generally pay for verification tools based on the number of lines of code that will be scanned. Rates generally run between 4 to 10 cents a line, with a typical contract covering hundreds of thousands of lines at a minimum. "Price can be a sticking point for some smaller organizations," Mr. Burgess said, adding that Data Integrity's contracts have run from $90,000 to more than $2 million. Analysts estimate that the big computer users in the United States with comprehensive Year 2000 programs have used either tools or human checks on 40 to 60 percent of their repaired code. "Once you get outside the U.S. and a few countries, you get down to single digits in how much of the code has been independently verified," Mr. Evans said. The first generation of automated Year 2000 tools in the mid-1990's was difficult to use. The tools frequently overwhelmed programmers with lists of potential glitches that on closer inspection had nothing to do with dates. Many computer users turned instead to so-called body shops like I.B.M., CAP Gemini and Keane, which relied more heavily on human programmers than automation for verifying code. Many of the tools now have better filters, but programmers acquainted with how the code is used must still determine which of the flaws are worth investigating. As Data Integrity's contract with the real estate company demonstrated, the independent verification business has not dried up even though January is just around the corner. But many prospective clients are suffering from "Y2K fatigue," Mr. Burgess said. "At the end of October, they started saying it was too late to do anything more." Today, the rooms in Data Integrity's office in a business park near Route 128 outside Boston are mostly empty. But it is not because the company is shriveling as demand for its main product slows. The office is a new one that the company moved into anticipating adding to its staff of 30 next year as its tools, like those of its rivals, are adapted to cleaning up other faults in software. Thanks to Year 2000, they should get at least a chance to make pitches to most of the world's biggest computer users.