Legacy Computer Systems
Please understand that "legacy" simply means inherited, in place, incumbent. A legacy system is not necessarily obsolete or even old - it's what's 'there'. Usually the term legacy is used when speaking of the systems of two companies that merge (each has its legacy systems), a system that was developed a long time ago (in computing terms) and very often proprietary, or a system which the company is planning to replace or upgrade.
"Legacy system ... A computer system or application program which continues to be used because of the cost of replacing or redesigning it and often despite its poor competitiveness and compatibility with modern equivalents. The implication is that the system is large, monolithic and difficult to modify.
“If legacy software only runs on antiquated hardware the cost of maintaining this may eventually outweigh the cost of replacing both the software and hardware unless some form of emulation or backward compatibility allows the software to run on new hardware."
Reference: http:\\wombat.doc.ic.ac.uk/foldoc/foldoc.cgi?legacy+system (2004)
Legacy applications are a challenge to list, since most mainframe systems are unique. You often cite them in your discussions online, usually by their initials, such as CYB. Eventually software development companies arose that wrote ‘packaged’ applications. One such company is Computer Associates, which started as a store front in Garden City, Long Island, NY, maybe ten years ago. Databases for the mainframes were non-relational, bulky, but could depend on the size of the mainframes to hold the information and make it available to all terminal users; interfaces were simple text.
Languages used were FORTRAN, BASIC, COBOL, AIX (IBM mainframe), VMS (Vax mainframe), idioms of all of the above, and a rash of unique languages. All of these languages are considered “third generation” – they need a compiler to change them into machine code. I had even worked on systems that used two languages merged together – one for the processes and a second for the graphics; the computers that used this type of coding had two processors in them and you had to inform the computer of when to switch processors. First generation (machine language) – actual zeroes and ones – is still used in certain cases, such as space applications. Second generation languages are Assembler/Assembly languages. These require knowledge of the specific CPU processor being used, since these are all orders on how to move the bytes and words from stack to heap to register. and back. While Assembler is taught in schools, it is taught only to let the student understand the actual processing involved and is usually taught for an ancient (but simpler) processor such as the 808x series. It is also used for instructions. COBOL was created to avoid the ‘blackmailing’ of companies by programmers who would write undocumented code in obscure languages just to make the company keep them on the job; it became the mainstay of business – airlines, travel agencies, colleges, and many more. Thus, in 1998, all COBOL programmers had job security as Y2K repairers. A lot of companies took the opportunity to switch to completely new applications, which were not being written in COBOL. So starting in mid-2000, all these COBOL programmers hit the streets again, and have found they don’t have a market any more.
Legacy network systems include Pathworks, LANtastic, IBM (can’t remember the name of their foray into networking), and Banyan Vines. Some people might consider Novell a legacy system, but I know huge companies running on partial or full Novell networking. Microsoft was aware enough of that to build a Novell bridge and a conversion package into their NT network. We might even consider NT 4.0 a legacy system, since it’s superceded by 2000 and XP. Those of us who had our NT4 patches ripped off our shoulders aren’t worried, though, since the migration of a network operating system (NOS) is so expensive it’s hard to justify.
Legacy operating systems (OSs) would of course include DOS, NT, Windows 95 and 98. Beyond that there are companies struggling with Macintosh pockets, CDM (the graphics OS that worked alongside of DOS), and even Apple in school systems and some offices.
MANAGEMENT OF LEGACY SYSTEMS: This can be relatively simple, since such systems are not upgraded (if it ain’t broke….) I have a friend who lives and plays the trumpet in California; he is flown into Boston two weeks of every month to his own apartment just to maintain an insurance company’s customer service system, written in some obscure language I can’t remember; and he is paid handsomely for this! But in general the maintenance of a legacy system, be it an application, OS or whatever, can be taught to successive employees. The problems arise when conversion and retirement are considered. The data is often in a unique format, and someone has to convert it to a new format – sometimes a formidable task. And even retirement of an application would almost demand the conversion of its data in a business, so that if historical data is needed it’s retrievable. Since operating systems (OS) also obsolesce, if legacy applications or data depend on one OS, separate machinery may need to be maintained to keep the legacy application going. This can become quite cumbersome. And you will need the time to train personnel on the management of the hardware as well as the software. Sometimes, like in my friend’s case, specially talented personnel need to be retained, and they are very expensive. Manufacturers of hardware and software close down, or simply refuse to support obsolescent applications and OSs. There comes that point of no return when a company has to consider retirement.
MIDDLEWARE: This is a relatively new term, even though the concept has been around for a while. The success of huge enterprise-wide packages such as SAP and PeopleSoft have made this a market of its own. However, these are not cheap – they can go for as much as a quarter of a million dollars. But without middleware, companies were not going to take a chance on moving systems that had served them for ten years or more. Middleware may also be a communication system. While trying out different software and hardware, companies made collections of OSs and NOSs. Middleware was developed that bridged the different platforms, so the company could get back to intercommunication. This way a person could have a single PC on the desk instead of a mainframe terminal plus a Mac plus possibly an Intel PC. Despite the move to standardization of hardware (inspired by the success of networking), some companies still maintain specialty platforms, such as Macintosh graphics networks; these still need middleware to save to a repository, print on special printers, and manage regular work. Apple may have made the first middleware when it designed its 3.5” FDDs to accept and read Intel disks; this included software to write to Intel disks. This made the transition to and from Mac to Intel simple to achieve. I used it regularly, since MS Office was on both machines, and I could take advantage of features of both processors. For example, I had a client whose company was run on a Mac network. They had a vendor offering parts lists in ASCII. There were 4 different price lists for 35,000 parts. I would download the ASCII information onto disks, then import it into an Intel Excel program because the Macs used so much RAM for their system they could not handle the huge data transfers that Intel could. Once I had the spreadsheets set up I made them into smaller collections of spreadsheets so that the Mac could handle them, then converted them to Mac disks. The whole process took 6-7 hours, so my client decided that it was better to keep the vendor’s machine and have his salesmen walk over to it when they needed information, since these parts lists came out every month!
Despite the newness of the term, middleware has been around for a while, especially for monitoring of networks or applications. They were simply referred to as third-party addons. Called sniffers, they were developed to watch the activity on to identify logjams and problem users. They were developed to help communication of applications to web browsers (ever hear of a plug-in?). They were developed to cross platforms and to do statistical analysis of information in an application (such as reviewing client use of banking accounts). When third-parties started developing migration software to move information from legacy systems to newer ones, the term Middleware cropped up and became a buzzword. Now a lot of the companies offering new packages also offer the conversion middleware, to encourage the migration.
Part of the migration decisions is to evaluate legacy systems and determine if it is feasible to migrate these systems, or more cost effective to simply connect into the legacy systems.