What is legacy?

Legacy refers to old or outdated technology-such as software, code, hardware, or systems-that is still in use because it still performs essential functions for a business or organization.

Let's break it down

Legacy can be split into three main parts:

  • Legacy code: programs written years ago, often in older programming languages.
  • Legacy hardware: physical devices like mainframes or old servers that are still running.
  • Legacy systems: whole applications or platforms that were built long ago and have not been replaced.

Why does it matter?

Legacy matters because it keeps critical business processes running, but it also creates challenges. Maintaining old code can be costly, security updates may be hard, and finding people who understand it can be difficult. Balancing its reliability with the need for modernization is a key concern.

Where is it used?

Legacy technology is common in banks, insurance companies, government agencies, healthcare providers, and large enterprises that have built complex systems over decades. Anything that can’t afford downtime often keeps legacy components alive.

Good things about it

  • Proven stability: it has been tested in real‑world conditions for years.
  • Business continuity: replacing it could risk breaking essential services.
  • Low immediate risk: no need for a big, disruptive migration project right now.
  • Familiarity for existing staff who have worked with it for a long time.

Not-so-good things

  • Hard to maintain: code may be poorly documented and use outdated languages.
  • Security vulnerabilities: older systems often miss modern security patches.
  • Integration issues: connecting legacy tech with new tools can be complex and costly.
  • Talent shortage: fewer developers know how to work with old languages or hardware.
  • Stifles innovation: resources spent on upkeep can’t be used for new features.