If any company had a reason to dump Oracle, it’s Amazon. And yet, 14 years after Amazon lamented its “straining database infrastructure on Oracle” and started to “evaluate if we could develop a purpose-built database that would support our business needs for the long term,” the commerce and cloud provider won’t be free of Oracle until the first quarter of 2020, as reported by CNBC’s Jordan Novet.
That “I can’t leave you, baby” reality, to use Led Zeppelin’s lyrics, is not so much a testament to Oracle’s database prowess but to the friction inherent in moving data. Or, as Gartner analyst Merv Adrian once put it, “The greatest force in legacy databases is inertia.”
Why even mighty Amazon is stuck on its legacy Oracle databases
Amazon may have pushed Oracle’s database beyond its ability to scale as early as 2004, as Amazon CTO Werner Vogels has called out, but only a decade later was Amazon seriously considering replacing the venerable technology. As Novet’s interviews reveal:
Amazon began moving off Oracle about four or five years ago, said one of the people, who asked not to be named because the project is confidential. Some parts of Amazon’s core shopping business still rely on Oracle, the person said, and the full migration should wrap up in about 14 to 20 months. Another person said that Amazon had been considering a departure from Oracle for years before the transition began but decided at the time that it would require too much engineering work with perhaps too little payoff.
“Too much engineering work with perhaps too little payoff” perfectly describes why most legacy tech sticks around. Once an application is written to run on the mainframe, there’s often little point in rewriting it to run elsewhere. In Adrian’s words, “When someone has invested in the schema design, physical data placement, network architecture, etc. around a particular tool, that doesn’t get lifted and shifted easily.”