Having been a long-time scientist myself, I’ve observed time and time again one very persistent approach by most of my fellow scientists to innovation: take what’s been done, and improve it. Not a single project that I’ve participated in could skip this important step – look what’s already been done, study the literature, talk to those who walked there before, learn what their approaches do well and where they have weaknesses, and see if you can keep the "good stuff" and somehow avoid the pitfalls, generally by tweaking things here and there. Granted, most of the technology comes from such an approach of learning more and more about the specific methods, and polishing them to perfection, until hardly anything can be improved, at which point the science proudly declares it to be "the state of the art" and "the best it can ever be", mathematicians formulate theorems proving that nothing better can be done with this technology – no matter how hard you try, and the method enters the classical textbooks as "the way to go". Until someone invents a new technology that totally outperforms the "old and tried" ways, making everyone wonder what has just happened…
You make it obsolete by introducing a superior methodology."
Remember the vacuum tubes? Neither do I. Perhaps, the only surviving vacuum tubes these days are the CRT TVs and computer monitors – but even those are becoming increasingly obsolete. With the invent of a transistor, electronics suddenly became cheaper, more energy-efficient, and much more compact. I remember playing with transistors as a kid – soldering simple radios and amplifiers for my home fun projects.