Were you ever ordered to debug an application written ages ago? Did the original programmer was still around? Did the users contributed any valuable info besides "it's not our fault" or "the other guy knew how to solve this"? Did you think that the people who wrote software in technologies which are now obsolete, are idiots or incompetent programmers (no offense)?
Well, since I'm writing these questions, you can guess what would be my answers. This week I got to work on an issue, in an application written before the turn of the century, for people who still thinks (!) they should solve every problem with VB and Oracle Forms. Of course the issue was critical, and the blame was on us - the IT and IS departments. After three days of hard working, and the complete waste of time of about 5 software developers, the issue was solved. And still, after solving the problem, we still don't know what caused it. Code just disappeared from the application. We consider it "voodoo", and blame ourselves that we gave the users complete control over the application, so they could destroy it by accident (they'll never admit that, and we cannot prove that).
Some would say I'm too harsh, or mistaken. These things happen, and they might be our complete fault. But I believe our only mistake was to let users continue to work on such legacy application.
What is the life span of software developed in-house? 7 years? 10 years? And what are the costs of prolonging legacy applications' life? which is better: to rewrite using modern methodologies, or to patch the software until computer architecture changes so dramatically, so the software simply wouldn't work?
I wish I had school answers. Now I remember why I hate in-house software development so much. Nothing good comes out of it. And when problems arise, it happens so many years after the software was written, that nobody knows how to solve them. So, the software costs money (and resources) when written, when fixed, when rewritten, and all over again.