Is most software development really as bad as I think?
Following my lay-off over the summer from Insurance.com, I've been around quite a bit doing contract work and other sub-optimal things. Now I'm finally in what I hope is a solid job with an online marketing agency that also does app dev. As the company's technical architect, I'm not writing as much code myself as I have previously, but I do get to see a lot of stuff inherited from other places.
The common theme is that most of what I see is universally bad. I mean, throw it out if possible and start over bad. Copy-and-paste from countless blog entries bad. Uses DataTables and Session bad. Written in VB.NET bad. (OK, sorry, that was uncalled for. ;))
I'm trying to figure out the reason for this. I know part of it is the scarcity issue, that there just aren't enough good people to go around. But one would assume that every organization has some senior people in charge of maintaining some level of non-suck. Yet a lot of suck gets out into the wild.
I feel fortunate that I work with developers who are eager to learn and surprisingly open to culture change. They want to be ninjas. I now realize just how high-end the development was at ICOM, and I feel very fotunate to have had that experience, and have others now benefit from it. I like the idea that you can ease your own legacy, and others down the road will thank you for it as well.
Of course, that sounds ironic coming from a guy who often says he can do without the computer science lessons, but come on, the basic design patterns make your life infinitely better and they're not that hard to learn!
What kinds of nonsense have you seen?