Joel got it wrong - Why learning C is a waste of your college time and money
Back in January, Joel Spoelsky wrote a Career Advice column. It gave some generally good advice to college students, the main points being:
- Learn how to write before graduating.
- Learn C before graduating.
- Learn microeconomics before graduating.
- Don't blow off non-CS classes just because they're boring.
- Take programming-intensive courses.
- Stop worrying about all the jobs going to India.
- No matter what you do, get a good summer internship.
[update: note that Joel's advice is to college students in general, not just to computer science majors]
I agreed with everything but (2). I believe learning C before graduating may help some, but hurt just as many. While it might have helped Joel, I don't think it's universally good advice. It's been bugging me for a month now, so it's time to get it in writing. In the interest of hopefully getting comments from someone other than my pals at www.online-cheap-meds-4u.ru and www.poker-gambling-casino.net , I aim for the nearest windmill and present:
Jon's 6 reasons why learing C is a waste of your college time and money:
- It's not a skill you'll use in most of the software development jobs you'd want to have
- It can give you a false sense of control
- It can teach you to get in the way
- It can make it hard for you to love famework based development
- It can teach you philosophies which will prevent you from really understanding modern programming
- It can divert you from the real challenges of software engineering
1. It's not a skill you'll use in most of the software development jobs you'd want to have
Without trying to offend anyone who's into this kind of stuff (weirdos!), here are the kind of things you might use C for these days - writing some kind of device driver, maintaining extremely old applications, embeded development on very limited hardware, or maybe writing an operating system or compiler or something. I guess you could like hack away on the Linux (or is that GNU-Linux) kernel.
Hey, if that sounds fun then off to the races. Consider this though - you're not really going to be solving any new problems. If you want to do web development, database development, desktop applications - you know, stuff you can talk about at a party - you're probably not going to be working in C.
2. It can give you a false sense of control
Quote from Joel's article:
I don't care how much you know about continuations and closures and exception handling: if you can't explain why while (*s++ = *t++); copies a string, or if that isn't the most natural thing in the world to you, well, you're programming based on superstition, as far as I'm concerned: a medical doctor who doesn't know basic anatomy, passing out prescriptions based on what the pharma sales babe said would work.
That sounds good on the surface, but does his code sample really tell you how the string is being copied? It's a symbolic representation of a process which is moving memory, sure, but still several levels of abstraction away from shuttling the bits through memory. Why is this the magic level of abstraction that gives you the edge? Why not assembler?
Worse still is that it can make you think that programming is about telling the computer what to do with its registers and memory. Oops, wait, that memory was magically paged from disk without your knowlege. The compiler applied some optimizations behind your back. Your code is running on top of an operating system, which is rationing out the CPU cycles between tens of processes, and yours gets a measly 10% timeslice. And, hey, what CPU are you on? Any sneaky little tricks going on with your 64 bit hyperthreaded chip? What about two years from now, when you run your app on a virtual server on a multicore chip?
Thinking that you're in the pilot's seat because you're handling pointers is silly. Better to understand that you're asking the CPU(s), through multiple levels of abstraction, to copy a string. Do it politely - this ain't no rowboat anymore, it's a durned space ship.
3. It can teach you to get in the way
That false sense of control I mentioned before can lead to outsmarting yourself. Some examples:
- Trying to pool database connections or holding a global connection, not knowing that the data layer is automatically pooling if you let it
- Overriding the .NET Garbage Collector, assuming that Java / Micro$oft doesn't know how to manage memory nearly as well as you can
In both these cases, a little bit of knowledge is (as rumored) a dangerous thing. If all you learned from C is that you are the boss, you will most certainly write code that plays poorly with others.
4. It can make it hard for you to love framework based development
Moderns programming languages run on top of frameworks. .NET apps use the .NET framework, Java uses J2EE (et. al.), and arguably web apps run on top of a browser / communication communication that constitutes an application framework. The list could go on (GTK, XUL, web service, Flash). Most good frameworks are standards based, and all of them host your solutions so you only solve new problems.
C code, by and large, is not about frameworks. At its best, it uses some libraries and links to some API's. C gives you a toolset that can solve just about any problem, but requires that you solve each and every problem every time. Frameworks were created to obviate solving the same problems in each program you write. Software development has made a steady progression from code to libraries to components to frameworks. Thankfully, you don't need to retrace this path just as you don't need to experience the dark ages to live in the post-renaissance world.
To be productive as a programmer these days, you either need to be learning to get the most out of existing frameworks or helping to build tomorrow's frameworks. To learn to be a productive programmer, you need to learn how to work with frameworks.
Learning to work with a framework takes work. You need to learn research skills that go far beyond fixing syntax errors. You need to learn how to optimally integrate with our environment, what the environment will provide for you, and what's expected of you.
Thankfully, learning to work with a framework is a transferable skill, so the exact framework doesn't matter so much as that you learn how to use a framework - pick an environment and get going.
5. It can teach you philosophies which will prevent you from really understanding modern programming
C code is procedural. That means you describe an exact set of steps you want your program to accomplish. If you want to find a value in an array, you write code that loops through each element in the array - a house to house search.
Modern programming is often declarative, meaning that you describe what you want and thing else decides the best way to get it. Databases, for instance, respond to "Queries" - a request for information in a descriptive language. The database maintains indexes to find the data without looking at every line, and the good databases self-optimize based on the questions they're asked most often. To continue the analogy from before, we replace the house to house search with a 411 call. Cursors in database queries are just about always a failure on the programmer's part to describe the problem rather than limitations of the database engines.
A surprising amount of work in current software is described rather than prescribed. HTML, for instance is a markup language that describes content and layout to a browser, which does the hard work of wrapping text, applying styles and enforcing user-defined security settings. XPath and regular expressions some more current technologies which implement declarative programming.
The point here is that modern programming is moving towards Domain Specific Languages (DSL's) which efficiently communicate programmer intent to CPU cycles. C is not a good prototype for any of these DSL's.
6. It can divert you from the real challenges of software engineering
There is significant work to be done today in software engineering that requires fresh, energetic minds. Some examples:
- We are on the verge of ubiquitous AI - self organizing systems (software and hardware) that will collaboratively solve problems for us. We're still figuring out how to manage them.
- We've got a lot of disparate systems that still don't communicate well.
- We are being bombarded by more information than we can currently process. Fixing the U.S. intelligence system to the point that pre-9/11 dots could have been connected requires storage and analysis of 36 Terrabytes per week ongoing (quote from recent IT Conversations - Accelerating Change audiocast). RSS and e-mail volume continue to grow.
- Computer users need both security and privacy. Neither is relible today.
- Our current methodologies routinely result in failed projects. (Examples - recent failed $170B FBI project, Mars mission software errors, etc.)
- Bob in accounting needs a report on recent payment orders to office supply companies, but the data is stored in Excel files on Mary's desktop. Mary is on vacation.
- Walt in IT just gave two weeks notice. He justtold you that he's been managing several daily FTP transfers of critical company data. He manually inserts into the production database using some scripts his roommate helped him figure out.
- Marketing promised product X by June. Management budgeted for product .25X by June 2025. You are the developer for product X.
In the movie Real Women Have Curves, the main character is a Mexican American girl who's been offered a full scholarship to Columbia. Her mother won't let her accept, giving some bogus reasons about needing to stay near her family. When her father challenges it, she tells him the real reason: she's had to work in a sweatshop her whole life to provide for the family, and there's no way her daughter is going to skate from school to college to the easy life without some time in the sweatshops. It's her turn to work.
When an experienced programmer tells you to learn C nowdays, they're either trying to apply their out of context life experiences to yours, or they're trying to throw you in the sweatshop because that's what they had to do.
There's plenty to learn in computer science circa 2005. Don't waste your time on history unless you want to be a historian.