Wednesday, March 12, 2008

We called him Tortoise 'cause he taught us!


We're interviewing for junior software developers right now. I've just completed about 20 phone interviews and the lack of useful experience is appalling.

Let's not rehash the arguments about what is "computer science" and how to hire and about how most people are no good and all that.

I'm not even talking about how smart they are. I'm just talking about the coursework.

It's time for "computer science" to become "science" and not math. Meaning: Learning to use the tools and techniques of the trade, not just formulas and mathematical analysis.

For example, of the 100 resumes and 20 phone interviews I've done in the past month:

  • 3 have ever used a version control system; only 1 of those could describe what version control is.
  • 5 even claimed to know anything about SQL; none of them could tell me what an "outer join" was.
  • 6 have ever done a unit test; zero had ever done a unit test in school.
I can't understand why, for example, "SQL" isn't a required course. I'm not asking for much -- OK if you want to emphasize the mathematical properties of queries, OK if you want to teach data normalization theory. I'd prefer practical things like "never delete data" and "auto-increment versus GUID's" and "how to diagnose slow queries" and "how database vendors differ," but I'll take anything at all where they could at least form basic queries and know roughly how they work.

Even "unprofessional" open-source projects all use version control. The vast, vast majority of software companies do too. Version control is almost as important as the compiler. I'd be happy if there was a course where you learned things like branching theory, but I'd be content if they just used it at all, at any time. Schools love group projects, so how is it that the group project doesn't involve version control?

And no unit testing? Ever? That's just lazy. Why is that not required for all coursework? By the way, if you required it, almost all assignments would be correct when they were turned in. Isn't it better to teach people how to make sure they always get an 'A'? How is unit testing now on par with how to debug?

Which is another thing. I never had any instruction on how to use a debugger. To me that should be an entire class -- how to find out what's wrong with code, especially when you didn't write it. And unit testing should be a part of that.

I can think of two reasons why these obvious things aren't taught:
  1. Professors don't know these things either because they don't keep up with new trends and practical processes. Of course I'm making a wide, unfair judgment, but witness for example the slow acceptance of Java.
  2. Departments don't want to "sully" their pure, theoretical, artis liberalis culture with practicalities. Analogy: The math department is mostly theoretical. If you want applied math, be a physicist or an engineer. My problem with this is there is no alternative. OK, so have a theoretical computer science degree, but then where's the "engineering" alternative? Some colleges have this concept, but for example at UT the alternative just meant fewer hard CS classes, not more practical classes.
Oh well. From what I can tell it's the same in a lot of disciplines. Most of the MBA's I know can't run Quickbooks or read a P&L, and the ones that can tell me they learned it on the job.

OK, I like on-the-job learning, it's the best kind, but throw me a bone. At least broach the concepts somewhere in the 4-5 year curriculum.