We're interviewing for junior software developers right now. I've just completed about 20 phone interviews and the lack of useful experience is appalling.
Let's not rehash the arguments about what is "computer science" and how to hire and about how most people are no good and all that.
I'm not even talking about how smart they are. I'm just talking about the coursework.
It's time for "computer science" to become "science" and not math. Meaning: Learning to use the tools and techniques of the trade, not just formulas and mathematical analysis.
For example, of the 100 resumes and 20 phone interviews I've done in the past month:
- 3 have ever used a version control system; only 1 of those could describe what version control is.
- 5 even claimed to know anything about SQL; none of them could tell me what an "outer join" was.
- 6 have ever done a unit test; zero had ever done a unit test in school.
Even "unprofessional" open-source projects all use version control. The vast, vast majority of software companies do too. Version control is almost as important as the compiler. I'd be happy if there was a course where you learned things like branching theory, but I'd be content if they just used it at all, at any time. Schools love group projects, so how is it that the group project doesn't involve version control?
And no unit testing? Ever? That's just lazy. Why is that not required for all coursework? By the way, if you required it, almost all assignments would be correct when they were turned in. Isn't it better to teach people how to make sure they always get an 'A'? How is unit testing now on par with how to debug?
Which is another thing. I never had any instruction on how to use a debugger. To me that should be an entire class -- how to find out what's wrong with code, especially when you didn't write it. And unit testing should be a part of that.
I can think of two reasons why these obvious things aren't taught:
- Professors don't know these things either because they don't keep up with new trends and practical processes. Of course I'm making a wide, unfair judgment, but witness for example the slow acceptance of Java.
- Departments don't want to "sully" their pure, theoretical, artis liberalis culture with practicalities. Analogy: The math department is mostly theoretical. If you want applied math, be a physicist or an engineer. My problem with this is there is no alternative. OK, so have a theoretical computer science degree, but then where's the "engineering" alternative? Some colleges have this concept, but for example at UT the alternative just meant fewer hard CS classes, not more practical classes.
OK, I like on-the-job learning, it's the best kind, but throw me a bone. At least broach the concepts somewhere in the 4-5 year curriculum.