You’ve heard it said many time that “No one who takes a high school programming course is qualified for a software development job.” Sounds reasonable. But how about people graduating from universities not being qualified? That is the question that shows up in a number of posts and news articles lately. Keith Ward asks the “Over-Educated, Yet Under-Qualified?” question in a recent editorial post in MSDN Magazine. He references an article from InfoWorld Web’s site, “The sad standards of computer-related college degrees”. Is it that bad? Well I hear mixed reviews on recent graduates.
One VP at a major software company said that it takes an average of a year and a half for a recent college graduate to become a fully productive member of a development team. On the other hand I do know high school students who have gotten software development or testing jobs while still in high school. Some of my students were in that category. Hélène Martin reports that she has “I’ve got 5 students doing technical internships this summer” as an aside in a great post she has reviewing a Teach SCHEME workshop she is attending. (The post is TeachScheme Workshop at Brown University and brings up a lot of important points that are very worth reading and considering. )
As usual the truth is really between extremes. Some high school students do get technical internships and some of them even involve programming or at least software testing. And some university graduates are not as prepared as many employers would like. Part of the problem though may be expectations. One expects a lot more from a university graduate than from a high school student. Entry level positions, rather than just internships, usually want more experience with larger teams and larger projects. A large university project may include 3-5 students working for a couple of months. A small commercial project may involved 20 people working for a year. Vastly different scales.
I’m not convinced that the problem is a dumbing down of university curriculum though. I don’t think that is happening. Yes many programs are trying to make their programs more interesting by adding robotics or game development and similar creative areas of case study. But they are working hard not to do so in a way that waters down the curriculum. Fun and dumb are not synonyms. The XNA Game Curriculum resources that Microsoft has are serious and often complex materials for example. (XNA Resources and Teaching Tools for your Classroom) No I think that the problem is that professional development has gotten more complex at a faster rate than education programs have adapted.
When I was an undergraduate the sort of projects I worked on as a student were much closer to the much smaller projects that were common in the early to mid 70s. Those days are gone but have universities adapted? Maybe not enough. Some have. As an industry advisor to Taylor University's computer science program I have been pleased to see them make real world projects a key part of the department’s programs. (Challenging, Real-World Projects) Obviously I’m biased as a graduate but I think they have been doing a great job of connecting to the real world for a long time. I do think there are models that could be replicated in other schools.
Other universities have great co-op and similar programs that place students in the work force while they are still in school. These are great ways to learn the “real world” at the same time that students build theoretical knowledge in school. Having the chance to move from theory to practice and back again can be an outstanding opportunity to prepare for post graduate careers. Students can and should also try hard to get internships during their summers. There are probably not enough of those opportunities and they can be hard to find and get. But you know if someone works hard to find and get hired for internships they are going to be better prepared for careers later.
So I am not convinced that university computer science education programs are really in tough shape. I think most of them are even very good. But some could be better. Well, when push comes to shove everyone could be better.
In schools the accent is on theory and algorithms, the small projects target the learning of a technology, their complexity and difficulties involved being quite small when compared with real life applications. Taking an application from design to production and later during support phase require times and the mix of knowledge from different fields, thing quite difficult to do in a school project, while the structural context in an organization, the requirements and work in a team, is again quite different. Going above the basic features of a programming language takes time, it depends on the learning curve of the programming language and the capacity of the learner, on the complexity of the tasks approached and on the knowledge (made) available. I can’t say that schools can do much in this direction because it’s quite difficult to cover all the aspects in just 8-20 classes, in which the students are introduced into the concepts and some basic applications. What the schools could do in order to support their students is to provide the required infrastructure (mainly computers), bring the technologies and learning material up to date, direct gradually the focus from theory to applicability, and eventually support users getting some additional experience in organizations. It’s in students’ attribution to make most of the learning experience in schools, though often even if the want, need and infrastructure is there, fighting with the lack of time is quite hard.
One of the tough realities in IT is that it takes time to link the dots, and as you already highlighted, it takes about a year before a college graduate to become really productive. Now I have to say that this depends also on the organization’s culture/environment, on how it supports the learning process, how it helps the new comer to become part of the team and become productive. I’m saying that because I’ve seen companies doing minimum in this direction, just expecting the new comer to catch everything on the fly and be productive in a matter of weeks. Those working in IT for a longer time know that is not entirely possible, though there are also some exceptions. There are also organizations that train the new comers, introduce them into tasks evolving in complexity based on each person’s skills, provide resources (software tools, books, courses and other type of learning material) and an environment that facilitates learning. Having time allocated for learning new things, participating in activities that allow the distribution of knowledge within a team, having professionals whom you could ask questions or who could mentor you through the learning process, I consider all these as being essential for a modern IT organization.
The theory learned in schools need to be supported by hand-on experience in order to make most of the learning process, IT organizations are maybe the best places to do that, though I’m not sure how much that is possible. There are schools, organizations and governments that support this type of learning, though, unfortunately is not everywhere possible to do that or at least not for everybody. I think it’s in everybody’s interest to make most of the learning process, for schools to have highly skilled graduates, for organizations to have productive employees, a pool of college graduates resources from where they could select potential employees, for students to be skilled, and thus have higher chances of finding a job, while for governments this could lead in theory to a smaller unemployment rate. I find important the constructive involvement of all parties; now, I wonder how many schools, organizations or governments are trying to do something, change something into this direction.
Nice post! Very interesting, I guess it would depend on your experience, however, degrees in computer science and education make you smarter and expand the brain! Thanks for this post!