The computer and information technology department where I teach part time has been experiencing dropping enrollment for a long time now, ever since the dotbomb. It's hardly alone, though, as just about every computer science and technology school in the country has seen enrollment leak away.
Our students have supposedly been frightened off by myths about offshoring and employment gluts. Apparently, they hadn't heard the anguished moans from IT managers nationwide as they hunt in vain for employees. Gartner (www. gartner.com) reported as recently as December that there was a "massive and devastating skills shortage." CIO Insight magazine (www.cioinsight.com) followed with a January article saying essentially the same thing, using the low unemployment figures for IT workers as evidence that IT hiring is, indeed, stretching to its limits.
Or maybe the vanishing students were right after all, and we are wrong. Maybe there isn't a labor shortage in IT at all.
In a March article in Baseline Magazine (www.baseline.com), Ericka Chickowski gets right up in the grille of the myth of an IT labor shortage. I have to admit that, throughout my sober doomsaying, I haven't actually seen many statistics to back up the claim of a shortfall. There have been a lot of anecdotes, but the plural of "anecdote" is not "data." I just parroted the doom and gloom. And in retrospect, I have to say I haven't noticed IT workers' salaries rising substantially or bonuses being offered, as would happen if good people were as scarce as oases. Is there, or is there not, an IT labor scarcity?
The other day, I had a hallway conversation with a student that helped illuminate this conundrum for me. He's close to graduating with a four-year degree in computer technology, but he complains that he hasn't learned skills he can walk in and put to use on the job right away.
I assured him that employers recognize that computer departments couldn't possibly teach the latest thing every semester, and that they would give him time to "train up" to whatever they wanted him to do, using the generalized skills from his degree as a base. Computer languages arise and fade, but programming mind-sets are good indefinitely.
Maybe he's right to be worried, though. Linda Musthaler in a 2006 article in Network World magazine (www.networkworld.com) essentially says that IT hiring practices are dysfunctional. Computer jobs of all kinds tend to be volatile, winking in and out of existence according to the economic conditions surrounding the company. Employees learn to be mistrustful of assurances and seek jobs with security rather than challenge, and if those jobs are in another field, they'll switch. Strike one.
Then companies play the "Mr. Right" game, waiting for someone who isn't just good, but perfect. Computer skills come in many forms and flavors. A "Microsoft shop" doesn't want to interview a candidate with Java skills. He may be overqualified for a position. Or the candidate isn't certified in something or other. Strike two. That resume ends up round-filed. This, too, the student knows.
Afterward, the HR manager cries out that she just can't find enough IT workers to fill her jobs and must go overseas to find help, but the IT work force considers that wailing to be merely an excuse to hire lower-priced labor. Strike three. All the aspiring worker knows is that he sends out dozens of resumes, only to have few interviews and no job offers, while India and China get ever-bigger contracts.
If this is the case, it explains the apparent paradox of low unemployment coupled with low job-hunting satisfaction. Demand is high, but narrow. Students know they have to pick a particular technological path, but picking one is as harrowing as selecting a lifetime mate.
For example, there are at least five major server platforms in common use, and nobody is an expert on them all. There are three huge database managers and many smaller ones, and all of them are different. The Web uses a minimum of six major programming languages and many less common ones. Such languages are so rich in power and variety that even good programmers in one job can have a substantial learning curve in another.
Then there's the legacy problem, where companies can't bear to give up outdated technology, but which causes new employees fits trying to maintain or update. Which of these skills should a student acquire in depth? For many, it's a life-and-death decision, something that will lead either to a job or to a gypsy existence trying to find a skills match.
Companies can cure much of this malaise by giving new, promising minds some time to get caught up, but many say they just can't afford partial productivity. It's "day one" or nothing. Too many high school students know it, and are opting for something else.
Altom is an independent local technology consultant. His column appears every other week. Listen to his column via podcast at www.ibj.comor read his blog at usabilitynome.blogspot.com. He can be reached at email@example.com.