I have been working in the IT Industry for close to 30 years and seen lots of changes. Over the years, I have been interviewed and been the interviewee for various IT jobs, and the questions of education and certification always come up.
What skills are in demand?
Right now I am working part time in education and people ask me what does one need to know to land a job in IT? Should it be industry certifications (Microsoft, Cisco, VMware, ComTIA), an IT certificate program, a college degree or some combination? The problem for educators is knowing what skills to teach and making sure those are the same skills employers are demanding in their new hires. Being in public education, we can only offer a limited number of classes, and I think you would be horrified at the skills graduating high school students have or, I should say, don’t have. Imagine trying to teach subnetting or binary math to students who have to use their calculator to perform simple addition and subtraction problems.
When I began my career in IT, I learned by doing. The only classes being taught in colleges were programming, and classes in networking were only found at the top universities in electrical engineering programs. Times have changed. YouTube is filled with thousands of videos on server administration and networking.
This semester, I am teaching CompTIA’s A+ course. It’s been 10 years since I’ve really played around with the hardware. But to my surprise, when I reviewed the content of the A+ textbook, I felt like I knew everything: ISA, EISA, Micro Channel, IDE, parallel ports, Thicknet, Thinnet. This was all stuff I used 10 to 20 years ago, and I wondered why this material would be covered in an A+ textbook today. Some of these technologies are 30 years old. And some this stuff I have never seen. This really has me questioning what we should be teaching students today. Should we even be teaching PCI, PCI-X, and AGP? Aren’t these technologies rapidly fading into history? As I flipped through the 1,000 page book, it looked more like a history of computer hardware than a textbook about hardware for new IT professionals.
I’m also wondering just how much an IT professional needs to know about hardware these days. In the past, I added RAM, swapped CPUs, and installed larger hard drives, but is that something we are still doing today? I think it’s been 15 years since I installed a faster CPU. When you look at the new computers such as Microsoft’s Surface, Apple’s iPad, and ultra-thin laptops, aren’t all of these items sealed units. I’m just having a hard time believing anyone is upgrading RAM or installing a faster CPU these days.
I’m thinking we should be filling the RAM in our student’s brains with current and more relevant information. After all, aren’t most businesses upgrading computer hardware every three to five years? It seems to me hardware that is over five years old just isn’t worth upgrading. Or am I way off base here?
Value in certification?
I’m also wondering about the value of industry certifications. I’m familiar with Microsoft’s MCSA, MCSE, ComTIA’s A+, Net+, and Security+, Cisco CCNA, and VMware’s VCP certification exams, but do employers see any value in them? As I research industry certifications, I’m learning there’s an entirely new series of certifications that are far easier and have less questions than the existing or what I will call the “real” exams. The IT industry has created so many certifications, I’m questioning if there’s any value in becoming certified anymore. How is a student or employer to know which ones are valuable and which are not. I meet many starving students who have spent $500-$800 dollars to take these junior certification exams that are meaningless to employers. These junior certifications do not lead to more advanced or prestigious certification such as MCSA or MCSE.
I am now questioning the value of an IT education and of industry certifications. Are they worth it? What should we be teaching our students, the next generation of IT workers, and your future employees? For the past decade I’ve been teaching the basics: networking fundamentals, (and I don’t mean networking according to Cisco, but how computer networks transmit data), Windows Server/AD administration fundamentals for the current version of Windows server, server virtualization, and cloud. I make sure my students get a heavy dose of troubleshooting and practice communications skills, the creation of documentation, and help desk ticketing skills. Many of these skills are difficult to test.
I also teach security classes. In looking at the security textbooks, I just cannot believe how out of date the current books are. In the A+ textbook, they talk about security flaws and attacks against Windows NT and Windows 95. Who cares? These classes are supposed to be teaching current job skills, not the history of malware. I really feel guilty teaching my students about “I Love You” or “Code Red” virus from a $175 textbook when there is nothing about the types of attacks that have occurred in the past five years.
Believing I was doing a disservice to my students and to future employers, I began I have my students compete in online cyber competitions. Not only are my students getting real-world, hands on experience, but they are learning about the latest cyber attacks and how to detect them instead of threats that happened 10 to 20 years ago. I’m finding my students really like this method of teaching. Not only are they learning about current methods used by attackers, they are also approaching security the same way they would in a real job.
If you are a hiring manager I’d like to hear what skills and certifications you would like to see in candidates for IT positions you might be hiring for? If you are a recent hire, what skills and certifications did you have to have to get hired?