So after yet another disappointing realization that I don't have the skills that most employers want, I have decided to forego brushing up on Clojure, or learning D...or OpenGL. Sigh.
Unfortunately, I work in a kind of half-way world where my skills don't seem to be valued. I am not a firmware developer or device driver developer (anymore), and I don't have enterprise skills that most other businesses want (my SQL knowledge is very basic and I know very little about web development either back end or front end).
So for the last 3 weeks or so, I have been learning OSGi. I have made an Eclipse plugin before, but that was mostly just trial and error because the documentation was so bad (even the SWT documentation was pretty bad, but I somehow managed to make a JFace based plugin). So this is the first time that I've really dived into OSGi, and I dived into it because of something I've been seeing at my workplace: non-modularity.
Spaghetti code has of course, a bad connotation to it. But I think some of its original meaning has been lost. Spaghetti code is code in which codepaths and/or dependencies get so interwoven that it is no longer feasible or possible to change one thing without that change cascading into a lot of other code. When you have non-modular code you almost by definition have spaghetti code. I have seen first hand now what happens when you only need some classes for a project, but because those classes have dependencies on other classes, and those classes on other classes (ad nauseum), you get into a monolithic all-or-nothing scenario.
I decided to learn OSGi precisely to combat this problem (despite the offending non-modular platform not being in Java). What I am discovering is that it is harder to design applications than it is to code them. Admittedly, I'm getting a bit stuck on Services at the moment, but to my mind, the hardest part is just structuring your application into modules in the first place. I am also hoping that when Java 8 comes out along with project Jigsaw, it won't be too much of a transition for me to use a more modular approach to programming. Also, I can finally (hopefully) figure out what Inversion of Control and Dependency Injection are. I've already been looking at iPOJO in felix, and it seems interesting, though a bit confusing. Component based frameworks have a lot in common with modularity, and they seem to work well together.
If there's one thing I've discovered being a SDET for the last 3.5 years, it's that there is definitely a need for engineers who can bridge the hardware/software/enterprise gap. It astounds me that the SCM team doesn't understand what "Software as a Service" is. Or what Service Oriented Architecture is. Hell, even just trying to show what distributed computing is was a challenge. I have had an opportunity now to see several organizations "Test Automation Frameworks", and to say that they were....to put it politely...behind the times is an understatement. State of the art to many test groups is to just have a pile of scripts written in a ton of languages with no means to tie together Requirements -> TestCases -> Scripts -> automated installation -> automatic argument passing. They just pass on by tribal knowledge that Script A is for TestCase B, and that you have to install a whole bunch of software on the system under test. Oh, and the tester will have to manually fill in all those annoying little things, like when they started the test, when it ended, and if it passed or failed (not to mention manually uploading log files or other useful debugging info).
Being an SDET in a hardware company is a bit challenging, because I have to not just provide an automation framework, but an ad-hoc one too. While the automation aspect mostly just deals with the enterprise side of computing, the ad-hoc tool creation requires understanding hardware, firmware, and drivers. If you talk to many Computer Science grads today, talk about Interrupts, Stack frames, logic analyzers, IOCTL's, SERDES or the finer points of memory allocation....and they will probably just gaze at you with blank eyes. The majority of CS majors I have met only took cursory C or C++ classes, and took neither Logic Design nor Microcontrollers. Conversely, Electrical Engineers don't understand the principles of good software engineering, and often have to be dragged into using Revision Control, given an explanation why writing 1000 line functions is not a good idea (inline them if you are worried about performance), that creating unit tests is helpful, and told that copying and pasting code is usually a really bad idea.
I definitely think there is a place for people like me. I'm not a device driver guru, but I have done it in the past. Ditto for embedded firmware (I used to write assembly a little even). I'm not a SQL master, but I can create simple tables with constraints. I don't understand load balancing for web servers, but I can write a simple servlet for Jetty to enable web services. But I think the most important skill I have of all is figuring out the technology that should be used. Maybe because I'm not a master at any one thing, but I know a broader swath of technologies than even many engineers with 20 years of experience who has been specializing, I know about many ways to tackle a problem.
Perhaps one day, an employer will discover that it's not so much about the skills you have now, it's about how quickly you can adapt and learn, and how well an engineer can integrate all of his skills and knowledge together.