I was at a holiday gathering the other day and during the usual course of “…And what do you do?” I replied that I was a developer. The inference was that I was a Real Estate Developer; I had to explain that I was a Make the Computer Do Useful Things Developer. I was talking to two ladies about my age (Hi, I’m 40), and was surprised at the reply: “Oh, that’s unusual!”
I suppose I should not have been. I know a lot of women in IT, but darned few who do development. To be clear: most of the women I know in the Information Technology space were at one point developers, or have a passing knowledge of some development language. They merged into Project or Product Management, or Business Analyst roles. These roles require knowing what is possible of code without actually having to write any of it, and so if you get tired of the incessant progress of development technology then that is one way up and out (and it is a way I took, about five years ago).
Careers arc and opportunities knock and itches flare up and I am once again a developer. And I find myself, when talking to people who don’t work with or know other developers, battling not only the usual misconceptions about development, but the gender-based ones as well.
Development (in IT terms) is the handle one applies to the concept of using a series of commands (code) to tell the box (tower, laptop, server, etc.) what you want it to do; if you want it to take in something or not, if you want it to spit out something or not. In order to create this blog post many people did varying forms of development (from creating the templates that instruct the browser how to make this post look all shiny, to the protocols that tell the server where to put this post, to the widgets on the front end that tell you things like I haven’t posted in a while). If I typed it in MS Word, that required a bunch of other development by a bunch of other people.
Development is not:
- Something you can do on five screens drinking 3 bottles of wine to create a “worm” that appears as a graphic on your screen (as in Swordfish), and usually doesn’t involve a developer logging an Easter Egg of themselves in a bad Elvis costume with sound effects (as in Jurassic Park)*. If I drank 3 bottles of wine and was looking at 5 screens they’d probably be the ones you see in a hospital room, and the only graphics I would see appearing would be the “worm” that is my heart rate monitor flat-line. And while I have myself buried Easter Eggs and commentary in code, it isn’t that elaborate because you don’t typically have time to build elaborate things. You’re busy rewriting all of the stuff you just wrote because someone decided to change the scope of your work.
- Anything involving a graphic user interface (GUI). When a developer talks about manipulating objects, they are things that are typed out phrases, they are not boxes that are dragged and dropped. There are some development environments that offer up a GUI in tandem with the “scripting” – that bit about writing out words I was talking about – but they are there to illustrate what you have scripted more often than not, and not there to assist in your scripting.
- Finite. Development technology is constantly changing and no one developer knows all of the development methods or languages. That would be like someone knowing all of the spoken languages in the world. Rather, it’s typical you’ll find one developer who “speaks” one development language really well, or maybe a branch of languages (much like you run into a person who can speak Spanish and French and Italian, because they are rooted in the same “base” of Latin, it’s not uncommon to find someone who can code in ASP.Net and VB.Net and C#.Net, because they’re all of the Microsoftian .Net base). No one hires “a developer”, they hire a .Net Developer or a Java Developer or a Ruby Developer or what have you. Specialization exists because the base is so broad.
Modern cinema has done an injustice to developers in terms of making what we do seem both simple and sexy; the “shiny” environments typified by the interfaces “hackers” use on-screen looks really slick and probably took some real developer hours of time to make look good… with absolutely no real purpose. That said, actual development can be simple (with clear requirements and a decent knowledge of the things you can and can’t do) and can be quite sexy (if you’re sapiosexual). It’s just not well-translated in current media. (To wit: Jeff Goldblum uploaded a Virus to an alien system on a Macbook. He didn’t have to know the alien system’s base language, machinery, indexes, program constraints, functions, etc. And it was on a Mac, in the 90’s, for which development was not one of its strengths).
Most of what development is, is trying to solve a problem (or two), and generating endless logic loops and frustrations along the way. You build a “thing”, you think it works, you go to compile it or make it run, it fails, you go dig through what you wrote, find you’re missing a “;” or a “,” or an “END” or a “GO” or a “}”, re-run, find it fails, and go dig through some more. For every hour you spend writing out what you want it to do, you spend about an hour figuring out why it won’t do it. This process of “expected failure” is not sexy or shiny or ideal, and that’s why it doesn’t show up on-screen.
These are misconceptions every developer, regardless of gender, has had to deal with at some point. Some deign to explain, some gloss over, some simply ignore; much like I really hope we get a socially-functioning, intelligent person on-screen soon, so do I hope that we get a showcase for the simple elegance of real development.
It would be great, too, if there were more female developers on “display” as well (and not for their bodies, hence the scare quotes). Think through every movie you’ve ever seen that shows people doing any real development, “hacking” even (a term that is abused beyond recognition); how many were female? Go back to the movie “Hackers”—did Angelina Jolie actually, ever, really type anything? You inferred that she did, but the real development, the real “hacking”, was done by the crew-of-guys. Oh, and that’s right, she was the only girl. The Matrix? Carrie Ann Moss spent precious little time in front of a computer there. She did look damn good in skin-tight leather.
Fast-forward a decade (or two) and we’re pretty much in the same boat. You see women behind computers on-screen, but they are typing in word processing programs or moving the mouse to click it on the shiny picture of the Murderer/Prospective Boyfriend (or, you know, both). They aren’t buried under a desk trying to trace a network cable or eyeballing multicolored text trying to figure out *WHY* it won’t compile, they’re delivering the shiny printout to the Chief/Doctor/Editor from which Decisions Will Be Made.
We find it surprising in social circles, I suppose, for women to be in development, because we don’t see it exemplified or displayed in any of our mediums. TV, Movies, even proto-development toys for children often feature eager-looking boys interacting with them, the girls are reserved for the beading kits and temporary tattoo sets (actually, there’s precious little out there for getting your child, regardless of gender, to learn code, but that is changing). We have crime-solving anthropologists, we have NCIS ass-kickers, we have cops and coroners; maybe it’s time we had a developer.
*Jurassic Park is a good example of both great and poor development display. Right before tripping that “Dennis Nedry Elvis Graphic”, Samuel L. Jackson’s character is eyeballing Nedry’s code. That stuff that looks like sentences that don’t make sense? That’s code. That’s what it looks like, for the most part. Unfortunately, later on when the little girl is hacking the “Unix System” that “she knows”, it’s all graphical. And that’s not accurate.