Easy

Developer Culture Check

“This $5000 programming class is on sale for $17!

I only had to scroll Facebook for 30 seconds to find this. That should tell you something. I only had to scroll Facebook for 30 seconds to find this. That should tell you something.

Oh boy!

I can’t tell you how many half-baked ads like this I’ve seen claiming that learning {x, y, z} technology is going to {build a path to a better future, double your income, land the tech job of your dreams}. How many YouTube thumbnails claim “THIS ONE TRICK CHANGED MY CAREER”.

Already, you may be thinking “They’re obviously exaggerating. It’s an ad. That’s what they do.” If it were just the ads, that’d be one thing.

But it’s not about the ads. It’s about the culture.

Let me back up.

Over the past couple of decades, Computer Science exploded around the world. Tech giants laid a foundation on silicon and ambition. At first, it was a computer in every home, then every pocket, and finally every device that could be improved by one. Today, almost any app you could ask for is at your fingertips, any story you could imagine tucked in a video game. Research, medicine, finance, and virtually every modern field benefited from the digital age.

So, it’s no surprise that the desire to learn about this world — how tech is created, software imagined — has similarly made its way to the corners of the globe.

It was inevitable that learning these skills, even if just a lick of Python or Javascript, would become valuable.

Nowadays many trades recommend picking up programming to stay current, get a competitive edge, or adapt to new software. Understandable, as in general knowing even a little bit can take you a long way. There are tailored intro CS classes for lawyers, doctors, geologists, you name it. Don’t get me wrong — this is the best thing that could happen to CS. After all, there’s a prevalent misconception that Computer Science is about programming. I remember thinking this as an intro CS student, and I remember the profound mix of awe and dread when my intro CS professor sat our class down, first day, put on a flat face and said,

“Computer Science is the study of problem solving.”

She went on to explain to us that we would use computers, learn Java and programming fundamentals, and generally survey modern CS. But if our intent was to major in CS, that we should abandon whatever preconceptions we had about it. Learning to code would be like learning to write — a tool, a medium, a means to an end. But (again like writing) through it we would find so much more — an art, a mindset, catharsis, opportunity.

The skills you sharpen when learning to code are indispensable, highly sought after skills in other professions, and I’m not even talking about the actual application of coding itself, which is valuable in its own right.

I’m talking about problem-solving, abstract thinking, and patience.

Returning to my previous point, this is why it’s great that CS/learning to code has become so prevalent. Because the best things to come out of CS are always going to be the practical applications of CS. Sure, solving P=NP and making advances in Theoretical Computation are groundbreaking, but that alone isn’t what draws most people to CS, at least not at first. And I say that as someone who loves theory, algorithms, and the like (it was by far what my school’s CS program focused on the most).

So why is it so bad that everyone and their mom are trying to get you to learn to code online?

Inherently, it’s not.

Accessibility has historically been one of the great challenges to CS/learning to code. Even with the (relatively) recent push to bring in a more diverse set of developers, the CS/Dev world is still overwhelmingly wealthy, white/asian, male, and egocentric. As you might expect, so is the bubble surrounding learning CS, and that makes it particularly challenging to feel like you fit in if you don’t fall into the above. This effect is amplified by the fact that CS and coding are difficult to learn in their own right. You know what really sucks?When you feel like you aren’t getting something, and you can’t relate to anyone around you. Make no mistake, let’s welcome every perspective we can. We should be doing everything in our power to queer this shit up.

The problem is that more and more frequently the culture of learning to code is disingenuous. To dissect this, let’s briefly examine the three clear ways to become a developer today.

The classic: A 2–4+ Degree from a College/University

I will say upfront this was my approach, and so I can speak from experience, whereas the other two areas I can only speak as an observer.

You enroll and study for a degree in Computer Science, Computer Engineering, or a related area of study. This is the straightforward, no-nonsense approach. You will learn the fundamentals and then some. You’ll do it alongside peers, academic and industry professionals. You might even learn a little bit about how you learn, an invaluable skill in its own right.

There are a wide range of approaches that CS departments take:

You have something like my alma mater, Oberlin, a small liberal arts college, which offered an almost entirely theoretical, abstract, and math-heavy approach. There’s less emphasis on what you’re doing with what you’re learning, and instead, a lot placed on how and why.

On the other hand, you have departments that offer strict real-world focused curricula. You’re expected to pick up fundamentals, but then there’s more emphasis on practicing systems design, learning a breadth of languages, and development experience.

And then there’s everything in between. There are pros and cons to each. If you’re interested in continued education, a theory-heavy course load is likely what you want. If you’re trying to hit the ground running as a developer, a technical-focused department might be the way to go. Both can be rigorous, and they aren’t mutually exclusive. I often see people in one camp berating one over the other, which I think is a waste of time. People have different goals.

Code Bootcamp

These are the one-stop-shop for learning how to code. They’re typically short-term, demanding programs that teach hard skills and often focus on getting students a career in tech. They can be immersive in a single topic or shallow in covering a survey of topics, as well as expensive or cheap.

I don’t know if this is still the norm, but I remember them appearing attractive to many as they offered deferred payment plans until they found you a job. That is, you don’t pay anything until you start making money on what you learned.

They can be online, in person, or some combination. The majority offer courses in web, game, and/or mobile app development, as well as UI/UX design.

Self-Taught

This is exactly what it sounds like. You boot up your Google machine, find a guide that looks interesting, or download an e-book/e-course. If that sounds like an incomplete description, it’s because it is. There’s no “right” way to teach yourself. It’s up to you.

Teaching yourself to code is no new thing. There are countless books, articles, video series, and every other medium you can come up with to learn. I think this is a Very Good Thing. The education is out there for free, for those that want to put in the time and learn it.

Since it’s self-paced as well, you have total flexibility as to how, when, and what you’re learning.

This is where the problem starts.

Let me begin by saying there’s no end to people going back and forth, again, berating each other over the merit of any of these approaches stacked up. For every die-hard self-taught developer, there’s another that swears by code boot camps. Don’t even get me started on the elitist “I went to X prodigious university and your education is worthless” attitudes out there.

While that’s a separate issue in its own right, that’s not what I’m here to rant about today.

No, the real issue is that everyone wants a piece of the pie and is willing to sacrifice the bigger picture to get it. We’re trading quality for quantity.

All of these ads that claim they will transform you into some kind of coding powerhouse billionaire are misleading to the uninitiated public. They don’t always paint an accurate picture of what being a developer really looks like, nor do they encourage the image of what, as I said above, CS is all about: problem-solving.

They’re appealing only to wide-angle shots of salary and status, wrapping up a technical career in jargon that the unfamiliar aren’t going to be able to decipher to know if they actually want to be a developer.

And I can’t entirely blame them. IT is the new white collar. It’s a started-from-the-bottom Cinderella story that sells and reeks of the “American dream.” Like I said before, this isn’t inherently a bad thing, but it deserves scrutiny.

The fact of the matter is, that’s an unrealistic portrayal of what the work and lifestyle actually look like. Not everyone should be out here trying to write code for 8 hours a day. I’m willing to bet that it’s further contributing to the stereotypes that already exist within the culture (re: lack of diversity).

Even for those that do want to pursue and would excel at a career as a developer, it feels dishonest to have the educational narrative wrapped tightly around learning to code just to get a job.

Because again, the technicality of learning to code is not what should be commanding respect from learning to code. This culture of picking it up as fast as possible isn’t doing the community justice. Worse, there’s a trend of devs being pushed into the industry so fast, that they’re thrown to the wayside a year later for producing spaghetti code, security vulnerabilities, etc. This not only lowers the hiring expectations, but creates a toxic environment for junior and aspiring developers.

Unfortunately, this culture is pervasive and carried away by social media and the ever-growing hunger for attention and status.

My intention is not to come across as some kind of CS evangelist.

Quite the opposite. I believe that over the next decade we will continue to see a shift in learning the fundamentals of CS and coding as a necessary and normal part of continued and even primary education.

That’s why, right now, as developers and mentors it’s our responsibility to be hyper-critical of how our field is being conveyed. What is the focus of our lessons? Who is hearing them? What interests do they have that set them apart and how can learning benefit them outside of monetary gain? It is paramount that we capture what inspires us and make sure that’s presented at the forefront of our community.

/rant