Learning to Code? You're Doing it Wrong

You’ve probably seen an uptick in the number of people wanting to learn to code these days. So why aren’t all the developer jobs gone if seemingly everybody is learning to code? The sheer amount of free tutorials and documentation should be enough to educate all the people we need in software development, and yet the demand for developers continues to rise.

According to the Bureau of Labor Statistics, “Employment of software developers is projected to grow 21 percent from 2018 to 2028, much faster than the average for all occupations.” So what is stopping people from scooping up these jobs? At least one problem that might explain this is something called “Tutorial Hell,” which is an extremely common trap people fall into when starting out learning to code.

I majored in education at my university, so I already had the meta-awareness that just “coding along” during tutorial videos wasn’t going to be enough to teach me how to code on my own. This may confuse some people, because after all, some people are visual learners. Shouldn’t they at least be able to learn to code just by watching someone else do it? The answer is yes and no.

We are able to gain familiarity with a subject by watching, which is better than nothing, certainly. What we will never receive from a tutorial video, but what is crucial to becoming a hirable developer, is the set of soft skills that enable a developer to work through new and unfamiliar problems. The catch-22 of the learning process required to become a proficient developer is much like the catch-22 involved in the developer job market:

You need to know how to code to solve new and complex problems. You can’t solve complex problems without learning how to code.

-is similar to-

You need experience to land a job. You need a job to have experience.

If you look closer, however, you will find that this catch-22 is only a paradox on the surface.

For the job market dilemma, you can generate your own experience with side projects relevant to the job you want to obtain. This is easily done in the tech and creative industries, where you can make prototypes and small applications to add to your personal portfolio, that you can then show off on job applications to attract employers.

For the learning-to-code dilemma, you have to accept that the most important process is the figuring-it-out-yourself part of programming. Think about it. There will not always be a tutorial for the problem you need to solve on the job. You need to think of how you’re going to approach an unfamiliar problem when the documentation is sloppy or the tutorials haven’t been made yet. How else did those people do it? Certainly magic or pure genius was not required in every instance.

Most likely, the first person to release a tutorial on a subject gathered information related to their problem, took the time to understand the inner workings of the system they were trying to work with, and attempted to implement their own solutions until something finally worked. Failure needs to be embraced as a necessary part of the learning process, not something to be skipped over. It’s much more pleasant to skip this scary, uncertain part of the learning process and opt to be led by tutorials. It’s easier never to take risks and watch your program throw errors and get broken while you’re refining the codebase. This is why Tutorial Hell is so common.

Peter Senge wrote a book where he outlined 11 laws that can guide people to become more effective at learning and life in general. Two of these 11 laws explain why Tutorial Hell happens to people who just want to be led by other people when learning to code, and don’t want to take their own risks:

Law #4 → The easy way out usually leads back in.

It’s easier to watch tutorials and copy the code the instructors write. If you do this, you will end up in the same position at the end of the tutorial that you were in at the beginning. You will not know how to create the project from scratch on your own because you were relying on the instructor to tell you what to do. If you encounter bugs, you will have a much harder time solving them because you didn’t fully understand what you were writing.

Law #6 → Faster is slower

It takes much longer to learn “the hard way” by building your own projects and suffering through the uncertainty, but ultimately you will reach your learning goals faster this way. Tutorials have value, but they are meant to be used as tools to fill in knowledge gaps, not to build your projects for you.

The best advice I have received from speaking to other developers at work is that most people have it backwards. Most people are looking at the most popular tools out there and trying to learn them before they have a reason to use that particular tool. Instead, people should aim to solve a specific problem in the world. What kind of app does the world need? What kind of problem could be solved by an application? Is there something I’ve always wanted to build just for fun? When you have a solution to a problem, then you can go about planning the design. This is when you will need to select the best tools for the job. This is when you can decide if that “hot” new tool is going to be the best choice for your application.

Remember how we were discussing all of those developer jobs that nobody seems to be taking? This is another reason for that. In developer interviews, candidates are often asked to explain why they chose X technology or Y tool when building Z application. Candidates who don’t have a clue why they used it other than “it was a popular technology” will immediately get the boot. Developers aren’t just supposed to use the “latest and greatest” tech on the market. They’re supposed to make informed choices about the tools they use on the job.

Not all tools are created equal, and not all new tools end up being as good as everyone said they would be. Just like new businesses, only the best ideas have any staying power, and they aren’t always seen for what they are while they’re still in their infancy. Being an early adopter always carries some risk. That’s why even the best inventions take some time to gain popularity. Not everyone is keen to be the first one to find out that there’s a major problem with the new thing.

In technology, though, new developers see these new technologies as shiny things to boost their resume, so they can’t jump on the bandwagon fast enough. It is worth it to be cautious, however, and to fully understand what a new technology can offer your project before you start using it. Doing this analysis up-front will make sure that you don’t end up learning technology that you have no use for other than as a resume builder. Employers are much more interested in seeing that you’ve made sound decisions with the technology stack you use in your projects. The best part about these popular technologies, though, is that they often have many valid use cases, which is why they’re popular.

The main takeaways I want you to get from this are:

Don’t learn something just because it’s popular. Learn it because it’s the best choice for your project and be able to articulate why that is. Don’t let tutorials build your projects for you. Use tutorials to fill in knowledge gaps you have and then go back to your project to implement the new information. You can’t learn anything well by being led. You have to take risks and fail a lot and trust the process. Trust yourself to problem-solve and Google answers because that’s what even the most experienced developers do every day.

I hope this helped anyone stuck in tutorial hell or who is interested in beginning their journey to learn how to code. Happy Coding!