Much has been written about learning how to code. Instead of rehashing the best of what is out there, of which there is already a great deal, I’ll write about how I personally learned. If you’re like me, this might be a good way to learn. If not, it might be a terrible way. There are many paths, some no doubt more efficient than others, this is mine.
I was first exposed to code by my Dad. My Dad is a programmer, nominally self-taught along the lines of the old school programmers you read about on Wikipedia. He was obsessed with early computer hardware he had access to when he was a teen, and starting with games and other small programs taught himself. Eventually he and some friends started a software company.
I didn’t actually appreciate code while I was living at home that much, beyond a few early formative experiences. One of them was at camp, taking a course on building a website. The power of code really sunk in at that point: I was in lots of STEM courses, but code seemed much more interesting and practical than the other stuff we talked about. The second was when I became interested in businesses, and online businesses in particular. I missed the boat on being Jeff Bezos and getting in on Web 1.0, but I still feel like there’s so many opportunities to build things that are useful to people and deliver them over the internet. I’ve worked in ecommerce, directly or indirectly, my whole career. Maybe I will in the future too.
People think learning code is hard, because movies show a nerd slinging unrecognizable garbage into a black and white screen full of text and hitting enter, then the door unlocks or the power grid turns off or something. That isn’t programming. Programming is starting with a goal, like keeping track of books or animating a cartoon bunny, and involves a tight cycle of Googling, tutorials, copy/pasting code, running it, fixing the broken stuff, running it again, fixing the broken stuff again, and eventually deciding you are done. Even people who do it for a living mostly do that. There are many cooks, and few chefs. The chefs of programming are in fact doing incredibly cool things, likely on a black and white screen. But they still spend a disturbing (for people with poorly set expectations) amount of time Googling and failing to do it properly.
I found that it took me a really long time to learn to code properly. I wouldn’t necessarily say I can even do it now, despite many years of experience. The weird thing about code, compared to almost anything else, is that it’s a video game with unlimited levels. You can literally continuously learn about how to program, starting with the most trivial tasks and upgrading over time to more complex ones. I’ve spent years of effort in total writing code, and I still learn materially important things at least once a week. It is frustrating, because unlike some things, you can be many orders of magnitude better when you get good as when you start out. In hockey, you can skate like five times faster once you’re great. With code, you can solve a problem in literally 30 seconds that might take days when you start out. And that’s basically the product of the continuous learning: it never gets easier, you just get better at it and faster.
The progression in terms of skill starts out with just exploring and copying/pasting things. The next level involves less copy/pasting, but it still takes you a really long time to debug problems. The last stage would basically be having enough experience that you can debug things really fast, or at least figure out the nature of the problem fast enough for customers / users / you to be satisfied with the amount of time it takes you to figure out the problem. That took me years, but I’m sure there are people for whom it doesn’t take as long to get to that level. And likewise, if you are really talented, you can not only move faster but solve harder problems.
I’m not sure on reflection I would change anything about how I learned. Programming properly is hard. I don’t find many things hard, but programming is consistently hard, and I don’t think that’s because I lack talent. Computers are weird, so much weirder than you would think as a user. And programming is actually very rudimentary and primitive once you realize what you’re doing. It’s possible that tooling and such will get better in the future but my experience has been the opposite: as I get better at programming, I yearn for less tools and none at all in favor of a simple, powerful language where I can write things in a really deterministic way.
If you wanted a formula from this article, I’ll share a rough outline of what my approach was: do things that interest you, start with the front-end because the cause and effect is more obvious, move on to server-side stuff and true “programming” and expect a lot of pain and failure. Once you can throw a web application together without Googling, you’re good enough to do this for a living, and you can kind of decide where to take it. There are bootcamps and online learning tools that try to get people to this proficiency. In my opinion, progression beyond it takes a great deal of learning from failure, pushing yourself, finding interesting and difficult problems to work on and mentorship. You don’t know what you don’t know. Just learning what HTML is can be eye opening, and if you can’t enjoy it, you probably have an answer as to whether to attempt the more hardcore stuff.
My last point is that for every coding job the economy has generated, at least three related jobs are created in sales, support, HR and other functions inside of those fancy companies where they give you free lunch. So if your sibling is learning to code, you may be just as well focusing on learning how to sell software, or support it, or hire programmers. Don’t force yourself to be someone you aren’t, and appreciate that a world dominated by software requires a variety of skills. You should definitely aspire to understand software though, at least well enough to use it for work. Because every job in the future will be impacted by software in a meaningful way.