Skip to main content Accessibility Feedback

Vanilla JS is a web performance decision

One of the big reasons I started learning vanilla JS a decade ago, and then started evangelizing it to anyone who would listen, is because it’s so much better for performance than using libraries and frameworks.

Libraries can be awesome, and definitely have their place. I use a few myself.

But platform-native JavaScript will almost always be faster than a library that adds lots of weight and layers in abstractions. It’s not just the file size that’s the issue. The further away you move from the native methods and APIs, the slower things get. Abstractions add latency.

If you work on modern device with a fast internet connection, you might not notice. But not everyone does.

This morning, Katie Sylor-Miller, author of the amazing Oh Shit, Git!?!, tweeted

This past weekend, I had an experience that really reinforced why Web Performance is so vitally important. We were on vacation in a remote part of coastal Maine. There was limited 3G service, no TV (my kids hated that part), and very slow broadband with limited range wifi … However, when the power briefly went out, the annoyance became a major problem! I couldn’t look up the power company phone number, or go online to report the outage b/c sites loaded so slowly. Luckily the neighbor took care of it, but if they hadn’t been there, we were stuck.

This is not the first story of its kind that I’ve read.

When a hurricane slammed into New Orleans in the US last week, I saw at least one tweet about how the electric company’s website kept crashing that person’s device because of how much JavaScript it was loading. Several years back, I read an article that said more or less the same thing (for the life of me I can’t find it even though I know I wrote about at the time.)

If you actually care about your users, use as little JavaScript as possible, and make sure the critical parts of your site work without if possible.