Chapter 4. Sustainable Web Development

Aside from the general benefits of reliability, security, and maintainability, well-written code is inherently more efficient. If you can write in ten lines of code what you used to write in a hundred lines, you’ve not only reduced the size of the file, but likely reduced the amount of work the server has to do to process that file.

Think about the way you write code from a structural point of view. Can you organize your files to be more streamlined and avoid duplication of styles and functions? Can you simplify logic to reduce the number and complexity of queries required to deliver a specific piece of functionality? Can you code the website without using bloated libraries and plugins that contain functionality you don’t even need? It might take a bit more time to plan an efficient structure or to write code from scratch, but the benefits in terms of performance and sustainability can be significant.

Of course, we live in a reality where time is short. Hand-coding everything might deliver technical efficiency, but might not be a good use of our own time. Likewise, we might not have the experience or confidence to code everything from scratch.

We don’t need to be dogmatic about it, but we do need to commit to making informed decisions about our solutions. Even if we need to use an off-the-shelf option, a little research could reveal a more appropriate and less bloated solution.

Let’s drop the assumption that because developers’ time is more expensive than computer time, computing efficiency is not worth pursuing. According to Nikita Prokopov, this attitude in our industry results in us “wasting computers at an unprecedented scale” (http://bkaprt.com/swd/04-01/). It reminds me of something Guy Singh-Watson, founder of Riverford Organic Farms, wrote in one of his veg box newsletters: “Until energy is more expensive, I suspect we will keep thinking of new ways to waste it” (http://bkaprt.com/swd/04-02/). We must start valuing energy and natural resources as much as we value our own time and effort.

Energy Efficient Languages

If efficient code helps us create more sustainable web services, it’s worth asking what difference programming languages themselves make to the energy efficiency. 

A team of six researchers from Portuguese universities set out to investigate exactly that (http://bkaprt.com/swd/04-03/, PDF). They ran server energy tests of twenty-seven popular programming languages, using ten standardized algorithmic problems from a free software project called the Computer Language Benchmarks Game (http://bkaprt.com/swd/04-04/). 

They found the energy efficiency of different languages varies significantly, and although the results varied by context, there was a clear pattern that compiled languages delivered the highest levels of energy efficiency (Fig 4.1): “On average, compiled languages consumed 120 J [joules] to execute the solutions, while for a virtual machine and interpreted languages this value was 576 J and 2365 J, respectively.” Those are some profound differences! (The notable exception to this rule was Java, which came in at just 114 joules.)

The researchers found that the three least energy intensive programming languages were C, Rust, and C++, while the three most energy intensive were Perl, Python, and Ruby. Popular web programming languages JavaScript and PHP ranked 17th and 21st respectively, which is a little troubling when we consider these languages power the majority of modern websites. Some small consolation was that JavaScript and PHP were two of the most energy efficient languages when manipulating strings with regular expression, even though “they tend to be not very energy efficient in other scenarios,” according to the report.

Fig 4.1: The variation in energy efficiency of programming languages is astonishing.

When looking at actual energy consumption (rather than ranking), JavaScript is far more efficient than PHP or Ruby, using just 4.4 joules to complete an average task, compared to 29.3 joules for PHP and 69.9 joules for Ruby. In other words, JavaScript uses 15 percent of the energy of PHP and just 6.3 percent of that consumed by Ruby. 

In practice, we have to use programming languages appropriate to the project and with which we have sufficient proficiency to deliver our work, but knowing which languages are most efficient helps us decide which languages to use in those cases where we do have a choice. It also provides guidance to help us intentionally choose which programming languages to learn and use in the future.

Use JavaScript with care

Despite the comparative benefits of JavaScript, I’m going to single it out as a language we should try to use less. I know what you’re thinking; “This guy isn’t making any sense!”

Let me explain. Although JavaScript might be more energy efficient in processing algorithmic problems, it will always be less efficient than serving static files in cases where we don’t need our code to “solve problems.”

The most common example is the use of JavaScript for animating elements in a web design. CSS can now achieve many animation effects with far more efficiency than JavaScript because it minimizes the need for the CPU to “think”—and can generally be achieved with far smaller file sizes, minimizing the energy used to transfer data.

We should think carefully about whether the code we use is necessary. For example, does our website really need jQuery to deliver the functionality specified, and is it appropriate to use a front-end framework like React, Vue, or Angular for simple websites that don’t have any need for them?

Likewise, the functionality that JavaScript adds tends to consume energy and slow websites down, without contributing value for the user. And that brings me to tracking scripts.

Tracking scripts

One of the most common uses of JavaScript is for advertising and tracking scripts, which are at best a distraction and in many cases an invasion of privacy.

Simple analytics tracking scripts vary enormously in size. The standard Google Analytics script runs 17 KB, while Google Tag Manager is significantly heavier at 75 KB; more streamlined options are a fraction of that at 1.5 KB for Minimal GA and 1.2 KB for Fathom at the time of writing (http://bkaprt.com/swd/04-05/). It’s worth questioning what functionality you need from your tracking script to justify the extra weight. 

News websites are notorious for having an excess of tracking and advertising scripts. In 2015, the New York Times reviewed the data consumption of fifty popular news websites and found that, on average, over half of the data consumption came solely from advertising scripts. In addition to causing significant performance issues, these scripts end up costing mobile users a large chunk of their monthly data allotments. A user could spend up to thirty-two cents of a typical data plan loading just the ads on a news homepage—while the editorial content would cost only eight cents (http://bkaprt.com/swd/04-06/). 

Tracking scripts can undo a lot of our great work in creating an efficient website. To solve this, we need to be less passive about simply installing all the scripts requested by the marketing department at the end of the design and development process, and engage in an open conversation about what exactly we are trying to learn from the data. There may be much smaller tracking scripts that can provide the data we actually need.

We can also reduce the impact of tracking scripts by asking when they are needed. We may not need to use tracking scripts indefinitely, but instead could install them temporarily to track a campaign or understand a specific UX problem, and then remove them once we have the necessary insights.

We might not always have full control over what tracking scripts end up on the websites we create, but by asking the right questions and proactively suggesting more efficient alternatives, we can help steer things in the right direction.

Optimizing Font Files 

When we decided to use Inter UI as the typeface on the Wholegrain Digital website, we were concerned that a single font weight came standard as a 298 KB TTF file—larger than the rest of the front-end code put together. So we investigated how we could reduce the size. We started with the original Inter UI font file in TTF format and converted it to a more efficient WOFF2 file format using Font Squirrel’s Webfont Generator (Fig 4.2) (http://bkaprt.com/swd/04-07/). This reduced the file size to 77 KB with no loss of quality whatsoever.

Fig 4.2: It seems all the best tools are run by cute animals. Font Squirrel’s Webfont Generator is a simple way to convert fonts to the optimal format in no time.

The next question was whether the font included any elements we didn’t need. Using the character map viewer provided by Font Drop, we were able to see that our Inter UI font file contained 2,192 characters supporting thirty-nine languages, most of which our website simply didn’t need (http://bkaprt.com/swd/04-08/). We used the font subsetting tool from Everything Fonts to strip out the unused characters, leaving us with a subsetted version of Inter UI containing only ninety-eight characters (http://bkaprt.com/swd/04-09/). The final file was a mere 7 KB, a reduction of 97.7 percent in file size compared to the official Inter UI file we’d started with, and without any negative side effects for users of our website. 

So long as you have control of the font files, and the foundry’s license terms allow for web use and optimization, you can repeat this process for almost any font, improving performance and sustainability with just a few minutes of effort.

Staying Static 

If you were born before 1990, you’ll probably remember the days when we used to build websites in static HTML, and when we made our dreams come true with Dreamweaver. It turns out that those basic websites were really, really ecofriendly because they generally had really small files and placed minimal load on the server.

The interaction between the browser and the server amounted to something like this:

Browser: “Can I have the homepage?” 
Server: “Sure, here it is.”

With the introduction of content management systems (CMSes), that process has become a lot more complicated. The actual HTML files no longer exist on the web server; the server has to generate the files every time someone requests the web page. Think of old-fashioned static HTML web pages as microwave-ready meals, while CMS-based web pages are raw ingredients that need chopping and cooking before anyone can eat them. Despite the incredible power CMSes have given us to take control of our content, the technology enabling this control is a lot less efficient, resulting in higher energy consumption and slower load times.

It’s difficult to put a number on exactly how much extra energy this uses, but we can get an indication by comparing the amount of computation required to load a simple HTML file to the same content from a PHP file (used by popular CMSes like WordPress and Drupal).

At the Web Performance for People and Planet event held in London in 2019, the CTO of the Positive Internet Company, Nick Mailer, gave a presentation in which he suggested using the number of system calls on the server as an indicator of the relative amount of energy used to load a file. He created a static HTML file and an equivalent PHP script to deliver the exact same content to the browser: the string “hello, world.” The static file used forty-six system calls, while the PHP file used 888 system calls—nearly twenty times the amount of work on the server.

Fortunately, solutions exist that can reduce the number of system calls on the server. Let’s take a look at some of them.

Page caching

One common solution is page caching, where most of the page is generated on the fly by the CMS when the first visitor loads the page, and all subsequent visits receive a cached version of the page. Although the server still has to do the resource-intensive work of generating and populating the markup for the first visitor to a new or updated page, every subsequent visitor to that page will receive that same generated copy of the page’s HTML.

This not only saves a lot of computational energy, it also greatly improves web performance, because the user doesn’t need to wait for the page to be “assembled” on the server before it can be sent to their browser.

A plugin like WP Rocket is the easiest way to implement caching in a CMS such as WordPress. WP Rocket generates the cached version of our web pages for us, and can also pregenerate the cache by simulating the first visit. This helps us avoid slow load times for the first visitor to each newly edited page, and also helps search engine rankings by ensuring all pages are cached before the search engine crawler visit.

However, when we implement a caching solution inside our CMS, the CMS itself still has to fire up and process every request before sending the static files to the visitor. We can eliminate this initial overhead by moving the caching layer onto the hosting itself. The server sends the cached files directly to the user without querying the CMS, saving energy and improving load times even further. Server-side caching technologies like Varnish are widely available and come standard in some hosting packages, so even if you don’t have the technical knowledge to set caching up yourself, you can still apply it to your projects.

Page caching also increases resilience to traffic spikes. For example, one of our client’s sites had a million and a half hits inside a week from bots trolling their traffic. The server held up just fine because it didn’t have to do any of that heavy WordPress stuff. To quote my colleague Josh Stopper, it was like the server just said, “Here’s the traffic. Here’s the file. Bye.” Job done!

JAMstack

Caching isn’t the only solution for eliminating the inefficiency caused by content management systems dynamically generating web pages. An emerging approach that can help us create a more sustainable web is the JAMstack.

JAMstack might sound like something you would eat for breakfast, but it actually stands for JavaScript, APIs, and Markup. By using the growing power of JavaScript in modern web browsers, combined with APIs that can interact with other web services, JAMstack is a way to decouple the CMS from the front-end user experience.

A JAMstack website uses a content management system to edit and publish content, and then the web pages themselves prerender as static web pages. This approach delivers the performance, security, and energy efficiency of purely static web pages—and unlike the static websites of the early web, it can also deliver the rich interactions, advanced functionality, and CMS-based content editing we expect from the modern web. JAMstack websites typically also host the static files on a content delivery network (CDN), saving energy by reducing the distance files have to travel to each visitor and creating resilience by eliminating any single point of failure.

It’s an approach rather than a technology, but it’s becoming ever more accessible to apply to our projects. There are a growing number of JAMstack frameworks and content management systems, such as Jekyll, Netlify, Hugo, and Gatsby.

This approach is similar to hosting services (such as Strattic and Shifter) that do the hard work of converting our preferred content management systems into static websites. When people visit these websites, they are hitting static web pages with no connection to the CMS. In fact, the CMS doesn’t even exist unless the editor is logged in and editing content. So-called “serverless” hosting solutions are designed so that when the user logs in, it spins up a live version of the CMS, generates static files for the web pages, and then winds down the CMS again when the website editor logs out.

This makes perfect sense. Web-based CMSes are extremely useful for empowering people to publish and manage content on the web, but if visitors are being served static files from a CDN, then there’s no need for the CMS to be running on a server and using electricity unless someone is actually logged into the CMS editing the website.

Josh Lawrence, cofounder of serverless WordPress host Strattic, told me that, although it’s hard to obtain exact numbers, a static implementation of WordPress is likely to use several hundred times less energy than using WordPress in its default form.

As for whether the JAMstack approach is more efficient than a well cached conventional CMS website, that’s a matter of debate. The difference is marginal if you’ve set up your CMS to get everything working efficiently. The key difference is JAMstack has the efficiency benefits of static websites by default, whereas with conventional content management systems you have to pay attention to detail to ensure it’s all set up to perform efficiently, which is often not the case.

Progressive Web Apps 

Progressive Web App (PWA) technology is a close ally of static web technology—it helps us develop more efficient websites by bringing many of the features previously unique to native mobile apps into the web browser. PWAs are now supported in most web browsers, and offer numerous benefits in terms of user experience, such as the ability to cache files on a user’s device with a high degree of control.

Carefully designed caching reduces the number of times the browser makes requests to the server, as well as the amount of data transfer. This is particularly effective on pages that users are likely to visit multiple times but don’t change very often, such as reference articles. In addition to benefiting the environment, it also improves load speeds and makes websites seem more stable when the web connection is a bit flaky, such as when commuting on the train—not to mention providing resilience during natural disasters when telecoms networks are often compromised.

In Progressive Web Apps, Jason Grigsby argued that there really is no good reason for any website not to harness the benefits of PWAs: “Even if you don’t think of your website as an ‘app,’ the core features of progressive web apps can benefit any website. Who wouldn’t profit from a fast, secure, and reliable website?” Add “energy-efficient” to that list!

Compress Your Code

Having written your super-clean, well-organized code, there’s one more thing to do before you upload it to your hosting account and call it a day: compression!

No matter how well you write your code, it can always be more streamlined. As humans, we like certain “niceties” in code that machines don’t care about: white space for legibility, comments to help us understand it, and intuitive naming of classes and functions. What’s more, we’re not perfect, and will often include some unused or duplicated code.

While these extras are fine in the development process, they make our files larger than they need to be. If we can strip them out of our work and generate a version of our code that’s purely machine-readable, then we can save energy and improve web performance.

Build tools help us do this by converting the development code we read and write as humans into a production-ready version optimized for machines. Build tools can be used for many things, including checking code against web standards and browser compatibility, but it’s their ability to compress code that we’re interested in here.

It can be complicated to set up build tools because there’s a different tool for every optimization we want to apply. CodeKit is a piece of software written for people fed up with managing multiple open-source tools to produce a website build (http://bkaprt.com/swd/04-10/). It will take pretty much any code you give it, run it through build tools, and compile it into the most efficient format. It works with images, JavaScript, CSS, PHP, almost anything you like. It doesn’t have all of the flexibility you’d get from setting up custom build tools, but as a starting point, it’s powerful.

Block the Bots

Having spent most of this chapter explaining ways you can make your code more efficient, there’s one really easy way you can reduce the environmental impact of your websites with minimal effort.

In one word—bots!

As web designers we’re very much focused on creating products that serve the needs of the user—but human users aren’t the only ones visiting our websites. A huge percentage of website traffic comes from bots.

Bots visit websites for a wide variety of reasons—web scrapers, SEO bots like Ahrefs and SEMRush, brute force attackers trying to break in, RSS reader apps, competitors monitoring each other’s websites, and “script kiddies” just playing with bots.

Akshat Choudhary, founder of the WordPress backup and security service BlogVault, told me that their data shows “bots often use up 50 percent of resources such as processing and bandwidth. We observe this consistently across all sites.”

Remember that I mentioned in Chapter 1 that if the internet were a country, it would be equal to Germany as the sixth worst polluter in the world? Now imagine if 50 percent of Germany’s electricity was being siphoned off by bots that are up to no good. Surely it would be a national priority to keep the bots out.

This is how we should be thinking about bots in our web projects. We might not see them, but they’re there, trying to extract information and breach our security. In the process, they put additional load on our servers and slow down our websites.

We need to keep the bots out, and luckily, in many cases, it’s not too difficult. Bots can be blocked primarily by using a firewall, which can be added to most websites using a service such as Cloudflare or Malcare, or by your hosting provider. These services will block the majority of bots, but still let in the ones you want (hello, Google!).

With this in place, you can not only reduce the carbon footprint of your website, you can also improve load speeds, enhance security, and stop competitors from scraping your information. Win!

Better, More Sustainable Code

The bottom line when developing websites and apps is that it’s important to remember just how much influence you have over the sustainability of the end product, regardless of whether or not you were involved in the design process. A thoughtful, detail-oriented approach to development leads to websites that are far more efficient without any perceivable difference to the user other than improved performance. That’s a pretty good side effect.

No matter what type of web project you work on or what coding technologies you use, one thing is certain—it will need to be hosted somewhere, and we’re talking about that next.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.191.202.72