Green Code: Reducing your websites' carbon footprint

Photo by Ellen Melin on Unsplash

Green Code: Reducing your websites' carbon footprint

Quantifying and reducing your web application's carbon emissions

Featured on Hashnode

This is a post inspired from one of the React India '22 Topics.

Everyone thought going digital was better for the environment, but nobody considered the ecological impact of large data centres and edge locations, their emissions and generated heat.

Now is the time when we should start improving it. But improving something that we cannot measure is difficult. However few organisations have now come up with a fairly acceptable way to measuring the carbon impact.

How to measure the impact: the formula

tl;dr lot of mathematics

The constants

According to this source (link)

  1. Annual Internet Energy: 1988 TWh (projected figure for 2020, includes consumer devices, Networks, data centers and hardware production energy consumption)
  2. Annual End User Traffic: 2444 EB (projected figure for 2020, for expected traffic, accross wired and wireless channels)
  3. Carbon factor (global grid): 442 g/kWh source
  4. Carbon factor (renewable energy source): 50 g/kWh

Therefore, we can assume Annual Internet Energy / Annual End User Traffic = 0.81 tWh/EB or 0.81 kWh/GB

Energy per visit [E]:

Energy consumed for first time visitors = Data transfer for first time visitors (GB) x 0.81kWh/GB x 0.75

Energy consumed for returning visitors = Data transfer for returning visitors (GB) x 0.81kWh/GB x 0.25 x 2

Therefore, E = Energy consumed for first time visitors + Energy consumed for returning visitors

Emissions of CO2 per visit [C]:

C = E x 442 g/kWh (or alternative/region-specific carbon factor)

Annual energy in kWh (AE):

AE = E x Monthly Visitors x 12

Annual emissions in grams CO2e (AC):

AC = C x Monthly Visitors x 12

Source for this calculation: sustainablewebdesign.org/calculating-digita..

end tl;dr

How to reduce your website/applications' carbon footprint

It is evident from the above computations that volume of data transferred to your browser is the only variable that directly influences your websites' carbon footprint.

Volume of Data includes volume of CSS, JS, Fonts, as well as dynamic data interactions like form posts or restful api requests.

There are a few improvements you can do to make your website better.

Interestingly, these all factors also improve your websites' performance too.

Code level improvements

Remove Unnecessery libraries:

  • In modern web development, usage of libraries for making restful api calls are redundant. axios, unfetch, $.ajax are all useless, as all modern browsers come with fetch api support.
  • MomentJS is a tremendously heavy library, and does not support tree shaking. Therefore, your code is automatically bloated with all the features of MomentJs, even if you don't need them. My recommendation is you use date-fns as it supports tree shaking. Refer here
  • Redux is often used in modern web development with react. With the advent of Context APIs in 2018, you don't need redux any more. And when you remove redux, do ensure you remove redux-saga, redux-thunk and other "side-effect" handling middlewares too, as your useEffect perfectly handles side effects in react.

Image Optimization:

If your website uses images, then consider optimizing them. few of the ways of optimizing can be

  1. Resize. If you're showing a image of 640x480 px image, then use a 640x480 image, do not use a heavier 14 megapixel image and scale it on the browser.
  2. Format. Often webp is a more efficient format than JPEGs.
  3. Image Optimizers. If you're using gatsby or nextJS, then you can use gatsby-plugin-image or next/image which automatically improves the image bundling.

There's a good set of advise available here: web.dev/fast/#optimize-your-images

Heavy Fonts:

Custom fonts can be heavy. Don't use too many custom fonts

  1. Use woff, or woff2 formats
  2. Google Fonts allows you to include only the weights you need.

Heavy Page level CSS:

Your CSS which loads on the top of your page often is the most performance deterrant.

  1. Avoid CSS libraries that bundle all the CSS classes together. Remove unused CSS, use css chunking according to routes in your application.
  2. Use AMPs, which mandates the CSS to be less than 50 kbs in a page
  3. Use an efficient component library, if you're using react.
  4. I prefer styled-components or similar css-in-js libraries. This ensures efficient CSS bundling and minimal global/page level CSS.

JS Bundling and Code Chunking

Minimising JS code and appropriate chunking mechanism ensures that only the relevant JS bundle loades on the page. Good news is, almost all the modern JS based web development framworks does this intelligently. If you're using create-react-app, you can influence the code chunking with react.lazy() and Suspence.

Headers and Cookies

Cookies are often unavoidable. However, developers in many cases store data in cookies. This should not be done, as cookies travel in request headers, and add on to the network loads. Similarly, custom headers should be discouraged, as they add up to network load.

Infra Level Optimizations:

Static Generation

  • With static html, you'll need less javascript.
  • Avoid using angular, or create-react-app for applications which are not dynamic. If you're creating a blog site or personal website, then go with static generation with gatsby, or nextJS.
  • Use static hosting to serve your content, like s3/cloudfront. This is cost effective as no compute happens in the backend and with cloudfront's edge location caching, your content reaches the browser faster.
  • nextJS with Hybrid rendering can be very useful dynamic ites as well, as the first render happens on server. Therefore, the data transferred less compared to a completely client side rendered application. However, this involves using servers or code execution in backend, so might be a costly.

Caching Headers

Appropriate caching headers ensures that browsers don't make repeated roundtrips to server to get content. Ensure that you have set an appropriate cache longivity, so that the content, once served, gets cached in browser. This is particularly useful for returning visitors to your site.

Compression

Compression headers ensure that data is compressed while served to browsers. gzip is the most common option and all api gateways and content delivery networks support this. brotli is better option, although it is not that popular yet.

On AWS stack s3 and cloudfront, combined with brotli and cache-control headers can make a very powerful content delivery mechanism.

Design Level

UX and Design plays a crucial role in reducing network load. There are a few places where Design can solve reduce the network loads.

Designing for mobile first

If you're making a public website, then the design must support mobile phone and other handheld devices. And in such devices the screen real estate is really small. Therefore, a lot of data transfer can be optimized for such devices. for eg.

  • Image sizes can be optimized for smaller devices
  • Large texts can be truncated with an ellipsis(...) or Read More links. Users can click on those buttons and then the full content can be fetched from server.
  • For tables with lots of rows, data can be fetched progressively when scrolled by user
  • For sections of page below the visible area, images/ui can be fetched progressively when user scrolls there using intersection observers.
  • Tables with lot of columns can be restructured to show only relevant columns in mobile views.

Dark theme

Supporting Dark Mode in UI, although may not affect the network load, but can be useful in significantly reducing power consumption of OLED based modern mobile devices. source

Data heavy web apps

  • When you're rendering a table with lots of rows, adding a suitable sort or pagination to the UI is preferred instead of fetching rendering the whole data set all at once.
  • In some cases, a filter, or a typeahead search enables you to fetch appropriate data, instead of fetching the whole data set.

Digital Marketing Junk

For Ecomm applications, a lot of digital marketing junk is loaded on UIs, through tag management channels, and users often do not have control on those. These not only adds on to the network traffic, but often a risk to privacy as well. Often, this data capture is useful in order to support users but the website provider must minimize the data capture, and make every attempt to reduce network load.

Conclusion

Its true that attempting to reduce carbon footprint also improves your web applications performance to a great extent. However, there are a few scenarios where network load and performance may not go hand in hand. A pragmatic approach is necessery to address such situations.

Did you find this article valuable?

Support Tirtha Guha by becoming a sponsor. Any amount is appreciated!