Updated at 2021-03-10 13:19

The most common caching approaches are:

  • Client Cache: Clients cache the content. Can significantly reduce network load but tougher to force refresh.
  • Proxy Cache: An extra component between client and server handles the caching like reverse proxy or CDN (Squid, Varnish, AWS CloudFront). Transparent for both parties, has minimal nerwork overhead if any but requires managing an extra component.
  • Server Cache: Servers cache the content. More control to server, usually done with a Redis or other in-memory database. The easiest to reason about and to invalidate.

Choose one caching approach and stick to it. Having multiple layers of caching is hard to control.

HTTP has a lot of caching control, use those if possible. cache-for, expires, etag, ifnonematch etag, 304 conditional GETs, etc.

Be careful with expires: never You can't control ISP or user browser. You will have a bad time except if you can control the URL to force a refresh.

Try to protecting the origin in large scale projects. The best way is to make cache writes asynchronous so that requests never go to origin from the clients but a separate processing queue. This is overengineering on small and medium projects though.