Comment

Morris de Oryx

An in-memory cache *should* be faster, and such tools absolutely have a place. A tool such as Redis can make an unusable system great, under the right conditions. It would be cool if Postgres added in-memory tables in V13 or V14.

In this case, unless the work is compounded in a loop, even the slowest times are *imperceptible* to a human being:

https://www.nngroup.com/articles/response-times-3-important-limits/

Put another way, some of your results are "20x faster" to a computer and imperceptibly different to a person.

Replies

Peter Bengtsson

The numbers add up and it's nice to eliminate slow things that add up when the application actually does a little bit more that just that one call.

Denique De Nique

The problem with your response is that human perception isn't the only factor. Another thing to consider is the cost at scale of grinding away for an extra 10x or 20x the time. A lot of infrastructure, particularly cloud infrastructure, is sensitive to this. So you can save a considerable amount of money by saving 20x the computation even if it isn't always happening in a loop.