Is there some sort of shortage of CPU time?
Why not all me to see 500 or 1000 rows per page so I can scroll through the things you’re trying to sell me without needing to press “next” every 5 seconds?
Developers please, there’s no shortage of “rows”. Give us more.
Only thing worse than pagination is infinite scrolling.
For your question specifically, the following explain it well I think:
..locating a previously found item on an extremely long page is inefficient, especially if that item is placed many scrolling segments down. It’s much easier for people to remember that the item is on page 3 than it is to gauge where the item is positioned on an extremely long page.
Latency is not an issue, and neither is displaying complex information. You can pull and display 400 dummy items no problem.
Loading time for the first "page" is extra valuable - you'd want that in 200 ms or so ideally. So one trick is load 3 items, then 30 or so.
Also you have to look at actual cost. Perhaps loading an extra 15 queries on the home screen costs $0.0004, but when you have 1m daily active home page uses, that's an extra $400 per day. In unoptimized pieces of code, the cost could well be 20x higher.
If you have a very high average user value like Jira, that's fine. But for say, a free manga site or something like imdb, you want to shave off costs wherever possible.
This all changed with the advent of infinite scrolling enabled via the whole ajax revolution, but this was (is still? I haven’t written front end code in a decade or so) difficult to get right.
In sum, this happened because of technical difficulties on the front end side (browser limitations or code complexity), and probably just stuck as convention. Perhaps some nice analyst then A/B tested the arrangement and found that it was happily optimal.
The big sites have definitely AB tested changing the results per page and have stuck with 10 for some user metric driven reason.
- Payload size: 1000 rows makes the payload huge which adds even more latency
- Frontend performance: More modest machines will struggle to render very large dynamic tables of rich content.
I have one with 6K rows, 300ms to insert into page on paper (console.time(1); console.timeEnd(1)), but in reality browser freezes for 3 seconds (1.8s style, 900ms layout, 300ms update tree). Freezing goes away with position: absolute, but it still takes 3 seconds to show up after .appendChild. I tried replacing Table with Flex divs, even worse speed.
The worst i ve seen is in Bluehost's domain management pages. An unsorted list of domains with infinite scroll where you have to scroll, scroll and scroll hoping that your domain will be in the next block that pops up (because, of course they keep all your long abandoned domains in the list). And they do it again in their DNS Zone file editor. It will only show you the first 10 or so lines of each section, so naturally you think lines are missing and try to re-enter them. The comparison with the plain old CPanel interface is mind boggling. I have complained, they don't listen
Progress!
I wonder how i can start a campaign to legally ban infinite scrolling
Their philosophy is to show less initially but as you start paging through additional pages it'll show more results per page until you hit a maximum amount. For example you could get 25 results on page 1, 50 on page 2, 100 on page 3, etc..
It's a happy medium. Keep your initial result minimal and focused but if a user wants more then keep giving them more in an efficient way (less clicking).
I intentionally put all of my Elixir screencasts on a single page without much whitespace on Alchemist Camp (https://alchemist.camp/episodes). I can't think of a single employer who would have been cool with that design choice, but I frequently get emails and comments from people thanking me for it.
There is no meaningful implementation difference between providing 20/50/100 options and 20/50/100/500 options, with 20 as the default in either case.
I would be very surprised if this decision is even discussed for more than 1 minute on most projects, let alone if any user research is done. But I'm happy to be proved wrong!
Source: I spend a couple years wrestling constantly with these sorts of problems in a React-based toolset we were developing.
One strategy I've used before to improve the user experience and help people miss Ctrl+F less is to load the full data set into memory, even if only a small slice of it is actually rendered at a time, and then do all paging/searching/sorting/filtering client-side. JavaScript has no issue handling six digits of items this way, and it keeps things really snappy.
I hate how so many websites manages to break the back button.
I would not assume most users want a lot of choices. Perhaps choosing from a list of 5-10 options is much less intimidating than choosing from 100. If someone clicks next page, then maybe the calculus changes.
If we pull, say, 1000 rows in the foreground, you have to keep in mind when designing UI that you have to wait for data. E.g. you have to design some sort of indication that data is not yet 100% loaded if user tries to search something in page.
Another point is that (maybe, I have no experience in UI design) that users favor fast loading page, even if it only has 10 rows..
Additionally, depending on how your backend is designed and your choice of ORM, sometimes requests can be slow. In those cases it probably makes sense to design a UI that encourages users to rely on search and filter controls vs giving them a large list.
So, a lot of the time, developers will just limit the page size, in order to guarantee performance on majority of devices...