User:Brooke Vibber/ResourceLoader and latency
This is an essay. It expresses the opinions and ideas of some mediawiki.org users, but may not have wide support. Feel free to update this page as needed, or use the discussion page to propose major changes. |
I'm doing a lot of conference travel this month, which means I'm using slower internet more -- throttled 3G instead of my fast LTE at home, or even worse... satellite-based in-flight wifi. :) Latency is up, bandwidth is down, and I'm experiencing more of the pain of "the common people". :)
It occurs to me that some of how we've factored our JavaScript loading with ResourceLoader isn't ideally suited to these environments.
ResourceLoader was created in the pre-SPDY, pre-HTTP 2 era, where making multiple requests to the server was always expensive; concatenating multiple script modules into a single request body was a smart optimization to reduce total latency by avoiding multiple connection openings/closings.
HTTP 1.1 supported 'pipelining' -- asking for a second resource on a connection before the first resource was entirely received -- in theory, but browsers tended not to use it because it wasn't always reliable in server and proxy implementations.
HTTP 2 improves on this in several ways including:
- request headers are compressed, further reducing overhead of making multiple requests
- multiple streams can be multiplexed, so requesting a resource that's large/slow doesn't block the next thing you ask for on the 'pipeline'
If we could start relying on this, we could consider some changes to how ResourceLoader works that I think would benefit users on high-latency and slow connections.
One problem I'm seeing: currently, an RL bundle of modules must be _completely downloaded_ before any of its code executes.
For one thing, this means that if we have some modules we might or might not need at runtime, we have two choices:
- always load them with the main bundle
- load them in a second bundle, after the entire first bundle loads and we have a chance to do runtime checks
In the first case, we waste bandwidth and download time for the user if the module isn't needed after all.
In the second case, we add latency because by the time our code is executing, the first bundle has already completed its download -- we lose the chance for pipelining. If a lot of modules do this, that latency can add up.
A possibility is to stop concatenating so many modules into a single request. Fire off multiple requests, let them pipeline/multiplex, and let them execute as they come in.
(ResourceLoader already has tools to ensure that modules are executed in dependency order, so if one module comes in earlier than expected that's ok. But we would probably want to request them in dependency order as much as possible.)
Thus, a module that gets loaded early on in the bunch wouldn't have to wait for the rest to load before saying "oh hey, I'm on IE 11, I'm going to also need to load some IE-specific compatibility code!" ... it can then add its extra module onto the loading queue _while_ other modules are still loading.
So this gives two latency gains:
- latency between requesting a bunch of modules, and each module executing, is reduced
- additionally requested modules don't always force another network-latency wait
The second might only be useful in some cases, though browser-compatibility shims seem like an obvious win case when the shims are non-trivial. I'm thinking among others of the ogv.js video player bits. ;)
The first has additional possible wins when it comes to doing progressive enhancement on large pages.
There are three things we may have to wait on before a page is "working":
- main stylesheet has to load before we see anything
- some portion of HTML has to load before we see stuff, but at some point the browser starts doing partial layout...
- all of our main JS bundle has to load before it executes...
- ...and much of our JS ends up waiting to do anything until the HTML is done loading ('DOMReady' event)
- (then stuff like images that we mostly don't care about here)
So we have two major possibilities for JS that's loaded from the page's <head> section: 1) The HTML finishes loading before the JS does 2) The JS finishes loading before the HTML does
If the difference is small, nobody really notices. If the difference is large, there are unpleasant consequences:
In first case, you have elements of the page that don't yet fully work while waiting for JS to load. Possibly you have "flashes" of moving content when the JS finally loads and starts changing things, which is annoying to the user.
In the second case... you have much the same effects because the JS adding event handlers, rewriting UI bits, etc doesn't run until the HTML _finishes_ loading, but the beginning and middle of a long page may still be displayed and interacted with by the user.
In both cases, we could probably be smarter about how we update things. When the HTML loads first, a more pipeline-friendly JS loader would allow for the various progressive enhancements to start trickling in as they become available, instead of waiting for the last bit of the last module.
When the HTML is large and finishes last, we could be more careful about our JS code to allow it to run periodically, adding enhancements to bits of HTML code as they come in.
You'd want to make sure a module working in this way is idempotent -- that is, when executing a second time, don't mess up anything that you already transformed the first time. :) Don't double up event handlers, for instance.
Many popular Wikipedia articles have gotten really big; around a megabyte of HTML is not uncommon for pages like w:Russia, w:United States, w:Hillary Clinton, w:Vladimir Putin, etc. These gzip down to 150-250 KiB, but still that's going to be slowish on a 2G or 3G network.
Loading modules separately can also be a big win in terms of caching. Instead of invalidating and reloading the entire module set when any one item changes, we can allow the local browser cache to treat each RL module as a separate entity, individually able to be replaced when needed.
This should also work better with Service Workers -- a system being added to modern browsers that essentially lets you run a JavaScript program as in-browser customizable caching proxy just for your site.