Users — and publishers — were complaining about news sites’ slow load times long before initiatives like Facebook’s Instant Articles or Google’s Accelerated Mobile Pages came along with plans to cut drastically pageload times.
“Mobile web performance is bad — I challenge you to find someone who disagrees with that,” Mic’s chief strategy officer Cory Haik told me last month when we chatted about the official rollout of Google AMP. “When our pages load too slowly on mobile, as a publisher, we’re losing an audience, and that is painful.”
Researchers at MIT’s Computer Science and Artificial Intelligence Laboratory
have designed a new system
Thus, loading a webpage requires a browser to resolve a “dependency graph”; this partial ordering constrains the sequence in which a browser can process individual objects. Unfortunately, many edges in a page’s dependency graph are unobservable by today’s browsers. To avoid violating these hidden dependencies, browsers make conservative assumptions about which objects to process next, leaving the network and CPU underutilized.
With a Polaris-enabled page, however, the system figures out how to more efficiently load a page by relying on a more accurate dependency graph, reducing the number of network trips. This graph of three sites shows how Polaris reduces pageload times (average reductions relative to baseline loading times):
The code hasn’t been made available publicly yet, but the researchers are figuring out how best to release it for general use, according to a spokesperson.
You can read the full paper, co-authored by Ravi Netravali, Ameesh Goyal, James Mickens, and Hari Balakrishnan, here.