
It’s an interesting (albeit somewhat technical) read. In short, Facebook’s primary concerns were shortening network time (the time it takes for data to be transmitted between the user’s computer and Facebook) and render time (the time it takes the user’s web browser to process a response from Facebook and display the page). They managed to speed up the site primarily by reducing the number of cookies and cutting back on JavaScript.
Finally, they divided a typical Facebook page into parts (which they call pagelets), which can be loaded one after another (instead of waiting for the entire page to load. From the post:
“Over the last few months we’ve implemented exactly this ability for Facebook pages. We call the whole system BigPipe and it allows us to break our web pages up in to logical blocks of content, called Pagelets, and pipeline the generation and render of these Pagelets. Looking at the home page, for example, think of the newsfeed as one Pagelet, the Suggestions box another, and the advertisement yet another. BigPipe not only reduces the TTI of our pages but also makes them seem even faster to users since seeing partial content earlier feels faster than seeing complete content a little bit later.”While this is nice to know, it’s hard not to notice the recent user complaints that Facebook (

