RSS
 

How Facebook Became Twice as Fast (But Still Not Fast Enough)

19 Feb

There’s an interesting post today at Facebook’s engineering blog, detailing how Facebook’s engineers managed to make the site two times faster in a six-month period — from June of 2009 till the beginning of 2010.

It’s an interesting (albeit somewhat technical) read. In short, Facebook’s primary concerns were shortening network time (the time it takes for data to be transmitted between the user’s computer and Facebook) and render time (the time it takes the user’s web browser to process a response from Facebook and display the page). They managed to speed up the site primarily by reducing the number of cookies and cutting back on JavaScript.

Finally, they divided a typical Facebook page into parts (which they call pagelets), which can be loaded one after another (instead of waiting for the entire page to load. From the post:

“Over the last few months we’ve implemented exactly this ability for Facebook pages. We call the whole system BigPipe and it allows us to break our web pages up in to logical blocks of content, called Pagelets, and pipeline the generation and render of these Pagelets. Looking at the home page, for example, think of the newsfeed as one Pagelet, the Suggestions box another, and the advertisement yet another. BigPipe not only reduces the TTI of our pages but also makes them seem even faster to users since seeing partial content earlier feels faster than seeing complete content a little bit later.”

While this is nice to know, it’s hard not to notice the recent user complaints that Facebook is slower than ever (it’s been that way for me, too). Of course, the Facebook experience is different for users in different countries, so it’s hard to say whether it’s a global problem, but one thing is certain: With Facebook’s userbase growing the way it does, keeping the site fast enough will always be a challenge.


Reviews: Facebook

Tags: facebook, social media, social networking, trending, Web Development