Improving the performance of SPDY for mobile devices

SPDY [2] is an application layer protocol developed for reducing web latency. It is estimated that about 3.6% of all websites use SPDY today [3], but that statistics hides how many web transactions use SPDY. It has rapidly gained adoption across Google, Facebook, Twitter, Yahoo, Reddit, and WordPress. While only a small fraction of all websites use SPDY today, SPDY-capable proxies are being deployed to provide the benefits of SPDY over the “last mile” while traffic between proxies and the origin webserver remains HTTP1.x for the time-being. SPDY improves web performance through various mechanisms including multiplexing multiple HTTP transfers over a single connection, header compression, HTTP request prioritization, and server push; which sends down content before it is requested. Push can help when the client is underpowered and is slowly processing Javascript or app code, or is waiting for the user to interact with the displayed content. The relative benefits of multiplexing, header compression and request prioritization have been measured and analyzed by several blogs and research papers [6, 1, 4]. However, the proactive pushing of HTTP content to the client has received relatively less scrutiny. The promise of push in SPDY is that through machine learning on prior access patterns or otherwise, the server can intelligently push down content to the client before it realizes it is needed, thereby significantly reducing user-perceived latency. Push is problematic for mobile devices because it can waste both battery and bandwidth. This happens either because the server blindly sends down the content which is already in the client’s cache or it ends up sending content to the client which is never consumed by the user. A reasonable solution may be to simply turn off push when the mobile client is on a limited data plan or has low battery. This binary decision mitigates the worst case scenario, while retaining the powerful latency advantage of push only in the ideal scenario. We can do better. We propose two basic mechanisms that the HTTP 2.0 standard should adopt to dynamically adjust the overall performance (speed, battery consumption, data consumption) of mobile clients: • Cache hints – we propose a lightweight mechanism for the client to indicate what cached content it has. The server can then modulate its intended set of push objects. When the client initiates the connection, it sends the server an array of bloom filters with the first HTTP request. The Bloom filter at index n of that array will represent the objects which expire in less than 2 seconds. Client Proxy Server