2.2 The Web and HTTP part 2

2.2 The Web and HTTP part 2

Understanding HTTP Performance Enhancements

Introduction to HTTP Basics

  • The study begins with a review of HTTP basics, including types of connections (persistent and non-persistent), message types (requests and responses), and the role of cookies.

Techniques for Improving User Perceived Performance

  • Focus shifts to techniques that enhance user perceived performance, specifically latency reduction through web caching and conditional GET requests.

Web Caching Explained

  • Web caching is introduced as a powerful method to improve performance by reducing load on origin servers. Users configure browsers to point to local caches.
  • When a browser requests an object, it first checks the cache. If found, the cache serves it directly; if not, it retrieves from the origin server before caching it for future requests.

Cache Functionality in Action

  • An animation illustrates how a client request is processed: if an object is cached, it's served immediately without involving the origin server.
  • The cache acts both as a server for clients and as a client when fetching data from the origin server.

Caching Behavior Control

  • Origin servers can dictate caching behavior via response headers like Cache-Control, specifying maximum cache duration or prohibiting caching altogether.

Benefits of Web Caching

  • Key benefits include reduced response times due to proximity of caches and decreased traffic on institutional access links since fewer requests reach the origin server.

Quantifying Performance Improvements

  • A scenario outlines network conditions: an access link speed of 1.544 Mbps with high utilization (0.97), leading to potential queuing delays at this bottleneck.

Access Link Utilization Analysis

  • The average incoming data rate closely matches access link capacity, indicating high utilization which can lead to significant delays in processing requests.

Delay Components Breakdown

  • End-to-end delay consists of internet delay (2 seconds), queuing delays at the access link (high due to utilization), and minimal LAN transmission delays.

Solutions for Improved User Performance

Web Caching and HTTP Performance Enhancements

Understanding Web Caching Benefits

  • The discussion begins with the quantification of benefits from web caching, estimating link utilization and end-to-end delay. It is assumed that 40% of requests are served by the cache within an institutional network.
  • Requests served by the cache experience minimal delays (milliseconds or less), as data travels over a local gigabit link, while 60% still require access to the origin server.
  • Despite having a limited 1.54 megabit per second link, clients consume data at 1.5 megabits per second; only 60% of this traffic comes from the public internet, leading to reduced access link utilization.
  • The average end-to-end delay is calculated: 60% of page loads incur a two-second delay from origin servers plus minor queuing delays, while 40% benefit from millisecond delays at the cache.
  • The overall average end-to-end delay is approximately 1.2 seconds, demonstrating that web caching not only costs less than upgrading link speed but also significantly reduces user page load times.

Conditional GET and Client-Side Caching

  • A second form of caching involves using the client's own computer and browser to avoid unnecessary data transmission if an up-to-date copy exists locally.
  • Clients include an "if modified since" field in HTTP requests to check for updates on content they already possess; this optimizes resource usage by preventing redundant transmissions.
  • If content is current, servers respond with a "304 Not Modified" message; if updated content exists, they send it along with a "202 OK" response.
  • Both web caching and conditional GET improve user-perceived performance by reducing latency and conserving network resources—an effective dual advantage.

Advancements in HTTP Protocol

Transitioning to HTTP/2

  • The current version of HTTP is known as HTTP/2, which aims to enhance user experience further by decreasing delays associated with multi-object requests.
  • Key changes in HTTP/2 include client-specified object priority for transmission order and server capabilities to push unrequested objects proactively to clients.

Addressing Head-of-Line Blocking

  • Large objects can be divided into frames in HTTP/2, allowing interleaved frame transmissions that mitigate head-of-line blocking—a common issue where smaller objects wait for larger ones to finish transmitting first.
  • An analogy compares this situation to supermarket checkout lines where smaller purchases are delayed behind larger carts; interleaving improves overall performance significantly.

Future Developments: HTTP/3

  • While improvements have been made in HTTP/2 regarding packet loss effects and TCP security issues remain unresolved, these will be addressed in upcoming releases like HTTP/3 expected in 2021.

Conclusion on Web Performance Enhancements

Video description

Video presentation: Computer Networks and the Internet. 2.2 The Web and HTTP (part 2). Web caches, Conditional GET, HTTP/2 Computer networks class. Jim Kurose Textbook reading: Section 2.2, Computer Networking: a Top-Down Approach (8th edition), J.F. Kurose, K.W. Ross, Pearson, 2020. See http://gaia.cs.umass.edu/kurose_ross for more open student resources.