An experimental study of TCP/IP's Van Jacobson header compression behavior in lossy space environment

There has been some work done in studying the behavior of TCP/IP's Van Jacobson header compression (VJHC) over high bit-error-rate (BER) and long delay channels such as in the space environment. However, most studies have been done based on discussing the definitions and methodologies of the scheme. There are hardly any experimental results seen in support of these discussions. This paper discusses an experimental examination of the VJHC behavior in a lossy GEO-satellite environment, simulated using a test-bed. The experimental results show that VJHC causes go-back-n retransmission behavior. When a single packet is corrupted or lost, all packets arriving after the lost or corrupted packet, up to one bandwidth-delay product's worth, are lost and need to be retransmitted. This results in performance degradation in an environment with a very high BER (around 10/sup -5/) and long link delay. In comparison, when VJHC is disabled, only the corrupted or lost packets are retransmitted, avoiding unnecessary retransmissions. This suggests that it is better to disable TCP/IP's VJHC over very lossy channels.