The error exponent with delay for lossless source coding

In channel coding, reliable communication takes place at rates below capacity at the fundamental cost of end-to-end delay. Error exponents tell us how much faster convergence is when we settle for less rate. For lossless source coding, entropy takes the place of capacity and error exponents tell us how much faster convergence is when we use more rate. While in channel coding without feedback the block error exponent is a good proxy for studying the more fundamental tradeoff with fixed end-to-end delay, it is not so in source coding. Block-coding error exponents are quite conservative (despite being tight!) when it comes to the tradeoff with delay. Nonblock codes can achieve much better performance with fixed delay and we present both the fundamental bound and how to achieve it in a delay-universal manner. The proof gives substance to Shannon's cryptic statement about how the duality between source and channel coding is like the duality between the past and the future.