Academic
Publications
How quickly can we approach channel capacity?

How quickly can we approach channel capacity?,10.1109/ACSSC.2004.1399310,Dror Baron,Mohammad Ali Khojastepour,Richard G. Baraniuk

How quickly can we approach channel capacity?   (Citations: 7)
BibTex | RIS | RefWorks Download
Recent progress in code design has made it crucial to understand how quickly communication systems can approach their limits. To address this issue for the channel capacity C, we define the nonasymptotic capacity CNA(n, ε) as the maximal rate of codebooks that achieve a probability ε of codeword error while using codewords of length n. We prove for the binary symmetric channel that CNA(n,ε)=C-K(ε)/√n+o(1/√n), where K(ε) is available in closed form. We also describe similar results for the Gaussian channel. These results may lead to more efficient resource usage in practical communication systems.
Cumulative Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
    • ...We note that although Strassen’s result is classical, there is a renewed interest in his setup; see, e.g., [4], [5], [6], [7], [8], and references therein...

    Y. Altuget al. Moderate deviation analysis of channel coding: Discrete memoryless cas...

    • ...Proceeding heuristically from the reliability function in [7], the expansion in (16) was put forward in [6] with n = o(n 1=2), for the case of per-codeword power constraint...

    Yury Polyanskiyet al. Dispersion of Gaussian channels

    • ...We use Theorem 3 in [9] (see below) to determine the set of packet loss and physical layer transmission rates for packets with length L bits transmitted over the channels using optimal forward error mechanisms at the physical layer...

    Ramin Khaliliet al. On the performance of Random Linear Network Coding in relay networks

    • ...To deal with input packets of finite length k, we adapt the following theorem from our recent work [5] (see also Wolfowitz [6] and references therein)...
    • ...Theorem 1: [5] For a binary symmetric channel with crossover probability p, there exists a constant Q1 such that, if the number n of channel uses in the first transmission satisfies...
    • ...Our previous results [5] show how to compute Q1 in closed form for the binary symmetric channel (BSC)...
    • ...(The latter expenditure is necessary to combat non-asymptotic effects [5, 9, 11].) Furthermore, because �� is monotone...

    Dror Baronet al. Coding vs. Packet Retransmission over Noisy Channels

    • ...Can we do better? We begin with a lower bound by Wolfowitz [2] (see also Baron et al. [8,9]) on the penalty for using finite length sequences in communication systems in which the statistics are known...
    • ...Theorem 2 [2,8,9] For known statistics and a fixed �, the penalty for using finite length sequences in coding schemes that rely on joint typicality is �( p n) bits...
    • ...universality. The remaining terms account for the penalty in non-asymptotic channel coding [2,9]...
    • ...The � 1(�i) comes from the Central Limit Theorem [2,9]; backing off �...
    • ...also appears in refined versions of Theorem 2 [2,8,9], we have a tight order bound...

    Shriram Sarvothamet al. Variable-Rate Coding with Feedback for Universal Communication Systems

Sort by: