Sign in
Author

Conference

Journal

Organization

Year

DOI
Look for results that meet for the following criteria:
since
equal to
before
between
and
Search in all fields of study
Limit my searches in the following fields of study
Agriculture Science
Arts & Humanities
Biology
Chemistry
Computer Science
Economics & Business
Engineering
Environmental Sciences
Geosciences
Material Science
Mathematics
Medicine
Physics
Social Science
Multidisciplinary
Keywords
(4)
Binary Symmetric Channel
Channel Capacity
Communication System
Gaussian Channel
Subscribe
Academic
Publications
How quickly can we approach channel capacity?
How quickly can we approach channel capacity?,10.1109/ACSSC.2004.1399310,Dror Baron,Mohammad Ali Khojastepour,Richard G. Baraniuk
Edit
How quickly can we approach channel capacity?
(
Citations: 7
)
BibTex

RIS

RefWorks
Download
Dror Baron
,
Mohammad Ali Khojastepour
,
Richard G. Baraniuk
Recent progress in code design has made it crucial to understand how quickly communication systems can approach their limits. To address this issue for the
channel capacity
C, we define the nonasymptotic capacity CNA(n, ε) as the maximal rate of codebooks that achieve a probability ε of codeword error while using codewords of length n. We prove for the
binary symmetric channel
that CNA(n,ε)=CK(ε)/√n+o(1/√n), where K(ε) is available in closed form. We also describe similar results for the Gaussian channel. These results may lead to more efficient resource usage in practical communication systems.
Conference:
Asilomar Conference on Signals, Systems & Computers  ASILOMAR
, 2004
DOI:
10.1109/ACSSC.2004.1399310
Cumulative
Annual
View Publication
The following links allow you to view full publications. These links are maintained by other sources not affiliated with Microsoft Academic Search.
(
ieeexplore.ieee.org
)
(
ieeexplore.ieee.org
)
Citation Context
(5)
...We note that although Strassen’s result is classical, there is a renewed interest in his setup; see, e.g., [4], [
5
], [6], [7], [8], and references therein...
Y. Altug
,
et al.
Moderate deviation analysis of channel coding: Discrete memoryless cas...
...Proceeding heuristically from the reliability function in [7], the expansion in (16) was put forward in [
6
] with n = o(n 1=2), for the case of percodeword power constraint...
Yury Polyanskiy
,
et al.
Dispersion of Gaussian channels
...We use Theorem 3 in [
9
] (see below) to determine the set of packet loss and physical layer transmission rates for packets with length L bits transmitted over the channels using optimal forward error mechanisms at the physical layer...
Ramin Khalili
,
et al.
On the performance of Random Linear Network Coding in relay networks
...To deal with input packets of finite length k, we adapt the following theorem from our recent work [
5
] (see also Wolfowitz [6] and references therein)...
...Theorem 1: [
5
] For a binary symmetric channel with crossover probability p, there exists a constant Q1 such that, if the number n of channel uses in the first transmission satisfies...
...Our previous results [
5
] show how to compute Q1 in closed form for the binary symmetric channel (BSC)...
...(The latter expenditure is necessary to combat nonasymptotic effects [
5
, 9, 11].) Furthermore, because �� is monotone...
Dror Baron
,
et al.
Coding vs. Packet Retransmission over Noisy Channels
...Can we do better? We begin with a lower bound by Wolfowitz [2] (see also Baron et al. [8,
9
]) on the penalty for using finite length sequences in communication systems in which the statistics are known...
...Theorem 2 [2,8,
9
] For known statistics and a fixed �, the penalty for using finite length sequences in coding schemes that rely on joint typicality is �( p n) bits...
...universality. The remaining terms account for the penalty in nonasymptotic channel coding [2,
9
]...
...The � 1(�i) comes from the Central Limit Theorem [2,
9
]; backing off �...
...also appears in refined versions of Theorem 2 [2,8,
9
], we have a tight order bound...
Shriram Sarvotham
,
et al.
VariableRate Coding with Feedback for Universal Communication Systems
References
(7)
A Mathematical Theory of Communication
(
Citations: 9707
)
Claude Shannon
Published in 1948.
Coding Theorems of Information Theory
(
Citations: 148
)
J. Wolfowitz
Published in 1978.
Design of capacityapproaching irregular lowdensity paritycheck codes
(
Citations: 1371
)
Thomas J. Richardson
,
Mohammad Amin Shokrollahi
,
Rüdiger L. Urbanke
Journal:
IEEE Transactions on Information Theory  TIT
, vol. 47, no. 2, pp. 619637, 2001
Probability of error for optimal codes in a Gaussian channel
(
Citations: 229
)
C. E. Shannon
Lower Bounds to Error Probability for Coding on Discrete Memorylless Channels. I
(
Citations: 132
)
Claude E. Shannon
,
Robert G. Gallager
,
Elwyn R. Berlekamp
Journal:
Information and Computation/information and Control  IANDC
, vol. 10, no. 1, pp. 65103, 1967
Sort by:
Citations
(7)
Moderate deviation analysis of channel coding: Discrete memoryless case
(
Citations: 1
)
Y. Altug
,
A. B. Wagner
Conference:
IEEE International Symposium on Information Theory  ISIT
, 2010
Dispersion of Gaussian channels
(
Citations: 10
)
Yury Polyanskiy
,
H. Vincent Poor
,
Sergio Verdu
Conference:
IEEE International Symposium on Information Theory  ISIT
, 2009
On the performance of Random Linear Network Coding in relay networks
(
Citations: 4
)
Ramin Khalili
,
Majid Ghaderi
,
Jim Kurose
,
Don Towsley
Conference:
MILCOM, Military Communications Conference  MILCOM
, 2008
Universal FixedLength Coding Redundancy
Cheng Chang
,
A. Sahai
Published in 2007.
Coding vs. Packet Retransmission over Noisy Channels
(
Citations: 1
)
Dror Baron
,
Shriram Sarvotham
,
Richard G. Baraniuk
Conference:
Conference on Information Sciences and Systems  CISS
, 2006