Complexity in Information Theory by Yaser S. Abu-Mostafa

By Yaser S. Abu-Mostafa

The capacity and ends of data thought and computational complexity have grown considerably nearer over the last decade. universal analytic instruments, corresponding to combinatorial arithmetic and knowledge circulation arguments, were the cornerstone of VLSl complexity and cooperative computation. the elemental assumption of restricted computing assets is the idea for cryptography, the place the excellence is made among on hand info and obtainable details. quite a few different examples of universal pursuits and instruments among the 2 disciplines have formed a brand new study class of 'information and complexity theory'. This quantity is meant to show to the learn neighborhood a few of the fresh major issues alongside this subject matter. The contributions chosen listed below are all very simple, shortly energetic, particularly well-established, and stimulating for enormous follow-ups. this isn't an encyclopedia at the topic, it's involved merely with well timed contributions of adequate coherence and promise. The varieties of the six chapters disguise a large spectrum from particular mathematical effects to surveys of huge components. it's was hoping that the technical content material and subject of this quantity can assist identify this basic study zone. i want to thank the authors of the chapters for contributing to this quantity. I additionally want to thank Ed Posner for his initiative to deal with this topic systematically, and Andy Fyfe and Ruth Erlanson for proofreading the various chapters.

Show description

Read or Download Complexity in Information Theory PDF

Best information theory books

Database and XML Technologies: 5th International XML Database Symposium, XSym 2007, Vienna, Austria, September 23-24, 2007, Proceedings

This ebook constitutes the refereed court cases of the fifth overseas XML Database Symposium, XSym 2007, held in Vienna, Austria, in September 2007 along with the foreign convention on Very huge info Bases, VLDB 2007. The eight revised complete papers including 2 invited talks and the prolonged summary of one panel consultation have been rigorously reviewed and chosen from 25 submissions.

Global Biogeochemical Cycles

Describes the transformation/movement of chemical compounds in a world context and is designed for classes facing a few points of biogeochemical cycles. equipped in 3 sections, it covers earth sciences, aspect cycles and a synthesis of latest environmental matters.

Extra info for Complexity in Information Theory

Example text

XN ).

In the rest of this chapter we apply these properties to prove lower bounds on the number of bits that must be transmitted in error-free computation of a function. 2 Worst-Case Complexity So far, we have not addressed the relation between the computed value of a protocol and the function it is supposed to compute. To define the number of transtnitted bits needed to compute a function, we need to do this first. Denote the number of bits in a string b by Ibl and let l(x, y) ~ Ib(x, y)1 = E;:<;'Y) Ibi(x, y)1 denote the total number of bits transmitted according to a protocol ¢> when Px knows x and Py knows y.

Then Z is also the largest sequence among all cyclic shifts of Y. Both Px and Py can find Z. Therefore, Px transmits to Py the number of times Z should be cyclicly right shifted to obtain X (flogNl bits) and P y does the same Example 12 [EO 84] 0 Px has a sequence X of N bits and Py has a sequence Y of N bits. The two sequences are known to be different in at most K locations (dH(X,Y) ::; K). 52 How many bits must be transmitted in the worst case for each person to find the other's sequence? Here, Sp = {(x, Y) E {O,l}N : dH(x, y) ~ K}.

Download PDF sample

Rated 4.04 of 5 – based on 42 votes