抖音网红黑料

A Lower Bound for Cooperative Broadcast in the Presence of Noise

Release time:2010-06-24

Speaker: Michael Saks Rutgers, The State University of New Jersey

Time: 2010-06-24 14:00-2010-06-24 15:00

Venue: FIT 1-222

Abstract:

In a noisy broadcast channel, processors communicate as follows: in each time step, one processor broadcasts a bit. Each of the other processors receives a bit, but the received bit is incorrect with some known probability p. Reception errors are assumed to be independent. In such an environment, a group of n broadcasters, each possessing a bit, wish to transmit all n bits to a receiver so that, with probability close to 1, the receiver learns all of the bits correctly. This can be done easily with O(n log n) broadcasts, by having each processor broadcast its input bit C log n times (for some sufficiently large constant C) and having the receiver recover each bit by majority vote. This naive algorithm was improved by Gallager who, in 1988, gave a surprising algorithm that uses only O (n log log n) broadcasts. I'll talk about my proof with Navin Goyal and Guy Kindler that Gallager's algorithm is optimal up to a constant factor.

Short Bio:

Michael Saks is a professor in the Mathematics Department at Rutgers University. His interests range over a number of areas of theory of computing and discrete mathematics. Most recently he’s been thinking about lower bounds in concrete computational models (decision trees, communication complexity), polynomial identity testing, exponential algorithms for NP hard problems, and various algorithmic problems related to monotonicity.

RETURN
演讲人 <a href='//www.math.rutgers.edu/~saks/' target='_blank'>Michael Saks</a> Rutgers, The State University of New Jersey 时间 2010-06-24 14:00-2010-06-24 15:00
地点 EN
TOP