Event Series
Event Type
Seminar
Friday, November 1, 2019 2:00 PM
Yuval Wigderson

In 1948, Claude Shannon published a paper that simultaneously invented the field of information theory and more or less answered all its major questions. One of Shannon's key ideas was the introduction of the entropy function, defined by the mysterious expression        –Σ p_i log p_i. In this talk, I'll go through several of Shannon's main theorems, always with an eye towards understanding in what sense this magical quantity captures the "information" of a probability distribution. Time permitting, I'll also discuss polar codes, which I consider among the most important mathematical developments of the 21st century.