Derandomization in functional analysis
Thursday evening there is a party at the American Embassy. The invitation, mysteriously, requires "cocktail attire." The clothes that people wear in San Francisco while drinking cocktails in bars do not seem appropriate for an Embassy, so I have to search on google what this means. (Men have to wear jacket and tie.) Ronitt Rubinfeld isn't here yet, Avi Wigderson has a strict no-jacket policy, and so at the party I recognize less than a handful of people. After entering, there is a receiving line. A mathematician from UC San Diego (who, I believe, was one of the organizers of the party) meets the guests first; then he introduces each guest to a jovial fellow whom I assume to be the Ambassador (he is, instead, a government official, something like the science advisor to the Secretary of State). Finally, a woman says "welcome to the United States embassy." It's my turn. Hello, I say to the mathematician, I am Luca Trevisan from U.C. Berkeley. Oh, he says, you are a graduate student. Once we are inside, the three of them give one short speech each. The woman regrets to inform us that the Ambassador had to be somewhere else in Spain and so he could not be with us. He is hiding in the next room, is my comment. He is on vacation, someone else says.
On Friday, Jon Kleinberg gives his plenary talk. He frames his work by considering two ways of viewing computers (and hence, computer science, and hence theoretical computer science). One is that we have inputs, on which we have a computation that gives an output, and we are interested in the efficiency of computation. This would be the design of algorithms and many of the problems studied in computational complexity. In the other view, computers are "mediators" between people, and they are the means of interactions that happen in various forms (emails, electronic markets, blogs,...).
Studying the information exchanges of the second type one is led to study networks, and an important point is that such networks are not artifacts that have been designed; they are phenomena that happen. After giving some concrete examples, he focuses on his results on "small world" networks.
After the break, I go to see Emmanuel Candes's talk on "compressivesampling sensing." Here the idea, not unlike Goldreich-Levin (or Kushilevitz-Mansour) is that we want to reconstruct the Fourier coefficients of a function f:{1,...,N} -> R by looking at only at a small number of values of f, under the promise that only a bounded number of Fourier coefficients of f are non-zero. It turns out that, if k Fourier coefficients are non-zero, it is sufficient to read f at O(klog N) random points and then find the function that, subject to having those values at those points, has the smallest possible l1 spectral norm. Such minimization can be found in time poly(N) using linear programming. Similar ideas also lead to LP-based algorithms for error-correcting codes. Amusingly, compressive sampling sensing is commonly abbreviated CS, and one slide reads
CS: a new paradigm for analog to digital.
Then, Avi takes me to a lecture by Stanislaw Szarek on geometric functional analysis. The beginning of the lecture, devoted to putting the problems in context is too abstract for me to follow. But then come the concrete problems. Surprisingly (to me) the probabilistic method is used routinely in functional analysis, and, in finite dimensional cases, they are interested in explicit constructions. As partial results, they are interested in constructionts that use few random bits, and one results that is mentioned involves random walks on expander graphs. Some of the derandomization open questions that he discusses have applications in compressivesampling sensing, the area earlier described by Candes.
On Friday, Jon Kleinberg gives his plenary talk. He frames his work by considering two ways of viewing computers (and hence, computer science, and hence theoretical computer science). One is that we have inputs, on which we have a computation that gives an output, and we are interested in the efficiency of computation. This would be the design of algorithms and many of the problems studied in computational complexity. In the other view, computers are "mediators" between people, and they are the means of interactions that happen in various forms (emails, electronic markets, blogs,...).
Studying the information exchanges of the second type one is led to study networks, and an important point is that such networks are not artifacts that have been designed; they are phenomena that happen. After giving some concrete examples, he focuses on his results on "small world" networks.
After the break, I go to see Emmanuel Candes's talk on "compressive
CS: a new paradigm for analog to digital.
Then, Avi takes me to a lecture by Stanislaw Szarek on geometric functional analysis. The beginning of the lecture, devoted to putting the problems in context is too abstract for me to follow. But then come the concrete problems. Surprisingly (to me) the probabilistic method is used routinely in functional analysis, and, in finite dimensional cases, they are interested in explicit constructions. As partial results, they are interested in constructionts that use few random bits, and one results that is mentioned involves random walks on expander graphs. Some of the derandomization open questions that he discusses have applications in compressive