Eminent economist Willem Buiter gives an impassioned rant against Google on his blog. I like Buiter's blog, but I'm still a fan of Google. Concerning the invasion-of-privacy issue, I am still more concerned with the Government's tendancy to invade our privacy -- if you don't like some private company, at least you don't have to use it.
This article is attracting attention on academic blogs; here is a lengthy discussion. My thoughts - it is healthy to get challenges to some of the ingrained assumptions about how academic life ought to be. Most of the proposals are a bit off the wall and lacking in focus, but I think it is well worth questioning the merit of academic tenure in the US. As Craig Venter remarked in his autobiography, really smart people don't need tenure in order to keep them motivated, or to maintain their job security (I am paraphrasing from memory).
From Policy Exchange, the think tank that gave us the notorious Cities Unlimited report (which I discussed earlier), comes this report, on why British universities should not be prevented from having to close down due to bankruptcy. The fundamental error in the report is its failure to recognise that in reality, universities can close down in all but name -- it may well be that there are legal obstacles to making one cease to exist as a legal entity, but for all practical purposes, bankruptcy would be just like for an private company.
By now the list of accepted papers for ACM-EC is last week's news, already mentioned here and here. A nice feature of that list is that there is a version of it complete with abstracts, so we can get a better feel what what sort of papers appear there. I tried reading them; let me have a go at making comments (this year, I am an impartial observer, I did not submit a paper or serve on the program committee).
It's easy to classify papers according to who wrote them, harder to classify them according to (more importantly!) what they are about. Out of 40 papers, just over 31 come from the USA (dividing up credit for a paper equally amongst its co-authors), followed by Israel on just over 3, then England takes third place with just under 2, leaving France, Germany, etc simply nowhere.
There's a big emphasis on mechanism design, and most of the papers address the issue of truthful mechanisms in various contexts. Only one of the abstracts mentions price of anarchy; another paper is about price of mediation (ratio of social cost of correlated equilibria to social cost of Nash equilibria). Maybe that kind of topic is more likely to go to SAGT these days(?). (However, just to keep up the competitive pressure, (via the comsoc mailing list) I just got an announcement for AMMA, the first Conference on Auctions, Market Mechanisns and Their Applications.) Returning to EC, I see maybe 2 papers relating to coalitional games, another 2 to market equilibria, one on voting... actually, it's kind of hard to find a feature that is shared by approximately half of the papers. I wondered about classifying them according to the number of agents they study, i.e. 2, 3, n or infinity, but in nearly all cases it seems to be n.
In an epsilon-Nash equilibrium, the players choose distributions over their actions in such as way that no player can gain more than epsilon (in expected utility) by switching to some alternative strategy. But in general, there is no guarantee that an epsilon-Nash equilibrium is anywhere near a true Nash equilibrium. ("near" can be defined by using the variation distance between two probability distributions.)
After my talk at BCTCS (on algorithms for computing approximate Nash equilibria) Arno Pauly asked me the following interesting question. How hard is it to find an epsilon-Nash equilibrium that is guaranteed to be within epsilon of a true Nash equilibrium? (My immediate answer: don't know.)
Focus on the 2-player case. The problem is in TFNP, because a non-deterministic Turing machine can find a genuine Nash equilibrium, which of course qualifies. But there's an interesting twist here, which is that a solution to the problem cannot, on its own, in any obvious way, be checked in deterministic polynomial time. This is in contrast to most search problems. A solution does not seem to constitute its own certificate. By contrast, an unrestricted epsilon-Nash equilibrium is its own certificate.
I guess that any NP search problem (not just total ones) have related ones where a solution cannot be checked in deterministic poly-time. e.g. SAT: suppose you ask for an assignment to the propositional variables that is Hamming distance at most n/4 from a satisfying assignment. It's still in NP but you can't (seemingly) deterministically check a solution. The problem of finding an approximation to a genuine Nash equilibrium looks more natural, of course.
So, is this problem any harder than finding an unrestricted epsilon-Nash equilibrium? As explained in the talk, little is known about the complexity of finding epsilon-Nash equilibria, although the fact that they can be found in subexponential time for any epsilon>0 suggests that maybe the answer is that the problem should be in P.
On the other hand, finding epsilon-approximations to true Nash equilibria feels like it ought to be tougher. One possibility that I would be tempted to look for: any such algorithm can be modified to find genuine Nash equilibria. A possible analogy is with boosting in computational learning theory, where in Schapire's classical paper, a PAC algorithm that guarantees to end up doing slightly better than random guessing, can have its performance boosted (by running it repeatedly on variously filtered data) to end up with standard PAC learning.
I am attending the 25th British Colloquium for Theoretical Computer Science (today thru Thursday). From the web page, it looks like the meeting has nearly 100 participants, quite a healthy number. I've not been for some time; last time I went to BCTCS the conference was at Liverpool and I worked at Warwick; now it's the other way around. So, a good time to meet up with various people I've not seen for a while.