Monday, May 25, 2009

Putting the VC into VCG

There's a problem with blogging about papers you've read recently - actually there are two problems - maybe the paper was one that you had to review anonymously for a conference or journal, or else it was one that relates to current research (and so co-authors may be displeased if you tell all about the research problem of interest). But, the following paper seems to escape those.

This paper appeared recently on arxiv (VC v. VCG: Inapproximability of Combinatorial Auctions via Generalizations of the VC Dimension; Authors: Elchanan Mossel, Christos Papadimitriou, Michael Schapira, Yaron Singer). So, it relates to 2 topics of interest to me, i.e. mechanism design and the Vapnik-Chervonenkis dimension. The following is a summary of my sketchy understanding of the paper, concluding with a wild guess at a future research direction that may arise.

So, the VCG mechanism requires one to find the socially optimum outcome of an auction. Suppose a bunch of items are being sold to 2 bidders, who have limited budgets, which means you impose a ceiling on the value they may assign to any set of items. Finding the social optimum is now hard, I guess because it can encode a SUBSET SUM type of problem, i.e. you could easily find yourself trying to identify a bundle of items that just fits some bidder's budget, and if you overshoot, you end up undershooting for the other bidder.

In looking for an approximately optimal mechanism, you might try for a maximal in range mechanism, in which you simply delete from consideration some of the possible outcomes - if there are n items being sold, there are 2n ways to share them between the bidders (these are the outcomes) - now suppose you restrict to a limited subset of those, you still have a truthful mechanism, and the question becomes, can you find such a subset that results in an approximately optimal mechanism. What you need to do is, find a set of partitions such that, if the socially optimal partition is not a member of your set, there exists one that is close.

The way to argue that you still need exponentially many partitions, is to consider valuations in which, for some target partition, one bidder assigns unit value to each of his items, and the other bidder assigns unit value to each of the other items. The approximation quality of any alternative partition, will correspond to its Hamming distance from the target partition, and for distances less than n/2, the hamming balls only contain a very small fraction of the 2n partitions...

So you still have to optimize over a large number of the partitions. So many in fact, that there exists a set of sqrt(n) of the items such that any partition of those sqrt(n) items can be realised by one of the partitions that is being used by the maximal-in-range mechanism. This is the part that uses results about the VC dimension (defined on Wikipedia so no need for me to repeat the def; the VC dimension is a combinatorial measure of a set system, and in the context of learning theory, mainly comes in handy when the set system is infinite; in this paper the key result being used, by contrast, is about the VC dimension of specifically finite set systems).

Finally, what this leaves us with, is a (standard, unrestricted) budget-limited auction for sqrt(n) items (since all other items can be given zero value). This is polynomially related, so you are back where you started. Conclude that maximum-in-range mechanisms cannot both give a good approximately-optimal solution, and alleviate the computational difficulty.

Where to go from here? While Roberts Theorem suggests that your options are rather limited in designing a truthful mechanism, it applies to auctions where there are no restrictions on the values that bidders can give to outcomes. But, the 2n outcomes are resulting from only 2n valuations, so there is actually a lot of structure in the valuations that has so far gone unexploited. Maybe some clever ad-hoc mechanism could have better approximation than the crude one mentioned in the paper (where you just bundle all n items together and sell them all to one of the bidders.)

Thursday, May 21, 2009

Two new postdoc positions in computational game theory

Note the update to the job ads on the left hand side of this page! The project (funded by EPSRC) is entitled Efficient Decentralised Approaches in Algorithmic Game Theory, it addresses some of the main research issues in computational game theory, some of which were raised in GAMES 2008 (the 3rd World Congress of the Game Theory Society) as being of considerable interest to the game theory community. This is a preliminary announcement, not an official ad. Updates will follow. Note the link (on the left) to some more details.

Monday, May 18, 2009

joining Facebook

Everyone I know seems to be on Facebook, so finally I have gotten around to signing up. So, no need for me to discuss my first impressions, since anyone reading this is no doubt familiar with it already.

Wednesday, May 13, 2009

Monbiot on scientific research

George Monbiot is a (mainly environmentalist) commentator who writes for The Guardian; he gets a lot of things wrong but I have to admit I quite like him since his heart is clearly in the right place. In this article he criticises the requirement for academics seeking grants to describe the economic impact of the proposed work. He also criticises the way business leaders are being put in charge of research councils.

Really, I'd say his attack is off-target --- it's not disastrous to have to describe economic impact of proposed research, so long as it is OK to be quite speculative. And I don't have a principled objection to business leaders running research councils -- with an outsider you get a sporting chance that the person concerned doesn't imagine he knows everything about science. The point he should be making is that we need to recognise that scientific progress is unpredictable, and you can't plan it that way you plan (say) a construction project. And also, "investing in research" is more analogous to premium bonds than it is to saving in an interest-bearing account.

Thursday, May 07, 2009

Paperless communications

For our last department meeting we got sent the agenda as an email attachment, and I must admit to not reading it; that's the kind of thing that really ought to be distributed on paper, in order to attract loyal readers. Let's face it, just about any communication is more likely to be read if the sender feels the need to print it off rather than just send electronic copy. I reckon that Dagstuhl's reluctance to use this newfangled "Internet" for the purpose of distributing its invitations to workshops, plays a part in its brand recognition and loyalty. On the downside, to agree to attend seminar 10101 I had to search the stationery closet for an envelope (just underneath the spare daisy-wheels and typewriter ribbons), find it's so old that the glue no longer works, then try to remind myself how to use the fax machine... But once that's done, I feel kind of committed.