There's a problem with blogging about papers you've read recently - actually there are two problems - maybe the paper was one that you had to review anonymously for a conference or journal, or else it was one that relates to current research (and so co-authors may be displeased if you tell all about the research problem of interest). But, the following paper seems to escape those.
This paper appeared recently on arxiv (VC v. VCG: Inapproximability of Combinatorial Auctions via Generalizations of the VC Dimension; Authors: Elchanan Mossel, Christos Papadimitriou, Michael Schapira, Yaron Singer). So, it relates to 2 topics of interest to me, i.e. mechanism design and the Vapnik-Chervonenkis dimension. The following is a summary of my sketchy understanding of the paper, concluding with a wild guess at a future research direction that may arise.
So, the VCG mechanism requires one to find the socially optimum outcome of an auction. Suppose a bunch of items are being sold to 2 bidders, who have limited budgets, which means you impose a ceiling on the value they may assign to any set of items. Finding the social optimum is now hard, I guess because it can encode a SUBSET SUM type of problem, i.e. you could easily find yourself trying to identify a bundle of items that just fits some bidder's budget, and if you overshoot, you end up undershooting for the other bidder.
In looking for an approximately optimal mechanism, you might try for a maximal in range mechanism, in which you simply delete from consideration some of the possible outcomes - if there are n items being sold, there are 2n ways to share them between the bidders (these are the outcomes) - now suppose you restrict to a limited subset of those, you still have a truthful mechanism, and the question becomes, can you find such a subset that results in an approximately optimal mechanism. What you need to do is, find a set of partitions such that, if the socially optimal partition is not a member of your set, there exists one that is close.
The way to argue that you still need exponentially many partitions, is to consider valuations in which, for some target partition, one bidder assigns unit value to each of his items, and the other bidder assigns unit value to each of the other items. The approximation quality of any alternative partition, will correspond to its Hamming distance from the target partition, and for distances less than n/2, the hamming balls only contain a very small fraction of the 2n partitions...
So you still have to optimize over a large number of the partitions. So many in fact, that there exists a set of sqrt(n) of the items such that any partition of those sqrt(n) items can be realised by one of the partitions that is being used by the maximal-in-range mechanism. This is the part that uses results about the VC dimension (defined on Wikipedia so no need for me to repeat the def; the VC dimension is a combinatorial measure of a set system, and in the context of learning theory, mainly comes in handy when the set system is infinite; in this paper the key result being used, by contrast, is about the VC dimension of specifically finite set systems).
Finally, what this leaves us with, is a (standard, unrestricted) budget-limited auction for sqrt(n) items (since all other items can be given zero value). This is polynomially related, so you are back where you started. Conclude that maximum-in-range mechanisms cannot both give a good approximately-optimal solution, and alleviate the computational difficulty.
Where to go from here? While Roberts Theorem suggests that your options are rather limited in designing a truthful mechanism, it applies to auctions where there are no restrictions on the values that bidders can give to outcomes. But, the 2n outcomes are resulting from only 2n valuations, so there is actually a lot of structure in the valuations that has so far gone unexploited. Maybe some clever ad-hoc mechanism could have better approximation than the crude one mentioned in the paper (where you just bundle all n items together and sell them all to one of the bidders.)
Subscribe to:
Post Comments (Atom)
1 comment:
It would be very interesting indeed to improve over the naive 1/2-approximation mechanism that simply allocates all the items to the highest bidder. It seems that this may require techniques that are inherently different from the ones we currently have for designing truthful mechanisms for multi-parameter environments like combinatorial auctions. Approaches that rely on VCG are bound to fail (as shown in the paper), and so will attempts to use random-sampling-based techniques (that have a logarithmic factor lower bound).
Post a Comment