Wednesday, October 23, 2013

article on privatisation of universities

I recommend the article Sold Out by Stefan Collini, in the London review of Books. It reviews two books about the process of privatisation/marketisation of higher education, that has taken place in the UK during the past roughly 30 years. Collini’s article provides a highly informative overview of the development and current state-of-play of private sector provision of HE, in the USA and UK. It then explains very lucidly the problems inherent in the current system of student funding, and is very convincing on why it cannot be expected to last. My only criticism: the article is passionate, but doesn’t propose remedial action. That may be because there’s a bigger picture (missing from the article) that includes, for example, the recent Royal Mail privatisation. According to this bigger picture theory, remedial action would not be specific to universities, but would involve a restoration of trust in public institutions and the public service ethic.

(added later:) ...And, this article (Economics students need to be taught more than neoclassical theory) in the Guardian looks relevant to the general topic. See also this earlier article (Economics students aim to tear up free-market syllabus). Their criticism of neoclassical economics is mainly that it failed to predict the financial crisis. It could also be blamed for an approach to student funding which takes the form of firstly, assuming all students are selfish agents, and secondly, repeatedly raising the price of higher education, in search of some kind of market equilibrium. 

Wednesday, October 09, 2013

What are the equilibria of this game?

In the course of writing a paper, we often construct examples of the sort of mathematical objects under analysis, as a means to figure out what’s true in general. Then the paper gets written, and the examples are stripped away, like the scaffolding from a completed building. Before it gets forgotten, I now provide an online description of the following game (that came up while writing this paper), since it sets you thinking about alternative solution concepts (exact/approximate Nash/correlated equilibria (well-supported or otherwise)), and which one(s) we are supposed to like best. Or maybe it could be used as part of a game theory exercise. (It may be helpful to point out that a well-supported ε-approximate equilibrium is one where no player ever uses a strategy that pays more than ε less than his best response. An approximate equilibrium that is NOT well-supported could allow some player to occasionally play a very poor strategy, provided that it’s with low probability.)

Suppose there are n players, numbered 1 through n. Each player has 2 pure strategies, call them a and b. Payoffs will be either 0 or 1, according to the following rule. Player 1 gets paid 1 to play a and 0 to play b, UNLESS all other players play a, in which case it’s the other way around. Meanwhile, for i>1, player i gets paid 1 to play the same way as player i−1 (otherwise is paid 0).

(If instead player 1 was paid to differ from player n, we would have a generalized Jordan game, with a unique Nash equilibrium where all players flip fair coins to choose a or b.)

So — spoiler alert — I reckon the answers are as follows. For exact Nash equilibrium, it’s not hard to see that players 1 and 2 can’t play pure strategies, and in fact players 1 and 2 should flip fair coins, meanwhile the other players all play 1. For exact correlated equilibrium, one solution (different from the NE) is for player 1 to flip a fair coin, while all the other players flip a single shared fair coin, ie they play the all-a’s vector or the all-b’s vector with equal probability. It looks like there is some freedom for any player to deviate slightly from this, while keeping it well-supported. In the neighbourhood of the original NE, only players 1 and 2 can deviate slightly, if we want the approximate equilibrium to be well-supported. (In fact, I think we can say more strongly, that all the well-supported Nash equilibria are in the vicinity of the exact NE.)

Interestingly, when we give up on the “well-supported” requirement for correlated equilibria, lots of other things become possible. Notably, all the players can play a with high probability 1−O(1/n), and this is what would start to happen if natural learning dynamics were applied to learn an equilibrium. Eventually, the dynamics should do something different; maybe converge to something in the region of the exact NE.

Tuesday, October 01, 2013

2 research posts in Algorithms and Complexity, Oxford

2 research positions in algorithms and complexity at the University of Oxford, each tenable for up to five years

Professor Leslie Ann Goldberg is looking for two postdoctoral researchers to join the ERC project Mapping the Complexity of Counting at the University of Oxford. (This post at Luca Aceto’s blog contains a link to the announcement of this and other ERC grants.) This is a five-year project from 1 Mar 2014 to 28 Feb 2019. The positions are available for the duration of the project (subject to satisfactory performance).

The salary range is £29,541 - £36,298, depending on experience.

Deadline for applications 15 Nov 2013
Details and application form at
https://www.recruit.ox.ac.uk/pls/hrisliverecruit/erq_jobspec_version_4.jobspec?p_id=110002

The overall objectives of the project are:
  • To map out the landscape of computational counting problems (both exact and approximate), determining which problems are tractable, and which are intractable (quantifying the extent of tractability).
  • To discover complexity characterisations which elucidate the features that make counting problems tractable or intractable (telling us why each problem is tractable or intractable).

Within the context of these overall objectives, the goal is to study classes of counting problems which are as wide as possible (obviously including the problems that arise in practice), and within these classes, to develop complexity dichotomies/characterisations which are as precise as possible. The key problem domains that will be considered are graph homomorphism problems (including generalised weighted homomorphism problems), counting constraint satisfaction problems and holant problems. Key challenges include determining the effect of restrictions such as planarity, bipartiteness and bounded-degree constraints on the complexity of graph-theoretic counting problems.

Candidates should have a PhD in algorithms and complexity or in a closely related area of discrete mathematics.

Expertise in one or more of the following would be helpful: combinatorics or discrete probability, graph polynomials or partition functions, mixing rates of Markov chains, constraint satisfaction problems including algebraic methods for classifying their difficulty, holographic algorithms or holant problems.

REF submission deadline approaching

At the New Insights into Computational Intractability workshop, I find that some of our distinguished overseas speakers are blissfully unaware (or maybe: blissfully vaguely-aware) of ongoing work on submissions to the Research Excellence Framework. They’re unfamiliar with the heavy-duty administrative machinery that UK universities have set up to process the submissions, which will finally fall silent on the deadline of November 29th.  There are gaps in their knowledge about the functional dependence of the number of researchers a ‘unit of assessment’ (i.e., academic department) can submit, on the number of impact cases. They don't know that “evidence” is a verb, not just a noun. They don’t know the word “REF-able”. And so on.

My advice: it’s a time to indulge your UK colleagues when they gripe about REF. Especially if they’ve taken on any kind of responsibility for preparing a department’s REF submission. Buy them drinks. Appear sympathetic and remind that it’ll all be over in a few weeks. 

Sunday, September 29, 2013

CALL FOR NOMINATIONS FOR EATCS-FELLOWS 2014

Note: the EATCS Fellow scheme is new, so at this stage, there is no risk that anyone that you may be thinking of nominating for this recognition, already has it; the field is wide open! Here is the link on the EATCS web site.

Please note: all nominees and nominators must be EATCS Members

Submit by December 31 of the current year for Fellow consideration by email to the EATCS Secretary (secretary@eatcs.org). The subject line of the email should read "EATCS Fellow Nomination - <surname of candidate>".

REQUIREMENTS FOR EATCS NOMINATION:

The EATCS Fellows Program is established by the Association to recognize outstanding EATCS Members for their scientific achievements in the field of Theoretical Computer Science. The Fellow status is conferred by the EATCS Fellows-Selection Committee upon a person having a track record of intellectual and organizational leadership within the EATCS community. Fellows are expected to be "model citizens" of the TCS community, helping to develop the standing of TCS beyond the frontiers of the community.

In order to be considered by the EATCS Fellows-Selection Committee, candidates must be nominated by at least four EATCS Members.  Please verify your membership at http://www.eatcs.org/.

The EATCS Fellows-Selection Committee consists of
  • Rocco De Nicola (IMT Lucca, Italy)
  • Paul Goldberg (Oxford, UK)
  • Anca Muscholl (Bordeaux, France, chair)
  • Dorothea Wagner (Karlsruhe, Germany)
  • Roger Wattenhofer (ETH Zurich, CH)

INSTRUCTIONS

A nomination should consist of answers to the questions below. It can be co-signed by several EATCS members. At least two nomination letters per candidate are recommended. If you are supporting the nomination from within the candidate's field of expertise, it is expected that you will be specific about the individual's technical contributions.

To be considered, nominations for 2014 must be received by December 31, 2013.

1 Name of candidate
Candidate's current affiliation and position
Candidate's email address, postal address and phone number
Nominator(s) relationship to the candidate

2 Short summary of candidate’s accomplishments (citation — 25 words or less)

3 Candidate’s accomplishments: Identify the most important
contributions that qualify the candidate for the rank of EATCS Fellow
according to the following two categories:

A) Technical achievements
B) Outstanding service to the TCS community

Please limit your comments to at most three pages.

4 Nominator(s):
Name(s)
Affiliation(s), email and postal address(es), phone number(s)

Tuesday, September 17, 2013

Books on display

A visitor is pleased to see a copy of Mas-Colell, Whinston, and Green on my office bookcase. Admittedly I have not read much of it, but it’s the thought that counts. It, and other books, are a conversation-starter. A book collection on display is a way to talk about yourself, and your interests, in an acceptable, take-it-or-leave-it way. If we all stop using paper books and switch to e-books, there will be a downside. There is not much on-line worrying about this downside of e-books (maybe I used the wrong search terms), but there are a few articles on the merits of bookcases that other people can look at. This web page explains reasons why paper books are nice (as artifacts), but doesn’t quite make the point about books as a personal statement. The bookshelf as a personal statement is discussed in this article which has a link to a website where you can upload a picture of your bookshelf.

Just for balance, this article “Breaking The Sentimental Attachment To Books” offers advice on helping you get rid of books you don’t need.

Friday, September 13, 2013

improving text by cutting it down

In making revisions to a paper that was accepted to a journal, I was told to make sure the abstract had no more than 150 words. Of course, my immediate assumption was that the abstract had already been perfected, and having to cut about 40 words could only make it worse. In the event I managed to improve the text as a result, and was left admitting that it was a useful exercise, purely for the purpose of good writing. Maybe I should always cut my text by 20% as part of the process of polishing the writing.

Yet I’m still reluctant to put that idea into practice. It’s probably due to some kind of ingrained resistance to the idea that you can improve something by making an economy. Economising isn’t supposed to be beneficial, it’s supposed to be onerous, right? Maybe the worry is that if I cut 20% voluntarily, I’ll be required to cut another 20%, and there must be some point at which the “less is more” principle stops applying...

Monday, September 09, 2013

Postdoc positions in algorithmic game theory at the University of Oxford

Three postdoc positions in Algorithmic Game Theory are available at the University of Oxford funded by the ERC Advanced Grant "Algorithms, Games, Mechanisms, and the Price of Anarchy" (Principal Investigator: Elias Koutsoupias). The researchers will be based in the Department of Computer Science and will be part of the newly-established Algorithms group in Oxford (http://www.cs.ox.ac.uk/research/algorithms).  Work for these posts includes investigating, designing, and analyzing mechanisms; extending the theory of the price of anarchy and stability; investigating game-playing issues that arise by applying the algorithmic approach (e.g. limitations in computation and communication) to classical game theory.

Candidates should have a first degree and a doctorate in computer science or related discipline and a record of research in algorithmic game theory or a related area.
Area: Algorithmic Game Theory
Number of positions: 3
Duration: 1 year (extendable to 5 years)
Deadline for applications: 12 noon on 10th October 2013
Further information and applications: https://www.recruit.ox.ac.uk/pls/hrisliverecruit/erq_jobspec_version_4.jobspec?p_id=109515

Informal inquiries are welcome and should be directed to Elias Koutsoupias (mailto:elias@cs.ox.ac.uk).

Wednesday, June 19, 2013

2013 Game Theory and Computer Science Prize

Here is a link to the announcement and details. Congratulations to Benjamin Edelman, Michael Ostrovsky, Michael Schwarz, and Hal R. Varian, for their papers:
Benjamin Edelman, Michael Ostrovsky, and Michael Schwarz, “Internet Advertising and the Generalized Second-Price Auction: Selling Billions of Dollars Worth of Keywords” American Economic Review 97, pages 242-259, 2007
and
Hal R. Varian, “Position Auctions” International Journal of Industrial Organization 25, pages 1163-1178, 2007
As one of the members of the prize committee, I would like to thank everyone who submitted nominations.

Wednesday, June 05, 2013

The economic impact of astronomy

The Wikipedia page on Computer Science quotes the folklore one-liner that Computer Science is no more about computers than astronomy is about telescopes. On the other hand, it turns out that algorithmic mechanism design is sometimes about telescopes — provided that it’s a telescope that everyone wants to use.

The paper Telescope time without tears – a distributed approach to peer review (link to journal version) proposes a mechanism for allocation of telescope time to astronomers, and this mechanism is being tested by the NSF in their review process for grant applications. There is a critique and discussion here at Michael Mitzenmacher’s blog. The general approach of the mechanism is that each competitor gets to rank some of of the other competitors; these rankings are aggregated to get an overall ranking; then (the controversial bit), a competitor gets a small bonus if his partial ranking has good agreement with the overall ranking. The risk is that a competitor tries to second-guess the consensus rather than express their honest opinion.

The scheme being proposed by the NSF is very faithful to the one proposed in the Telescope Time paper. I read the paper — it is very readable and does a great job of articulating well-known problems in the scientific world, such as the problem of finding reliable referees, the lack of incentive to be a reliable referee (the main reward of which being that you get more refereeing to do) and the problem of allocating scarce resources to competing proposals, where the resource could be telescope time, funding, or presentation time in a conference. It makes lots of points that will resonate with many readers, for example that the ideal of a correct ranking of proposals in order of merit is “at best an over-simplification and at worst a fantasy”. The paper also carefully considers the limitations and shortcoming of the approach. Regarding the main concern noted above, they write:
Perhaps the greatest potential concern is that this procedure will drive the allocation process toward conservative mediocrity, since referees will be dissuaded from supporting a long-shot application if they think it will put them out of line with the consensus.  This effect would need careful monitoring, but it could also be addressed explicitly in the instructions given to applicants as to the criteria they should apply in their assessments: if they are aware that all applicants have been encouraged to support scientifically-exciting speculative proposals, then they are likely to rank such applications highly
themselves to remain in line with the consensus.
The following quote explains some background to the paper:
In his attempts to put off assessing telescope proposals by learning more about the subject, MRM came into contact with this work’s co-author (DGS) who has extensive expertise in both astrophysics and the mathematical theory of voting and its associated complexities.
Finally, here is a link to an attempt I made to address a similar problem, in the context of reviewing papers for conferences.

Sunday, May 12, 2013

A refinement of approximate correlated equilibrium

One reason to prefer correlated equilibrium (CE) over Nash equilibrium (NE) is that correlated equilibrium is easier to compute. We have various PPAD/PLS-hardness results for exact Nash equilibria (NE), and it is possible that approximate Nash equilibria may also be similarly ‘hard’ to compute. In contrast, approximate correlated equilibria (CE) are not just easy to compute, but are obtainable by simple ‘learning dynamics’ type algorithms. The price you pay is that it’s a much weaker solution concept: every Nash equilibrium is a correlated equilibrium, and there may be CEs that are far from any NEs. For example some congestion games have CEs with social cost much higher than the worst NE. Some games have unrealistic Nash equilibria that motivate the study of various refinements of NE, restrictions that hopefully weed out some or all of the unnatural ones.

Given that CE is a relaxation of NE, it looks like there is even more motivation to impose restrictions on them so as to get rid of unnatural equilibria. For CE, I reckon that any such restriction should try to chip away at the set of approximate correlated equilibria, via some mathematical restriction that gets rid of undesirable ones while preserving the property that the remaining solutions are still easy to find, preferably via simple algorithms. The following definition is derived from the notion of approximate well-supported Nash equilibrium, but the version for correlated equilibrium doesn’t seem to have been considered previously. Here I just give the definition and a motivating example, and leave it as a guess that the restriction makes it not too much harder to compute.

In a well-supported ε-NE, it is disallowed for a player to allocate positive probability to a pure strategy whose payoff is worse than the best response by more than ε. (In contrast, an ε-NE that is not well-supported allows a player to play a really bad response, provided that he does so with such low probability that his average payoff does not fall short of the best response by more than ε. For bimatrix games with payoffs in [0,1], it is currently known how to — in poly time — compute ε-NE for ε just over ⅓, but for well-supported ε-NE, it is only know how to do this for ε slightly less than ⅔.

Applying that “well-supported” principle to ε-correlated equilibria, it corresponds to a requirement that when a player observes his action (a sample of his marginal distribution derived from the approximate CE), he does not gain more than ε by switching to some other pure strategy. Here’s an example of how it makes a difference. We start with a trivial example and then make it more interesting. The trivial example has one player with actions {0,1}, who gets paid 0 to play 0, and 1 to play 1. In a standard ε-approximate CE, he can play 0 with probability up to ε, but if the approximate CE should be well-supported, he must play 1 with probability 1. To make the example more interesting, suppose you have a sequence of players 1,2,...,n, all with 2 pure-strategies 0 and 1, where player 1 is like the original player in the trivial 1-player game, and player i > 1 gets paid 1 to copy player i−1 and 0 to play the opposite strategy of player i−1. In a well-supported ε-CE, all players must play 1 with probability 1. By contrast, in a standard ε-CE, the probability that player i+1 plays 1, may differ by up to ε from the corresponding probability for player i. Consequently, the probability that player i plays 1, may fluctuate arbitrarily between 0 and 1 as i increases, subject to the constraint that it changes by at most ε each time i increases by 1. Thus the “well-supported” constraint seems to rule out a large number of outcomes. Whether you think this “well-supported” constraint is a good one, depends on whether you think these outcomes really ought to be ruled out...

Sunday, May 05, 2013

end of lectures?

There’s an interesting discussion in an article in the Guardian about the possible demise of traditional lectures at universities; it’s inspired by a speech by Jimmy Wales reported in more detail here (Jimmy Wales: Boring university lectures ‘are doomed’). Any academic must be tempted to have an opinion on this topic. One thing that I haven’t seen come up in this discussion is the freedom the lecturer gets in an unrecorded lecture to take liberties with the subject-matter; for example to be a bit imprecise when discussing a mathematical topic, in a way that helps the audience’s intuition, but involves saying something that is not strictly correct. With a textbook, or recorded lecture, there is pressure to play it safe and avoid the possibly helpful imprecision.

Lectures are often criticised for not being very interactive (although, these days, any reasonable lecturer attempts to be interactive). When a lecture fails to be interactive, it is noted that you might as well watch a video of the lecture (or better yet, some version of it delivered by a great lecturer). What the video loses, however, is the unrealised possibility of interaction: the possibility that an awkward question may be asked makes it different from a setting where there is no possibility that an awkward question may be asked. At least that may be why I’ve found recorded lectures to be less interesting than the live version, other things being equal.

Finally, the prophets of lecture-doom need to explain why theatres have not been completely wiped out by movies, and indeed why lectures themselves have not already been replaced by recorded ones (after all, the technology has been around for a while).

Tuesday, March 19, 2013

A real-world path auction (sort of)

Path auctions have received quite a lot of attention in the Algorithmic Game Theory community. The general scenario envisages a single buyer who wants to procure a path through a network, and has to buy a set of links that contains an acceptable path. The links are available for sale from multiple agents; a common special case of interest has each link belonging to a single agent. It’s a line of work that has led to some interesting results and techniques; for example I like the “cheap labor can be expensive” paradox of this paper by Chen and Karlin, which says that sometimes, if additional (and cheap) links/sellers are added to the network, it may actually drive up the cost for the buyer.

Now the standard “practical” motivation for path auctions that is given in the AGT literature, usually refers to a requirement to route network traffic (either data over a computer network, or alternatively, road traffic). However I have not seen any concrete examples of such path auctions actually taking place. But here’s an auction that actually takes place, in a context that is very similar conceptually to a path auction. Adapting Auctions for the Provision of Ecosystem Services at the Landscape Scale considers payments for Ecosystems Services (ES) in which landowners in Australia are paid to promote conservation goals on their land. Quoting the paper:
Most ES auctions adopt a sealed bid, discriminatory price mechanism, in which successful landholders are paid their bid price (e.g. Stoneham et al., 2003; Windle et al., 2009). A rational landholder will request at least the opportunity cost of providing the ES; they can ask for more, but the higher their price the less likely they are to have their bid accepted.
Now, the environmental gain for ES is greater if parcels of land are adjacent, since the protected land then provides wildlife corridors, and the like. The greater value of certain combinations of land parcels has influenced the mechanism used to set the prices. The mechanism in use seems to use a number of iterations, in which bids are received, a set of winners is announced, and based on that (provisional) outcome, sellers get the opportunity to change their bids. Another quote:
A critical problem in corridor formation is individuals not participating, or holding out for excessively high prices. In this form of iterated auction there will be greater opportunity for participants to identify and work around such hold-outs. Where there are different ways of forming a corridor across a landscape, corridors can evolve over multiple rounds according to the bidding behaviour of individual landholders.
The paper reports on lab-based experiments aimed at optimizing parameters of the kind of mechanism in use, in particular the number of rounds of an iterated auction. Another paper I found that may be of interest is An Experimental Study of Iterative Auctions for Ecosystem Services Delivery, which experiments with an iterative descending-price mechanism.

Tuesday, February 26, 2013

Elsevier Editorial System

Last time I did a review for an Elsevier journal, I was prompted to set up an account on the Elsevier Editorial System, which aggregated all (or most) of earlier reviews I previously did for various Elsevier journals. From now on, it would seem that future reviews for Elsevier journals will get added to a bulging portfolio. While my general understanding is that the main reward for reviewing papers will occur in the afterlife,  there’s something rather satisfying about seeing this collection in one place. (And I never know there were so many Elsevier journals, he adds, naively...).

If there is indeed any value to having a centralised record of (anonymous) reviews, it might provide a new justification for journals to belong to large commercial publishers. On the other hand, review requests lack the personal touch that they used to have until about 10 years ago. And there’s presumably a risk of their database getting hacked, and reviews getting de-anonymised.

On another topic, here is a list of top 30 Computer Science blogs. I am not sure I deserve to be listed there, but anyway, some of the others look interesting, check them out. Here is a similar list that I was notified of at about the same time.

Wednesday, January 23, 2013

decem quid pro quo?

I received from a colleague, an email bearing the above subject line. It reads (slightly edited):

The Bank of England is considering putting Alan Turing (no need to explain who he is) on the next £10 note. An online petition has been created to collect signatures from people who would like to see this.

So far they have received ~26,000 signatures and it needs over 100,000 signatures before it will be considered by the design committee.

The petition is here:
https://submissions.epetitions.direct.gov.uk/petitions/31659

Personally, I would love to see this come into fruition and I am sure many of you would too.

If you do (and you’re British) I would appreciate if you could take the time to sign the petition.

Let’s hope that this highly influential computer scientist finally gets the wide-reaching recognition he deserves.

I am happy to advertise this!