Monday, December 13, 2010

Another critique of tuition fees

The problem with the Internet is that for nearly all topics, something is available out there that does a better job than one own's efforts could achieve. This new post at the excellent Exquisite Life blog does a great job of the criticizing the dire state of higher education policy in the UK, and should be mandatory reading for anyone who wants to engage with the associated debate. The main focus of the article is on the failure of the tuition fees to fix the fiscal problems that supposedly motivated them, and it also explains the flaws in the arguments that the fees regime is more “progressive” than the current one.

The Campaign for the Public University (which I also found out about via the Exquisite Life blog) has a collection of articles and resources on the topic.

Sunday, December 12, 2010

Game-theoretic board games

'Tis the season (almost) to sit around the fire playing board games and pretending to enjoy it. I recommend two board games, namely Poleconomy and Apocalypse, that I played a few times when I was a student; these came up in a conversation recently due to having some interesting game-theoretic content. I gave a talk at Microsoft Research (Cambridge) on Thursday, and Yoram Bachrach told me about the ripoff game, which has been played by human volunteers for cash prizes. Each individual game takes about one minute and works as follows. Each player is allocated a number, a fraction in the range [0,1] which is his “weight”. A subset of the players can form a winning team if their weights add up to at least 1. However, the winning team has not won the round until they have agreed on how to share the prize (worth 1 pound). For this purpose, each player gets to control another number in the range [0,1], which is the fraction of the prize that he requests from the winning team. And, the team does not win (and share the prize) until those fractions add up to at most 1. Apparently, the players sit in the same room but interact via computers, so the negotiation is somewhat stylized. Anyway, it turns out that Shapley value is quite a good predictor of the winnings associated with weights (although, there is variation from round to round, and some players are better at the game than others). A computational agent was implemented, which computed its Shapley value and then added 10%, and it performed well in competition.

This reminded me of an aspect of Poleconomy, a board game that simulates the interactions of politicians who happen to occupy corporate directorships “on the side”. (The game was developed in the early '80s.) From time to time, an “election” takes place, in which the players cast dice to determine the number of “votes” they obtain, and a Government may be formed by any subset of players who happen to have received a majority of the votes between them. There is some advantage to being in Government, so the immediate outcome of an election is a flurry of mental arithmetic and bargaining amongst the players to identify and agree upon a winning subset.

In Apocalypse, a player's move consists of a sequence of attacks. In each attack, the player chooses a number of units with which to attack another player, a whole number in the range 1...6, which is identified by placing a standard die with the chosen number uppermost underneath a cup; the player being attacked tries to guess the number. If the attacked player guesses correctly, that is the number of units that are lost by the attacking player; otherwise, the attacked player loses a single unit (and the attacking player gains a nuclear weapon, or part thereof, so there is an incentive to make lots of attacks.) Thus, each attack is a kind of generalized matching pennies, where the probability of choosing a smaller number is clearly larger than the probability of choosing a larger number, but all probabilities are positive.

Are there any other board games out there with interesting game-theoretic aspects?

Wednesday, December 01, 2010

CS as STEM

In the Guardian, Stephan Collini imagines an alternative world where the British government is proposing to withdraw all financial support for the teaching of sciences, as opposed to humanities. Meanwhile, a recent flurry of email on the CPHC mailing list1 addresses the status of Computer Science as a STEM field. (That's “Science, Technology, Engineering and Mathematics”.) The background being that these STEM subjects are the ones that in fact benefit from favoritism2 from Government due to their economic importance.

In more detail, STEM subjects will continue to receive a fee premium from Government. CS is STEM, right? Well, maybe not, if STEM is a shorthand for “HEFCE band A or B”; CS narrowly misses out on band B, as a result of reductions in the cost of computing facilities. The Browne review hints that “priority programmes” are primarily bands A and B, with a bit of representation from band C.

The email was about lobbying for recognition as a STEM subject, or alternatively, as “strategically important and vulnerable (SIV)”. Quoting UUK report Changes in student choices and graduate employment
The strategically important and vulnerable (SIV) subjects are chemistry, engineering, mathematics, physics and some area studies, quantitative social science and modern foreign languages (HEFCE, 2010a). They are considered so in relation to the anticipated demands of the economy rather than the exercise of student choice.


While CS is not sufficiently “vulnerable” to classify as SIV, one email argued that computer programming in particular should maybe qualify, and gave some anecdotal evidence of a decline of computer programming in schools in the UK, blamed it on the teaching of ICT in schools, and contrasted it with a relatively high interest in programming in India. ICT, as currently taught here, tends to evade the interesting technical challenges and is an example of the bad driving out the good.

One lesson we should learn: if we really care about our STEM status, it's not just a matter of lobbying to retain that recognition. It's a matter of ensuring that the intellectual content justifies the claims. We need to keep both programming and rigorous mathematics right at the heart of CS.

1Council of Professors and Heads of Computing. I sometime consider getting myself demoted so that I don't have to be on the mailing list.

2Relative, not absolute!

Thursday, November 25, 2010

Student protests

It's pretty hard to come up with anything very original to say about the student protests yesterday and earlier. The fact that there is no major political party here that would reverse this increase in university fees, leaves me with the sense that the protesters are praying to a god that does not exist. As an atheist, I fully acknowledge the benefits some people derive from praying to gods that do not exist, but I don't think the protests will do any good politically.

This comment article in the Guardian made most points that are worth making. This article in the Daily Mash is, I must admit, a trenchant and effective critique of the protest movement. Any case in favour of "higher education as public good" should take it into account. This blog post calls attention to the unresolved issues with the current proposals. We learn that Nick Clegg is "haunted" by his election promise. (That's good of him. I wish Tony Blair and David Blunkett were haunted by their 2001 manifesto commitment not to introduce tuition fees. Or should we all accept that making and breaking promises is the price of winning an election?)

I actually struggle to make a principled case for HE as public good. It's relatively easy to make a practical case: e.g. that there's a slippery slope -- next we'll expect people to pay for post-16 school education, and (following Dearing's logic), since the main beneficiaries of the NHS are the patients, then patients should pay back the costs of their treatment. But, the slippery-slope argument evades the question of what's wrong with high tuition fees themselves. The other practical (not principled) case is that the fees make us out-of-step with most of the industrialized world (as I hinted previously). To elaborate on that, one could accept that high tuition fees are correct in principle, but also accept that nations compete amongst each other for academic talant and skills. Hence we risk a new brain drain not just of staff but of students. Although, that may help with the government's efforts to reduce net immigration.

Sunday, November 07, 2010

Universities, conservatism and Hotelling's Law

I'm not sure what to make of Simon Jenkins' recent article in the Guardian, in whch he accuses universities of being lazy, wasteful, addicted to public money, and resistant to any form of change or innovation. He is probably being more harsh than he really feels, in order to provoke a reaction in the comments contributed by readers, and any reader who takes his article at face value will find that the comments that follow are a useful antidote.

Let's take the article seriously for a moment, and consider whether there is any truth in the central charge that conservatism is the driving force behind academic life. Unreasonable resistance to change is so widespread that it would be amazing if universities were free of it. You only have to look at some of the reaction to Obama's health care reforms to verify this: the hysteria, fear and anger of the rhetoric against these highly incremental reforms is evidence of a psychological disability to cope with any form of change and innovation, one that affects tens of millions of Americans.

But there's another reason why universities might not make the changes that Jenkins urges, and that's Hotelling's Law. That is the observation that in a lot of competitive marketplaces, the rival providers (of goods or services) tend to position themselves very close to each other, rather than go for product differentiation. If one provider does decide to aim at a particular section of the market (say, lower-cost goods) then in order to do so, he just offers goods at a very slighter lower cost than his rivals: he does not consider his cost in isolation, but relative to the competition. (As an aside, I'm not sure that hotels obey Hotelling's law, but other things do, e.g. political parties.)

In an increasingly global marketplace for staff and students, universities are understandably reluctant to go to extremes in order to capture some segment of that market. Despite Jenkins' call for 2-year degrees and year-round teaching, even if there is indeed a big market for those things, universities should be reluctant to go there. British ones are already taking a fairly extreme position on short degree timescales (most countries have 4 or more years for first degrees). And they are, reluctantly, taking an extreme position on low levels of state support for their activities. They are already well into the danger zone and should go no further.

Sunday, October 31, 2010

complexity and elections

I just read an interesting review article (Using Complexity to Protect Elections, by Piotr Faliszewski, Edith Hemaspaandra, Lane A. Hemaspaandra) in the CACM on the complexity of election schemes, and various "attacks" on them, such as searching for voter manipulation opportunities. The article is aimed at a general computer-science audience, and written in quite a colourful style. It touches on a problem with this line of research, one that has bothered me since I started following its progress:
... perhaps the greatest single worry related to using NP-hardness to protect elections—a worry that applies to NP-hardness results not just about manipulation, but also about control and bribery. That worry is that NP-hardness is a worst-case theory, and it is in concept possible that NP-hard sets may be easily solved on many of their input instances even if P and NP differ.
They go on to mention the theory of average-case complexity, but it seems not to have been applied in this context. Is it ever going to be worth using a highly-artificial voting system, just to ensure that a potential manipulator gets a problem that is hard in worst case, but may generally be easy in practice? Perhaps one should look at other sources of difficulty for the manipulator, such as uncertainty about the other voters' preferences.

Added 1.11.10: Another nice survey that Ariel Procaccia pointed me to, ("AI’s War on Manipulation: Are We Winning?" by Piotr Faliszewski and Ariel D. Procaccia) — considers in more detail the problem of NP-hardness as just a worst-case problem for the manipulator, it reviews some work giving negative results, i.e. fundamental obstacles to the project of designing voting systems that are typically computationally resistant to manipulation.

Thursday, October 21, 2010

SAGT 2010

Here are some notes from the 2010 Symposium on Algorithm Game Theory (SAGT), mostly typed up at Athens airport. SAGT had about 60 participants and next year will take place in Amalfi, near Naples. If you want to host it in 2012 then I believe the steering committee would like to hear from you. In the following I mention some of the papers that caught my eye; it is by no means a complete overview, being biased to my own interests, plus I suspect that "conceptual" papers tend to be more eye-catching than "technical" ones.

A special tutorial session on "Games played in physics", along with one of the regular papers "Mixing time and stationary expected social welfare of logit dymanics" highlights a line of research that looks like it could go a long way. Given a game, there's an associated Markov chain whose states are pure-strategy profiles, and transitions consist of the selection of a random player, who then updates his strategy in a way that makes better responses more likely than worse ones (although, it is possible for him to switch to a worse response). Specifically, the probability assigned to a response with payoff x is proportional to exp(βx) where parameter β is the "noise rate": β=0 is entirely noisy and players move at random; as β increases the noise goes down and players prefer better responses. The solution concept for the game is the stationary distribution of the chain. The topic of interest is the mixing rate (a function of β). A connection with physics is that the Glauber dynamics on the Ising model (a topic of enduring interest in the math/CS community) corresponds to a party affiliation game where the players lie on a grid, and in the ferromagnetic version you want to join your neighbours, and in the antiferromagnetic version you want to disagree with them.

2 papers from the "Learning and dynamics" session: one is "On the rate of convergence of Fictitious Play", which I have mentioned earlier. An important point: the lower bounds on convergence rate (for 2-player zero-sum games) apply to convergence to the "correct" strategies, as opposed to convergence to the value of the game. The paper "On learning algorithms for Nash equilibria" forms part of a proposed search for general lower-bound results stating that a broad class of algorithms should fail to find Nash equilibria, even in the 2-player case. They get some negative results for iterative weight-update approaches. They mention that the question of the convergence rate of FP for zero-sum 2-player games to the value of the game is something of a long-standing open question, and seemingly the only convergence rate known is the very slow one that results from the 1951 proof of Julia Robinson.

A cute result: "Responsive lotteries" by Feige and Tennenholtz consider the problem of incentivising someone to reveal their true preference-ranking of a set of items, by awarding one of them to the person, selected from a distribution that is derived from their declared ranking. You have to design the distribution carefully.

Peter Bro Milterson's talk was about NP-hardness and square-root-sum hardness of testing equilibria for being trembling-hand stable; I like the topic since it relates to the complexity of finding equilibria that are restricted to comply with some equilibrium selection/refinement concept. The focus on the "doomsday game" example was nice, and he gave a nice overview of the 3-player hide-and-seek game of Borgs et al (The myth of the folk theorem).

2 papers on network congestion I will mention: Yonatan Aumann's talk on "Pareto efficiency and approximate Pareto efficiency in routing and load balancing games" started by noting that the concept of Pareto efficiency can be used to distinguish the inefficency of the Braess paradox network from that of the Pigou network - in the Pigou network, when you move from equilibrium to optimum flow, some but not all agents benefit, in Braess, all benefit. They continue by studying a notion of approximate Pareto efficency, focussing on parallel links. Then a very nice talk by Martin Macko studies a model of dynamic flows on networks, and shows Braess-paradox-like results... In the model, backlogs may develop at nodes in a flow network, like the build-up of a traffic jam at a junction whose capacity is lower than the rate of incoming traffic. Assuming everyone has up-to-date traffic information, there comes a point in time where some other route is preferred (and possibly, itself becomes overloaded as a result, again not only due to the usage it attracts, but possibly due to a build-up of backlog at some point along it...). And, they get some results characterising the topology of networks that produce Braess-like paradoxes, that are different from the characterision for the standard Braess paradox.

OK, so I'm a bit of a sucker for cake-cutting... Omer Tamuz ("Truthful fair division") gave a nice talk on a protocol where you get the players to declare their valuations of items to be shared, and then you share them out in an envy-free manner. Of course, you want the sharing rule to incentivise the players to be truthful (at a high level, the set-up is similar to the "responsive lotteries" topic noted above). So, this can be done, but if you want a "super-fair" truthful mechanism, it cannot be deterministic.

Thursday, October 14, 2010

Dearing Report

I took a look, by way of reminder, at a copy of the Dearing Report, which has a lot in common with the Brown Review discussed previously. I reckon that you need to know about Dearing in order to write a sensible analysis of Browne, and understand the mess we're in and how we got there. They are both reports with recommendations on higher education funding that got commissioned by unpopular Governments shortly before an election that they duly lost. The Dearing Report got a lot of coverage and discussion in the Times Higher at the time, especially in the run-up to its publication in 1997. Reading it now, it seems terribly overtaken by events -- despite being written only about 15 years ago, it seems to address a situation a lifetime ago, when the basic assumptions about HE finance were totally different. It can be blamed for playing a part in the process of change and decay that it purported to manage. The underlying narrative of UK higher education funding over the past few decades, has been a path of least resistance. It has not been guided by principles; what has happened instead is that certain principles have been retro-fitted to the lazy decisions that have been made.

Dearing's report purported to set the HE funding scene for the next 20 years, but was in practice mainly in the business of short-term fixes rather than an attempt to settle the matter decisively. Its context was a decline in the unit of resource for teaching of about 40% over the previous 20-odd years, this decline having been billed an "efficiency gain". Universities (some of them) were making noises about charging top-up fees. The report concluded that the sector could manage a further 2% cut over the next 2 years, but the planned 6.5% cut would lead to a loss of quality. The report had some good stuff: a recognition that international competitiveness was important, and a recommendation that public spending on HE should increase in real terms by 1% a year. It broached the subject of income-contingent repayment of tuition costs, decided that a graduate tax is unworkable, and proposed a scheme for repaying 25% of a student's tuition costs. Obviously it's a shame we did not stick to that figure of 25%; it looks benign by present standards.

Some other stuff that caught my eye: It recommended a target of 45% participation in HE, which for some reason was rounded up to 50% by the subsequent Govt, but wasn't achieved. With regard to the short-term funding gap resulting from student loans coming from Govt, it recommended that national accounting rules should be changed so that the loans could be treated as a Govt asset.

Here's a point made by Dearing whose dread significance was probably not so very apparent at the time. The Dearing commission considered the question of the benefits to society arising from higher education, and concluded that the main beneficiaries are the graduates themselves, due to improved job opportunities. Not a very remarkable or surprising conclusion to reach, but it has allowed that aspect to crowd out all other considerations of the wider social benefit. Participation in HE has been reduced to an act of economic rationality, an act of selfishness.

Tuesday, October 12, 2010

Browne Review

The Browne Review (wikipedia page) came out today and is being widely discussed. Here is a link to one of many news articles. (Added later: this blog post reports on a devastating critique of the Browne report. Added still later: another great critique in the London Review of Books. Added 20.8.11: another critique (by the same writer, Stefan Collini) with more overview of the historic context.)

On the plus side, it does an efficient job of demolishing the ludicrous "graduate tax" idea. Also, it acknowledges that
Other countries are increasing investment in their HEIs and educating more people to higher standards
And the graphic design is striking in a rather retro way. It would have been improved by being embellished with the dirty fingerprints of assorted Labour party politicians, since the previous Government commissioned the report, but although the fingerprints are missing, the following quote serves that purpose:
Students do not pay charges, only graduates do; and then only if they are successful. The system of payments is highly progressive. No one earning under £21,000 will pay anything.

We estimate that only the top 40% of earners on average will pay back all the charges paid on their behalf by the Government upfront; and the 20% of lowest earners will pay less than today. For all students, studying for a degree will be a risk free activity. The return to graduates for studying will be on average around 400%.
Of course, the above is too good to be true. It reeks of Gordon Brownism, in that it's making promises too good to be true: no-one pays anything, you get much more back that you pay in later, and Government can print all the money needed to fill the short-term funding gap. There's also some stuff about student charters that looks suitably new-Labourish.

However, the present government seem happy to accept this gift. The headline figure of 7000 pounds per year to study at a UK university is dismaying many people, although not nearly as much as the lack of any limit on the fees that may be charged.

Here's a David Blunkett quote that I found here
This is a complete betrayal by the Liberal Democrats of everything that they have ever said on higher education and of the platform they stood on at the general election. The Tories have already performed a volte-face on their previous policy. This leaves only the Labour party with any credibility on student funding and the future of our great universities ...

The fact that the Labour party introduced university fees in the first place, and commissioned this report, seems to have escaped his attention! And here is the single take-home message of this blog post: Labour is to blame (or, if you like fees, they get the credit). Don't ever forget that. And don't ever forgive people like Blunkett for trying to trying to pass on the blame to his opponents.

What happens next? i.e., more specifically, how much will universities will charge? Probably there will be a high ``sticker price'' embellished with a system of discounts and bursaries for students with good exam results. It is tempting to assume that Oxford and Cambridge will gleefully impose very high fees, but they will be reluctant to be seen to be shutting out poorer candidates. Below them, prestigious universities will want to be seen to have a relatively high fee since university degrees are a Veblen good, but then they will have concerns about being able to attract enough students at the basic sticker price. If high fees do not deter too many UK students, then overseas students may be a casualty, at least if they no longer pay substantially higher fees than home students.

Friday, October 01, 2010

trip to Glasgow

Congratulations to Eric McDermid, who successfully defended his PhD at his viva yesterday at the department School of Computing Science, University of Glasgow (myself as external examiner, Patrick Prosser as internal examiner). Title of thesis: A Structural Approach to Matching Problems with Preferences, I thought it was very nice work. I was also fortunate to be able to attend Rob Irving's retirement presentation, which coincidentally took place on the same day.

Thursday, September 23, 2010

GAMES 2010

The games-for-verification crowd seems to be a different community from those of us who do algorithmic game theory (coming from economics). The topic is not covered in the textbook (fair enough I suppose, the book is big enough already). I dropped in on them to give a tutorial Nash equilibria and computational complexity, at the meeting GAMES 2010, the annual Workshop of the ESF Networking Programme on Games for Design and Verification. I got the impression that France and Poland are well-represented in that community, although that may just be an artifact of the composition of the network.

They have the main open problem of: what is the complexity of solving parity games? (Google: did you mean party games?) Parity games are known to be in both NP and co-NP, also known to be mildly subexponential (Zwick and Paterson), so believed to be in P. Currently there are no proposed algorithms that are conjectured to do the job; apparently there used to be one, but it was recently shown to be exponential in the worst case, for an artificial class of examples. The parity games problem is derived from the problem of checking whether a formula in modal mu-calculus is satisfied by a given Buchi automaton, and I gather that is a poly-time reduction. There is also an exponential-time reduction if instead the formula is in monadic second-order logic. However, the work on parity games seems to take on a life of its own, without the need to refer to model-checking. So for example, I am not sure that the question of finding a mixed equilibrium for a parity game relates to solving a model-checking problem, but it's an interesting topic.

Thursday, September 16, 2010

cap on non-EU migrants

I got an email a few days ago that was circulated to all staff, notifying us that all of a sudden, the university will only be able to employ a very small number of people from outside the EU. It applies to all universities, see this article. It's a big problem for universities that deserves wider attention, but as yet I haven't seen it discussed in the Times Higher. The article linked-to above quotes the chief executive of Universities UK who spells out the problem clearly:
The proposed cap will be difficult for universities as a significant proportion of the academic workforce is, and always has been, international. In the UK, over 10 per cent of all our academic staff are non-EU nationals and many are working in key subject areas such as science, technology and engineering.

...

The success of the UK’s higher education sector depends on our ability to attract the most highly talented people to work and study here. Anything that diminishes our ability to do this will undermine the quality of what we do and our ability to compete internationally.

This lamentably crude approach to reducing immigration to the UK does not make one optimistic that they will come up with a more sensible way to regulate overseas students. If they end up choking off the flow of overseas students, that would be complete catastrophe (a short-term one; the impact on staff hiring is a long-term one).

Monday, August 30, 2010

Peer to peer lending

I've been reading about Funding Circle in
this article in the Guardian (Funding Circle's home page also contains links to other news articles that have featured it). It's a new internet-based market that allows individuals to lend money to small businesses. So, it is similar to Zopa, which supports lending of money amongst individuals.

I've not tried it myself but am tempted... you just need a few hundred pounds to make available for loans, and your money gets divided amongst 20 different businesses, so as to spread the risk, and then the claim is that you make on average about 6% on your savings, as opposed to about 2-3% as you currently make in a savings account. Interesting features include: lenders can bid their interest rate, resulting in a trade-off in that if you set a higher rate, it may take longer for your funds to be accepted. You can also set up or join a "circle" (see the help centre) to specialise which businesses you lend to. If you want your money back, you have to sell your outstanding loans to another lender (I assume that effectively results in a penalty for early withdrawal). The system supports "autobid" which manages the division of your funds amongst distinct businesses, and recycles repayments into new loans so that your funds continue to earn interest. Finally, on the downside, my impression from reading the help page is that the tax treatment of income is not very lenient.

Monday, August 16, 2010

Two shocks to the (British university) system

Two recent news articles caught my attention: this one in yesterday's Observer, and this one in today's Guardian. In different ways, they raise profound questions about how British universities will evolve over the next ten years.

The first of the above articles draws attention to a “recruitment drive” by EU universites, aimed at would-be university students in this country. It may come as news to many such students (or their parents) that they have the option to study for degrees in various other EU countries, where the fees are lower than in the UK, and get tuition in English. The article focuses on Maastricht University, by way of example. It's an attractive option: nice town, English language tuition in some subjects, strong research. The recruitment drive is claimed to be driven by our own supply-and-demand problem: too many UK students chasing too few places. The interesting question is whether foreign universities can attract students who actually could get places over here. That potentially reverses higher education's role as a net exporter for the UK. Can this be used as an argument for better state support for universities?

The second article (similar one here), entitled London is most cost-effective UK city for students, publishes a league table, the “2010 NatWest Student Living Index” which evaluates the cost of living for students for 25 cities in the UK, and favours London, surprisingly — the catch is, that cost-effectiveness is a function of a student's earning potential in the city, rather than just cost of living. Seemingly, students are now “meant” to be working part-time while studying (at least, in order for the ranking to be meaningful). Of course, many students get support from their parents and so some of them have the option to study full-time, but (as mentioned in both articles) 46% do not. Universities themselves, meanwhile, remain pretty much oblivious to these inequities — if a student fails an exam, you can cut him some slack on the grounds that his grandmother died recently, but you can't make allowances for him having been pulling pints in the local bar every evening. I think the best fix that is likely to be achievable, would be for universities to be more flexible about the duration of study. The current rigid three-year schedule is dictated by the (declining) Government funding scheme for students, as devised when full-time study was the standard. We could move towards a more German approach of getting your degree in however long it takes you (and we should accept a higher drop-out rate).

added 17.8.10: A new Guardian article:
A landmark review into university finance is expected to recommend that student loans, now only available to those on full-time courses, are extended to part-time students to cover the fees they must currently pay upfront, the Guardian has learned. Such a move would pave the way for a major change in the way university education is viewed, with a three-year stint in a new city no longer a given.
Part-time study is a way forward, but I wonder whether you need a binary divide between full-time and part-time study, or whether we should be able to allow for study schedules that fall between the two.

Tuesday, August 10, 2010

Fictitious play

Fictitious play is a very simple iterative procedure for finding a Nash equilibrium of a game. Before stating an open problem of interest, let me give a quick summary of how it works (or, you can read the Wikipedia page for a more careful description). It's an iterative process that if it converges, must converge to a Nash equilibrium, but unfortunately it doesn't always converge. However it always works for 2-player zero-sum games, more on that below.

Assume you have a 2-player game. To begin with, each player chooses a single pure strategy. Then, both players construct sequences of strategies according to the following rule: At each step, a player considers the sequence chosen by the other player, he supposes that the other player will randomize uniformly over that sequence, and he chooses a best response to that mixed strategy. (So, a player's sequence is treated as a multiset of strategies, one of which is selected uniformly at random.) Technically, we should explain how to choose a best response in the event of a tie (2 equally good responses) - there are various options, but I won't discuss them. Anyway, the chosen best-response is added to a player's sequence, and that's how the sequences get extended.

I have recently been pestering various people with the following question: What is the complexity of the following problem:

input: a bimatrix game. For each player, an initial pure strategy.

question: does FP converge when those initial strategies are chosen at the start?

Additional question: If it converges, give the NE that it converges to.

(and just to be clear, this is not the same thing as asking about the complexity of FP itself.)

I recently read one of the early papers on the topic, namely: Robinson, J. (1951) "An Iterative Method of Solving a Game", Annals of Mathematics 54, 296-301. It is a beautiful paper, just 6 pages; I would recommend anyone to read it. It proves the result mentioned above, that for any zero-sum game, the payoffs obtained by the players do indeed converge to the value of the game. (The convergence rate that is implicitly obtained is very slow - maybe it can be improved?) Note, that is not the same thing as showing that the players play the correct strategies after any given number of iterations - a new paper in SAGT (Felix Brandt, Felix Fischer and Paul Harrenstein "On the Rate of Convergence of Fictitious Play") indicates it may take arbitrarily long to find the right strategies, depending on the payoff values.

Another nice question follows up on a paper by Vince Conitzer "Approximation guarantees for fictitious play" (Allerton 09). This paper studies the quality of approximate equilibria obtained by FP, in a scenario where the number of iterations is less than the number of pure strategies per player. But, if it's applied to a n×n game, and allowed to run for more than n iterations, for all we know it may actually do better than the best currently-known polynomial-time approximation algorithms (that obtain round about 1/3-approximation, if all payoffs are in [0,1].) Although, I will pessimistically guess that some troublesome games that haven't yet been identified, have it obtaining just a 1/2-approximation.

Wednesday, July 28, 2010

Mathematical conversation-starters

It comes up, from time to time, in discussions we have with each other. You're chatting with that long-suffering creature, the Intelligent Layperson, and you feel the urge to explain your professional interests to him/her. And, it has been established that n×n chessboards just don't cut it, or even 100×100 chessboards.

It's time to identify some topics that really work - here's one I tried recently. Consider the following quote from the beginning of the paper Minimal Subsidies in Expense Sharing Games by Meir, Bachrach and Rosenschein, to appear in SAGT.
Three private hospitals in a large city plan to purchase an X-ray machine. The standard type of such machines cost $5 million, and can fulfill the needs of up to two hospitals. There is also a more advanced machine which is capable of serving all three hospitals, but it costs $9 million. The hospital managers understand that the right thing to do is to buy the more expensive machine, which will serve all three hospitals and cost less than two standard machines, but cannot agree on how to allocate the cost of the more expensive machine among the hospitals. There will alway be a pair of hospitals that (together) need to pay at least $6 million, and would then rather split off and buy the cheaper machine for themselves.
The question you ask your audience is, what will be the outcome of the negotiation between the hospitals? Hopefully, someone will begin by saying that 2 hospitals will share a $5M machine, and with any luck, someone else will suggest that the 3rd hospital will offer to share a $5M machine with one of the first two, and pay more than 50%. At this stage, you are in good shape.

Now, you might object that this has nothing to do with computational complexity, which is sort of true, however you can introduce some later on in the discussion if you feel the urge (non-constant number of hospitals or machines). What makes this a nice mathematical topic is that - assuming your audience start to consider a sequence of offers and counter-offers - it raises the issue of making a proper mathematical model of the negotiation (so, if 2 hospitals make an agreement, is it meant to be binding? Presumably not if the 3rd one can "attack" it with a more attractive offer. But if it's not binding, how can the process come to an end?) Finally, despite the fact the problem is ill-posed, there is still a cute answer that is analogous to the answer to this ill-posed problem: when asked what the outcome should be, you say "by symmetry, the hospitals will share a $9M machine". (Actually, don't use the phrase "by symmetry".)

Saturday, July 10, 2010

Research funding, pensions

Quite a flurry of genuinely important material for the Times Higher. The story that the REF will be delayed a year to resolve the dispute over "impact" is good news; regardless of whether this aspect of the REF is a good thing, any delay is a win, simply because of the enormous expense (mainly on academics' time) of national research assessment. Lots of comments follow the article, with Philip Moriarty in good form. (added 12.7.10: 2 more interesting links: This article (5.12.09) discusses the "James Ladyman index" and attracted a lot of comments; this one (1.7.10) has a comment by Ladyman)

The outcome of the USS meeting I mentioned previously is that probably the USS will switch to career-average earnings to compute pensions. In an attempt to make that fact sound interesting, let me put it this way: In the future, your pension contributions statement will not state the number of years of service you have accrued, but it will state your pension entitlement as a sum of money. Oh well, if that bores you, maybe you are wiser than I am.

This article reports on a speech by David Willetts on the case for science funding. So, he believes in it, that's a good thing. From the article:
Mr Willetts said he could not talk about spending commitments until the Comprehensive Spending Review is published this autumn, but warned that the UK could not afford to emulate the examples of the US, Canada and France, which had reacted to the recession by spending more on science.
Full marks, at any rate, for admitting that these other countries are raising spending on science.

Finally, sort of a weak forecast: maybe full economic costing of research grants is not going to collapse under the weight of its own stupidity, as I thought it would. Here's the argument. The Government wants to cut spending on universities but at the same time, wants to protect the strong against the weak. Now, one way to do that is to identify specific universities for preferential treatment, and I guess that's sort of what they're doing in Germany, but it's delicate, to say the least. Alternatively, you can identify characteristics of "strong" and show favoritism to universities having those characteristics. And research grants are quite handy, for that purpose. So, artificially inflate the value of all research grants, and the Matthew effect is duly enforced.

Thursday, July 01, 2010

USS changes

The Times Higher is a good magazine to read if you're feeling a bit euphoric, there's an unaccountable spring in your step, and you need to simmer down a bit. Last week's issue raised the alarm about the sustainability of the Universities Superannuation Scheme (USS). The article notes a forthcoming meeting (7th July) between representatives of the University and College Union (UCU) and USS to negotiate changes. The UCU's question-and-answer page on changes to the USS, argues that no change is needed. Trouble is, we all know that very few final-salary pension scheme still exist; in the private sector most of then closed down to new member during the past 10 years. In a recent blog post, Robert Peston gives an overview of cuts to the benefits from the BBC's pension scheme (which like the USS and the private sector, is also a funded pension scheme.) From Peston's post:
Even those on relatively high salaries, who don't expect their earnings to rise, will have to think twice about whether it makes sense to continue with contributions.
This presents scheme managers with a dilemma - they have to keep the deal sweet enough to retain members and attract new ones, a problem that becomes much more urgent when a scheme is in deficit. When that happens, the whole thing is at risk of being sustained by promises that managers cannot ensure they will be able to keep.

To be honest, I think the UCU is obstructing the necessary dose of bad news that will be necessary to ensure the USS's sustainability. One problem that's come up: the USS's 3rd-largest equity holding is BP, with a market value of 693.7M (the list I got that information from is updated only every 3 months, so BP is likely to be lower-down by now.) USS has a total of about 22Bn in invested capital. (Not that I blame them for owning some BP. It looked like a good bet a few months ago.)

Here's a quote from that UCU question-and-answer page.
Q: Aren't all public sector pension funds going to have to make changes to save money. Whys should academics have special treatment?

A: USS is not a public service pension scheme, so it has to meet the same targets as other funded occupational pension schemes in the private sector. Its big advantages over such schemes are that the HE sector as a whole is less likely to go bust than any individual group of companies, so that it can afford to take a longer view. That does not mean that the politics of envy will not give traction to attacks on USS; but that is no good reason for the preemptive cringe that is part of what characterises the stance of the employers' negotiators.

Full marks for emphasizing that the USS is not public-sector, but the usage of "politics of envy" made me cringe. Finally, here's a fragment of the USS investments page to show that in some respects they have impeccable taste.

(It's the Go game that caught my eye.)

Monday, June 28, 2010

Accepted papers for SAGT

The web site for SAGT 2010 (3rd Symposium on Algorithm Game Theory) now has the list of 28 accepted papers (and a nice facility to show the abstracts). Lots of interesting topics, especially if you're into equilibrium computation. (30.6.10. Hmm, their link to the papers seems to have disappeared. Here it is.)

Friday, June 11, 2010

Competition for rank

It looks like today's the day that my co-author Carmine Ventre gives a talk at ACM-EC on the paper Ranking Games that Have Competitiveness-based Strategies, our take on competition for rank in league tables. At the same time, news that Liverpool University has gone up 8 places to the 39th place in the Guardian's league table of UK universities feels a bit like the news that BP's share price has gone up by 5% this morning. Historically, the university has performed poorly in the Guardian's league table. Regardless of that, positive progress is always welcome.

As I've sometimes mentioned in talks on the topic, the thing about competition for rank is, that when you start thinking about it, you start to see it absolutely everywhere. The games that we consider in the paper seem to capture a lot of the features of this type of competition, and (so far!) we have positive results, by which I mean polynomial-time algorithms. So what we've got is, a class of games that describe a lot of real-world competitions, but for which Nash equilibria (or at any rate, approximate ones) can be found in polynomial time. Of course, my hope is for more work to appear on this topic.

Sunday, May 16, 2010

Research in an age of austerity

Concidence: the new British government is a coalition that is united mainly by the belief that the budget deficit should be tackled sooner rather than later. At the same time, the crisis in the Eurozone has led to severe austerity measures in Greece, Portugal and Spain. All of a sudden, it seems like cost-cutting is "in the air"; our own government will no doubt be encouraged in cutting spending by being able to answer its critics "go to Spain and see how much worse things are over there", or perhaps "if we don't cut now, we'll end up like Greece."

I expect research to be in the line of fire; an open question is whether cuts to the research councils' budgets will be "game-changing" ie will they have an impact on the way funding is handed out, and researchers' approach to competing for it. A bit of reform could be something of a silver lining on the cloud.

Let's try to foresee some changes.

Increased importance of EU funding. This is easy, just look at the figures. European Research Council funds have been spiralling up in recent years, just as EPSRC is facing a cut.

Full economic costing of research grants has got to go. In a nutshell, the background is this. About 10 years ago, universities complained to the Government that their research activities were bankrupting them, because research grants did not cover the expenses incurred. It is important to note that this problem did not actually stop them from fighting like rats in a sack over these supposedly inadequate research grants. And the Govt duly boosted Britain' s science budget, but... the way the funds were handed out, was essentially by doubling the cost of individual grants. The result: heightened competition for grants, and a boom/bust pattern to peoples' (and universities') research incomes. Reviewers of research proposals are told to ignore all value-for-money considerations, and comment only on the quality of the research, leading to anomalies like successful applicants getting more than 100% of their salary paid by their grants... enough of this!

A simplified research funding regime. I'm entering the wishful-thinking zone here. The rationale is, that if you've only got ten quid to dish out to the research community, there's no point making them spend hours poring over impact statements, and peer-reviewing each others' long and highly-technical grant proposals. You should just give it to a subset of universities, or share it equally amongst anyone who can exhibit a decent track-record, or something. The new government is likely to cause a welcome delay to the Research Excellence Framework.

Theoretical research may be well-placed to survive a worst-case scenario. This is appropriate, since a lot of theoretical CS addresses worst-case performance of algorithms. Actually, CS generally may be in better shape than other sciences, since computers are cheap.

And finally, a tongue-in-cheek suggestion: Since London-based academics receive London weightings to their salaries, they end up doing the same work at a higher cost. It makes sense to support academic research (and indeed teaching) in less expensive locations.

Wednesday, May 12, 2010

UK election game

Here's a fun game-theory problem; Martin Gairing helped me find a solution during lunch, which I will add later.

There are 2 political parties and N constituencies; each party wants to win as many of them as possible. Both parties have an amount M of money (to spend on election campaigning) which they split amongst the N constituencies; for each constituency, it is won by the party that allocated it the larger amount of money. A party's payoff is the number of constituencies it wins, so it's a zero-sum game. The problem is to find a Nash equilibrium. You can assume that M is infinitely divisible, or if not, you're allowed to find an approximate solution with error proportional to 1/M.

Note that there is no pure equilibrium; if a party fails to randomize, the other one will be able to narrowly defeat it in nearly all constituencies while allocating no money to one(s) that it loses.

Obviously lots of generalizations are possible...

Friday, April 30, 2010

Experiments with envy-prone cake cutting

Suppose that 3 people share a cake as follows. Player 1 cuts it into 3 pieces, then player 2 takes a piece, then player 3 takes one, then player 1 takes the remaining one. Player 1 can avoid envy by cutting into pieces that he values equally, player 2 can avoid envy since he gets first choice, but the worst case for player 3 is terrible - he may see player 2 walk off with a piece that he (player 3) deems to contain all the value in the entire cake. Consider the following "fix": after player 3 has taken a piece, he has the right to combine it with player 2's and challenge him to cut-and-choose. In this case, player 1 can end up envious: he can ensure his own piece is worth 1/3 of the total, but he may watch as one of the other players ends up with a share that he values as 2/3. Indeed, player 2 can also end up envious of player 1, under this scheme.

An obvious way to quantify the envy of a cake-cutting scheme is to say that each player values the cake at 1 unit, and look for the worst case envy of any player, i.e. the maximum amount that he may be forced to value another player's share more than his own. The first scheme above has an envy of 1 in the worst case, and the second version is 1/3.

We can analyze schemes in an ad-hoc manner to figure out their worst-case envy, but can we do it automatically? For an undergraduate programming project I got the student (Amir Niknafs-Kermani) to try doing this as follows. Model the cake as a unit line segment. Represent a player's value function by a set of N (typically about 1000) points on the line, each worth 1/N. Generate for each player, an initially random value function, and run the scheme on them, and measure the level of envy (which is typically zero or very low for a sensible scheme, since these value functions all more or less agree with each other). Then, try to tweak the value functions by adding and removing points that define those functions. The hope is that points get added in "sensitive" regions of the line, and when you re-run the scheme, the envy goes up. Repeat until you don't manage to raise it further, and you've got a lower bound for the worst-case envy.

And the results, briefly, are that it works pretty well, at least for simple schemes. The envy level that is found by the algorithm varies quite a lot (depending on the initial random choice of value functions) but is often close to the theoretical limit. But the performance gets worse with larger number of players.

Since it is unknown whether there is a 4-player envy-scheme with a finite number of cuts, perhaps it is interesting to study the above "level of envy" measure, and how fast it can go down towards zero, as a function of the number of cuts in specific types of schemes.

Sunday, April 25, 2010

On May 6th, I'm voting for France

The United Kingdom is no longer capable of governing itself. Labour is simply awful, and the Tories are incompetant. The Liberal Democrats meanwhile, are trying as hard as they can to position themselves in the exact centre of gravity of the other two.

Luckily, help is at hand. A cursory glance of history will convince anyone that every nation-state harbours the ambition to conquer and swallow up its rivals. What the UK needs to do is to select another country, and invite them to take us over. For administrative purposes, it may be necessary to declare war on them and promptly surrender, but that is a technical detail -- the main challenge is to choose the best country.

I have compiled a list of what I consider to be the strongest candidates, arranged in descending order of my own assessment of their merits. I discuss the case to be made for each one, and some of the possible objections.

France. France has a natural geographic advantage, which is likely to be the cause of the extensive history of takeover bids that have occurred in both directions, over the last thousand years. In 2009 for the fourth year running, France has headed the International Living Quality of Life league table. A short excerpt from that web site gives some of the explanation:
The French believe that every day is a pleasure to be slowly savored—and lingering at the dinner table for three hours in conversation isn’t considered abnormal. Family, friends, and good food are all vitally important to the French—and so is having enough time to appreciate them all.
Sounds good to me! On the downside, we would have to speak their language. However, since any red-blooded Englishman knows that he is naturally gifted at driving, golf, and speaking French, this should not be a major problem.

While academic salaries are somewhat lower in France than in the UK, this is undoubtedly offset by a lower cost of living. Finally, while French academics are apt to complain extensively about their Government's higher education policy, I suspect that is just Gallic bloody-mindedness.

The United States of America. For the better part of a century, the USA has served as a beacon to would-be migrants all over the world, especially academics. Furthermore, if the UK became a part of the USA, we would gain a strong sense of destiny having been fulfilled. The USA's federal structure may ease the administrative process of assimilation, since the UK could become a new state of the union. Readers of the Daily Telegraph and the Daily Mail would be delighted to revert to feet and inches, gallons and degrees Fahrenheit. Finally, anything that reduces the inconvenience of transatlantic air travel is surely to be welcomed.

The downside (based on conversations with many Americans) is that the USA may decide to cherry-pick Scotland and Ireland (and possibly Wales) but leave England hanging out to dry.

Germany. While Germany is a strong candidate, most of the arguments in favour apply in at least equal measure to France. The language is said to be easier, being adequately supplied with consonants. The main downside is that it would make nonsense of our highly cherished World War 2 movies.

China. There are some significant cultural synergies from a takeover by China. On the one hand you have a communist country with a capitalist work ethic, on the other hand you have a capitalist country with a communist work ethic. The Chinese "one-child" policy would not be popular here, but it would do wonders for primary school class sizes.

India The great thing about soliciting a takeover by India, is that they may just possibly agree to do it. They might make the mistake of viewing it as a status symbol to be in charge of the UK.

Saturday, April 24, 2010

3 papers for my reading list

Last week I was at the EPSRC Symposium Workshop on Game theory for finance, social and biological sciences at Warwick University, an event that lived up its multidisciplinary title; here are 3 papers that people told me about, that I hereby undertake to try to read.

"On the rate of convergence of fictitious play" by Brandt, Fischer and Harrenstein, available from Felix Brandt's web site, actually qualifies to be mentioned here at Noam Nisan's blog, where he put out a call for recommendations for as-yet-unpublished algorithmic game theory papers. The paper explains clearly what is meant by Fictitious Play (which was a prominent topic at the above workshop) and for types of game for which is known to converge (to Nash equilibrium), shows it takes exponential time. Surely a very timely set of results, in view of recent related results in CS about the convergence rate of various dynamic processes in game-theoretic settings.

"Fairness and Desert in Tournaments" by Gill and Stone, available here at David Gill's web site, I find interesting since it relates to my (and 3 co-authors) paper on competition for placement in a ranking, to appear in EC. The word "tournament" is not being used in quite the same sense as in social choice (a mechanism to pick a winner using a program of pairwise contests); it's (related) meaning is a competition where one player gets a prize, but there is quantified value to the prize, as well as cost for alternative levels of effort that go in to winning it. The emphasis is on the impact of players' sense that they deserve to win or lose, something that is modeled mathematically.

"Chaos in learning a simple two-person game" by Sato, Akiyama and Farmer, available here, in fact focuses on rock-paper-scissors, and shows that a particular learning process may exhibit chaotic behaviour, perhaps because if the behaviour was predictable, then a player could predict it and take advantage of what is about to happen.

Friday, April 23, 2010

impact of the election (or vice versa)

I received another email update on the campaign against assessing research by impact yesterday; here is a link to the last post I made on the previous update on that. Rather than quote the whole thing, here is a quick summary: it is optimistic on progress; most concretely, it pointed out that HEFCE are ready to delay the REF another year or two to work through the debate on the topic (story in Times Higher) - that is itself a good achievement even if they don't back down. Both the Conservatives and Lib Dems are skeptical about usage of impact statements for research funding.

By the way, this article by Simon Jenkins contributes the following great quote to the debate:
This dirigisme reached its logical conclusion when Lord Mandelson took universities into his "business, innovation and skills" department, and rendered their planning a matter of political infallibility. Last year, many universities lost the will to live when he demanded a measure of every scholar's "contribution to demonstrable economic and social impacts", with reference to "public policy, cultural impact and improving the quality of life". It was a Leninist parody.

Monday, April 12, 2010

email addresses with/without academic domain names

Most academic colleagues use email addresses that end with .edu or .ac.uk, or related endings for other countries. But some (more often, younger people) use gmail, for example. I suspect that it is a mistake to do that, and that academic domain names are a genuinely good thing, simply because they help to certify the identity of the sender. I have a gmail address but I only use it for personal, not professional correspondence.

If someone emails me who is interested in a job or studentship with my research group, I give the message more credibility if it's an academic address (is it wrong to do so? I try not to be biased in how I respond to the message... but certainly there's a bigger risk that it looks, at first sight, like spam.) Or, if someone writes me a reference on behalf of someone else, the fact that it originates from a sender with a university email address is as good as a signature on paper - better in fact, since I don't recognize most people's signatures.

Is the above opinion sensible, or old-fashioned (or possibly, both)?

Wednesday, March 31, 2010

2 new faculty posts at Liverpool, UK

Having just send the ad to various mailing lists, it should also appear at the blog...

THE UNIVERSITY OF LIVERPOOL -- DEPARTMENT OF COMPUTER SCIENCE
LECTURESHIPS IN ECONOMICS AND COMPUTATION (2 POSTS)

The Department was ranked in the top 10 UK Computer Science departments in RAE2008, and building on this success, we seek to significantly expand current research around existing strengths at the intersection of economics/game theory and computer science. You will join Professor Paul Goldberg, Dr Piotr Krysta, Dr Martin Gairing and Dr Rahul Savani in our research group on Economics and Computation. The group carries out research in the computational foundations of economics and game theory and enjoys close collaborative links with other strong research groups in the Department. You will be expected to contribute to our MSc in Computation and Game Theory.

You should have a PhD in Computer Science or related discipline and an excellent track record of research at the intersection of computer science and economics/e-commerce/game theory.

Relevant topics of interest include (but are not restricted to):

  • algorithmic game theory;
  • e-commerce;
  • mechanism design and auction theory;
  • complexity and computation of solution concepts;
  • optimisation problems in economics;
  • computational social choice.



salary in range GBP 36,715 - GBP 41,323 pa; Job Ref: A-570581; Closing Date: 30 April 2010

Further details

online applications/job description

For informal discussions please contact Prof Paul Goldberg, Head of Group (mailto:goldberg@liverpool.ac.uk).

For full details, or to request an application pack, visit here
e-mail jobs@liv.ac.uk
tel +44 151 794 2210 (24 hr answerphone).

Thursday, March 11, 2010

Dagstuhl 10101

I am at Dagstuhl seminar 10101 (schedule) on computational social choice; it ends tomorrow. Much of the chatter in the evenings has been about the AAAI conference, due to author feedback to reviews being made available a couple of days ago; to a lesser extent ACM-EC accepted papers, also announced a couple of days ago. I will not try to do a complete overview; let me try a more vignette-like approach.

A nice talk by Jérome Lang was in the context of conducting a collection of yes/no votes where there is preferential dependencies between attributes, meaning that a voter's support for one issue may depend on the outcome of the vote on one or more of the other issues. The example used was voting to build a swimming pool and voting to build a tennis court, where some voters would like one or the other, but not both (too expensive). Each voter is represented by a preference relation on the 4 possible outcomes (for n yes/no issues, he uses a more concise representation, "CB-nets"). The question is, how hard is it for the "chair" (who chooses which order the issues are voted on) to control the outcome. Commonly NP-hard, which is taken to be good news, although it is noted that it raises the question of easier manipulation in typical or average cases. That issue is analogous to the hardness of a voter choosing a strategic vote (a ranking of the candidates that is not his true ranking) so as to get a better outcome. While that is NP-hard for some voting schemes, it may often be easy in practical settings. Indeed, Edith Hemaspaandra's talk was about polynomial-time manipulability when there is single-peaked preferences over the candidates.

The rump session (not covered in the above schedule) was 11 talks each of 5 minutes, really aimed at stating problems where no results have been obtained. I gave an introduction to the "chairman's paradox" (to ask about related computational issues) -- it was identified by Farquharson in 1969 and goes as follows. Say you have a committee of 3 voters {1,2,3} who have to choose one of 3 outcomes {A,B,C}. Let voter 1 be the "chair" and voters 2 and 3 be the ordinary members. The special role of the chair is that if all 3 outcomes get one vote each, then the one supported by the chair is the winner. The paradox is that if the voters' preference lists are generated at random, and your solution concept is pure Nash equilibrium, then the chair gets what he wants less often than the other members. For example, consider the (Condorcet cycle) preferences where 1 has preferences ABC (in descending order), 2 has preferences BCA and 3 has preferences CAB. Then, voter 2 will vote for C since that results in C rather than A winning (2 and 3 supporting C). Voters 1 and 3 continue to support A and C respectively, having no incentive to switch. That is the only pure Nash equilibrium that results from iterative removal of weakly dominated strategies, and notice that 1 (the chair) gets his worst outcome C. I will note that the last time I had to chair a committee of this nature, I felt disadvantaged, although not quite for this reason. The session contained an interesting talk by Kóczy on a method for ranking economics journals (someone has to do it I suppose; see this article on the ranking obsession factor). Finally I should surely mention Felix Brandt's entertaining talk on the "kicker theorem".

Edith Elkind's talk on plurality voting with abstentions was related to the above in using pure Nash equilibrium as the solution concept with a bunch of voters and alternatives (not just 3). An interesting open problem she raised is: Suppose you have sets of voters and alternatives, and for each voter/alternative pair there's a numerical utility of that outcome to that voter. Assume that in the event of a tie, the voter's utility is the average (furthermore, let's assume that all numbers and averages are distinct). Suppose voters cast their votes in some prescribed order, and consider the solution concept of subgame perfect equilibrium. What is the complexity of computing their votes? (On a less formal note, Aviv Zohar told me about a "taxicab game" he had played rather poorly, in which a sequence of (say) 10 people board a minibus with 11 seats, and you aim to end up next to the vacant seat, or failing that, maybe some seats/neighbours are better than others. OK, it needs to be specified more precisely.)

Here is another recent blog post on computational social choice, just to prove I have been paying attention.

Friday, March 05, 2010

inaugural lecture



The illustration is from one of my slides - it is intended to give an idea of how to make a hard instance of 2D-SPERNER. I must thank my colleagues in the EcCo research group for the talks they gave in the morning, also Troels Bjerre Sorensen, David Manlove, Kousha Etessami and Bernhard von Stengel for visiting and giving very nice talks in the afternoon, as well as other visitors and everyone who attended the talks.

Wednesday, February 24, 2010

Writing references

At Cambridge, Mary Beard blogged recently about the reference-writing burden. Here's a quote:
evaluating students, ex-students and colleagues is an important part of my job; I'm not complaining about being asked to do it (so no need to feel remotely guilty about asking me) -- I'm complaining about the cumbersome, inefficient and sometimes downright obstructive infrastructure.

Regarding academic references (for students applying for postgraduate study), Beard notes that some departments
To save themselves money and to maximise your irritation, many departments now have feeble, barely secure systems where you hand the reference back to the student in an envelope, signed across the seal and then covered with sellotape.


From recent experience, having produced a bunch of references for students applying for MSc study, I can reveal that Cambridge is by far the worst offender in this respect. At York and Edinburgh, they email you a URL, you go there, and upload the reference in PDF. At Oxford, it's bit worse, they email you a URL, username and password, you login and have to provide details of your contact info, affiliation and next of kin (I exaggerate slightly) which gets checked by "inspector" software, then you finally get to upload a PDF. At Cambridge, the student has to come by your office with a reference form in triplicate (you don't often get to use that word these days) and you have to print and sign three copies of the reference (one for each reference form) then you go through all the amateur cloak-and-dagger stuff with the signature and sellotape.

The point is, I guess, that Cambridge (and to some extent Oxford) are the only universities that can afford to be so obstructive to potential customers. The trouble is, they are mainly wasting the time of the referees, not just the students, and referees are ethically obliged to cooperate with whatever stupid system is being used. But I will complain to them about their system, and report here on any response I receive.

Monday, February 08, 2010

Forthcoming inaugural lecture



My inaugural lecture (on Computational Game Theory) is on the 2nd of March; I'm currently working on the slides. Here is a schedule of a one-day mini-workshop we are having on topics in computational game theory, to take place alongside the lecture. Here is an electronic invitation that anyone reading this is most welcome to. Here is a link to the university's web site on the current series of inaugural lectures, includes a registration facility for anyone who wants to attend.

Thursday, February 04, 2010

mainly about cuts

I went to a university staff meeting today where our Vice-chancellor began with a presentation on funding cuts, and after he was done, I asked him which way we should all vote in the next election. It's the kind of question you ask when you've just finished marking 40 exam scripts. His response touched on the fact that the USA, Japan, Germany and France all are spending more rather than less on higher education, as part of their fiscal stimulus packages. I knew about the USA, but it's great to hear that all these other places also regard universities as part of the solution and not the problem. (added a day later): This beautifully-written article in the Guardian is a must-read for anyone who is interested in this topic!

The VC also noted that the funding formula is now allocating more weight to research that received the highest RAE rating (4*) than it previously did. It is starting to look like it is wrong to refer to "3* and 4* research" in the same breath; in reality we should all be chasing after 4* research, and disregard anything less.

Back to exam marking - no-one likes doing it, but it's nice when someone gets everything more or less correct, and you think, hey, I really got through to this guy.

On Facebook, Ulle Endriss called attention to this web site about the closure of the Group of Logic, Language and Computation at Kings College London. I would guess that the people being laid off would not get so much sympathy in the wider community, at a time when unemployment is still rising in the USA, and is still very high over here. It's a reminder that universities are not in the public sector, and are all in thrall to fiscal constraints.

Finally, a definition: meritocracy: government by people with a powerful sense of entitlement.

Thursday, January 28, 2010

New MSc course

We have started advertising our new MSc degree course in Computation and Game Theory. (I am hoping it will not confused with computer games; the summary below should help.) Here are details on the department's web site. Here is the link on the university's web site for applying to join the course.

The following is a 100-word summary that should appear in findamasters:
Success stories of the Internet giants like Google have generated great interest in new techniques for e-commerce. New career opportunities are emerging that exploit the rapidly expanding research area in the intersection of economics and computer science. These arise both in research and commercial development. The MSc in Computation and Game Theory program aims at providing students with a broad understanding of current issues and gaining specialist qualification in this field. The program covers a number of foundational theoretical areas, including cutting edge modules such as algorithmic mechanism design, and covering modern applications such as Google's sponsored search auctions.
This is of course the general topic of the Economics and Computation Research Group.

Piotr Krysta is the main contact for details.

Tuesday, January 19, 2010

Tories to review HEFCE's plans on impact

Following on from my earlier post, the latest email in the series is entitled REF campaign update - some good news. Apart from the title of this post, also - see below - scientists are being invited to submit feedback on this to the Science and Technology Select Committee.

REF campaign update:

I wanted to write to you to update you with some good news, showing that our lobbying and the publicity we have generated over the ‘impact’ campaign is having an effect. Please see below:

Tories call for REF to be shelved:

David Willetts announced last week that if elected, the Tories would shelve HEFCE’s plans on impact until the completion of a two year review. The shadow minister for higher education, David Willetts, said he would delay proposals that would force 25% of future research to be assessed on 'economic impacts' by two years in order to listen to the concerns of the academic community. The news comes just a week after a UCU poll of top professors revealed that over a third (35%) would consider pursuing their academic career abroad if the plans were introduced. Read more here: http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=409991&c=2 This is a measure of the pressure we are building up on political parties and HEFCE and a testament to your support, so thank you again.

Calling all scientists! – Science and Technology Committee calls inquiry on funding and ‘impact’

Following UCU lobbying, the Science and Technology Select Committee has announced an inquiry into Science funding and one aspect the committee will be particularly interested in is the proposals for ‘impact’. The Committee is interested in “what evidence there is on the feasibility or effectiveness of estimating the economic impact of research, both from a historical perspective (for QR funding) and looking to the future (for Research Council grants)”. If you are a researcher in the sciences, this is your chance to speak directly to the politicians by making an individual or group submission.

How to send a submission:
  • Focus on the feasibility of an impact measure in your field – can you measure impact over the short term? What would it do to pure science and basic research?
  • Keep your submission to a maximum of 3000 words and put it in Word format (no later than 2003) and number your paragraphs
  • The deadline for submissions is 27 January so time is short.
  • Don’t leave it to others! With the disappearance of a committee that specifically represents universities it’s harder than ever for the academic voice to be heard. Volume of responses will be important. If you have something to say, now’s the time to say it!

Please send your submissions by e-mail to scitechcom@parliament.uk and marked "Research funding cuts" and please send a copy to us too at jwhite@ucu.org.uk. An additional paper copy should be sent to: The Clerk, Science and Technology Committee House of Commons, 7 Millbank, London SW1P 3JA.

You can read full details of the Inquiry’s remit here: http://www.parliament.uk/parliamentary_committees/science_technology.cfm

Best wishes

Jonathan

Jonathan White

Deputy Head of Campaigns

UCU

Saturday, January 16, 2010

Surds and the pursuit of happiness

My son Arthur mentioned that he had learned about surds in maths, so we asked him what one was. The definition I vaguely recalled from my school days was that a surd is an irrational number, but of course that's not the whole story, since it would seem that not every irrational number is a surd. Arthur did not know an exact definition, and it would seem that no-one else has tried very hard to pin it down precisely. Higher GCSE Mathematics for Edexcel by Alan Smith, p492, states:
Some quantities in mathematics can only be written exactly using a square root symbol.

For example, if x2=5, then the exact value of x is √5 (or -√5).

Quantities like these, written using roots, are called surds.

Based on discussions and exercises on the following pages, it appears that a number like 1+√2 is a "surd expression" rather than just a surd, but neither was it ruled out as being a legitimate surd. The book gave no hint about whether, for example, the cube (as opposed to square) root of 2 is a surd.

Other sources are similarly imprecise. Wikipedia indicates that a surd in an N-th root (presumably, an N-th root of a positive integer, where N is also a positive integer). It says here that
An unresolved root, especially one using the radical symbol, is often referred to as a surd.

Based on the usage of the word in that web page (which also explains its origin) it looks like it's supposed to be a (real-valued) positive integer root of a positive integer.

This web page states the most restrictive definition: "A surd is a square root which cannot be reduced to a whole number." Presumably they mean: a square root of a positive integer, and not a number like √(9/4) = 3/2. Wiktionary says: "An irrational number, especially one expressed using the √ symbol." (which would appear to allow 1+√2).

With a view to inducting my sons into the family trade, I thought that it would be a worthwhile mathematical exercise to discuss what should be the right definition. (The definition itself will not be interesting mathematically, but the pursuit of one is of great value; by analogy, the chap who coined the phrase "Life, liberty and the pursuit of happiness" clearly figured out that pursuit of happiness, rather than happiness itself, was the point.) It's a topic that touches on all sorts of issues, such as which if any, of the alternative definitions are equivalent, and why. More fundamentally, it addresses the issue of what constitutes a genuine mathematical definition, as opposed to some general guidelines on usage. Finally, the alternative definitions will have various different merits, such as being a set of numbers that is closed under addition. In the event the discussions did not get very far, but looks like a good one to have in high school math lessons.

(added later: Mark Jerrum pointed out this link on mathematical terminology; in the case of surds, it contains more historical detail than wikipedia's page.)

Thursday, January 07, 2010

UCU takes the gloves off?

This post follows on from two previous posts quoting email updates on the UCUs campaign against the proposed ways that UK academic research will be measured in the REF. Again I quote the entire thing below - it has some useful links and it is noteworthy in highlighting the risk of a renewed brain drain if the proposals go ahead.

One thing you learn from the study of bargaining and negotiation from a computational perspective, is that to make the case for a particular price, you need to appeal to the marketplace. In selling a house, it is no good to say to a buyer "you should pay me more because my house is worth more than your offer". You must say "you should pay me more because some other potential buyer would pay me more". Likewise, I believe that if a researcher is threatened with financial and reputational penalties if he refuses to bend to the Government's agenda, he is possibly mistaken to focus on explaining that pure research is valuable. Rather he should say "There are other buyers out there for the services I prefer to sell".


Dear colleagues,

UCU poll shows one third of professors considering leaving the country if impact pushed through:

I am just emailing to update you on recent progress in UCU’s campaign against HEFCE’s ‘impact’ proposals. The REF campaign hit the press in a big way today as three of the broadsheets feature a UCU poll showing that more than one third of professors would consider pursuing their academic careers abroad if HEFCE’s impact proposals are pushed through. One in five professors polled also said they knew someone already considering leaving. You can read more about the poll in a double page spread in the Independent:

http://www.independent.co.uk/news/science/nobel-laureates-dont-put-money-before-science-1860138.html

You can also read it in the Times and the Telegraph:

http://www.timesonline.co.uk/tol/life_and_style/education/article6978437.ece

http://www.telegraph.co.uk/education/6943360/Warning-over-academic-brain-drain.html

The revelation that the impact agenda could trigger a brain drain in UK academia should give pause to the funding council and the government, providing yet more evidence of the danger posed by these proposals.

But are they listening? Peter Mandelson’s recent HEFCE grant letter, besides announcing swingeing cuts to the unit of resource, appeared to pre-empt the results of the REF consultation, committing the government firmly to the impact agenda. Lord Mandelson wrote to HEFCE: "On research, securing greater economic and social impact will be important over the next year. I want you to continue to develop proposals for the Research Excellence Framework, following the consultation that ended on 16 December. These should provide significant incentives to enhance the economic and social impact of research."

The press attention to our poll shows that the enormous opposition to the impact proposals is finding public expression. Our task now is to turn this into pressure at a political level.

The foundation of this campaign has been the support shown by you and your colleagues in signing the 18,000 strong petition. If we are to raise the pressure on the government we will need your support again and will shortly be writing to tell you how you can help us put pressure on your MP. Watch this space.

Thank you again for your support,

Best wishes

Jonathan White

Deputy Head of Campaigns, UCU

Friday, January 01, 2010

contributing to Wikipedia

What Wikipedia really needs, it occurred to me recently, is an article about the complexity class PPP. Not everyone will agree with that assessment; some people reckon that what Wikipedia really needs is to get rid of its liberal bias. For myself, I gave up waiting for the article on PPP to appear, made a account at Wikipedia, negotiated the rather unconventional syntax you use to edit articles, and wrote the above-linked-to page (that is, the first one, not the second one). Then I added links from the articles on PPAD and PPP (disambiguation). Then I wrote an article about the complexity class PPA, which as the whole world needs to know, is another denizen of the terra incognita that lies between PPAD and FNP. And if you think there's anything wrong with any of those articles, don't come hassling me about it, go and edit the pages yourself! It's not like they belong to me.

Then I contributed fifty quid to Wikipedia, after repeatedly seeing all those fundraising appeals from its founder Jimmy Wales.

Then I added most of the content on the current version of the page on the Research Excellence Framework (REF). Like the fundraising appeals, the REF is a bit hard to ignore, at least for UK academics. If you don't know what it is, consider yourself fortunate. If you want to know, follow the above link, it's as good a starting-point as any, in my unbiased opinion.

Happy new year, by the way.