Thursday, December 10, 2009

"Stand up for Research" again

OK, a slightly incremental update to my previous post... the UCU petition is continuing to pick up more and more support, see this copy of an email I got today.

Stand up for Research - Last push for 20,000

Dear colleagues,

Thank you for signing UCU’s ‘Stand up for Research’ statement. The response to this campaign has been overwhelming, with more than 16,500 academics signing up.

As a result of the petition’s growth, the campaign is winning more press attention. Clive James used his ‘A Point of View’ column on Radio 4 to attack the proposals. You can read this here: http://news.bbc.co.uk/1/hi/magazine/views/a_point_of_view/

Also, read Sally Hunt’s comment in the Times Higher Magazine here:

http://www.timeshighereducation.co.uk/story.asp?sectioncode=26&storycode=409381&c=1

BUT we still need more.

We have less than one week now to push as close to 20,000 as possible. Please take a moment to help us to get there. Forward this email to all your colleagues with a last request to sign up. We will be submitting this petition to HEFCE as part of our response to their consultation and every signature will count.

Colleagues can sign the statement here: http://www.ucu.org.uk/standupforresearch

Thanks again for all your help,

Jonathan


Jonathan White
Deputy Head of Campaigns
UCU
Carlow Street
London
NW1 7LH

Meanwhile, Universities UK are proving to be a bunch of spineless yes-men; in this article their president is quoted saying about the pre-budget statement: "However, the sector is already absorbing considerable efficiency savings and the announcement that by 2012-2013, £600 million will be cut from higher education and science and research budgets will be extremely challenging for universities." (translates as: we won't lift a finger to fight back.)

Let me conclude this post with some more links etc: This image was sent to me by Greg Kochanski. This article on science budget cuts is enough to turn me into a Tory. An enthusiastic endorsement of Educators for Reform - read the blurb on the web site - great stuff.

Wednesday, November 25, 2009

"Stand up for Research" petition update

Here's an email I received this morning - sign the petition if you have not yet done so! (I posted earlier about this topic here and here; and this page is a collection of links in support of the campaign, in addition to links that make the general case for curiosity-driven research.)

Dear colleague,

I wanted to update you on the progress of the “Stand up for Research” campaign.

More than 12,500 academics have now signed, making this the biggest ever such petition UCU has run. The signatories represent all disciplines, applied as well as pure research and come from every kind of university, defying the way in which the proposals trade research communities off against one another.

You can also read more about the campaign in the forthcoming edition of UC magazine, in which Professor Philip Moriarty from Nottingham University argues that “The imposition of impact criteria in peer review and in the REF is nothing less than an assault on core academic principles and the ethos of university research. It will also, perhaps counter-intuitively, be immensely damaging to the long-term socio-economic impact of academia.”

We now have only THREE WEEKS left to make this statement impossible to ignore.

Please help us do this by forwarding the link to the page to someone you know and asking them to sign it: http://www.ucu.org.uk/standupforresearch

Yours sincerely

Jonathan White

Friday, November 13, 2009

Journal special issues for conferences - why bother?

Question: why do we bother with special issues of journals for conferences?

In an effort to appear scholarly, I googled a bit and found this article ("If Special Issues of Journals Are Not So Special, Why Has Their Use Proliferated?" by Richard T. Mowday in the Journal of Management Inquiry). That article considers special issues devoted to specific research subfields, rather than conferences, so is not very relevant to my question. (Various arguments for and against are dismissed as invalid, but they don't include the ones I mention below.) The topic arises in this blog post (Lance Fortnow, "Are conferences worth fixing?") but the topic is merely touched on in some of the comments.

My general understanding is that your conference paper is supposed to acquire a seal of approval from being invited to the special issue. The other motivation is that the journal paper should appear more rapidly than usual, but this does not always happen in my experience, and the delay of having to coordinate one's paper with half a dozen others is partly to blame. So we return to the "prestige" motivation. The trouble is, that the journal hosting the special issue, is not necessarily the one you would have submitted the paper to, in the absence of a special issue. Some people decline the invitation to the special issue (and submit to a different journal), and that seems to severely undermine this purpose of a special issue.

Am I right that special issues are supposed to be prestigious? I realise that any answer is to some extent, a self-fulfilling prophesy.

Friday, November 06, 2009

Full professorship in Economics/Game Theory and Computation

Note, this is the first of three positions we will be advertising (see below), in an aim to build an internationally leading research group in this area. An official advert with details on how to apply, will appear on Nov 12th at the URL given below.

THE UNIVERSITY OF LIVERPOOL -- DEPARTMENT OF COMPUTER SCIENCE

CHAIR IN COMPUTER SCIENCE -- ECONOMICS AND COMPUTATION

Salary negotiable

The Department of Computer Science was ranked in the top 10 UK Computer Science departments in RAE2008, and building on this success, seeks to significantly expand current research around existing strengths at the intersection of economics/game theory and computer science.  To this end, we invite applications for a full Professorship to be attached to the newly established Economics & Computation Research Group.

Led by Professor Paul Goldberg, the Group carries out research in the computational foundations of economics/game theory and economic theory in computer science. In addition to Prof Goldberg, the group currently has three other faculty members: Dr Martin Gairing, Dr Piotr Krysta, and Dr Rahul Savani.

Current areas of research activity in the group include:

* algorithmic game theory;

* mechanism design and auction theory;

* complexity of solution concepts, algorithms for solution concepts;

* optimisation problems in economics;

* computational social choice.

We welcome applications from candidates in these areas, as well as more application-oriented areas, such as recommender systems, and related areas, such as computational finance and computational economics.

Two further faculty positions have been approved for this group, and will be advertised after the professorial appointment. It is expected that the successful candidate will be actively involved with these appointments.

The successful candidate will have an excellent track record of research leadership at the intersection of computer science and economics/game theory, and will join a dynamic, world-class Department.

Informal enquiries may be directed to the head of group:

       Professor Paul W. Goldberg

       mailto:P.W.Goldberg@liverpool.ac.uk

       phone: +44 151 795 4259

** Job Ref: A-570583

** CLOSING DATE FOR APPLICATIONS: 8 January 2010 **

Further details and an application pack will be available from the following URL after 12 November:

       http://www.liv.ac.uk/working/job_vacancies/

** AN EQUAL OPPORTUNITY EMPLOYER

Thursday, November 05, 2009

EPSRC's Schlimmbesserung

I was recently circulated a letter from EPSRC (UK's main scientific research funding body) announcing that, in order to "alleviate pressure involved in our peer review process" they would
  • no longer accept uninvited resubmissions of proposals

  • constrain "repeatedly unsuccessful applicants" to submit only one application over a 12-month period

In the letter there is a definition of "repeatedly unsuccessful applicant" which does not bear repeating here.

I have figured out what is wrong with this policy: far from relieving pressure on peer review, it has the opposite effect. Note first that the research proposals that now end up getting prohibited were always easy to criticize, while the proposals that survive this cull are the ones where you have to think hard about how they rate in competition with each other. That does not in itself explain why it becomes harder to review a proposal --- the reason why it gets harder, is that a reviewer now has a much heavier responsibility to "get it right": if a proposal fails, someone's research agenda has just been closed down permanently (they can't resubmit) and in the worst case, they get personally blacklisted, just for good measure.

Based on some brief web searches, it seems that Schlimmbesserung is a variant of a more standard German word Verschlimmbesserung, meaning an improvement that makes things worse. (The claim is that English-speaking fans of long German words usually prefer "Schlimmbesserung". This page is informative.)

Tuesday, November 03, 2009

Stereotyping universities

God, I hate being stereotyped. I don't think I'm absent-minded, nor do I wear a mortar-board. When Peter Mandelson says that universities should not be "islands or ivory towers" (see here) he is of course insinuating that universities are just that, out of touch with the real world, and so on. Well, it's a salutary reminder of what it must be like to belong to a social or racial group that gets itself stereotyped on a more frequent basis. Mandelson's remark forms part of a dismal-looking package of proposed reforms to the higher education system that purport to enhance the "customer experience"... not much in the way of concrete proposals, which is probably just as well, it's mainly a vague attempt to get universities perceived as just another industry. Mind you, if that's where he's coming from, can someone point out to him that we are, at least, a net exporter? Bloody politicians - they're all the same, got nothing better to do than spend half their time spouting off whatever cheap populist slogans it takes to get elected (not that we can accuse Mandelson of getting elected), and the other half making fabulous sums of money on after-dinner speeches and consulting contracts. All right, back to marking my COMP209 class test...

Tuesday, October 27, 2009

question about random sequences

OK, a respite from my last two posts about the Research Excellence Framework. The following question occurred to me, I make no promise that it is interesting.

Suppose you repeatedly roll a fair die and write down the sequence of numbers you get. I'm going to delete numbers from the sequence according to the following rule. Let X be the first number in the sequence. I delete the first block of consecutive X's (which will usually just be the single X of course), then I delete the second block of consecutive X's, but retain everything in between. Then I do the same thing to the rest of the sequence of dice rolls - it will start with some Y not equal to X, so I get rid of the first two blocks of consecutive Y's and continue.

For example, I would delete the bold-face elements of the following:

2,3,6,1,1,2,2,4,2,3,3,5,4,5,5,4,6,2,3,2,4,1,5,2,3,2,...

And the question is, is the sequence of undeleted numbers indistinguishable from a completely random sequence? Seems to work for 2-sided dice (coins).

Monday, October 19, 2009

Research Excellence Framework (part 2)

Continuing from my previous post; once again, foreigners who are not into schadenfreude should again read no further. Some new web links follow. First a couple of petitions:
A new collection of web pages highlighting this topic: The Danger of Assessing Research By Economic Impact

The topic has led to a flurry of emails on the CPHC mailing list; opinions there are divided, there are some who don't mind the proposal, some who are defeatist and some who also object to it. Some discussion has addressed whether one should push for a broader definition of "impact" beyond economic impact, and also the general burden of assessment -- it is costly to have to stop what you're doing every 5 years and enter into an episode of acute navel-gazing, and the present Government does not propose to compensate us for that! Outside of CS, there is a stronger consensus against the proposed definition of "impact". Do not be defeatist - the REF is a version of (and probably largely plagiarized from) Australia's Research Quality Framework (RQF), a similar Gradgrind-like model which was cancelled in December 2007 due to a change in government. Notice that in the UK, prospects for a change in government are very strong -- let's see if history repeats itself!

And finally, let me quote from an article in the Guardian yesterday by Madeleine Bunting addresses the point that Market theory closed down public discourse about injustice. But we urgently need to describe what we should value. From the article:
But don't look to economists to get us out of this hollow mould of neoliberal economics and its bastard child, managerialism – the cost-benefit analysis and value-added gibberish that has made most people's working lives a mockery of everything they know to value.

Friday, October 16, 2009

Research Excellence Framework

Readers from outside the UK may wish to stop reading at this point, unless they are into schadenfreude. I recommend readers from the UK to sign this petition, which is sponsored by the Universities and College Union (UCU). It relates to the Research Excellence Framework (REF). The following text accompanies the petition; below I add some of my own comments.

The latest proposal by the higher education funding councils is for 25% of the new Research Excellence Framework (REF) to be assessed according to 'economic and social impact'. As academics, researchers and higher education professionals we believe that it is counterproductive to make funding for the best research conditional on its perceived economic and social benefits.

The REF proposals are founded on a lack of understanding of how knowledge advances. It is often difficult to predict which research will create the greatest practical impact. History shows us that in many instances it is curiosity-driven research that has led to major scientific and cultural advances. If implemented, these proposals risk undermining support for basic research across all disciplines and may well lead to an academic brain drain to countries such as the United States that continue to value fundamental research.

Universities must continue to be spaces in which the spirit of adventure thrives and where researchers enjoy academic freedom to push back the boundaries of knowledge in their disciplines.

We, therefore, call on the UK funding councils to withdraw the current REF proposals and to work with academics and researchers on creating a funding regime which supports and fosters basic research in our universities and colleges rather than discourages it.

It is not only the UCU which expressing grave concerns about the REF; universities and societies that represent academic disciplines are also similarly concerned, and I will give examples of these in later posts. For the moment, the REF seems to be doing the impossible, namely to make us feel nostalgic for the Research Assessment Exercise (RAE). At least the RAE was exactly that, a research assessment exercise. It did not set out to distort the meanings of the words it uses such as "impact" and "excellence".

The REF -- in its proposed form -- discriminates against theoretical work and imposes an artificial incentive to do work that has short-term economic impact. And you know what? I've got nothing against economic impact. But if a certain kind of research is able to make money, that should be its own reward; government-funded money-making is ridiculous. And just don't call it "research excellence", it's not the same thing.

Some articles

Article in the Independent Against The Grain: 'I didn't become a scientist to help companies profit' by Philip Moriarty

See the comments that follow this article in the Guardian (the comments that get highly-recommended are correct)

Monday, October 05, 2009

State the open problems!

A few incoming paper-review requests remind me of the following long-standing gripe.

A nice feature of theoretical computer science is that the results we obtain tend to raise lots of very well-defined research questions. I'm very much in favour of the practice of stating these open problems in detail, in our research papers. Indeed, a thorough discussion of future work is arguably as important as including references to all appropriate previous papers -- the latter puts you in touch with the past and the former makes the connection with the future. Some papers take a lot of care to spell out what the open problems are, but some (most?) don't bother. Maybe the authors think it should be obvious what the open problems are. Occasionally they perhaps don't want the reader to pick up on an open problem they've got lined up for a follow-up paper.

Let's consider the case where the open problem being raised is "obvious". For instance, suppose you give an algorithm that approximates some objective within a factor of 2.5. Clearly, any approximation ratio of better than 2.5 is (implicitly) raised by the paper's result, so why bother to point it out? I would say it usually is worth pointing out, partly just to confirm that you really would find further progress to be interesting, and partly because you might have some useful additional discussion to add, such as a consideration of the prospects for improving beyond a ratio of 2, say.

If someone writes a paper that is able to claim to have solved an open problem stated explicitly in a previous work, that's usually a good piece of relatively objective evidence that the result is interesting. Most papers do not manage to achieve this - they usually solve some variant of an open problem posed previously, and indeed "posed" may just mean implicitly rather than explicitly. This is, there is a scarcity of "official" open problems (those that have been raised and deemed to be interesting in a published paper).

(There are some web sites that help, e.g. Comp geometry problems has links to other open problem pages. Open problem garden: wiki for general math problems.)

Monday, September 28, 2009

New academic year

My first lecture was at 9am today. Welcome to new lecturers Rahul Savani and Martin Gairing who join our Economics and Computation research group, from the University of Warwick and UC Berkeley, respectively. In the coming year we will be looking to appoint three new permanent faculty members to this research group; watch this space if you think you might be interested.

Monday, September 21, 2009

No time to raise tuition fees

As Gordon Brown's spending spree judders to a halt, the diners at the UK's economic table look at each other uneasily, as the time approaches for someone to pick up the bill. Enter the Confederation of British Industry, with a modest proposal to raise tuition fees from university students.

You've got to admire their chutzpah. Over the past decade, business leaders have been regularly enjoying double-digit pay increases. Meanwhile, the one single aspect where the present government has actually saved money, and not spent more of it, is on student finance. Specifically, they stopped paying for students' tuition, and now expect them to borrow the cash instead. Admittedly, the government lends it to them, at low interest rates.

That's not good enough for the CBI however. They've awarded themselves the above-mentioned pay increases on a regular basis, and only received a pusillanimous, relatively recent, 10% tax increase on incomes over 150k. So what we now have is a bunch of fat cat business leaders telling a bunch of penniless students to pick up the bill for Gordon Brown's uncontrollable spending habit. It's blindingly obvious that taxes will have to go up -- my own as well as the CBI's. You don't have to be an economist (or perhaps you do have to be non-economist) to figure that out.

Monday, September 07, 2009

Homotopy methods for Nash equilibrium computation

In computational game theory, we prefer fast algorithms to slow ones, but we also prefer algorithms that seem "natural", capable of being carried out in the real world. This is perhaps a special feature of computational game theory (in comparison with most other areas of algorithmics) - we do not just want a solution, but the solution should represent something that happens "out there". (This recent blog post by Vohra makes the stronger claim that if a game-theory problem is polynomially solvable, there is usually a natural algorithm for it.)

Homotopy methods are based on a simple and appealing idea - given some game of interest, you start by constructing a version of that game where the numbers involved (that is, the payoffs associated with the various strategy combinations) have been changed so that there is a single "obvious" solution. (And let us assume that it is Nash equilibrium that we are interested in.) Then you gradually move all the numbers towards the ones that represent the game of interest, keeping track of this Nash equilibrium, which should change continuously.

The "starting" game (i.e. the one with the obvious, easy-to-find Nash equilibrium) can perhaps be assumed to represent some kind of prior belief by the players about what each other are doing, or what is the best action in the absence of knowledge about the other players. In that case, the above process seems to represent some kind of natural learning process that you can believe might actually happen. What we get in addition is a criterion for equilibrium selection --- we would like to believe that there really is a single correct answer to the question of "what is the outcome of this game?" and where there are multiple Nash equilibria, this kind of procedure usually identifies just one of them.

Browder's Fixpoint Theorem

Browder's fixpoint theorem -- not to be confused with Brouwer's fixpoint theorem -- can be used to show that such a path of Nash equilibria actually exists. The theorem says the following. Suppose you have a compact domain X, so that Brouwer's (not Browder's) fixpoint theorem promises that any continuous function from X to X has a fixpoint. Suppose that for each t in [0,1], ft is a continuous function from X to X. Furthermore, the functions ft are actually a continuum of continuous functions; if u converges to w then fu converges to fw. Then there is a path connecting some fixpoint of f0 with a fixpoint of f1. But, it should be noted that the path need not be monotonic as a function of t. In the context of games, it is easy to construct 2-player 2-action games where you have to backtrack in moving from one to the other.

Some literature

I recently read the following 2 papers in some detail, so far have probably just scratched the surface. The first paper (in Economic Theory, 2009) listed below gives a survey of different methods and how they work. It includes a nice explanation of how the Lemke-Howson algorithm can be seen to be a homotopy method.

The second paper (in GEB, 2002) addresses computation of approximate Nash equilibria for multiplayer games. It uses the notion of ε-Nash equilibria that some of us in the CS community worked with more recently. Recall that for multiplayer games, one needs to give up on exact NE in general, since they may require irrational numbers to specify them. (For 3+ players, a NE may need to be the solutions of a high-order polynomial.) Given a homotopy between multiplayer games, they first partition the space into simplices, then approximate the payoffs within each simplex by linearly interpolating between its vertices (i.e you evaluate the true payoffs at the vertices, but inside a simplex the approximate payoffs are allowed to differ from the correct ones). Having discretised the space in this way, the algorithm implicitly constructs a graph on simplices and sub-simplices, of degree at most 2, whose degree-1 vertices are approximate equilibria of the games at the extremes of the homotopy. So, to get from the NE of the "starting game" to the game of interest, you follow this path. I haven't understood the details yet; not even sure if the graph is directed like in PPAD problems.

The paper does not apply Browder's theorem, but Browder's theorem essentially follows from the construction of the graph, at least in the context of approximations to multiplayer games.

Homotopy methods to compute equilibria in game theory by Herings and Peeters

Computation of the Nash Equilibrium Selected by the Tracing Procedure in N-Person Games by Herings and van den Elzen

Wednesday, September 02, 2009

Competitive journals

I just got an email circular to "Editors and Reviewers of Games and Economic Behavior" (I am just a humble reviewer, not an editor). It noted that last year it received over 500 submissions and anticipates publishing under 80 papers per year. Two points raised were: Please raise the bar (only accept papers of broad interest) and Faster publication - each new issue to contain about one-third of the papers in the current publication queue.

In the context of recent discussions about whether CS journals can, or ought, to be more competitive than conferences, these points look like good ones to bear in mind for journals that want to be competitive. (Of course, it's fun to stereotype people and research areas, and say that it's appropriate that a game theory journal should have figured that out. But, most other research areas also have the feature that the top journals are what are important.)

Thursday, August 27, 2009

The need for copyright reform

The UK pirate party has picked up press coverage recently, e.g. here and here in the Guardian. I am quite tempted to give them a protest vote at the next election; I have been dismayed by the way copyright law has been changed recently, as well as with new proposals. The articles relate to draconian proposals that could cut off your internet access if your kid downloads a copyrighted file without paying.

The Public Domain: Enclosing the Commons of the Mind is a new, freely-downloadable book by James Boyle (at Duke University law school) that discusses these issues. This paper by Hal Varian analyses the economics of copyright. (credit to this blog entry for pointing me to Varian's paper.)

Page 127 of Varian's article gives a brief history of how the term of copyright has been repeatedly extended in the USA. Now, copyright holders like to talk about "theft" and "stealing" in the context of copyright infringement. Well, if theft is by definition illegal, I guess what rights holders themselves have done does not qualify. On the other hand, if theft is simply the appropriation of something of value from someone else, without their consent, then extensions to copyright most definitely qualify. It's time to get angry with those people! They call us immoral, but they have stolen, and stolen repeatedly, from the public domain, all sorts of great works that ought to belong to us all. An enormous amount of work by long-dead authors is just about inaccessible, the collateral damage of copyright extensions that were motivated by the desire to "protect" only a small number of works. The collateral damage also includes all sorts of trivial stuff - books you once picked up in a used-book store, comics and magazines you once read as a kid, stuff that someone, somewhere would have put online, if only it were legal, but it's not, so instead it all get banished to a diminishing collection of dusty paper copies that no-one can ever find. All this for the sake of being able to profit from the creativity of someone who died over 70 years ago.

I'm not urging some kind of collectivist utopia where no intellectual property belongs to anyone. But, copyright should be like patents, that expire after (I think) 17 years. That's plenty of time to make money out of the intellectual property.

Wednesday, August 12, 2009

Time to move on from three-year degrees

UK-centric post follows... This post is to discuss a situation that is becoming increasingly anomalous and unsustainable, namely the system in English universities in which the standard period of study time for an undergraduate degree is just three years. (Earlier I criticised a similar problem in the context of PhD degrees.) Here is the historical explanation for it:

  • English schools encouraged early specialisation, traditionally three "A levels" during the last two years of study (so you would focus for example, on Maths, Physics and Chemistry)

  • Students at university are financed by the local education authority, which would pay for three years of study only


Both of the above reasons are getting tenuous. There is a general understanding that A levels have gotten watered-down, and school-leavers are not so specialised (the latter is probably a good thing!). And students increasingly have to pay for their own education, either though government loans, or the lucky ones get their parents to foot the bill.

Reasons why the three-year system is coming under pressure, and not merely losing its historical justification, include:

  • Students have to work part-time to avoid large debts. So they can't be expected to put in three years of intensive full-time study, as was the traditional expectation. With respect to this, most people over age 40 seem to maintain an outmoded image of the student lifestyle.

  • A consequence of the three-year timescale is that it can be very unforgiving for students who fail exams. This aspect gains added urgency from the recent debate (here's a nice article) about what universities should be doing to help disadvantaged students (who are more likely to have part-time jobs, and hence fail exams).

  • International standards in general, and EU in particular, envisages at least four years of undergraduate study. I see loads of CVs from prospective PhD students from overseas and they nearly all have 4 years of undergraduate study, and often have additional masters-level study.

  • Putting the above points together, the reason why in most countries you expect to take 4+ years is precisely because you want the flexibility to re-take exams where necessary while holding down a part time job.


What's the way forward? I think the change can be achieved incrementrally; for starters universities should allow students to graduate "a year late"; if a student needs to pass one or two modules in order to get a degree, let him re-take the exams the following year, attend the courses, pay reduced fees for that year.

Monday, July 27, 2009

bacterial computation

This article in the Guardian is attracting attention; rather over-inflated claims on behalf of the ability of bacteria to solve hard computational problems... (I did not post any of the comments attached to that article, but they are fun to read.) Could swine flu turn out to be even smarter than E. Coli?

I am now posting from Athens, visiting Koutsoupias and Papadimitriou; the instructions that blogger provides for logging in are rather hard to read...

Wednesday, July 22, 2009

Balanced set systems

I've been trying to understand Scarf's core theorem, which says that if a coalitional non-tranferable utility game is balanced, then it has a non-empty core. In the context of transferable utility games, there is another definition of "balanced game" which should not be confused with the one for NTU games. Osborne and Rubinstein's book studies balanced TU games in some detail, but only briefly mentions balanced NTU games.

The definition (NTU case) uses the notion of a balanced set system, defined as follows...

Given a set N, suppose that C is a collection of subsets of N, i.e. C is a subset of 2N. Let C={C1,...,Cz}. We say that C is balanced if there are non-negative real numbers αi, one for each Ci, such that, for every element of N, the sum over all Ci of their α values is 1.

The above definition is used in Scarf's 1967 paper.

The def can also be characterized as follows: Associate each member of N with a coordinate in Euclidean space, in fact let's associate it with a point in space representing the unit vector pointing in the direction of that coordinate. More generally, associate any subset of N with the average, or center of gravity, of the points corresponding to the members of that subset. An equivalent definition of "balanced collection of subsets of N" says that the point corresponding to the entire set N should be a convex combination of the points corresponding to these subsets of N. (the equivalence is straightforward; it is pointed out in this paper and probably quite a lot of others.)

There's an alternative definition that requires that the αi should be positive, rather than just non-negative (e.g. this paper). This is not equivalent in terms of which set systems are balanced (but the definitions are indeed the same modulo their use in the definition of a balanced NTU game). Let's use the phrase "positively balanced" to refer to set systems where the αi can all be positive. To see how they differ, suppose N={1,2,3} and consider the subsets {1},{2,3},{3}. These subsets are balanced, using the α values 1,1,0 respectively. It is easy to see that {3} must be given the value 0, so that these sets are balanced but not positively balanced.

From a geometrical perspective, "positively balanced" seems rather unsatisfying since a set system that is balanced but not positively balanced, is from the perspective of the αi, nevertheless arbitrarily close to being positively balanced. I am wondering if there is some nice alternative combinatorial characterization of positively balanced set systems. On the other hand, for set systems that are not balanced (under the original definition) the set of all non-negative weighted sums of their corresponding points in space, is presumably bounded away from the average of all the points. Maybe it's interesting to ask how close you can get? (My guess: for N={1,2,...,n}, take the n-1 subsets N-{1}, ..., N-{n-1}; these are not balanced; but give each one αi=1/(n-2), so for each i< n you get a total of 1, and for n you get a total of (n-1)/(n-2); pretty close to the all-ones vector.)

Wednesday, July 15, 2009

playing with javascript

One thing missing from Facebook's "shite gifts for academics" is "web page of colleague who has just discovered javascript". Here is a modest example - my reason for learning javascript is partly because I teach a class on formal language theory, and thought it would be useful motivational material to show the students how javascript lets you use regular expressions to verify user input to a web page. And, I wanted to improve the functionality of COMSOC's program page, although by now I guess that collection of papers is yesterday's news.

BTW, another missing gift is "invitation to review paper by unspecified authors that requires you to log on to the publisher's web site using a hard-to-remember user name" - I recently got one of those, but not via Facebook.

Wednesday, July 08, 2009

Game theory and ad-hoc networks

I am wondering if this could be a growth area... so there's been some work at this intersection, e.g. this paper posted to arxiv (CS and Game Theory) on Tuesday 7th July. Ad-hoc networks are composed of nodes, which in a game-theoretic context would constitute the agents or players.

I recently read another paper "New strategies for Revocation in Ad-Hoc Networks" (link) which doesn't have a game theory spin; I took a look at it because it was reported on in the New Scientist, and it relates to general issues I'm interested in, networks of decentralised agents trying to reach an agreement (so maybe there's an element of social choice theory?) where there are good and bad agents, and the good ones have to avoid being misled by the bad ones... "revocation" means removing the bad agents from the system, or at any rate, "blackballing" them.

For a theorist the paper is a bit unsatisfying to read since it doesn't spell out a mathematical model of what's going on (in terms of how agents can communicate, and what they are trying to achieve). And perhaps there is no nice model that wouldn't rule out the sort of approach one would want to use in practice, or else rule out the sort of attacks that the bad agents are supposed to be capable of. From time to time, agents may observe other agents misbehave, and when that happens, the observing agent, assuming it is one of the "good" agents, would want to report on the "bad" (misbehaving) agent to the other good ones. However, it is apparently possible for a bad agent, or agents, to bear false witness, so care must be taken in how to process one of these accusations. The only advantage the good agents have, is that they are in the majority (much less than one-third of agents are assumed bad). The paper mentions Byzantine generals, but doesn't really discuss the relationship. It would be interesting to come up with a model and relate it to the ones in The Byzantine Generals Problem and maybe also the FLP paper (Impossibility of distributed consensus with one faulty process).

Tuesday, June 30, 2009

Advertising PhD studentships

(The following is motivated by my work on PhD admissions at Liverpool CS.)

Terence Tao opines that you should move from place to place when you progress from undergraduate to postgraduate to postdoctoral research. I tend to agree, but my sense is that within the UK most students are reluctant to do so.

One obstacle is that departments like to keep their best undergraduate students on their own PhD programmes, another is lack of information about outside opportunities, and which research strengths are located in which departments. New and new-ish web sites that advertise PhD study opportunities may alleviate this problem. Here's a list I made after hunting around; I thought that findaphd.com is probably the best for finding phd studentships, and the discussions on the postgraduateforum will be of interest to most current PhD students. These web sites are mainly geared towards specific projects; to locate general research strengths still requires a prospective student to ask around, or visit individual department web sites.

Tuesday, June 23, 2009

new postdoc positions (update)

A couple of weeks ago I announced two new postdoc jobs (each of 3 years duration) in Algorithmic Game Theory, on the project Efficient Decentralised Approaches in Algorithmic Game Theory (to be run by me and colleagues at Liverpool, and Artur Czumaj at Warwick). At that time, we made preliminary announcement web pages here and here.

Here is the official ad (went live today) for the one at the University of Liverpool and here is the official ad for the one at the University of Warwick (See another copy here.) For both of these the closing date for applications is July 17th.

Here and here are the EPSRC summary pages for the grants for this joint project.

Monday, June 15, 2009

perpetual motion at Sainsbury's

This article caught my eye in the Guardian's technology section: a branch of Sainsbury's supermarket is installing speed bumps that harvest energy "for free" from the cars passing over them.

Hang on a minute. You can only get energy from passing vehicles if you slow them down as they pass. I guess this could validly produce energy that would otherwise be wasted, strictly provided the speed bumps are installed on a stretch of road where most drivers would brake (this is not pointed out in the article). Although for me, in my Toyota Prius, when you brake the car is supposed to recycle that energy by using it to recharge the battery, so if I pass over those speed bumps I lose out -- ending up powering the supermarket and not the car battery.

Friday, June 12, 2009

Mac switchover

Today I finally got around to getting my desktop PC replaced with a Mac that has been waiting in the wings while I come to terms with giving up much of the software I've gotten used to. My laptop is already a Mac, so now the switchover is complete; my souvenir of Windows is a couple of aging laptops left over from earlier.

The switchover was mainly driven by peer pressure; now I can whole-heartedly join in the discussions about the deficiencies of Windows (mainly the start-up time; one of my laptops actually "runs" Vista but I never used it because of a shortage of time).

Wednesday, June 10, 2009

New degree course in e-Finance

My interest in this new degree programme dates back to just under a year ago, when my head of department asked me to be in charge of setting it up. Luckily, various colleagues at CS and University of Liverpool Management School (ULMS) are also involved, this is not the sort of thing you want to do on your own.

So, it's a 3-year undergraduate course to be taught jointly with ULMS, and it was formally approved a couple of weeks ago. Here is a link that has recently shown up, actually I should get some minor errors fixed in some of the information it contains.
I did not blog about it in case something went horribly wrong with the approval process (approval processes are there to make us resist the urge to set up degree programmes without due care and consideration); we have been under time pressure to get it started up for this September.

Background: lots of students at XJTLU (Liverpool's partner university in China) spend their first 2 years over there and then another 2 years over here. There are about 20 of them who will want to take this new course. Starting in 2010 we will accept new students (mainly UK and EU) to start in year 1 of the course; we don't know how many students it will attract but the students coming from China are enough to justify setting it up.

There is more work to be done, includes setting up a new taught module in CS (probably to be titled "electronic markets") but I'm happy we got the degree approved in time for the first cohort of XJTLU students. Credit must go to Ullrich Hustadt (CS dept expert on XJTLU, and on setting up new degrees) and Frank Steffen (main contact at ULMS). This degree should help to develop further collaborations between the two departments.

Monday, June 08, 2009

European elections

The way that votes within a region are converted into a set of winning candidates is described in a rather unintuitive way here -- here is the relevant passage:

Once the vote count is complete and confirmed:

1. The party with the highest number of votes is allocated the first seat. The candidate at the top of the party's list is elected.
2. The votes of the party winning the first seat are then divided by the number of seats they have won + 1 ( i.e. 2 ) and the resultant number of votes go forward to be compared with the number of votes for the other parties and candidates to decide who wins the next seat.
3. The exercise is then repeated. As each successful party wins its first seat, its votes are divided by 2. This divisor is then increased by 1 each time this party wins a further seat. (NB it is the total number of votes cast for the party which is divided by the divisor, NOT the number resulting from the previous division.)
4. In the event that an independent candidate is elected or all the candidates on a party's list have been elected, the votes cast for that party are excluded from the remainder of the exercise.

This generalises plurality to election of k representatives: scale the votes obtained by each party by dividing them by a common factor chosen so that, when you round the new numbers down to the nearest integers, they add up to k. That's how many representatives each party gets.

My region (North-west England) has suffered the embarrassment of having elected a BNP candidate (one of 8 representatives) with the Green party narrowly failing to edge them out. Note that the number of votes obtained by the BNP actually went down. In my opinion, by putting up their leader as candidate in our region, we were unfairly targeted by them, and this result has more to do with low voter turnout. Both the Greens and the Lib Dems deserved to do better than they did.

Tuesday, June 02, 2009

MFI'09

I am at the 5th Modeles Formels de l'Interaction; tomorrow I give a talk on the PPAD-completeness of computing Nash equilibrium. The meeting is in Lannion, in Brittany, I gather it's following the Dagstuhl approach of being out of the way in order for people to socialise. Although, getting there by train is pretty straightforward, even allowing for the poor quality of the web site that sells the tickets; it's not as out-of the-way as the Estonia winter school in March. Today I attended some talks on computational social choice and met up with some of the people who came to COMSOC. Air France lost my suitcase and my time spent online will be brief to conserve battery of laptop (the charger is in the suitcase.) Any questions by email relating to recent job announcements will have to wait until Friday.

Monday, May 25, 2009

Putting the VC into VCG

There's a problem with blogging about papers you've read recently - actually there are two problems - maybe the paper was one that you had to review anonymously for a conference or journal, or else it was one that relates to current research (and so co-authors may be displeased if you tell all about the research problem of interest). But, the following paper seems to escape those.

This paper appeared recently on arxiv (VC v. VCG: Inapproximability of Combinatorial Auctions via Generalizations of the VC Dimension; Authors: Elchanan Mossel, Christos Papadimitriou, Michael Schapira, Yaron Singer). So, it relates to 2 topics of interest to me, i.e. mechanism design and the Vapnik-Chervonenkis dimension. The following is a summary of my sketchy understanding of the paper, concluding with a wild guess at a future research direction that may arise.

So, the VCG mechanism requires one to find the socially optimum outcome of an auction. Suppose a bunch of items are being sold to 2 bidders, who have limited budgets, which means you impose a ceiling on the value they may assign to any set of items. Finding the social optimum is now hard, I guess because it can encode a SUBSET SUM type of problem, i.e. you could easily find yourself trying to identify a bundle of items that just fits some bidder's budget, and if you overshoot, you end up undershooting for the other bidder.

In looking for an approximately optimal mechanism, you might try for a maximal in range mechanism, in which you simply delete from consideration some of the possible outcomes - if there are n items being sold, there are 2n ways to share them between the bidders (these are the outcomes) - now suppose you restrict to a limited subset of those, you still have a truthful mechanism, and the question becomes, can you find such a subset that results in an approximately optimal mechanism. What you need to do is, find a set of partitions such that, if the socially optimal partition is not a member of your set, there exists one that is close.

The way to argue that you still need exponentially many partitions, is to consider valuations in which, for some target partition, one bidder assigns unit value to each of his items, and the other bidder assigns unit value to each of the other items. The approximation quality of any alternative partition, will correspond to its Hamming distance from the target partition, and for distances less than n/2, the hamming balls only contain a very small fraction of the 2n partitions...

So you still have to optimize over a large number of the partitions. So many in fact, that there exists a set of sqrt(n) of the items such that any partition of those sqrt(n) items can be realised by one of the partitions that is being used by the maximal-in-range mechanism. This is the part that uses results about the VC dimension (defined on Wikipedia so no need for me to repeat the def; the VC dimension is a combinatorial measure of a set system, and in the context of learning theory, mainly comes in handy when the set system is infinite; in this paper the key result being used, by contrast, is about the VC dimension of specifically finite set systems).

Finally, what this leaves us with, is a (standard, unrestricted) budget-limited auction for sqrt(n) items (since all other items can be given zero value). This is polynomially related, so you are back where you started. Conclude that maximum-in-range mechanisms cannot both give a good approximately-optimal solution, and alleviate the computational difficulty.

Where to go from here? While Roberts Theorem suggests that your options are rather limited in designing a truthful mechanism, it applies to auctions where there are no restrictions on the values that bidders can give to outcomes. But, the 2n outcomes are resulting from only 2n valuations, so there is actually a lot of structure in the valuations that has so far gone unexploited. Maybe some clever ad-hoc mechanism could have better approximation than the crude one mentioned in the paper (where you just bundle all n items together and sell them all to one of the bidders.)

Thursday, May 21, 2009

Two new postdoc positions in computational game theory

Note the update to the job ads on the left hand side of this page! The project (funded by EPSRC) is entitled Efficient Decentralised Approaches in Algorithmic Game Theory, it addresses some of the main research issues in computational game theory, some of which were raised in GAMES 2008 (the 3rd World Congress of the Game Theory Society) as being of considerable interest to the game theory community. This is a preliminary announcement, not an official ad. Updates will follow. Note the link (on the left) to some more details.

Monday, May 18, 2009

joining Facebook

Everyone I know seems to be on Facebook, so finally I have gotten around to signing up. So, no need for me to discuss my first impressions, since anyone reading this is no doubt familiar with it already.

Wednesday, May 13, 2009

Monbiot on scientific research

George Monbiot is a (mainly environmentalist) commentator who writes for The Guardian; he gets a lot of things wrong but I have to admit I quite like him since his heart is clearly in the right place. In this article he criticises the requirement for academics seeking grants to describe the economic impact of the proposed work. He also criticises the way business leaders are being put in charge of research councils.

Really, I'd say his attack is off-target --- it's not disastrous to have to describe economic impact of proposed research, so long as it is OK to be quite speculative. And I don't have a principled objection to business leaders running research councils -- with an outsider you get a sporting chance that the person concerned doesn't imagine he knows everything about science. The point he should be making is that we need to recognise that scientific progress is unpredictable, and you can't plan it that way you plan (say) a construction project. And also, "investing in research" is more analogous to premium bonds than it is to saving in an interest-bearing account.

Thursday, May 07, 2009

Paperless communications

For our last department meeting we got sent the agenda as an email attachment, and I must admit to not reading it; that's the kind of thing that really ought to be distributed on paper, in order to attract loyal readers. Let's face it, just about any communication is more likely to be read if the sender feels the need to print it off rather than just send electronic copy. I reckon that Dagstuhl's reluctance to use this newfangled "Internet" for the purpose of distributing its invitations to workshops, plays a part in its brand recognition and loyalty. On the downside, to agree to attend seminar 10101 I had to search the stationery closet for an envelope (just underneath the spare daisy-wheels and typewriter ribbons), find it's so old that the glue no longer works, then try to remind myself how to use the fax machine... But once that's done, I feel kind of committed.

Wednesday, April 29, 2009

Miscellaneous political stuff


Eminent economist Willem Buiter gives an impassioned rant against Google on his blog. I like Buiter's blog, but I'm still a fan of Google. Concerning the invasion-of-privacy issue, I am still more concerned with the Government's tendancy to invade our privacy -- if you don't like some private company, at least you don't have to use it.

This article is attracting attention on academic blogs; here is a lengthy discussion. My thoughts - it is healthy to get challenges to some of the ingrained assumptions about how academic life ought to be. Most of the proposals are a bit off the wall and lacking in focus, but I think it is well worth questioning the merit of academic tenure in the US. As Craig Venter remarked in his autobiography, really smart people don't need tenure in order to keep them motivated, or to maintain their job security (I am paraphrasing from memory).

From Policy Exchange, the think tank that gave us the notorious Cities Unlimited report (which I discussed earlier), comes this report, on why British universities should not be prevented from having to close down due to bankruptcy. The fundamental error in the report is its failure to recognise that in reality, universities can close down in all but name -- it may well be that there are legal obstacles to making one cease to exist as a legal entity, but for all practical purposes, bankruptcy would be just like for an private company.

Monday, April 20, 2009

ACM-EC'09 papers

By now the list of accepted papers for ACM-EC is last week's news, already mentioned here and here. A nice feature of that list is that there is a version of it complete with abstracts, so we can get a better feel what what sort of papers appear there. I tried reading them; let me have a go at making comments (this year, I am an impartial observer, I did not submit a paper or serve on the program committee).

It's easy to classify papers according to who wrote them, harder to classify them according to (more importantly!) what they are about. Out of 40 papers, just over 31 come from the USA (dividing up credit for a paper equally amongst its co-authors), followed by Israel on just over 3, then England takes third place with just under 2, leaving France, Germany, etc simply nowhere.

There's a big emphasis on mechanism design, and most of the papers address the issue of truthful mechanisms in various contexts. Only one of the abstracts mentions price of anarchy; another paper is about price of mediation (ratio of social cost of correlated equilibria to social cost of Nash equilibria). Maybe that kind of topic is more likely to go to SAGT these days(?). (However, just to keep up the competitive pressure, (via the comsoc mailing list) I just got an announcement for AMMA, the first Conference on Auctions, Market Mechanisns and Their Applications.) Returning to EC, I see maybe 2 papers relating to coalitional games, another 2 to market equilibria, one on voting... actually, it's kind of hard to find a feature that is shared by approximately half of the papers. I wondered about classifying them according to the number of agents they study, i.e. 2, 3, n or infinity, but in nearly all cases it seems to be n.

Wednesday, April 08, 2009

Localising Nash equilibria

In an epsilon-Nash equilibrium, the players choose distributions over their actions in such as way that no player can gain more than epsilon (in expected utility) by switching to some alternative strategy. But in general, there is no guarantee that an epsilon-Nash equilibrium is anywhere near a true Nash equilibrium. ("near" can be defined by using the variation distance between two probability distributions.)

After my talk at BCTCS (on algorithms for computing approximate Nash equilibria) Arno Pauly asked me the following interesting question. How hard is it to find an epsilon-Nash equilibrium that is guaranteed to be within epsilon of a true Nash equilibrium? (My immediate answer: don't know.)

Focus on the 2-player case. The problem is in TFNP, because a non-deterministic Turing machine can find a genuine Nash equilibrium, which of course qualifies. But there's an interesting twist here, which is that a solution to the problem cannot, on its own, in any obvious way, be checked in deterministic polynomial time. This is in contrast to most search problems. A solution does not seem to constitute its own certificate. By contrast, an unrestricted epsilon-Nash equilibrium is its own certificate.

I guess that any NP search problem (not just total ones) have related ones where a solution cannot be checked in deterministic poly-time. e.g. SAT: suppose you ask for an assignment to the propositional variables that is Hamming distance at most n/4 from a satisfying assignment. It's still in NP but you can't (seemingly) deterministically check a solution. The problem of finding an approximation to a genuine Nash equilibrium looks more natural, of course.

So, is this problem any harder than finding an unrestricted epsilon-Nash equilibrium? As explained in the talk, little is known about the complexity of finding epsilon-Nash equilibria, although the fact that they can be found in subexponential time for any epsilon>0 suggests that maybe the answer is that the problem should be in P.

On the other hand, finding epsilon-approximations to true Nash equilibria feels like it ought to be tougher. One possibility that I would be tempted to look for: any such algorithm can be modified to find genuine Nash equilibria. A possible analogy is with boosting in computational learning theory, where in Schapire's classical paper, a PAC algorithm that guarantees to end up doing slightly better than random guessing, can have its performance boosted (by running it repeatedly on variously filtered data) to end up with standard PAC learning.

Monday, April 06, 2009

BCTCS 2009

I am attending the 25th British Colloquium for Theoretical Computer Science (today thru Thursday). From the web page, it looks like the meeting has nearly 100 participants, quite a healthy number. I've not been for some time; last time I went to BCTCS the conference was at Liverpool and I worked at Warwick; now it's the other way around. So, a good time to meet up with various people I've not seen for a while.

Monday, March 16, 2009

Wiebe Fest 2009

Here at Liverpool Computer Science we just had Wiebe Fest 2009, "A workshop in honour of our friend and colleague Prof. Wiebe Van Der Hoek on the occasion of his 50th birthday". (that is from the front cover of the proceedings). Occasions like this are a wonderful morale-booster, since they remind us about the intrinsic interest of our research topics, that we have some great colleagues, and they distract our attention from all the nonsense that besets the academic world. (This event was arranged as a complete surprise for Wiebe, and I did not have much more notice that he did; my only contribution was to turn up to it about half an hour late. photos.)

Wednesday, March 11, 2009

Senate meeting

I attended my first meeting of Senate at Liverpool University - at the beginning I was introduced by the vice-chancellor, along with about 5 other new members, and this was obviously the main highlight of the meeting.

At Liverpool, Senate is described on the web pages as "a governing body of the university". I was also a member of Senate back at Warwick University - there it is "the supreme academic authority" of the university - and how could anyone possibly refuse the chance to join such an august body? In practice, university senates spend most of their time noting and receiving reports from various academic subcommittees. It's as boring as hell, quite frankly.

Warwick and Liverpool senates differ in the physical format of the meetings. At Warwick, there were about 40 of us sitting around a table, and the vice-chancellor (who chairs the meeting) is flanked by the registrar, the deputy VC and a couple of secretaries. Over here, the format seems to put the VC on the spot - he stands in front of an audience with only a single assistant off to one side, to count the votes. Also, at Liverpool every prof is automatically a member of Senate, so a well-attended meeting can attract 200 people. I am reliably informed that most meetings are less well-attended than this one was.

The most memorable Senate meeting at Warwick that I attended, took place about 10 years ago, and was interrupted by a student demo against tuition fees. The then VC, Sir Brian Follett, had just left so that we could come to some agreement on who would be acting VC when he retired, and while he was out, about 300 students occupied the building. They let it be known that no-one would be allowed to enter or leave until we passed some form of resolution condemning tuition fees, which was the Government's big idea of the time. (Pretty quicky, someone spotted a technical obstacle to complying with this demand, namely the meeting's chairman being unavoidably detained outside.) When we found out what was going on, I remember catching the eye of another young lecturer (this was about 10 years ago remember), and our common reaction was dismay that we were now so old as to be on the receiving end of a student protest - the horror!

These days of course, I accept the onset of middle age with all its associated inconveniences. There were about 300 students outside the Victoria building protesting against the proposal to discontinue 8 of our RAE "units of assessment", up for discussion at Senate. The idea is that we should concentrate on things we are good at, one of which is Computer Science, so I am biased in that regard. This was not the only item of business (it was item 16 on the agenda) but definitely a crowd-puller. The Jack Leggate theatre is a grand surroundings for making impassioned speeches about the ideals of universities, although the accoustics are not great, and also not helped by the demo and nearby road works.

The resulting discussion was interesting, touching on basic issues of the nature and objectives of universities, and the obligations that a university has to its academic staff to insulate them from the real world. Many speeches were made, some of them passionate. I did not speak at the meeting; the following is my thoughts on a few of the points raised.

Point raised: "A Russell group university cannot do without a philosophy department" (one of the "units of assessment" threatened with closure), and related: "A civic university needs to have a philosophy department" ["civic" seems to mean: attracts local students who can't afford to live further afield, so if they want to study philosophy, where are they supposed to go?]. My answer to this is: the worst attitude you can have in an organisation is "my employer can't fire me, I'm indispensable". And w.r.t. civic universities in particular, it is most important that Liverpool University should have higher ambitions that this - it is essential for the regional economy that it continue to have an international scope and outlook.

Point raised: "The RAE results are inaccurate and do not reflect quality of research." My answer: Everyone, at some point, has to accept external scrutiny of the quality of their work. The alternative, allowing someone to always certify the quality of his own work, is patently absurd. And while RAE scores are inaccurate, they are not off by more than about 20%. If your score is terrible you are not great, you are at best mediocre.

Point raised: "All this controversy will deter prospective students". My answer: In the short term, perhaps. In the long term it will be worse if we give the impression we just do not care about research quality, and are afraid to deal with problems.

Saturday, March 07, 2009

some last notes on EWSCS

The British Colloquium for Theoretical Computer Science (BCTCS) is the nearest British event to EWSCS. The main difference is that BCTCS is mainly contributed talks; at EWSCS the 5 mini-courses took up most of the time, with just 8 student presentations. Making the slides for my introduction to computational game theory took more time than I'd anticipated; probably because previous talks I've done on the topic have been research talks with less emphasis on doing a detailed introduction to the background. The format in which lecturers do 4.5 hours of lectures, seems like a good idea, but requires quite a lot of care on the part of lecturers - if you get it wrong the loss is of course bigger than for just a 1-hour talk.

The students who attended EWSCS seemed to be well-prepared and asked plenty of searching questions. The Russian contingent all came from St Petersburg, probably due to its proximity, also this seems to be something of a tradition. At least 2 of them were undergraduates, who has heard about this event due to being in a club for students whose interest in CS goes beyond the content of their degree course - such an organisation must be a good way to find prospective PhD students; probably better than just looking for students who got good marks, which correlates positively but weakly with interest in pursuing further research.

I came home late on Friday, after passing on the opportunity to visit Tallinn's historic centre; that will have to wait until another trip. On the way back to the airport (via the bus station) there is no hint of this tourist attraction - just a gloomy procession of rust-stained, Soviet-era concrete apartment blocks, going on for mile after mile. (I did not bother to check that these buildings were Soviet-era; indeed, the phrase is essentially a handy label for all the concrete stumps that to this day, continue to go up in all parts of the world. Le Corbusier has a lot to answer for.)

I thank Helger Lipmaa for inviting me to lecture at EWCSC 2009.

Wednesday, March 04, 2009

CRAPcon'09


I just attended CRAPcon 2009, an entertaining event co-located with EWSCS. Details are at the web site. (For completeness, here are links to CRAPcon 2008 and CRAPcon 2007.)

Monday, March 02, 2009

Winter school; pre-PhD training

They do winters properly here in Estonia - there's plenty of snow on the ground, and the ice on the lake looks solid enough to walk on, not that I would try it.

There are about 40 students here, mainly from Eastern Europe. They seemed to like my 1st talk, an intro to Nash equilibrium, and asked plenty of questions. Time to dust off my pet peeve, as I confirm that PhD students outside the UK are much better prepared than UK ones. (Helger Lipmaa, one of the local organisers, seems to have spotted this problem earlier on, from back when he worked in the UK.) In Estonia and Latvia, 4 years u/g and 2 years masters is standard. In Sweden it's 3+2, but they leave high school at 19 (a year later than most countries). Russia: 4+2+3 (for u/g, masters and PhD respectively). Bologna agreement is for 3+2+4 in Europe according to Helger. Which of the following will come first to Britain?

  • The euro

  • driving on the right-hand side of the road

  • the Bologna process


...answers on a postcard please. Not sure if the following is true, sounds too good to be true: a UK student can go and take a 2 years Masters degree in Sweden, without paying tuition fees (just personal expenses), and the teaching is in English! A year or 2 ago, I heard a similar claim concerining undergrad study in Denmark. It gives me some new ideas about where to send my kids to university, anyway.

Friday, February 27, 2009

Estonia Winter School

Here is an advert for Christos Papadimitriou's talk at Liverpool next week.

...to my shame, I will not be there; I will be giving lectures at the 14th Estonia Winter School in Computer Science. This school is for research students mainly from Eastern Europe, and my lectures will be an intro to game theory and computational complexity, including an introduction to the complexity classes of total search problems introduced by Christos in the early 90s.

The school is located at Palmse Manor, a remarkably isolated-looking spot (based on checking google maps) about 80km east of Tallinn.

Sunday, February 22, 2009

A mild defense of bibliometrics

In an ideal world, bibliometrics would be a character from Asterix (the village librarian, presumably). In real life, they are one of the dark clouds that hang over the academic world, and give it cause for concern. They are going to be used in Britain's Research Excellence Framework, which may be enough to antagonize a few people. And yet, I'd like to propose that they have some merit; to make a half-hearted case in their favour. I don't have a particularly good bibliometric standing myself, so this defense is not motivated by self-interest.

Here are two places where I have recently seen bibliometrics coming under fire. This post at Luca Aceto's blog has a go at them, and links to a statement by the editorial board of a journal that criticises them. More recently, I have been following a debate that was begun by a letter to the Times Higher a week ago that criticised the emphasis on economic benefit of research proposals in funding decisions; one place this debate has been continued is here at Steven Hill's blog. Hill commends bibliometrics as evidence that Britain's scientific prowess is undiminished; his opponents challenge this evidence. Hill's status as a funding chief marks him out as the villain of this particular battle, thus bibliometrics, as a measure of a person's scientific contribution, are further tainted.

So what do I like about them, then? Actually, as a measure of scientific contribution, they are indeed inaccurate. What I like is not so much the way they measure a researcher, as the way they incentivise one. Let's accept as a given that one's scientific output must, from time to time, be assessed for quality. The act of measuring will inevitably affect behavior, as the measurement gets announced ex ante, so that measurement-as-incentive is just as important as measurement for the sake of measurement, if not more so. Now, to boost your citation count, what must you do? Write stuff that other people find interesting, seems like an obvious answer. This seems positively virtuous. (An alternative way to measure a person's research output, which is not unfamiliar to most of the scientific community, is to compute his research grant income. Grant income may indeed correlate with research quality, but it seems clear that the pursuit of grant income is by no means as socially virtuous as the pursuit of citations.)

To develop this observation about measurement-as-incentive in more detail, consider h-index, for which I will now include a definition for the sake of completeness. A person's h-index is the largest value of N such that he/she has written at least N papers each of which have been cited at least N times. Again, as a measure there are problems with this - if my h-index is 10 and I write a new paper that picks up less than 11 citations, it cannot improve my h-index. But surely that paper should have positive contribution to my research output? Yes, but h-index encourages a researcher to "raise his game" as his h-index increases; the better you do, the more ambitious should be your next effort -- and this seems like a good thing to encourage.

Now, let me turn to the most obvious fault of bibliometrics, which is their weak correlation with research quality. My hope, as a technocrat, is that technology will in due course alleviate this problem. Let me give one example: their current failure to distinguish between a high-quality and a low-quality citation. By way of example (I exaggerate to make the distinction obvious) a paper may be cited in a context like: ...finally, the papers [4,5,6,11,12,13,17,24] also study minor variations of this model, thus providing useful evidence that this is a "hot topic". or it may be cited in a context like We make repeated use of a fundamental and masterful theorem of Haskins [13] without which the present paper could never have been written. Both of these sentences would result in [13] picking up a single citation; clearly, one would like the second to carry more weight. In the future, better computer analysis of citation contexts may well allow this distinction to be make. One may also hope that better citation analysis may also be able to detect other undesirable artifacts, such as a flurry of mediocre papers that all cite the previous ones but have little or no outside influence. Another idea I have is a simple one - take the age of a cited paper into account. It should be more a valuable citation when you cite a 10-year old paper, than when you cite a 1-year old one.

Finally, a possible strength of bibliometrics, acknowledged in Oded Goldreich's (otherwise critical) essay On the Evaluation of Scientific Work, is that the errors they make are the work of many, not the work of a few, as is the case for expert evaluation. So, can "the wisdom of crowds" be harnessed more effectively? Perhaps. Indeed, making it work could turn out to be quite a technically interesting problem.

Tuesday, February 17, 2009

Overseas research proposals good, domestic ones bad

Simon Rattle once observed that Liverpool is a city that turns its back to England, and look outward to the rest of the world. At any rate, I vaguely recall some such sentiment being attributed to him in a concert programme that I was perusing a few months ago. If there's anything to it, I offer the following in the best Liverpudlian spirit.

I got an email asking me to review a research proposal, emanating from a foreign country, their equivalent of EPSRC. Without having perused the proposal in any detail, I feel much happier about having to review it, than I would for one submitted to EPSRC. EPSRC, and also probably most other national research funding organisations, are biased towards getting reviews from within their own country. But there are good reasons for seeking them from outside.

The main reason: Foreign reviewers do not have a conflict of interest. This problem is particularly acute in the UK, where the value of research grants has become extremely inflated as a result of "full economic costing". In the zero-sum game of research funding, when a rival institution attracts a grant, one's own institution has been disadvantaged as a direct consequence. This is especially true in the present economic climate, which may be roughly characterized as "everyone is running out of money". Ignoring this problem increasingly requires one to take up residence in an ivory-tower, and become the sort of complacent academic who imagines that higher education is recession-proof. The other obvious source of conflict-of-interest is that one's own taxes are being used to fund the proposed research. Clearly, this problem goes away when one has to review a foreign research proposal.

It enlarges the pool of expert reviewers. Most individuals' research interests are highly specialised, and in a globalised scientific community there is no particular reason to assume that there exist any reviewers within one's own country, who are competant to do the job. I've got an example in mind; the less said about that the better.

It exposes a national research community to external scrutiny. This is related to the previous point, but by no means the same. A national research agenda can become misguided, or a line of research having questionable value thrive unchecked, within a system where such scrutiny is absent.

There are practical hurdles to the process of globalisation of research-grant reviewing that I am advocating. An EPSRC official was asked at a meeting I attended a few years ago whether it was OK to list overseas reviewers on the usual list of potential reviewers that form part of a proposal, and he or she advised that they had trouble getting reviews from foreign reviewers. Still, that's how we review research papers. One might also object that a foreigner might fail to appreciate the "grade inflation", that if you don't rate a proposal very highly, then it is likely to fail. But, anything that reverses that grade inflation and allows reviewers to use the rating system in a more balanced way, would surely be a fine thing.

Thursday, January 29, 2009

New Research Group: Economics and Computation

Concepts from economics (markets, incentives, social welfare, etc.) have gained increasing currency within computer science. We now have quite a well-developed research community at this interface, with meetings like ACM-EC, WINE and SAGT having arisen to cater to it.

Here at Liverpool we're starting up a new research group in the Computer Science dept: Economics and Computation (EcCo). Initially, Piotr Krysta and myself are members, and we are now looking to recruit two further faculty members --- see below! And of course, later we hope to attract students and RAs, and later on, grow by another 2-3 further faculty members.

The new group builds on existing expertise and research activity here, and thus there are excellent opportunities for collaborations with other current research groups here, notably the Complexity Theory and Algorithms Group and the Agent Applications, Research and Technology group. I can promise a great research environment, and am happy to answer questions from potential candidates about any aspect of these posts.


Here is a copy of an ad that has been emailed out to various mailing lists:


THE UNIVERSITY OF LIVERPOOL -- DEPARTMENT OF COMPUTER SCIENCE

LECTURESHIPS IN ECONOMICS AND COMPUTATION (2 POSTS)

SALARY IN RANGE GBP 30,594 - GBP 35,469 pa

The Department of Computer Science was ranked in the top 10 UK Computer Science departments in RAE2008, and building on this success, we seek to significantly expand current research around existing strengths at the intersection of economics/game theory and computer science. The successful candidates will join Professor Paul Goldberg and Dr Piotr Krysta in establishing a new research group in this area.

The group will carry out research in the computational foundations of economics/game theory and economic theory in computer science. The group will enjoy close collaborative links with the existing Complexity, Theory and Algorithms research group and the Agent Applications, Research, and Technology research group. Relevant topics of interest include (but are not restricted to):

** algorithmic game theory;

** mechanism design and auction theory;

** complexity and computation of solution concepts;

** optimization problems in economics; and

** computational social choice.

You should have an excellent track record of research at the intersection of computer science and economics/game theory, and will join a world-class Department.

The post attracts a special HEFCE-funded `Golden Hello' to the value of GBP 9K, subject to individuals satisfying the eligibility criteria.

Job Ref: A-569104 ** Closing Date: 27 February 2009 **

For informal discussions please contact Prof Paul Goldberg, Head of Group (mailto:goldberg@liverpool.ac.uk).

For full details, or to request an application pack, visit

http://www.liv.ac.uk/working/job_vacancies/

or

e-mail jobs@liv.ac.uk
tel 0151 794 2210 (24 hr answerphone).

Please quote Job Ref A-569104 in all enquiries.

Tuesday, January 27, 2009

Is green the new black?

Compared with other vices, envy gets a bad press. For years, the phrase "politics of envy" has been a trusty weapon of political discourse, wheeled out again and again whenever someone rails against excessive boardroom pay rises, or suggests raising the top level of income tax. Recently, things have changed - articles like this one are becoming fashionable, as we find out about the pay received by top bankers whose organizations are now being bailed out by the taxpayer.

It is a fair counter-argument that the excessive bonuses are a small percentage of the overall losses that banks are passing on to the public, and that to curb them would not cut the taxpayer's bill very much. But, there is a growing understanding that curbing other people's excessive bonuses simply feels good for its own sake -- it requires no financial justification. We don't penalise the bankers in order to save money -- we do so because it makes us happy.

In psychology and economics, envy requires one to be prepared to pay a small amount of money in order to see someone else lose a larger amount of money. The above attitude appears to qualify therefore, since if you get your kicks out of seeing some overpaid banker have to give up his 100 foot yacht, then there exists some sufficiently small amount of money that you would be prepared to pay for the spectacle.

Maybe this will provide a boost for envy-related research? There's an interesting line of research that has attracted some computer scientists, on the problem of envy-free cake cutting. The general topic is: how do you generalize the I-cut-you-choose idea, when there are more than 2 people who want to share a divisible good (such as a cake)? I just read a very nice paper: Thou Shalt Covet Thy Neighbor's Cake, by Ariel Procaccia. It shows that envy-freeness is harder to achieve than proportionality, in the sense that it requires more cuts, as a function of number of players. (Envy-freeness means that everyone can guarantee that they value their own share at least as much as anyone else's; proportionality means that everyone can guarantee a "fair share", but with the risk they end up thinking certain other players did better. So, proportionality is a weaker requirement.) The paper uses the Robertson-Webb model of cake-cutting procedure in which a centralized algorithm is allowed to submit certain queries to the players about what value they would assign to various parts of the cake. For me, a very interesting aspect of this model is the parallels with the query protocols that have been studied in computational learning theory.

Thursday, January 22, 2009

Exams, migration

My exam on Decision, Computation and Language took place yesterday, co-located with another one on Animal Parasitology. I could hopefully have scraped together a few marks had I taken the rival paper; could probably make a few constructive comments on the distinction between parasites, parasitoids and predators, not to mention lice and ticks. Not sure how those biologists would have gotten on with regular expressions for words that contain "aaa" as a substring, and the like. After that, listened to a talk by former Liverpool postdoc Edith Elkind on "beyond weighted voting games". She convinced me (not during the talk) that Singapore is, for climate, a nice place at this time of year, or failing that, Southampton. One good thing about heading south for the winter is, your reduced domestic heating probably offsets the carbon you burn in order to travel.

The latest CACM merits inspection. (Added a day later: that is, the Feb 09 issue, which became available online a day before my paper copy of the Jan 09 issue arrived! The Jan issue has an article on "Computational Challenges in E-Commerce" that I really should read.)

Friday, January 09, 2009

RAE Sector Overview report

Here is where one can download the "subject overview reports" from the RAE panels. Panel F is the one that includes Computer Science. It is a brief 4-page summary of their general findings about the state of CS research in the UK. One paragraph notes:

There was a significant increase in research in algorithms and complexity, including the establishment of several new centres of excellence. This increased activity was not, however, limited to the research groups specialising in this area, but permeated research in many other areas, such as data-mining, vision, evolutionary computing, agents, etc.

This was noted as a result of the 2001 International Review of UK Research in Computer Science, which (rightly) pointed out that algorithms and complexity, at the time, was under-represented here. But, the most encouraging aspect of the above observation is the bit about algorithmic analysis gaining recognition for its importance, outside of its own community.

Another paragraph noted what might be regarded as a more general trend, namely that the research being assessed was generally more rigorous than the previous RAE (2001), i.e. more mathematical analysis, and more careful and detailed experiments.