cognitive science
and more
Intro Bio Psy

When a manuscript is submitted to an academic journal, it generally undergoes peer review. That is, the manuscript is read by other researchers who rate the quality of the manuscript, suggest improvements, etc. Based on these peer reviews, the editor of the journal then either rejects the paper, or accepts it for publication, usually after one or two rounds of revision.

A thorny question in this process is whether reviewers should remain anonymous, or whether they should disclose their identity by signing their reviews. There's a wide-spread belief that signing reviews is dangerous, because authors may not appreciate your critical comments, and may even retaliate, for example by trashing your manuscript when it's their turn to review. This would be especially dangerous for early-career researchers, who don't have permanent positions, and are therefore vulnerable to career damage.

A few days ago, Hilda Bastian voiced this concern in a blog post. I feel that her post is somewhat alarmist, in the sense that it starts from the assumption that signing reviews is indeed dangerous. (Although she also points out that it can in some cases help to build a reputation.)

And this prompted me to share my own experiences here.

Robert De Niro as an easily offended author.

I have signed all of my reviews, starting from the very first one, which I believe was in 2011 when I was still a junior PhD student. Since then I've reviewed about 100 manuscripts and grant proposals, most of them undergoing multiple rounds …

Read more »

Why do the US and the UK dominate the World University Rankings?

Every year, all universities in the world are ranked by academic excellence. (I'm only going to use the term 'academic excellence' once, because every time I write it, I vomit a little in my mouth.) These rankings are created by three self-appointed authorities: QS, The Times of Higher Education, and The Shanghai Rankings.

Below you can see where the Top 20 universities come from:

The most striking feature of these rankings is that the Top 20 consists almost entirely of US and UK institutes. In fact, only five countries appear in any of the three Top 20s: US (42×), UK (11×), Switzerland (4×), Singapore (2×), and Australia (1×). And the Top 3 even consists entirely of US (6×) and UK (3×) institutes.

Not all universities are equally good, and I have no problem accepting that Oxford (#6 according to QS) is in all respects a better university than my previous academic home of Aix-Marseille Université (#411-420 according to QS). And universities may, on average and by some measures, be a bit better in one country than another. That's fine.

But the suggestion that, to a good approximation, the US and the UK are the only countries in the world where you can find good universities is ridiculous. What about Japan? What about Germany? What about France—how did Emmanuel Macron become president after receiving an education (well, if you can call it that!) at lowly Paris Nanterre (#801-1000 according to QS)? Compare that to Donald Trump, who was educated at …

Read more »

Mind the gap! Income inequality in the European Union

Inequality is a hot topic. Prominent economists, such as Thomas Piketty and Joseph Stiglitz, have raised public awareness of economic inequality: differences between people in income (what you earn) and capital (what you own). They and others have shown that economic inequality is not only huge, but steadily increasing: the proverbial "one percent" own more and more of everything there is to own. Others have focused on social inequality, such as differences in the opportunities given to women and ethnic minorities, compared (generally) to white men.

Social and economic inequality are, of course, related; in a sense, economic inequality is a subtype of social inequality. But it's a subtype that is (relatively) easy to quantify—and that's exactly what I will try to do in this post, using data from Eurostat, a public database curated by the European Commission.

We all have some gut feeling of the kinds of inequalities that exist: You probably don't need to see an income distribution to know that, on average, men earn more than women for the same job. But how much more? Do men make 50% more, or only 5%? And how does this compare to differences between people with different levels of education? Or to differences between countries? Or to differences between the very rich and the very poor within a country?

It's important to have some idea of the magnitude of the different kinds of inequalities. Because only then you can have an informed debate.

(Disclaimer: This is my best attempt …

Read more »

About (the fact) that

A famous writer (I forgot who or where) once complained about not being able to avoid 'the fact that' in his writing. Such inelegance! Yet he just couldn't bring himself to remove it from every sentence—which he could have done, because reducing 'the fact that' to a plain 'that' always results in a grammatically correct sentence. In fact, I once worked with a copy editor who did just that: She returned my manuscript with every instance of 'the fact that' reduced to 'that'. In all cases, the result was grammatically correct. But in some cases, the result was also atrocious.

So why do some sentences just seem to require a 'the fact that', even when it is grammatically redundant and almost universally despised? I have given this matter a disproportionate amount of thought, and arrived at the conclusion that it is all about expectations.

The word 'that' can have several grammatical roles. It can be an adjective, as in: 'that capybara'. (Which capybara? That one!) Or it can be a conjunction, which is a word that introduces a subclause, as in: 'Do you know that capybaras are the largest rodents?' ('That' can also be a pronoun, of course, but let's forget about that for now.)

Now here's the thing: You often don't know which role 'that' has until you've read the entire sentence. And that's confusing. For example, after reading 'I like that …', you still don't know whether 'that' will be an adjective ('I like that capybara') or a …

Read more »

The inner world of very small creatures

I spent the past few days trying to photograph very small things. I recently bought a camera, and—like any amateur photographer—quickly discovered that close-ups look amazing. But there's only so many close-ups of bees and flowers you can take before it gets old. So I became hungry for even more extreme close-ups. And because I'm not one to exercise restraint, I bought a camera extension for getting really, really close to your model—which consequently can be really, really tiny.

In principle, I can now take close-ups of very, very small things. In practice, I rarely manage, because this requires perfect lighting, a completely stable image, and some idea of what you're doing. (And probably an even better camera.)

But I managed to take a few semi-decent photos. Not BBC-nature-documentary quality. But they got me thinking.

The creatures in the photos are, I think, different kinds of aphids, all less than a millimeter long (except the winged one below, which was bit larger). To the naked eye, they look like slowly moving motes of dust.

To give you some idea of how tiny these creatures are, here's a clumsy aphid who got stuck on a hair while traversing the mountainous ridges of my index.

My best guesstimate is that their brains are less than one hundred micrometers in size, and consist of about 100,000 neurons. (Which may seem like a lot, but the human brain consists of about 100 billion neurons, or a million times more.)

But …

Read more »