The NIPS consistency experiment was an amazing, courageous move by the organizers this year to quantify the randomness in the review process. They split the program committee down the middle, effectively forming two independent program committees. Most submitted papers were assigned to a single side, but 10% of submissions 166 were reviewed by both halves of the committee. This let them observe how consistent the two committees were on which papers to accept. For fairness, they ultimately accepted any paper that was accepted by either committee.

The results were revealed this week: of the 166 papers, the two committees disagreed on the fates of 25.9% of them: 43. But this “25%” number is misleading, and most people I’ve talked to have misunderstood it: it actually means that the two committees disagreed more than they agreed on which papers to accept.

Read the full article “The NIPS Experiment” by Moritz Hardt.

>50% of NIPS papers would be rejected if the review process was rerun

Tagged ,

During World War Two, conscientious objectors in the US and the UK were asked to volunteer for medical research. In one project in the US, young men were starved for six months to help experts decide how to treat victims of mass starvation in Europe.

Read the full article on BBC News.

The Minnesota starvation experiment

Tagged , , ,

In January, Ranganath took on the task of building a prototype for a new Foursquare app. By the spring, even he had to admit that the project was a mess. It caused batteries to drain after just a few hours. It gave bad directions. It sent alerts at the wrong times — tossing users recommendations for a nearby fashion boutique when they were comfortably seated at a bar around the corner.

The problem was the method the prototype was using to identify location — a straightforward combination of GPS, Wi-Fi signals, and cell towers. It couldn’t always find the right signals, and even if it did, it tended to seriously drain the battery as it searched.

But when Ranganath told Shaw about the problems, the data scientist had an idea. Why not take a shortcut? Foursquare already had a massive database of check-ins — location information about the places its users most liked to go. And this data didn’t just include the place where someone had checked in. It showed how strong the GPS signal was at the time, how strong each surrounding Wi-Fi hotspot signal was, what local cell towers were nearby, and so on. Leveraging this data meant that Foursquare could still grab a good current location even if users were underground, near a source of radio interference, or facing some other signal obstacle. Chances are, some prior Foursquare user had seen the world through the same flawed eyes and reported his or her location.

Read the full article on

The Brilliant Hack That Brought Foursquare Back From the Dead

Tagged ,

Most setup guides for Nginx tell you the basics – apt-get a package, modify a few lines here and there, and you’ve got a web server! And, in most cases, a vanilla nginx install will work just fine for serving your website. However, if you’re REALLY trying to squeeze performance out of nginx, you’ll have to go a few steps further. In this guide, I’ll explain which settings in nginx can be fine tuned in order to optimize performance for handling a large number of clients. As a note, this isn’t a comprehensive guide for fine-tuning. It’s a [brief] overview of some settings that can be tuned in order to improve performance. Your mileage may vary.

Read the full article on Zachary Orr.

Battle ready Nginx – an optimization guide