Dan Kaminsky seems to have rocked the cyber-world with a presentation at Black Hat in Las Vegas. The security expert received a massive amount of publicity for “releasing” – er, talking about – a free software tool he is calling N00ter. N00ter is supposed to be incredibly exciting because it can detect when an Internet service provider (ISP) is slowing down or speeding up traffic to and from a website.

We found it really hard to get excited about this.

First, we are talking about vaporware here. Kaminsky hasn't actually released his tool yet, so no one can tell whether it has the magnificent properties he claims for it. I am sure he will eventually release something – “in coming weeks” as he is reported to have said – but we don't know what it is, how usable it is or anything about its performance. And we don't have a store of data it generated to test for anomalies and problems. Wouldn't it be nice if such breathless reports were based on actual facts rather than a bunch of claims made in a conference presentation?

Second, and more importantly, there are at least half a dozen other tools out there already which do the same thing. Kaminsky's presentation – and the fawning, pack-journalism reportage on it – made it seem as if he was the pioneering originator of the very idea of developing tests for network-based discrimination among protocols and services. The Forbes article, at least, noted that the FCC has already called for an open competition to develop such tools – but still, such tools have been around for at least four years. Check out Google's MeasurementLab site, for example. There's Glasnost and ShaperProbe there. The EFF has a tool called Switzerland. There are others.

Our research project on deep packet inspection (DPI) has used the data generated by Glasnost extensively. Developed by researchers at Germany's Max-Planck Institute for Software Systems, Glasnost does a rough but workable job of detecting interference with BitTorrent – it has an error factor of 5-10%. More recent updates extended the testing to 6 or 7 other protocols. Based on our real-world experience with this tool, we greet Kaminsky's claims with skepticism. The article quotes Kaminsky as saying, “Bing is fifty milliseconds slower than Google. Is this the ISP or the million other things that could be slowing the site down?” Kaminsky goes on to claim: “We will always be able to know if an ISP is changing your traffic.” As someone familiar with the concept of measurement error and with the variability of conditions on the Internet, I seriously doubt that any tool could infallibly attribute a 50 ms difference in web site delivery speed to an ISP policy. Only two conditions would make that possible: a) strictly controlled, artificially simplified lab conditions, or b) seeing a difference of the same magnitude hold up over tens of thousands of independent measurements from thousands of different sources at many different times. If Kaminsky thinks he can run one test, say “aha!” and point a finger at an ISP, he needs to learn a bit more about how real social science is conducted. Things are not so easy in the wild.

Third, the idea of testing for minute differences in web site delivery speed shows that Kaminsky and other would-be net neutrality posses may be a couple of steps behind those cattle-rustling, black hat ISPs. What the more sophisticated ISPs are doing with DPI and traffic-shaping technologies is more focused on monetization than on speed discrimination. It's actually pretty difficult to think of a business case for an ISP to make some random website 50 ms slower, or even 1 full second slower, than another. It's not clear that there is any payoff from that which is worth the trouble and risk. ISPs have a lot of other tricks up their sleeve many of which have nothing to do with speed differences, such as advertising-related re-directions of DNS or disabling mobile phones so they can't use Skype when on WiFi. Some of the most interesting and likely-to-be-controversial interventions will involve discrimination among subscribers, not among web sites. Bit caps may be enforced through subscriber awareness, for example. U.S. ISPs, in cahoots with copyright holders, have threatened to slow users' speeds if they exceed a “six strikes” threshold. How will Kaminsky's tool make that transparent? ISPs may offer unique services at prioritized speeds, but those services may not be directly comparable to those offered at other sites, in terms of the protocol used or access mechanisms – of what use will a simple speed test be in that case?

But maybe we are underestimating the estimable Kaminsky. It may be that this security researcher, who gained fame as the discoverer of a “bug” that made the domain name system vulnerable to attack, has uncovered another dangerous vulnerability. Perhaps his demonstration at Las Vegas revealed that even the most astute technical experts and their media can be unduly influenced by a “man in the spotlight” attack, which cache-poisons Google searches on important topics and makes reporters and techies alike the slaves of popular personalities. Perhaps Dan is subtly renewing his fame by calling our attention to this devastating vulnerability.