Too many techies still don't understand the concept of due process, and opportunistic law enforcement agencies, who tend to view due process constraints as an inconvenience, are very happy to take advantage of that. That's the lesson to draw from VeriSign's sudden withdrawal of a proposed new “domain name anti-abuse policy” yesterday.
On September 12 China, the Russian Federation, Tajikistan and Uzbekistan released a Resolution for the UN General Assembly entitled “International code of conduct for information security.” The resolution proposes a voluntary 12 point code of conduct based on “the need to prevent the potential use of information and communication technologies for purposes that are inconsistent with the objectives of maintaining international stability and security and may adversely affect the integrity of the infrastructure within States…” The Code seems to be intended to preserve and protect national sovereignty in information and communication.
That was the eye-catching subject line in a recent note from Randy Bush to the North American Network Operators Group (NANOG) about secure Border Gateway Protocol (S-BGP). His note critiqued a paper, Let the Market Drive Deployment: A Strategy for Transitioning to BGP Security, which was presented recently at SIGCOMM and NANOG meetings. The paper argued that under certain conditions, the transition to secure Internet routing could be driven by ISPs' incentive to increase their revenue-generating traffic. But as Bush noted, focusing on the economic incentives affecting ISP routing decisions in light of S-BGP may be missing the point. For him, the problem of secure routing deployment is grounded in economic and institutional issues around RPKI, something we identified in a paper earlier this year. While there certainly is a need to understand the micro-foundations surrounding adoption of Internet security standards like RPKI, S-BGP or DNSSEC, understanding and resolving the institutional problems must happen simultaneously.
Dan Kaminsky seems to have rocked the cyber-world with a presentation at Black Hat in Las Vegas. The security expert received a massive amount of publicity for “releasing” – er, talking about – a free software tool he is calling N00ter. N00ter is supposed to be incredibly exciting because it can detect when an Internet service provider (ISP) is slowing down or speeding up traffic to and from a website.
We found it really hard to get excited about this.
Jeff Moss is famous in the security community as the founder of DEF CON and Black Hat. He is in Internet governance news today because ICANN has just hired him as its new “Chief Security Officer.” The corporation has issued a self-congratulatory news release, prepared by its London public relations firm, in which various prominent people effusively praise the hire. We offer up our own observations and a cautionary note.
As expected, VeriSign placed its key material in the root zone yesterday (click on the picture below to view more detailed key information, etc.). Secure resolvers can now authenticate the .com key starting from the root zone and validate DNSSEC secured domains in the .com zone. Certainly a big accomplishment for the technical community. But a big question still remains – is there any incentive for resolvers to validate?
As mentioned briefly in a post last Friday, our recently completed study on ISP botnet mitigation showed that between 5 to 10 percent of all broadband subscribers in the Netherlands had their machines recruited into a botnet at some point in 2009. This week we offer a little closer look at that finding, which is conservatively based on the unique IP sources present in three distinct datasets of malicious network hosts: a large spam trap, the DShield distributed intrusion detection system, and Conficker sinkholes. Our results indicate that, from an economic perspective, the use of automation in botnet mitigation has an interesting effect on the incentives of ISPs. Read on to find out more and download the report.
Last night I got a chance to view the excellent 2009 documentary film “The Most Dangerous Man in America: Daniel Ellsberg and the Pentagon Papers.”
Of course, it is impossible to mention the Pentagon Papers now without thinking “Wikileaks,” and I admit that it was an interest in the parallels and differences in the cases that put that selection in my Netflix queue. It turned out to be a far more rewarding choice than I had expected. The film brings the 40-year old Ellsberg/Pentagon Papers sequence of events to life as vividly as the Private Manning/Wikileaks case is alive now. And without that historical knowledge and context one’s awareness of the Wikileaks case is impoverished. A fascinating aspect of this film is the way it documents how different the technological and publishing environment was – but one is also struck by the similarities in the political debate. Despite efforts to drive a wedge between Ellsberg and Wikileaks, this documentary, which was made more than a year before the Wikileaks controversy hit, shows how fundamentally similar the cases are.