The first part of this blog exposed some of the hidden assumptions underlying the Solarium Commission recommendations and provided a general overview of the report. Now we will take a closer look at other recommendations of the report, specifically those dealing with information sharing and centralization of authority in CISA.

Information Sharing

Among the slew of buzzwords encountered in the field of cybersecurity is the notion of information sharing. Harmonizing information sharing, security-related or otherwise, is arguably one of the biggest challenges facing the federal government today, which is why the Solarium Commission’s treatment of the subject deserves some reflection. To their credit, Solarium experts highlight the importance of sharing information across branches of the federal government and the private sector and acknowledge the need to address the related problem of over-classification of information (p.99). The commission’s solution is a bill making information sharing mandatory in the Defense Industrial Base (DIB). How? The report proposes a “joint collaborative environment,” a cloud platform that shares and fuses threat information across the federal government and the private sector.

While superficially appealing, this proposal reflects an old and oft-repeated tech fallacy. An attempt will be made to explain why the commission fell prey to it. Three aspects of this discussion need to be kept in mind given the distinct considerations they entail: 1) public sector Knowledge Management (KM); 2) private sector information sharing; and 3) the intersection of both.

Public sector KM

Efforts to “improve information sharing” for homeland security go as far back as the 9/11 attacks. A 2003 GAO report claimed that “Information on threats, methods, and techniques of terrorists is not routinely shared; and the information that is shared is not perceived as timely, accurate, or relevant.” After the Homeland Security Act, the Federal Enterprise Architecture (EA) explicitly mandated procedures for the transfer and sharing of information between government agencies, subject to review by the Government Accountability Office (GAO).

The need to redesign the processes and operations of government organizations so that they align with changing IT capabilities starts by applying general principles of KM to the government, which are typically laid out in the EA. The Federal EA is a policy framework at the core of intra-agency knowledge transfer and information sharing. It ensures investments in IT are tied to the President’s agenda and sets out knowledge management as one of the four capabilities under its services component reference model.

In 2007, a GAO report on the Department of Homeland Security (DHS) Enterprise Architecture stated that “investing in IT programs without having an EA to guide the process often results in systems that are duplicative, not well integrated, unnecessarily costly to maintain, and limited in terms of meeting mission needs and optimizing mission performance.” If information security is ever to be integrated within an overall public organization strategy, it would be conceived at the level of the Federal EA. Notably, the EA is not mentioned once in the Cyber Solarium Commission report.

Calling for more – and mandatory – information sharing while being oblivious to the history of prior attempts to do that within the federal government is bad enough. The problem is compounded by the fact that 85% of the critical infrastructure is owned and operated by the private sector. Consequently, information sharing must take place within the context of public-private collaboration, which makes the problem even more complex. Can this bigger problem be solved while the crucial structural and procedural considerations of governmental KM are ignored?

Set your own house in order before you regulate the world 

After the infamous 2015 OMB hack – an APT that lost millions of records of highly sensitive data –  the Government Accountability Office (GAO) conducted a performance audit of 23 civilian agencies in late 2018 and found egregious shortcomings. Federal agencies were not implementing basic cyber-hygiene. While one may be inclined to blame DHS for failing to adequately blow the whistle, DHS is supposed to provide support for the National Cybersecurity Protection System (NCPS), but not the actual implementation. When logs are fastidiously collected but never matched with a threat signature or when basic Intrusion Detection Systems are simply not turned on, one can invariably point to a badly integrated EA as the cause. Every administration has its take on “big government” and while the Trump administration may opt to ‘streamline’, there is no excuse for bad management of this kind.

The IT productivity paradox from the 1990s tells the story of how investments in IT did not correlate with increases in productivity when no requisite organizational changes were made. That is why proposals such as the Modernizing Government Technology (MGT) act look good politically but are dubious in terms of results. The same fallacy is perpetuated in the Cyber Solarium report. It asserts that introducing new tech (in the form of a new cloud) or modernizing old IT for local government in one of the recommendations (p. 86). These recommendations should be considered with a very skeptical eye despite the many parties who are eager to jump on the ‘IT upgrade’ bandwagon.

We cannot assume that a “certified” cloud solution will magically solve the problem of sharing classified information and translating it into non-classified actionable information. There is another problem, however. How will every part of the Department of Defense Information Network, the DIB, and the private sector CIP providers participate in a joint cloud with user-access control policies that map information to the right clearance level while avoiding Snowden-magnitude disasters? Also, imagine a paper-trail of information identifiers created between information sharer and information receiver on a need-to-know basis – how are we to prevent dangerous information asymmetries from occurring ex-ante if the penalties are legally arbitrated after the fact? Disincentives only work when there is a clearly assigned role for the responsibility. We can look back to the 9/11 commission report which was very thorough in finding out how information wasn’t shared, but this became evident only after the fact and after a lot of finger-pointing. This all ‘sticks’ and no ‘carrots’ approach to policy making has no teeth given that the burden of proof for establishing a failure to share timely information would only occur after a major disaster.

While a cloud certification scheme may, in theory, be good for security, the report should have focused on carefully considering rules of information sharing (mandating vs. incentivizing) as well as the organizational structures, processes, and human information networks supporting the entire system.

Is centralization the answer? 

The report sets out to centralize authority and increase funding for DHS’s Cybersecurity and Infrastructure Security Agency (CISA). Ironically, the report calls for “speed and agility” while simultaneously demanding further centralization of authority. Those things don’t mix. Whether this is grandstanding or empty rhetoric, it should be noted that information relevant to cybersecurity is distributed throughout different sectors of the economy, where experts draw on all kinds of subtypes of information. In government, we already have 15 years of experience from DHS’s CISA, the Office of the Director of National Intelligence, and using Information Sharing and Analysis Centers (ISACs). In the private sector, there are already dozens of open source and commercial threat intel services and all kinds of informal private sector-based networks for sharing cybersecurity information. Centralization of this might make sense from a SIGINT/national security standpoint, especially given the NSA’s dual-hat functions, but it does nothing for public-private sector information sharing. 5 years of not very salutary experience with the 2015 Cybersecurity Information Sharing Act’s attempt to centralize information sharing in its Automated Indicators System doesn’t inspire confidence. Genuine ‘speed and agility’ is a byproduct of decentralized organizational structures and not further centralization. Interestingly enough, the report recognizes that CISA’s capacity to facilitate public-private collaboration “is not widely understood or consistently recognized.” Yet still, the report proposes to throw money at the problem. As pointed out in the report, critical infrastructure is mostly owned by the private sector, yet other than further centralization at CISA, no high-level guidance is provided on how Sector-Specific Agencies (SSAs) are supposed to go about making the private sector collaborate willingly.

A National Cyber Director?

Since its inception, the Critical Infrastructure Protection (CIP) regime mobilized the different sectors on a voluntary and cooperative basis. The catch is that attention is divided, collaborative authority dispersed, and incentives for action are misaligned. A “czar” over federal Critical Infrastructure Protection could foster cohesion and coordination across the sectors. Instead, the commission recommends the creation of a National Cyber Director, responsible for one sector only. This would not solve the aforementioned collective-action problems despite the fact that cyber may permeate all the other sectors.

Info sharing and org theory

Information sharing is not simply a technical problem; rather, it is a multifaceted social challenge that needs to be addressed at the intersection of organizational and institutional theory. Organizations’ incentives to share information or collaborate will typically depend on the distributional outcomes of rules and actors’ abilities and willingness to change them given iterative bargaining. The shortcomings of the Strategic Infrastructure Coordinating Council (SICC) provide a good example. SICC is a loose structure set up after 9/11 and staffed by CEOs operating at the union of three “lifeline sectors:” finance, electric power, and telecommunications. These sectors are what the commission is currently referring to as “systematically critical infrastructure” because of their interdependence. A Fortune 500 CEOs’ motivation for participation stems from a desire to maintain some control and preempt the regulatory environment (lest the government does it for them). The problem is that SICC members’ involvement was sporadic at best. The telecommunication sector wasn’t committed to those meetings citing liability and privacy issues (despite the CISA act passed in 2015).

When stakeholders can change rules and affect distributional benefits (as in the energy sector) they tend to participate. When the stakeholder groups are large and diverse (the telecommunications sector) they perceive their involvement as wasteful given the unlikelihood to reach a win-win outcome. DoE’s Cybersecurity Risk Information Sharing Program (CRISP) tells a related story. While the program can detect and isolate threat signatures by analyzing gigantic amounts of data from the entire electric grid and associate them with distinct threat-actors, how can we expect the national labs to have genuine informal collaboration and share information if they are competing for dollars?
When it comes to information-sharing related to cyber investigations and audit trails, there are cross-purposes between intelligence gathering from a public agency vs. private sector perspective. Misaligned incentives also need to be pointed out. It would be natural to expect an infrastructure operator in the private sector to want to shut a threat down as quickly as possible whereas a public agency would want a situation to evolve to better investigate it and gather intelligence.

When it comes to organization theory, there are two meaningful levels of analysis. At the intra-organizational level, we can consider new ways of rewarding people for their ideas to encourage learning and sharing. Fostering an open information-sharing culture with trust as a first-trait is a hard challenge whether the information shared is threat-information or technical know-how. How do we set up mechanisms to credit whoever shares useful information? Special attention should be given to boundary workers that wear dual-hats and operate at the intersections between parts of the organization.

At the strategic inter-organizational level we might consider how public-private responsibilities are codified. The commission report states on page 16 that: “Afraid of creating moral hazard, the federal government invests little in protecting the cybersecurity of commercial infrastructure or key systems controlled by states and local municipalities”. A valuable insight, but the report falls short of providing any useful recommendations on how to achieve sector coordination. The economic literature on formal Public-Private Partnerships (PPP) can help frame policy that would one day assign specific control or property rights for the secure operation of certain critical infrastructure sectors. In a nutshell, when control rights are allocated to whoever has the highest willingness to pay for a project, economic surplus is maximized. That way, whichever entity has the highest valuation for security will have the incentive to invest optimally in its provision. The question of allocating control or property rights for critical infrastructure including how security is valued is a complex topic.  A sophisticated discussion of public goods in this context needs to distinguish the national interest (what cybersecurity protects) vs. national security (what critical infrastructure protection is about). Lessons on economic modeling and valuation can also be extrapolated from research on cyber-insurance.

This is not to dismiss the countless hours of work undertaken by the commission, but to impart two important lessons. First, we already have tools for valid policy analysis at our disposal. The comprehensive knowledge-base of organization and institutional theory addresses information management within hierarchical systems inside and out. It would be advisable to pay heed to it lest we repeat the same pattern every 10 years. Second, and at the risk of wishful thinking, before passing any bills that mandate information sharing, Congress should make sure a public comment period allows the IT lobby, academia, and civil society to weigh-in — disciplinary expertise needs to cut through political rhetoric.