Friday, February 26, 2010

US Federal agencies and cloud computing: Reason to be cautious

Cloud computing in the enterprise has generated plenty of hype – and plenty of eye-rolling among wary CIOs of federal government agencies. Federal IT leaders continue to express legitimate concerns about the effectiveness of cloud environments in securing sensitive government and consumer data. Many are also skeptical of the promised cost savings and service quality.

Amid the healthy skepticism, however, lies increasing pressure to give serious consideration to cloud computing initiatives. The Obama Administration is focused on lowering the cost of government operations while driving innovation. Many local, state and federal government agencies are evaluating cloud computing, though few are actually implementing cloud-based solutions at the federal level.

While there are no quick answers, there are some clear steps federal agencies – not just CIOs, but all senior decision-makers – can be taking now to prepare their organizations for cloud-based computing environments. Here are three key points federal agencies should consider in an effort to break through the hype and lay the groundwork for a clear, reasoned path to cloud computing.

1. It’s OK to say no.

Federal agencies under pressure to embrace cloud computing are fully justified to just say no. The fact is that most federal agencies should not venture into a “classic” public cloud any time soon, for a simple reason: Incumbent cloud service providers such as Google, Amazon and Microsoft do not comply with current Certification & Accreditation rules dictated by the Federal Information Security Management law. This makes the public cloud a non-starter for most federal organizations, while providing a clear litmus test for the future viability of cloud providers.

To their credit, commercial cloud providers are making some progress in regards to federal compliance. Both Microsoft and Google, for example, are close to receiving accreditation for FISMA compliance, and Google has reportedly completed a System Security Plan for its Google Apps platform.

Key issues around compliance involve where the data actually resides in a commercial or public cloud environment and how vulnerable it will be to cyber attacks. CIOs are rightly concerned that their data could be stolen by hackers, mixed with data from their cloud providers’ other customers, or inadvertently exposed. The recent cyber attacks on Google and other organizations emanating from China will do nothing to quell concerns around moving sensitive information into cloud environments.

As Robert Carey, CIO of the US Navy, noted at a federal executive forum in November 2009:  “Public clouds are not necessarily appropriate for Army or Navy information to be just sitting out there, and therefore the models that you would use to describe the security of that information might have to, and probably will have to change a great deal.”

2. Not all clouds are created equal.

Although the public cloud is not a near-term option for federal agencies, other options do exist to help federal and intelligence agencies gain more flexibility and achieve cost efficiencies for their IT infrastructures.

The basic technologies of a cloud environment – high speed Internet connections, browsers, grid computing and virtualization – are well established and can be duplicated by any organization.  This makes it possible for government agencies or departments to build “private clouds” – infrastructures that use cloud technologies but are more aligned with current security requirements.

Adopting virtualization and other cloud technologies, even in a closed environment, could increase efficiencies and reduce infrastructure costs by breaking down the silos created through the use of proprietary technologies. Consider the efficiencies of something as basic as a shared email system; these and other non-core applications are good candidates for early migration into private cloud environments.

The DoD’s Defense Information Systems Agency (DISA) is moving toward what arguably could be the world’s largest private cloud, as it looks to integrate the Defense Information Systems Network, its data centers, and its command and control applications. This cloud-based approach to sharing information could serve as a model for other agencies – and for commercial providers as well.

The type of information each agency handles will influence its approach to cloud solutions. Just as most governments tag data with different levels of sensitivity, from low level (published widely and no restrictions) to ultra secure (classified security information for top government leaders only), they can also begin to design cloud architectures for different levels of information. For example:

·         Any agency that deals in the public domain – such as the EPA, the Census Bureau, or the Department of Interior – could serve as a “canary in a coal mine” test case for a cloud infrastructure.
·         The GSA’s recent launch of apps.gov – a storefront for cloud-based business, productivity and other applications featuring non-sensitive data – is an example of where federal agencies can begin testing the waters.[iv] The GSA has already seen positive results from moving the USA.gov portal to the cloud. The transition significantly lowered GSA's costs, saving taxpayers an estimated $1.7 million annually.
·         New public programs that must launch quickly to meet legislative mandates – Cash for Clunkers is one recent example – are candidates for testing the ability of cloud solutions to scale quickly .

On the other hand, Homeland Security and other agencies that deal with ultra-secure data have little or no incentive to consider a cloud solution. In situations where data security presents an untenable risk, the government may choose to pursue other avenues, such as optimizing the infrastructure in place using traditional IT practices or decoupling data and processing to allow use of public or private cloud infrastructure without jeopardizing data security.

3. Who’s accountable?

The issue of governance is perhaps the biggest obstacle to federal agency adoption of cloud computing. Regulatory mandates require that an agency must know precisely who has access to data and where the data resides, both physically and logically. Some providers will guarantee the presence of data in the U.S., while others will not, or cannot, prove their ability to do so. In addition, agencies must have considerable transparency into the operations of the service provider. This raises two key questions:

·         Who will oversee the passage of data throughout a cloud environment?
·         Whose role is it to ensure that this data is continuously managed and protected?

Agencies will need to establish comprehensive policies for issues such as encryption key management along with network, application, and data-level mechanisms that enable the verification of data movement and storage in cloud-based environments.

The good news is that government agencies do have plenty of experience in establishing governance policies for external service providers. The same risk assessments used over the past decade to qualify third-party storage facilities or other service providers can be applied, at least in part, to cloud computing vendors.

Agencies can also rely on existing rules and structures that govern IT decision-making; these policies and processes can be adapted to determine the chain of command for decisions and activities related to cloud computing.