Thursday, May 20, 2010

At the NIST Cloud Computing Forum it was Baby Steps for Baby Feet, or so it Seems

I attended the Cloud Computing Forum & Workshops hosted by the National Institute of Standards and Technology (NIST). Venue: United States Department of Commerce, Washington DC. NIST’s mission in life is to work collaboratively with the public and private sector on standards and technology. Vivek Kundra, United States Chief Information Officer, was the key note speaker.

The workshop atmosphere (it was really just one long day of presentations and panels) was naturally US government centric. Patrick Gallagher, Director of NIST, peged the event as a launch pad for a “robust dialogue” to define how the US government will procure and use cloud computing.

Here is a snippet of NIST’s widely circulated a definition of Cloud Computing:

“Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g. networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction”

As the day unfolded it became clear that NIST is faced with a catch 22. Agencies need standards to move forward. At the same time NIST is shying away from standards lest they get it wrong too soon.

History, markets and economics tells us that buyers and sellers alike benefit when there are global norms that remove ambiguity, build confidence, establish best practices and clarify legal ramifications. It’s like a traffic signs and speed limits.

As the world’s largest purchaser of information technology – with 10,679 IT systems (lots of duplication) the US government spends $76+ billion annual budget. These impressive stats are a small indication of the high stakes that are being cast around. [The government serves on a daily basis over 300 million “customers” (students, employees etc.) and 1.9+ million federal employees.]

Vivek, while a big fan of cloud computing, reiteratted that the initiative that he launched last September 2009 is still mostly about “rethinking investments in IT budgets” through data center consolidation across agencies.  A report on the state of the union was just released here.

There is meaningful progress towards cloud computing within state and federal agencies and some of the  illustrative use-cases can be found on page 12/38 in the report. Here are a few examples:

• The SEC uses Salesforce to shorten the time to handle investor advocacy cases. It took 30 days to close a case. Now it takes less than 7 days.

• For the 2010 count, 2010census.com was launched by US Census. It is very media rich and there was no means to evaluate the traffic volume. The US Census did not want to stand up a litany of servers for the 2010 count. Instead they used Akamai to delivery content “from the edge”.. The census also selected Everbridge used for mass notifications to the 1+ million temporary census employees.

• Recovery.gov moved to the Amazon Cloud. Recovery.gov is a tool that “shines light into Federal spending that must handle million of visitors”. The shift to Amazon EC2 led to a savings of $750,000 – redirected funding to help identify fraud, waste and abuse.

• State of Utah is moving to a “hybrid cloud” (70% complete) going from 1800 physical servers to 400 virtual servers – with a $4 million annual savings out of a $140 million IT budget. The cloud solution includes Salesforce, Google Earth and Wikispaces.

• The city of Los Angeles moved to Gmail from an antiquated communications system. They calculate a savings of $5.5 million in direct costs and an ROI of 20-30 million.

• The US Department of Interior is in mid-stream of a consolidation effort of multiple email systems with 80,000mail-boxes users into a cloud model. It was not unusual to have a single email server to manage only 6 email accounts.

• NASA is re-evaluating a $1.5 billion data center effort in the context of “cloud first” strategy.

A promising initiative was “revealed” so to speak by Katie Lewin, Director of the GSA Cloud Computing PMO. The Federal Risk and Authorization Management Program (FedRamp) is the official name for what Microsoft and Google are involved in to get their cloud “FISMA-compliant”. The program is still in formation phase.

What I gleamed about the spirit and purpose of FedRamp:

• The program lives under the Federal CIO and seeks to ease tough questions of security controls, authorization procedures and continuous monitoring for large outsourced and multi-agency systems. It would work in the context of the NIST Risk Management Framework (NIST 800-37 revision 1).

• Instead of working with each vendor independently to ensure compliance, an agency would depend upon stamps of approval that would be given to a cloud provider.

• An agency would review the authorization given to a vendor and if convinced with safe-guard assurances, they tie the knot with the vendor

• It is unclear what sort of base-lines are going to be published and applicability across international borders and strict (high sensitivity) agency requirements. The agency would still hold the responsibility to determine suitability

• FedRamp will follow-up with participating cloud vendors to ensure “continuous monitoring” and flag any drift from the original certification.

All  told the tone in the hallways of the US federal government is one of optimism about the future of cloud computing. There is a "cloud first" mind-set. It will be a long-haul though. John J. Shea Director in the DoD, Office of the CIO reminded us that cloud is a 2-10 technology. 2 years of hype and 10 years of adoption. Plenty of time to get rid of the "sins of the past".

No comments:

Post a Comment