Sunday, December 12, 2010

A Model for Credible and Response Security Operations

The focal point for keeping tabs with security and compliance activities has been the Security Operations Center (SOC): a physical location that is the front-line to handle incident reports, review system logs and constantly monitor the environment - 24x7. 

Take the latest Wikileaks extravaganza (details left for another blog post) that pits the security operations of some of the world’s most IT savvy companies (Visa, Mastercard, Paypal and Amazon.com) against relatively un-organized “hactivists”. 

The wall-street journal online gives a picture of what a SOC looks and feels like: “in PayPal's network operation center, charts showing total payments processed per minute and total traffic to the site, along with other data, are projected on a large, curved wall in front of around 20 workstations, each holding three to five computer monitors.” Add security events and blinking lights for threat alerts and you get a SOC.

According to the reports Paypal did not suffer a down-time, neither did Amazon.com. Mastercard and Visa didn’t fair so well. In the article it was speculated that MasterCard and Visa simply did not invest in their security operations to “gird for attacks from a more-sophisticated cyber army”.  

That sort of after-the-fact (lets investigate what happened and then make things better) approach to security operations can be very costly. A survey of 45 organizations by the Ponemon Institute found that on average, cybercrime takes 4 weeks to investigate and each cybercrime averages a cost of $3.8 million/year in financial loss and response and remediation costs.

Security operations also have another omnipresent master: regulatory compliance mandates and legal enforcements. Hundreds of laws are introduced every year at the US state level that affect the collection, use, handling, and disclosure of personal data in one way or another.  These laws may be introduced as “privacy” laws, or may be attached to financial services, health care, employment, children’s services or other laws as well.

And so the conventional model of security operations is too pick and choose from a menu of out of the box technologies and tools with no over-arching strategy or long-tail capabilities road-map. By the latter I mean a lack of investments in niche and customizable applications such as illicit insider threats or dealing with persistent threats. The standard functions that are sourced and acquired either in-house or as a managed service: Infrastructure security, device monitoring and management, Security incident and event management, Security incident response and forensics and Threat research and vulnerability management. 

If we are not getting any safer and compliance is unabated – is doing the same thing over and over again not insanity? The thought is a new working model that helps guide an organization towards a better-quality operational picture that is more responsive, rather than reactive. I am not suggesting new technology or standards. Instead a framework to orient the enterprise in sourcing, acquiring and deploying the arsenal.

The model starts with an abstract layer of requirements to help evaluate the maturity of an organization. Not unlike some common principles (integration, automation, service oriented architectures) that would apply to CRM, ERP, or SCM – a domain that is being re-framed by cloud computing.

Here are some of the competencies or master specifications that could apply to “next generation” security operations:

Process Automated Responses
  • Operator activity should be automated processes that accommodate human-in-the-loop work-flows for decision making that can be optimized. The aspiration here is processes that shuttle information to policy makers, engineers, the C-Suite and application developers. If data encryption is ramping up then security operations would be prepared to deal with the ramifications of additional reporting. An anti-virus clean-up task would trigger notifications to engineers and analysts spell out countermeasures
  • Strategic plans and courses-of-actions would govern pre-and post incidents and designed to avoid disruption to the mission or business. Basically some sort of “rules engine” that makes the system hum. EINSTEIN 3 is a system that will be deployed by US government agencies that “will have the ability to automatically detect and respond appropriately to cyber threats before harm is done, providing an intrusion-prevention system supporting dynamic defense”.
Enablement of Virtual Resources
  • Ability to rapidly source high powered computing resources, to process sudden or unplanned volumes of traffic as well as test countermeasures. For example, a bandwidth-based attack like a Denial-of-service would be met with defense that shield parts of the network from collateral damage. The current trend is also to break apart malware and study it for intent, origin and then design of countermeasures. There are vendors that offer tools to simulate the behavior of malware in a safe and condoned test-bed environments. Support for integration and interoperability is key.
  • Private and public partnerships are used to to analyze indicators of attacks and early into the planning phases of the adversary. Imagine, if you will, the National Security Agency (in the US) working with private firms to alert them of possible cyber strikes.  A slippery slope of government intervention. Google is working with NSA to help sort through the Chinese hack of its computers.
  • A virtual team of experts and analysts will make-up the diverse mix of users and consumers. Decades of research and technology in IT security has shown us that absolute security is a fallacy. Now its also clear that we have not invested in nurturing the right crop of professionals. A recent study by the Center for Strategic and International Studies nails home the point for the US, that we simply don’t have the talent to stay steps ahead. So outside collaboration and the means to do so will be critical.
Analytics-Driven Security
  • We know that fusion of all sorts of data is pivotal to create higher-fidelity alerts of intruders, and spot subtle yet suspicious activity. The idea is to monitor ingress and egress points and pull data from human resources, firewall logs and even law enforcement. Trouble is that today’s intrusion detection systems are short-sighted and often blind-sided. The US government’s EINSTIEN 2 system that is meant to up the ante and is installed at all connections between government computer systems and the public Internet. The system gathers and sketches out threat signatures from foreign intelligence and DoD information assurance missions
  • Absorbing and monitoring Internet traffic; human, software and computer activity is one thing and then trying to make sense of it all is quite another. The mathematics behind statistical analysis and analogue techniques in data mining offer powerful aids to understand past and present events. Forecasting models can be used to project probability of whether a threat scenario will come to pass.
  • Privacy-enhancing mechanisms will be needed to limit the collection and retention of personally identifiable information. When speaking about any surveillance recommendations the temptation will be to over-reach authority and the risk of privacy violation is all too real. Basically the principle here should be to redact or obfuscate sensitive content that is not pertinent to down-stream threat analysis or won’t help the advancement of compliance.
Do these above themes make sense? Send me a note at walid.negm@acecnture.com. I’d like to hear your thoughts on whether the above above themes are relevant to CIOs, CTOs, CISOs and business executives struggling to get their heads around where to prioritize their security investments.

Wednesday, December 8, 2010

Dumbstruck - Wikileaks, law, cyberware and politics (regulary updated)

Wikileaks as of 12/9/2010 has yet to be convicted of any crime by the US government.
Most reasonable folk can agree that the unauthorized release (leak) of sensitive information should be handled with care. The trouble is most folk are neither sensible, nor in agreement about what sensitive information is.
And the story continues to unfold...

Update on Legal Position
12/09/2010: "The U.S. government indicated 12/09/2010 that WikiLeaks spokesman Julian Assange could be in legal jeopardy for disclosing classified information because he is "not a journalist." When asked whether "traditional media" organizations that republish secret documents could be prosecuted, State Department spokesman P.J. Crowley said that the administration applauds "the role of journalists in your daily pursuits." "In our view, Mr. Assange is not a journalist," Crowley added". Source here.

12/10/2010: "Wikileaks founder Julian Assange, the man behind the publication of more than a 250,000 classified U.S. diplomatic cables, could soon be facing spying charges in the U.S. related to the Espionage Act, Assange's lawyer said today" Source: here

12/13/2010: The World War I Espionage Law criminalizes anyyone who possesses or transmits any "information relating to the national defense" which an individual has "reason to believe could be used to the injury of the United States or to the advantage of any foreign nation." The Espionage Act was not written to distinguish the leaker or the spy and the recipient. 

Streaming Reactions
The US government is rightfully so more than livid and is exploring the options. A slew of companies have pulled the plug on the Wikileaks organization from Visa, Mastercard, PayPal, EveryDNS.

Here is Amazon Web Services getting an earful from some of their customers for pulling the plug.


A new Wikileaks kid on the block is on tap reported by the Swedish newspaper Dagens Nyheter reported today. The new project: Openleaks is said to be online any time now. "The two organizations are similar in that aspect that both are focusing on providing means for whistleblowers to anonymously provide the public with information,” as stated by an insider.

The Politics: 
Varying positions continue to be voiced. One side of the debate foxnews and the other side... the atlantic

The Attacks and Counter Attacks
12/8/2010: Lets get this party started
Wikileaks plays it cool and diversifies -- real quick --spreading its documents on mirror sites, adding redundancy to "caller-id" DNS look-ups and a variety of things so it can take a licking and keep on ticking.

And things all of a sudden start to escalate into extremist and rash retaliation  Reuters: "More cyber attacks in retaliation for attempts to block the WikiLeaks website are likely in a "data war" to protect Internet freedom, a representative of one of the groups involved said Thursday"

12/9/2010: Operation Payback in the news.
"A collective of hackers who have set their sights on those companies that have denied service to WikiLeaks and its founder are now trying to take down Amazon.com. They announced via Twitter that they would begin their attack at 11 a.m. ET". Source: http://mashable.com/2010/12/09/operation-payback-amazo/

Espionage act 'Makes Felons of Us All' - legal experts

Quotes 
"To me, New York Times has committed at least an act of bad citizenship, and whether they've committed a crime, I think that bears very intensive inquiry by the Justice Department," IS Senator Joe Lieberman


"In a time of universal deceit, telling the truth becomes a revolutionary act." -1984, George Orwell 

I think in today's climate, telling the truth is classified as "terrorism"
 
"In our view, Mr. Assange is not a journalist" State Department spokesman P.J. Crowley


"Leaks of classified information to the press have only rarely been punished as crimes, and we are aware of no case in which a publisher of information obtained through unauthorized disclosure by a government employee has been prosecuted for publishing it," - Jennifer Elsea, a legal researcher for the US Congress

Techie Section
Its just like a video game: Operation Payback is asking its followers to download a piece of software called LOIC to fire off a distributed denial of service attack at targets. The question of course is “what IF I get caught”. Here is a snipped from from one of their FAQ’s. By the way V& stands for Van’d – as in when the FBI shows up at your house in Van:
  • You probably won't. It's recommended that attack with over 9000 other anons while attacking alone pretty much means doing nothing. If you are a complete idio and LOIC a small server alone, there is a chance of getting V&. No one will bother let alone have the resources to deal with DDoS attacks that happens every minute around the world. Then theres always the botnet excuse. Just say your pc was infected by a botnet and you have since ran antivirus programs and what not to try to get rid of it. Or just say you have NFI what a DDoS is at all.


Thursday, October 21, 2010

Cloud Services With Strings Attached...

Without trust, banks can't exist. President Franklin D. Roosevelt said in his first fireside chat with the citizens of the United States, March 12, 1933. That radio broadcast came only eight days into his first term.  After Roosevelt had to close down all banks because of a run on assets.

Since then a varitey of regulations and faith building steps were put into place, until of course the financial banking calamity of recent years put into question "who is running this ship". And another loss of confidence. Trust is what makes an economy flourish or flounder. Trust in companies, the relationships we build and the assurances we get that the products we buy are fresh, well built and without hazard.

Trust is given and taken each time we open our mailboxes..the physical ones.
 
(Believe it or not) 87% of 9000 Americans surveyed by the Ponemon Institute in its 2010 Privacy Trust Study of the US Government ranked the Postal Service first amongst 75 federal government agencies. Simply put we trust that the US Postal  Service is able to keep our information safe and secure. It has a 230 year
history. When you go into a USPS office, you expect reliable and safe delivery of your packages. 

OK, if you are extra cautious about those tax forms and sensitive merger documents you call in FedEx or any number of courier services. Its more expensive but it lowers risk and increases peace of mind with guaranteed delivery and signatures. 

Oh but I am blogging about online a.k.a cloud services.

A cloud provider (LLC, corporation etc.) that cannot be trusted will not exist. Sounds simple enough. I will add some other predictions for a patently untrustworthy cloud service providers:

(1) put out of business because of the natural order of things
(2) be relegated into a bazaar of low-cost and low-quality offerings
(3) be shutdown by the government after a major data breach
(4) if large enough and a "critical infrastructure" be nationalized in the event of a national security issue
(5) address a commodity market that deals with pretty much worthless customer data

Its hard to see that an enterprise (hospital, bank, law firm, pharma etc.) that is going paper-less, will disregard the weight and meaning of trust as it goes about doing its business. In particular when talking about proprietary information. And while the security of sensitive data in the cloud is being hashed out, there is not enough focus on the larger context and meaning of trust: how its created, how it's maintained and how it's destroyed. The conversation dive's into passwords, encryption, regulatory compliance and he said, she said.

OK sure, as consumers we store sensitive information (banking information, personal letters, legal documents) on any number of popular online mail and storage accounts. Truth of the matter, most of these online service providers have been in business less than 10 years. There is social shift in the expectation of low (or no) privacy in cyberspace (another topic).

With that said, the more we have to lose, the less likely we'll trust just "anybody".  And by trust, I mean our willingness to depend (or interdependence) on someone (or something) else. In my simple mind, there are four behaviors that we exhibit in the real-world which are naturally present in cyberspace:

1. Carelessness          
2. Paranoia          
3. Practicality: Most businesses and consumers would like to be in this mindset as we are bombarded with new technology and the peer pressure to move into the state of the art. We rank convenience high. We may take care of basic check boxes, but we are not finicky
4. Prudence
   
As enterprises(and consumers) ramp up the volume of outsourcing of data storage, shared application hosting and third-party data processing, we'll see more of the prudent mind-set start to change the marketplace for cloud services.

Vendors will have to respond to a demand for "assured" cloud services that offer more than one-sided technical security controls, standards and empty promises.

The service provider in this sector will demonstrate financial viability. They have a history and reputation. They see your data as a currency with a Dollar, Yen, Euro etc. value to it. There will be unambiguous obligations to provide compensation from disruption, damage or loss of data. Its nothing new in terms of old-school expectations from any service provider.

Its an understanding that there are some relationships that simply must be built on confidence, some mistakes were built to last - and you can in fact measure trust in cyberspace.

President Roosevelt told his countrymen, "there is an element in the readjustment of our financial system more important than currency, more  important than gold, and that is the confidence of the people"...replace financial system with critical cloud services.

Tuesday, August 17, 2010

Reminiscing


A little more than 2 years ago I put down on paper rudimentary thoughts (and borrowed some good ones of course) about the risks of cloud computing. Since then there has been no stopping the cloud tsumani. I recently got introduced to Apple's iDisk. A near perfect utility: an innocent looking icon on the desktop where you can move all your beloved stuff to the "cloud". 

Where the photo's, documents will go, no one knows.

Here are some of those risk-related properties of cloud computing that were swirling in my mind not so long ago:

  • Trust and lack there of: How (or why) do you trust a cloud provider to do the right thing? The root of the matter is putting a believable trade-off in place between the risks and one's alternatives. No surprise here. 
  • Ease of Reach: Anything (data, machines, applications) that will be neatly placed "out there, somewhere" will be at an elevated risk of abuse by some disgruntled employee, hacker or [insert favorite bad nation here]. The network is the hack. Like black magic an invisible hand will reach over the ether to tinker with, break-into and cause mischief.
  • Dispersed Data: Personal, private, pseudo-classified and classified data ... all sitting side-by-side. It just sounds and feels so unnerving... no matter what precautions or promises are made by the trusted provider. Of course there is an answer: isolation. On the spectrum of shared everything, or shared nothing you will have to pick your position. 
  • Virtual Time: A Google, salesforce.com or any cloud provider will take advantage of a secret sauce coined: virtualization. Long story short: virtual machines and storage live in a world of virtual time (and space). Without proper accounting, the space-time-continuum can get out of order. Realistically, an anti-virus scan can get tripped up.    
  • Mobility: Those virtual servers (which are essentially files) will be placed and then moved around the network whether for maintenance, resiliency or due to randomness. The files will take with them whatever -- data, malicious ware, outdated policies. 
  • Fate Sharing: A multi-tenant application or infrastructure that is hit by a catastrophic attack will affect all customers. Unlikely event. But those are famous last words.
  • Old Foundations: The internet was not designed for a hostile setting. It is anonymous. It is about speed. There are no safeguards for privacy. It is about openness. All at odds with locks, keys and body guards.
  • Emergent Properties: My favourite one of all. The "i don't know what's about to hit me, cause this is all so new". Have a house? Add a window. You have added change. Change = vulnerability. Have a cloud? Who knows whats going to be exploited...

Saturday, July 31, 2010

DEFCON 18

The session titles are tinged with cloak and dagger, anarchy and freedom of expression: "We don't need no sticking badges: hacking electronic door access controllers", "Your ISP and the Government: best friends for ever", "Practical cell phone spying".

My personal impression of the some-what cult-like DEFCON security conference can be characterized as  smart people instinctively driven to share knowledge and unadulterated research for a greater good. Whether its the protection of civil liberties, revealing stupid security vulnerabilities and flaws of products or unabashedly calling out vendors on incompetent engineering.

A smattering of speaker comments offer's a peak into the topics for this year's conference:
  • There is no patch for stupidity 
  • 15 year device life-time == long tail for bad decisions 
  • Clever does not mean secure 
  • What appears secure is not
  • Privacy is a subtle thing
  • The warm, fleece-y Snuggie of Obscurity
  • Software moves power on the grid
  • Cute smart meter is cute
  • The dumbest lock design ever
  • Assumed to be trustworthy - 543 million devices shipped in first half of 2010
  • Download games at your own risk
  • An attack on any one node of an electric grid could take that entire grid down
  • My life as a spyware developer and why I'm probably going to Hell 
  • There is no such thing as privacy. It is dead. Get over it.
  • Malware scanner's are mostly stupid
  • What can you do with Twitter that is utterly evil? Lots and lots of things 
  • There are 155, 693 public water systems - serving 286 million American's
  • I don't think you need a sophistical exploit, there will always be a certain number of people that will click "yes" no matter what
  • Social engineering has a long history and works just fine on the Internet

Wednesday, June 30, 2010

The Impact of Scale


The topic of scale is best illustrated through an analogy. 

If you live in a mega-city, then city planning and transportation come to mind. With so many people living in one city, responsible governments must deal with public safety and creating livable structures. Individual buildings, transportation routes, water supply systems - are part of the city. Overtime we got clever and started to architect and build vertically to deal with scarce physical space. As the number of cars on the streets zooms upwards, we study traffic patterns, congestion and invest in alternative modes of transportation. All told, all these "parts" of the city are a system.

The dimensions we use to characterize whether a "system" is large includes number of elements (people, hardware, "things"), tasks, relationships, policies, domains of interest, and enforcement points etc.

If we are investing in smart buildings, smart cities, smart transportation, smart grids, healthcare infrastructures etc. we are talking large scale. We are talking about "system of systems" that is complex and in fact ultra-large.

The theme of ultra-large scale IT systems is explored in "Ultra-Large-Scale Systems: The Software Challenge of the Future" here, a report, published in 2006.

Ask the statisticians, and they would agree, big is in: 
  • Number of Cell Phones Worldwide Hits 4.6B in 2010
  • 4,000: The number of lines of code in MSDOS 1.0 - Microsoft's first operating system
  • 50 million: The number of lines of code estimated to be in Microsoft Vista
  • Data storage requirements for smart meters will increase at a rate that resembles a natural logarithmic rate
  • The 2009 movie Avatar is reported to have taken over one petabyte of local storage for the rendering of the 3D CGI effects
  • The US DoD has 3 million desktops, with just one data-center housing 18 terabyte storage
Of course big things are broken down into manageable parts in an attempt to make sense of the pieces. Cars, routes, traffic control, toll plaza's etc. 

However, we are still left with menacing problems:
  • Emergent properties; (or in plain English) stuff will happen that we can't cope with because it is brand new and never happened before. (e.g. the impact of mobile phones on driving behavior)
  •  We cannot completely define let alone measure the properties of all components a-priori i.e. somethings are just out of our view-lens
  •  It is impossible to update all elements of the system (as each changes) without leaving some window of vulnerability or ambiguity 
  • There is continuous evolution of our surrounding environment with unknowable outcomes and inconsistent (changing) states (e.g. new devices, new personalities, new enemies)
  • There is no clear ownership, possession and boundaries (e.g. cloud computing)
The Internet and the typical IT infrastructure of an enterprise and government agency is very complex. We think about external networks, web sites, data flows, applications and users, insiders, hardware/software, transactions and supply chains. The Internet and everything in it and connected to it, is very much an ecosystem: a dynamic community of interdependent and competing organizations in a complex and changing environments. The elements have an intrinsic adaptive behavior and we can measure the health and sense troublesome indications. 

To deal with that sort of large-scale complexity, rich research is needed (and going on in) to:
  • Understand the biological metaphore and it's applicability to tech-centric views
  • Use heuristics and learn how to better predict behaviors 
  • Learn how to respond to external stimuli with keener intelligence gathering 
  • Assure the software and hardware components that are used to build systems 
  • Survive hostile intentions and operate in a continuously changing system

Thursday, June 17, 2010

Its a Messy World

We live in a world that is rather messy. 


Russell Ackoff defined "a mess" as  "interacting problems or issues that are not easy to appreciate as a whole" (Flood & Carson, 1993).  You are in  mess, if you can't put any structure to the situation.  


Organizations are not closed systems. You can't measure everything because things change so quickly. The Internet is a vexing source of "unknown, unknowns". We depend on software, that if you look closer, is made up of piece parts whose source is ambiguous at best. Its "Office 2.0" for most employees with no definite way for the employer to tell what's good or bad behavior. 


It is safe to say, that when it comes to IT we are dealing with ... an unstructured situation. 


The growth in terms of volume, speed and diversity of data, devices and threats is non-linear. So our thinking about IT Security must also be non-linear. 


For example, the correlation analysis that is used to flag a security incident or track an impending threat is very much divorced of cause-and-effect accuracy. We are left with false emergencies, lots of noise and inaction to priorities. We are still building "naive" applications instead of making them "street smart" and engineered with the knowledge they will be broken into and constantly attacked. Finally there is a tolerance to live with broken links. Yet to make informed security decisions we need to connect the rationale and outcomes of our policy decisions, cultural expectations, counter-intelligence activities and the lessons learnt from incidents. 


We have to shift from securing an environment to surviving an ever changing ecosystem.

Monday, May 24, 2010

A Wicked Web We Weave

How do you get in the middle of plans or actions of a knowledgeable employee who intends to inflict damage to your company? 
It’s very difficult to control the flow of information within today’s work-place. In any normal “business” day there could be foreign national interaction, USB key exchanges, work from home, contractors –all greatly increasing opportunity for espionage and data theft. There is plenty of room for mischief and the amount of harm done by a well-informed saboteur is non-trivial. It is an order of magnitude more damaging than a stranger.

Definition of an insider by CERTA current or former employee, contractor, or business partner who
Has or had authorized access to an organization’s network, system, or data and
• Intentionally exceeded or misused that access in a manner that negatively affected the confidentiality, integrity, or availability of the organization’s information or information systems 

Trouble with an "insider" is they have legitimate access. They are working within regulation, aware of policy and  unlikely to break rules. So access control won't work nor will "intrusion detection". There is no intrusion.
Most of the time an employee will be reported for unusual behavior by a co-worker or an audit. There are also personnel screening processes that have to be in place before hiring. Training and awareness helps employee's notice unusual activity.

That's old-school stuff - and it works and is crucial. In addition there are technical approaches that have gained a rightful place in the "must-have" list. Most companies should turn on network and application activity logging, file integrity checks and data loss monitoring. Stitching together the alerts that are generated by "big brother" helps spot things that are "out of the norm".  For example if a file changes, or an email is sent with a sensitive attachment, a user or behavior can be marked as "bad" and declared suitable for further surveillance.

Unfortunately there is still plenty of noise obstructing an accurate and clear reading of what is good, bad or ugly behavior. We hop on and off social networks, plug and unplug cables, head to work late, forget to submit expense reports, make travel arrangements out of policy, skip virus updates and get overly zealous downloading information. The list goes on. Human behavior can seem hopelessly impossible to predict. 

Statistics brings the magic of mathematics to make sense of data and tell us "what's up". It is a science that fills in blanks in our memory, keeps us honest about the present and paints a rough approximation of our future. There is good research literature on the topic of user behavior analysis. We are working to take tried and true algorithms such as root cause analysis, propensity analysis, link analysis and econometric forecasts - and then rigoursly apply them to hard to solve cyber security prolems.

If you are interested in assessing threats across transactions, individuals and groups that are unobservable to the naked human eye, please contact me. While tackling the problem of employee betrayal can be hard, we can turn to “big data” and analytics to help trip up the enemy within. 

Thursday, May 20, 2010

At the NIST Cloud Computing Forum it was Baby Steps for Baby Feet, or so it Seems

I attended the Cloud Computing Forum & Workshops hosted by the National Institute of Standards and Technology (NIST). Venue: United States Department of Commerce, Washington DC. NIST’s mission in life is to work collaboratively with the public and private sector on standards and technology. Vivek Kundra, United States Chief Information Officer, was the key note speaker.

The workshop atmosphere (it was really just one long day of presentations and panels) was naturally US government centric. Patrick Gallagher, Director of NIST, peged the event as a launch pad for a “robust dialogue” to define how the US government will procure and use cloud computing.

Here is a snippet of NIST’s widely circulated a definition of Cloud Computing:

“Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g. networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction”

As the day unfolded it became clear that NIST is faced with a catch 22. Agencies need standards to move forward. At the same time NIST is shying away from standards lest they get it wrong too soon.

History, markets and economics tells us that buyers and sellers alike benefit when there are global norms that remove ambiguity, build confidence, establish best practices and clarify legal ramifications. It’s like a traffic signs and speed limits.

As the world’s largest purchaser of information technology – with 10,679 IT systems (lots of duplication) the US government spends $76+ billion annual budget. These impressive stats are a small indication of the high stakes that are being cast around. [The government serves on a daily basis over 300 million “customers” (students, employees etc.) and 1.9+ million federal employees.]

Vivek, while a big fan of cloud computing, reiteratted that the initiative that he launched last September 2009 is still mostly about “rethinking investments in IT budgets” through data center consolidation across agencies.  A report on the state of the union was just released here.

There is meaningful progress towards cloud computing within state and federal agencies and some of the  illustrative use-cases can be found on page 12/38 in the report. Here are a few examples:

• The SEC uses Salesforce to shorten the time to handle investor advocacy cases. It took 30 days to close a case. Now it takes less than 7 days.

• For the 2010 count, 2010census.com was launched by US Census. It is very media rich and there was no means to evaluate the traffic volume. The US Census did not want to stand up a litany of servers for the 2010 count. Instead they used Akamai to delivery content “from the edge”.. The census also selected Everbridge used for mass notifications to the 1+ million temporary census employees.

• Recovery.gov moved to the Amazon Cloud. Recovery.gov is a tool that “shines light into Federal spending that must handle million of visitors”. The shift to Amazon EC2 led to a savings of $750,000 – redirected funding to help identify fraud, waste and abuse.

• State of Utah is moving to a “hybrid cloud” (70% complete) going from 1800 physical servers to 400 virtual servers – with a $4 million annual savings out of a $140 million IT budget. The cloud solution includes Salesforce, Google Earth and Wikispaces.

• The city of Los Angeles moved to Gmail from an antiquated communications system. They calculate a savings of $5.5 million in direct costs and an ROI of 20-30 million.

• The US Department of Interior is in mid-stream of a consolidation effort of multiple email systems with 80,000mail-boxes users into a cloud model. It was not unusual to have a single email server to manage only 6 email accounts.

• NASA is re-evaluating a $1.5 billion data center effort in the context of “cloud first” strategy.

A promising initiative was “revealed” so to speak by Katie Lewin, Director of the GSA Cloud Computing PMO. The Federal Risk and Authorization Management Program (FedRamp) is the official name for what Microsoft and Google are involved in to get their cloud “FISMA-compliant”. The program is still in formation phase.

What I gleamed about the spirit and purpose of FedRamp:

• The program lives under the Federal CIO and seeks to ease tough questions of security controls, authorization procedures and continuous monitoring for large outsourced and multi-agency systems. It would work in the context of the NIST Risk Management Framework (NIST 800-37 revision 1).

• Instead of working with each vendor independently to ensure compliance, an agency would depend upon stamps of approval that would be given to a cloud provider.

• An agency would review the authorization given to a vendor and if convinced with safe-guard assurances, they tie the knot with the vendor

• It is unclear what sort of base-lines are going to be published and applicability across international borders and strict (high sensitivity) agency requirements. The agency would still hold the responsibility to determine suitability

• FedRamp will follow-up with participating cloud vendors to ensure “continuous monitoring” and flag any drift from the original certification.

All  told the tone in the hallways of the US federal government is one of optimism about the future of cloud computing. There is a "cloud first" mind-set. It will be a long-haul though. John J. Shea Director in the DoD, Office of the CIO reminded us that cloud is a 2-10 technology. 2 years of hype and 10 years of adoption. Plenty of time to get rid of the "sins of the past".

Saturday, May 8, 2010

General Hayden - Words of Wisdom

I was privileged to listen to one of my all-time favorite presentations by retired General Michael Hayden, former director of NSA and CIA. The venue: TTI / Vanguard Conference on CyberInsecurity in Washington DC. He was reflecting on the current "bogey man" known as Cyber security. 

Now to most of us, Cyber space is a Hollywood (sci-fi genre) conjured term used synonymously with the Internet. It's origins are actually from a 1984 novel called Neuromancer.  In the National Security discourse the term cyber-space is the "5th domain" with implications far removed from a Hollywood set. There is complex language of "mutually assured dependence" and Cyber warfare. Why the 5th domain? Well the other four are: land, sea, air and space. General Hayden artfully distringuished cyber-space as the only domain that was created by man, the rest of which was "created by god" - and natural events. Adding: "we did not do such a good job". Simply-put the Internet is inherently untrusted, and was designed for anonimyity and information movement. Not for security.

In the US we are still grappling with the ownership of the "cyber thing" which Hayden points out is very much fractured. There are voices that seek to frame the problem as a matter of commerce to be shepherded along led with the private sector. On the other, cyber is a matter of national security with bills that promote a nation navigating both times of cyber peace .. and war. Clearly its not a trivial topic. The private sector owns a lot of critical infrastructure from our energy supplies, power grids and financial markets. And the networks and systems are not entirely isolated from this thing called cyber-space. For ill-doers this is a prize. 


The Internet was designed and engineered for friend not foe.


The growing voices that suggest cyber space should be regulated and done so by government, immediately starts to run chills down the spine of civil libertarians and privacy mavens. They fear as Hayden put it: that the US government will make the dubious offer that it cannot protect its citizens unless, it is allowed to monitor them on the Internet. A fear reinforced when General Hayden gave a wink to his own prior employers (NSA) "abridged sense of privacy".


His message was one of a sense of urgency and not alarmist. He encouraged industry to spend energy into thinking about security and privacy and not just the ease of use of the Internet. And the need for doctrine about what can and should not happen in cyber space. (think about the lack of norms against a a cyber attack on a hospital's patient management system).

As John Negroponte put it (also a presenter) in cyber space "proceed with caution"



Tuesday, April 6, 2010

In Cyber Space, It Pays ... To Pay Attention

The ability to imagine the future and do so rather inaccurately is a uniquely human quality. Lest we forget the faulty real-estate asset valuations and risky gambles some financial services firms undertook. That inability to "get it right" in the midst of plenty of relevant information saw us enter into one of the worst economic downturn. 

Now, while we do imagine creatively (flying machines, submarines, Internet, smart-phones and electric cars to name a few) it is still difficult for us humans to get the future right, because of some familiar limitations. First we are locked in the present as we try and predict the future, or in other words the future almost always looks like a different version of the present, at least for most of us. And second we are very subjective in our forecasts. We can get stuck into believing that our own point of view reigns supreme, and that when we evaluate our claims against those of others – we will doubt those of others. 

Let’s say you regularly drive your car down a route home-bound and we'd like to evaluate your driving behavior. For this test we've created 3 driver proficiency categories. You are either someone that drives in “auto-pilot”, you are a directionally challenged driver (and a lost cause) or you are an individual that is extremely in-tune with your surroundings. 

If you find yourselves in the last category – you are a near perfect driver.  You know the distance of your car to the next. You observe the erratic behavior of a truck 20 feet ahead and two lanes across. You are tracking the changing weather conditions. You are aware. Actually there is a term you don’t fall into: Driving Without awareness (DWA): someone in a state where there is no active attention to the task of driving. 

Congratulations, you’ve managed to free yourself of simply focusing on the precise task of driving. You are pretty good at making forecasts because you are not totally centered around your-self, and instead you are actively absorbing (and filtering) information from your environment. And in relationship to the introduction of this blog, you are someone that does not ignore subtle cues and signals.

If you were to program all those keen skills into a next generation drive-assist system it would have features such as defensive driving heuristics, map-based reasoning and use your own experience to predict traffic flows. Moreover it would be smart enough to respond to changing situations with more acuity with or with-out you in the loop. 

And so it is also true that the goal of better understanding our surroundings exists all around us: air-line traffic control, supply chain management, in the battle-field, doctors and other critical decision makers must all maintain some level of situational awareness in dynamic and tricky environments.

The process of raising that situational awareness barometer starts with differentiating status (of something) from events and thus relying heavily on surveillance (more passive monitoring) and reconnaissance (actively targeting someone or something) to recognize errant behavior, the terrain and environmental conditions, track targets and sense indicators and early warning signs. 

Think of an air traffic controller and the tools they need to get and maintain the right attention to track fast moving objects from colliding with each other in mid-air. 

It is increasingly apparent that in Cyber space (as in land, air and sea) there is virtual terrain and dimensions of time and space. To conduct commerce, serve citizens and communicate without some sort of handle on one’s surroundings is akin to walking in a dark alley with no perception whatsoever. It’s out of the question. 

Organizations of course rely on intrusion detection systems, event monitoring, incident response and readiness teams, anti-virus scanners and well managed applications and operating systems. Hopefully that pristine infrastructure or application is under a digital microscope where anything that is out of place or odd will be observed. 

The challenge is that observation or witnessing an event, is again different and harder than forecasting or predicting an outcome.

For example, consider an trusted insider that is observed downloading sensitive files for an extended period of time - after hours. On the surface there may have been no reason to doubt any misuse of privileges. There may have been no “rule-breaking" behavior. With some  projection and connection of the dots, there may be an opportunity to prevent a serious incident of data theft. Consider, if that same individual 4 months earlier was placed on administrative leave and 1 year earlier had visited a web site that is known to distribute malware. 

The point is that most of today’s IT security systems that help gleam what’s happening, what has happened and what is about to happen are either disconnected or most likely not in place at all. 

As we live, work and play in cyber space - organizations and all of us must raise our own situational awareness and in different ways. Whether it’s changing passwords on a regular basis, updating anti-virus definitions or avoiding that tempting link in our emails. 

Organizations and government agencies must also up the ante in terms of accurately detecting suspicious behavior, putting in place credible deterrents and automating responses that will minimize the impact of a potential threat actually occurring - whether that threat is known or unknown. They must also get better at working with a wider latitude of information that originates in cyber-space and must be correlated to the physical world.


Friday, February 26, 2010

US Federal agencies and cloud computing: Reason to be cautious

Cloud computing in the enterprise has generated plenty of hype – and plenty of eye-rolling among wary CIOs of federal government agencies. Federal IT leaders continue to express legitimate concerns about the effectiveness of cloud environments in securing sensitive government and consumer data. Many are also skeptical of the promised cost savings and service quality.

Amid the healthy skepticism, however, lies increasing pressure to give serious consideration to cloud computing initiatives. The Obama Administration is focused on lowering the cost of government operations while driving innovation. Many local, state and federal government agencies are evaluating cloud computing, though few are actually implementing cloud-based solutions at the federal level.

While there are no quick answers, there are some clear steps federal agencies – not just CIOs, but all senior decision-makers – can be taking now to prepare their organizations for cloud-based computing environments. Here are three key points federal agencies should consider in an effort to break through the hype and lay the groundwork for a clear, reasoned path to cloud computing.

1. It’s OK to say no.

Federal agencies under pressure to embrace cloud computing are fully justified to just say no. The fact is that most federal agencies should not venture into a “classic” public cloud any time soon, for a simple reason: Incumbent cloud service providers such as Google, Amazon and Microsoft do not comply with current Certification & Accreditation rules dictated by the Federal Information Security Management law. This makes the public cloud a non-starter for most federal organizations, while providing a clear litmus test for the future viability of cloud providers.

To their credit, commercial cloud providers are making some progress in regards to federal compliance. Both Microsoft and Google, for example, are close to receiving accreditation for FISMA compliance, and Google has reportedly completed a System Security Plan for its Google Apps platform.

Key issues around compliance involve where the data actually resides in a commercial or public cloud environment and how vulnerable it will be to cyber attacks. CIOs are rightly concerned that their data could be stolen by hackers, mixed with data from their cloud providers’ other customers, or inadvertently exposed. The recent cyber attacks on Google and other organizations emanating from China will do nothing to quell concerns around moving sensitive information into cloud environments.

As Robert Carey, CIO of the US Navy, noted at a federal executive forum in November 2009:  “Public clouds are not necessarily appropriate for Army or Navy information to be just sitting out there, and therefore the models that you would use to describe the security of that information might have to, and probably will have to change a great deal.”

2. Not all clouds are created equal.

Although the public cloud is not a near-term option for federal agencies, other options do exist to help federal and intelligence agencies gain more flexibility and achieve cost efficiencies for their IT infrastructures.

The basic technologies of a cloud environment – high speed Internet connections, browsers, grid computing and virtualization – are well established and can be duplicated by any organization.  This makes it possible for government agencies or departments to build “private clouds” – infrastructures that use cloud technologies but are more aligned with current security requirements.

Adopting virtualization and other cloud technologies, even in a closed environment, could increase efficiencies and reduce infrastructure costs by breaking down the silos created through the use of proprietary technologies. Consider the efficiencies of something as basic as a shared email system; these and other non-core applications are good candidates for early migration into private cloud environments.

The DoD’s Defense Information Systems Agency (DISA) is moving toward what arguably could be the world’s largest private cloud, as it looks to integrate the Defense Information Systems Network, its data centers, and its command and control applications. This cloud-based approach to sharing information could serve as a model for other agencies – and for commercial providers as well.

The type of information each agency handles will influence its approach to cloud solutions. Just as most governments tag data with different levels of sensitivity, from low level (published widely and no restrictions) to ultra secure (classified security information for top government leaders only), they can also begin to design cloud architectures for different levels of information. For example:

·         Any agency that deals in the public domain – such as the EPA, the Census Bureau, or the Department of Interior – could serve as a “canary in a coal mine” test case for a cloud infrastructure.
·         The GSA’s recent launch of apps.gov – a storefront for cloud-based business, productivity and other applications featuring non-sensitive data – is an example of where federal agencies can begin testing the waters.[iv] The GSA has already seen positive results from moving the USA.gov portal to the cloud. The transition significantly lowered GSA's costs, saving taxpayers an estimated $1.7 million annually.
·         New public programs that must launch quickly to meet legislative mandates – Cash for Clunkers is one recent example – are candidates for testing the ability of cloud solutions to scale quickly .

On the other hand, Homeland Security and other agencies that deal with ultra-secure data have little or no incentive to consider a cloud solution. In situations where data security presents an untenable risk, the government may choose to pursue other avenues, such as optimizing the infrastructure in place using traditional IT practices or decoupling data and processing to allow use of public or private cloud infrastructure without jeopardizing data security.

3. Who’s accountable?

The issue of governance is perhaps the biggest obstacle to federal agency adoption of cloud computing. Regulatory mandates require that an agency must know precisely who has access to data and where the data resides, both physically and logically. Some providers will guarantee the presence of data in the U.S., while others will not, or cannot, prove their ability to do so. In addition, agencies must have considerable transparency into the operations of the service provider. This raises two key questions:

·         Who will oversee the passage of data throughout a cloud environment?
·         Whose role is it to ensure that this data is continuously managed and protected?

Agencies will need to establish comprehensive policies for issues such as encryption key management along with network, application, and data-level mechanisms that enable the verification of data movement and storage in cloud-based environments.

The good news is that government agencies do have plenty of experience in establishing governance policies for external service providers. The same risk assessments used over the past decade to qualify third-party storage facilities or other service providers can be applied, at least in part, to cloud computing vendors.

Agencies can also rely on existing rules and structures that govern IT decision-making; these policies and processes can be adapted to determine the chain of command for decisions and activities related to cloud computing.




Wednesday, January 13, 2010

When a closed mind bears the standard, pity those who follow



Most cloud vendors are setting themselves up as the gold-standard in terms of everything from how thier product's are priced, labeled, serviced etc. They are patiently waiting for signs of a critical mass to declare victory over important matters that will influence customer defection rates and unbending loyalty. History is litered with examples of technology standards pitted against each other with the consumer watching the battle unfold. Most recently: Blu-Ray, HD DVD, USB, GSM, CDMA. The consumer cares about standards and those obscure protocols because they impact our wallet, our sanity and life-long experience with every day products. 

To pick and choose standards for cloud computing we need not look further than the ‘web services/SOA’ craze of the last 5 years. That hoopla spawned standards and vendor specifications to last us a life-time. All of us (tech savvy or not) will see XML as one of those unsung hero's and a saving grace that pulls us back from the brink of mass confusion. It is THE currency of data portability and like the electric socket in our home will force cloud vendors to conform in some fashion. I can pull out all my blog posts into a neat folded ‘XML’ file and take it where ever I want. I can do the same with iTunes.

On the flip side vendors still must be willing to cooperate and commit to that sort of openness, secure cloud integration, application portability and data portability. Will the US home developer actually put that 110 Volt electric socket and in all of the rooms?


Some would say that the higher you go up the cloud stack the more difficult it is to reclaim your content. The prevailing sentiment is that an infrastructure provider allows you to move in, and move out with all your belongings. Like a hotel room. A software-as-a-service provider gets to keep all that code – that it owns - and if you walk away you are down more than your data. Here is a blog I wrote about reclaiming your data and applications.

If salesforce.com confirmed to ideal standards it would allow an organization to export all that goodness and make it much easier as it hunts for a replacement. Standards have been around for thorny 'middle-ware' since 2004: interoperability and business process management. SaaS and PaaS vendor are in the drivers seat and will want to stay that way. If I can't increase stickiness, I will lock-you in. Telecom carriers along side COTS application vendors have long played the lock-in game. In the end they hold the shorter end of the stick. Instead they should focus on customer and brand loyalty, service-levels and building trust and legitimate interdependence.

The nuance that gets lost in the discourse: plenty of standards already exist. Yet vendors are avoiding adoption as others play into sound-bites until they get critical mass. Another simple example, all the standards for security are ready for prime-time cloud computing: SAML, XAXML, KEYPROV, ISO 15489, EDRM, PKCS, WS-Federation, Liberty ID-FF etc.)

There are a number of notorious topics that do deserve special attention. Service Level Agreements and contract terms and conditions are not uniform across service providers. Count on government bodies such as the US GSA to swiftly drive the convergence to common attributes around performance metrics, incident response details etc.

The National Institute of Standards and Technology (NIST) calls out customer on-boarding, service provisioning, inter-cloud interaction as prime topics that deserve attention:
  • VM image distribution (e.g., DMTF OVF)
  • VM provisioning and control (e.g., Amazon EC2 API)
  • Inter-cloud VM exchange
  • Persistent storage (e.g., Azure Storage, S3, EBS, GFS, Atmos)
  • VM SLAs that are machine readable
  • uptime, resource guarantees, storage redundancy
  • Secure VM configuration (e.g. NIST Security Content Automation Protocol)
  • Workflow and business rules import and export
In a world of constant motion and innovation we should expect to live with some degree of propriety and feature-loss or gain. What we should expect, is some sort minimum code of interoperability, ethics and portability.