Monday, May 24, 2010

A Wicked Web We Weave

How do you get in the middle of plans or actions of a knowledgeable employee who intends to inflict damage to your company? 
It’s very difficult to control the flow of information within today’s work-place. In any normal “business” day there could be foreign national interaction, USB key exchanges, work from home, contractors –all greatly increasing opportunity for espionage and data theft. There is plenty of room for mischief and the amount of harm done by a well-informed saboteur is non-trivial. It is an order of magnitude more damaging than a stranger.

Definition of an insider by CERTA current or former employee, contractor, or business partner who
Has or had authorized access to an organization’s network, system, or data and
• Intentionally exceeded or misused that access in a manner that negatively affected the confidentiality, integrity, or availability of the organization’s information or information systems 

Trouble with an "insider" is they have legitimate access. They are working within regulation, aware of policy and  unlikely to break rules. So access control won't work nor will "intrusion detection". There is no intrusion.
Most of the time an employee will be reported for unusual behavior by a co-worker or an audit. There are also personnel screening processes that have to be in place before hiring. Training and awareness helps employee's notice unusual activity.

That's old-school stuff - and it works and is crucial. In addition there are technical approaches that have gained a rightful place in the "must-have" list. Most companies should turn on network and application activity logging, file integrity checks and data loss monitoring. Stitching together the alerts that are generated by "big brother" helps spot things that are "out of the norm".  For example if a file changes, or an email is sent with a sensitive attachment, a user or behavior can be marked as "bad" and declared suitable for further surveillance.

Unfortunately there is still plenty of noise obstructing an accurate and clear reading of what is good, bad or ugly behavior. We hop on and off social networks, plug and unplug cables, head to work late, forget to submit expense reports, make travel arrangements out of policy, skip virus updates and get overly zealous downloading information. The list goes on. Human behavior can seem hopelessly impossible to predict. 

Statistics brings the magic of mathematics to make sense of data and tell us "what's up". It is a science that fills in blanks in our memory, keeps us honest about the present and paints a rough approximation of our future. There is good research literature on the topic of user behavior analysis. We are working to take tried and true algorithms such as root cause analysis, propensity analysis, link analysis and econometric forecasts - and then rigoursly apply them to hard to solve cyber security prolems.

If you are interested in assessing threats across transactions, individuals and groups that are unobservable to the naked human eye, please contact me. While tackling the problem of employee betrayal can be hard, we can turn to “big data” and analytics to help trip up the enemy within. 

Thursday, May 20, 2010

At the NIST Cloud Computing Forum it was Baby Steps for Baby Feet, or so it Seems

I attended the Cloud Computing Forum & Workshops hosted by the National Institute of Standards and Technology (NIST). Venue: United States Department of Commerce, Washington DC. NIST’s mission in life is to work collaboratively with the public and private sector on standards and technology. Vivek Kundra, United States Chief Information Officer, was the key note speaker.

The workshop atmosphere (it was really just one long day of presentations and panels) was naturally US government centric. Patrick Gallagher, Director of NIST, peged the event as a launch pad for a “robust dialogue” to define how the US government will procure and use cloud computing.

Here is a snippet of NIST’s widely circulated a definition of Cloud Computing:

“Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources (e.g. networks, servers, storage, applications, and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction”

As the day unfolded it became clear that NIST is faced with a catch 22. Agencies need standards to move forward. At the same time NIST is shying away from standards lest they get it wrong too soon.

History, markets and economics tells us that buyers and sellers alike benefit when there are global norms that remove ambiguity, build confidence, establish best practices and clarify legal ramifications. It’s like a traffic signs and speed limits.

As the world’s largest purchaser of information technology – with 10,679 IT systems (lots of duplication) the US government spends $76+ billion annual budget. These impressive stats are a small indication of the high stakes that are being cast around. [The government serves on a daily basis over 300 million “customers” (students, employees etc.) and 1.9+ million federal employees.]

Vivek, while a big fan of cloud computing, reiteratted that the initiative that he launched last September 2009 is still mostly about “rethinking investments in IT budgets” through data center consolidation across agencies.  A report on the state of the union was just released here.

There is meaningful progress towards cloud computing within state and federal agencies and some of the  illustrative use-cases can be found on page 12/38 in the report. Here are a few examples:

• The SEC uses Salesforce to shorten the time to handle investor advocacy cases. It took 30 days to close a case. Now it takes less than 7 days.

• For the 2010 count, 2010census.com was launched by US Census. It is very media rich and there was no means to evaluate the traffic volume. The US Census did not want to stand up a litany of servers for the 2010 count. Instead they used Akamai to delivery content “from the edge”.. The census also selected Everbridge used for mass notifications to the 1+ million temporary census employees.

• Recovery.gov moved to the Amazon Cloud. Recovery.gov is a tool that “shines light into Federal spending that must handle million of visitors”. The shift to Amazon EC2 led to a savings of $750,000 – redirected funding to help identify fraud, waste and abuse.

• State of Utah is moving to a “hybrid cloud” (70% complete) going from 1800 physical servers to 400 virtual servers – with a $4 million annual savings out of a $140 million IT budget. The cloud solution includes Salesforce, Google Earth and Wikispaces.

• The city of Los Angeles moved to Gmail from an antiquated communications system. They calculate a savings of $5.5 million in direct costs and an ROI of 20-30 million.

• The US Department of Interior is in mid-stream of a consolidation effort of multiple email systems with 80,000mail-boxes users into a cloud model. It was not unusual to have a single email server to manage only 6 email accounts.

• NASA is re-evaluating a $1.5 billion data center effort in the context of “cloud first” strategy.

A promising initiative was “revealed” so to speak by Katie Lewin, Director of the GSA Cloud Computing PMO. The Federal Risk and Authorization Management Program (FedRamp) is the official name for what Microsoft and Google are involved in to get their cloud “FISMA-compliant”. The program is still in formation phase.

What I gleamed about the spirit and purpose of FedRamp:

• The program lives under the Federal CIO and seeks to ease tough questions of security controls, authorization procedures and continuous monitoring for large outsourced and multi-agency systems. It would work in the context of the NIST Risk Management Framework (NIST 800-37 revision 1).

• Instead of working with each vendor independently to ensure compliance, an agency would depend upon stamps of approval that would be given to a cloud provider.

• An agency would review the authorization given to a vendor and if convinced with safe-guard assurances, they tie the knot with the vendor

• It is unclear what sort of base-lines are going to be published and applicability across international borders and strict (high sensitivity) agency requirements. The agency would still hold the responsibility to determine suitability

• FedRamp will follow-up with participating cloud vendors to ensure “continuous monitoring” and flag any drift from the original certification.

All  told the tone in the hallways of the US federal government is one of optimism about the future of cloud computing. There is a "cloud first" mind-set. It will be a long-haul though. John J. Shea Director in the DoD, Office of the CIO reminded us that cloud is a 2-10 technology. 2 years of hype and 10 years of adoption. Plenty of time to get rid of the "sins of the past".

Saturday, May 8, 2010

General Hayden - Words of Wisdom

I was privileged to listen to one of my all-time favorite presentations by retired General Michael Hayden, former director of NSA and CIA. The venue: TTI / Vanguard Conference on CyberInsecurity in Washington DC. He was reflecting on the current "bogey man" known as Cyber security. 

Now to most of us, Cyber space is a Hollywood (sci-fi genre) conjured term used synonymously with the Internet. It's origins are actually from a 1984 novel called Neuromancer.  In the National Security discourse the term cyber-space is the "5th domain" with implications far removed from a Hollywood set. There is complex language of "mutually assured dependence" and Cyber warfare. Why the 5th domain? Well the other four are: land, sea, air and space. General Hayden artfully distringuished cyber-space as the only domain that was created by man, the rest of which was "created by god" - and natural events. Adding: "we did not do such a good job". Simply-put the Internet is inherently untrusted, and was designed for anonimyity and information movement. Not for security.

In the US we are still grappling with the ownership of the "cyber thing" which Hayden points out is very much fractured. There are voices that seek to frame the problem as a matter of commerce to be shepherded along led with the private sector. On the other, cyber is a matter of national security with bills that promote a nation navigating both times of cyber peace .. and war. Clearly its not a trivial topic. The private sector owns a lot of critical infrastructure from our energy supplies, power grids and financial markets. And the networks and systems are not entirely isolated from this thing called cyber-space. For ill-doers this is a prize. 


The Internet was designed and engineered for friend not foe.


The growing voices that suggest cyber space should be regulated and done so by government, immediately starts to run chills down the spine of civil libertarians and privacy mavens. They fear as Hayden put it: that the US government will make the dubious offer that it cannot protect its citizens unless, it is allowed to monitor them on the Internet. A fear reinforced when General Hayden gave a wink to his own prior employers (NSA) "abridged sense of privacy".


His message was one of a sense of urgency and not alarmist. He encouraged industry to spend energy into thinking about security and privacy and not just the ease of use of the Internet. And the need for doctrine about what can and should not happen in cyber space. (think about the lack of norms against a a cyber attack on a hospital's patient management system).

As John Negroponte put it (also a presenter) in cyber space "proceed with caution"