Feed on
Posts
Comments

This month I had the pleasure of attending a talk given by Warren Udy, Director Information Assurance and Cyber Security at US Department of Energy. If you are in the Arlington, VA area on June 22, you should try to catch Warren at the “Cloud Computing Committee Meeting.” His presentation was not only very entertaining and enlightening on the new Federal Risk and Authorization Management Program (FedRAMP), but it also started me thinking about a few other recent developments on the federal cloud front. Before discussing FedRAMP, let us discuss the recent changes to FISMA, the desire for open government and the cloud, and the General Services Administration (GSA) reissuing the request for quote (RFQ) to Infrastructure as a Service (IaaS) vendors. We will conclude with examples of government cloud adaption going on today. Things are getting interested on the federal front.

Changes to FISMA

Last month the Obama administration announced new standards for agency reporting under FISMA as part of an effort to get agencies to shift from paper-based reports to real-time monitoring of systems. Vivek Kundra, the Federal Chief Information Officer, was interviewed by Federal News Radio in the post "OMB outlines shift on FISMA." Vivek expressed the vision that "What we need to do, when it comes to information security, is shift to a model across the federal government, with a focus that is much more of a real-time basis. And you'll see forthcoming, in terms of the FISMA reporting guidance, more centered on continuous performance monitoring and Cyberscope."

Ben Bain is reporting in the article, "NASA's new FISMA approach and what it means for you" that NASA’s Deputy Chief Information Officer for IT Security Jerry Davis is developing a new program for the security authorization process based on continuous monitoring, automated tools and reducing paperwork. NASA hopes to have it in place for fiscal 2011. “Security is still going to be done. Certification and accreditation will still be done, but the way we do it is going to change significantly and the frequency of it will change,” he said. “Instead of every three years, you’re really going to be doing it, in a sense, on like a weekly or monthly basis, you’re always going to be looking at those controls and adjusting them for changes."

Alan Paller, director of research at the SANS Institute is quoted on how the new approach will help to correct flaws in the original FISMA legislation, "It's a move toward being able to know the status of every machine at every minute. So that when something bad is coming at you, you know where you can target and where you can't so you can act quickly. It's a complete change from what we've had before. This started during the Clinton Administration, and it was the Senate that created it in the bill called GISRA, and then it became FISMA. It was an error made by people who didn't understand the threat, and the error was that you can manage fast-moving attacks with slow moving paper."

Joe Faraone, aka Vlad the Impaler, in his post "Machines Don’t Cause Risk, People Do!" warns that "continuous, repeatable security processes followed by knowledgeable, responsible practitioners are what government needs. But you cannot develop these processes without starting from a larger, enterprise view." Joe writes "Desiring to know everything about everything may seem to some to be a worthy goal, but may be beyond many organization’s budgets. *Everything* is a point in time snapshot, no matter how many snapshots you take or how frequently you take them. Continuous, repeatable security processes followed by knowledgeable, responsible practitioners are what government needs. But you cannot develop these processes without starting from a larger, enterprise view. Successful organizations follow this–dare I say it–axiom whether discussing security governance, or system administration."

Open Government and the Cloud

Effective security approaches being beyond many organization's budget might just be at the heart of the matter. Recall that Vivek Kundra statement that he sees two overarching trends now happening in computing:

  1. The increasing use of mobile devices and the app ecosystems they support.
  2. There's cloud computing, which can cut IT costs and drastically improve access to information.

With that in mind, it is not surprising that Nick Eaton reports in his post, "Obama's CIO ready to bring government tech up to speed" that the first two major tech initiative launched by the Obama administration consist of:

  1. Data.gov, which is a depository for open government datasets that people can access to create applications, do scientific research and more. It launched with 47 datasets and it now includes more than 169,000. Since its launch in May 2009, New York, San Fransisco, Seattle and other local governments have launched similar services. Vivek has stated, that a big difference between public-sector and private-sector technology is that the commercial world is focused on front-end customer needs, whereas government IT is usually focused on the back end. Kundra wants to change that by creating accessible user interfaces to online government services, and as a result make "government cool again."
  2. Apps.gov, which is hosted by the U.S. General Services Administration. It's a clearinghouse for hundreds of cloud-computing applications, both free and not, from mostly private vendors.

Cloud computing can be a solution that allow for continuous monitoring and a unified risk based approach across government agencies, all while reducing costs. A major stumbling block is achieving agencies compliance issues in respect to cloud vendors.

GSA Reissues RFQ

The GSA released the RFQ on its E-Buy mid-May asking for bids from IaaS providers on cloud storage services, virtual machines and cloud web hosting. Fed Cloud Blog interviewed Dave McClure, GSA’s Associate Administrator of Citizen Services and Communications, concerning the RFQ and the new contract. Dave discussed several of the differences:

We’re raising the security level to the moderate level. I think that’s where the public sector in general is headed — greater security in these cloud provisioning agreements. So, we’ve raised this up to the moderate level. I think that’s a significant improvement and difference from the prior RFQ. We also are making it much easier and clearer to map the industry offerings to the contract line items in this BPA instrument that we’re using. There was some confusion about whether specific services and prices for some of the industry offerings — how they’ve mapped to the contract line items in this BPA. We’ve gone back and actually cleaned that up and had conversations with industry on how that mapping process can work very effectively. So I think that will also create a much better instrument than what we had before. The third big difference is that things that are awarded off of this instrument will be candidates that will go into the FedRAMP centralized CNA approval process. I think that will make a difference, as well — knowing that your product or service will actually go through one CNA and then be usable across the entire government.

FedRAMP

This month FedRAMP was officially announced. Peter Mell, FedRAMP Program Manager, discusses the program in his presentation from last month. Peter explains FedRAMP is a government-wide initiative to provide joint authorizations and continuous security monitoring services. It provides a unified government-wide risk management and it will allow agencies to leverage FedRAMP authorizations (when applicable).

FedRAMP’s initial focus is on cloud computing with the program working with cloud vendors (currently Microsoft and Google are in pilot mode) to evaluate their overall security environment in relation to government security controls. The controls will be based on the new NIST security framework. There still will be some gaps between civilian, DoD and Intel agencies, so moving to cloud will still require some security work. The goal of FedRAMP is to create a unified risk management process that:

  • increases security through focus assessment.
  • eliminates duplication of effort and associated cost savings.
  • enables rapid acquisition by leveraging pre-authorized solutions.
  • provide agency vetted transparent security requirements and authorization packages.
  • facilitates multi-agency use of shared systems.
  • ensure integration with government-wide security efforts.

Peter states, "An advantage of this program is that [vendors] primary work with one security assessment and authorization body, or one risk management program, and they don't have to independently meet all of the security requirements of the many, many different agencies." In an interview with Eric Chabrow, Mell goes on to state, "Agencies, by leveraging FedRAMP authorization, will save a lot of money and enable rapid acquisition, but they're still in control. They get to choose whether or not they leverage it. They can choose if they want to do additional work to assure systems meet the security needs of their agency."

Mell believes the primary hurdle in securing the government adaption of cloud computing is the lack of government-wide authorization capabilities. Mell states:

Currently, with each federal agency independently doing risk management with these large outsourced systems in cloud computing you have got duplication of effort, but you have got incompatible policies being levied because the Federal Information Security Management Act is all about a framework by which agencies communicate or enforce their policies on a system. So you get 40 agencies together, enforcing their policies on a single system and the interception of those policies is likely not draftable. Likely, they will disagree on the finer points of server configuration, for example, and it just won't be possible and that is a source of great frustration for cloud vendors. It also means that acquisition is very slow, the lengthy compliance processes and then there is inconsistent application of these government-wide security programs.

To solve that, and I think this is common sense, I don't think we are doing anything unexpected or unusual here, it's certainly new, that the proposed solution is found within FedRAMP – the Federal Risk and Authorization Management Program. The idea is to create a government-wide, risk management program that has to be optionally used by the agencies. It provides joint authorization services and continuous monitoring services and again, I will stress that it is optional.

FedRAMP would perform assessment and authorization of these very large systems, these government-wide authorization then can be optionally leveraged by agencies so that they can adopt these services with a minimal of additional security effort required. FedRAMP would perform security, based on an agreed upon government-wide security baseline that agencies can leverage. That is what I mean by most of the work will be done because that baseline will have been assessed and authorized.

Agencies do have unique missions and risk tolerances and security needs, and so agencies are always welcome to do incremental additional security testing, require additional security controls to be implemented and so forth. But again, the idea is to complete the bulk of the work for the agencies; do it once and do it well and thereby reduce an enormous amount of duplication of effort and enable rapid acquisition by federal agencies, eliminate that concern of security requirements not being compatible when multiple agencies levied them on a particular resource pool cloud system. And lastly, ensure consistent application of federal government-wide security programs. The Trusted Internet Connection program or there is ITM, there is Einstein, and the list goes on

As to the question of authorization, Mell explains, "this fits perfectly within existing law, OMB policy, and even NIST security guidance. What we did do is in the new NIST risk management framework, in particular the NIST Special Publication 800-37, we added an Appendix s.6. That appendix talks about this notion of joint authorization being performed by the joint authorization board and then this concept of leveraged authorization where the agencies are leveraging the outcome of this joint authorization. We put the sort of foundational underpinnings of FedRAMP into the new NIST management framework. And by the way, FedRAMP is designed to follow that NIST risk management framework and focus a lot on that continuous monitoring aspect."

There are real issues that need to be worked out as FedRAMP develops. For example, Michael Smith in his post, “NIST Cloud Conference Recap” shares his personal experience with a certifier that said, “we don’t recognize common controls so even though you’re just a simple web application you have to justify every control even if it’s provided to you as infrastructure.” Michael goes on to list several pieces that he has not seen FedRAMP addressed yet (follow the link and read his blog). I will add two more:

  1. Vendor Lock in: if a cloud provider is authorized at some point but later stops meeting the security controls causing authorization to be revoked, how do agencies switch cloud providers without cost and/or loss of service?
  2. Contamination Containment: when classified material leaks into the cloud, how is that dealt with? It does happen. Current requirements are to have the drives pulled and destroyed. That is not possible under current cloud configuration where the data is spread over thousands of drives.

So, everything is not rainbows and unicorns. It never is in security. There are real challenges to be faced. It is great that a discussion is taking place and folks are working hard at addressing these issues.

Federal Cloud Adoption

This past week, a new Federal CIO Council report, "The State of Public Sector Cloud Computing" was released. The executive summary states, "As we move to the cloud, we must be vigilant in our efforts to ensure that the standards are in place for a cloud computing environment that provides for security of government information, protects the privacy of our citizens, and safeguards our national security interests. This report provides details regarding the National Institute of Standards and Technology’s efforts to facilitate and lead the development of standards for security, interoperability, and portability." Kevin Jackson in his post, "Vivek Kundra – State of Public Sector Cloud Computing" describes how the report "not only details Federal budget guidance issued to agencies to foster the adoption of cloud computing, but it also describes 30 illustrative case studies at the Federal, state and local government level."

Deniece Peterson in the post, "Security, Standards and Budget Initiatives to Spark Cloud Computing Adoption" discusses the NIST forum and workshop she attended (slides are available). Deniece describe the the morning session as including a panel of industry representatives from Intel, Microsoft, the Cloud Security Alliance, Amazon.com and the Center for Democracy and Technology. The panelists' wish list consisted of:

  • Keep going with FedRAMP (security certification effort), but don't stop there.
  • Develop standards in collaboration with both industry and international stakeholders
  • Recognize that interoperability needs can vary case by case; no one size fits all
  • Don't stifle innovation by setting standards too quickly; focus on building the framework
  • ID management, access control and cryptographic key management are the main security issues surround cloud computing and can have a serious impact on scalability
  • Push vendors to be more transparent about their security controls
  • Traditional notions based on physical boundaries will need to change
  • SLAs must include meaningful metrics for performance and security

"We want to be pragmatic, but aggressive," Kundra told the Washington crowd, noting that the government's consolidation of federal data centers and several other "game-changing approaches" will further fuel the move to the cloud. Andrew R Hickey in his article, "Federal CIO Says Cloud Standards Needed For Government Adoption" describes how NIST has also started the Standards Acceleration to Jumpstart Adoption of Cloud Computing (SAJACC) initiative that will validate and communicate interim specifications to agencies in the areas of security, interoperability and data portability. "We're not trying to write cloud computing standards, but are trying to do some testing on reasonable system interfaces or specifications of systems and make the test results available so people can see something is absolutely possible because the the test results show it," NIST senior computing scientist Lee Badger said. NIST will also launch a publicly accessible Web portal to facilitate collaborative development of standards to support cloud computing requirements, Dawn Leaf, NIST senior executive for cloud computing, told attendees. Leaf expects the portal to be available sometime before the end of 2010. Currently, business use cases are now available on the CIO Web site.

Alex Howard reports that recovery.gov would be moving to Amazon's cloud. Earl Devaney, chairman of the recovery board, stated this move represents one of the "first bricks in the foundation that we're laying" throughout the federal government, in terms of cloud computing. Vivek would direct us to "look at the Department of Interior: The CIO is considering moving 80,000 emails to the cloud. Look at the investments made at GSA or a recent RFI [Request for Information] around email. Across federal government, you're seeing a number of agencies putting in a plan." J. Nicholas Hoover reports in his article "Gov 2.0: Google Readies Government Cloud" that customers Google already has for Google Apps are the city of Los Angeles and Lawrence Berkeley National Laboratory. In the federal sector, more than 100 federal agencies are already customers of Google's other products, including Google Earth, Google Maps, and Google Enterprise Search. Google Enterprise president, Dave Girouard reports "we have a lot of state and local interest, and, increasingly, with FISMA certification arriving soon, think we have an opportunity with the federal sector." Girouard said that in addressing the federal government's unique cybersecurity demands, the majority of Google's work thus far has centered around documenting, clarifying, and explaining Google's security rather than re-inventing or changing its security posture.

Final Thoughts

Mary Engelbreit, famous children's book illustrator, once wrote "If you don't like something change it; if you can't change it, change the way you think about it." Is the government making real challenges? If so, are these the kind of changes necessary to make cloud computing a reality in federal departments?

Lori MacVittie in her post, “Can the Cloud survive regulation?” points out that “we are just beginning to see the impact of what sharing and ‘international’ really means: an increasingly complex web of requirements and regulations. That may very well make the cloud a battle-zone unsuitable for any organizational use until the conflicts between security, regulations, reliability, and privacy are addressed.” Lori also considers that we might just “see the rise of regulated clouds; clouds within clouds specifically designed to meet the demanding needs of the myriad governmental and industry-specific privacy and data protection regulations. Regulated clouds set aside – at a premium of course – for those users and organizations who require a broader set of solutions to remain compliant even in the cloud.”

In the post “Cloud: Security Doesn’t Matter (Or, In Cloud, Nobody Can Hear You Scream)” Chris Hoff offers the opinion, “the only thing that will budge the needle on this issue is how agile those who craft the regulatory guidelines are or how you can clearly demonstrate why your compensating controls mitigate the risk of the provider of service if they cannot.” Chris goes on to state, “We need the regulators and examiners to keep pace with technology — as painful as that might be in the short term — to guarantee our success in the long term.” Chris also recommends organizations “manage compliance, don’t let it manage you.” Novell has done a very funny short video based on the blog (along with other entertaining short videos you will want to check out):

I do not agree with everything that is going on in government. I believe solutions will be found through trained security professionals. Security tools can be empowering but are not the end all solution. A monkey with a computer, even if it is a high performance computer, is no William Shakespeare. Adding more monkeys will not make any difference; it just creates a zoo. I do believe in the possibilities created with change, especially when you find yourself in a place where things are not working. You build upon the knowledge of your people utilizing what does work.

What gives me greatest hope is that the federal government seems to be listening to experts like Chris, Deniece, Joe, Lori, Michael, etc. and making a solid effort to create an environment where it can foster the adoption of cloud computing. These are not just cosmetic changes focused on how we think about computing, but real changes in how we will operate. For those who like the challenges brought on by change, it is an exciting time to be in security.

Related Posts:

Leave a Reply

Bad Behavior has blocked 671 access attempts in the last 7 days.