 |
The use of standards-based solutions can be an important risk reduction approach for Government.
Please describe current standards that could help the Government in its adoption of cloud computing. Also, what cloud standards efforts would you like to see in the future?
|
- Winston Bumpus, Director of Standards Architecture, Office of the CTO, VMware
- Ron Knode, Director, GSS, LEF Research Associate, CSC
- James A. St.Clair, Senior Manager, Global Public Sector, Grant Thornton LLP
- Lew Moorman, President, Cloud and Chief Strategy Officer, Rackspace Hosting
- Peter Coffee, Director of Platform Research, salesforce.com inc.
- Teresa Carlson, Vice President, Microsoft Federal
- Marie Francesca, Director Corporate Engineering Operations, The MITRE Corporation
|
Winston Bumpus
Director of Standards Architecture, Office of the CTO
VMware
Over the last year much progress has been made on new standards for improved cloud interoperability and reduced vendor lockin. Standards development organizations (SDOs) have been applying the expertise of their constituencies to the problem and new organizations like the Cloud Security Alliance, have emerged to focus on unique challenges of cloud computing. Existing standards are being adapted as well to address cloud computing interoperability such as the Open Virtualization Format (OVF) from the Distributed Management Task Force (DMTF). OVF was originally developed to address portability concerns between various virtualization platforms. It consists of meta-data about a virtual machine images or groups of images that can be deployed as a unit. It provides an easy way to package and deploy services as either a virtual appliance or used within an enterprise to prepackage known configurations of a virtual machine image or images.
For example, it may contain information regarding the number of CPUs, memory required to run effectively, and network configuration information. It also can contain digital signatures to ensure the integrity of the machine images being deployed along with licensing information in the form of a machine readable EULA (End User License Agreement) so that it can the terms can be understood before the image(s) is deployed.
OVF is currently being explored to validate if other metadata should be added to help improve the automation of intercloud workload deployment. Concepts such as standardized SLAs (Service Level Agreements), sophisticated inter virtual machine network configuration and switching information and software license information regarding all of the various components that make up the workload are possibilities.
Other standards are still emerging including a common cloud API (Application Progamming Interface). These higher level APIs will be important as the pendulum settles in towards a series of multiple cloud offerings, each have different underlying implementations, but a standard many in which to interact. Having a standard set of foundation APIs, similar to the POSIX standards of the past, will help to ensure that cloud management and compliance tools will not be overly complex in order to handle different cloud implementations.
Efforts such as NISTs Standards Acceleration to Jumpstart Adoption of Cloud Computing (SAJACC) and Federal Risk and Authorization Management Program (FedRAMP) are two key initiatives to help both the government and industry to better define a robust, scalable and economically viable set of standards to accelerate the adoption of cloud computing.
For more information on what is going on in the various SDOs and the current and emerging industry standards for cloud computing please go to www.cloud-standards.org
Posted: July 13, 2010
|
Ron Knode
Director, GSS, LEF Research Associate
CSC
Cloud Standards Now!?
Wouldn't it be wonderful if we could simply point to cloud standard(s) and claim that such standard(s) could reliably lubricate government adoption of safe, dependable, accreditable cloud computing?! Sadly, we cannot. At least, not yet. And, this fact is as true for commercial adoption of cloud computing as it is for government adoption.
However, what we do have is the collective sense that such standards are needed, and the energy to try to build them. Furthermore, while the "standards" we need do not yet exist, we are not without the likely precursors to such standards, e.g., guidelines, so-called best practices, threat lists, special publications, and all manner of "advice-giving" items that try to aim us in the right direction (or at least aim us away from the very wrong direction). In fact, we have so many contributors working on cloud standards of one kind or another that we are in danger of suffering the "lesson of lists" for cloud computing.
Nevertheless, given our desire to reap some of the benefits of cloud computing, should we not try to accelerate the production, publication, and endorsement of cloud computing standards from the abundance of sources we see today?
Wait a minute! Standards can be a blessing or a curse. On the one hand, standards make possible reasonable expectations for such things as interoperability, reliability, and the assignment and recognition of authority and accountability. On the other hand, standards, especially those generated in haste and/or without widespread diligence and commentary, can bring unintended consequences that actually make things worse. Consider, for example, the Wired Equivalent Privacy (WEP) part of 802.11 or the flawed outcomes and constant revisions for the PCI DSS (remember Hannaford and Heartland!?).
What we seek are standards that lead us into trusted cloud computing, not just "secure" cloud computing or even "compliant" cloud computing. Ultimately, any productive stack of standards must deliver transparency to cloud computing. Simply having cloud standards just to have standards does not bring any enterprise closer to the promised payoffs of the cloud. So, let's proceed with all deliberate speed through some of the worthy efforts ongoing, but not declare success merely for the sake of an artificial deadline or competitive advantage. The cloud definition and certification efforts sponsored by NIST and GSA, the security threat and guidance documents authored by the Cloud Security Alliance, the cloud modeling work of the OMG, the cloud provider security assertions technique proposed by Cloudaudit.org, and the CloudTrust Protocol SCAP like extension to reclaim transparency for cloud computing, ... all of these efforts certainly hold promise for accelerating the adoption of cloud computing for government and industry.
Let's push and participate in the actions of these and other groups. Ask questions, experiment, build prototypes, and seek extensive and deliberate peer review. Standards that survive such a process can be endorsed. But, like fine wines, cheeses, (and even thunderstorms), "We will accept no cloud standard before its time."
See the full blog response at www.trustedcloudservices.com.
For further information, please contact Ron Knode at: rknode@csc.com
Posted: July 21, 2010
|
James A. St.Clair
Senior Manager, Global Public Sector
Grant Thornton LLP
While the Cloud does prompt consideration of unique standards, many of the "same old thing" still pertain and should be considered.
At a high level, existing compliance standards such as The Health Information Portability and Accountability Act (HIPAA), Sarbanes-Oxley (SOX), and the Federal Information Security Act (FISMA), embodied by NIST guidance, all provide specific considerations for security that pertain to any computing environment. In basic terms, systems are expected to provide a level of confidentiality, integrity and availability commensurate with the sensitivity of the information and the identified level of risk, whether the system is a one-sever LAN or cloud-provisioned infrastructure.
However, how provisioned services and cloud computing manage information and limit risk is arguably different than traditional client/server architecture. As such, new standards are developing that help the "apples to apples" comparison of traditional security control objectives and new cloud provisioned services:
- The Cloud Security Alliance. The Cloud Security Alliance (CSA) is a non-profit organization formed to promote the use of best practices for providing security assurance within Cloud Computing, and provide education on the uses of Cloud Computing to help secure all other forms of computing. The CSA's "Security Guidance for Critical Areas of Focus in Cloud Computing" in one of the first security standards for cloud computing, based on the contribution of many industry participants.
- FedRAMP. The Federal Risk and Authorization Management Program (FedRAMP) is a unified government-wide risk management program focused on large outsourced and multi-agency systems. The program will initially focus on cloud computing but will expand to other domains as the program matures. FedRAMP provides security authorizations and continuous monitoring of shared systems that can be leveraged by agencies to both reduce their security compliance burden. FedRAMP authorization standards are built upon the tested application of tailored security controls and enhancements of NIST Special Publication 800-53.
Additionally, other groups are fostering technical and architectural standards that provide important elements of a framework to develop virtualized environments and provision cloud computing:
- For several years, The Open Group has developed and promoted The Open Group Architecture Framework (TOGAF), now at version 9. TOGAF has been built as an industry consensus framework and a method for enterprise architecture that is available for use by any organization around the world, and available for download.
- The Open Web Application Security Project (OWASP) is a global, collaborative, not-for-profit association dedicated to advancing knowledge and awareness of security in application security. Perhaps most importantly, OWASP tools and initiatives directly benefit virtualization and cloud services which heavily leverage web application for service delivery.
It is envisioned that continuing adoption of these and other standards will also help drive broader adoption of more transparent and secure cloud computing services. Ultimately, whether an organization pursues creating their own cloud, or leveraging hybrid or public clouds, these standards will make adoption easily acceptable and promote cost-effective IT investments.
For further information, please contact James A. St.Clair at: Jim.StClair@gt.com
Posted: July 27, 2010
|
Lew Moorman
President, Cloud and Chief Strategy Officer
Rackspace Hosting
Many suggest that standards are the key to encouraging broader adoption of cloud computing. I disagree; I think the key is openness and a competitive market. What's the difference? In the standards approach, a cloud would look and work as described by the standard it is implementing. If only one commercial implementation of the standard exists, this limits choice and freedom. Open clouds, on the other hand, could come in many different flavors, but they would share one essential feature: all of the services they'd offer could be run by the enterprises or agencies themselves without requiring a service provider.
Why is openness so important? If the web has taught us anything, it is that open systems, portability, and choice drive innovation. The open Linux system brought us a mountain of software and tools to help accomplish almost any task. And, each open component, whether a database or a widget, could be moved in and out freely to get the job done. These components often followed standards but were not limited or locked-in by them; and as long as their changes were open and accessible, they saw adoption.
Last week, Rackspace announced a new open source cloud platform with the support of more than 25 technology industry leaders: OpenStack. With initial source code contributions from Rackspace and the NASA Nebula cloud platform, OpenStack forms a foundation of cloud technologies used at scale in production today, including a compute provisioning engine OpenStack Compute and a fully distributed storage engine OpenStack Object Storage.
We expect an open source cloud platform like OpenStack to enable several things. One, anyone will be able to run this cloud and do it anywhere. Enterprises and agencies will be able to build private clouds. Workloads will be moved among these clouds easily from private to community to public clouds, or among different service providers, without having to rewrite the software to do it. Two, the entire tech ecosystem can build around this open source foundation. With wide adoption, there will be a more robust market of services around this flexible engine, from storage systems to monitoring tools to management systems. Three, the cloud will advance faster than ever.
It has been rewarding to see more formality around cloud standards development (from the DMTF, OGF, and others) as well as a coalescing of various standardization efforts (e.g. http://cloud-standards.org). Customers demand a faster pace. That's why we established OpenStack to accelerate innovation around open cloud software, and have also committed to adopt and support open and extensible standards as they emerge.
For further information, please visit http://www.rackspace.com
Posted: July 29, 2010
|
Peter Coffee
Director of Platform Research
salesforce.com inc.
In the cloud, if you're not interoperable, you're irrelevant. Any cloud service that can't interact with other services, and integrate with legacy IT assets, is too crippled to be competitive: it will never be important enough to make its proprietary nature a problem to any large community of users.
Even so, the world of the cloud still demands a role for standards but not the role that standards have played in the past.
With locally operated IT, a dominant vendor could exploit increasing returns to scale: the more of that vendor's technology a customer adopted, the greater the incentive to use the same vendor for the next incremental need. Customers needed standards to protect them from dominant vendors' temptation to strengthen barriers to competition.
The cloud's best-paved path does not lead to natural monopoly: rather, the cloud invites any provider of even the narrowest specialty capability to offer that service through an appropriate API, accessible to other services through established Web service protocols. A vendor of risk management tools, for example, need not create its own sources of data or its own facilities for documenting results: other cloud services can perform those functions, while the vendor puts its resources into what it does best.
The cloud is an ecosystem that favors specialization and symbiosis, rather than generalization and top-predator dominance. The question that should be asked, therefore, is not, "are standards desirable?" The question should rather be, "what standards will benefit customers and users?"
In the old IT model, customers needed standards that made different vendors' products readily substitutable even if this led to a commoditized marketplace, such as the desert of "beige box" Windows/Intel desktop PCs that competed only on price.
The cloud invites providers to develop distinctive competence, while customers focus on standards that maximize interoperability. Already, these standards largely define the cloud marketplace: for example, more than half the workload borne by salesforce.com systems supports other services invoking salesforce.com APIs, rather than executing salesforce.com's own cloud applications.
Cloud service customers will still benefit from some degree of substitutability, both commercially (providers will not be able to raise prices without inviting competition) and technically (rare service interruptions may be inconvenient, but need not be catastrophic).
Customers will do well, though, to think on a higher level than mere substitution: should cloud storage services, for example, be purely interoperable, or should there be stratified storage offerings with varying reliability/speed/cost that can provide "defense in depth" against common-mode failures? What protections matter most?
Any competitive IT platform should give application developers and customers the freedom to put some things where their productivity and capability will be greatest, while putting other things where substitutability is maximized. When writing an application with a useful life of months, developers want maximum leverage; when developing intellectual property with a lifetime of years, they may prefer a path that can lead to as many different environments as possible.
The proper role of standards is to provide, and preserve, a choice among varied paths.
For further information, please contact Peter Coffee at: pcoffee@salesforce.com
Posted: July 30, 2010
|
Teresa Carlson
Vice President
Microsoft Federal
Standards can be extremely valuable in providing security and privacy assurances to organizations exploring cloud computing options, and they are also critical to laying a foundation of interoperability within the IT industry. Interoperability is really essential because it promotes competition, innovation, and customer choice, which are all key to ensuring the government has access to the best solutions at the best prices. It's important to always think about standards as a means to this end, because creating standards for the sake of creating standards has the potential to hinder innovation.
History shows that standards tend to emerge as they are needed. Industries adopt standards when organizations demand them. They have the power to level the playing field and provide access to the best solutions. In the cloud, this means security standards, protocols, Internet standards and storage standards. Some of the best cloud standards that exist today are the same ones that advanced the Web, including HTTP, Simple Object Access Protocol (SOAP), Representational State Transfer (REST) and Extensible Markup Language (XML). These are all market-test standards carried over from Web 2.0 or grid computing and they all support interoperability. In some cases these existing standards aren't a perfect fit for the cloud, especially when it comes to connectivity, datacenter proximity and privacy, but new standards that address these issues specifically will continue to build upon Web services and REST-based approaches. Our developers always talk about how high level, semantic standards tend to work better than syntactic standards because they avoid the friction and overlap that can actually hinder progress.
But standards don't create interoperability on their own. If two providers implement the same standard within their cloud offering there is no guarantee that those products will be able to interoperate with each other. It still requires ongoing technical collaboration amongst companies, governments and standards-setting organizations. In the federal space, standardized terms, language and processes will go a long way to achieving this goal. On the security side, FedRAMP is a great example of collectively addressing standards and streamlining the process of evaluating cloud solutions. Making authorization government-wide will eliminate the time and resources that each agency needs to devote to risk management.
The best standards come from an open, collaborative process that is driven by market need. Worthwhile standards need to be tied to a specific "use case" that addresses a critical outcome like interoperability or security. There are some great standards that already exist today that we can build upon to address current cloud gaps, which will keep the playing field level and offer the best solutions for federal agencies.
For more information, please see Teresa Carlson's FutureFed blog at: http://blogs.msdn.com/USPUBLICSector/
Posted: August 5, 2010 6:00 pm.
|
Marie Francesca
Director Corporate Engineering Operations
The MITRE Corporation
Many thanks to this month's submitters for sharing their insights and perspectives on cloud standards. As our submitters have noted, there are multiple on-going activities by government and industry with many market-leading companies participating. Winston Bumpus states much recent progress has been made. However, more effort is needed to facilitate widespread government adoption. NIST is leading the way in government and there are industry-based organizations such as DMTF pursuing standards that can move the community to the next level. The history of technical standards has shown that they can be highly successful in facilitating interoperability and portability as well as lowering costs and enabling new products.
Successful standards development involves people, process and technology, and all three aspects must be addressed. Lew Moorman hinted at this in the discussion on openness in his response. Several sources, including IEEE, indicate that the better standards themselves are open (versus closed/proprietary) and are developed in an open, community process. Business opportunities need to outweigh the loss of competitive advantage attained through a closed, proprietary approach. In his Essential Guide to Standards, Andrew Updegrove places significant weight on the right membership in a standards development consortium, including a broad list of market leading vendors and government. Peter Mell and Tim Grance of NIST both indicate minimum cloud standards should ensure interoperability without inhibiting innovation. Similarly, several of our blog responders commented that preserving innovation (and thus competition and business opportunity) is a necessary standards attribute.
Peter Coffee proffers an important question that all of us should ask: "What standards will benefit customers and users?" This speaks to the question of relevance, which Peter J. Ashendon mentions in What Makes a Good Standard?. There should be a clear business purpose that standards address. Ron Knode suggests that we should seek cloud standards that lead us to "trusted cloud computing" and Jim St.Clair points out a number of activities already underway.
In examining successful modern technical standards, the degree of adoption was a critical tipping point. HP CTO Paul Congdon coined a term called the "thud factor" referring to the amount of documentation needed to describe the standard. If it is difficult to understand, adoption is not likely, or, it will not be implemented fully and lead to the occurrence of non-standard subsets. The W3C also makes the case that "learnability" is important. An open process is likely to help in reducing complexity as noted by several sources.
Many technologies comprise cloud computing and there is likely a need for standards that address different aspects of cloud implementations. For example, portability and interoperability standards have very different implications for IaaS than PaaS or SaaS. Successful standards committees will need to narrow down the scope of the specific standards while looking for opportunities to have a positive impact for both consumers and providers.
In summary, cloud computing standards should be open, simple, interoperable, and relevant to business needs as well as targeted for a given application. They should enable innovation and the process by which they are developed should support community-based participation.
For further information please contact Marie Francesca at cloudbloggers-list@lists.mitre.org.
Posted: August 5, 2010 5:03 pm.
|
If you would like to contribute an answer to this question, or future questions, please Contact Us. Terms and Conditions of Use
|
|
If you are from a U.S. government agency or DoD organization and would like to pose a question for this forum, let us know.
Welcome
"Ahead in the Clouds" is a public forum to provide federal government agencies with meaningful answers to common cloud computing questions, drawing from leading thinkers in the field. Each month we pose a new question, then post both summary and detailed responses.
Current Month
January 2011
|
|
|