Welcome!

SAP HANA Cloud Authors: Liz McMillan, Pat Romanski, John Basso, Kevin Knuese, Ed Featherston

Blog Feed Post

COTS Cloud security reference design and related NIST workshop

By

NISTSince the beginning of the modern Cloud movement (which we trace to November 2006 — see here if you want to know why) technologists have been seeking ways to mitigate key risks. Top on our list include

1) The increased risk due to multi-tenancy

2) The mission needs of availability (including the need for always available path to resources)

3) New and at times nuanced challenges regarding data confidentiality

4) New challenges regarding integrity of data.

There are many other policy related risks that planners must consider, including how to establish the best user authentication methods and how to ensure compliance with regulations and laws of the geography that holds the data. But for a technologist, the four above are a continual concern, and if those technical concerns are mitigated it makes other concerns so much easier to deal with.

That is why we read with such great pleasure a recent announcement that NIST is continuing to work with industry to ensure advancements are being made in cloud security. The NIST National Cyber Center of Excellence (NNCOE) in Rockville, MD is a focal point for many great industry/government interactions, including a workshop at their facility January 14 that we are especially excited about.

This workshop is on the topic of  Trusted Geo location in the Cloud. It is a proof of concept implementation that uses technology that has proven to be the most scalable technology on the globe: Intel processors.  Technologists presenting and discussing these developments come from Intel, EMC-RSA, NIST and the NCCoE. This will be a great workshop that includes hands-on demonstrations of this technology, and we believe it will show ways to help mitigate all four of the challenges we  provide above.

Following the workshop the NCCoE will have a two day cloud computing event (details can be found on that here)

From the workshop flyer:

An upcoming workshop to be held at the NIST National Cyber Center of Excellence (NNCOE) facility in Rockville, MD on Monday, January 14th on Trusted Geo location in the Cloud : Proof of Concept Implementation.

There is a very interesting workshop being provided to a technical audience next week on Monday the 14th by NIST and private industry on a cloud use case embracing the security challenges involving Infrastructure as a Service (IaaS) cloud computing technologies and geolocation.

The motivation behind this use case is to improve the security of cloud computing and accelerate the adoption of cloud computing technologies by establishing an automated hardware root of trust method for enforcing and monitoring geolocation restrictions for cloud servers. A hardware root of trust is an inherently trusted combination of hardware and firmware that maintains the integrity of the geolocation information and the platform. This information is accessed using secure protocols to assert the integrity of the platform and confirm the location of the host.

At the heart of the solution is a reference design provided through the utilization of commercial off the shelf (COTS) products provided by Intel, VmWare and RSA Archer. The use case is of significant relevance to US Federal agencies in solving the security problem in question: improving the security of virtualized infrastructure cloud computing technologies by enforcing geolocation restrictions.

NIST now moves in conjunction with private industry in a workshop specific to this research (attached to this email) that explains and details how to implement this trusted cloud solution on January 14th at the NIST National Cyber Center of Excellence (NCCOE).

Audience 

This workshop and IR document has been created for security researchers, cloud computing practitioners, system integrators, and other parties interested in techniques for solving the security problem in question: improving the security of virtualized infrastructure cloud computing technologies by enforcing geolocation restrictions. 2:00 PM – 2:15 PM  NCCoE Introduction NIST 
2:15 PM – 2:30 PM  Trusted Cloud Description NIST 
2:30 PM – 2:45 PM  Trusted Geolocation in the Cloud Implementation – Trusted Measurement and Remote Attestation Intel Corporation 
2:45 PM – 3:00 PM  Trusted Geolocation in the Cloud Trusted – Monitoring of Measurements in a Governance, Risk, and Compliance Dashboard EMC-RSA 
3:00 PM – 3:15 PM  Trusted Cloud Demonstration Intel, EMC-RSA, and NIST 
3:15 PM – 4:00 PM  Questions and Answers / Hands-on Session Intel, EMC-RSA, and NIST 

 

Participation from all parties is welcomed and to register for this workshop: Please send an email with the attendee’s name, affiliation, and email address in the body of the message to [email protected], with the subject “Trusted Location in the cloud” by January 13, 2013.

This workshop is now part of their Big Data and Cloud Computing Workshop to be held at the NIST HQ in Gaithersburg, MD on January 15-17. http://www.nist.gov/itl/cloud/cloudbdworkshop.cfm

The importance of this secure cloud computing proof of concept can be found in the NIST Draft publication at the following link to the publication which details this reference design and clearry delineates how to stand up this secure cloud structure. The NIST Interagency Report (NISTIR) is a public/ private collaboration with co-authors from both NIST and private industry authors and is now taking public comments: http://csrc.nist.gov/publications/drafts/ir7904/draft_nistir_7904.pdf

____________________________________________________________________________________

Background Information taken from NISTIR 7904:

Shared cloud computing technologies are designed to be very agile and flexible, transparently using whatever resources are available to process workloads for their customers. However, there are security and privacy concerns with allowing unrestricted workload migration. Whenever multiple workloads are present on a single cloud server, there is a need to segregate those workloads from each other so that they do not interfere with each other, gain access to each other’s sensitive data, or otherwise compromise the security or privacy of the workloads. Imagine two rival companies with workloads on the same server; each company would want to ensure that the server can be trusted to protect their information from the other company.

Another concern with shared cloud computing is that workloads could move from cloud servers located in one country to servers located in another country. Each country has its own laws for data security, privacy, and other aspects of information technology (IT). Because the requirements of these laws may conflict with an organization’s policies or mandates (e.g., laws, regulations), an organization may decide that it needs to restrict which cloud servers it uses based on their location. A common desire is to only use cloud servers physically located within the same country as the organization. Determining the approximate physical location of an object, such as a cloud computing server, is generally known as geolocation. Geolocation can be accomplished in many ways, with varying degrees of accuracy, but traditional geolocation methods are not secured and they are enforced through management and operational controls that cannot be automated and scaled, and therefore traditional geolocation methods cannot be trusted to meet cloud security needs.

The motivation behind this use case is to improve the security of cloud computing and accelerate the adoption of cloud computing technologies by establishing an automated hardware root of trust method for enforcing and monitoring geolocation restrictions for cloud servers. A hardware root of trust is an inherently trusted combination of hardware and firmware that maintains the integrity of the geolocation information and the platform. The hardware root of trust is seeded by the organization, with the host’s unique identifier and platform metadata stored in tamperproof hardware. This information is accessed using secure protocols to assert the integrity of the platform and confirm the location of the host.

NIST now moves in conjunction with private industry in a workshop specific to this research (attached to this email) that explains and details how to implement this trusted cloud solution on January 14th at the NIST National Cyber Center of Excellence (NCCOE). This workshop is now part of their Big Data and Cloud Computing Workshop to be held at the NIST HQ in Gaithersburg, MD on January 15-17. http://www.nist.gov/itl/cloud/cloudbdworkshop.cfm

Here is the link to the publication from both NIST and private industry authors that is now taking public comments: http://csrc.nist.gov/publications/drafts/ir7904/draft_nistir_7904.pdf

For media interviews and comments, please contact:

Kevin Fiftal

Intel Corporation

 

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com

@ThingsExpo Stories
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform and how we integrate our thinking to solve complicated problems. In his session at 19th Cloud Expo, Craig Sproule, CEO of Metavine, will demonstrate how to move beyond today's coding paradigm ...
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Traditional IT, great for stable systems of record, is struggling to cope with newer, agile systems of engagement requirements coming straight from the business. In his session at 18th Cloud Expo, William Morrish, General Manager of Product Sales at Interoute, outlined ways of exploiting new architectures to enable both systems and building them to support your existing platforms, with an eye for the future. Technologies such as Docker and the hyper-convergence of computing, networking and sto...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus...
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, discussed the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filterin...
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Early adopters of IoT viewed it mainly as a different term for machine-to-machine connectivity or M2M. This is understandable since a prerequisite for any IoT solution is the ability to collect and aggregate device data, which is most often presented in a dashboard. The problem is that viewing data in a dashboard requires a human to interpret the results and take manual action, which doesn’t scale to the needs of IoT.
Internet of @ThingsExpo has announced today that Chris Matthieu has been named tech chair of Internet of @ThingsExpo 2016 Silicon Valley. The 6thInternet of @ThingsExpo will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.