Click here to close now.

Welcome!

SAP Authors: Elizabeth White, Liz McMillan, Pat Romanski, Carmen Gonzalez, Automic Blog

Blog Feed Post

Applying MaaS to DaaS (Database as a Service) Contract. An introduction to the Practice

The Cloud offers a great opportunity to manage highly available and scalable databases by decreasing cost, time and risks. We have introduced how [4] the DaaS life cycle helps in applying best practices when migrating to the Cloud or administrating day-by-day Cloud activities. Taking into consideration the risks associated with Cloud contracts, we introduce a set of best practices that assist organizations in defining the best possible DaaS agreement. Best practices help define regulation controls that determine when and how applications can be deployed in the Cloud. This means that Cloud computing platforms are made up of different components from a variety of vendors but also of a variety of legal jurisdictions (countries, politics, risk management and compliance).

MaaS applied to drawing up the DaaS contract (and to control the Services)

Applying the MaaS can help manage data storage by using location constraints to check where your data is deployed and how it is implemented. Such constraints need to be clearly defined in the contract; persistence and dependencies have to be those classified (and regularly updated) in the data model in order to standardize the platform technologies that underpin the service provided. The main obligations that must be stipulated in the DaaS contract are the following:

1. Integrity defined at the model level has to be maintained through the service. The monitoring executed by data model, for example, has to match what is defined into the initial data structure and classified in the same way;

2. Country location has to be defined in the model partition and regularly monitored and compared. Any mismatch is an infringement of the agreement and must be reconciled with the terms outlined in the SLA;

3. Include and specify international regulations that the both Provider and the Vendor are responsible for during the service life cycle. In detail, highlight directives containing data breach rules. Provider and Vendors are protected although any violation is a service penalty and the data owner must notify both Provider and Vendor in case of a breach;

4. Specify location properties and not only in terms of country. The site locating machines, racks and so on has to be the appropriate one (weight per square meter, fire safety, anti-flood, employee privileges and security service personnel);

5. Identify trust boundaries throughout the IT architecture. Data models and partitions are the right way to define trust boundaries and stewardship to prevent unauthorized access and sharing;

6. Include the method to encrypt data transmitted across the network. If different encryption is used by the provider/ vendor, specify what and when it is to be used. The contract has to include how encryption is run on multi-tenant storage. List the rules concerning keys adoption;

7. Once data has to be deleted, specify that data retention and deletion are the responsibility of the Provider. Looking at data model mapping, data has to be destroyed in all locations defined and monitored. The Provider has to specify if data, for any reason, has been copied in different media and then shredded. The contract must include a provision for the customer to request an audit in order to certify that data has been deleted. This is strategic because satisfyes 2 important clauses:

7.1) Service Closure: the provider should not be able to terminate the service at his convenience. Merges, acquisition and other unpredictable events cannot stop the service (clause of irrevocable guarentee of continous service). In case the service has to be shutdown, the provider has the obligation to retain the data (and services) for an accepatable period of time and to migrate them to the new provider without costs. Of course, data retention and unrecoverable deletion after the migration are the responsability of the provider;

7.2) Right to Closure: in case the contract’s clauses are non respected (value proposition violated, extra charged upgrades, infrastructure maintenance without appropriate assistance, services have not be rendered adeguately, location security out of order …) you should close the contract without penalties. Again, the provider has the obligation to retain the data (and services) for an accepatable period of time and then to migrate them to the new provider.

8. Models are key to ensuring that logical data segregation and control are effective after backup and recovery, test and compare are completed. Include in the contract that a data model should be used to define the data architecture through the data life cycle. MaaS maintain the right to audit, to test all the clauses have been agreed: the data models keep in.

Although the best practices introduced above are helpful guidelines in defining DaaS contracts, negotiating the contractual clauses of your Cloud agreement is the first constraint. Ensure that all standard functionality are guaranteed and enforce special measures should be taken into consideration to secure data and service both in transit from/to the Provider and during the storage:

1)    Enforce and ensure security compliance through ISO 27001/27002 directions. Schedule vulnerability assessments and regular real-time visibility into data applications. MaaS can define “on-premise” the multitenancy in the provider’s infrastructure and applications. Models map the service requirements at a given infrastructure: then, compliance officers have to periodically verify requirements assessment and outcomes through the infrastructure.

2)    Apply SSL, IPSec constraints to secure data movement into the data center. Perimeter protection is essential to prevent denial-of-service threats;

3)    Consider and include VLAN, VPN rules to secure data movement from/to the data center;

4)    Include full disclosure. Provider’s employees and data administrators have to be certified by regulatory and compliance obligations. ISO 27001/27002 have to be provider’s standards (extended to their employees) in regard to privacy and data residency. Always include in the contract, who is responsible for establishing the compliance policy.

Conclusion

MaaS is the “compass” to define on-premise the DaaS (Database as a Service) properties such as security range, DB partitioning and scaling, multi-tenancy, geo-location and all requested assets might be defined “early”. Still, models increases the efficiency of defining, updating and sharing data models and database designs. In other words, models provide continuity with the databases’ structure to extend to the Cloud preconfigured levels of security, compliance and what has been registered inside the data models.

References
[1] N. Piscopo - ERwin® in the Cloud: How Data Modeling Supports Database as a Service (DaaS) Implementations
[2] N. Piscopo - CA ERwin® Data Modeler’s Role in the Relational Cloud
[3] D. Burbank, S. Hoberman - Data Modeling Made Simple with CA ERwin® Data Modeler r8
[4] N. Piscopo – Best Practices for Moving to the Cloud using Data Models in the DaaS Life Cycle
[5] N. Piscopo – Using CA ERwin® Data Modeler and Microsoft SQL Azure to Move Data to the Cloud within the DaaS Life Cycle
[6] R. Livingstone – Four Barriers to Cloud Due Diligence;
[7] N. Piscopo – MaaS (Model as a Service) is the emerging solution to design, map, integrate and publish Open Data http://cloudbestpractices.net/2012/10/21/maas/
[8] N. Piscopo – MaaS Workshop, Awareness, Courses Syllabus;
[9] N. Piscopo – DaaS Workshop, Awareness, Courses Syllabus;
[10] N. Piscopo – DaaS Contract templates: main constraints and examples, in press.


Read the original blog entry...

More Stories By Cloud Best Practices Network

The Cloud Best Practices Network is an expert community of leading Cloud pioneers. Follow our best practice blogs at http://CloudBestPractices.net

@ThingsExpo Stories
SYS-CON Events announced today that GENBAND, a leading developer of real time communications software solutions, has been named “Silver Sponsor” of SYS-CON's WebRTC Summit, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. The GENBAND team will be on hand to demonstrate their newest product, Kandy. Kandy is a communications Platform-as-a-Service (PaaS) that enables companies to seamlessly integrate more human communications into their Web and mobile applications - creating more engaging experiences for their customers and boosting collaboration and productiv...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been involved at the beginning of four IT industries: EDA, Open Systems, Computer Security and now SOA.
From telemedicine to smart cars, digital homes and industrial monitoring, the explosive growth of IoT has created exciting new business opportunities for real time calls and messaging. In his session at @ThingsExpo, Ivelin Ivanov, CEO and Co-Founder of Telestax, shared some of the new revenue sources that IoT created for Restcomm – the open source telephony platform from Telestax. Ivelin Ivanov is a technology entrepreneur who founded Mobicents, an Open Source VoIP Platform, to help create, deploy, and manage applications integrating voice, video and data. He is the co-founder of TeleStax, a...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focused on understanding how industrial data can create intelligence for industrial operations. Imagine ...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, representing a model of how to analyze rea...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes for use cases across the industrial, enterprise, and consumer segments.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data they generate about customer usage and product performance to deliver extremely compelling and reliabl...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
The IoT market is projected to be $1.9 trillion tidal wave that’s bigger than the combined market for smartphones, tablets and PCs. While IoT is widely discussed, what not being talked about are the monetization opportunities that are created from ubiquitous connectivity and the ensuing avalanche of data. While we cannot foresee every service that the IoT will enable, we should future-proof operations by preparing to monetize them with extremely agile systems.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and other machines.
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @ThingsExpo, Michael Sick, a Senior Manager and Big Data Architect within Ernst and Young's Financial Servi...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com), moderated by Ashar Baig, Research Director, Cloud, at Gigaom Research, Nate Gordon, Director of T...
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Intelligent Systems Services will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ mission-critical systems. ISS has completed many successful projects in Healthcare, Commercial, Manufacturing, ...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add scalable realtime functionality with minimal effort and cost.”
Sensor-enabled things are becoming more commonplace, precursors to a larger and more complex framework that most consider the ultimate promise of the IoT: things connecting, interacting, sharing, storing, and over time perhaps learning and predicting based on habits, behaviors, location, preferences, purchases and more. In his session at @ThingsExpo, Tom Wesselman, Director of Communications Ecosystem Architecture at Plantronics, will examine the still nascent IoT as it is coalescing, including what it is today, what it might ultimately be, the role of wearable tech, and technology gaps stil...
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
For years, we’ve relied too heavily on individual network functions or simplistic cloud controllers. However, they are no longer enough for today’s modern cloud data center. Businesses need a comprehensive platform architecture in order to deliver a complete networking suite for IoT environment based on OpenStack. In his session at @ThingsExpo, Dhiraj Sehgal from PLUMgrid will discuss what a holistic networking solution should really entail, and how to build a complete platform that is scalable, secure, agile and automated.