Welcome!

Artificial Intelligence Authors: Yeshim Deniz, Pat Romanski, Elizabeth White, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo, Mobile IoT, @ThingsExpo

@CloudExpo: Blog Post

Machine Learning - Azure vs AWS By @SrinivasanSunda | @CloudExpo #IoT #Cloud

The importance of machine learning

Machine Learning - Azure vs AWS

Machine Learning, which is a process to predict future patterns and incidents based on the models created out of past data, is definitely the most important part of the success of the Internet of Things in the enterprise and consumer space. The main reason is that without machine learning the entire backbone of the Internet of Things - event acquisition, event processing , event storage and event reporting - is merely a live display of events happening elsewhere and will not provide any value to its consumers. Think of a smart monitor in an oil well that monitors various climatic conditions and other factors that can cause a failure; unless the monitor is able to predict of a failure and corrects itself the usage of such solution is quite limited.

MLPaaS - Azure Vs AWS
In that context, Machine Learning Platform as a Service (MLPaaS) has been a major component of the major cloud platforms. Both Azure and AWS have equivalent services, the below thoughts are comparison of major building blocks of a machine learning service and how the respective cloud providers handle them.

Machine Learning Component

Azure

Amazon AWS

Training Data Enablement: As the machine learning falls in to two major categories of Supervised Learning and Unsupervised Learning, proper training data is one of the most important aspect of a success of a machine learning experiment and how well a MLPaaS facilitates availability and usage of training data is a key factor.

Azure ML has extensive options for data input and manipulation. The Data sources could be any of, Hive, Azure SQL, Blob Storage, web based data feeding engines and even the data could be manually entered.

 

Never a input data from source could be directly used as a training data and hence in this context, Azure ML has an array of transformation functions like, Filter, Data Manipulation, Split and Reduce.

 

With the effective use of above options Azure ML will provide an effective means of integrating training data as part of the machine learning process.

AWS Machine Learning also supports multiple data sources within its eco system.

 

Amazon Simple Storage Service (Amazon S3) is storage for the AWS cloud platform. Amazon ML uses Amazon S3 as a

primary data repository.

 

Amazon ML allows you to create a data source object from data residing in Amazon Redshift, which is the Data Warehouse Platform as a service.

 

Amazon ML also allows you to create a datasource object from data stored in a MySQL database in Amazon

Relational Database Service (Amazon RDS).

 

Also Amazon ML provides a rich set of data transformation functions like, N-gram transformation, Orthogonal Sparse Bigram transformation and more.

Support For Machine Learning Life Cycle: Developing and consuming a machine learning model for an enterprise use case is in itself a eco system. There are multiple players like data scientist, data analyst, ETL Developers, Visualization Engineers and business users are involved and each one plays an important role. Hence any machine learning service should support this life cycle of work flow.

One of the key success factor of Azure ML is the positioning of Azure ML studio and its user friendly graphical interface and supporting workflows which makes the machine learning process highly collaborative and interactive.

The concept of Workspace nicely allows for separation of duties as well as seamless integration with rest of Azure eco system like storage. Typically Data scientist initially creates models and train them with various parameters and data combinations \. Also rich Visualization features help data scientist to test the results easily.

Once a model is trained successfully, Azure provides easy options to create a scoring experiment which can be ultimately published as a web service to be consumed by client applications.

The graphical interface of Amazon ML provides a very similar experience and features in terms of creating and training models.

 

While there is no separation between a training and scoring experiment, Amazon ML provides lot of options for model evaluation and interpretation.

 

When we evaluate an ML model, Amazon ML provides an industry-standard metric and a number of

insights to review the predictive accuracy of the model.

Algorithm Support: This is probably the most important piece of evaluating a machine learning service as there are different algorithms which can be applied for different situations.

While almost all machine learning solutions are covered under the three major categories namely, Clustering, Classification and Regression based on whether we needed a supervised machine learning or unsupervised machine learning.

However the real challenge could be the particular algorithm that suit the above 3 analysis categories.

Azure machine learning supports a whole array of algorithms be it, Decision Trees, Logistic Regression, Bayes Point Machine, Nerual Networks, K-Means ... to just name a few.

One important aspect of Azure machine learning is the democratization of these advanced algorithms that even without any programming knowledge of machine learning languages like R we could effectively deploy them for given use cases.

Amazon ML supports three types of ML models: binary classification, multiclass classification, and regression.

 

As the name indicates, Binary classification is used to predict one of two possible out comes.

 

Multi class classification is used to predict one of three or more possible out comes.

 

Regression is used to predict a continuous variable which is a number.

However as per documentation there does not seem to be an option within the Amazon ML to select individual algorithms like a K-Means as part of evaluating the model.

Consumer Applications: Once the model is trained it has to be put into the practice and the most natural usage is that the results of machine learning are to be used as part of consumer application and in todays context it is mostly a mobile based consumer. So a robust machine learning service should support multiple consumer applications too.

Azure machine learning provides ready to go client side code for the web services that are published. It supports clients for both request and response model as well as batch based execution. Azure machine learning also produces sample client side code in C#, Python and R. It provides an easy interface for testing the request and response parameters. When it comes to batch execution, Azure machine learning provides APIs for submitting and starting a job and sample code is available in C#, Python and R. With this support Azure machine learning provides excellent support for developing client side applications.

Amazon support both batch predictions as well as real time predictions with the support of API for each of the tasks.

 

Amazon ML API has batch prediction APIs like, Create, Update, Delete which can be used for creating batch applications.

 

Similarly the real time machine learning API samples are available in platforms like Java, Python and Scala.

Pricing aspects are not discussed in the table because PaaS solutions like machine learning are charged per usage and the pricing is either per prediction or by per prediction hour and typically enterprises would worry more about the capabilities of the platform in choosing a machine learning service.

Also without doing significant machine learning case studies we cannot comment on the algorithms and their support; however, a higher level view indicates that Azure Machine Learning supports more algorithms and individual choice of algorithms within a category like clustering, classification which may be of interest to seasoned data scientists. Also most data scientists predict the future of machine learning will be on unsupervised learning which has got a good support from Azure in the form clustering algorithms, especially the K-Means algorithm.

More Stories By Srinivasan Sundara Rajan

Highly passionate about utilizing Digital Technologies to enable next generation enterprise. Believes in enterprise transformation through the Natives (Cloud Native & Mobile Native).

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...