Welcome!

Artificial Intelligence Authors: Yeshim Deniz, Pat Romanski, Elizabeth White, Liz McMillan, Zakia Bouachraoui

Related Topics: Artificial Intelligence, Microservices Expo

Artificial Intelligence: Article

BEA, IBM, Oracle, SAP, IONA, Siebel and Sybase Announce "Service Component Architecture" Specification

SCA Spec – "A Deployment Descriptor On Steroids" – Has Been Specifically Designed for SOA

BEA, IBM, Oracle, SAP, IONA, Siebel, and Sybase today announced their support for a new specification for building and packaging applications called Service Component Architecture or SCA – a specification that allows developers to focus on writing business logic.

More directly, SCA is "a deployment descriptor on steroids," that works with any language, not just Java. Moreover, you can use procedural languages and declarative language like BPEL and XSLT as well. The key difference is that it uses the notion of declarative policies for things like security, transactions and reliable messaging.

 

One thing that sets SCA apart is that it has been designed for SOA, unlike other systems, such as J2EE which has been adapted to it. It focuses on being able to describe assemblies of components which have been written in a variety programming models and protocols.

The goal of SCA is to make building applications easier. It allows the development off application assemblies without regard to specific middleware APIs or language.

At the heart of SCA is the notion of a service, and its related implementation. A service is defined by an interface which includes a set of operations. Service implementation can reference other services, called references. These services may have one or more properties which are data values that can be configured externally.

A key actor in the SCA universe is is the Service Data Object (SDO). SDOs are used to represent business data, parameters and return values of service invocations, and are a way to represent data as it travels across a service network. Note that XMLBeans and other technologies can be used as well.

SCA components are composed into assemblies. Assemblies are service level applications which are collections of services connected together and appropriately configured. An SCA assembly operates at two levels: First, assemblies are a set of loosely connected components connected within a system. Second, assemblies are a set of loosely connected components connected within a module. The distinction between the two is (roughly) that a module is a collection of components, and a systems is a collection of modules. Furthermore, a system corresponds to "programming in the large" or megaprogramming, and a model is "programming in the small", akin to building one of today's typical applications.

This wiring of components to the services they depend on is how service networks are "assembled." Assembly has been pioneered in a number of technologies and frameworks such as CORBA, J2EE, ATG Dynamo, and Spring; that is, it is nothing new. We have learned from these technologies that assembly provides important benefits (some of the frameworks mentioned do a better job than others at this) such as easier iterative development and avoiding making business logic dependent on middleware containers.

SCA uses the concept of assembly to solve key problems presented by SOA development including:

  1. Separation of business logic from underlying infrastructure, qualities of service, and transport
  2. Linking programming in the small with programming in the large
  3. Providing a unified way to move to and from architectural design, coding, and operational deployment in a bottom-up or top-down approach.

It should be noted that companies like IBM and BEA have been shipping pieces of SCA, like SDO, for some time.

Why does SCA matter?

SCA matters, because this is the first technology that promises a compositional model that will enable the Service Network and allow the building of the next generation of service-oriented applications.

Each innovation in this field allowed a new layer of abstraction to appear which made new tiers of applications possible. C allowed applications to be built that could not be built in assembler, or whose effort would have been prohibitive. C++ allowed things to be built that could not be built in C. Java allowed things not possible in C++. All of these are progenitors to SCA. It appears that SCA is part of a promising future for building large-scale enterprise composite applications.

 

More Stories By SAP News Desk

SAP News Desk trawls the world's news information sources and brings you timely updates on the world's leading provider of enterprise resource planning (ERP) and its various software product lines used to integrate back-office functions such as distribution, accounting, human resources, and manufacturing.

Comments (5) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Denis 01/30/06 11:29:52 PM EST

Killer article!!! Congratulations. We do need a higher level of abstraction to make the task of assembling Composite Applications more productive.

Joseph Jonas 12/08/05 10:29:06 AM EST

Faint, useless attempt to gain some attention in the SOA Space. I do not see the point nor the correlation to SOA, I think IBM themselves cannot... Bringing some old concepts into a new light and trying to create confusion around SOA where they have less to say...

Javier Cámara 12/01/05 11:41:13 AM EST

I do not see the point of SCA/SDO at al. All I see is technology from IBM, and not new at all:

- I have read Developerworks articles about "IBM SOA Programming Model" at least from June 2005

- The SDO is yet-another Java data abstraction layer from IBM dating back at least to September 2003. It seems to also have been submitted as JSR 235 by IBM and Bea... on December 2003. And, at any rate, I have never seen its relationships with SOA, or services in general.

- The SCA seems to be something quite tied to Webshpere. Besides, it sounds as having a huge overlap with JBI.

- Bea has something called "CommonJ" in which it talks about these things... dating back to at least March 2005

So for me it seems like old IBM stuff that collides with many other things and whose net effect is to blur the landscape a little more.

Michael O'Connell 12/01/05 03:33:09 AM EST

The new Service Component Architecture (SCA 0.9) and updated Service Data Objects (SDO 2.01) specs (and related docs and tools) are now available at IBM developerWorks. The direct URL: http://www.ibm.com/developerworks/library/specification/ws-scasdosumm/

Michael O'Connell 11/30/05 03:47:24 PM EST

The new Service Component Architecture (SCA 0.9) and updated Service Data Objects (SDO 2.01) specs (and related docs and tools) are now available at IBM developerWorks. The direct URL: http://www.ibm.com/developerworks/library/specification/ws-scasdosumm/

IoT & Smart Cities Stories
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-c...
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by ...
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
René Bostic is the Technical VP of the IBM Cloud Unit in North America. Enjoying her career with IBM during the modern millennial technological era, she is an expert in cloud computing, DevOps and emerging cloud technologies such as Blockchain. Her strengths and core competencies include a proven record of accomplishments in consensus building at all levels to assess, plan, and implement enterprise and cloud computing solutions. René is a member of the Society of Women Engineers (SWE) and a m...
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Predicting the future has never been more challenging - not because of the lack of data but because of the flood of ungoverned and risk laden information. Microsoft states that 2.5 exabytes of data are created every day. Expectations and reliance on data are being pushed to the limits, as demands around hybrid options continue to grow.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to ...
Enterprises have taken advantage of IoT to achieve important revenue and cost advantages. What is less apparent is how incumbent enterprises operating at scale have, following success with IoT, built analytic, operations management and software development capabilities - ranging from autonomous vehicles to manageable robotics installations. They have embraced these capabilities as if they were Silicon Valley startups.
As IoT continues to increase momentum, so does the associated risk. Secure Device Lifecycle Management (DLM) is ranked as one of the most important technology areas of IoT. Driving this trend is the realization that secure support for IoT devices provides companies the ability to deliver high-quality, reliable, secure offerings faster, create new revenue streams, and reduce support costs, all while building a competitive advantage in their markets. In this session, we will use customer use cases...