Welcome!

Artificial Intelligence Authors: Carmen Gonzalez, Zakia Bouachraoui, Yeshim Deniz, Pat Romanski, Elizabeth White

RSS Feed Item

RE: What makes a data component good forstandardizing?

Roger, your example is weak.  If the name is not useful outside its
context, then neither is the address.  Is it a shipping address? Home
address? Birth place? Current residence? Future residence? For sale? For
rent? Target? Party house?  Yes, you can put it in your contact list,
but you can do that with the name as well, and with just about as much
value.

If I understand him, I agree with Len, what is or isn't worth
standardizing is negotiated with at least a surrogate for the
anticipated but as yet unknown user.  Who buys after-market motorcycle
parts?  You might not yet know the person, but you know the role they
play in the transaction you build into your system based on the
established market dynamics that you've studied carefully and documented
fully for your SBA loan approval.  That's the context you use to decide
what to standardize.  Without that, you don't even have a market, and
therefore no competitors or related services, so why bother
standardizing anything since there aren't any other business services to
interoperate with.

Some data clumps have a context larger than a specific market segment,
such as name and address and credit card info.  They represent the
social infrastructure that makes interoperability in a specific market
segment possible.  A particular user (role player) might be
unanticipated, but the role they play is not.

The value in standardization is in reducing friction in a transaction,
which in a market is measured by changes in profit.  In an academic
context, standardization might mean reducing effort in getting
published, or the cost of publication, all tending to improve one's
standing in the community.  In every case I can think of, the decision
to standardize or not would be a cost vs. benefit for effort involved.
If the data already has a standard representation, someone has answered
that question in the affirmative.  They've done so on the basis of some
abstract model of the transaction, probably derived from historical
records, regulations, custom, or other past experience.  Surely, some
such decisions are just wrong, due to poor analysis or based on poor,
tacit, or seat-of-the-pants models.

What to standardize for the sake of interoperability is based on the
model of the operation, which has to include all the roles for potential
users, even in the hypothetical nirvana of the semantic web.  That stuff
just pushes it to a higher level of abstraction, making it all the more
difficult to achieve since there will be less chance of agreement on the
model and (I'm going to really stick my neck out here) absolutely no
chance of basing the model on any actual human behavior.  How can we
expect people to behave in accord with an abstract model that is
completely foreign to them?  Does this mean that the semantic web
actually makes interoperability more difficult rather than less?  Hmmm.
But that's completely off topic.  Sorry.

Enough rambling.

Bruce B Cox
Manager, Standards Development Division
OCIO/SDMG
571-272-9004


-----Original Message-----
From: Len [mailto:[email protected]] 
Sent: Saturday, January 10, 2009 11:49 AM
To: 'Costello, Roger L.'; xml-dev@l...
Subject: RE:  What makes a data component good for
standardizing?

The user anticipates you.

It is a negotiation.

len

-----Original Message-----
From: Costello, Roger L. [mailto:[email protected]] 
Sent: Saturday, January 10, 2009 8:34 AM
To: '[email protected]'
Subject:  What makes a data component good for standardizing?


Hi Folks,

Suppose you set out to create some standard data components. Your goal
is to
improve interoperability by creating standardized data components.
Particularly, you want these standardized data components to improve
interoperability between systems that weren't a priori coded to
understand
each other's data exchange format (i.e. you want to improve
interoperability
with the "unanticipated user").

What makes one data component good, and another bad? 

(By "good" I mean the data component would in fact help improve
interoperability with the unanticipated user. By "bad" I mean the data
component would do little, if anything, to improving interoperability
with
the unanticipated user.)

I'll share my initial thoughts. I'd like your feedback on my initial
thoughts, and I'd also like to hear your thoughts.

Note: by "data component" I mean a chunk of markup that can be reused in
multiple XML vocabularies.


MY INITIAL THOUGHTS

I think that some data components would be good to standardize, while
others
would not be useful. 

I'll start with two examples of data components would be good to
standardize.

Think about a postal address. It would be a good data component to
standardize. It's a useful data component even if I don't understand the
context in which it's being used. 
 
      For example, suppose some nuclear physicist unexpectedly sends 
      me a document containing data that I have no clue 
      what it means, but embedded in it is a postal address. 
      I may not be able to process all that data about 
      subatomic particles (quarks, neutrinos, etc), but I can 
      pluck out the postal address and store it in my address book. 
 
That's interoperability between unanticipated users, albeit limited.
 
Another example of a useful data component is a business card (vcard).
Again, that's a useful data component that I can immediately utilize,
even
if I have no clue what the rest of the document is talking about.
 
These data components are useful independent of their context. I can use
the
data components even if I can't use all the stuff that they reside in.
 
Now I'll give an example of a data component which I think would not be
useful to standardize.

Both postal address and vcard gives a person's name (along with other
data).
Suppose I decide that I want data components with finer granularity than
postal address or vcard. Would "person name" make a good component for
standardizing?

I think not. A person's name would not be useful independent of context.

 
      For example, the same nuclear physicist above 
      sends me the same document but containing the
      standardized PersonName data component, about
      a person named "Jim Brown.
      I am PersonName-aware so I am able to pluck out that 
      Jim Brown information, even though I have 
      no clue what the rest of the document says. 
      Have I gained anything? No. It could be Jim Brown 
      the ex-football player or some other person by that name.
      To make sense of the data component I need to
      understand its context. 
 
I propose these two metrics for evaluating the usefulness of data
component:
 
    1. The data component must be standardized
       and broadly adopted (see below).
    2. If I can meaningfully use the data component
       without understanding any of the context in 
       which it resides then it is a good data
       component. If I must understand its context
       then it is a bad data component. 
  
Standardizing is good. It enables two parties to understand each other,
i.e., interoperate. 
 
But standardization is not enough. I want more than interoperability
between
two parties that have a priori agreed to a data interchange format. I
want
interoperability between two parties that haven't a priori agreed to a
data
interchange format. I want interoperability between unanticipated
parties.
 
So the key is to not only standardize, but standardize the right things.
 

SUMMARY 

We would go a long way toward advancing interoperability of
unanticipated
systems if we focused on creating standardized components that are
useful
independent of context.

What do you think?
 
/Roger  
_______________________________________________________________________

XML-DEV is a publicly archived, unmoderated list hosted by OASIS
to support XML implementation and development. To minimize
spam in the archives, you must subscribe before posting.

[Un]Subscribe/change address: http://www.oasis-open.org/mlmanage/
Or unsubscribe: [email protected]
subscribe: [email protected]
List archive: http://lists.xml.org/archives/xml-dev/
List Guidelines: http://www.oasis-open.org/maillists/guidelines.php



Read the original blog entry...

IoT & Smart Cities Stories
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Every organization is facing their own Digital Transformation as they attempt to stay ahead of the competition, or worse, just keep up. Each new opportunity, whether embracing machine learning, IoT, or a cloud migration, seems to bring new development, deployment, and management models. The results are more diverse and federated computing models than any time in our history.
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throug...
Dion Hinchcliffe is an internationally recognized digital expert, bestselling book author, frequent keynote speaker, analyst, futurist, and transformation expert based in Washington, DC. He is currently Chief Strategy Officer at the industry-leading digital strategy and online community solutions firm, 7Summits.
Digital Transformation is much more than a buzzword. The radical shift to digital mechanisms for almost every process is evident across all industries and verticals. This is often especially true in financial services, where the legacy environment is many times unable to keep up with the rapidly shifting demands of the consumer. The constant pressure to provide complete, omnichannel delivery of customer-facing solutions to meet both regulatory and customer demands is putting enormous pressure on...
IoT is rapidly becoming mainstream as more and more investments are made into the platforms and technology. As this movement continues to expand and gain momentum it creates a massive wall of noise that can be difficult to sift through. Unfortunately, this inevitably makes IoT less approachable for people to get started with and can hamper efforts to integrate this key technology into your own portfolio. There are so many connected products already in place today with many hundreds more on the h...
The standardization of container runtimes and images has sparked the creation of an almost overwhelming number of new open source projects that build on and otherwise work with these specifications. Of course, there's Kubernetes, which orchestrates and manages collections of containers. It was one of the first and best-known examples of projects that make containers truly useful for production use. However, more recently, the container ecosystem has truly exploded. A service mesh like Istio addr...
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight...
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereal. Andrew's role at ConsenSys Enterprise is a mul...