Welcome!

Artificial Intelligence Authors: Zakia Bouachraoui, Liz McMillan, Elizabeth White, Yeshim Deniz, William Schmarzo

RSS Feed Item

RE: What makes a data component good forstandardizing?

Roger, your example is weak.  If the name is not useful outside its
context, then neither is the address.  Is it a shipping address? Home
address? Birth place? Current residence? Future residence? For sale? For
rent? Target? Party house?  Yes, you can put it in your contact list,
but you can do that with the name as well, and with just about as much
value.

If I understand him, I agree with Len, what is or isn't worth
standardizing is negotiated with at least a surrogate for the
anticipated but as yet unknown user.  Who buys after-market motorcycle
parts?  You might not yet know the person, but you know the role they
play in the transaction you build into your system based on the
established market dynamics that you've studied carefully and documented
fully for your SBA loan approval.  That's the context you use to decide
what to standardize.  Without that, you don't even have a market, and
therefore no competitors or related services, so why bother
standardizing anything since there aren't any other business services to
interoperate with.

Some data clumps have a context larger than a specific market segment,
such as name and address and credit card info.  They represent the
social infrastructure that makes interoperability in a specific market
segment possible.  A particular user (role player) might be
unanticipated, but the role they play is not.

The value in standardization is in reducing friction in a transaction,
which in a market is measured by changes in profit.  In an academic
context, standardization might mean reducing effort in getting
published, or the cost of publication, all tending to improve one's
standing in the community.  In every case I can think of, the decision
to standardize or not would be a cost vs. benefit for effort involved.
If the data already has a standard representation, someone has answered
that question in the affirmative.  They've done so on the basis of some
abstract model of the transaction, probably derived from historical
records, regulations, custom, or other past experience.  Surely, some
such decisions are just wrong, due to poor analysis or based on poor,
tacit, or seat-of-the-pants models.

What to standardize for the sake of interoperability is based on the
model of the operation, which has to include all the roles for potential
users, even in the hypothetical nirvana of the semantic web.  That stuff
just pushes it to a higher level of abstraction, making it all the more
difficult to achieve since there will be less chance of agreement on the
model and (I'm going to really stick my neck out here) absolutely no
chance of basing the model on any actual human behavior.  How can we
expect people to behave in accord with an abstract model that is
completely foreign to them?  Does this mean that the semantic web
actually makes interoperability more difficult rather than less?  Hmmm.
But that's completely off topic.  Sorry.

Enough rambling.

Bruce B Cox
Manager, Standards Development Division
OCIO/SDMG
571-272-9004


-----Original Message-----
From: Len [mailto:[email protected]] 
Sent: Saturday, January 10, 2009 11:49 AM
To: 'Costello, Roger L.'; xml-dev@l...
Subject: RE:  What makes a data component good for
standardizing?

The user anticipates you.

It is a negotiation.

len

-----Original Message-----
From: Costello, Roger L. [mailto:[email protected]] 
Sent: Saturday, January 10, 2009 8:34 AM
To: '[email protected]'
Subject:  What makes a data component good for standardizing?


Hi Folks,

Suppose you set out to create some standard data components. Your goal
is to
improve interoperability by creating standardized data components.
Particularly, you want these standardized data components to improve
interoperability between systems that weren't a priori coded to
understand
each other's data exchange format (i.e. you want to improve
interoperability
with the "unanticipated user").

What makes one data component good, and another bad? 

(By "good" I mean the data component would in fact help improve
interoperability with the unanticipated user. By "bad" I mean the data
component would do little, if anything, to improving interoperability
with
the unanticipated user.)

I'll share my initial thoughts. I'd like your feedback on my initial
thoughts, and I'd also like to hear your thoughts.

Note: by "data component" I mean a chunk of markup that can be reused in
multiple XML vocabularies.


MY INITIAL THOUGHTS

I think that some data components would be good to standardize, while
others
would not be useful. 

I'll start with two examples of data components would be good to
standardize.

Think about a postal address. It would be a good data component to
standardize. It's a useful data component even if I don't understand the
context in which it's being used. 
 
      For example, suppose some nuclear physicist unexpectedly sends 
      me a document containing data that I have no clue 
      what it means, but embedded in it is a postal address. 
      I may not be able to process all that data about 
      subatomic particles (quarks, neutrinos, etc), but I can 
      pluck out the postal address and store it in my address book. 
 
That's interoperability between unanticipated users, albeit limited.
 
Another example of a useful data component is a business card (vcard).
Again, that's a useful data component that I can immediately utilize,
even
if I have no clue what the rest of the document is talking about.
 
These data components are useful independent of their context. I can use
the
data components even if I can't use all the stuff that they reside in.
 
Now I'll give an example of a data component which I think would not be
useful to standardize.

Both postal address and vcard gives a person's name (along with other
data).
Suppose I decide that I want data components with finer granularity than
postal address or vcard. Would "person name" make a good component for
standardizing?

I think not. A person's name would not be useful independent of context.

 
      For example, the same nuclear physicist above 
      sends me the same document but containing the
      standardized PersonName data component, about
      a person named "Jim Brown.
      I am PersonName-aware so I am able to pluck out that 
      Jim Brown information, even though I have 
      no clue what the rest of the document says. 
      Have I gained anything? No. It could be Jim Brown 
      the ex-football player or some other person by that name.
      To make sense of the data component I need to
      understand its context. 
 
I propose these two metrics for evaluating the usefulness of data
component:
 
    1. The data component must be standardized
       and broadly adopted (see below).
    2. If I can meaningfully use the data component
       without understanding any of the context in 
       which it resides then it is a good data
       component. If I must understand its context
       then it is a bad data component. 
  
Standardizing is good. It enables two parties to understand each other,
i.e., interoperate. 
 
But standardization is not enough. I want more than interoperability
between
two parties that have a priori agreed to a data interchange format. I
want
interoperability between two parties that haven't a priori agreed to a
data
interchange format. I want interoperability between unanticipated
parties.
 
So the key is to not only standardize, but standardize the right things.
 

SUMMARY 

We would go a long way toward advancing interoperability of
unanticipated
systems if we focused on creating standardized components that are
useful
independent of context.

What do you think?
 
/Roger  
_______________________________________________________________________

XML-DEV is a publicly archived, unmoderated list hosted by OASIS
to support XML implementation and development. To minimize
spam in the archives, you must subscribe before posting.

[Un]Subscribe/change address: http://www.oasis-open.org/mlmanage/
Or unsubscribe: [email protected]
subscribe: [email protected]
List archive: http://lists.xml.org/archives/xml-dev/
List Guidelines: http://www.oasis-open.org/maillists/guidelines.php



Read the original blog entry...

IoT & Smart Cities Stories
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settl...
Contextual Analytics of various threat data provides a deeper understanding of a given threat and enables identification of unknown threat vectors. In his session at @ThingsExpo, David Dufour, Head of Security Architecture, IoT, Webroot, Inc., discussed how through the use of Big Data analytics and deep data correlation across different threat types, it is possible to gain a better understanding of where, how and to what level of danger a malicious actor poses to an organization, and to determin...
@CloudEXPO and @ExpoDX, two of the most influential technology events in the world, have hosted hundreds of sponsors and exhibitors since our launch 10 years ago. @CloudEXPO and @ExpoDX New York and Silicon Valley provide a full year of face-to-face marketing opportunities for your company. Each sponsorship and exhibit package comes with pre and post-show marketing programs. By sponsoring and exhibiting in New York and Silicon Valley, you reach a full complement of decision makers and buyers in ...
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
LogRocket helps product teams develop better experiences for users by recording videos of user sessions with logs and network data. It identifies UX problems and reveals the root cause of every bug. LogRocket presents impactful errors on a website, and how to reproduce it. With LogRocket, users can replay problems.
Data Theorem is a leading provider of modern application security. Its core mission is to analyze and secure any modern application anytime, anywhere. The Data Theorem Analyzer Engine continuously scans APIs and mobile applications in search of security flaws and data privacy gaps. Data Theorem products help organizations build safer applications that maximize data security and brand protection. The company has detected more than 300 million application eavesdropping incidents and currently secu...
Rafay enables developers to automate the distribution, operations, cross-region scaling and lifecycle management of containerized microservices across public and private clouds, and service provider networks. Rafay's platform is built around foundational elements that together deliver an optimal abstraction layer across disparate infrastructure, making it easy for developers to scale and operate applications across any number of locations or regions. Consumed as a service, Rafay's platform elimi...
Two weeks ago (November 3-5), I attended the Cloud Expo Silicon Valley as a speaker, where I presented on the security and privacy due diligence requirements for cloud solutions. Cloud security is a topical issue for every CIO, CISO, and technology buyer. Decision-makers are always looking for insights on how to mitigate the security risks of implementing and using cloud solutions. Based on the presentation topics covered at the conference, as well as the general discussions heard between sessio...
Growth hacking is common for startups to make unheard-of progress in building their business. Career Hacks can help Geek Girls and those who support them (yes, that's you too, Dad!) to excel in this typically male-dominated world. Get ready to learn the facts: Is there a bias against women in the tech / developer communities? Why are women 50% of the workforce, but hold only 24% of the STEM or IT positions? Some beginnings of what to do about it! In her Day 2 Keynote at 17th Cloud Expo, Sandy Ca...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...