Interoperability

From Wikipedia, the free encyclopedia
Jump to: navigation, search

Interoperability is a property to the ability of diverse systems and organizations to work together (inter-operate). The term is often used in a technical systems engineering sense, or alternatively in a broad sense, taking into account social, political, and organizational factors that impact system to system performance.

While interoperability was initially defined for IT systems or services and only allows for information to be exchanged (see definition below), a more generic definition could be this one:

Interoperability is a property of a product or system, whose interfaces are completely understood, to work with other products or systems, present or future, without any restricted access or implementation.

This generalized definition can then be used on any system, not only information technology system. It defines several criteria that can be used to discriminate between systems that are "really" inter-operable and systems that are sold as such but are not because they don't respect one of the aforementioned criteria, namely:

  • non-disclosure of one or several interfaces
  • implementation or access restriction built in the product/system/service

The IEEE Glossary defines interoperability as:

the ability of two or more systems or components to exchange information and to use the information that has been exchanged.[1]

James A. O'Brien and George M. Marakas define interoperability as:[2]

Being able to accomplish end-user applications using different types of computer systems, operating systems, and application software, interconnected by different types of local and wide area networks.

Contents

[edit] Syntactic interoperability

If two or more systems are capable of communicating and exchanging data, they are exhibiting syntactic interoperability. Specified data formats, communication protocols and the like are fundamental. XML or SQL standards are among the tools of syntactic interoperability. This is also true for lower-level data formats, such as ensuring alphabetical characters are stored in ASCII format in all the communicating systems.

Syntactical interoperability is a necessary condition for further interoperability.

[edit] Semantic interoperability

Beyond the ability of two or more computer systems to exchange information, semantic interoperability is the ability to automatically interpret the information exchanged meaningfully and accurately in order to produce useful results as defined by the end users of both systems. To achieve semantic interoperability, both sides must refer to a common information exchange reference model. The content of the information exchange requests are unambiguously defined: what is sent is the same as what is understood.

[edit] Interoperability and open standards

Interoperability must be distinguished from open standards. Although the goal of each is to provide effective and efficient exchange between computer systems, the mechanism for accomplishing that goal is very different. Open standards imply interoperability ab-initio, i.e., by definition, while interoperability does not, by itself, imply wider exchange between a range of products, or similar products from several different vendors, or even past future revisions of the same product. Interoperability may be developed post-facto, as a special measure between two products, while excluding the rest, or when a vendor is forced to adapt its system to make it interoperable with a dominant system.

[edit] Open standards

Open standards rely on a broadly consultative and inclusive group including representatives from vendors, academicians and others holding a stake in the development. That discusses and debates the technical and economic merits, demerits and feasibility of a proposed common protocol. After the doubts and reservations of all members are addressed, the resulting common document is endorsed as a common standard. This document is subsequently released to the public, and henceforth becomes an open standard. It is usually published and is available freely or at a nominal cost to any and all comers, with no further encumbrances. Various vendors and individuals (even those who were not part of the original group) can use the standards document to make products that implement the common protocol defined in the standard, and are thus interoperable by design, with no specific liability or advantage for any customer for choosing one product over another on the basis of standardised features. The vendors' products compete on the quality of their implementation, user interface, ease of use, performance, price, and a host of other factors, while keeping the customers data intact and transferable even if he chooses to switch to another competing product for business reasons.

[edit] Post Facto Interoperability

Post-facto interoperability may be the result of the absolute market dominance of a particular product in contravention of any applicable standards, or if any effective standards were not present at the time of that product's introduction. The vendor behind that product can then choose to ignore any forthcoming standards and not co-operate in any standardisation process at all, using its near-monopoly to insist that its product sets the de-facto standard by its very market dominance. This is not a problem if the product's implementation is open and minimally encumbered, but it may as well be both closed and heavily encumbered (e.g. by patent claims). Because of the network effect, achieving interoperability with such a product is both critical for any other vendor if it wishes to remain relevant in the market, and difficult to accomplish because of lack of co-operation on equal terms with the original vendor, who may well see the new vendor as a potential competitor and threat. The newer implementations often rely on clean-room reverse engineering in the absence of technical data to achieve interoperability. The original vendors can provide such technical data to others, often in the name of 'encouraging competition,' but such data are invariably encumbered, and may be of limited use. Availability of such data is not equivalent to an open standard, because:

  1. The data are provided by the original vendor on a discretionary basis, who has every interest in blocking the effective implementation of competing solutions, and may subtly alter or change its product, often in newer revisions, so that competitors' implementations are almost, but not quite completely interoperable, leading customers to consider them unreliable or of a lower quality. These changes can either not be passed on to other vendors at all, or passed on after a strategic delay, maintaining the market dominance of the original vendor.
  2. The data themselves may be encumbered, e.g. by patents or pricing, leading to a dependence of all competing solutions on the original vendor, and possibly leading a revenue stream from the competitors' customers back to the original vendor. This revenue stream is only a result of the original product's market dominance and not a result of any innate superiority.
  3. Even when the original vendor is genuinely interested in promoting a healthy competition (so that he may also benefit from the resulting innovative market), post-facto interoperability may often be undesirable as many defects or quirks can be directly traced back to the original implementation's technical limitations. Although in an open process, anyone may identify and correct such limitations, and the resulting cleaner specification may be used by all vendors, this is more difficult post-facto, as customers already have valuable information and processes encoded in the faulty but dominant product, and other vendors are forced to replicate those faults and quirks even if they could design better solutions, for the sake of preserving interoperability. Alternatively, it can be argued that even open processes are subject to the weight of past implementations and imperfect past designs, and that the power of the dominant vendor to unilaterally correct or improve the system and impose the changes to all users facilitates innovation.
  4. Lack of an open standard can also become problematic for the customers, as in case of the original vendor's inability to fix a certain problem that is an artifact of technical limitations in the original product. The customer wants that fault fixed, but the vendor has to maintain that faulty state, even across newer revisions of the same product, because that behaviour is a de-facto standard and many more customers would have to pay the price of any break in interoperability caused by fixing the original problem and introducing new behaviour.

[edit] Telecommunications

In telecommunication, the term can be defined as:

  1. The ability of systems, units, or forces to provide services to and accept services from other systems, units or forces and to use the services exchanged to enable them to operate effectively together.
  2. The condition achieved among communications-electronics systems or items of communications-electronics equipment when information or services can be exchanged directly and satisfactorily between them and/or their users. The degree of interoperability should be defined when referring to specific cases.[3][4]

In two-way radio, interoperability is composed of three dimensions:

  • compatible communications paths (compatible frequencies, equipment and signaling),
  • radio system coverage or adequate signal strength, and;
  • scalable capacity.

[edit] Search

Search interoperability refers to the ability of two or more information collections to be searched by a single query.

Specifically related to web-based search, the challenge of interoperability stems from the fact designers of web resources typically have little or no need to concern themselves with exchanging information with other web resources. Federated Search has emerged as one solution to search interoperability challenges. In addition, standards, such as OAI-PMH, RDF, and SPARQL, have emerged recently that also help address the issue of search interoperability related to web resources. Such standards also address broader topics of interoperability, such as allowing data mining.

[edit] Software

Interoperability: playing the two role network game, when one of the player clients (top left) runs under Sun Microsystems and another under GNU Classpath with JamVM. The applications execute the same bytecode and interoperate using the standard RMI-IIOP messages for communication

With respect to software, the term interoperability is used to describe the capability of different programs to exchange data via a common set of exchange formats, to read and write the same file formats, and to use the same protocols. (The ability to execute the same binary code on different processor platforms is 'not' contemplated by the definition of interoperability.) The lack of interoperability can be a consequence of a lack of attention to standardization during the design of a program. Indeed, interoperability is not taken for granted in the non-standards-based portion of the computing world.

According to ISO/IEC 2382-01, Information Technology Vocabulary, Fundamental Terms, interoperability is defined as follows: "The capability to communicate, execute programs, or transfer data among various functional units in a manner that requires the user to have little or no knowledge of the unique characteristics of those units". [5]

Note that the definition is somewhat ambiguous because the user of a program can be another program and, if the latter is a portion of the set of program that is required to be interoperable, it might well be that it does need to have knowledge of the characteristics of other units.

This definition focuses on the technical side of interoperability, while it has also been pointed out that interoperability is often more of an organizational issue: often interoperability has a significant impact on the organizations concerned, raising issues of ownership (do people want to share their data?), labor relations (are people prepared to undergo training?) and usability. In this context, a more apt definition is captured in the term "business process interoperability".

Interoperability can have important economic consequences, such as network externalities. If competitors' products are not interoperable (due to causes such as patents, trade secrets or coordination failures), the result may well be monopoly or market failure. For this reason, it may be prudent for user communities or governments to take steps to encourage interoperability in various situations. In the United Kingdom, for example, there is an eGovernment-based interoperability initiative called e-GIF while in the United States there is the NIEM initiative. Standards Defining Organizations (SDOs) provide open public software specifications to facilitate interoperability; examples include the Oasis-Open organization and buildingSMART (formerly the International Alliance for Interoperability). As far as user communities, Neutral Third Party is creating standards for business process interoperability. Another example of a neutral party is the RFC documents from the Internet Engineering Task Force (IETF).

The OSLC (Open Service for Lifecycle Collaboration) Community is working on finding a common standard in order that software tools can share and exchange data e.g. bugs, tasks, requirements etc. The final goal is to agree on an open standard for interoperability of open source [Application Lifecycle Management|ALM] tools. [6]

[edit] Medical industry

New technologies are being introduced in hospitals and labs at an ever-increasing rate, and many of these innovations have the potential to interact synergistically if they can be integrated effectively. The need for “plug-and-play” interoperability – the ability to take a medical device out of its box and easily make it work with one’s other devices – has attracted great attention from both healthcare providers and industry.

Interoperability helps patients get the most out of technology, and it also encourages innovation in the industrial sphere. When different products can be combined without complicated and expensive interfaces, small companies can enter a field and make specialized products. Without interoperability, hospitals are forced to turn to large vendors that provide suites of compatible devices but that do not specialize in any one area. Interoperability promotes competition, and competition encourages innovation and quality.

From the perspective of Intel, a major producer of consumer healthcare devices, there are six major factors that affect an industry’s ability to achieve interoperability. First there needs to be a demand for interoperable products. Second, there must be standards, or rules, defining what interoperability means in the field. Third, business conditions must encourage manufacturers to make their products interoperable. Fourth, guidelines must exist that make the often-complicated standards easier for companies to interpret. Fifth, compliance must be verified by independent testing; and finally, interoperability must be actively promoted. The rapid rise of wireless technology illustrates that interoperability is attainable.

Conditions in the biomedical industry are still in the process of becoming conducive to the development of interoperable systems. A potential market of interested hospitals exists, and standards for interoperability are being developed. Nevertheless, it seems that current business conditions do not encourage manufacturers to pursue interoperability. Only sixteen to twenty percent of hospitals, for example, use electronic medical records (EMR). With such a low rate of EMR adoption, most manufacturers can get away with not investing in interoperability. In fact, not pursuing interoperability allows some of them to tout the inter-compatibility of their own products while excluding competitors. By promoting EMR adoption, companies such as Intel hope to create an environment in which hospitals will have the collective leverage to demand interoperable products.

[edit] eGovernment

Speaking from an eGovernment perspective, interoperability refers to the collaboration ability of cross-border services for citizens, businesses and public administrations. Exchanging data can be a challenge due to language barriers, different specifications of formats and varieties of categorisations. Many more hindrances can be identified.

If data is interpreted differently, collaboration is limited, takes longer and is not efficient. For instance if a citizen of country A wants to purchase land in country B, the person will be asked to submit the proper address data. Address data in both countries include: Full name details, street name and number as well as a post code. The order of the address details might vary. In the same language it is not an obstacle to order the provided address data; but across language barriers it becomes more and more difficult. If the language requires other characters it is almost impossible, if no translation tools are available.

Hence eGovernment applications need to exchange data in a semantically interoperable manner. This saves time and money and reduces sources of errors. Fields of practical use are found in every policy area, be it justice, trade or participation etc. Clear concepts of interpretation patterns are required.

Many organizations are dedicated to interoperability. All have in common that they want to push the development of the World Wide Web towards the semantic web. Some concentrate on eGovernment, eBusiness or data exchange in general. In Europe, for instance, the European Commission and its IDABC programme issue the European Interoperability Framework. They also initiated the Semantic Interoperability Centre Europe (SEMIC.EU). A European Land Information Service (EULIS) was established in 2006, as a consortium of European National Land Registers. The aim of the service is to establish a single portal through which customers are provided with access to information about individual properties, about land and property registration services, and about the associated legal environment.[7] In the United States, the government's CORE.gov service provides a collaboration environment for component development, sharing, registration, and reuse and related to this is the National Information Exchange Model (NIEM) work and component repository.

[edit] Public safety

Interoperability is an important issue for law enforcement, fire fighting, EMS, and other public health and safety departments, because first responders need to be able to communicate during wide-scale emergencies. Traditionally, agencies could not exchange information because they operated widely disparate hardware that was incompatible. Agencies' information systems such as computer-aided dispatch systems (CAD) and records management systems (RMS) functioned largely in isolation, so-called "information islands." Agencies tried to bridge this isolation with inefficient, stop-gap methods while large agencies began implementing limited interoperable systems. These approaches were inadequate and the nation's lack of interoperability in the public safety realm become evident during the 9/11 attacks [8] on the Pentagon and World Trade Center structures. Further evidence of a lack of interoperability surfaced when agencies tackled the aftermath of the Hurricane Katrina disaster.


In contrast to the overall national picture, some states, including Utah, have already made great strides forward. The Utah Highway Patrol and other departments in Utah have created a statewide data-sharing network using technology from a company based in Bountiful, Utah, FATPOT Technologies.

The Commonwealth of Virginia is one of the leading states in the United States when it comes to improving interoperability and is continually recognized as a National Best Practice by Department of Homeland Security (DHS). Virginia's proven practitioner-driven Governance Structure ensures that all the right players are involved in decision making, training & exercises, planning efforts, etc. The Interoperability Coordinator leverages a regional structure to better allocate grant funding around the Commonwealth so that all areas have an opportunity to improve communications interoperability. Virginia's strategic plan for communications is updated yearly to include new initiatives for the Commonwealth - all projects and efforts are tied to this plan, which is aligned with the National Emergency Communications Plan, authored by Department of Homeland Security's Office of Emergency Communications (OEC).

The State of Washington seeks to enhance interoperability statewide. The State Interoperability Executive Committee (SIEC), established by the legislature in 2003, works to assist emergency responder agencies (police, fire, sheriff, medical, hazmat, etc.) at all levels of government (city, county, state, tribal, federal) to define interoperability for their local region.

Washington recognizes collaborating on system design and development for wireless radio systems enables emergency responder agencies to efficiently provide additional services, increase interoperability, and reduce long-term costs.

This important work saves the lives of emergency personnel and the citizens they serve.

The U.S. government is making a concerted effort to overcome the nation's lack of public safety interoperability. The Department of Homeland Security's Office for Interoperability and Compatibility (OIC) is pursuing the SAFECOM and CADIP programs, which are designed to help agencies as they integrate their CAD and other IT systems.

The OIC launched CADIP in August 2007. This project will partner the OIC with agencies in several locations, including Silicon Valley. This program will use case studies to identify the best practices and challenges associated with linking CAD systems across jurisdictional boundaries. These lessons will create the tools and resources public safety agencies can use to build interoperable CAD systems and communicate across local, state, and federal boundaries.

[edit] Achieving software interoperability

Software Interoperability is achieved through five interrelated ways:

  1. Product testing
    Products produced to a common standard, or to a sub-profile thereof, depend on clarity of the standards, but there may be discrepancies in their implementations that system or unit testing may not uncover. This requires that systems formally be tested in a production scenario – as they will be finally implemented – to ensure they actually will intercommunicate as advertised, i.e. they are interoperable. Interoperable product testing is different from conformance-based product testing as conformance to a standard does not necessarily engender interoperability with another product which is also tested for conformance.
  2. Product engineering
    Implements the common standard, or a sub-profile thereof, as defined by the industry/community partnerships with the specific intention of achieving interoperability with other software implementations also following the same standard or sub-profile thereof.
  3. Industry/community partnership
    Industry/community partnerships, either domestic or international, sponsor standard workgroups with the purpose to define a common standard that may be used to allow software systems to intercommunicate for a defined purpose. At times an industry/community will sub-profile an existing standard produced by another organization to reduce options and thus making interoperability more achievable for implementations.
  4. Common technology and IP
    The use of a common technology or IP may speed up and reduce complexity of interoperability by reducing variability between components from different sets of separately developed software products and thus allowing them to intercommunicate more readily. This technique has some of the same technical results as using a common vendor product to produce interoperability. The common technology can come through 3rd party libraries or open source developments.
  5. Standard implementation
    Software interoperability requires a common agreement that is normally arrived at via a industrial, national or international standard.

Each of these has an important role in reducing variability in intercommunication software and enhancing a common understanding of the end goal to be achieved.

[edit] Interoperability as a question of power and market dominance

Interoperability tends to be regarded as an issue for experts and its implications for daily living are sometimes underrated. The European Union Microsoft competition case shows how interoperability concerns important questions of power relationships. In 2004, the European Commission found that Microsoft had abused its market power by deliberately restricting interoperability between Windows work group servers and non-Microsoft work group servers. By doing so, Microsoft was able to protect its dominant market position for work group server operating systems, the heart of corporate IT networks. Microsoft was ordered to disclose complete and accurate interface documentation, which will enable rival vendors to compete on an equal footing (“the interoperability remedy”). As of June 2005 the Commission is market testing a new proposal by Microsoft to do this, having rejected previous proposals as insufficient.

Interoperability has also surfaced in the Software patent debate in the European Parliament (June/July 2005). Critics claim that because patents on techniques required for interoperability are kept under RAND (reasonable and non discriminatory licensing) conditions, customers will have to pay license fees twice: once for the product and, in the appropriate case, once for the patent protected programme the product uses.

[edit] Railways

Railways have greater or lesser interoperability depending on conforming to standards of gauge, couplings, brakes, signalling, communications, loading gauge, operating rules, to mention a few parameters. North American railroads are highly interoperable, Europe, Asia, Africa, Central and South America, and Australia much less so. The parameter most difficult to overcome (at reasonable cost) is incompatibility of gauge, though variable gauge axle systems such as the SUW 2000 are starting to come to the rescue.

[edit] See also

[edit] References

  1. ^ Institute of Electrical and Electronics Engineers. IEEE Standard Computer Dictionary: A Compilation of IEEE Standard Computer Glossaries. New York, NY: 1990.
  2. ^ James O'Brien, George Marakas: Introduction to Information Systems, McGraw-Hill/Irwin; 13 edition, ISBN 0073043559
  3. ^  This article incorporates public domain material from the General Services Administration document "Federal Standard 1037C" (in support of MIL-STD-188).
  4. ^  This article incorporates public domain material from the United States Department of Defense document "Dictionary of Military and Associated Terms".
  5. ^ http://jtc1sc36.org/doc/36N0646.pdf
  6. ^ http://www.slideshare.net/olberger/presentation-icssea2011
  7. ^ Aims of EULIS, European Land Information Service
  8. ^ Grier, Robin. "Interoperability Solutions". Interoperability. Catalyst Communications. http://www.catcomtec.com/index.php?option=com_content&view=article&id=19&Itemid=34. Retrieved 28 May 2011. 

[edit] External links

Personal tools
Namespaces

Variants
Actions
Navigation
Interaction
Toolbox
Print/export
Languages