Interoperability

Interoperability Definition

Interoperability refers to the basic ability of different computerized products or systems to readily connect and exchange information with one another, in either implementation or access, without restriction.

Interoperability diagram showing the connectedness of various stakeholders, technology systems, dashboards, etc.
Image from South Western Communications, SWC

FAQs

What is Interoperability?

Interoperability is the property that facilitates unrestricted sharing and use of data or resources between disparate systems via local area networks (LANs) or wide area networks (WANs). There are two types of data interoperability - syntactic interoperability, which is a prerequisite to semantic interoperability and enables different software components to cooperate, facilitating two or more systems to communicate and exchange data; and semantic interoperability, which refers to the ability of computer systems to exchange meaningful data with unambiguous, shared meaning.

Efficient automated data sharing between applications, databases, and other computer systems is a crucial component throughout networked computerized systems, especially interoperability in healthcare information and management systems.

How Does Interoperability Work?

For two or more systems to be interoperable, they must be able to exchange, interpret, and present shared data in a way that is understood by the other. This is accomplished with the establishment of syntactic interoperability, which involves adopting a common data format and common data structure protocols, followed by semantic interoperability, which involves the addition of metadata that links each data element to a controlled, shared vocabulary. Within this shared vocabulary are associated links to an ontology, which is a data model that represents a set of concepts within a domain and the relationships among those concepts.

The adoption of these common standards enables the transmission of meaningful information that is independent of any information system. The benefits of interoperability include increased productivity, reduced costs, and reduced errors.

System and software interoperability capabilities are essential in such fields as:

  • Healthcare: hospitals and labs are increasingly adopting new technologies and devices that are driven by sophisticated software, which must integrate at the point of care and with electronic systems, such as electronic medical records
  • eGovernment: solutions address challenges such as language barriers and different specifications of formats and varieties of categorizations in the collaboration of cross-border services for citizens, businesses and public administrations
  • Public Safety: addresses the ability of law enforcement, fire fighting, EMS, and general public health and safety first responders to effectively communicate between different agencies during wide-scale emergencies
  • Military: Force Interoperability refers to the ability the forces of two or more nations to operate together coherently, effectively, and efficiently to execute Allied tactical, operational and strategic objectives
  • Flood Risk Management: in the context of urban flood risks, the ability of a water management system to redirect water and make use of other systems to maintain or enhance its performance during water exceedance events

Integration vs Interoperability

While both integration and technological interoperability involve connecting applications and facilitating data transmission, the main difference is in how the different systems communicate. Interoperability is the real-time data exchange between different systems that speak directly to one another in the same language, instantly interpreting incoming data and presenting it as it was received while preserving its original context.

Integration refers to the process of combining multiple applications to function together as one uninterrupted system, often involving the use of middleware. Integration provides an environment in which a series of products can talk to each other in their current state while also maintaining compatibility with future versions of each product - in contrast, interoperable systems will lose their interoperability in the event of a system change or upgrade. Most industries that do not require interoperability, exchange data as a result of data integration.

Compatibility vs Interoperability

Compatibility is the technique by which two or more applications or systems interact within the same environment, each performing their expected tasks independently without interfering with the performance of another application or system. Compatibility is not concerned with interoperability as the two components are not required to communicate with one another, but simply hold residence in the same environment.

What is Interoperability Testing?

Interoperability testing is the process by which systems or applications are formally tested in a production scenario in order to ensure clarity of standards have been established and to detect potential discrepancies. The factors in interoperability testing include ease of use features, syntax and data format compatibility, and sufficient logical and physical connection methods. A common interoperability testing methodology follows the PCDA (Plan, Do, Check, Act) Cycle:

  • Plan: determine the functionality, behavior, input, and output for all applications and test individually for defects
  • Do: execute functional and non-functional testing, log and resolve defects, re-test and regression test the system as a whole, and report results
  • Check: revisit test results and validate whether all the expected requirements are met and whether all the applications are traversed
  • Act: identify and continue executing good practices; identify poor practices and develop steps to rectify them

Does HEAVY.AI Offer Data Integration Solutions?

HEAVY.AI offers solutions for seamless, big data integration. Today’s data managers are challenged with a growing ecosystem of data sources and warehouses, making big data integration more complex than ever. The HEAVY.iDB open source database acts as a hot cache for analytical datasets and is capable of ingesting millions of records a second.