Lessons learned from big interoperability in government
The implementation of standards germane to interoperability for verticals is also becoming more commonplace. The Enterprise Data Management Council’s championing of the Financial Industry Business Ontology (FIBO) in finance proves that point, as does CDISC’s efforts to standardize aspects of clinical trials. In fact, the need to move into new standards is one of the drivers for increasing the propensity toward interoperable data.
“It takes a long time to move into new standards,” Coyne admits. “But gradually government is doing this, in many cases because they have real use case drivers saying there’s no other good way to do this. It’s been done in a way with too much overhead, [that’s] too chaotic, for too long.” CDISC is advocating interoperable standards for clinical trial data to effectively streamline the process of validating some of the substances that go to the FDA for approval. “They’re interested in semantic technology because they’re coming out with new mandatory requirements that clinical trial stuff be submitted in semantic standards,” Coyne says.
Longstanding sustainability
Another critical byproduct of the interoperable standards positioned in various public-sector entities is the greater capacity to ensure regulatory compliance. Coyne called the ingress of regulations flooding most verticals today “an indirect way the government has built huge leverage on the direction things are moving.” When governmental entities or regulatory bodies require reports illustrating data lineage and other facets of regulations, the typical reaction is to build a silo solely for that purpose.
Coyne mentions recent findings by the EDM Council that for compliance, several organizations “assemble all this data and map it together by human heads for one-time reporting and when they have to do it again, they have nothing to reuse.” However, the better governance and lineage capabilities characterizing interoperable data are readily reusable, oftentimes providing insight into the very areas of diligence regulatory reports demand. The result is increased stability, sustainability and flexibility for regulations or other data-driven needs that may arise in the future.
Future-proof
The final merit of the interoperability of standardized, model-driven approaches to data management is the comfort that they suitably prepare organizations for the future. By compiling metadata into a single repository influencing real-time operations of downstream applications, issuing each datum its own unique identifier and harmonizing data with standardized models and classification systems, organizations can assemble data for almost any accord. Those benefits transcend a redoubled ability to manage assets, conduct search, determine lineage or govern data. They include a uniformity in the very management of diverse data assets at scale and a profound capacity to quickly mobilize data for whatever applications become the most meaningful tomorrow.