In today’s digital era, every operation leaves behind a digital footprint. Paradoxically, while medical imaging generates a massive amount of data daily, today it is impossible to leverage this data to improve operations and business performance.
The idea of integrating data from multiple sources such as DICOM and HL7, extracting valuable operational insights, and utilizing it for reinventing the management of imaging operations has recently gained traction. Modern organizations increasingly understand the need for data, and how data-driven strategies will transform imaging operations the same way it has changed the airline industry.
A core need is the creation of a middleware interface, or unified data layer, that assimilates data from different sources within imaging facilities, harmonizes it and combines it, and allows downstream applications to seamlessly ask data questions, regardless of the variations in vendors, systems, or software versions used.
This raises a fundamental, strategic question for imaging providers: should they venture into building their own middleware unified data layer, or is it more strategic to partner with a technology company for whom data is the core competency?
We have a strong opinion on this point, which is why we spun out Quantivly from Boston Children’s Hospital: imaging providers will benefit by out-sourcing this middleware interface to a technological partner charged with maintaining this infrastructure layer instead of re-building a solution internally. We have seen this lesson learned the hard way, leading to internal tools that result in a “worst of both worlds” scenario, i.e., consuming significant resources – time, talent, capital – for development and maintenance while showing subpar performance and reliability, lacking the flexibility to meet future challenges, and missing out on innovations and expertise specialized technological partners bring.
To help hospital and radiology IT teams avoid this pitfall, we outline below the primary reasons why a partnership is likely your best bet, both in the short and long term.
1. It’s not just about aggregating data in a data store
If the problem was just gathering data in a common data store, things would be easy. But if you want useful data, the problem extends far beyond this. It requires a complex process involving cleaning, harmonizing, combining, deriving, and effectively organizing data to transform it into the trustable and reliable source you need to generate meaningful insights.
- Cleaning – Raw data often contains errors, duplications, and inconsistencies. To ensure data reliability, complex guardrails and algorithms need to be developed to both detect erroneous data and rectify it.
- Harmonizing – Vendors have many unique variations of each “standard” (e.g., DICOM, Enhanced DICOM, Siemens Mosaic, Siemens XA, all GE flavors of DICOM). To ensure compatibility and seamless integration, data harmonization is crucial.
- Deriving – Key operational concepts are often absent from the raw data and need derivation. For instance, DICOM, a storage format, lacks the concept of “examination” and “acquisition”. It was designed for a different purpose with concepts of “series” and “study”, which are very different and not useful for operations (e.g., how long acquisitions and exams truly take, or not counting things twice when data is duplicated). The extracted concepts of “acquisition” and “exam” require complex algorithms and even AI to extrapolate and generate these concepts so that data can be used for operations management.
- Combining – Data is most powerful when combined, for example, to compare “what was scheduled” to “what was performed”, allowing in turn the derivation of new key metrics (e.g., slot utilization ratio, delays, etc.). This requires a deep understanding of data standards, their limitations, the data relationships, and technical solutions to cross-reference between data, even when explicit links are missing.
- Organizing – Lastly, the data needs to be methodically arranged in a new, scalable data schema designed to support millions of entries, while allowing for easy retrieval and analysis, and minimizing redundancy to protect data integrity.
Without trust in your data, its value diminishes drastically. Each stage of this process calls for a unique skill set, and any missteps could undermine data integrity and usability. In essence, the quality of your outcomes is as good as the data you can trust.
2. The development of a data layer requires expertise in many different domains.
Building a unified data layer for imaging operations is not just an IT challenge. It requires a unique combination of expertise and skills that are seldom found within a single team.
First, this isn’t simply about articulating radiological concepts to a group of programmers. The programmers themselves need deep industry expertise, such as the physics of MR and CT and the intricate details – and limitations – of healthcare protocols to design the right system.
Consider these complexities:
- How do you reverse-engineer private DICOM tags for all and software versions? How do you know whether you have the right information?
- How do you recover acquisition duration, so central for many metrics, even when the information is not encoded in DICOM?
- How do you robustly associate HL7 orders with DICOM data? Does it work when the techs moved the patient to a different scanner? Or with combo exams? (do your programmers know what a combo exam is?)
- How do you transform the DICOM data model, an “entity-attribute-value” model, into a relational data schema that can accommodate every DICOM data ever produced? You have to capture 100% of acquisitions, or else the recovered exam timelines will have holes and your metrics will be wrong.
If your in-house team has these skills, excellent! That’s part of the basic requirements. However, it’s not all; you also need expertise in:
- Algorithm Development: To create efficient processes for data cleaning, harmonization, and analysis.
- High-performance database design: to handle massive datasets at scale, and ensure swift data queries even when your database is growing
- Parallel programming: To enable efficient computation and handle the continuous stream of data coming
- RAM memory management: To prevent your platform from falling apart under data influx
- Security – To protect sensitive patient data from breaches and maintain patient trust.
- Compliance: To ensure your systems adheres to all relevant healthcare regulations and standards.
- Infrastructure design: To provide a robust and reliable framework that supports all data operations, and allow for smooth platform upgrades that do not “miss some data”
- Platform observability: To continuously monitor and optimize system performance.
Each of these domains is a specialized field, and assembling a team with expertise in all these areas involves substantial investment in time, training, and resources.
Technology companies focusing on healthcare data management have refined these skills over many years in the field. They can harness their collective experience to deliver carefully designed solutions that work effectively right from the start.
3. Economies of Time and Cost: A Strategic Advantage
Creating a robust, efficient, and secure data layer is no simple task. By collaborating with a specialized data company, organizations gain immediate access to years of refined expertise and dedicated resources, translating into significant economies of time and cost.
A technology company rooted in healthcare data management has already navigated the complex pathways of data integration and faced down numerous challenges. By benefiting from their experience, you can avoid the same potential pitfalls they have encountered and solved. This not only mitigates the risks associated with deploying a new system but also accelerates the process significantly.
The result? You can benefit from the data on day one, focusing on what truly matters: providing better imaging care to more patients through data-driven strategies.
Importantly, the development of such a system isn’t a one-off task. It requires a continuous commitment through regular maintenance, updates, and bug fixes. An external provider, equipped with a dedicated team, is solely focused on these tasks, ensuring your system is always optimized and up-to-date with the latest advancements.
Finally, a company specializing in data solutions can distribute the cost of development and maintenance across its entire user base. This economies-of-scale approach results in more affordable solutions, helping you achieve cost efficiency while benefiting from a proven system with robust data security, streamlined workflow, and continuous access to the latest technological advancements.
In essence, partnering with a technology company for your data needs not only saves you significant time and costs, it also gives you an opportunity to benefit from what was learned across many sites while enhancing your operational efficiency.
4. Scalability and Future Proofing
An in-house solution may serve your current needs, but it might not be scalable or adaptable enough to accommodate future growth or changes in the industry.
Engaging with a company whose core business revolves around data ensures that your platform will continually evolve and improve. As your organization grows and your data needs become more complex, you won’t be left scrambling to scale up your internal solution – your data partner will handle this for you, ensuring seamless adaptation and performance regardless of the volume or complexity of your data.
Even more importantly: these companies are at the forefront of technological advancements in the data realm. While your team might be busy implementing and maintaining an in-house solution, data-centric companies like Quantivly are already applying AI to push the boundaries of what’s possible. They continuously invest in research and development, finding innovative ways to leverage the power of AI and machine learning for even more insightful and actionable data. For example, at Quantivly, we will soon release AI models that grade image quality on the fly from pixel data or label organs in images. It means you will be able to use those descriptors in any query, without having to find AI engineers to develop these for you.
So, partnering with an imaging data company doesn’t just mean access to an exceptional data layer on day one. It also means future-proofing your operations. You’ll benefit from all their innovative developments in real-time, making your system perpetually cutting-edge, and giving you a significant competitive advantage in an industry that continues to evolve rapidly.
The digital transformation is not a trend – it’s the new normal. Yet, oddly enough, imaging providers have not been able to tap into their data to improve operations – the operational data is simply fragmented and inaccessible. Providers urgently need a unified data layer that integrates different data sources and creates, for the first time, an unbiased, vendor-agnostic, and objective source of knowledge about imaging operations.
While the idea of building an in-house data solution may initially seem attractive, the inherent complexities, significant time investment, substantial cost, and ongoing maintenance required most often outweigh the perceived benefits. In contrast, partnering with a specialized data company like Quantivly not only mitigates these challenges; it also grants immediate access to reliable, trustworthy, and actionable data, all while potentially saving considerable time and cost and benefiting from the insights we’ve gathered across our customer base.
Importantly, such partnerships offer more than just a solution to immediate needs – they provide a strategic advantage. A data company like Quantivly, continuously innovating and pushing the boundaries of data technology, ensures that your operations stay at the cutting edge, giving you the leverage to enhance patient care through data-driven strategies.
Ultimately, the decision isn’t merely about solving a current need; it’s about strategically positioning your organization for success in the increasingly data-driven future of medical imaging and imaging operations.
CEO @ Quantivly
Quantivly’s mission is to provide better imaging care to more patients. We do this by unlocking data, and making it accessible to imaging providers to make data-driven decisions and improve imaging operations. It’s time to replace an ad-hoc and reactive approach with an engineering mindset. The result is a win-win-win for staff, the department, and most importantly, patients.