In spring 2019, Dicom Systems collaborated with WinguMD to host a webinar about how the health care industry is working to make imaging workflows more accepting of mobile devices. A representative of Nicklaus Children’s Hospital was also present to give us practical feedback. The participants talk about how personal phones and other mobile devices can be woven into the existing infrastructure of imaging systems. They also speculate about where the future of imaging will lead to medicine.

Featuring

  • Florent Saint-Clair, EVP, Dicom Systems
  • Manabu Tokunaga, CEO/Co-Founder, WinguMD, Inc
  • Nolan Altman, MD, Pediatric Neuroradiologist, Chief Department of Radiology (Nicklaus Children’s Hospital)

Florent Saint-Clair (FSC): So the context for the conversation today is a really well-admired hospital in Miami. Nicklaus Children’s Hospital has been highly innovative and creative in the care that they place into the utilization of technology to advance patient’s health, and especially children. Some of the most interesting innovations we’ve done as a company, as Dicom Systems, have involved Nicklaus Children’s because they do push the envelope pretty routinely. We were fortunate to be introduced to Nicklaus Children’s by our partners at Cerner, a company that is completely committed to the world of interoperability. Their commitment to interoperability is what’s allowed us to do some wonders in some of the clinical contexts that are typically closed off to other vendors. And so this has been thanks to the Chief Nursing Information Officer, Elise Hermes, and of course, the leadership and stewardship of Dr. Altman in the adoption of this platform to advance some of the clinical workflows we’re gonna be looking at today.

Optimizing Personal Devices for Medicine

Our typical customer at Dicom Systems is PACS admins, network admins, the directors of IT, the CIO, people that are concerned with the enterprise infrastructure that’s serving up the images for the physicians. The past 11 years, Dicom Systems is now pretty routinely running about 9 billion images through our systems, annually. Some of our largest installations do as many as 5000 daily commits of exams to our archives. We also power about 12 million annual Telerad reads, which is not counting relevant priors on top of that number, and we’ve also increasingly gotten involved in workflows that are involving machine learning. Some of the largest that we’ve done had as many as 5.5 million exams that needed to be de-identified scientifically to be able to be used to feed into a machine learning algorithm to learn new things. Using this platform as kind of a universal adaptor has allowed us to essentially plug in the ZenSnap app that we’re gonna be spending our time on today, along with Dr. Altman and Manabu Tokunaga. So this interoperability layer plugs into Cerner and plugs into virtually any enterprise imaging node that needs to talk to other nodes, and that could be HL7, it could be FHIR, it could be Dicom. The interoperability layer is the Dicom Systems Unifier Platform. 

One of the most important things to consider in this conversation is the role of mobile devices in today’s hospital context. All of us, physicians included, are using smartphones for virtually everything and professional purposes. Physicians are extremely resourceful individuals, and it doesn’t matter if a hospital has sanctioned a method of communication or not, they’re gonna do whatever is best for their patients, even if that includes utilizing a mobile device that doesn’t belong to the hospital. Everybody knows that happens every day, and so that’s one of the key problems that we solve with this platform. These are some of the issues that plague, not just enterprise imaging, but also every department that uses imaging one way or the other. So number one, unauthorized and unsecured personal mobile devices. That means iPads, iPhones, Android devices. These are all devices that belong typically to the individual using it. And in some cases, believe it or not, even with all the HIPAA education that goes on every day, a lot of people still use email or text, or unsanctioned methodology to share images with one another. Another key issue is that those mobile devices have zero enterprise integration. So the ability to utilize some of the EHR functionality or the PACS functionality is siloed away from the ability to use your mobile devices. 

 

Imaging with Mobile Devices

How do we make this a seamless integration where everybody wants to use mobile devices, but it’s not a controlled environment that they’re doing it in? One of the key layers is the existing infrastructure. Most hospitals have LDAP or Active Directory, they have a PACS, they have an EHR. In this case, Cerner has both the EHR and the PACS at Nicklaus Children’s. If you’re gonna be doing mobile device imaging, you don’t want to have to completely reinvent the infrastructure. The hospital has already invested a lot of capital in the acquisition and customization of the EHR and the PACS. So to accommodate mobile devices, we have to be unobtrusive, we have to be benign in the impact on the existing infrastructure. The ability to use DICOM Modality Worklist has typically not been available to mobile devices, because an iPhone or an iPad is not a typical modality that’s communicating the worklist to a RIS or a PACS. We need to be able to reduce human error and data entry into a mobile device by giving the mobile device access to a worklist. Another key aspect of this platform is that you can download the app from iTunes today for free. This is leveraging an app that is very natural for people to learn. So there’s virtually no end-user adoption. We started in radiology, but adoption is now pervasive throughout the hospital. Any other department that could be using this now wants to because it’s such a natural toolset to use with no training. So this is also sanctioned by the hospital, so the CIO doesn’t have to think twice about it. The fleet of devices is now belonging to the hospital. The iPads were purchased by the hospital and deployed to the physicians and technologists to do their job with. Security and compliance are taken care of because these are the tools that are being given to the physicians and caregivers by the hospital. You can actually view any type of modality that is now available through the integration with the PACS. And so by leveraging DICOMweb as a standard, WADO-RS, QIDO-RS, STOW-RS, all of those standards that are deployed through the Dicom Systems Unifier are available through this user interface. I’d like to give the mic to Manabu to speak to this a little bit.

Manabu Tokunaga (MT): Thank you very much, Florent. I have been in this kind of industry for a long time. Throughout all this time, there are two key themes that I was always asked to solve, which is, how I can save time and how can I accurately communicate what I’ve seen to other people. Now the mobile communication came, which is really perfect to address this first ‘save time and accuracy’ part. And the security part is coming. So that’s how I’ve been making an effort to develop this into a completely cohesive package that can be used in clinical situations. And down the line, we’re also starting to add the patient engagement part to it, which would make it very easy for the patient to also provide information, or from the clinical side, provide information because pretty much most people have Android or iPhone. So this would be a perfect platform now to address all these three issues. What people wanted to do in a clinical setting, is that people come in right now in the morning and have a conference, round conference, and then they talk about the case, but what they want to do is to actually, that communication to continue throughout the day. And that’s the core of this application.

Communication flowchart for how mobile supports patient care continuum

 

 

 

 

Department Collaboration Through Imaging

FSC: This is really the crux of our conversation. When we started discussing this project with Nicklaus Children’s I asked, why would radiology need photography in the practice of diagnostics? Because radiology uses ultrasound, MRI, nuc med. They use internal medical imaging to articulate the diagnosis. Why do you need photography? You have a radiology workstation, you have a PACS, you have medical images open on the desktop, and you’re projecting images on the wall, and side by side, you now have all of these physicians, that are collaborating on problematic cases that are able to see not just the internal images that they need to discuss and next steps in the care for the patient, but also the outward manifestation of symptoms. Taking a picture of the patient, taking a picture of the skin, in addition to showing the medical images, provides a very effective holistic approach to collaborating in that room. What was really interesting to us to witness this is to see how our technology was being deployed in a very innovative way, in a clinical conference. At any tumor board, any clinical conference, people get together for an hour or 45 minutes in the morning and the collaboration is typically very effective because they’re all in the same room in the same context, looking at the same information. And then everybody at 8:00 AM gets up, they go to do their job at different floors, they attend to their patients, and the collaboration that was so good for an hour gets stopped dead in its tracks at 8:00 AM. Now, using this platform, the collaboration continues throughout the day. It’s almost like you have a Twitter feed on each patient case that was discussed during the clinical conference. Dr. Altman, this would be an interesting point for you to bring to the audience.

Dr. Nolan Altman (NA): Thank you very much, Florent. And I’m glad to be part of this program because we’re glad to have you under the hood as far as our Dicom collaboration to allow us to do many things that Cerner wouldn’t quite allow us to do through your intricate rules engine. And then this new endeavor with Manabu that allows us to do things that we have been doing for some time, besides our daily 7:00 AM meetings that we have with the residents and the attendings, and we have those in the different disciplines in our department to help everyone stay on the same page. I think that all of us realize that our communicator device is our cell phone and that’s where everything really is residing. I’ve been using my personal cell phone to communicate with a lot of different things regarding imaging. And from the resident shooting iPhone pictures off of the PACS workstation to me at home, to the other attendings within the hospital shooting me pictures of cases that they may see at other hospitals, everything ends up at residing in the iPhone. So what you all have allowed us to do is to basically do this the proper way by having the ability for us, off of our Dicom-integrated PACS worklist, to be able to have images all reside in the patient’s worklist and we can see it quickly and share it with our other colleagues. And that’s kind of what I wanna show. For us, we’ve been doing, for quite some time, documentation of the patient’s skin over the areas of concern. And I’ll show some cases anywhere from along the patient’s spine to see if there are underlying neurocutaneous disorders, to patients that have interventional procedures. And once we see the lesion, once the technologists can see the lesion, we, not only at the hospital, but we have 12 satellite centers where we have ultrasound exams being done, and so not only do we get the ultrasound images and some of our satellites have MR images, but we get the clinical pictures and we can put them all together as another pulse sequence, say, in the case of MR images, we just drop it all into the PACS system and we can see all of the images of the patient at one time.

 

Mobile Devices vs. Specialist Medical Imaging Equipment

FSC: So Dr. Altman, there’s actually an important distinction in the way that images get created between a mobile device versus an ultrasound. So ultrasound, CT, MR, nuc med, all of those devices typically use something called a DICOM Modality Worklist in order to know what’s next, who’s the next patient, and get the proper demographics from the EHR automatically populated in the DICOM headers. It’s a little bit different when you’re dealing with a mobile device. And so historically, when you were using a mobile device, you didn’t really have a place to put metadata into the images, and so you would have orphan images that would have to be manually handled, Dicom-ized, and then placed into the PACS, and so… But that also means that you have a different kind of workflow that doesn’t involve the worklist, and that’s called an encounter-based imaging event. Encounter-based imaging is a totally different sport, but it can utilize the same underlying infrastructure to provide a worklist in order to reduce human error in data entry. In an iPhone, it would be very easy to introduce typos in the patient’s name or in various things. And so by providing worklist to an iPad or an iPhone, as if it were any other conventional imaging equipment in the hospital, you dramatically reduce the amount of time that it takes to not just begin an encounter, but also make it ingestible by the PACS and available to the rest of the enterprise through the PACS viewer. So those two different types of workflows are interesting to discuss. So I’ll let you talk about that.

MT: You’ll be hearing about IHE encounter-based workflow, and Nicklaus Children’s is one of the earlier places that gonna be able to do this mainly because of the flexibility of the Dicom Systems Unifier. As a ZenSnap user, we just look at it as if it’s a Dicom Modality Worklist. The System Dicom Unifier will get all these encounter events, like ADT events, as well as the radiology events. And we’ll make it work the same way, you look for the patient and take the photo, and we’ll just route it to the right place, either on your Podcharts folder or to the Dicom part of the pulse sequence or the series, actually, in the Dicom series image. So this is gonna be a very, very flexible solution, and it’s not just a mobile photo capture. We do a lot of this kind of work, and then we integrate it with other AI stuff that’s behind the scene. So make the contacts available, much easier and sooner sort of things.

FSC: You can take a picture of a document and OCR will be able to make that document searchable in your mobile device.

MT: Yeah. Not only searchable, but we can match that with MRN, we can locate MRN accession number, order number, that sort of thing.

FSC: Nicklaus Children’s still own all those Nikon cameras that they still use occasionally. I think you can buy a $40 Bluetooth adapter for the Nikon camera, and actually connect it to the iPad and take pictures and still make it available through the same mechanism.

MT: Not only that, you can add an otoscope, you can do a fundoscopy, even a microscope, it can be added on your iPhone, just an attachment, and you would be able to do a much wider range of photo acquisitions, all backed by the modality works, all end up in the right place, right patient, right chart.

NA: As the end-user, I think it’s very important that we have it all organized, but we just wanna see it quickly, and we wanna see it accurately, and that’s what you’ve allowed us to do. While we’ve had many devices that we can use, I think that the common worklist and the applications of your rules engine, to allow us to do that is what’s really pushing forward this product. 

 

Use Cases for Mobile Devices with Imaging

FSC: We’ll leave the floor now to the physician in the room. Dr. Altman, these are the cases that you leverage this technology for.

NA: This is a little girl that was seen in an outside imaging center, and she has these different spots on her scalp. And so the technologist knows that if there’s any patient that has a lump or a bump, that she needs to take a picture of it for us so we know what we’re looking at. This little girl has these little spots on her scalp, and they’re red, and they’re raised, and we can see that, quickly and cleanly, with this picture that the technologist at the satellite center’s taken of this little girl that came in, and she sent it with the ultrasound picture next. On the ultrasound that was done, on the left side, a gray-scale ultrasound, and on the right side, this is a color Doppler study of the same area. And you can see there’s a lot of color in here and we know that, as radiologists, that this indicates it’s a vascular malformation. There are different types of vascular malformations that you can see, and there’s venous malformations, hemangiomas. But with the picture of the child, we know for sure that this is a hemangioma, and there’s nothing to do about these things. They’ll go away, and we can tell the mom and dad before they leave the imaging area, “This is what it is. You don’t have to worry about it.” 

Another patient comes in and has a little lump, and we then had the technologist do the ultrasound picture and send it to us. When we see the clinical image, and we compare it with this color ultrasound image, which may look the same as that of hemangioma, we know in an instant that this isn’t. This is an infection. It’s an abscess. It needs to be drained. And we then went on to do IR, interventional radiology, and drain the abscess on this baby. 

So here’s a boy that comes in and he has a bit of a lump along the side of his jaw. And we then had the ultrasound done. We did the X-ray to make sure it wasn’t a bone lesion, which we didn’t see. And we actually got a CT, and you could see there’s a little soft tissue swelling and this lesion that’s under his jaw bone. When we saw that, we then went ahead and did an ultrasound. We see that there is a lump. It shows this on the grayscale that it’s relatively hypoechoic but it has no vascular pattern to it. So we know that this is not a vascular lesion, but we weren’t quite sure what it was. There are some lesions, such as cysts and/or other low-flow lesions that this can possibly be, but there are certain cysts, and this looks more like a cystic structure. But the question always is, could this be a tumor? Is this is a cyst, either a congenital cyst or an acquired cyst? And so in this case, we did go on and do an ultrasound-guided needle aspiration. So we turned it over to the interventional radiologist, and once we did that, we can document that, which we did, and we can do this all seamlessly within our system that you all have helped develop with us. And so we actually have a document of the aspirate that we retrieved, and when we looked at this, it’s mucus material. And so we know what this is, it’s called a “ranula”. And it’s an obstruction of a duct, and it fills up with mucus. It’s actually an obstruction of the sublingual duct, and we can then safely go and refer this to our ENT colleagues to have it removed. The reason we do this is there are other things, such a lymphatic malformation, where we would then go in directly after we do the aspirate and fill it with a sclerosant material, and obliterate it that way. So this helps us to determine not only what the lesion is but how to treat it. So these are just a few examples of how, yes, as a radiologist that sits in a box, we actually like to see our patients, we like to document what we see with our patients, and it helps us not only to diagnose, but it also helps us to treat. 

FSC: The reality is, in a pediatric environment, the physicians and nurses and technologists, they don’t just treat the infant or the child. Their parents are also patients. The parents are the ones making the decisions for their kids. And so these tools, these communication tools, making it very effective for Dr. Altman and his team to communicate to the parents what’s happening to their child. We do have quite a few aspects of this technology that were discussed today, not just on the clinical side, but also on the IT side, the security side, all of which can present some interesting points of discussion. One area that we didn’t discuss in detail is the AI possibilities presented by this technology.

 

Web-based Image Processing

MT: The next couple of phases that we are going to look at, one is, of course, the web-based, which is actually going to integrate the Dicom Systems web-based radiology directory into the app. Since we already have the contacts of which patient you’re looking at, it would be easy to actually show all of the relevant radiology images. This also has additional benefits in the sense that sometimes you may have a PACS downtime, or for whatever reason, you cannot access the PACS, in which case you could use this as a backup mechanism to look at the images as they’re acquired. We are going to show the images as they hit the Dicom system server, which is the first stage. Of course, we pull the priors, too, for the given patient. Annotation on screen, you already have seen, and so you can actually draw arrows and stuff like that, or circles, which of course you are familiar with. We also have a patented size measurement algorithm so that with the aid of a specifically coded ruler, you’ll be able to actually measure the size of the actual object. I know that some detailed wound care users, you have more detailed 3D type equipment, but essentially, you can turn your iPhone or iPad into a viable size measuring device at the kind of price point you pay for them. So that’s really great. Facial AI. We are actually going to do the… Looking at the various faces. When you walk into the doctor’s office, many doctors can tell you immediately right away what’s wrong with the patient. And so we’re gonna bring that excitement in terms of AI. By taking the photo, it can tell what the patient might potentially have. So aid in triaging, we are going to be diagnosis stuff. And then we have the Alexa of AI technology. Using our chat mechanism, you can just talk to the engine and then engine can answer, like what’s the blood cell count, is a lab complete, is infection indicated. You can just ask questions rather than you going into the EHR and trying to find these questions. It’s voice-based, natural language, smart processing. Another exciting thing that Nicklaus Children’s is getting.

FSC: Interesting use cases. We did get a couple of questions, by the way. We’ll go through them in a minute. But one interesting project at Nicklaus Children’s that you completed was that you also have a lot of historic pictures that were taken by various physicians including in plastic surgery and reconstructive surgery. And the ability to Dicom-ize and process those, categorize them and place them in a metadata-rich environment is also important, right? As time goes forward, when you have this new app, you can now use it, but there’s also a lot of historic images that can be handled through this platform. 

MT: Correct. Imagine that your parents one day give you a box full of just loose photos, and then, “This is for you. You do whatever you want to do with it.” Now you want to put in an album and put them… So our plastic surgeon at Nicklaus Children’s gave us the equivalent of that, and then my task was to put them in the right place. Because it’s a digital image, of course, we can tell you which clinic they are using. Luckily also, they took a picture of the chart so I can OCR it and then fish out the information. So we can get about 90% accuracy in actually sorting out the images. In just a matter of half an hour, we’re able to sort out about 2000 images. If you have in the audience a bunch of loose photos that have to be done as a part migration, do talk to us. We have the technology to help you with that.

 

To continue on to the Q&A section, access the full webinar recording here >