The North American Spine Society (NASS) 2018 event that took place in September seemed quite similar to the ones I’ve attended during previous years. The overwhelming focus (on the expo floor anyway) was on technologies and solutions involved in spinal fusion procedures. From metal, plastic, and hybrid implants to biologic-based fillers, the company booths showcased mostly familiar technologies with the few exceptions of a new innovation or product release.
Fortunately, if you really work the aisles, you can find some newer offerings from companies that separate themselves from the pack. Attending for all three days certainly presents that opportunity. And this year, the reward for the effort was locating vendors highlighting the growing field of virtual reality (VR). Better still, VR enables completely untrained laymen like myself (at least in the surgical sense) the opportunity to try out a technology without worry. In fact, given my propensity for video games since I was a child, I may have even held an advantage over some of the surgical professionals who also attended the event. Well…maybe not. But it still provided for an exciting experience.
The first company I visited was ImmersiveTouch to sample the experience provided through the company’s ImmersiveView system. According to its website, the firm focuses on “improving patient safety and satisfaction by using cutting edge technology to enhance the surgeon’s work flow efficiency and patient outcomes.”
The equipment used when interfacing with the ImmersiveView system included the VR headset and a controller for each hand. The handheld devices were surprisingly intuitive, and with relatively little instruction on which button did what, I was flying along in moments, reviewing a 3D rendering of a patient’s spine. The system uses CT or MRI data to create the visual effect. Further, I was able to “peel away” layers of the patient anatomy with ease, giving me an untrammeled view of any area I wanted to focus upon. In addition, I could identify points or regions I needed to measure, enlarge, or reduce in a certain segment, or even travel “within” a portion of the patient anatomy to better understand a problem.
While ideally you’d have the opportunity to “drive” the technology yourself, the company offers a quick video demo of it.
Moving along, I headed over to another VR vendor, Augmedics. This system wasn’t so much of a VR technology as it was an augmented reality innovation. This company is involved with surgical navigation and instrument placement in real time, leveraging the capabilities of the technology (more specifically, a headset and visor) to enable a more natural approach in viewing the patient anatomy during a surgical procedure.
In most instances, a surgeon looks at a monitor away from the surgical field to see the pathway or placement of an instrument. While surgeons become accustomed to this technique, the positioning is certainly awkward and unnatural. It would undoubtedly be better if the surgeon could look down at the surgical field where his or her hands are working. Augmedics’ xvision allows for exactly this. The overlay the surgeon sees through the visors of the headpiece provides a “look inside” the patient. Further, the instrumentation can be seen as well in real time.
While the technology is not yet approved, those interested in seeing how it works should view the company’s video.
The last company I visited was Fundamental Surgery. The system here was a surgical simulator that enabled a completely risk-free opportunity for physician training, but honestly, that’s completely understating the innovation. Without question, this technology left the most lasting impression on me. Since this was the NASS event, the simulation was for a spinal procedure. With the headset in place, I took hold of the handpiece controls and began to perform a spinal fusion (the early stages of one, anyway). The true impact of this technology came when, through the instrumentation, I “felt” the patient’s spine. I could run an instrument across a vertebrae and “feel” the bumps on the bone. Further, when I touched another area of the patient’s anatomy, the feedback I received was different from the feel of the bone. The system’s incorporation of haptics was remarkable. Quite honestly, for someone who’s never performed any sort of surgical procedure, I found the experience unsettling. That said, for a surgical student or even a seasoned veteran, that type of experience would likely be invaluable for training on a new procedure.
Although it hardly does this technology justice, you can get a better idea of how it works from the company’s video. If you have the occasion, however, I strongly recommend trying this system in person.