Designing a Hybrid or Robot Room takes intense staff participation, coordination and feedback in order to incorporate clinical workflow, best practices, future flexibility and manage expectations. In a recent project, here are some lessons learned through the process:
1.) Define the guiding principles of the Room Intent and Design – Keep these important guiding threads throughout the project to avoid current staff preferences, improve current workflows, and incorporate flexibility
2). Conduct a mock surgery that brings all staff and equipment into a simulation setting to define and discuss workflow patterns, collaboration, technology details, portable equipment connections, anesthesia needs, etc.
3.) As construction is beginning, plan a strategic leadership workflow workshop to include key physicians, clinical staff, and vendors (Boom, OR Integration, Equipment Planner, IT, Biomed, Equipment Vendors, etc.) to finalize functionality matrix, equipment, responsibilities, training, and coordination. (Surprises will come up)
4). Discuss scheduling priorities and emergency cases (trauma) to create a template for case scheduling and manage physician expectations.
Congratulations Pam Redden – selected for the prestigious MD Anderson Rogers Award. Rogers Award. Employee Notes 2012
Pam is being recognized for her contribution as Director of Clinical Operations Development. Pam embodies the mission of the Nursing Institute for Healthcare Design as a clinical advocate, liaison between clinicians and facilities and visionary leader. We salute you for your Nurse Leadership and Clinical Advocacy within the Design and Construction Community.
MIT’s Picard demos emotion monitoring at DESIGNEast, and creeps me out
Written by Patrick Mannion – September 20, 2012
The above examples were but some of the application-level — and sometimes accidental –off-shoots of the fundamental research into emotion-aware systems being undertaken by MIT professor Rosalind W. Picard, and demonstrated during her keynote here at UBM’s DESIGNEast conference this morning.
Some quick background: Picard is founder and director of the Affective Computing Research Group at the Massachusetts Institute of Technology (MIT) Media Laboratory, co-director of the Things That Think Consortium, and leader of the new and growing Autism & Communication Technology Initiative at MIT. She is also, in her spare time, co-founder, chief scientist and chairman of Affectiva Inc.
Her work, in essence, involves translating indicators of emotional states based on sympathetic nervous system arousal (flight or fight and emotions) into something that can be acted upon by a computing device or system. These indicators include skin conductance, heart rate and facial expression.
Why is this so important? As Picard herself said in her keynote, “Emotion is the fourth vital sign,” and as many in the medical field are now recognizing, emotional states in many ways affect — and can even determine — physical health. Also, for commercial applications, being able to measure heart rates, smiles, frowns, can impact delivery and feedback mechanisms for digital signage or even live demonstrations, such keynotes. Today, if such a system were active, Picard would have gotten the green light.
She had the crowd at the first demonstration. In it, she showed how Affectiva’s Q Sensor measures electro-dermal activity (EDA), particularly conductance. That part’s straightforward: skin moisture increases or decreases based on your arousal or general excitation level.
Affectiva’s Q-Sensor combines electro-dermal activity measurement with a 3-axis accelerometer and wireless connectivity to help, among other things, get insight into the emotional state of autistic children.
In addition, the Q-Sensor combines that conductance measurement with a 3-axis accelerometer. This detects regular motion as well as rocking, arm flapping and other movements characterized by Rhett syndrome or autism. The data can then be wirelessly uploaded via Bluetooth to a computer. The sensor also tracks ambient temperature and other environmental information.
This is all well and good, and it’s a clear path ahead to being able to get a bead on someone’s emotional state, which, if anyone who has experience with autism can attest to, is a huge breakthrough in and of itself. Anyone in a relationship could have a lot of fun with this too, but before you go off on a mental tangent, hang on. There’s a kicker.
Seizure prediction, say cheese
In a classic case of discovery through chance and observation, Picard had a surprise up her sleeve. It turns out that a student of hers borrowed the equipment to analyze his younger brother. When she looked at the data soon after, she noticed a spike in the readout but could not deduce the cause. She called her student and it turned out the spike correlated perfectly with the time his brother went into a seizure.
Picard called some experts in the field of seizures, they put two and two together and figured out the system could detect the onset of both grand-mal and petit-mal seizures.
Inversion: The signs are looking at you now
Picard was introduced on stage by Jeff Bier, founder of the Embedded Vision Alliance and president of BDTI. The founding principal of the Alliance is that we’ve only scratched the surface of how embedded vision can be applied.
In her keynote, Picard demonstrated this clearly, showing of how a webcam can be used to detect heart rate by discerning changes in facial color due to the pulsing of the heartbeat.
In another example, Picard demonstrated Affdex, an application of visual analysis that reads emotional states such as liking and attention from facial expressions using a webcam. This gives marketers faster, more accurate insight into consumer response to brands and media.
This reminded me of a conversation I had with Intel last week at its Intel Developers Forum in San Francisco. There we discussed its new Intelligent Systems Framework and how that can be used to connect the systems needed to truly enable the Internet of things.
One of these ‘intelligent’ things to be Internet-ed is digital signage. Now, instead of advertisers and retailers guessing how many people may have looked at their digital signage ad, they can get measureable results using Intel’s Audience Impression Metrics Suite (Intel AIM Suite).
This leverages cameras embedded in the signage that ‘anonymously’ monitors viewer metrics, such as gender, age bracket and length of attention, and analyzes the data in real-time so advertisers can instantly tailor the featured content of digital signs to align with viewer demographic.
Of course, the supposed upside for us with Affex and AIM is is that we get more relevant, customized advertising, which is good, I guess, but there’s something just plain creepy about cameras monitoring my eye movements and the data going to some database somewhere. I don’t, for one minute, buy the ‘anonymous’ part. It may start that way, but…
But I digress. On the more fun side, Picard showed an application that examines the faces in a crowd, displays them on a screen with emotive images over the faces that showed when someone was smiling or frowning.
Fifteen minutes after her keynote, I still couldn’t get near Dr.Picard. Her talk stimulated the creative neurons of everyone at her keynote at DESIGNEast. Also, her company, Affectiva Inc. announced they’re recruiting, right now. Send in your resume!
It turns out that when the people in the crowd (it looked like a mall) saw the smile on their face, they smiled wider. And those who didn’t have a smile actually did smile, and kept on smiling, leading to an increased sense of well-being. Which takes us back to Drucker’s quote above: if you want people to smile, measure their smiles.
You can try out Affdex at the Affectiva site. It’ll activate your webcam, so be aware of that before you do so: Say “cheese”!
We have had the privilege to work with Lisa Charrin on several recent projects. Her knowledge, client advocacy, and expertise have impressed us greatly. Lisa is an architect, project manager, and equipment strategist. Her understanding of the connection between the design, the clinical and the equipment is why she is one of the nation’s most recognized industry experts on future technology planning. Lisa Charrin will be one of the panel of experts at our Healthcare Technology Summit on October 9, 2013 at Johns Hopkins Medical Center. Lisa will focus her discussions around creating flexible design strategies for ever evolving medical technology. Interoperability, OR Integration and Imaging Equipment will be subjects Lisa will touch on at the conference. She will be leading a breakout session to further dive into these topics. Join in our discussions at the Healthcare Technology Summit. For more information visit www.ssr-inc.com/summit.php