Interview with Richard M. Satava, MD FACS, Professor Emeritus of Surgery, Department of Surgery, University of Washington Medical Center, Seattle, Washington.

Dr. Satava has held numerous leadership and management positions in the medical community. Prior assignments and positions include Professor of Surgery at Yale University and a military appointment Professor of Surgery (USUHS) in the Army Medical Corps assigned to General Surgery at Walter Reed Army Medical Center and Program Manager of Advanced Biomedical Technology at the Defense Advanced Research Projects Agency (DARPA). He has also served on the White House Office of Science and Technology Policy Committee on Health, Food and Safety.

His previous positions of leadership and influence in medical societies include membership of the Emerging Technologies and Resident Education, and Informatics committees of the American College of Surgeons (ACS) and now the Accredited Educational Institutes committee.

Group editor Marty Kauchak conducted a wide ranging interview on topics of interest to our community with Dr. Satava on July 22, 2013.

MEdSim: Given your scope and breadth of experience and accomplishments in the community you have certainly “seen the future.” Your insights on how medical simulation has evolved.

Dr. Satava: The real change began in 1987 when I started building the first virtual reality simulator. After submitting for publication a number of times to multiple different journals, I finally had the paper accepted and it was titled “Virtual Reality Surgical Simulation – the First Steps”.

In 1992 when I was still on active duty, and because of the work I had done both on simulation and virtual reality mainly, and robotics – I was on the team that developed the first surgical robot – I received orders to report to DARPA. It was a “detailed” duty assignment where I was also assigned to the [now closed] Walter Reed Army Medical Center assignment where I could keep my surgical practice going.

I was a program manager at DARPA. That gave me the opportunity to craft solicitations and provide sufficient amounts of funding in robotics and surgical simulation, and in other areas as well.

In the 14 years I was at DARPA the program comprised $(US) 10s of millions of dollars in surgical and medical simulation – mainly surgical skills and basic technical skills training, because both virtual reality and medical simulation were totally new fundamental sciences for healthcare

I worked extensively with some of the pioneers in simulation. Jeff Cooper, MD, at Massachusetts General Hospital, was one early expert in team training and communication skills.

And then there was the pioneer in simulation, Dave Gaba, MD, at Stanford University-VA Clinic. He completed all of his simulation on a mannequin whereas the DARPA projects focused on virtual reality.

M: Reflecting on DARPA’s significant, early investments in these technologies, was there an adequate return on investment for the US taxpayer and, as important, the military stakeholder.

Dr. S.: No – with a small yes. And that was because it was very difficult to get the surgical community to pay attention to simulation and training. A fair amount of that was my fault. We had developed simulators initially and discovered about five to six years into the overall program that it wasn’t about the simulator at all, it was about the curriculum. We finally got very heavily into the critical piece – curriculum development and validation.

The problem we discovered, at least in the first few years, is that every company that was funded found their favorite doctor and did what that doctor said. It turns out the education community and, in particular, those responsible for certification – the boards of surgery, urology and others – didn’t care how fast people operated. Time was the main metric that nearly everyone used. Because we did not focus on patient safety, we were not able to get these appropriate surgical authorities to accept the simulator and its curriculum, let alone to mandate the use of simulation. Medical educators were rightfully skeptical because if simulation was not required by the certification authorities (Boards), it was reasoned, why should they exert a significant effort in simulation?

M: I imagine that during your DARPA assignment you also worked closely with the US Army’s PEO STRI in Orlando?

Dr. S.: Yes. I had the opportunity to be closely in touch with PEO STRI which was the command that was funding all of the other non-healthcare simulation for our service and other military commands.

I learned from them not only the components of simulators but also the essence of simulation from the program management and curriculum development standpoints.

M: You have also collaborated with behavioral psychologists and non-medical learning professionals during your career.

Dr. S.: During my assignment at Yale University’s Department of Surgery and concurrently at the NASA Commercial Space Center, I continued to support DARPA. I had a Fulbright Scholar, Tony Gallagher, who spent two years with us, who was a behavioral psychologist and I learned an enormous amount from him and gained a great amount of respect for his occupation. Through Dr. Gallagher’s efforts, as well as the medical educators, psychometricians and statisticians, it was possible to prove the effectiveness of the curriculum and simulators.

In 2002, the VR to OR validation study was accepted by the American Surgical Society. After that was accepted, the American Board of Surgery soon required that simulation become part of surgical training. That was the major step forward for simulation becoming accepted.

M: And today’s healthcare community requires a rigorous curriculum with outcomes? 

Dr. S: Yes, they require a well-constructed curriculum with the appropriate outcomes that has validation proving that they are effective in training the learner. Without that validation it doesn’t matter how famous your surgeon is, or how important your company is, the authorities are not going to approve it [the simulator or simulation].

M: So how is the community doing to validate the technologies we see in medical simulation centers and in conference exhibition halls?

Dr.S.: It’s been a very, very long struggle. There are some simulation-based curricula, that have been accepted, but by-and-large it’s been very, very slow.

Part of the reason is the medical education community does not accept any of the simulations and training curricula until they have been unequivocally proven to be effective. There is not a lot of experience in the medical community on the rigors of curriculum development and on validation and implementation. This is slowly becoming adopted by more and more professionals in the medical profession who are finally beginning to understand the rigors necessary in order to make a curriculum acceptable and a required part of training.

Most surgeons do not have the time to spend to understand the stringent requirements to get his or her curriculum developed. The number one problem we have today for procedural-based training, where you use a lot of simulation, is that clinicians will say, “I have a great idea for training as a procedure.” They find a company and together build a simulator to their own specifications – as a health care provider and physician. In essence, this approach is one in which the clinician says “This is the way you should do it because this is the way I do it”. And then the clinician and company cannot understand why no one will buy their simulator because – the obvious explanation is that the developers frequently do not spend the time conducting the validation trial. In some cases, when the developers do attempt to validate the simulator, they try to do it themselves without the proper inputs of the educational and behavioral psychology community - and they miss their mark.

There is currently a small community of medical simulation experts (which is beginning to grow), of 40 or 50 people, perhaps a bit larger, that truly understand how to develop a curriculum, how to create a supporting simulator, how to conduct a rigorous validation study and then how to design a high-stake test that will be accepted by the certification bodies – this is referred to as the full life cycle development of curriculum - it’s a very rigorous process that takes at least two to three years to move forward.

M: And the cost?

Dr. S: It costs a lot of money. $100s of thousands in order to actually design a curriculum, incorporate it into a simulator and then validate the study and design the high-stakes test that accompanies it. All of that process must be completed before applying to the certification authority (such as a board), to consider requiring the curriculum to be a required part of surgical training.

 

"It becomes very difficult for the simulation centers to have enough funding and manage at a level they should have," Dr. Satava. Image Credits: University of Washington
"It becomes very difficult for the simulation centers to have enough funding and manage at a level they should have," Dr. Satava. Image Credits: University of Washington

M: Can you contrast that with the military simulator or simulation procurement model?

Dr. S: Well the irony of this is I have known about this since the early 2000s because of my association with the military simulation community. That community does not expect to build a simulator for a couple hundred thousand dollars and to have the curriculum implemented. They understand that it is not possible to perform the full life cycle development of a curriculum for less than a few million dollars. And once completed, it is necessary to spend a year or two validating it for nearly a million dollars in order to have a curriculum that is unequivocally, rigorously tested so it can demonstrate to the users that it is effective. This order of magnitude effort is sometimes difficult for the medical community to understand.

M: So is one solution to reduce medical community procurement costs, to accept as the military does, in many of its training devices, the “80 percent solution?”

Dr. S.: There is the issue of technology. We are experts in basic skills, team training and the like. For advanced procedures the medical community does not currently have the money to invest in order to make a simulator that allows you to practice not a skill but on the full surgical procedure for an operation, that is on the order of a high fidelity flight simulator, or tank or driving simulator. Such a medical simulator would be very, very expensive, and extremely technically challenging, and cost on the order of $10s to $100s of millions of dollars for one that is biologically correct and not just physically correct.

There is not an available source of funding to invest in such a high fidelity simulator. All the money that is being put into simulators is for very simple simulators. Trying to get to the next level is expensive and difficult, and there’s not enough people working in this area because there is no return on investment. It is really, really hard to take that next step – to produce a biologically-based simulator for a specific procedure.

With respect to the 80 percent solution, you do not have an 80 percent solution for a flight simulator, do you?

M: For the full flight simulators, no.

Dr.S.: When peoples’ lives are not at stake the 80 percent solution would be fabulous. But right now we are at the 20-30 percent solution in healthcare. We need a major, huge step forward to get there. We have hit the “technology wall”.

M: What are the impediments that prevent the medical community from embracing a more military-like procurement or life cycle business model for simulators?

Dr.S.: There are a number of educational and behavioral issues. The vast majority of the people do not truly understand the rigor required in medical education. The problem is the clinicians, who are absolutely critical in order to determine what training needs to be developed (i.e., the outcomes measures) have no clue how to do it. Some clinicians claim that they know exactly how to do a particular procedure and therefore, they will claim “This is the way you do it – you watch me and we’ll have a curriculum”. They do not understand the principles that are required for developing the outcome measures before you develop the curriculum, nor do they understand the full life cycle development.

The only good news is now we have a number of accredited simulation centers that have this rigorous understanding and we are beginning to get curricula developed which meets the standards necessary to be accepted by those who really count – the certification boards.

M: It also sounds like there’s a compelling need to bring others from outside the healthcare community into this process: educators, technicians to refine medical simulation center course content and other subject matter experts.

Dr.S.: Yes, that is absolutely true. To develop a medical education curriculum is a multi-disciplinary task: clinicians who know what is necessary and also behavioral psychologists to help them design what the curriculum will be, and to literally extract from them what are the critical outcome measures. My experience is there are very, very few surgeons or physicians who truly understand how to design curriculum, beginning with the outcomes measures and metrics that are required. They haven’t gone through the task analysis and all the other essential details that are required. In addition, they are reticent to reach out to those who can – the behavioral psychologists, statisticians and others. You cannot develop a high stakes curriculum that will result in certification if you do not have a multi-disciplinary team.

M: You noted the importance of the American College of Surgeons (ACS) in advancing the state of the art in medical simulation.

Dr. S.: I did. Back in 2003 or so Dr. Ajit Sachdeva [Director, Division of Education, ACS] made a very profound statement: “It is not about the simulator, it is about the curriculum”. That was the “Ah Ha!” moment for the simulator community. Dr. Sachdeva, when he made that statement, also said the other major problem facing the medical education community is that, even as simulation is beginning to become accepted and required, there is no quality assurance on the quality of the training - the simulation centers that were being built at that time were so random that there was no assurance that the training which was being performed had merit.

Under his direction and the impetus of the ACS Board of Regents under Dr. Carlos Pellegrini, and others, the ACS declared that although they believed in simulation-in-healthcare they were not certain that the process for managing a simulation (medical education) center was rigorous enough to develop confidence in the training for patient safety. Therefore they set up an ACS accreditation process that would guarantee the quality of education at a simulation center. They initiated the ACS Accredited Educational Institute (ACS-AEI) process for certification of simulation centers. This was every bit as rigorous as the process which hospitals go through for certification, and resident training programs were required to meet by their Residency Review Committees.

By 2006 The ACS-AEI had piloted a study, modified it and tested over again, and that year they began to certify simulation centers to train surgeons in technical skills using simulation.

There are now 77 certified centers, 14 are international. All have gone through the rigorous ACS- AEI process, which includes not only specific skills training, but also multi-disciplinary and inter professional team training.

M: Team training is one topic of increasing interest in agendas and on the exhibit floors of recent conferences.

Dr. S: What the ACS has done to greatly facilitate that perception is the ACS-AEI has emphasized the importance of patient safety with the sub-header “Improving patient safety through surgical education and training.”

Several things have happened because of ACS:

1. The focus on training and education of the healthcare provider has become a secondary (supporting) issue. The primary issue is patient safety; and

2. Simulation as a technology gave us opportunities to train to quantitative measurements that must be met in order to train to proficiency.

 

M: We talked off line about the airlines. While it appears the medical community is lagging behind the airline industry in effectively embracing learning technologies, the medical is belatedly making progress.

Dr. S: There is no doubt, but there are a lot of issues facing the medical community both technical and non-technical, the least of which is financial.

There is a fundamental problem in healthcare – adequately funding medical education at the resident level. Quite literally there is no funding for resident medical education at the national level. A small percentage has come out of the graduate medical education budget (which is mainly intended for medical students). My understanding is that there are no specific dedicated teaching funds for structured resident education, and any such “educational funding” is usually used to pay for faculty time and residents salaries

So it is difficult for simulation centers to survive because education is expensive and it is not profit making. It becomes very difficult for the simulation centers to have enough funding and manage at a level they should have.

M: So the medical simulation centers turn to alternate funding sources – foundation grants and the like to pay for infrastructure.

Dr. S: That is so true. The problem is nobody wants to invest in infrastructure. Everyone who is willing to invest will donate to create a simulation center, but there is rarely any sustainment money to run it – to hire the necessary, qualified people to take care of it over time. Benefactors can put their name on a simulation center, but cannot put their name on paying for electricity and for the faculty to come and teach. This is a huge problem.

M: Getting back to education and patient safety, it appears simulation, as part of the learning process, has an opportunity to make a huge impact – much as did the Flexner Report which changed medical training from follow me around as a medical apprentice, to a structured scientific educational training program..

Dr. S: Until now, there has not been a method to measure technical performance. All performance assessment was by the attending surgeon observing the resident, and the faculty deciding when they “were ready.” Simulation has permitted the establishment of objective measurements and to train to a benchmark level of proficiency (which is determined as the mean of experienced surgeons’ performance). We’re moving the technical skills from a time-based educational system (i.e., a strict 5-year training program) to a proficiency-based one – you have to continue training until you meet all the established milestones.

Another important change is that with simulators, as we previously discussed, it is possible to implement multi professional education through inter-professional team training.

The above factors are why this is a fundamental revolution in medical education – it is made possible by the technology of performance metrics through simulation and the process of objective structured assessment.

M: Let’s finally follow up and address your interest in robotics and share with us the state of that technology with regards to simulation.

Dr. S.: Yes, you are referring to the fundamentals of robotic surgery (FRS) curriculum. It was decided to develop FRS curriculum from first principles. As compared to the fundamentals of laparoscopic surgery (FLS), which was developed principally as a high stakes test, the FRS is being developed for the full spectrum of outcomes measures through high stakes testing and certification. The enormous value of the FLS is that it was the very first high stakes test for simulation skills training, which is now a required minimal invasive surgery component for a surgeon who wants to be certified in general surgery.

The FRS is a very accurately defined study. Based upon 19 years of experience in non-healthcare simulation the outcomes measures were defined and the appropriate metrics were chosen. Then the task analysis was conducted to determine what skills are required in the curriculum to meet the outcomes and measurements and metrics. This is a huge project, requiring multiple consensus conferences of subject matter experts. Then the validation study design had to be constructed to have enough learners in it to produce an unequivocal validation as well as meeting the conditions of an evidence-based scientific study. And now the validation process is in place.

One other final accomplishment is that through the ACS-AEI and some industry support, the Alliance of Surgical Specialties for Education and Training (ASSET) has been established. This is 14 surgical specialty societies (including the Department of Defense and the Veterans Administration Hospitals) which perform robotic surgery who have participated in developing the FRS curriculum. This is the first time that multiple specialties have ever collaborated in developing a single common curriculum for training. When completed, the individual specialty societies review the curriculum and consider adopting it for their specialty; in addition it will be considered for adoption by the ACS’s accredited institutes – all 77 of them. It is hoped that this will become a national and then a global curriculum that provides a common set of robotic surgery skills that all surgeons, regardless of specialty or nation, will be trained and assessed. Personally, it is unclear to me why two surgeons in different parts of the nation or the globe, would be trained to do the identical operation differently, perhaps this could be the first step to breaking down the silos that separate the medical education and training community.