The Role of Human Factors in Medical Product Development

 

Memorable Quotes:

  • “Where the Chinese government has basically allowed international players like many manufacturers to come in, like if their device or drug has been approved for emergency use, then they can import it into this region. And these academic institutions, medical institutions will be able to use them and trial them with Chinese patients. And then through the real world evidence that's gathered there, from my understanding these manufacturers can start to go through the approval process to get into China.” - Rita Lin

  • “ I find it interesting having worked in medical device and then combination products, there's a lot of similarities in language, but there's some like nuanced differences like the ICH Q9 to 14971. Risk management standards and even KAPPA. KAPPA is defined in the CFR for medical devices and it's a process in medical device land, but it's a an object in formalin.” Shannon Hoste

Transcript:

Intro- 00:00:04: Welcome to The Factor, A global medical device podcast series powered by Agilis by Kymanox. Today we're back for part two of our conversation with Rita Lin, Director of Human Factors Engineering at Kymanox. In part one, Rita and Shannon discussed the role of empathy in human factors engineering, and Rita shared some of her eye-opening experiences with medical devices in third world countries. Today they discussed the recent RAPS Convergence in Montreal, Rita's presentation on human factors considerations for combination product development, and the global regulatory landscape. Here's Shannon.

Shannon - 00:00:41: You've been working with us here at Kymanox for two months. I can't believe that already. But you've actually already had the opportunity to go out and attend and present at the RAPS Convergence in Montreal. So I know that was just, I think, a week ago at this point. So I wanted to catch up and hear how that went, what was your presentation on and then hear how the conference went a little bit.

Rita - 00:01:08: Yeah, of course. The conference was awesome. It was my first time back in that sort of environment since before the pandemic. And same with presenting. So, I love the experience. I thank Agilis and Kymanox for the experience. It was such an honor to be able to go and represent the company.

Shannon - 00:01:32: A quick question, was it a little overwhelming? I found my first trip back to a conference to be after COVID. It was a little bit overwhelming. 

Rita - 00:01:42: It was a lot of people, but I think everyone was in the same boat. It was the first RAPS Convergence since before the pandemic, from my understanding. So the energy was very raw. I found it very easy to meet new people and network. So I also had never been before either. So I was just blown away by the sheer grandiosity of the whole spectacle. 

Shannon – 00:02:09: It’s a great conference. 

Rita – 00:02:10: Yeah. The main room just, it looks like the iPhone, the next iPhone was going to be presented. Lights going off and a DJ. So it was nice to see regulatory affairs professionals at the top of their career ladder getting cheered. They were like rock stars. Everyone was saying, “oh my gosh, Janet Woodcock, she's here”. So that was neat to see. For my own presentation, I had the chance to present on the last slot of the second day. I had logged onto my app a couple of days before, just to make sure my slides were there. And I was curious to see who was attending because I want to do, even last minute, make sure the content that I was planning on bringing was going to be helpful and worthy of their time. So at the time I saw, okay, maybe 23 people had decided to show up, which is a great crowd. Maybe we could even have breakout sessions. And then when I showed up to the room to give my initial run through, I saw people, I saw how big the room was, and then later it filled up. So I was so grateful and glad to see during that talk, there were heads nodding. It was about human factors considerations for accommodation product development.  And thank you, Quinn, from Kymanox for your help on that, Stephanie as well. And I focused to talk on assuming that the audience was coming from either a pharmaceutical background wanting to work with a platform device company to market a combination product or vice versa, someone from the medical device field who just was curious about combination products and if their company could get into that space as well. Considering that over time combination products have gotten more and more complicated. We're seeing mobile medical apps also in that space. And AI/ ML possibly too with on body injections. So yeah, overall, I talked about risk management. Always needed to be a part of design controls, always needing to be a part of device development because risk management is from my understanding not inherently built into drug development as it's understood medical device manufacturers. So-

Shannon - 00:05:10: I find it interesting having worked in medical device and then combination products, there's a lot of similarities in language, but there's some like nuanced differences like the ICH Q9 to 14971. Risk management standards and even KAPPA. KAPPA is defined in the CFR for medical devices and it's a process in medical device land, but it's a an object in formalin . So it's interesting how we have very similar languages, but just some nuanced differences as we try to work through and navigate our...

Rita - 00:05:48: Yeah, that's a good point. I think another thing that could have helped in a presentation like that would be to try to understand and bridge those differences. But for myself coming from medical devices, I guess I had always assumed that design controls at least was like newer to pharmaceutical companies. So yeah, I think tying human factors to risk management to design controls was by the point of my presentation and basically walking them through examples of combination products. And why a combination product you surly risk would be unique compared to the device itself or the drug risks them just on its own. So just talking them through the users, youth environments, training differences that we've discussed earlier in this conversation.

Shannon - 00:06:59: Yeah. I found that interesting for working on medical device drug delivery platforms. So from the traditional medical device perspective, you can analyze the risks of your product and understand what failure modes you might have, what are your mechanical failure modes or your process failure modes that can lead to risk. But until you identify what drug is going to be in there, you really can't truly evaluate risk because you don't know the potential harms, right? So it's almost like as if you're in the drug menu or the device manufacturing space, you can only go so far in platform devices and combination products until you then have to look at the overall combination product as a whole and then assess the risks associated with that. It's an interesting dynamic in that risk management space.

Rita - 00:07:52: Is it accurate for pharmaceutical companies to be on guard as they're debating who to partner with in the platform device space? Do need to make sure that the platform device company has done their due diligence and completed time, design controls and risk management activities as part of human factors activities, right? Shannon - 00:08:20: Yeah. Absolutely.

Rita - 00:08:22: Yeah, but that's not enough because then the combination product company needs to do their own run through of all of that.

Shannon - 00:08:28: It's an interesting space, which I think is why we have whole conferences. I'm heading out to PDA next, actually today, at the Universe of Pre-Filled Syringes in Sweden. So as you think about it, it sounds like it's very specific area of the industry, but there's so much, there's so many considerations and things that factor into it.

Rita - 00:08:57: For sure. Going back to your point about the drug being a big factor. The examples that I gave. Were related to how the drug, if it were viscous, for example, could affect the way that the user is able to like hold. If it's going through a syringe or a prefilled syringe, if they're able to hold for the requisite amount of time or if it's going to be painful, you know, you might think about. The last time that you had a shot might detract you from wanting to take that shot like many times a day. So you could start thinking about compliance. So I gave all the different reasons for manufacturers to consider human factors, not only from safety, but also business advantage. Like if we're talking about compliance, you can build a product that is very safe, but if the patient isn't going to be complying, that's also part of safety too, that they're not going to be getting their full dose. And then also, you're going to be able to make certain sales or marketing claims or even touch reimbursement to some extent. So I think just bringing it back to the real world in a sense and not. And taking a step back from, I like to, sometimes I fall into the FDA hat and taking it, so I try to take a step back from, “this is what the guidance documents say, and this is what the FDA wants”. So trying to make the argument more approachable. And part of that was also encouraging them to have early interactions with the FDA and really breaking down, okay, let's say you're, at this stage of your clinical trials. What could you do in terms of human factors? Like when should you be considering submitting a protocol et. cetera. So I think it was helpful. I got many questions, a lot of them were very specific to, it sounded like to their company themselves. But one interesting question was with regards to our time at the FDA, asking like, “what were the most common issues that you saw in submissions?” So, I talked about the use-related risk analysis. Like we have talked before, shout out to Hannaby for pulling the data that the majority of the deficiencies that FDA CDRH was writing was related to the usual risk analysis, but a couple of folks in the audience had human factors backgrounds, actually. I had done a quick poll at the beginning of my talk. So people also raise their hands and give their own experiences as well. So someone mentioned success criteria, not having that defined. So and then going back to like, use someone asked, how realistic does the UCR need to be? So just the simulated use and then making. The FDA wanting to see how realistic the simulated use environment or its setup is compared to expected real world use. Um. But those are all conversations I was very comfortable to have.

Shannon - 00:12:26: Yeah, yeah. There's a lot of pieces that come into it with one of the diagrams I use a lot in presentations is kind of the pyramid idea of your building up your understanding of the users and the use environment and then from there you're assessing risk. So you're building up this pyramid of information, but if you miss something along the way, then it doesn't necessarily stand right. If I miss a user group. All together, then you don't have all of the information to go up and into your risk assessment and into your validation.

Rita - 00:12:57: And then when it rains it like trickles in through the hole

Shannon - 00:13:01: Thing falls down So I know one example that always comes to my mind when I think about that was like respirators and so respirator equipment being developed in different countries and the user groups for that are different potentially in different countries. So for example, we have our IT's in the US. And that role may not be a dedicated role in other countries. It may be the nurses that are managing that or the clinicians and things like that. So making sure for the areas that you're launching in that you're understanding those user groups. And not leaving any holes in your understanding.

Rita - 00:13:55: It also help for the post market too. We just discussing earlier, Shannon. Maybe you will be able to launch to some extent, but if you haven't been able to do your due diligence, I believe that there's going to be some sort of karma, medical device laws at the end of the day. Not in the sense of punishing a manufacturer, but people could get confused and you might end up in a situation where you are getting customer complaints or even worse, people getting hurt, maybe product recalls. So yeah, that's also usually part of the argument too. But I also just don't want to even start talking about it. I want to start helping to move the needle. For people to think about HF a lot earlier. 

Shannon -00:14:55: Yeah. I agree. 

Rita – 00:14:57: Like even at the FDA, I think with the 2016 guidance, it was so focused on validation testing. And even like the 2022 draft is about, what should I submit for, like we're already just talking about those marketing submission. Yeah. Well, I hope that's have, it does have material about earlier preliminary evaluation. But anyway.

Shannon - 00:15:21: My hope is, and maybe I'm altruistic, but my hope is that the regulators are identifying that, “hey, this data is important and critical to understanding safety of use”, right? And that as companies start to do that work, they start to quickly see the value of that work and how it not only helps them develop a safer product. And it. It makes good business sense because it reduces complaints and issues in the field, but that it improves their they're more competitive and they have better products and maybe they even have less iterations on their products, right? So that they start to see the wins of this type of work and this type of information and then start to incorporate it earlier when it's even more valuable. That's my hope. And I think I've been seeing that evolution over time.

Rita - 00:16:16: I think I have as well, but yeah, more from a third party perspective, just observing the kinds of questions, for example, that we've been asked over the years at different human factors conferences. It seems like more people are bought in and now. And they're digging into it details of what that would look like.

Shannon - 00:16:36: Yeah. I always think if you think about the. Life cycle of products, right? And the like ecosystem that we live in, right? So we're developing products and they're on the market. So, we spend one to six years in development or more, and then that product goes on the market for hopefully 20, 30 years or more. And so the majority of the product's life cycle is on the market. But over that whole timeframe, the human factors work and the understanding of the users is going to happen. It's just a matter of when, right? How much can you do upfront so that you understand and can optimize the product upfront versus putting it on the market and having your patients or your health care provider customers. Doing that human factors research for you in the form of complaints and feedback and post-market. So, it's going to happen. Going to I mean that that human factors work quote unquote is going to happen. Real world evidence.

Rita - 00:17:40: Yeah, I was drawn to jumping back real quickly to the RAPS Convergence, I was really drawn to the talks about real world evidence. So I didn't know this, but there's a region in China in Hainan like South China called the Boa region. Where the Chinese government has basically allowed international players like many manufacturers to come in, like if their device or drug has been approved for emergency use, then they can import it into this region. And these academic institutions, medical institutions will be able to use them and trial them with Chinese patients. And then through the real world evidence that's gathered there, from my understanding these manufacturers can start to go through the approval process to get into China. So that's like, interesting, really interesting, really revolutionary. I hope that I mean, it was new to me. I hope I'm explaining it correctly for our listeners. But it's only been in the past about 10 years that this program has been in place. And It also opens up, the conference opened up my eyes to other activities that I've been going on outside of like FDA outside of the US too. Even some FDA regulators had said like they're open to collaboration. They're open to other ways of thinking about things. AI/ ML was a pretty hot topic too, but we didn't get anywhere with it. It was just more about collaboration and more try new, like realizing that no individual silo had the answer for everything. I think that that was the bike away from that conference. There are big problems, not problems, but big opportunities on the horizon. Government on its own is not going to be able to figure it out. I think we saw at the FDA especially with like new technologies oftentimes that review them would rely a lot on the manufacturers.

Shannon - 00:20:00: Yeah, I mean Technology is advancing so fast and regulatory science has to keep up, right? How you establish safety and how you monitor and track that. Safety and effectiveness has to keep up and evolve just as fast.

Rita - 00:20:17: So, yeah. Okay, given the amount of time, I still wanted to ask you, how did your talk with the University of Michigan go?

Shannon - 00:20:29: Oh, that was excellent. So I had a chance to do the webinar, a webinar for the University of Michigan Innovation Partners, on human factors for clinical decision support, software as a medical device. We talked a little bit about AI//ML. And we had a chance to talk briefly on the new physiological closed-(PCLCs) loop control systems that just came out. And so, yeah, it went super well and we had a chance to kind of unpack that. And what I like about that guidance is that it outlines some of this, it identifies some of the specific concerns, the human factors related concerns with AI and ML or clinical decision support, that whole field. And part of that is automation bias and considerations about how people interpret the data, how much trust they have in that data, do they know when they should and shouldn't trust it, do they know what's going into the decision. So that they can make an informed decision on that. A lot of these considerations, it's bringing those up. And identifying those and then it's calling out, “hey, this is when you're doing your human factors, you need to make sure you're looking for the specific risks associated with AI//ML” or I really think, as my background is in cognitive systems engineering. When I went back after post-market issues caused me confusion to do with what people could do with products. Well, back to study cognitive systems. And in general, in cognitive systems, you're learning a few things. You're learning, one, people behave rationally in the moment. So when they're making decisions at that moment in time based on the information they have around them and everything that they're being exposed to at that time, that decision makes sense to them. That's why they're doing it, right? In hindsight, it may not make sense, but at that moment it does. And so what is it about the system that's giving them the information? And supporting them or not supporting them in that situation. And so, I spent, in my research in graduate school, it was all about all of these events that happened. And wear that breakdown. When it went wrong from Three Island, right? And accident after accident, it was a little bit macabre, but I developed this a little bit of an eye twitch every time I hear, “hey, Automation Bias is going to make things great and take out human error”. I start to get a little. I twitch because I'm like, oh, there's other things we now need to think about. So I like that guidance because it starts to unpack that a little bit. In a way that kind of applies across, it's not just speaking to the cognitive systems folks out there, right? It's speaking to medical device developers. So we had a chance to unpack that a little bit. So it was fun.

Rita - 00:23:22: Was it interactive enough that you could unpack together? Like was there-

Shannon - 00:23:26: We had some, we weren't interactive through the lecture part, the discussion part, but we had quite a few questions at the end that we were able to talk a little bit. So I'm hoping it at least piqued people's interest and I wanted to provide a lot of information that they could dig in further if they wanted to, so.

Rita - 00:23:47: Yeah, but trust in software is everywhere that we look.

Shannon - 00:23:55: And when to intervene, I mean, it's an issue in like the autopilot or the automatic drive, Automation Bias in automobiles, right? Of, you know, if that driver needs to intervene in the moment. Can they? Are they going to be aware of the situation fast enough, situational awareness, in order to intervene and things like that? So it applies to anytime we as humans are interacting with. With some Technology that's helping us make decisions. 

Rita - 00:24:26: Yeah. That's good that it got technical. It sounds like the audience was very engaged.

Shannon - 00:24:34: It was a great group with the work that they're doing. Their research across the spectrum from their students and researchers within the University of Michigan systems, health care practitioners and so forth that are building up different innovations. So it's neat.

Rita - 00:24:55: I think sometimes what we're doing is not only I mean, yes, for the business case, getting Agilis name out there, Kymanox name out there, but it's almost a continuation of the work that we were doing in the FDA also. So bringing these human factors principles and getting everyone excited about why it should matter, that I think is always going to be something I'm interested in.

Shannon - 00:25:25: Yeah, I think as part of it's part of Kymanox, so Agilis historically was focused on human factors and instructional materials, but now as part of Kymanox, we're part of a system that can support companies from beginning to end in product development. And it's exciting to be able to work in that space because we have so much more resources and expertise that we can pull from and tap into. And so when we're engaged with the client, we can help them identify needs and then identify how to go about solving them, whether they're an HF need or Process Validation need. I've had that before with clients where they had a human factors concern and we were able to identify ways within their process to take that out of the hands of users and do it within the product. But then I was like, “okay, now you have a Process Validation issue, not a human factors issue”. So it's exciting to be able to work with them across the board range of needs and be able to support that. So, okay, so we're running out of time, but I wanted to thank you for a couple things. One, I'm super excited that you're with us here at Kymanox now. And so, I wanted to thank you for joining us and jumping right in and getting involved projects and working with the staff and the people that are reporting to you and building that up. At conferences. So thank you for that. And then I also wanted to thank you for everything that you bring to our industry. And as you go out and you do these presentations and the work that you've done over the years, you bring your, again, your expertise and your experiences from understanding global cultures and all the information and you have accumulated as a whole person and bringing that to bear for improving medical devices and health care. So thank you for that. Thank you for working with us. And thank you for helping us all to be better. Gosh, thanks.

Rita - 00:27:40: Thank you, Shannon. Thanks for always taking a dance on me and believing in me. Oh, this is great to work with you.

Shannon - 00:27:49: Well, I guess we'll wrap it up and thank everybody for listening.

Rita - 00:27:53: Thank you, everyone.

Outro - 00:27:59: That was Rita Lin and Shannon Hoste. Thank you so much for listening to or watching this episode. Be sure to subscribe or follow this podcast and whatever app you're using right now, or follow Agilis by Kymanox on LinkedIn for all updates. For more information on what Kymanox offers, visit kymanox.com. That’s kymanox.com. This episode was edited and produced by Earfluence. Thanks for listening, and we'll talk to you again soon on The Factor.

Like this episode?

 
 
 
Kristen Breunig