Welcome to 3PBIOVIAN On Air.
Audiopost 2 is based on an article co-written by our SMEs in analytical development and quality control, Andrés Guerrero and Jani Yömaa.
Discover the competitive advantages of partnering with CDMOs that possess integrated, in-house analytical capabilities, an approach increasingly chosen by biopharma and biotech companies to reduce risk, cost, and time.
CDMOs With In-House Analytical Capabilities Save Sponsors Risk, Cost, and Time
At 3PBIOVIAN, we believe that innovation also lies in how we share knowledge. With the help of AI, we’ve transformed our scientists’ original articles into audio, making our research more accessible while preserving the authorship and scientific rigor of our experts.
Full transcription:
Welcome to the Deep Dive.
Today, we’re jumping into something really crucial in bio-pharma development. It’s all about analytical testing and manufacturing. Yeah, that interface. It’s critical. Exactly. And for you, our listener, you’re probably grappling with outsourcing choices all the time. So the big question really is, how do you keep things fast, safe, and top quality when you’re deciding who does your testing?
Well, that question is pushing a big shift in strategy. Our sources today are pretty clear on this. Companies are increasingly looking for CDMOs, contract development and manufacturing organizations that have strong integrated analytical capabilities in-house. The thinking is, look, fragmenting this stuff, sending testing out here and there, it just creates too much risk. Analytics really underpin compliance.
Okay, so let’s dig into that. Our mission here isn’t just about logistics, like shipping samples. It’s about the actual, maybe even quantifiable value of keeping it all together. Why does splitting these services cause so many headaches, technically, and with regulators? And what do you really gain by having development, manufacturing, analysis, all under one roof?
Understanding that value, that’s how you tell providers apart now. It’s key. Makes sense. Yeah, it’s definitely not just about finding the lab with the lowest price per test. It’s much deeper. It’s about making sure the data you get is directly tied to the actual process. So you have real control, real understanding of the product.
Okay, let’s start with the dangers of splitting things up. The sources talk about the hazards when samples physically exit the building. Right. Now we get it some outsourcing, you just have to do it. Maybe super specialized things like adventitious virus testing, things you don’t do often. Exactly. That kind of thing might need a special lab. But the problems really start when you rely too heavily on third parties for your, let’s say, standard routine testing.
And what kind of problems are we talking about first? Well, right off the bat, it’s samples that are time sensitive or critical for safety. Think about in process controls, microbiology tests, endotoxin levels. Things you need results for now. Precisely. Or even just products that aren’t very stable, certain intermediates, sensitive proteins. If these get compromised during shipping, maybe a temperature shift, maybe just a delay, you don’t just lose that one data point. You could potentially compromise the control over your entire GMP batch. The whole thing.
Wow. Okay. So beyond the sample getting ruined, what about the paperwork side, the sort of regulatory burden of using lots of different testing labs? Oh, it’s huge. It’s the qualification and auditing. See, the CDMO can’t just blindly send samples out. They have a responsibility. They have to audit and qualify every single external lab they use. Every single one. Everyone. That’s a massive ongoing job for their quality assurance team. It adds costs, ties up people, and stretches out the project timeline before you’ve even shipped sample number one. Right. Right.
And there’s a really critical point here about data integrity, isn’t there? For the sponsor company, the one who owns the drug. Outsourcing means you kind of lose sight of how your molecule behaves outside the manufacturing suite. That is such a key insight. Shipping itself introduces variables, you know, temperature changes, vibration, just the time it takes. None of that is part of your controlled manufacturing process. So if the outside lab sends back a weird result, say unexpected degradation, yeah, or aggregation or something else, how do you troubleshoot it? Was it a problem in the manufacturing step? Or did the sample degrade bouncing around in a delivery truck? Shipping adds this layer of variability, this ambiguity that’s completely separate from production. And regulators hate ambiguity. Really hate ambiguity, especially in a GMP environment. It’s just not acceptable.
Okay, that leads us straight into another big challenge. Transferring analytical methods between different labs. The source really stressed this point. Moving a method, say, from the development team to the CDMO, or from the CDMO to an external test lab, it’s hard. Oh, it’s notoriously difficult. Why? I mean, if you have the same written procedure, shouldn’t you get the same result? You’d think so, wouldn’t you? But it almost never works out that way. It’s the subtle things, the sort of devil in the details that regulators look very closely at. Like what kind of details? Well, tiny differences in how labs make their buffers, maybe, or the specific brand or even batch of a chromatography column they use, variations in how instruments are calibrated, even just operator technique, how one person does something versus another. So each time you transfer a method, you’re looking at extensive and frankly expensive revalidation work. Complex studies just to prove that lab-based result actually means the same thing as lab-based result. So every transfer is basically a risk, a technical liability, and it costs time and money. Exactly. Every single time.
Okay, so if splitting things up brings all these risks, logistical headaches, validation nightmares, data ambiguity, what’s the actual payoff for true integration, keeping it all together? Speed. Speed that comes from building knowledge faster. When the analytical team is right there, in-house, they start learning about your specific molecule, your API, right from day one of process development. So it’s parallel. Yes, that’s the key, parallel development. The manufacturing process gets developed alongside the methods that will be used to test and release it. They inform each other. And I imagine that pays off massively when you’re getting ready to talk to regulators, like for an IND submission. Oh, hugely, because those release methods, they’ve been developed and tested right alongside the actual production process. They’re inherently more robust, more reliable, easier to defend, much easier to explain and justify in your regulatory filings, like the CMC section, or when the agency comes back with questions. Plus, when the analytical team and the manufacturing team report into the same quality system, the documentation, the scientific reasoning, it’s all consistent, homogenous, much easier to defend. That makes sense.
I also found the bit about operational control really interesting. How does having everyone, analysts, project managers, production folks, physically close actually help manage the day-to-day workflow? It allows for really smart control points. They can sit down together and figure out the best places to sort of pause the process, retention points, they call them. Places where it makes sense to stop and wait for a critical analytical result before you commit expensive materials and valuable time to the next manufacturing step. Especially for those time-sensitive tests you mentioned earlier. Absolutely vital for those. You just cannot get that kind of tight, rapid process control if critical samples are being packed in a box and shipped overnight. It’s either extremely complex or, frankly, impossible.
And what happens when something does go wrong, like an OOS result, out of specification? Integration must be a huge advantage then. Oh, it’s night and day for OOS investigations, totally transformative. When you get an OOS, the clock starts ticking immediately, regulatory pressure, batch potentially at risk. With an integrated setup, the analytical scientists who ran the test might be literally just a short walk down the hall from the engineer who ran that part of the process. So they can talk immediately. Instantly. They’ve got near instant access to retain samples. They can retest quickly on the exact same instrument if needed. And this is crucial. They can start troubleshooting the reason for the failure together right away. The scientist isn’t just giving you a number back. They’re providing context. They might even have a good idea why it happened based on what they know about the process. So that means the whole organization’s knowledge about your specific molecule, your protein or vector, just keeps getting deeper. Exactly. An outside lab just sends back data points. An integrated partner builds applied knowledge about your product. That is the absolute core value. The integrated team understands the molecule’s quirks, its stability, where it might fail. And that understanding grows with every single batch. It’s institutional knowledge. And that knowledge helps prevent future problems. You got it. It helps head off future OOS results, potentially saving millions in failed batches and blown timelines down the road.
Okay. Let’s pivot then. Section three is about choosing, about vetting a CDMO. Now, there’s this kind of maybe lingering skepticism out there. This idea that because a CDMO’s main business is manufacturing volume, their analytical labs might somehow be, well, second tier compared to a dedicated analytical testing shop. How do we address that perception? Yeah, I hear that sometimes. We counter it with two things: regulatory reality and sheer breadth of experience. First, the regulators — FDA, EMA, whoever — they inspect a CDMO’s in-house labs using the exact same stringent standards they apply to standalone analytical labs. No difference in the inspection standard.
None whatsoever. There’s no CDMO discount on quality standards. They hold them to the same bar. Okay. That deals with a quality standard myth.
What about experience? Doesn’t a big CDMO actually see a wider range of projects? That’s the second point. And it’s huge. Think about the volume and diversity. A large, experienced CDMO might work on hundreds of different projects, different molecules, different challenges for many different clients over the years. So they build up a kind of wisdom. Exactly. Organizational wisdom. They don’t just run tests. They develop platform processes. They build quality systems specifically designed to handle the kinds of issues that pop up again and again in BioPharma — identifying risks early, handling QA challenges, troubleshooting tricky OOS results efficiently. Every problem they solve refines how they do things internally. So it’s much more than just having a list of analytical services they offer. It’s about whether they can actually use those services intelligently and put the data into context for your project. Precisely. An experienced analytical team, one that works closely with manufacturing, might see something subtle — maybe a tiny bit of protein precipitation early on, or some minor aggregation forming. And they don’t just report the number. They connect the dots. They think, okay, why did we see that? What might it mean downstream? The whole point of useful analytics isn’t just getting a clear yes/no result. It’s getting the context around that result to help you make the right next decision.
Okay, so for our listener who’s in the process of choosing a partner, what kind of questions should they be asking? How do you get beyond the glossy brochure and figure out if a CDMO really has that deep analytical experience, not just “Do you have an HPLC?” Right. You need to probe for that organizational wisdom. Ask for specific examples, like case studies — how have they resolved OOS investigations for molecules similar to yours? Okay. Ask to see examples of their data packages, look for consistency, clarity, how well the information flows between different groups like manufacturing and QC. And crucially, ask them how their analytical team actively contributes to improving their own platform processes. How did they learn from the past? Exactly. If they can’t clearly explain how past challenges led to concrete improvements and how they operate today, then maybe their experience is just, well, observational, not truly applied. That distinction feels really important. Observational vs. applied experience. It’s critical. It’s the difference between someone who just runs the machine and gets a number, and a scientific partner who applies years of collective learning to your specific molecule’s challenges.
Okay, let’s pull this all together then. What’s the final value proposition here? We’ve established integrated analytics means better efficiency, faster timelines, stronger data integrity. It seems like streamlined development with that real-time monitoring is the engine driving it. Yes. And we absolutely have to stress the financial side too. The cost of failure, it’s sometimes hard to put an exact dollar figure on avoiding a two-week delay, right? But the cost of losing an entire GMP batch because something went wrong with outsourced testing, or because the data couldn’t be properly interpreted in time — that cost is very real, very tangible, often millions of dollars. And the sources suggest those big failures can often be traced back to the risks of fragmentation. Frequently. Things like a miscommunication between sites, data arriving without context, or just a fundamental failure to connect the testing back to the actual manufacturing process effectively. These are the risks that integration mitigates.
So it sounds less like this is just one option among many, and more like, well, where the industry best practice is heading. I think that’s fair to say. Working with a CDMO partner that has strong integrated in-house analytics is becoming a core part of a robust quality assurance strategy. It’s a smarter way to manage risk. You’re transferring that risk to a partner who is set up technically and organizationally to handle it proactively.
So the takeaway for you, our listener, seems clear. You have to do your homework. Really dig deep when vetting CDMOs. Don’t just ask if they have analytical capabilities. Ask about the depth of their experience with molecules like yours, and how truly integrated their quality systems are. Yeah, look for the partners who can anticipate problems based on their experience — not just report results after the fact. Which brings us to our final thought for today, kind of building on that value idea. We talked about the high cost of a failed batch. But maybe the calculation you need to be making is a bit different. How do you actually measure the value of a disaster that didn’t happen? Hmm, interesting way to put it.
If fragmentation introduces risks that could lead to losing that multi-million dollar GMP batch, what’s the true financial advantage of choosing an integrated partner who prevents that failure? Even if their upfront service cost seems a bit higher. You’re essentially paying for proactive risk management, for insurance against catastrophic failure. Yeah, you’re paying for reliability and regulatory peace of mind, not just test results. That seems like the ultimate metric of integration to think about. What’s the real value of avoiding the worst case scenario? That’s a critical calculation for anyone managing a drug development pipeline. Absolutely.
Well, thank you for sharing the insights from the sources on this. It’s a complex but vital area.
Glad to dive into it. Thanks for joining us on the deep dive. We’ll catch you next time.