Wednesday, September 22, 2010

Manufacturing Biologics with CHO Cells? What’s the Risk for Viral Contamination?

by Dr. Ray Nims

Chinese hamster ovary (CHO) cells are frequently used in the biopharmaceutical industry for the manufacture of biologics such as recombinant proteins, antibodies, peptibodies, and receptor ligands. One of the reasons that CHO cells are often used is that these cells have an extensive safety track record for biologics production. This is considered to be a well-characterized cell line, and as a result the safety testing required may be less rigorous in some respects (e.g., retroviral safety) than that required for other cell types. But how susceptible is the cell line to viral contamination?

There are a couple of ways of answering this question. One way is to examine, in an empirical fashion, the susceptibility of the cell type to productive infection by model exogenous viruses. This type of study has been conducted at least three times over the past decades by different investigators. Wiebe and coworkers (In: Advances in Animal Cell Biology and Technology for Bioprocesses. Great Britain, 1989; 68-71) examined over 45 viruses from 9 virus families for ability to infect CHO-K1 cells, using immunostaining and cytopathic effect to detect infection. Only 7 of the viruses (Table 1) were capable of infecting the cells. Poiley and coworkers (In Vitro Toxicol. 4: 1-12, 1991) followed with a similar study in which 9 viruses from 6 families were evaluated for ability to infect CHO-K1 cells as detected by cytopathic effect, hemadsorption, and hemagglutination. This study did not add any new viruses to the short list (Table 1). The most recent study was conducted by Berting et al. This study involved 14 viruses from 12 families. The viruses included a few known to have contaminated CHO cell-derived biologics in the past two decades, and therefore did add some new entities to the list in Table 1. Still, the list of viruses that are known to replicate in CHO cells is relatively short.



Chinese hamster cells possess an endogenous retrovirus which expresses its presence in the form of retroviral particles, however these particles have been consistently found to be non-infectious for cells from other animals, including human cells. This endogenous retrovirus therefore does not present a safety threat (Dinowitz et al. Dev. Biol. Stand. 76:210–207, 1992).

A second way of looking at the question of viral susceptibility of CHO cells is to examine the incidence and types of reported viral contaminations of manufacturing processes employing CHO cell substrates. This subject has been reviewed a number of times, most recently by Berting et al. The types of viral contaminants fill a fairly short list (Table 2). In most cases, the contaminations have been attributed to the use of a contaminated animal-derived raw material, such as bovine serum.

Sources: Rabenau et al.1993; Garnick 1996; Oehmig et al., 2003; Nims Dev. Biol. 123:153-164, 2006; Nims et al., 2008; Genzyme 2009..

Considering the frequency with which CHO cell substrates have been used in biologics production, this history of viral contamination is remarkably sparse. This is further testament to the overall safety of this particular cell substrate.






Wednesday, September 8, 2010

FDA to viral vaccine makers: it's time to update viral testing methods

By Dr. Ray Nims

If you have been following the recent (2010) unfolding of the discovery of porcine circovirus DNA contamination in rotavirus vaccines from GSK and Merck, you may not be surprised to hear that the FDA has asked viral vaccine manufacturers to outline, by October, their plans to update their testing methodologies to prevent future revelations of this type.
 
I had predicted earlier that biologics manufacturers would be asked to provide evidence, going forward, that their porcine raw materials (trypsin being the most common) are free of porcine circovirus. This testing has not been manditory in the past, but adding this to the porcine raw material virus screening battery moving forward is a prudent action in light of the recent rotavirus vaccine experience.

The FDA has appropriately gone a little farther in it's request to the viral vaccine manufacturers. The regulators would like to assure that the future will not bring additional discoveries of viral contaminants in licensed vaccines, and the best way to accomplish this at the moment appears to be to request implementation of updated viral screening methodologies. Does this mean that viral vaccine makers will need to employ deep sequencing on a lot-by-lot basis? Most likely not. It appears that reliance on the in vivo and in vitro virus screening methods which have been the gold standards since the 1980s will, however, no longer be sufficient. So what does this leave us with? What FDA appears to be asking for is a relatively sensitive universal viral screening method.

The in vivo and in vitro methods were, until now, the best option for this purpose. These methods detect infectious virus only and depend upon the ability of the virus to cause an endpoint response in the system (cytopathic effect, hemagglutination, hemadsorption, or pathology in the laboratory animal species used). So viral genomic material would not be detected, and the methods have had to be supplemented with specific nucleic acid-based tests for viruses which could not otherwise be detected (e.g., HIV, hepatitis B, human parvovirus B9, porcine circovirus).

Some options for sensitive and universal viral screening methods which might fit the requirements include DNA microarrays and universal sequencing methods performed on cell and viral stocks. The latter technology may be preferable, as microarrays are constructed to detect known viruses, while the desire is that the technology be universal in the sense that it detect both known and unknown viruses. Such a test will provide additional assurance that the virus and cell banks used to manufacture viral vaccines do not harbor a viral contaminant.

Other universal viral screening methods which are less labor intensive than the sequencing technologies may be developed in the near future and addition of one of these to the release testing battery for viral vaccine lots may need to be considered in satisfying the FDA's goals.

Wednesday, September 1, 2010

Is Clarence calculating clearance correctly?

by Dr. Ray Nims

As pointed out by Dr. Rudge in a recent posting “Do we have clearance, Clarence?”, spiking studies conducted for the purpose of validating impurity clearance are often done at only one spiking level (indeed often at the highest possible impurity load attainable). This is especially true for validation of adventitious agent (virus and mycoplasma) clearance in downstream processes. The studies are done in this way in order to determine the upper limit of agent clearance (in terms of log10 reduction) by the process. Such log10 reduction factors from individual process steps are then summed in order to determine the overall capability of the downstream processes to clear adventitious agents. The regulatory agencies have fairly clear expectations around such clearance capabilities which must generally be met by biologics manufacturers.

The limiting factor in such clearance studies is typically the amount or titer of the agent that is able to be spiked into the process solution, which is determined by: 1) the titer of the stock used for spiking, and 2) the maximum dilution of the process solution allowed during spiking (typically 10%). Under these circumstances, as Scott points out, there is a possibility that the determined clearance efficiency (i.e., the percentage of the load which is cleared during the step) is an underestimate of the actual clearance that might be obtained at lower impurity loading levels.

Adventitious agent clearance is comprised of two possible modalities, removal and inactivation. Removal refers to physical processes designed to eliminate the agent from the process solution, usually through filtration or chromatography. Removal efficiency through filtration would not be expected to display variability based on impurity loading. On the other hand, chromatographic separation of agents (by, for example, ion-exchange columns) may display saturation at the highest loadings, and therefore use of the highest possible loading levels may result in underestimates of removal efficiency at lower (i.e., more typical) impurity levels.

Inactivation refers to physical or chemical means of rendering the agent non-infectious. Agent inactivation is not always a simple, first-order reaction. It may be more complex, with a fast phase 1 stage of inactivation followed by a slow phase 2 stage of inactivation. An inactivation study is planned in such a way that samples are taken at different times so that an inactivation time curve can be constructed. As with removal studies, the highest possible impurity levels are typically used to determine inactivation kinetics.

Source: Omar et al. Transfusion 36:866-872, 1996

While the information obtained through clearance studies of this type may be incomplete from the point of view of understanding the relationships between impurity loading levels and clearance efficiency, the results obtained are consistent with the regulatory expectation that the clearance modalities be evaluated under worst-case conditions. Therefore, at least in the case of adventitious agent clearance validation, I would say that Clarence is calculating clearance correctly!