In recent times there has been a large increase in the type and number of products packaged using Modified Atmosphere Packaging (MAP) procedures. The majority of these processes concern the removal of oxygen from packaging systems in order to reduce oxidative degradation of the product and increase shelf life. Whilst most of the historical development of MAP took place within the food industry, over the last 20 years there has been a large growth in its application within the pharmaceutical sector. The classic MAP application, in the pharmaceutical industry, uses medical grade nitrogen to purge atmospheric oxygen and has been applied to many systems including containers equipped with septum capped closures, pressurised gas cans, blister packs, glass snap-ampoules, foil/plastic sachets, infusion bags and medical devices.
Since oxidation of susceptible compounds proceeds most readily in liquid systems, in the pharmaceutical world, MAP was initially applied to products such as parenterals. However, due to ease of use, many oxidisable active pharmaceutical ingredients (API’s) are formulated in solid dose form and for packaging these, blister strips are favoured. Blisters have the advantage that each dose is housed in what is essentially its own individual container and a single dose may be taken without exposing the remaining units to ambient air. Other advantages include; the ease of complying with prescribed dosage frequency and making any tampering evident. In Europe, over 80% of all solid dose forms are presented to consumers as blister strips and these offer the greatest technical challenge in terms of MAP where Quality Control (QC) analysis for residual oxygen is concerned.
There are two technologies routinely applied to QC analysis to confirm the removal of oxygen: Oxygen Probe Technology and Gas Chromatography. There are many “Oxygen Probe” analysers on the market, with paramagnetic and zirconium cells. Butterworth has invested in both of these technologies in order to meet the prescribed requirements for the testing of the high purity nitrogen gas used in MAP to both Ph. Eur. and USP/NF monographs. These analysers are very competitive in terms of cost (purchase, running & maintenance) accuracy, bench footprint and ease of training as long as large sample volumes are available. However as reported by one manufacturer of Oxygen Probe systems reported, when a 0.5mL sample of 1% known oxygen content was tested, a result of 3% was obtained. When analysing blister packs, which may contain only a few hundred micro-litres of headspace, such instrumentation will not compete against traditional Gas Chromatography (GC) using Thermal Conductivity Detection (TCD) which allows meaningful quantitation with as little as 10µL of sample.
There are no absolute pharmacopoeial specifications or regulatory requirements for the levels of residual oxygen in MAP. Product specific specification is established from the stability studies used to obtain shelf life data for new products. Any specification set should take into consideration the level of reduced residual oxygen technically achievable, the maximum level shown to have null affect in terms of product stability and the LOQ of the analytical technique employed. If the specification is set needlessly low, then production batches may be rejected without any real cause and the analytical challenges associated with routine QC will be increased without reason.
Generally for residual oxygen analysis, the initial analytical challenge is the sampling of an accurately measured aliquot of sample atmosphere from the product packaging without contaminating the sample aliquot with atmospheric oxygen in the process. For example, while it may be straight forward to remove a sample aliquot from a large volume, septum capped container using a syringe, other containers such as small glass snap ampoules or blister strips are much harder to process in the laboratory.
It is important to ensure, when running chromatographic analysis using traditional external calibration that not only accurately measured volumes of standard and sample are introduced onto the GC system, but such volumes must be at the same absolute pressure. In order to avoid the problems of reduced pressure, methods using external standard quantitation require addition of an equal volume of water to the sample atmosphere intended for analysis. This prevents a partial vacuum forming which will affect quantitation, but also increases the risk of contamination. Many laboratories assume that sample containers are filled with headspace at atmospheric pressure initially, which in practice is not always the case.
The Ratio Calibration Procedure verified at Butterworth negates both volume and pressure related sources of analytical error as well as contamination during the sampling procedure. The analysis is based on the peak area response ratio of oxygen to nitrogen determined for standards and samples rather than the absolute areas for oxygen as used by traditional external standard methods. It can be applied to sample containers equipped with septum capped closures, pressurised gas cans, blister packs, glass snap-ampoules and foil/plastic sachets, infusion bags and medical devices containing headspace purged with nitrogen. Verification was performed using blister strips, as these were considered to be the most difficult sample container type to process due to the extremely small sample headspace available.
Air by volume is approximately 21% oxygen and 78% nitrogen. The chromatogram below shows the oxygen (smaller 1st peak) and nitrogen (larger 2nd peak) responses obtained from injection of 50µL of air into the chromatograph.
Figure 1: 50µL air injection chromatogram
For ease of explanation, it is assumed that the areas of the oxygen and nitrogen peaks are 21 and 78 units respectively. This gives a ratio of oxygen to nitrogen of 0.269 for the 50µL GC injection. If, due to a partial vacuum (or indeed higher than atmospheric pressure) being present in the sample headspace at the time of sampling or due to difficult sample handling, a second injection of air of only 25µL is injected, the absolute peak responses would be halved giving areas for oxygen and nitrogen of 10.5 and 39. If an external standard calibration was being applied considering only the two oxygen responses, it would be calculated that the second injection contained only half as much oxygen as the first even though this difference is due to the differing volumes and not the actual sample constitution. However, if calculating the ratio of oxygen to nitrogen in the second injection, a figure of 0.269 is obtained, confirming that in reality both aliquots injected contained the same % of oxygen.
The GC ratio analysis at Butterworth calibrates by determining the area ratios for oxygen and nitrogen for a range of concentrations of oxygen in nitrogen which as has been shown are volume independent. Thereafter and again independent of volume injected, when a sample of unknown oxygen concentration is injected, the area ratio obtained is compared with those for the standards and thus an accurate figure for the actual concentration of oxygen in the sample can be calculated.
The procedure however will only work when nitrogen gas is used for purging and essentially only the two gases oxygen and nitrogen are present in the samples being tested.
All sampling procedures must be carried out inside a helium purged glove box or bag. The latter is a large, durable plastic bag with a sealable opening for insertion and removal of equipment, which is filled with ultra-high purity helium. Since the GC-TCD equipment uses helium as carrier (and reference) gas, any helium which is introduced to the sampled aliquot will not give any response and hence not interfere with oxygen to nitrogen ratios in any way.
Figure 2 and 3: Sample preparation under ultra-high purity Helium
In routine use at Butterworth the method is operated over a range of 0.2 to 5.0% residual oxygen given an approximate 50µL GC injection volume. During verification linearity was demonstrated with r2 of 1.000. In practise blank contribution is not significant and injection of helium from the glove bag typically gives results for oxygen of c. 0.05%. Using calculations based on the actual response of the 0.2% oxygen in nitrogen standard, a detection limit of 0.02% is easily obtained. Spike recoveries within 2% relative have been demonstrated. It is noted that 0.2% represents a 100 fold reduction in oxygen content compared to atmospheric oxygen and this has been adequate for the majority of client applications3.
In conclusion: the sample and standard preparation is carried out under a helium atmosphere, eliminating the effect of contamination and removing the requirement for blank correction. The volume sampled and injected into the GC is largely irrelevant, meaning the procedure proves to be simpler, quicker and more accurate than any external standard calibration method. This also translates to cost savings by: reductions in analysis time, analyst training in the method and the likelihood of erroneous Out of Specification (OOS) results.
Frank Judge BSc (Hons) CChem MRSC MChromSoc CSci
Consultant Chemist, Chromatography
Frank started his career at Kings and Co Ltd as a Senior Technician before joining Berridge Environmental Labs as Organic Analysis Team Leader in 1990. After a short spell with Pharmaco LSR in their Department of Aquatic Toxicology Studies, he joined the Chromatography Department of Butterworth Laboratories in 1994 and has progressed through various roles to his current position. Frank has spoken at JPAG meetings on Organic Volatile Impurities analysis and has been trained in preparing expert witness statements.