Definitions of Antimalarial Drug Resistance

The standard definition for antimalarial drug resistance considers "the ability of a parasite strain to survive and/or multiply despite the administration and absorption of a drug given in doses equal to or higher than those usually recommended but within the tolerance of the subject'', and its posterior addendum ''the drug in question must gain access to the parasite or the infected red blood cell for the duration of the time necessary for its normal action'' (Bloland 2001). This definition is generally interpreted referring only to persistence of parasites after the administration of an antimalarial drug and needs to be differentiated from a prophylaxis failure, which implies a reinfection. It also requires a demonstration of malarial parasitemia in the presence of adequate plasmatic drug and metabolite concentrations, established using any of the different available methodologies. In practice, this is rarely done, and in general, demonstration of the persistence of parasites in a patient receiving directly observed therapy is usually accepted as proof. Therefore, when serum drug concentrations are not measured, caution should guide the interpretation of in vivo therapeutic failure data.

7.4.3 Mechanisms of Antimalarial Resistance: The Host

In areas where malaria is endemic, partial immunity against the most severe forms of disease (death and severe disease) is progressively acquired (Gupta et al. 1999), followed by a protection against clinical episodes and finally a suppression of the parasitemia to low or undetectable levels. Such protection requires a continued booster effect, not conferring, however, a sterilising immunity, as individuals may get infected despite not developing clinical symptoms. Acquired immunity plays a central role in preventing the emergence and spread of resistance in high-transmission settings, as it reduces the parasite burden, which is a well-known determinant of antimalarial effectiveness. This phenomenon may partially explain the relative late increase in resistance in highly endemic areas like sub-Saharan Africa. In such settings, the immune system removes the parasites that have not been adequately tackled by the antimalarial drug. This explains why infections treated with inefficient drugs may still be cured. In low-transmission areas, the degree of acquired immunity is lower or inexistent, and the majority of the infections are symptomatic and associated to a higher parasite biomass. In addition, such infections are generally treated far more often than asymptomatic infections in areas of higher endemicity, increasing the drug pressure as parasites are confronted to antimalarial drugs. Factors that affect the immune system, either physiologically (pregnancy) or pathologically (drugs, diseases), could therefore have a critical role in the development of antimalarial drug resistance. There have been suggestions that immunosup-pression states secondary to malnutrition (Wolday et al. 1995) or to HIV infection (Ayisi et al. 2003) may compromise acquired immunity to malaria and possibly enhance drug resistance. This needs to be confirmed (Laufer et al. 2007), but if true, the high prevalence of these illnesses in malaria-endemic areas could pose a tremendous threat to existing and future antimalarial drugs.

Variant alleles of the human genes codifying enzymes, which are important in certain metabolic pathways, may also have a role in the development of resistances, by affecting the pharmacokinetics of antimalarial drugs. Variants of the gene for cytochrome CYP2C19 correlate with slow or rapid metabolism of some antimalarial agents (Baird 2005). Interactions observed when using certain drugs (oral contraceptives) could be explained by the effect of such drugs in this same enzyme (McGready et al. 2003).

7.4.4 Mechanisms of Antimalarial Resistance: The Parasite, the Vector and the Environment

Some evidence suggests that certain combinations of drug-resistant parasites and vector species could enhance transmission of drug resistance. Two important malarial vectors in Southeast Asia (Anopheles stephensi and Anopheles dirus) appear to be more susceptible to drug-resistant parasites than to drug-sensitive ones (Sucharit et al. 1977). The opposite may also be true, and partially explain the pockets of chloroquine sensitivity that remain in the world, in spite of very similar human populations and drug pressure conditions (Bloland 2001).

The initial burden of parasitemia also plays a role in the risk for resistance. High levels of parasitemia, as compared with lower ones, require longer exposure to effective drug levels and have a relatively higher risk of treatment failure.

Although it is generally recognized that the level of transmission influences the rate of development and spread of drug resistance, whether malarial transmission intensity plays an independent role in this spread is still a matter of debate. It has been suggested that intensity of transmission is an important determinant of drug resistance as a result of its relationship to clone multiplicity (Babiker and Walliker 1997). A higher transmission would increase the number of infected clones per infected individual and therefore multiply the chances of developing resistance. The exact nature of this relationship is not yet well understood and has given rise to contrasting theories (Talisuna et al. 2004), postulating that the development of drug resistance was increased in both low (Hastings and D'Alessandro 2000) and high (Molyneux et al. 1999) transmission settings. The contradictory implications of such findings jeopardize their use for malaria control strategies, but a recent review (Talisuna et al. 2007) in East Africa suggested that P. falciparum resistance to chloroquine and SP was highest where malarial transmission was most intense and that vector control was associated with an increase in the efficacy of these drugs, presumably by decreasing transmission intensity.

7.4.5 Mechanisms of Antimalarial Resistance: The Drugs

Drug abuse has certainly contributed to the development of resistances, and circumstantial evidence for its particular role are highlighted by the observation that resistance to chloroquine developed from different areas whose common denominator was the long-term use of this drug for either prophylaxis or treatment (Payne 1988). However, the appearance of resistance to drugs such as SP in areas with relatively low SP use suggests that other factors may also have a role in the spread of resistance in addition to drug pressure.

Drug elimination half-life plays an important role in the evolution of parasite resistance (Hastings et al. 2002). Drugs with a long elimination half-life (SP, mefloquine and so on) have multiple therapeutic advantages. Patient's compliance is improved, as treatments are normally taken as a single dose or a short regimen that can be supervised by the clinician. The prolonged elimination periods maintain plasmatic therapeutic drug levels, which offer certain protection against the re-emergence of parasitemia for several weeks and give time to patients to recover from anaemia, a major cause of malaria-related morbidity in areas of intense malarial transmission. Subtherapeutic drug concentrations eliminate the most susceptible parasites and leave those that may be more prepared to recover and reproduce. Therefore, during this long elimination period, parasites from new infections or recrudescent ones originating in infections that did not fully clear will be exposed to drug blood levels insufficient to provide protection but high enough to exert selective pressure (Watkins and Mosobo 1993). The initial individual benefit conferred by drugs with long half-lives may be therefore counterproductive for the population, as they can create a potent selective pressure capable of accelerating the evolution of resistance.

The development of resistance to one drug can facilitate the development of resistance to others, provided they are closely related chemically. This phenomenon, cross-resistance, has been well described particularly between chloroquine and amodiaquine, two drugs structurally related within the 4-aminoquinoline family (Bray et al. 1996; O'Neill et al. 1998). Similarly, development of resistance to mefloquine may also lead to resistance to quinine or halofantrine. Due to their structural similarities to other antifolates, the promising combination chlorproguanil-dapsone (LapDap) may be at risk even before it is deployed. Antimalarials with new modes of action need to be developed to avoid this problem.

We have seen that certain genotypes may influence the bioavailability of antimalarial drugs. Some treatments may also interact with antimalarial drugs by competing with them throughout metabolic pathways. Similarly, certain foods may further determine bioavailability by increasing or reducing gastrointestinal absorption, thus modulating drug effectiveness. An example of this is the enhancement of artemether-lumefantrine's absorption by fatty foods (Piola et al. 2005).

0 0

Post a comment