Download - Malaria

Transcript
Page 1: Malaria

MalariaMalaria is a vector-borne infectious disease caused by protozoan parasites. It is widespread in tropical and subtropical regions, including parts of the Americas, Asia, and Africa. Each year, there are approximately 515 million cases of malaria, killing between one and three million people, the majority of whom are young children in Sub-Saharan Africa.[1] Malaria is commonly associated with poverty, but is also a cause of poverty and a major hindrance to economic development.

Malaria is one of the most common infectious diseases and an enormous public health problem. The disease is caused by protozoan parasites of the genus Plasmodium. Only four types of the plasmodium parasite can infect humans; the most serious forms of the disease are caused by Plasmodium falciparum and Plasmodium vivax, but other related species (Plasmodium ovale, Plasmodium malariae) can also affect humans. This group of human-pathogenic Plasmodium species is usually referred to as malaria parasites.

Malaria parasites are transmitted by female Anopheles mosquitoes. The parasites multiply within red blood cells, causing symptoms that include symptoms of anemia (light headedness, shortness of breath, tachycardia etc.), as well as other general symptoms such as fever, chills, nausea, flu-like illness, and in severe cases, coma and death. Malaria transmission can be reduced by preventing mosquito bites with mosquito nets and insect repellents, or by mosquito control measures such as spraying insecticides inside houses and draining standing water where mosquitoes lay their eggs.

Although some are under development, no vaccine is currently available for malaria; preventative drugs must be taken continuously to reduce the risk of infection. These prophylactic drug treatments are often too expensive for most people living in endemic areas. Most adults from endemic areas have a degree of long-term infection, which tends to recur and also possess partial immunity (resistance); the resistance reduces with time and such adults may become susceptible to severe malaria if they have spent a significant amount of time in non-endemic areas. They are strongly recommended to take full precautions if they return to an endemic area. Malaria infections are treated through the use of antimalarial drugs, such as quinine or artemisinin derivatives, although drug resistance is increasingly common.

Distribution and impact

Areas of the world where malaria is endemic as of 2003 (coloured yellow).[17]

Malaria causes about 400–900 million cases of fever and approximately one to three million deaths annually[18][19] — this represents at least one death every 30 seconds. The vast majority of cases occur in children under the age of 5 years;[20] pregnant women are also especially vulnerable. Despite efforts to reduce transmission and increase treatment, there has been little change in which areas are at risk of this disease since 1992.[21] Indeed, if the prevalence of malaria stays on its present upwards course, the death rate

Page 2: Malaria

could double in the next twenty years.[18] Precise statistics are unknown because many cases occur in rural areas where people do not have access to hospitals or the means to afford health care. Consequently, the majority of cases are undocumented.[18]

Although co-infection with HIV and malaria does cause increased mortality, this is less of a problem than with HIV/tuberculosis co-infection, due to the two diseases usually attacking different age-ranges, with malaria being most common in the young and active tuberculosis most common in the old.[22] Although HIV/malaria co-infection produces less severe symptoms than the interaction between HIV and TB, HIV and malaria do contribute to each other's spread. This effect comes from malaria increasing viral load and HIV infection increasing a person's susceptibility to malaria infection.[23]

Malaria is presently endemic in a broad band around the equator, in areas of the Americas, many parts of Asia, and much of Africa; however, it is in sub-Saharan Africa where 85– 90% of malaria fatalities occur.[24] The geographic distribution of malaria within large regions is complex, and malaria-afflicted and malaria-free areas are often found close to each other.[25] In drier areas, outbreaks of malaria can be predicted with reasonable accuracy by mapping rainfall.[26] Malaria is more common in rural areas than in cities; this is in contrast to dengue fever where urban areas present the greater risk.[27] For example, the cities of Vietnam, Laos and Cambodia are essentially malaria-free, but the disease is present in many rural regions.[28] By contrast, in Africa malaria is present in both rural and urban areas, though the risk is lower in the larger cities.[29] The global endemic levels of malaria have not been mapped since the 1960s. However, the Wellcome Trust, UK, has funded the Malaria Atlas Project [30] to rectify this, providing a more contemporary and robust means with which to assess current and future malaria disease burden.

Socio-economic effects

Malaria is not just a disease commonly associated with poverty, but is also a cause of poverty and a major hindrance to economic development. The disease has been associated with major negative economic effects on regions where it is widespread. A comparison of average per capita GDP in 1995, adjusted to give parity of purchasing power, between malarious and non-malarious countries demonstrates a fivefold difference ($1,526 USD versus $8,268 USD). Moreover, in countries where malaria is common, average per capita GDP has risen (between 1965 and 1990) only 0.4% per year, compared to 2.4% per year in other countries.[31] However, correlation does not demonstrate causation, and the prevalence is at least partly because these regions do not have the financial capacities to prevent malaria. In its entirety, the economic impact of malaria has been estimated to cost Africa $12 billion USD every year. The economic impact includes costs of health care, working days lost due to sickness, days lost in education, decreased productivity due to brain damage from cerebral malaria, and loss of investment and tourism.[20] In some countries with a heavy malaria burden, the disease may account for as much as 40% of public health expenditure, 30-50% of inpatient admissions, and up to 50% of outpatient visits.[32]

Page 3: Malaria

Symptoms

Symptoms of malaria include fever, shivering, arthralgia (joint pain), vomiting, anemia (caused by hemolysis), hemoglobinuria, and convulsions. There may be a feeling of tingling in the skin, particularly with malaria caused by P. falciparum.[citation needed] The classical symptom of malaria is cyclical occurrence of sudden coldness followed by rigor and then fever and sweating lasting four to six hours, occurring every two days in P. vivax and P. ovale infections, while every three for P. malariae.[33] P. falciparum can have recurrent fever every 36-48 hours or a less pronounced and almost continuous fever. For reasons that are poorly understood, but which may be related to high intracranial pressure, children with malaria frequently exhibit abnormal posturing, a sign indicating severe brain damage.[34] Malaria has been found to cause cognitive impairments, especially in children. It causes widespread anemia during a period of rapid brain development and also direct brain damage. This neurologic damage results from cerebral malaria to which children are more vulnerable.[35]

Severe malaria is almost exclusively caused by P. falciparum infection and usually arises 6-14 days after infection.[36] Consequences of severe malaria include coma and death if untreated—young children and pregnant women are especially vulnerable. Splenomegaly (enlarged spleen), severe headache, cerebral ischemia, hepatomegaly (enlarged liver), hypoglycemia, and hemoglobinuria with renal failure may occur. Renal failure may cause blackwater fever, where hemoglobin from lysed red blood cells leaks into the urine. Severe malaria can progress extremely rapidly and cause death within hours or days.[36] In the most severe cases of the disease fatality rates can exceed 20%, even with intensive care and treatment.[37] In endemic areas, treatment is often less satisfactory and the overall fatality rate for all cases of malaria can be as high as one in ten.[38] Over the longer term, developmental impairments have been documented in children who have suffered episodes of severe malaria.[39]

Chronic malaria is seen in both P. vivax and P. ovale, but not in P. falciparum. Here, the disease can relapse months or years after exposure, due to the presence of latent parasites in the liver. Describing a case of malaria as cured by observing the disappearance of parasites from the bloodstream can therefore be deceptive. The longest incubation period reported for a P. vivax infection is 30 years.[36] Approximately one in five of P. vivax malaria cases in temperate areas involve overwintering by hypnozoites (i.e., relapses begin the year after the mosquito bite).[40]

Causes

Page 4: Malaria

A Plasmodium sporozoite traverses the cytoplasm of a mosquito midgut epithelial cell in this false-color electron micrograph.

Malaria parasites

Malaria is caused by protozoan parasites of the genus Plasmodium (phylum Apicomplexa). In humans malaria is caused by P. falciparum, P. malariae, P. ovale, P. vivax and P. knowlesi. P. falciparum is the most common cause of infection and is responsible for about 80% of all malaria cases, and is also responsible for about 90% of the deaths from malaria.[41] Parasitic Plasmodium species also infect birds, reptiles, monkeys, chimpanzees and rodents.[42] There have been documented human infections with several simian species of malaria, namely P. knowlesi, P. inui, P. cynomolgi,[43] P. simiovale, P. brazilianum, P. schwetzi and P. simium; however, with the exception of P. knowlesi, these are mostly of limited public health importance. Although avian malaria can kill chickens and turkeys, this disease does not cause serious economic losses to poultry farmers.[44] However, since being accidentally introduced by humans it has decimated the endemic birds of Hawaii, which evolved in its absence and lack any resistance to it.[45]

Mosquito vectors and the Plasmodium life cycle

The parasite's primary (definitive) hosts and transmission vectors are female mosquitoes of the Anopheles genus. Young mosquitoes first ingest the malaria parasite by feeding on an infected human carrier and the infected Anopheles mosquitoes carry Plasmodium sporozoites in their salivary glands. A mosquito becomes infected when it takes a blood meal from an infected human. Once ingested, the parasite gametocytes taken up in the blood will further differentiate into male or female gametes and then fuse in the mosquito gut. This produces an ookinete that penetrates the gut lining and produces an oocyst in the gut wall. When the oocyst ruptures, it releases sporozoites that migrate through the mosquito's body to the salivary glands, where they are then ready to infect a new human

Page 5: Malaria

host. This type of transmission is occasionally referred to as anterior station transfer.[46] The sporozoites are injected into the skin, alongside saliva, when the mosquito takes a subsequent blood meal.

Only female mosquitoes feed on blood, thus males do not transmit the disease. The females of the Anopheles genus of mosquito prefer to feed at night. They usually start searching for a meal at dusk, and will continue throughout the night until taking a meal. Malaria parasites can also be transmitted by blood transfusions, although this is rare.[47]

Pathogenesis

The life cycle of malaria parasites in the human body. A mosquito infects a person,by taking a blood meal. First, sporozoites enter the bloodstream, and migrate to the liver. They infect liver cells (hepatocytes), where they multiply into merozoites, rupture the liver cells, and escape back into the bloodstream. Then, the merozoites infect red blood cells, where they develop into ring forms, then trophozoites (a feeding stage), then schizonts (a reproduction stage), then back into merozoites. Sexual forms called gametocytes are also produced, which if taken up by a mosquito will infect the insect and continue the life cycle.

Malaria in humans develops via two phases: an exoerythrocytic (exo=outside; erythrocutic=red blood cell), i.e., mainly in the liver (hepatic), and an erythrocytic phase. When an infected mosquito pierces a person's skin to take a blood meal, sporozoites in the mosquito's saliva enter the bloodstream and migrate to the liver. Within 30 minutes of being introduced into the human host, they infect hepatocytes, multiplying asexually and asymptomatically for a period of 6–15 days. Once in the liver these organisms differentiate to yield thousands of merozoites which, following rupture of their host cells, escape into the blood and infect red blood cells, thus beginning the erythrocytic stage of the life cycle.[48] The parasite escapes from the liver undetected by wrapping itself in the cell membrane of the infected host liver cell.[49]

Page 6: Malaria

Within the red blood cells the parasites multiply further, again asexually, periodically breaking out of their hosts to invade fresh red blood cells. Several such amplification cycles occur. Thus, classical descriptions of waves of fever arise from simultaneous waves of merozoites escaping and infecting red blood cells.

Some P. vivax and P. ovale sporozoites do not immediately develop into exoerythrocytic-phase merozoites, but instead produce hypnozoites that remain dormant for periods ranging from several months (6–12 months is typical) to as long as three years. After a period of dormancy, they reactivate and produce merozoites. Hypnozoites are responsible for long incubation and late relapses in these two species of malaria.[50]

The parasite is relatively protected from attack by the body's immune system because for most of its human life cycle it resides within the liver and blood cells and is relatively invisible to immune surveillance. However, circulating infected blood cells are destroyed in the spleen. To avoid this fate, the P. falciparum parasite displays adhesive proteins on the surface of the infected blood cells, causing the blood cells to stick to the walls of small blood vessels, thereby sequestering the parasite from passage through the general circulation and the spleen.[51] This "stickiness" is the main factor giving rise to hemorrhagic complications of malaria. High endothelial venules (the smallest branches of the circulatory system) can be blocked by the attachment of masses of these infected red blood cells. The blockage of these vessels causes symptoms such as in placental and cerebral malaria. In cerebral malaria the sequestrated red blood cells can breach the blood brain barrier possibly leading to coma.[52]

Although the red blood cell surface adhesive proteins (called PfEMP1, for Plasmodium falciparum erythrocyte membrane protein 1) are exposed to the immune system they do not serve as good immune targets because of their extreme diversity; there are at least 60 variations of the protein within a single parasite and perhaps limitless versions within parasite populations.[51] Like a thief changing disguises or a spy with multiple passports, the parasite switches between a broad repertoire of PfEMP1 surface proteins, thus staying one step ahead of the pursuing immune system.

Some merozoites turn into male and female gametocytes. If a mosquito pierces the skin of an infected person, it potentially picks up gametocytes within the blood. Fertilization and sexual recombination of the parasite occurs in the mosquito's gut, thereby defining the mosquito as the definitive host of the disease. New sporozoites develop and travel to the mosquito's salivary gland, completing the cycle. Pregnant women are especially attractive to the mosquitoes,[53] and malaria in pregnant women is an important cause of stillbirths, infant mortality and low birth weight,[54] particularly in P. falciparum infection, but also in other species infection, such as P. vivax.[55]

Thalassaemias

Another well documented set of mutations found in the human genome associated with malaria are those involved in causing blood disorders known as thalassaemias. Studies in Sardinia and Papua New Guinea have found that the gene frequency of β-thalassaemias is

Page 7: Malaria

related to the level of malarial endemicity in a given population. A study on more than 500 children in Liberia found that those with β-thalassaemia had a 50% decreased chance of getting clinical malaria. Similar studies have found links between gene frequency and malaria endemicity in the α+ form of α-thalassaemia. Presumably these genes have also been selected in the course of human evolution.

Duffy antigens

The Duffy antigens are antigens expressed on red blood cells and other cells in the body acting as a chemokine receptor. The expression of Duffy antigens on blood cells is encoded by Fy genes (Fya, Fyb, Fyc etc.). Plasmodium vivax malaria uses the Duffy antigen to enter blood cells. However, it is possible to express no Duffy antigen on red blood cells (Fy-/Fy-). This genotype confers complete resistance to P. vivax infection. The genotype is very rare in European, Asian and American populations, but is found in almost all of the indigenous population of West and Central Africa.[57] This is thought to be due to very high exposure to P. vivax in Africa in the last few thousand years.

G6PD

Glucose-6-phosphate dehydrogenase (G6PD) is an enzyme which normally protects from the effects of oxidative stress in red blood cells. However, a genetic deficiency in this enzyme results in increased protection against severe malaria.

HLA and interleukin-4

HLA-B53 is associated with low risk of severe malaria. This MHC class I molecule presents liver stage and sporozoite antigens to T-Cells. Interleukin-4, encoded by IL4, is produced by activated T cells and promotes proliferation and differentiation of antibody-producing B cells. A study of the Fulani of Burkina Faso, who have both fewer malaria attacks and higher levels of antimalarial antibodies than do neighboring ethnic groups, found that the IL4-524 T allele was associated with elevated antibody levels against malaria antigens, which raises the possibility that this might be a factor in increased resistance to malaria.[58]

Diagnosis

Page 8: Malaria

Blood smear from a P. falciparum culture (K1 strain). Several red blood cells have ring stages inside them. Close to the center there is a schizont and on the left a trophozoite.

Severe malaria is commonly misdiagnosed in Africa, leading to a failure to treat other life-threatening illnesses. In malaria-endemic areas, parasitemia does not ensure a diagnosis of severe malaria because parasitemia can be incidental to other concurrent disease. Recent investigations suggest that malarial retinopathy is better (collective sensitivity of 95% and specificity of 90%) than any other clinical or laboratory feature in distinguishing malarial from non-malarial coma.[59]

Symptomatic diagnosis

Areas that cannot afford even simple laboratory diagnostic tests often use only a history of subjective fever as the indication to treat for malaria. Using Giemsa-stained blood smears from children in Malawi, one study showed that unnecessary treatment for malaria was significantly decreased when clinical predictors (rectal temperature, nailbed pallor, and splenomegaly) were used as treatment indications, rather than the current national policy of using only a history of subjective fevers (sensitivity increased from 21% to 41%).[60]

Microscopic examination of blood films

The most economic, preferred, and reliable diagnosis of malaria is microscopic examination of blood films because each of the four major parasite species has distinguishing characteristics. Two sorts of blood film are traditionally used. Thin films are similar to usual blood films and allow species identification because the parasite's appearance is best preserved in this preparation. Thick films allow the microscopist to screen a larger volume of blood and are about eleven times more sensitive than the thin film, so picking up low levels of infection is easier on the thick film, but the appearance of the parasite is much more distorted and therefore distinguishing between the different species can be much more difficult. With the pros and cons of both thick and thin smears taken into consideration, it is imperative to utilize both smears while attempting to make a definitive diagnosis.[61]

From the thick film, an experienced microscopist can detect parasite levels (or parasitemia) down to as low as 0.0000001% of red blood cells. Diagnosis of species can be difficult because the early trophozoites ("ring form") of all four species look identical and it is never possible to diagnose species on the basis of a single ring form; species identification is always based on several trophozoites.

Field tests

In areas where microscopy is not available, or where laboratory staff are not experienced at malaria diagnosis, there are antigen detection tests that require only a drop of blood.[62] Immunochromatographic tests (also called: Malaria Rapid Diagnostic Tests, Antigen-

Page 9: Malaria

Capture Assay or "Dipsticks") have been developed, distributed and fieldtested. These tests use finger-stick or venous blood, the completed test takes a total of 15-20 minutes, and a laboratory is not needed. The threshold of detection by these rapid diagnostic tests is in the range of 100 parasites/µl of blood compared to 5 by thick film microscopy. The first rapid diagnostic tests were using P. falciparum glutamate dehydrogenase as antigen [63]. PGluDH was soon replaced by P.falciparum lactate dehydrogenase, a 33 kDa oxidoreductase [EC 1.1.1.27]. It is the last enzyme of the glycolytic pathway, essential for ATP generation and one of the most abundant enzymes expressed by P.falciparum. PLDH does not persist in the blood but clears about the same time as the parasites following successful treatment. The lack of antigen persistence after treatment makes the pLDH test useful in predicting treatment failure. In this respect, pLDH is similar to pGluDH. The OptiMAL-IT assay can distinguish between P. falciparum and P. vivax because of antigenic differences between their pLDH isoenzymes. OptiMAL-IT will reliably detect falciparum down to 0.01% parasitemia and non-falciparum down to 0.1%. Paracheck-Pf will detect parasitemias down to 0.002% but will not distinguish between falciparum and non-falciparum malaria. Parasite nucleic acids are detected using polymerase chain reaction. This technique is more accurate than microscopy. However, it is expensive, and requires a specialized laboratory. Moreover, levels of parasitemia are not necessarily correlative with the progression of disease, particularly when the parasite is able to adhere to blood vessel walls. Therefore more sensitive, low-tech diagnosis tools need to be developed in order to detect low levels of parasitaemia in the field. Areas that cannot afford even simple laboratory diagnostic tests often use only a history of subjective fever as the indication to treat for malaria. Using Giemsa-stained blood smears from children in Malawi, one study showed that unnecessary treatment for malaria was significantly decreased when clinical predictors (rectal temperature, nailbed pallor, and splenomegaly) were used as treatment indications, rather than the current national policy of using only a history of subjective fevers (sensitivity increased from 21% to 41%).[64]

Molecular methods

Molecular methods are available in some clinical laboratories and rapid real-time assays (for example, QT-NASBA based on the polymerase chain reaction)[65] are being developed with the hope of being able to deploy them in endemic areas.

Laboratory tests

OptiMAL-IT will reliably detect falciparum down to 0.01% parasitemia and non-falciparum down to 0.1%. Paracheck-Pf will detect parasitemias down to 0.002% but will not distinguish between falciparum and non-falciparum malaria. Parasite nucleic acids are detected using polymerase chain reaction. This technique is more accurate than microscopy. However, it is expensive, and requires a specialized laboratory. Moreover, levels of parasitemia are not necessarily correlative with the progression of disease, particularly when the parasite is able to adhere to blood vessel walls. Therefore more sensitive, low-tech diagnosis tools need to be developed in order to detect low levels of parasitaemia in the field. [66]

Page 10: Malaria

Treatment

Active malaria infection with P. falciparum is a medical emergency requiring hospitalization. Infection with P. vivax, P. ovale or P. malariae can often be treated on an outpatient basis. Treatment of malaria involves supportive measures as well as specific antimalarial drugs. When properly treated, someone with malaria can expect a complete recovery.[67]

Antimalarial drugs

There are several families of drugs used to treat malaria. Chloroquine is very cheap and, until recently, was very effective, which made it the antimalarial drug of choice for many years in most parts of the world. However, resistance of Plasmodium falciparum to chloroquine has spread recently from Asia to Africa, making the drug ineffective against the most dangerous Plasmodium strain in many affected regions of the world. In those areas where chloroquine is still effective it remains the first choice. Unfortunately, chloroquine-resistance is associated with reduced sensitivity to other drugs such as quinine and amodiaquine.[68]

There are several other substances which are used for treatment and, partially, for prevention (prophylaxis). Many drugs may be used for both purposes; larger doses are used to treat cases of malaria. Their deployment depends mainly on the frequency of resistant parasites in the area where the drug is used. One drug currently being investigated for possible use as an anti-malarial, especially for treatment of drug-resistant strains, is the beta blocker propranolol. Propranolol has been shown to block both Plasmodium's ability to enter red blood cell and establish an infection, as well as parasite replication. A December 2006 study by Northwestern University researchers suggested that propranolol may reduce the dosages required for existing drugs to be effective against P. falciparum by 5- to 10-fold, suggesting a role in combination therapies.[69]

Currently available anti-malarial drugs include:[70]

Artemether -lumefantrine (Therapy only, commercial names Coartem and Riamet) Artesunate -amodiaquine (Therapy only) Artesunate -mefloquine (Therapy only) Artesunate -Sulfadoxine/pyrimethamine (Therapy only) Atovaquone -proguanil, trade name Malarone (Therapy and prophylaxis) Quinine (Therapy only) Chloroquine (Therapy and prophylaxis; usefulness now reduced due to resistance) Cotrifazid (Therapy and prophylaxis) Doxycycline (Therapy and prophylaxis) Mefloquine , trade name Lariam (Therapy and prophylaxis) Primaquine (Therapy in P. vivax and P. ovale only; not for prophylaxis) Proguanil (Prophylaxis only) Sulfadoxine -pyrimethamine (Therapy; prophylaxis for semi-immune pregnant

women in endemic countries as "Intermittent Preventive Treatment" - IPT)

Page 11: Malaria

Hydroxychloroquine , trade name Plaquenil (Therapy and prophylaxis)

The development of drugs was facilitated when Plasmodium falciparum was successfully cultured.[71] This allowed in vitro testing of new drug candidates.

Extracts of the plant Artemisia annua, containing the compound artemisinin or semi-synthetic derivatives (a substance unrelated to quinine), offer over 90% efficacy rates, but their supply is not meeting demand.[72] In 2007, the Bill & Melinda Gates Foundation contributed $13.6m to support research at the University of York to develop fast and high-yield strains of artemisia, with researchers predicting an increase in yield of up to 1000% compared to current varieties.[73] One study in Rwanda showed that children with uncomplicated P. falciparum malaria demonstrated fewer clinical and parasitological failures on post-treatment day 28 when amodiaquine was combined with artesunate, rather than administered alone (OR = 0.34). However, increased resistance to amodiaquine during this study period was also noted.[74] Since 2001 the World Health Organization has recommended using artemisinin-based combination therapy (ACT) as first-line treatment for uncomplicated malaria in areas experiencing resistance to older medications. The most recent WHO treatment guidelines for malaria recommend four different ACTs. While numerous countries, including most African nations, have adopted the change in their official malaria treatment policies, cost remains a major barrier to ACT implementation. Because ACTs cost up to twenty times as much as older medications, they remain unaffordable in many malaria-endemic countries. The molecular target of artemisinin is controversial, although recent studies suggest that SERCA, a calcium pump in the endoplasmic reticulum may be associated with artemisinin resistance.[75] Malaria parasites can develop resistance to artemisinin and resistance can be produced by mutation of SERCA.[76] However, other studies suggest the mitochondrion is the major target for artemisinin and its analogs.[77]

In February 2002, the journal Science and other press outlets[78] announced progress on a new treatment for infected individuals. A team of French and South African researchers had identified a new drug they were calling "G25".[79] It cured malaria in test primates by blocking the ability of the parasite to copy itself within the red blood cells of its victims. In 2005 the same team of researchers published their research on achieving an oral form, which they refer to as "TE3" or "te3".[80] As of early 2006, there is no information in the mainstream press as to when this family of drugs will become commercially available.

In 1996, Professor Geoff McFadden stumbled upon the work of British biologist Ian Wilson, who had discovered that the plasmodia responsible for causing malaria retained parts of chloroplasts,[81] an organelle usually found in plants, complete with their own functioning genomes. This led Professor McFadden to the realisation that any number of herbicides may in fact be successful in the fight against malaria, and so he set about trialing large numbers of them, and enjoyed a 75% success rate.

These "apicoplasts" are thought to have originated through the endosymbiosis of algae[82] and play a crucial role in fatty acid bio-synthesis in plasmodia.[83] To date, 466 proteins

Page 12: Malaria

have been found to be produced by apicoplasts[84] and these are now being looked at as possible targets for novel anti-malarial drugs.

Although effective anti-malarial drugs are on the market, the disease remains a threat to people living in endemic areas who have no proper and prompt access to effective drugs. Access to pharmacies and health facilities, as well as drug costs, are major obstacles. Médecins Sans Frontières estimates that the cost of treating a malaria-infected person in an endemic country was between US$0.25 and $2.40 per dose in 2002.[85]

Counterfeit drugs

Sophisticated counterfeits have been found in several Asian countries such as Cambodia [86] , China,[87], Indonesia, Laos, Thailand, Vietnam and are an important cause of avoidable death in these countries.[88] WHO have said that studies indicate that up to 40% of artesunate based malaria medications are counterfeit, especially in the Greater Mekong region and have established a rapid alert system to enable information about counterfeit drugs to be rapidly reported to the relevant authorities in participating countries.[89] There is no reliable way for doctors or lay people to detect counterfeit drugs without help from a laboratory. Companies are attempting to combat the persistence of counterfeit drugs by using new technology to provide security from source to distribution.

Prevention and disease control

Anopheles albimanus mosquito feeding on a human arm. This mosquito is a vector of malaria and mosquito control is a very effective way of reducing the incidence of malaria.

Methods used to prevent the spread of disease, or to protect individuals in areas where malaria is endemic, include prophylactic drugs, mosquito eradication, and the prevention

Page 13: Malaria

of mosquito bites. The continued existence of malaria in an area requires a combination of high human population density, high mosquito population density, and high rates of transmission from humans to mosquitoes and from mosquitoes to humans. If any of these is lowered sufficiently, the parasite will sooner or later disappear from that area, as happened in North America, Europe and much of Middle East. However, unless the parasite is eliminated from the whole world, it could become re-established if conditions revert to a combination that favors the parasite's reproduction. Many countries are seeing an increasing number of imported malaria cases due to extensive travel and migration. (See Anopheles.)

There is currently no vaccine that will prevent malaria, but this is an active field of research.

Many researchers argue that prevention of malaria may be more cost-effective than treatment of the disease in the long run, but the capital costs required are out of reach of many of the world's poorest people. Economic adviser Jeffrey Sachs estimates that malaria can be controlled for US$3 billion in aid per year. It has been argued that, in order to meet the Millennium Development Goals, money should be redirected from HIV/AIDS treatment to malaria prevention, which for the same amount of money would provide greater benefit to African economies.[90]

Brazil, Eritrea, India, and Vietnam have, unlike many other developing nations, successfully reduced the malaria burden. Common success factors included conducive country conditions, a targeted technical approach using a package of effective tools, data-driven decision-making, active leadership at all levels of government, involvement of communities, decentralized implementation and control of finances, skilled technical and managerial capacity at national and sub-national levels, hands-on technical and programmatic support from partner agencies, and sufficient and flexible financing.[91]

Vector control

Before DDT, malaria was successfully eradicated or controlled also in several tropical areas by removing or poisoning the breeding grounds of the mosquitoes or the aquatic habitats of the larva stages, for example by filling or applying oil to places with standing water. These methods have seen little application in Africa for more than half a century.[92]

Efforts to eradicate malaria by eliminating mosquitoes have been successful in some areas. Malaria was once common in the United States and southern Europe, but the draining of wetland breeding grounds and better sanitation, in conjunction with the monitoring and treatment of infected humans, eliminated it from affluent regions. In 2002, there were 1,059 cases of malaria reported in the US, including eight deaths. In five of those cases, the disease was contracted in the United States. Malaria was eliminated from the northern parts of the USA in the early twentieth century, and the use of the pesticide DDT eliminated it from the South by 1951. In the 1950s and 1960s, there was a major public health effort to eradicate malaria worldwide by selectively targeting

Page 14: Malaria

mosquitoes in areas where malaria was rampant.[93] However, these efforts have so far failed to eradicate malaria in many parts of the developing world - the problem is most prevalent in Africa.

Sterile insect technique is emerging as a potential mosquito control method. Progress towards transgenic, or genetically modified, insects suggest that wild mosquito populations could be made malaria-resistant. Researchers at Imperial College London created the world's first transgenic malaria mosquito,[94] with the first plasmodium-resistant species announced by a team at Case Western Reserve University in Ohio in 2002.[95] Successful replacement of existent populations with genetically modified populations, relies upon a drive mechanism, such as transposable elements to allow for non-Mendelian inheritance of the gene of interest.

On December 21, 2007, a study published in PLoS Pathogens found that the hemolytic C-type lectin CEL-III from Cucumaria echinata, a sea cucumber found in the Bay of Bengal, impaired the development of the malaria parasite when produced by transgenic mosquitoes.[96][97] This could potentially be used one day to control malaria by using genetically modified mosquitoes refractory to the parasites, although the authors of the study recognize that there are numerous scientific and ethical problems to be overcome before such a control strategy could be implemented.

Prophylactic drugs

Several drugs, most of which are also used for treatment of malaria, can be taken preventively. Generally, these drugs are taken daily or weekly, at a lower dose than would be used for treatment of a person who had actually contracted the disease. Use of prophylactic drugs is seldom practical for full-time residents of malaria-endemic areas, and their use is usually restricted to short-term visitors and travelers to malarial regions. This is due to the cost of purchasing the drugs, negative side effects from long-term use, and because some effective anti-malarial drugs are difficult to obtain outside of wealthy nations.

Quinine was used starting in the seventeenth century as a prophylactic against malaria. The development of more effective alternatives such as quinacrine, chloroquine, and primaquine in the twentieth century reduced the reliance on quinine. Today, quinine is still used to treat chloroquine resistant Plasmodium falciparum, as well as severe and cerebral stages of malaria, but is not generally used for prophylaxis. Of interesting historical note is the observation by Samuel Hahnemann in the late 18th century that over-dosing of quinine leads to a symptomatic state very similar to that of malaria itself. This lead Hahnemann to develop the medical Law of Similars, and the subsequent medical system of Homeopathy.

Modern drugs used preventively include mefloquine (Lariam), doxycycline (available generically), and the combination of atovaquone and proguanil hydrochloride (Malarone). The choice of which drug to use depends on which drugs the parasites in the area are resistant to, as well as side-effects and other considerations. The prophylactic

Page 15: Malaria

effect does not begin immediately upon starting taking the drugs, so people temporarily visiting malaria-endemic areas usually begin taking the drugs one to two weeks before arriving and must continue taking them for 4 weeks after leaving (with the exception of atovaquone proguanil that only needs be started 2 days prior and continued for 7 days afterwards).

Indoor residual spraying

Indoor residual spraying (IRS) is the practice of spraying insecticides on the interior walls of homes in malaria effected areas. After feeding, many mosquito species rest on a nearby surface while digesting the bloodmeal, so if the walls of dwellings have been coated with insecticides, the resting mosquitos will be killed before they can bite another victim, transferring the malaria parasite.

The first and historically the most popular insecticide used for IRS is DDT. While it was initially used to exclusively to combat malaria, its use quickly spread to agriculture. In time, pest-control, rather than disease-control, came to dominate DDT use, and this large-scale agricultural use led to the evolution of resistant mosquitoes in many regions. During the 1960s, awareness of the negative consequences of its indiscriminate use increased ultimately leading to bans on agricultural applications of DDT in many countries in the 1970s.

Though DDT has never been banned for use in malaria control and there are several other insecticides suitable for IRS, some advocates have claimed that bans are responsible for tens of millions of deaths in tropical countries where DDT had once been effective in controlling malaria. Furthermore, most of the problems associated with DDT use stem specifically from its industrial-scale application in agriculture, rather than its use in public health.[98]

The World Health Organization (WHO) currently advises the use of 12 different insecticides in IRS operations. These include DDT and a series of alternative insecticides (such as the pyrethroids permethrin and deltamethrin) to both combat malaria in areas where mosquitoes are DDT-resistant, and to slow the evolution of resistance.[99] This public health use of small amounts of DDT is permitted under the Stockholm Convention on Persistent Organic Pollutants (POPs), which prohibits the agricultural use of DDT.[100] However, because of its legacy, many developed countries discourage DDT use even in small quantities.[101]

Mosquito nets and bedclothes

Mosquito nets help keep mosquitoes away from people, and thus greatly reduce the infection and transmission of malaria. The nets are not a perfect barrier, so they are often treated with an insecticide designed to kill the mosquito before it has time to search for a way past the net. Insecticide-treated nets (ITN) are estimated to be twice as effective as untreated nets,[90] and offer greater than 70% protection compared with no net.[102] Since

Page 16: Malaria

the Anopheles mosquitoes feed at night, the preferred method is to hang a large "bed net" above the center of a bed such that it drapes down and covers the bed completely.

The distribution of mosquito nets impregnated with insecticide (often permethrin or deltamethrin) has been shown to be an extremely effective method of malaria prevention, and it is also one of the most cost-effective methods of prevention. These nets can often be obtained for around US$2.50 - $3.50 (2-3 euro) from the United Nations, the World Health Organization, and others.

For maximum effectiveness, the nets should be re-impregnated with insecticide every six months. This process poses a significant logistical problem in rural areas. New technologies like Olyset or DawaPlus allow for production of long-lasting insecticidal mosquito nets (LLINs), which release insecticide for approximately 5 years,[103] and cost about US$5.50. ITNs have the advantage of protecting people sleeping under the net and simultaneously killing mosquitoes that contact the net. This has the effect of killing the most dangerous mosquitoes. Some protection is also provided to others, including people sleeping in the same room but not under the net.

Unfortunately, the cost of treating malaria is high relative to income, and the illness results in lost wages. Consequently, the financial burden means that the cost of a mosquito net is often unaffordable to people in developing countries, especially for those most at risk. Only 1 out of 20 people in Africa own a bed net.[90] Although shipped into Africa mainly from Europe as free development help, the nets quickly become expensive trade goods. They are mainly used for fishing, and by combining hundreds of donated mosquito nets, whole river sections can be completely shut off, catching even the smallest fish.[104]

A study among Afghan refugees in Pakistan found that treating top-sheets and chaddars (head coverings) with permethrin has similar effectiveness to using a treated net, but is much cheaper.[105]

A new approach, announced in Science on June 10, 2005, uses spores of the fungus Beauveria bassiana, sprayed on walls and bed nets, to kill mosquitoes. While some mosquitoes have developed resistance to chemicals, they have not been found to develop a resistance to fungal infections.[106]

Vaccination

Vaccines for malaria are under development, with no completely effective vaccine yet available. The first promising studies demonstrating the potential for a malaria vaccine were performed in 1967 by immunizing mice with live, radiation-attenuated sporozoites, providing protection to about 60% of the mice upon subsequent injection with normal, viable sporozoites.[107] Since the 1970s, there has been a considerable effort to develop similar vaccination strategies within humans. It was determined that an individual can be protected from a P. falciparum infection if they receive over 1000 bites from infected, irradiated mosquitoes.[108]

Page 17: Malaria

It has been generally accepted that it is impractical to provide at-risk individuals with this vaccination strategy, but that has been recently challenged with work being done by Dr. Stephen Hoffman of Sanaria, one of the key researchers who originally sequenced the genome of Plasmodium falciparum. His work most recently has revolved around solving the logistical problem of isolating and preparing the parasites equivalent to a 1000 irradiated mosquitoes for mass storage and inoculation of human beings. The company has recently received several multi-million dollar grants from the Bill & Melinda Gates Foundation and the U.S. government to begin early clinical studies in 2007 and 2008.[109] The Seattle Biomedical Research Institute (SBRI), funded by the Malaria Vaccine Initiative, assures potential volunteers that "the [2009] clinical trials won't be a life-threatening experience. While many volunteers [in Seattle] will actually contract malaria, the cloned strain used in the experiments can be quickly cured, and does not cause a recurring form of the disease." "Some participants will get experimental drugs or vaccines, while others will get placebo."[110]

Instead, much work has been performed to try and understand the immunological processes that provide protection after immunization with irradiated sporozoites. After the mouse vaccination study in 1967,[107] it was hypothesized that the injected sporozoites themselves were being recognized by the immune system, which was in turn creating antibodies against the parasite. It was determined that the immune system was creating antibodies against the circumsporozoite protein (CSP) which coated the sporozoite.[111] Moreover, antibodies against CSP prevented the sporozoite from invading hepatocytes.[112] CSP was therefore chosen as the most promising protein on which to develop a vaccine against the malaria sporozoite. It is for these historical reasons that vaccines based on CSP are the most numerous of all malaria vaccines.

Presently, there is a huge variety of vaccine candidates on the table. Pre-erythrocytic vaccines (vaccines that target the parasite before it reaches the blood), in particular vaccines based on CSP, make up the largest group of research for the malaria vaccine. Other vaccine candidates include: those that seek to induce immunity to the blood stages of the infection; those that seek to avoid more severe pathologies of malaria by preventing adherence of the parasite to blood venules and placenta; and transmission-blocking vaccines that would stop the development of the parasite in the mosquito right after the mosquito has taken a bloodmeal from an infected person.[113] It is hoped that the sequencing of the P. falciparum genome will provide targets for new drugs or vaccines.[114]

The first vaccine developed that has undergone field trials, is the SPf66, developed by Manuel Elkin Patarroyo in 1987. It presents a combination of antigens from the sporozoite (using CS repeats) and merozoite parasites. During phase I trials a 75% efficacy rate was demonstrated and the vaccine appeared to be well tolerated by subjects and immunogenic. The phase IIb and III trials were less promising, with the efficacy falling to between 38.8% and 60.2%. A trial was carried out in Tanzania in 1993 demonstrating the efficacy to be 31% after a years follow up, however the most recent (though controversial) study in the Gambia did not show any effect. Despite the relatively long trial periods and the number of studies carried out, it is still not known how the

Page 18: Malaria

SPf66 vaccine confers immunity; it therefore remains an unlikely solution to malaria. The CSP was the next vaccine developed that initially appeared promising enough to undergo trials. It is also based on the circumsporoziote protein, but additionally has the recombinant (Asn-Ala-Pro15Asn-Val-Asp-Pro)2-Leu-Arg(R32LR) protein covalently bound to a purified Pseudomonas aeruginosa toxin (A9). However at an early stage a complete lack of protective immunity was demonstrated in those inoculated. The study group used in Kenya had an 82% incidence of parasitaemia whilst the control group only had an 89% incidence. The vaccine intended to cause an increased T-lymphocyte response in those exposed, this was also not observed.

The efficacy of Patarroyo's vaccine has been disputed with some US scientists concluding in The Lancet (1997) that "the vaccine was not effective and should be dropped" while the Colombian accused them of "arrogance" putting down their assertions to the fact that he came from a developing country.

The RTS,S/AS02A vaccine is the candidate furthest along in vaccine trials. It is being developed by a partnership between the PATH Malaria Vaccine Initiative (a grantee of the Gates Foundation), the pharmaceutical company, GlaxoSmithKline, and the Walter Reed Army Institute of Research[115] In the vaccine, a portion of CSP has been fused to the immunogenic "S antigen" of the hepatitis B virus; this recombinant protein is injected alongside the potent AS02A adjuvant.[113] In October 2004, the RTS,S/AS02A researchers announced results of a Phase IIb trial, indicating the vaccine reduced infection risk by approximately 30% and severity of infection by over 50%. The study looked at over 2,000 Mozambican children.[116] More recent testing of the RTS,S/AS02A vaccine has focused on the safety and efficacy of administering it earlier in infancy: In October 2007, the researchers announced results of a phase I/IIb trial conducted on 214 Mozambican infants between the ages of 10 and 18 months in which the full three-dose course of the vaccine led to a 62% reduction of infection with no serious side-effects save some pain at the point of injection.[117] Further research will delay this vaccine from commercial release until around 2011.[118]

Other methods

Education in recognising the symptoms of malaria has reduced the number of cases in some areas of the developing world by as much as 20%. Recognising the disease in the early stages can also stop the disease from becoming a killer. Education can also inform people to cover over areas of stagnant, still water eg Water Tanks which are ideal breeding grounds for the parasite and mosquito thus, cutting down the risk of the transmission between people. This is most put in practice in urban areas where there is large centres of population in a confined space and transmission would be most likely in these areas.

The Malaria Control Project is currently using downtime computing power donated by individual volunteers around the world (see Volunteer computing and BOINC) to simulate models of the health effects and transmission dynamics in order to find the best method or combination of methods for malaria control. This modeling is extremely

Page 19: Malaria

computer intensive due to the simulations of large human populations with a vast range of parameters related to biological and social factors that influence the spread of the disease. It is expected to take a few months using volunteered computing power compared to the 40 years it would have taken with the current resources available to the scientists who developed the program.[119]

An example of the importance of computer modelling in planning malaria eradication programs is shown in the paper by Águas and others. They showed that eradication of malaria is crucially dependent on finding and treating the large number of people in endemic areas with asymptomatic malaria, who act as a reservoir for infection.[120] The malaria parasites do not affect animal species and therefore eradication of the disease from the human population would be expected to be effective.

Antimalarial drugAntimalarial drugs are designed to prevent or cure malaria. Some antimalarial agents, particularly chloroquine and hydroxychloroquine, are also used in the treatment of rheumatoid arthritis and lupus associated arthritis. There are many of these drugs currently on the market. Quinine is the oldest and most famous anti-malarial. Two types of antimalarial drugs are to be distingueshed: the kind one takes as prevention (called prophylactic drugs) and therapy drugs. The first one is taken as prevention and requires continuous administration to reduce the risk of infection. The second type, called therapy drugs are taken once the a person is already infected.

Prophylactic drugs

Quinine

Quinine has a long history stretching from Peru, and the discovery of the Cinchona tree, and the potential uses of its bark, to the current day and a collection of derivatives that are still frequently used in the prevention and treatment of malaria. Quinine is an alkaloid that acts as a blood schizonticidal and weak gametocide against Plasmodium vivax and Plasmodium malariae. As an alkaloid, it is accumulated in the food vacuoles of plasmodium species, especially Plasmodium falciparum. It acts by inhibiting the hemozoin biocrystallization, thus facilitating an aggregation of cytotoxic heme. Quinine is less effective and more toxic as a blood schizonticidal agent than Chloroquine; however it is still very effective and widely used in the treatment of acute cases of severe P. falciparum. It is especially useful in areas where there is known to be a high level of resistance to Chloroquine, Mefloquine and sulfa drug combinations with pyrimethamine. Quinine is also used in post-exposure treatment of individuals returning from an area where malaria is endemic.

The treatment regimen of Quinine is complex and is determined largely by the parasite’s level of resistance and the reason for drug therapy (i.e. acute treatment or prophylaxis). The World Health Organization recommendation for Quinine is 8mg/kg three times daily

Page 20: Malaria

for 3 days (in areas where the level of adherence is questionable) and for 7 days (where parasites are sensitive to Quinine). In areas where there is an increased level of resistance to Quinine 8mg/kg three times daily for 7 days is recommended, combined with Doxycycline, Tetracycline or Clindamycin. Doses can be given by oral, intravenous or intramuscular routes. The recommended method depends on the urgency of treatment and the available facilities (i.e. sterilised needles for IV or IM injections).

Use of Quinine is characterised by a frequently experienced syndrome called cinchonism. Tinnitus (a hearing impairment), rashes, vertigo, nausea, vomiting and abdominal pain are the most common symptoms. Neurological effects are experienced in some cases due to the drug’s neurotoxic properties. These actions are mediated through the interactions of Quinine causing a decrease in the excitability of the motor neuron end plates. This often results in functional impairment of the eight cranial nerve; resulting in confusion, delirium and coma. Quinine can cause hypoglycaemia through its action of stimulating insulin secretion, this occurs in therapeutic doses and therefore it is advised that glucose levels are monitored in all patients every 4-6 hours. This effect can be exaggerated in pregnancy and therefore additional care in administering and monitoring the dosage is essential. Repeated or over-dosage can result in renal failure and death through depression of the respiratory system.

Other Alkaloids

Quinimax and Quinidine are the two most commonly used alkaloids related to Quinine, in the treatment or prevention of Malaria. Quinimax is a combination of four alkaloids (namely Quinine Quinidine Cinchoine and Cinchonidine). This combination has been shown in several studies to be more effective than Quinine, supposedly due to a synergistic action between the four Cinchona derivatives. Quinidine is a direct derivative of Quinine. It is a distereoisomer, thus having similar anti-malarial properties to the parent compound. Quinidine is recommended only for the treatment or severe cases of malaria.

Chloroquine

Chloroquine was until recently the most widely used anti-malarial. It was the original prototype from which most other methods of treatment are derived. It is also the least expensive, best tested and safest of all available drugs. The emergence of drug resistant parasitic strains is rapidly decreasing its effectiveness; however it is still the first-line drug of choice in most sub-Saharan African countries. It is now suggested that it is used in combination with other antimalarial drugs to extend its effective usage.

Chloroquine is a 4-aminoquinolone compound with a complicated and still unclear mechanism of action. It is believed to reach high concentrations in the vacuoles of the parasite, which, due to its alkaline nature, raises the internal pH. It controls the conversion of toxic heme to hemozoin by inhibiting the biocrystallization of hemozoin [1] thus poisoning the parasite through excess levels of toxicity. Other potential mechanisms through which it may act include interfering with the biosynthesis of parasitic nucleic

Page 21: Malaria

acids, the formation of a chloroquine-haem or chloroquine-DNA complex. The most significant level of activity found is against all forms of the schizonts (with the obvious exception of chloroquine-resistant P. falciparum and P. vivax strains) and the gametocytes of P. vivax, P. malariae, P. ovale as well as the immature gametocytes of P. falciparum. Chloroquine also has a significant anti-pyretic and anti-inflammatory effect when used to treat P. vivax infections, thus it may still remain useful even when resistance is more widespread. According to a report on the Science and Development Network website's sub-Saharan Africa section, there is very little drug resistance among children infected with malaria on the island of Madagascar, but what drug resistance there is, exists against chloroquinine.

A slightly different drug called nivaquine or chloroquine phosphate has also been invented. Popular drugs that make use of this compound are Chloroquine FNA, Resochin and Dawaquin.

Children and adults should receive 25mg of chloroquine per kg given over 3 days. A pharmacokinetically superior regime, recommended by the WHO, involves giving an initial dose of 10mg/kg followed 6-8 hours later by 5mg/kg, then 5mg/kg on the following 2 days. For chemoprophylaxis: 5mg/kg/week (single dose) or 10mg/kg/week divided into 6 daily doses is advised. It should be noted that chloroquine is only recommended as a prophylactic drug in regions only affected by P. vivax and sensitive P. falciparum strains. Chloroquine has been used in the treatment of malaria for many years and no abortifacient or teratogenic effects have been reported during this time, therefore it is considered very safe to use during pregnancy. However, itching can occur at intolerable level.

Amodiaquine

Amodiaquine is a 4-aminoquinolone anti-malarial drug similar in structure and mechanism of action to Chloroquine. It is most frequently used in combination with Chloroquine, but is also very effective when used alone. It is thought to be more effective in clearing parasites in uncomplicated malarial than Chloroquine, thus leading to a faster rate of recovery. However, some fatal adverse effects of the drug were noted during the 1980’s, thus reducing its usage in chemoprophylaxis. The WHO’s most recent advice on the subject still maintains that the drug should be used when the potential risk of not treating an infection outweighs the risk of developing side effects. It has also been suggested that it is particularly effective, and less toxic than other combination treatments in HIV positive patients.

The drug should be given in doses between 25mg/kg and 35mg/kg over 3 days in a similar method to that used in Chloroquine administration. Adverse reactions are generally similar in severity and type to that seen in Chloroquine treatment. In addition, bradycardia, itching, nausea, vomiting and some abdominal pain have been recorded. Some blood and hepatic disorders have also been seen in a small number of patients.

Pyrimethamine

Page 22: Malaria

Pyrimethamine is used in the treatment of uncomplicated malaria. It is particularly useful in cases of chloroquine-resistant P. Falciparum strains when combined with Sulphadoxine. It acts by inhibiting dihydrofolate reductase in the parasite thus preventing the biosynthesis of purines and pyrimidines. Therefore halting the processes of DNA synthesis, cell division and reproduction. It acts primarily on the schizonts during the hepatic and erythrocytic phases.

Sulphadoxine

The action of Sulphadoxine is focused on inhibiting the use of para-aminobenzoic acid during the synthesis of dihydropteroic acid. When combined with Pyrimethamine the two key stages in DNA synthesis in the plasmodia are prevented. It also acts on the schizonts during the hepatic and erythrocytic phases. It is mainly used for treating P. falciparum infections and is less active against other Plasmodium strains. However usage is restricted due to the long half life of the combination which exerts a potentially large selection pressure on the parasite hence encouraging the possibility of resistance developing. This combination is not recommended for chemoprophylaxis because of the severe skin reactions commonly experienced. However it is used frequently for clinical episodes of the disease.

Proguanil

Proguanil (Chloroguanadine) is a biguanide; a synthetic derivative of pyrimidine. It was developed in 1945 by a British Antimalarial research group. It has many mechanisms of action but primarily is mediated through conversion to the active metabolite cycloguanil pamoate. This inhibits the malarial dihydrofolate reductase enzyme. Its most prominent effect is on the primary tissue stages of P. falciparum, P. vivax and P. ovale. It has no known effect against hypnozoites therefore is not used in the prevention of relapse. It has a week blood schizonticidal activity, although not recommended for therapy currently, when combined with Atovaquone (a hydroxynaphthoquinone) it has been shown to be effective against multi-drug resistant strains of P. falciparum. Proguanil is used as a prophylactic treatment in combination with another drug, most frequently Chloroquine. 3mg/kg is the advised dosage per day, (hence approximate adult dosage is 200mg). The pharmacokinetic profile of the drugs indicates that a half dose, twice daily maintains the plasma levels with a greater level of consistency, thus giving a greater level of protection. It should be noted that the Proguanil- Chloroquine combination does not provide effective protection against resistant strains of P. falciparum. There are very few side effects to Proguanil, with slight hair loss and mouth ulcers being occasionally reported following prophylactic use.

Mefloquine

Mefloquine was developed during the Vietnam War and is chemically related to quinine. It was developed to protect American troops against multi-drug resistant P. falciparum. It is a very potent blood schizonticide with a long half-life. It is thought to act by forming toxic heme complexes that damage parasitic food vacuoles. It is now used solely for the

Page 23: Malaria

prevention of resistant strains of P. falciparum despite being effective against P. vivax, P. ovale and P. marlariae. Mefloquine is effective in prophylaxis and for acute therapy. It is now strictly used for resistant strains (and is usually combined with Artesunate). Chloroquine/Proguanil or sufha drug-pyrimethamine combinations should be used in all other Plasmodia infections.

The major commercial manufacturer of mefloquine-based malaria treatment is Roche Pharmaceuticals, which markets the drug under the trade name "Lariam". Lariam is fairly expensive at around 3 € per tablet (pricing of the year 2000).

A dose of 15-25mg/kg is recommended, depending on the prevalence of Mefloquine resistance. The increased dosage is associated with a much greater level of intolerance, most noticeably in young children; with the drug inducing vomiting and oesophagitis. The effects during pregnancy are unknown, although it has been linked with an increased number of stillbirths. It is not recommended for use during the first trimester, although considered safe during the second and third trimesters. Mefloquine frequently produces side effects, including nausea, vomiting, diarrhea, abdominal pain and dizziness. Several associations with neurological events have been made, namely affective and anxiety disorders, hallucinations, sleep disturbances, psychosis, toxic encephalopathy, convulsions and delirium. Cardiovascular effects have been recorded with bradycardia and sinus arrhythmia being consistently recorded in 68% of patients treated with Mefloquine (in one hospital-based study).

Mefloquine can only be taken for a period up to 6 months (due to side effects, ...). After this, other drugs (such as those based on paludrine/nivaquine) again need to be taken. [2]

Atovaquone

Recently, a new type of antimalarial drug has also been available which is very effective as no musquitos populations have already been able to generate resistance due to exposure. The new drug is called Atovaquone. Also, the drug produces has no side-effects such as the cardiovascular effect with mefloquine which can trigger heart rythm problems. The drug is avialable under the name Malarone, yet is very expensive (costing way more than Lariam).

Primaquine

Primaquine is a highly active 8-aminoquinolone that is used in treating all types of malaria infection. It is most effective against gametocytes but also acts on hypnozoites, blood schizonticytes and the dormant plasmodia in P. vivax and P. ovale. It is the only known drug to cure both relapsing malaria infections and acute cases. The mechanism of action is not fully understood but it is thought to mediate some effect through creating oxygen free radicals that interfere with the plasmodial electron transport chain during respiration.

Page 24: Malaria

For the prevention of relapse in P. vivax and P. ovale 0.15 mg/kg should be given for 14 days. As a gametocytocidal drug in P. falciparum infections a single dose of 0.75mg/kg repeated 7 days later is sufficient. This treatment method is only used in conjunction with another effective blood schizonticidal drug. There are few significant side effects although is has been shown that Primaquine may cause anorexia, nausea, vomiting, cramps, chest weakness, anaemia, some suppression of myeloid activity and abdominal pains. In cases of over-dosage granulocytopenia may occur.

Artemesinin and derivatives

Artemesinin is a Chinese herb (Qinghaosu) that has been used in the treatment of fevers for over 1,000 years[3], thus predating the use of Quinine in the western world. It is derived from the plant Artemisia annua, with the first documentation as a successful therapeutic agent in the treatment of malaria is in 340 AD by Ge Hong in his book Zhou Hou Bei Ji Fang (A Handbook of Prescriptions for Emergencies).[4] The active compound was isolated first in 1971 and named Artemsinin. It is a sesquiterpene lactone with a chemically rare peroxide bridge linkage. It is this that is thought to be responsible for the majority of its anti-malarial action. At present it is strictly controlled under WHO guidelines as it has proven to be effective against all forms of multi-drug resistant P. falciparum, thus every care is taken to ensure compliance and adherence together with other behaviours associated with the development of resistance. It is also only given in combination with other anti-malarials.

Artemesinin has a very rapid action and the vast majority of acute patients treated show significant improvement within 1-3 days of receiving treatment. It has demonstrated the fastest clearance of all anti-malarials currently used and acts primarily on the trophozite phase, thus preventing progression of the disease. It is converted to active metabolite dihydroartemesinin that then inhibits the sarcoplasmic/endoplasmic reticulum Calcium ATPase encoded by P. falciparum. On the first day of treatment 20 mg/kg should be given, this dose is then reduced to 10mg/kg per day for the 6 following days. Few side effects are associated with artemesinin use. However, headaches, nausea, vomiting, abnormal bleeding, dark urine, itching and some drug fever have been reported by a small number of patients. Some cardiac changes were reported during a clinical trial, notably non specific ST changes and a first degree atrioventricular block (these disappeared when the patients recovered from the malarial fever).

Artemether is a methyl ether derivative of Dihydroartemesinin. It is similar to Artemesinin in mode of action but demonstrates a reduced ability as a hypnozoiticidal compound, instead acting more significantly to decrease gametocyte carriage. Similar restrictions are in place, as with Artemesinin, to prevent the development of resistance, therefore it is only used in combination therapy for severe acute cases of drug-resistant P. falciparum. It should be administered in a 7 day course with 4mg/kg given per day for 3 days, followed by 1.6 mg/kg for 3 days. Side effects of the drug are few but include potential neurotoxicity developing if high doses are given.

Page 25: Malaria

Artesunate is a hemisuccinate derivative of the active metabolite Dihydroartemisin. Currently it is the most frequently used of all the Artemesinin-type drugs. Its only effect is mediated through a reduction in the gametocyte transmission. It is used in combination therapy and is effective in cases of uncomplicated P. falciparum. The dosage recommended by the WHO is a 5 or 7 day course (depending on the predicted adherence level) of 4mg/kg for 3 days (usually given in combination with Mefloquine) followed by 2mg/kg for the remaining 2 or 4 days. In large studies carried out on over 10,000 patients in Thailand no adverse effects have been shown.

Dihydroartemisinin is the active metabolite to which Artemisinin is reduced. It is the most effective Artemesinin compound and the least stable. It has a strong blood schizonticidal action and reduces gametocyte transmission. It is used for therapeutic treatment of cases of resistant and uncomplicated P. falciparum. 4mg/kg doses are recommended on the first day of therapy followed by 2mg/kg for 6 days. As with Artesunate, no side effects to treatment have thus far been recorded.

Arteether is an ethyl ether derivative of Dihydroartemisinin. It is used in combination therapy for cases of uncomplicated resistant P. falciparum. The recommended dosage is 150mg/kg per day for 3 days given by IM injections. With the exception of a small number of cases demonstrating neurotoxicity following parenteral administration no side effects have been recorded.

Therapy drugs

Halofantrine

Halofantrine is a relatively new drug developed by the Walter Reed Army Institute of Research in the 1960s. It is a phenanthrene methanol, chemically related to Quinine and acts acting as a blood schizonticide effective against all plasmodium parasites. Its mechanism of action is similar to other anti-malarials. Cytotoxic complexes are formed with ferritoporphyrin XI that cause plasmodial membrane damage. Despite being effective against drug resistant parasites, Halofantrine is not commonly used in the treatment (prophylactic or therapeutic) of malaria due to its high cost. It has very variable bioavailability and has been shown to have potentially high levels of cardiotoxicity. It is still a useful drug and can be used in patients that are known to be free of heart disease and are suffering from severe and resistant forms of acute malaria. A popular drug based on halofantrine is Halfan. The level of governmental control and the prescription-only basis on which it can be used contributes to the cost, thus Halofantrine is not frequently used.

A dose of 8 mg/kg of Halofantrine is advised to be given in three doses at six hour intervals for the duration of the clinical episode. It is not recommended for children under 10 kg despite data supporting the use and demonstrating that it is well tolerated. The most frequently experienced side-effects include nausea, abdominal pain, diarrhoea, and itch. Severe ventricular dysrhythmias, occasionally causing death are seen when high doses are administered. This is due to prolongation of the QTc interval. Halofantrine is not

Page 26: Malaria

recommended for use in pregnancy and lactation, in small children, or in patients that have taken Mefloquine previously. Lumefantrine is a relative of halofantrine that is used in some combination antimalarial regimens.[5]

Other agents

Doxycycline

Doxycycline is a Tetracycline compound derived from Oxytetracycline. The tetracyclines were one of the earliest groups of antibiotics to be developed and are still used widely in many types of infection. It is a bacteriostatic agent that acts to inhibit the process of protein synthesis by binding to the 30S ribosomal subunit thus preventing the 50s and 30s units from bonding. Doxycycline is used primarily for chemoprophylaxis in areas where quinine resistance exists. It can be used in resistant cases of uncomplicated P. falciparum but has a very slow action in acute maleria, therefore it should never be used in monotherapy.

When treating acute cases and given in combination with Quinine; 100mg/kg of Doxycycline should be given per day for 7 days. In prophylactic therapy, 100mg (adult dose) of Doxycycline should be given every day during exposure to malaria.

The most commonly experienced side effects are permanent enamel hypoplasia, transient depression of bone growth, gastrointestinal disturbances and some increased levels of photosensitivity. Due to its effect of bone and tooth growth it is not used in children under 8, pregnant or lactating women and those with a known hepatic dysfunction.

Tetracycline is only used in combination for the treatment of acute cases of P.Falciparum infections. This is due to its slow onset. Unlike Doxycycline it is not used in chemoprophylaxis. For Tetracycline, 250mg is the recommended adult dosage (it should not be used in children) for 5 or 7 days depending on the level of adherence and compliance expected. Oesophageal ulceration, gastrointestinal upset and interferences with the process of ossification and depression of bone growth are known to occur. The majority of side effects associated with Doxycycline are also experienced.

Clindamycin

Clindamycin is a derivative of Lincomycin, with a slow action against blood schizonticides. It is only used in combination with Quinine in the treatment of acute cases of resistant P. falciparum infections and not as a prophylactic. Being more expensive and toxic than the other antibiotic alternatives, it is used only in cases where the Tetracyclines are contraindicated (for example in children).

Clindamycin should be given in conjunction with Quinine as a 300mg dose (in adults) four times a day for 5 days. The only side effects recorded in patients taking Clindamycin are nausea, vomiting and abdominal pains and cramps. However these can be alleviated by consuming large quantities of water and food when taking the drug.

Page 27: Malaria

Pseudomembranous colitis (caused by Clostridium difficile} has also developed in some patients; this condition may be fatal in a small number of cases.

Drug regimens

The following regimens are recommended by the WHO, UK HPA and CDC for adults and children aged 12 and over:

chloroquine 300 to 310 mg once weekly, and proguanil 200 mg once daily (started one week before travel, and continued for four weeks after returning);

doxycycline 100 mg once daily (started one day before travel, and continued for four weeks after returning);

mefloquine 228 to 250 mg once weekly (started two-and-a-half weeks before travel, and continued for four weeks after returning);

Malarone 1 tablet daily (started one day before travel, and continued for 1 week after returning).

Other chemoprophylactic regimens that are available:

Dapsone 100 mg and pyrimethamine 12.5 mg once weekly (available as a combination tablet called Maloprim or Deltaprim): this combination is not routinely recommended because of the risk of agranulocytosis;

Primaquine 30 mg once daily (started the day before travel, and continuing for seven days after returning): this regimen is not routinely recommended because of the need for G-6-PD testing prior to starting primaquine (see the article on primaquine for more information).

Quinine sulphate 300 to 325 mg once daily: this regimen is effective but not routinely used because of the unpleasant side effects of quinine.

Resistance to antimalarials

Anti-malarial drug resistance has been defined as: "the ability of a parasite to survive and/or multiply despite the administration and absorption of a drug given in doses equal to or higher than those usually recommended but within tolerance of the subject. The drug in question must gain access to the parasite or the infected red blood cell for the duration of the time necessary for its normal action." In most instances this refers to parasites that remaining following on from an observed treatment. Thus excluding all cases where anti-malarial prophylaxis has failed. In order for a case to be defined as resistant, the patient under question must have received a known and observed anti-malarial therapy whilst the blood drug and metabolite concentrations are monitored concurrently. The techniques used to demonstrate this are: in vivo, in vitro, animal model testing and the most recently developed molecular techniques.

Drug resistant parasites are often used to explain malaria treatment failure. However, they are two potentially very different clinical scenarios. The failure to clear parasitemia and

Page 28: Malaria

recover from an acute clinical episode when a suitable treatment has been given and anti-malarial resistance in its true form. Drug resistance may lead to treatment failure, but treatment failure is not necessarily caused by drug resistance despite assisting with its development. A multitude of factors can be involved in the processes including problems with non-compliance and adherence, poor drug quality, interactions with other pharmaceuticals, poor absorption, misdiagnosis and incorrect doses being given. The majority of these factors also contribute to the development of drug resistance.

The generation of resistance can be complicated and varies between plasmodium species. It is generally accepted to be initiated primarily through a spontaneous mutation that provides some evolutionary benefit, thus giving an anti-malarial used a reduced level of sensitivity. This can be caused by a single point mutation or multiple mutations. In most instances a mutation will be fatal for the parasite or the drug pressure will remove parasites that remain susceptible, however some resistant parasites will survive. Resistance can become firmly established within a parasite population, existing for long periods of time.

The first type of resistance to be acknowledged was to Chloroquine in Thailand in 1957. The biological mechanism behind this resistance was subsequently discovered to be related to the development of an efflux mechanism that expels Chloroquine from the parasite before the level required to effectively inhibit the process of haem polymerization (that is necessary to prevent build up of the toxic by products formed by haemoglobin digestion). This theory has been supported by evidence showing that resistance can be effectively reversed on the addition of substances which halt the efflux. The resistance of other quinolone anti-malarials such as amiodiaquine, mefloquine, halofantrine and quinine are thought to have occurred by similar mechanisms.

Plasmodium have developed resistance against antifolate combination drugs, the most commonly used being sulfadoxine and pyrimethamine. Two gene mutations are thought to be responsible, allowing synergistic blockages of two enzymes involved in folate synthesis. Regional variations of specific mutations give differing levels of resistance.

Atovaquone is recommended to be used only in combination with another anti-malarial compound as the selection of resistant parasites occurs very quickly when used in mono-therapy. Resistance is thought to originate from a single-point mutation in the gene coding for cytochrome-b.

Spread of resistance

There is no single factor that confers the greatest degree of influence on the spread of drug resistance, but a number of plausible causes associated with an increase have been acknowledged. These include aspects of economics, human behaviour, pharmokinetics, and the biology of vectors and parasites.

The most influential causes are examined below:

Page 29: Malaria

1. The biological influences are based on the parasites ability to survive the presence of an anti-malarial thus enabling the persistence of resistance and the potential for further transmission despite treatment. In normal circumstances any parasites that persist after treatment are destroyed by the host’s immune system, therefore any factors that act to reduce the elimination of parasites could facilitate the development of resistance. This attempts to explain the poorer response associated with immunocompromised individuals, pregnant women and young children.

2. There has been evidence to suggest that certain parasite-vector combinations can alternatively enhance or inhibit the transmission of resistant parasites, causing ‘pocket-like’ areas of resistance.

3. The use of anti-malarials developed from similar basic chemical compounds can increase the rate of resistance development, for example cross-resistance to chloroquine and amiodiaquine, two 4-aminoquinolones and mefloquine conferring resistance to quinine and halofantrine. This phenomenon may reduce the usefulness of newly developed therapies prior to large-scale usage.

4. The resistance to anti-malarials may be increased by a process found in some species of plasmodium, where a degree of phenotypic plasticity was exhibited, allowing the rapid development of resistance to a new drug, even if the drug has not been previously experienced.

5. The pharmokinetics of the chosen anti-malarial are key; the decision of choosing a long-half life over a drug that is metabolised quickly is complex and still remains unclear. Drugs with shorter half-life’s require more frequent administration to maintain the correct plasma concentrations, therefore potentially presenting more problems if levels of adherence and compliance are unreliable, but longer-lasting drugs can increase the development of resistance due to prolonged periods of low drug concentration.

6. The pharmokinetics of anti-malarials is important when using combination therapy. Mismatched drug combinations, for example having an ‘unprotected’ period where one drug dominates can seriously increase the likelihood of selection for resistant parasites.

7. Ecologically there is a linkage between the level of transmission and the development of resistance, however at present this still remains unclear.

8. The treatment regime prescribed can have a substantial influence on the development of resistance. This can involve the drug intake, combination and interactions as well as the drug’s pharmokinetic and dynamic properties.

Prevention of resistance

The prevention of anti-malarial drug resistance is of enormous public health importance. It can be assumed that no therapy currently under development or to be developed in the foreseeable future will be totally protective against malaria. In accordance with this, there is the possibility of resistance developing to any given therapy that is developed. This is a serious concern, as the rate at which new drugs are produced by no means matches the rate of the development of resistance. In addition, the most newly developed therapeutics tend to be the most expensive and are required in the largest quantities by some of the

Page 30: Malaria

poorest areas of the world. Therefore it is apparent that the degree to which malaria can be controlled depends on the careful use of the current drugs to limit, insofar as it is possible, any further development of resistance.

Provisions essential to this process include the delivery of fast primary care where staff are well trained and supported with the necessary supplies for efficient treatment. This in itself is inadequate in large areas where malaria is endemic thus presenting an initial problem. One method proposed that aims to avoid the fundamental lack in certain countries health care infrastructure is the privatisation of some areas, thus enabling drugs to be purchased on the open market from sources that are not officially related to the health care industry. Although this is now gaining some support there are many problems related to limited access and improper drug use, which could potentially increase the rate of resistance development to an even greater extent.

There are two general approaches to preventing the spread of resistance: preventing malaria infections and, preventing the transmission of resistant parasites.

Preventing malaria infections developing has a substantial effect on the potential rate of development of resistance, by directly reducing the number of cases of malaria thus decreasing the requirement for anti-malarial therapy. Preventing the transmission of resistant parasites limits the risk of resistant malarial infections becoming endemic and can be controlled by a variety of non-medical methods including insecticide-treated bed nets, indoor residual spraying, environmental controls (such as swamp draining) and personal protective methods such as using mosquito repellent. Chemoprophylaxis is also important in the transmission of malaria infection and resistance in defined populations (for example travellers).

A hope for future of anti-malarial therapy is the development of an effective malaria vaccine. This could have enormous public health benefits, providing a cost-effective and easily applicable approach to preventing not only the onset of malaria but the transmission of gametocytes, thus reducing the risk of resistance developing. Anti-malarial therapy could be also be diversified by combining a potentially effective vaccine with current chemotherapy, thereby reducing the chance of vaccine resistance developing.

Combination therapy

The problem of the development of malaria resistance must be weighed against the essential goal of anti-malarial care; that is to reduce morbidity and mortality. Thus a balance must be reached that attempts to achieve both goals whilst not compromising either too much by doing so. The most successful attempts so far have been in the administration of combination therapy. This can be defined as, ‘the simultaneous use of two or more blood schizonticidal drugs with independent modes of action and different biochemical targets in the parasite’. There is much evidence to support the use of combination therapies, some of which has been discussed previously, however several problems prevent the wide use in the areas where its use is most advisable. These include:

Page 31: Malaria

problems identifying the most suitable drug for different epidemiological situations, the expense of combined therapy (it is over 10 times more expensive than traditional mono-therapy), how soon the programmes should be introduced and problems linked with policy implementation and issues of compliance.

The combinations of drugs currently prescribed can be divided into two categories: Non-artemesinin and Quinine based combinations and, Artemesinin based combinations.

Non-Artemesinin based combinations

Sulfadoxine-Pyrimethamine (SP)–This combination has been used for many years and has wide-spread resistance. It has serious adverse effects but is cheap and is available in a single dose, thus decreasing problems associated with adherence and compliance. The recommended dose is 25mg/kg of sulfadoxine and 1.25mg/kg of pyrimethamine.

Sulfadoxine-pyrimethamine plus Chloroquine–This is another cost-effective combination, which benefits from the drugs having similar pharmacokinetic profiles but different biochemical parasitic targets. There is already some level of parasite resistance present and numerous side-effects are associated with the use of SP. Chloroquine is recommended at 25mg/kg over 3 days with a single dose of SP as described above.

Sulfadoxine-pyrimethamine plus Amodiaquine–This combination has been shown to produce a faster rate of clinical recovery than SP and Chloroquine, however there are serious adverse reactions associated with use that have limited its distribution. It is thought to have a longer therapeutic lifetime than other combinations and may be a more cost-effective option to introduce in areas where resistance is likely to develop. This is unlikely to occur until more information regarding its safety has been obtained. The recommended dose is 10mg/kg of Amodiaquine per day for 3 days with a single standard dose of SP.

Sulfadoxine-Pyrimethamine plus Mefloquine–This is produced as a single dose pill and has obvious advantages over some of the more complex regimes. This combination of drugs has very different pharmokinetic properties with no synergistic action. This characteristic is potentially thought to delay the development of resistance, however it is counteracted by the very long half life of Mefloquine which could exert a high selection pressure in areas where intensive malaria transmission occurs. It is also an expensive combination and has not been recommended for used since 1990 due to Mefloquine resistance.

Tetracycline or Doxycycline plus Quinine–Despite the increasing levels of resistance to Quinine this combination has proven to be particularly efficacious. The longer half-life of the Tetracycline component ensures a high cure rate. Problems with this regime include the relatively complicated drug regimen, where Quinine must be taken every 8 hours for 7 days. Additionally, there are severe

Page 32: Malaria

side effects to both drugs (Cinchonism in Quinine) and Tetracyclines are contraindicated in children and pregnant women. For these reasons this combination is not recommended as first-line therapy but can be used for non-responders who remain able to take oral medication. Quinine should be taken in 10mg/kg doses every 8 hours and Tetracycline in 4mg/kg doses every 6 hours for 7 days.

Artemesinin-based combinations

Artemesinin has a very different mode of action than conventional anti-malarials (see information above), this makes is particularly useful in the treatment of resistant infections, however in order to prevent the development of resistance to this drug it is only recommended in combination with another non-artemesinin based therapy. It produces a very rapid reduction in the parasite biomass with an associated reduction in clinical symptoms and is known to cause a reduction in the transmission of gametocytes thus decreasing the potential for the spread of resistant alleles. At present there is no known resistance to Artemesinin and very few reported side-effects to drug usage, however this data is limited.

Artesunate and Chloroquine–This combination has been thoroughly tested in randomised controlled trials and has demonstrated that it is well tolerated with few side effects. However, in one study there was less than 85% cure in areas where Chloroquine resistance was known. It is not approved for use in combination therapy and is unadvised in areas of high P. falciparum resistance.

Artesunate and Amodiaquine–This combination has also been tested and proved to be more efficacious and similarly well tolerated to the Chloroquine combination. The cure rate was greater than 90%, potentially providing a viable alternative where levels of Chloroquine resistance are high. The main disadvantage is a suggested link with neutropenia. Dosage is recommended as 4mg/kg of Artesunate and 10mg/kg of Amodiaquine per day for 3 days.

Artesunate and Mefloquine–This has been used as an efficacious first-line treatment regimen in areas of Thailand for many years. Mefloquine is known to cause vomiting in children and induces some neuropsychiatric and cardiotoxic effects, interestingly these adverse reactions seem to be reduced when the drug is combined with Artesunate, it is suggested that this is due to a delayed onset of action of Mefloquine. This is not considered a viable option to be introduced in Africa due to the long half-life of Mefloquine, which potentially could exert a high selection pressure on parasites. The standard dose required is 4mg/kg per day of Artesunate plus 25mg/kg of Mefloquine as a split dose of 15 mg/kg on day 2 and 10 mg/kg on day three.

Artemether and Lumefantrine–(Coartem, Riamet, and Lonart) This combination has been extensively tested in 16 clinical trials, proving effective in children under 5 and has been shown to be better tolerated than Artesunate plus

Page 33: Malaria

Mefloquine combinations. There are no serious side effects documented but the drug is not recommended in pregnant or lactating women due to limited safety testing in these groups. This is the most viable option for widespread use and is available in fixed-dose formulas thus increasing compliance and adherence.

Artesunate and Sulfadoxine/Pyrimethamine–This is a well tolerated combination but the overall level of efficacy still depends on the level of resistance to Sulfadoxine and Pyrimethamine thus limiting is usage. It is recommended in doses of 4mg/kg of Artesunate per day for 3 days and a single dose of 25mg/kg of SP.

Other combinations

There are several anti-malarial combinations currently being developed that are hoped to be highly efficacious, cost-effective, safe and well tolerated. These are to be newly developed compounds and not derivatives of currently used drugs, thus decreasing the likelihood of resistance.

Piperaquine -dihydroartemisinin-trimethoprim (Artecom) and Artecom combined with Primaquine has been studied in resistant areas of China and Vietnam. The drug has been shown to be highly efficacious (greater than 90%) even to strains resistant to Primaquine. Prior to introduction more information is required on safety and tolerability in pregnant women and children and toxicology data.

Pyronaridine and Artesunate has been tested and demonstrated a clinical response rate of 100% in one trial in Hainan (an area with high levels of P. falciparum resistance to Pyronaridine).

Chlorproguanil -Dapsone and Artesunate (Lapdap plus) is the most tested drug currently under development and could be introduced in African countries imminently. It is not recommended as a monotherapy due to concerns of resistance developing thus threatening the future use of related compounds.

AmoebiasisAmoebiasis, or Amebiasis is caused by the amoeba Entamoeba histolytica. It is an intestinal infection that may or may not be symptomatic and can be present in an infected person for several years. It is estimated that it causes 70,000 deaths per year world wide. Symptoms, when present, can range from mild diarrhea to dysentery with blood and mucus in the stool.

When symptoms are present it is generally known as invasive amoebiasis and occurs in two major forms. Invasion of the intestinal lining causes "amoebic dysentery" or "amoebic colitis". If the parasite reaches the bloodstream it can spread through the body, most frequently ending up in the liver where it causes "amoebic liver abscesses". When

Page 34: Malaria

no symptoms are present, the infected individual is still a carrier, able to spread the parasite to others through poor hygienic practices. While symptoms at onset can be similar to Bacillary dysentery, amoebiasis is not bacteriological in origin and treatments differ, although both infections can be prevented by good sanitary practices.

Transmission

Amoebiasis is usually transmitted by contamination of drinking water and foods with feces, but it can also be transmitted indirectly through contact with dirty hands or objects as well as by anal-oral contact.

Infection is spread through ingestion of the cyst form of the parasite, a resistant structure that is found in stools. There may also be free amoebae, or trophozoites, that do not form cysts but these die quickly after leaving the body and are only rarely the source of new infections. Since amoebiasis is transmitted through contaminated food and water, it is often endemic in the poorer regions of the world due to less well developed waste disposal systems and untreated water supplies.

Contact with contaminated water, for example by washing or brushing your teeth in water from a contaminated source, or ingesting vegetables washed in such water, can lead to infection as well.

Amoebic dysentery is often confused with "traveler's diarrhea", or "Montezuma's Revenge" in Mexico, because of the prevalence of both in developing nations, but in fact most traveler's diarrhea is bacterial or viral in origin. Liver abscesses can occur without previous development of amoebic dysentery.

Prevention

To help prevent the spread of amoebiasis around the home :

Page 35: Malaria

Wash hands thoroughly with soap and hot running water for at least 10 seconds after using the toilet or changing a baby's diaper, and before handling food.

Clean bathrooms and toilets often; pay particular attention to toilet seats and taps. Avoid sharing towels or face washers.

To help prevent infection:

Avoid raw vegetables when in endemic areas, as they may have been fertilized using human feces.

Boil water or treat with iodine tablets.

Nature of the disease

Most infected people, perhaps 90%, are asymptomatic, but this disease has the potential to make the sufferer dangerously ill. It is estimated by the World Health Organization that about 70,000 people die annually worldwide.

Infections can sometimes last for years. Symptoms take from a few days to a few weeks to develop and manifest themselves, but usually it is about two to four weeks. Symptoms can range from mild diarrhea to dysentery with blood and mucus. The blood comes from amoebae invading the lining of the intestine. In about 10% of invasive cases the amoebae enter the bloodstream and may travel to other organs in the body. Most commonly this means the liver, as this is where blood from the intestine reaches first, but they can end up almost anywhere.

Onset time is highly variable and the average asymptomatic infection persists for over a year. It is theorized that the absence of symptoms or their intensity may vary with such factors as strain of amoeba, immune response of the host, and perhaps associated bacteria and viruses.

In asymptomatic infections the amoeba lives by eating and digesting bacteria and food particles in the gut, a part of the gastrointestinal tract.[citation needed] It does not usually come in contact with the intestine itself due to the protective layer of mucus that lines the gut. Disease occurs when amoeba comes in contact with the cells lining the intestine. It then secretes the same substances it uses to digest bacteria, which include enzymes that destroy cell membranes and proteins. This process can lead to penetration and digestion of human tissues, resulting first in flask-shaped ulcers in the intestine. Entamoeba histolytica ingests the destroyed cells by phagocytosis and is often seen with red blood cells inside when viewed in stool samples. Especially in Latin America,[citation needed] a granulomatous mass (known as an amoeboma) may form in the wall of the ascending colon or rectum due to long-lasting cellular response, and is sometimes confused with cancer.[1]

Theoretically, the ingestion of one viable cyst can cause an infection.

Page 36: Malaria

Diagnosis of human illness

Asymptomatic human infections are usually diagnosed by finding cysts shed in the stool. Various flotation or sedimentation procedures have been developed to recover the cysts from fecal matter and stains help to visualize the isolated cysts for microscopic examination. Since cysts are not shed constantly, a minimum of three stools should be examined. In symptomatic infections, the motile form (the trophozoite) can often be seen in fresh feces. Serological tests exist and most individuals (whether with symptoms or not) will test positive for the presence of antibodies. The levels of antibody are much higher in individuals with liver abscesses. Serology only becomes positive about two weeks after infection. More recent developments include a kit that detects the presence of ameba proteins in the feces and another that detects ameba DNA in feces. These tests are not in widespread use due to their expense.

Amoebic dysentery in colon biopsy

Microscopy is still by far the most widespread method of diagnosis around the world. However it is not as sensitive or accurate in diagnosis as the other tests available. It is important to distinguish the E. histolytica cyst from the cysts of nonpathogenic intestinal protozoa such as Entamoeba coli by its appearance. E. histolytica cysts have a maximum of four nuclei, while the commensal Entamoeba coli cyst has up to 8 nuclei. Additionally, in E. histolytica, the endosome is centrally located in the nucleus, while it is usually off-center in Entamoeba coli. Finally, chromatoidal bodies in E. histolytica cysts are rounded, while they are jagged in Entamoeba coli. However, other species, Entamoeba dispar and E. moshkovskii, are also commensals and cannot be distinguished from E. histolytica under the microscope. As E. dispar is much more common than E. histolytica in most parts of the world this means that there is a lot of incorrect diagnosis of E. histolytica infection taking place. The WHO recommends that infections diagnosed by microscopy alone should not be treated if they are asymptomatic and there is no other reason to suspect that the infection is actually E. histolytica.

Relative frequency of the disease

In older textbooks it is often stated that 10% of the world's population is infected with Entamoeba histolytica. It is now known that at least 90% of these infections are due to E. dispar. Nevertheless, this means that there are up to 50 million true E. histolytica infections and approximately seventy thousand die each year, mostly from liver abscesses

Page 37: Malaria

or other complications. Although usually considered a tropical parasite, the first case reported (in 1875) was actually in St Petersburg in Russia, near the Arctic Circle. Infection is more common in warmer areas, but this is both because of poorer hygiene and the parasitic cysts surviving longer in warm moist conditions.

Treatment

E. histolytica infections occur in both the intestine and (in people with symptoms) in tissue of the intestine and/or liver. As a result two different sorts of drugs are needed to rid the body of the infection, one for each location. Metronidazole, or a related drug such as Tinidazole, Secnidazole or Ornidazole, is used to destroy amebae that have invaded tissue. These are rapidly absorbed into the bloodstream and transported to the site of infection. Because they are rapidly absorbed there is almost none remaining in the intestine. Since most of the amebae remain in the intestine when tissue invasion occurs, it is important to get rid of those also or the patient will be at risk of developing another case of invasive disease. Several drugs are available for treating intestinal infections, the most effective of which has been shown to be Paromomycin (also known as Humatin); Diloxanide Furoate (also known as Furamide) is used in the US and Iodoquinol (also known as Yodoxin) is used in certain other countries. Both tissue and lumenal drugs must be used to treat infections, with Metronidazole usually being given first, followed by Paromomycin or Diloxanide. E. dispar does not require treatment, but many laboratories (even in the developed world) do not have the facilities to distinguish this from E. histolytica.

For amebic dysentery a multi-prong approach must be used, starting with one of:

Metronidazole 500-750mg three times a day for 5-10 days Tinidazole 2g once a day for 3 days is an alternative to metronidazole

In addition to the above, one of the following luminal amebicides should be prescribed as an adjunctive treatment, either concurrently or sequentially, to destroy E. histolytica in the colon:

Paromomycin 500mg three times a day for 10 days Diloxanide furoate 500mg three times a day for 10 days Iodoquinol 650mg three times a day for 20 days

For amebic liver abscess:

Metronidazole 400mg three times a day for 10 days Tinidazole 2g once a day for 6 days is an alternative to metronidazole Diloxanide furoate 500mg three times a day for 10 days (or one of the other

lumenal amebicides above) must always be given afterwards

Doses for children are calculated by body weight and a pharmacist should be consulted for help.

Page 38: Malaria

Herbal treatments

In Mexico, it is common to use herbal tinctures of chaparro amargo (Castela texana). 30 drops are taken in a small glass of water first thing in the morning, and 30 drops before the last meal of the day, for seven days straight. After taking a seven day break from the treatment, it is resumed for seven days. Some mild cramping may be felt; it is claimed this means that the amoebas are dying and will be expelled from the body. Many Mexicans use the chaparro amargo treatment regularly, three times a year. The efficacy of such treatments has not been scientifically proven.

A 1998 study in Africa suggests that 2 tablespoons per week of papaya seeds may have some antiamoebic action and aid in prevention of amoebiasis, but this remains unconfirmed. Papaya fruit and seeds are often considered beneficial to digestion in areas where this plant is common.

Complications

In the majority of cases, amoebas remain in the gastrointestinal tract of the hosts. Severe ulceration of the gastrointestinal mucosal surfaces occurs in less than 16% of cases. In fewer cases, the parasite invades the soft tissues, most commonly the liver. Only rarely are masses formed (amoebomas) that lead to intestinal obstruction.

Entamoeba histolytica infection is associated with malnutrition and stunting of growth.[2]

Populations at risk

All people are believed to be susceptible to infection and there is no evidence that individuals with a damaged or undeveloped immunity may suffer more severe forms of the disease.

Food analysis

E. histolytica cysts may be recovered from contaminated food by methods similar to those used for recovering Giardia lamblia cysts from feces. Filtration is probably the most practical method for recovery from drinking water and liquid foods. E. histolytica cysts must be distinguished from cysts of other parasitic (but nonpathogenic) protozoa and from cysts of free-living protozoa as discussed above. Recovery procedures are not very accurate; cysts are easily lost or damaged beyond recognition, which leads to many falsely negative results in recovery tests.[3]

Outbreaks

The most dramatic incident the USA was the Chicago World's Fair outbreak in 1933 caused by contaminated drinking water; defective plumbing permitted sewage to

Page 39: Malaria

contaminate water. There were 1,000 cases (with 58 deaths). In 1998 there was an [outbreak] of amoebiasis in the Republic of Georgia. One hundred and seventy-seven cases were reported between 26 May and 3 September 1998, including 71 cases of intestinal amoebiasis and 106 probable cases of liver abscess. In recent times, food handlers are suspected of causing many scattered infections.

GiardiasisGiardiasis — popularly known as beaver fever or backpacker's diarrhea — is a disease caused by the flagellate protozoan Giardia lamblia (also sometimes called Giardia intestinalis and Giardia duodenalis).[1] The giardia organism inhabits the digestive tract of a wide variety of domestic and wild animal species, including humans. It is a common cause of gastroenteritis in humans, infecting approximately 200 million people worldwide.

Transmission

Giardiasis is passed via the fecal-oral route. Primary routes are personal contact and contaminated water and food. People who spend time in institutional or day-care environments are more susceptible, as are travelers and those who consume improperly treated water. It is a particular danger to people hiking or backpacking in wilderness areas worldwide. Giardia is suspected to be zoonotic—communicable between animals and humans. Major reservoir hosts include beavers, dogs, cats, horses, cattle and birds.

Symptoms

Symptoms include loss of appetite, lethargy, fever, explosive diarrhea, hematuria (blood in urine), loose or watery stool, stomach cramps, upset stomach, projectile vomiting (uncommon), bloating, flatulence, and burping (often sulphurous). Symptoms typically begin 1–2 weeks after infection and may wane and reappear cyclically. Symptoms are caused by Giardia organisms coating the inside of the small intestine and blocking nutrient absorption. Most people are asymptomatic; only about a third of infected people exhibit symptoms. Untreated, symptoms may last for six weeks or longer.

Symptomatic infections are well recognised as causing lactose intolerance,[2] which, while usually temporary, may become permanent.[3][4] Although hydrogen breath tests indicate poorer rates of carbohydrate absorption in those asymptomatically infected, such tests are not diagnostic of infection.[5] It has been suggested that these observations are explained by symptomatic giardia infection allowing for the overgrowth of other bacteria.[6][5]

Some studies have shown that giardiasis should be considered as a cause of Vitamin B12 deficiency, this a result of the problems caused within the intestinal absorption system. [7]

Treatment

Page 40: Malaria

Drugs used to treat adults include metronidazole, albendazole and quinacrine. Furazolidone and nitazoxanide may be used in children. Treatment is not always necessary, as the body can defeat the infection by itself.

The drug tinidazole can treat giardiasis in a single treatment of 2000 mg, instead of the longer treatment of the other medications listed. The shorter duration of treatment may also cause less patient distress. Tinidazole is now approved by the FDA[8] and available to U.S. patients.

Lab Diagnosis

The mainstay of diagnosis of Giardiasis is stool microscopy. This can be for motile trophozoites or for the distinctive oval G.lamblia cysts.

The entero-test uses a gelatin capsule with an attached thread. One end is attached to the inner aspect of the patient's cheek, and the capsule is swallowed. Later the thread is withdrawn and shaken in saline to release trophozoites which can be detected microscopically.

A new immunologic test referred to as ELISA, for enzyme-linked immunosorbent assay is now available. These tests are capable of a 90 percent detection rate or more.[9]

Because Giardia lamblia is difficult to detect, often leading to misdiagnoses, it is advised that several tests be conducted over a one week time period.[10]

AnthelminticAnthelmintics or antihelminthics are drugs that expel parasitic worms (helminths) from the body, by either stunning or killing them. They may also be called vermifuges (stunning) or vermicides (killing).

Pharmaceutical classes

Examples of pharmaceuticals used as anthelmintics include:

Albendazole – effective against threadworms, roundworms, whipworms, tapeworms, hookworms

Diethylcarbamazine – effective against Wuchereria bancrofti, Brugia malayi, Brugia timori, tropical pulmonary eosinophilia, loiasis

Mebendazole – effective against pinworms, roundworms and hookworms Niclosamide – effective against tapeworms Ivermectin – effective against most common intestinal worms (except

tapeworms)

Page 41: Malaria

Suramin Thiabendazole – effective against roundworms, hookworms Pyrantel pamoate – effective against most nematode infections Levamisole Piperazine family Praziquantel – effective against nematodes, some trematodes Triclabendazole – effective against liver flukes Octadepsipeptides (eg: Emodepside) – effective against a variety of

gastrointestinal helminths Amino Acetonitrile derivatives (eg: Monepantel): effective against a variety of

gastrointestinal helminths including those resistant to the other drug classes.

Please note that many of these pharameuticals are extremely toxic. Taken in improper dosages they can be dangerous to humans as well as lethal to parasites.

Natural anthelmintics

Examples of naturally occuring anthelmintics include:

Tobacco (Nicotiana tabacum & Nicotiana rustica)[1] Black walnut (Juglans nigra) wormwood (Artemisia absynthium) clove (Syzygium aromaticum) tansy tea (Tanacetum vulgare) Hagenia (Hagenia abyssinica) kalonji (Nigella sativa) seeds male fern (Dryopteris filix-mas) Plumeria (P. acutifolia or P. rubra) in Brazilian folk medicine.[2] Peganum harmala is used as an anthelmintic.[3]

Please note that many natural vermifuges or anthelmintics are poisonous and, in improper dosages, dangerous to humans as well as parasites.

Anthelmintic resistance

The ability of worms to survive treatments that are generally effective at the recommended dose rate is considered a major threat to the current future control of worm parasites of small ruminants and horses.

The clinical definition of resistance is a 95% or less reduction in a "Fecal Egg Count" test.[clarify]

Page 42: Malaria

Development of resistance

Treatment eliminates worms whose genotype renders them susceptible. Worms that are resistant survive and pass on their "resistance" genes. Resistant worms accumulate and finally treatment failure occurs.

BioassayBioassay is a shorthand commonly used term for biological assay and is a type of scientific experiment.

Bioassays are typically conducted to measure the effects of a substance on a living organism. Bioassays may be qualitative or quantitative. Qualitative bioassays are used for assessing the physical effects of a substance that may not be quantified, such as abnormal development or deformity. Quantitative bioassays involve estimation of the concentration or potency of a substance by measurement of the biological response that it produces. Quantitative bioassays are typically analyzed using the methods of biostatistics. Bioassays are essential in the development of new drugs, and monitoring pollutants in the environment. Environmental bioassays are generally a broad-range survey of toxicity, and a toxicity identification evaluation is conducted to determine what the relevant toxicants are.

The use of bioassays include:

1. measurement of the pharmacological activity of new or chemically undefined substances

2. investigation of the function of endogenous mediators 3. determination of the side-effect profile, including the degree of drug toxicity 4. measurement of the concentration of known substances (alternatives to the use of

whole animals have made this use obsolete) 5. Assessing the amount of pollutants being released by a particular source, such as

wastewater or urban runoff.

ImmunosuppressantAn immunosuppressant is a substance that performs immunosuppression of the immune system. They may either be exogenous. as immunosuppressive drugs, or endogenous, as e.g. testosterone.[1]

After organ transplantation, the body will nearly always reject the new organ(s) due to differences in human leukocyte antigen haplotypes between the donor and recipient. As a result, the immune system detects the new tissue as "hostile", and attempts to remove it by attacking it with recipient leukocytes, resulting in the death of the tissue.

Page 43: Malaria

Immunosuppressants are applied as a countermeasure; the side effect is that the body becomes extremely vulnerable to infections and malignancy, much like in an advanced HIV infection.

Immunosuppressive drugImmunosuppressive drugs, immunosuppressive agents, or immunosuppressants are drugs that inhibit or prevent activity of the immune system. They are used in immunosuppressive therapy to:

Prevent the rejection of transplanted organs and tissues (e.g., bone marrow, heart, kidney, liver)

Treat autoimmune diseases or diseases that are most likely of autoimmune origin (e.g., rheumatoid arthritis, multiple sclerosis, myasthenia gravis, systemic lupus erythematosus, Crohn's disease, pemphigus, and ulcerative colitis).

Treat some other non-autoimmune inflammatory diseases (e.g., long term allergic asthma control).

These drugs are not without side-effects and risks. Because the majority of them act non-selectively, the immune system is less able to resist infections and the spread of malignant cells. There are also other side-effects, such as hypertension, dyslipidemia, hyperglycemia, peptic ulcers, liver, and kidney injury. The immunosuppressive drugs also interact with other medicines and affect their metabolism and action. Actual or suspected immunosuppressive agents can be evaluated in terms of their effects on lymphocyte subpopulations in tissues using immunohistochemistry.[1]

Immunosuppressive drugs can be classified into five groups:

glucocorticoids cytostatics antibodies drugs acting on immunophilins other drugs .

Glucocorticoids

In pharmacologic (supraphysiologic) doses, glucocorticoids are used to suppress various allergic, inflammatory, and autoimmune disorders. They are also administered as posttransplantory immunosuppressants to prevent the acute transplant rejection and graft-versus-host disease. Nevertheless, they do not prevent an infection and also inhibit later reparative processes.

Immunosuppressive mechanism

Page 44: Malaria

Glucocorticoids suppress the cell-mediated immunity. They act by inhibiting genes that code for the cytokines IL-1, IL-2, IL-3, IL-4, IL-5, IL-6, IL-8, and TNF-γ, the most important of which is the IL-2. Smaller cytokine production reduces the T cell proliferation.

Glucocorticoids also suppress the humoral immunity, causing B cells to express smaller amounts of IL-2 and IL-2 receptors. This diminishes both B cell clone expansion and antibody synthesis.

Antiinflammatory effects

Glucocorticoids influence all types of inflammatory events, no matter what their cause. They induce the lipocortin-1 (annexin-1) synthesis, which then binds to cell membranes preventing the phospholipase A2 from coming into contact with its substrate arachidonic acid. This leads to diminished eicosanoid production. The cyclooxygenase (both COX-1 and COX-2) expression is also suppressed, potentiating the effect.

Glucocorticoids also stimulate the lipocortin-1 escaping to the extracellular space, where it binds to the leukocyte membrane receptors and inhibits various inflammatory events: epithelial adhesion, emigration, chemotaxis, phagocytosis, respiratory burst, and the release of various inflammatory mediators (lysosomal enzymes, cytokines, tissue plasminogen activator, chemokines, etc.) from neutrophils, macrophages, and mastocytes.

Cytostatics

Cytostatics inhibit cell division. In immunotherapy, they are used in smaller doses than in the treatment of malignant diseases. They affect the proliferation of both T cells and B cells. Due to their highest effectiveness, purine analogs are most frequently administered.

Alkylating agents

The alkylating agents used in immunotherapy are nitrogen mustards (cyclophosphamide), nitrosoureas, platinum compounds, and others. Cyclophosphamide is probably the most potent immunosuppressive compound. In small doses, it is very efficient in the therapy of systemic lupus erythematosus, autoimmune hemolytic anemias, Wegener's granulomatosis, and other immune diseases. High doses cause pancytopenia and hemorrhagic cystitis.

Antimetabolites

Antimetabolites interfere with the synthesis of nucleic acids. These include:

folic acid analogues, such as methotrexate purine analogues such as azathioprine and mercaptopurine pyrimidine analogues protein synthesis inhibitors.

Page 45: Malaria

Methotrexate

Methotrexate is a folic acid analogue. It binds dihydrofolate reductase and prevents synthesis of tetrahydrofolate. It is used in the treatment of autoimmune diseases (for example rheumatoid arthritis) and in transplantations.

Azathioprine and Mercaptopurine

Azathioprine, is the main immunosuppressive cytotoxic substance. It is extensively used to control transplant rejection reactions. It is nonenzymatically cleaved to mercaptopurine, that acts as a purine analogue and an inhibitor of DNA synthesis. Mercaptopurine itself can also be administered directly.

By preventing the clonal expansion of lymphocytes in the induction phase of the immune response, it affects both the cell and the humoral immunity. It is also efficient in the treatment of autoimmune diseases.

Cytotoxic antibiotics

Among these, dactinomycin is the most important. It is used in kidney transplantations. Other cytotoxic antibiotics are anthracyclines, mitomycin C, bleomycin, mithramycin.

Antibodies

Antibodies are used as a quick and potent immunosuppression method to prevent the acute rejection reaction.

Polyclonal antibodies

Heterologous polyclonal antibodies are obtained from the serum of animals (e.g., rabbit, horse), and injected with the patient's thymocytes or lymphocytes. The antilymphocyte (ALG) and antithymocyte antigens (ATG) are being used. They are part of the steroid-resistant acute rejection reaction and grave aplastic anemia treatment. However, they are added primarily to other immunosuppressives to diminish their dosage and toxicity. They also allow transition to cyclosporine therapy.

Polyclonal antibodies inhibit T lymphocytes and cause their lysis, which is both complement-mediated cytolysis and cell-mediated opsonization followed by removal of reticuloendothelial cells from the circulation in the spleen and liver]]. In this way, polyclonal antibodies inhibit cell-mediated immune reactions, including graft rejection, delayed hypersensitivity (i.e., tuberculin skin reaction), and the graft-versus-host disease (GVHD), but influence thymus-dependent antibody production.

As of March 2005, there are two preparations available to the market: Atgam (R), obtained from horse serum, and Thymoglobuline (R), obtained from rabbit serum. Polyclonal antibodies affect all lymphocytes and cause general immunosuppression,

Page 46: Malaria

possibly leading to post-transplant lymphoproliferative disorders (PTLD) or serious infections, especially by cytomegalovirus. To reduce these risks, treatment is provided in a hospital, where adequate isolation from infection is available. They are usually administered for five days intravenously in the appropriate quantity. Patients stay in the hospital as long as three weeks to give the immune system time to recover to a point where there is no longer a risk of serum sickness.

Because of a high immunogenicity of polyclonal antibodies, almost all patients have an acute reaction to the treatment. It is characterized by fever, rigor episodes, and even anaphylaxis. Later during the treatment, some patients develop serum sickness or immune complex glomerulonephritis. Serum sickness arises seven to fourteen days after the therapy has begun. The patient suffers from fever, joint pain, and erythema that can be soothed with the use of steroids and analgesics. Urticaria (hives) can also be present. It is possible to diminish their toxicity by using highly-purified serum fractions and intravenous administration in the combination with other immunosuppressants, for example, calcineurin inhibitors, cytostatics and cortisteroids. The most frequent combination is to use antibodies and cyclosporine simultaneously in order to prevent patients from gradually developing a strong immune response to these drugs, reducing or eliminating their effectiveness.

Monoclonal antibodies

Monoclonal antibodies are directed towards exactly defined antigens. Therefore, they cause fewer side-effects. Especially significant are the IL-2 receptor- (CD25-) and CD3-directed antibodies. They are used to prevent the rejection of transplanted organs, but also to track changes in the lymphocyte subpopulations. It is reasonable to expect similar new drugs in the future.

T-cell receptor directed antibodies

As of 2007 OKT3 (also called muromab) is the only approved anti-CD3 antibody. It is a murine anti-CD3 monoclonal antibody of the IgG2a type that prevents T-cell activation and proliferation by binding the T-cell receptor complex present on all differentiated T cells. As such it is one of the most potent immunosuppressive substances and is administered to control the steroid- and/or polyclonal antibodies-resistant acute rejection episodes. As it acts more specifically than polyclonal antibodies it is also used prophylactically in transplantations.

At present the OKT3's mechanism of action is only partially understood. It is known that the molecule binds TCR/CD3 receptor complex. In the first few administrations this binding non-specifically activates T-cells, leading to a serious syndrome 30 to 60 minutes later. It is characterized by fever, myalgia, headache, and arthralgia. Sometimes it develops in a life-threatening reaction of the cardiovascular system and the central nervous system, requiring a lengthy therapy. Past this period CD3 (R) blocks the TCR-antigen binding and causes conformational change or the removal of the entire TCR3/CD3 complex from the T-cell surface. This lowers the number of available T-cells,

Page 47: Malaria

perhaps by sensitizing them for the uptake by the epithelial reticular cells. The cross-binding of CD3 molecules as well activates an intracellular signal causing the T cell anergy or apoptosis, unless the cells receive another signal through a co-stimulatory molecule. CD3 antibodies shift the balance from Th1 to Th2 cells.

When deciding to include OKT3 in the treatment a healthcare practitioner must consider not only its great efficiency but also its toxic side-effects. The risk of excessive immunosuppression and the risk of development of neutralizing antibodies could make it inefficacious. Although CD3 antibodies act more specifically than polyclonal antibodies, they lower the cell-mediated immunity significantly, predisposing the patient to opportunistic infections and malignancies.

IL-2 receptor directed antibodies

Interleukin-2 is an important immune system regulator necessary for the clone expansion and survival of activated lymphocytes T. Its effects are mediated by the trimer cell surface receptor IL-2a, consisting of the α, β, and γ chains. The IL-2a (CD25, T-cell activation antigen, TAC) is expressed only by the already-activated T lymphocytes. Therefore, it is of special significance to the selective immunosuppressive treatment, and the research has been focused on the development of effective and safe anti-IL-2 antibodies. By the use of the recombinant gene technology, the mouse anti-Tac antibodies have been modified, leading to the presentation of two himeric mouse/human anti-Tac antibodies in the year 1998: basiliximab (Simulect (R)) and daclizumab (Zenapax (R)). These drugs act by binding the IL-2a receptor's α chain, preventing the IL-2 induced clonal expansion of activated lymphocytes and shortening their survival. They are used in the prophylaxis of the acute organ rejection after the bilateral kidney transplantation, both being similarly effective and with only few side-effects.

Drugs acting on immunophilins

Cyclosporin

Together with tacrolimus, cyclosporin is a calcineurin inhibitor. It has been in use since 1983 and is one of the most-widely-used immunosuppressive drugs. It is a fungal peptide, composed of 11 amino acids.

Cyclosporin is thought to bind to the cytosolic protein cyclophilin (an immunophilin) of immunocompetent lymphocytes, especially T-lymphocytes. This complex of cyclosporin and cyclophilin inhibits calcineurin, which under normal circumstances induces the transcription of interleukin-2. The drug also inhibits lymphokine production and interleukin release, leading to a reduced function of effector T-cells.

Cyclosporin is used in the treatment of acute rejection reactions, but has been increasingly substituted with newer immunosuppressants, as it is nephrotoxic.

Tacrolimus (Prograf(TM), FK506)

Page 48: Malaria

Tacrolimus is a fungal product (Streptomyces tsukubaensis). It is a macrolide lactone and acts by inhibiting calcineurin.

The drug is used particularly in the liver and kidney transplantations, although in some clinics it is used in heart, lung and heart/lung transplants. It binds to an immunophilin, followed by the binding of the complex to calcineurin and the inhibition of its phosphatase activity. In this way, it prevents the passage of G0 into G1 phase. Tacrolimus is more potent than cyclosporin and has less-pronounced side-effects.

Sirolimus (Rapamune (Tm), Rapamycin)

Sirolimus is a macrolide lactone, produced by the actinomycetes Streptomyces hygroscopicus. It is used to prevent rejection reactions. Although it is a structural analogue of tacrolimus, it acts somewhat differently and has different side-effects.

Contrary to cyclosporine and tacrolimus that affect the first phase of the T lymphocyte activation, sirolimus affects the second one, namely the signal transduction and their clonal proliferation. It binds to the same receptor (immunophilin) as tacrolimus, however the produced complex does not inhibit calcineurin, but another protein. Therefore, sirolimus acts synergistically with cyclosporine and, in combination with other immunosuppressants, has few side-effects. Also, it indirectly inhibits several T lymphocyte kinases and phosphatases, preventing the transmission of signal into their activity and the transition of the cell cycle from G1 to S phase. In a similar manner, it prevents the B cell differentiation to the plasma cells, which lowers the quantity of IgM, IgG, and IgA antibodies produced. It acts as an immunoregulatory agent, and is also active against tumors that involve the PI3K/AKT/mTOR pathway.

Other drugs

Interferons

IFN-β suppresses the production of Th1 cytokines and the activation of monocytes. It is used to slow down the progression of multiple sclerosis. IFN-γ is able to trigger lymphocytic apoptosis.

Opioids

Prolonged use of opioids may cause immunosuppression of both innate and adaptive immunity.[2] Decrease in proliferation as well as immune function has been observed in macrophages, as well as lymphocytes. It is thought that these effects are mediated by opioid receptors expressed on the surface of these immune cells.[2]

TNF binding proteins

Page 49: Malaria

A TNF-α- (tumor necrosis factor-alpha-) binding protein is a monoclonal antibody or a circulating receptor such as infliximab (Remicade), etanercept (Embrel), or adalimumab (Humira) that binds to TNF-α, preventing it from inducing the synthesis of IL-1 and IL-6 and the adhesion of lymphocyte-activating molecules. They are used in the treatment of rheumatoid arthritis, ankylosing spondylitis, Crohn's disease, and psoriasis.

TNF or the effects of TNF are also suppressed by various natural compounds, including curcumin (an ingredient in turmeric) and catechins (in green tea).

These drugs may raise the risk of contracting tuberculosis or inducing a latent infection to become active. Infliximab and adalimumab have label warnings stating that patients should be evaluated for latent TB infection and treatment should be initiated prior to starting therapy with them.

Mycophenolate

Mycophenolic acid acts as a non-competitive, selective, and reversible inhibitor of Inosine-5′-monophosphate dehydrogenase (IMPDH), which is a key enzyme in the de novo guanosine nucleotide synthesis. In contrast to other human cell types, lymphocytes B and T are very dependent on this process.

Small biological agents

FTY720 is a new synthetic immunosuppressant, currently in phase 3 of clinical trials. It increases the expression or changes the function of certain adhesion molecules (α4/β7 integrin) in lymphocytes, so they accumulate in the lymphatic tissue (lymphatic nodes) and their number in the circulation is diminished. In this respect, it differs from all other known immunosuppressants.

Myriocin has been reported being 10 to 100 times more potent than Cyclosporin

ToxicologyToxicology (from the Greek words toxicos and logos) is the study of the adverse effects of chemicals on living organisms.[1] It is the study of symptoms, mechanisms, treatments and detection of poisoning, especially the poisoning of people.

Relationship between dose and toxicity

Toxicology is the study of the relationship between dose and its effects on the exposed organism. The chief criterion regarding the toxicity of a chemical is the dose, i.e. the amount of exposure to the substance. Almost all substances are toxic under the right conditions as Paracelsus, the father of modern toxicology said, Sola dosis facit venenum (only dose makes the poison). Paracelsus, who lived in the 16th century, was the first person to explain the dose-response relationship of toxic substances. The term LD50 refers

Page 50: Malaria

to the dose of a toxic substance that kills 50 percent of a test population (typically rats or other surrogates when the test concerns human toxicity). LD50 estimations in animals are no longer required for regulatory submissions as a part of pre-clinical development package.[citation needed]

Toxicity of metabolites

Many substances regarded as poisons are toxic only indirectly. An example is "wood alcohol," or methanol, which is chemically converted to formaldehyde and formic acid in the liver. It is the formaldehyde and formic acid that cause the toxic effects of methanol exposure. Many drug molecules are made toxic in the liver, a good example being acetaminophen (paracetamol), especially in the presence of alcohol. The genetic variability of certain liver enzymes makes the toxicity of many compounds differ between one individual and the next. Because demands placed on one liver enzyme can induce activity in another, many molecules become toxic only in combination with others. A family of activities that engages many toxicologists includes identifying which liver enzymes convert a molecule into a poison, what are the toxic products of the conversion and under what conditions and in which individuals this conversion takes place.

Chemical toxicology

Chemical toxicology is a scientific discipline involving the study of structure and mechanism related to the toxic effects of chemical agents, and encompasses technology advances in research related to chemical aspects of toxicology. Research in this area is strongly multidisciplinary, spanning computational chemistry and synthetic chemistry, proteomics and metabolomics, drug discovery, drug metabolism and mechanisms of action, bioinformatics, bioanalytical chemistry, chemical biology, and molecular epidemiology. The molecular profiling approaches towards Toxicology are also referred to as Toxicogenomics [5]

ToxicityToxicity is the degree to which a substance is able to damage an exposed organism. Toxicity can refer to the effect on a whole organism, such as a human, bacterium, or plant, as well as the effect on a substructure of the organism, such as a cell (cytotoxicity) or an organ (organotoxicity such as the liver (hepatotoxicity). By extension, the word may be metaphorically used to describe toxic effects on larger and more complex groups, such as the family unit or society at large.

A central concept of toxicology is that effects are dose-dependent; even water can lead to water intoxication when taken in large enough doses, whereas for even a very toxic substance such as snake venom there is a dose below which there is no detectable toxic effect.

Page 51: Malaria

The skull and crossbones is a common symbol for toxicity.

Types of toxicity

There are generally three types of toxic entities; chemical, biological, and physical.

Chemicals include inorganic substances such as lead, hydrofluoric acid, and chlorine gas, organic compounds such as methyl alcohol, most medications, and poisons from living things.

Biological toxic entities include those bacteria and viruses that are able to induce disease in living organisms. Biological toxicity can be complicated to measure because the "threshold dose" may be a single organism. Theoretically one virus, bacterium or worm can reproduce to cause a serious infection. However, in a host with an intact immune system the inherent toxicity of the organism is balanced by the host's ability to fight back; the effective toxicity is then a combination of both parts of the relationship. A similar situation is also present with other types of toxic agents.

Physically toxic entities include things not usually thought of under the heading of "toxic" by many people: direct blows, concussion, sound and vibration, heat and cold, non-ionizing electromagnetic radiation such as infrared and visible light, and ionizing radiation such as X-rays and alpha, beta, and gamma radiation.

Toxicity can be measured by the effects on the target (organism, organ, tissue or cell). Because individuals typically have different levels of response to the same dose of a toxin, a population-level measure of toxicity is often used which relates the probability of an outcome for a given individual in a population. One such measure is the LD50. When such data does not exist, estimates are made by comparison to known similar toxic things, or to similar exposures in similar organisms. Then "safety factors" are added to account for uncertainties in data and evaluation processes. For example, if a dose of toxin is safe for a laboratory rat, one might assume that one tenth that dose would be safe for a human, allowing a safety factor of 10 to allow for interspecies differences between two mammals; if the data are from fish, one might use a factor of 100 to account for the greater difference between two chordate classes (fish and mammals). Similarly, an extra protection factor may be used for individuals believed to be more susceptible to toxic effects such as in pregnancy or with certain diseases. Or, a newly synthesized and previously unstudied chemical that is believed to be very similar in effect to another compound could be assigned an additional protection factor of 10 to account for possible differences in effects that are probably much smaller. Obviously, this approach is very approximate; but such protection factors are deliberately very conservative and the method has been found to be useful in a wide variety of applications.

Assessing all aspects of the toxicity of cancer-causing agents involves additional issues, since it is not certain if there is a minimal effective dose for carcinogens, or whether the

Page 52: Malaria

risk is just too small to see. In addition, it is possible that a single cell transformed into a cancer cell is all it takes to develop the full effect (the "one hit" theory).

It is more difficult to assess the toxicity of chemical mixtures than of single, pure chemicals because each component display its own toxicity and components may interact to produce enhanced or diminished effects. Common mixtures include gasoline, cigarette smoke, and industrial waste. Even more complex are situations with more than one type of toxic entity, such as the discharge from a malfunctioning sewage treatment plant, with both chemical and biological agents.

Factors influencing toxicity

Toxicity of a substance can be affected by many different factors, such as the pathway of administration (whether the toxin is applied to the skin, ingested, inhaled, injected), the time of exposure (a brief encounter or long term), the number of exposures (a single dose or multiple doses over time), the physical form of the toxin (solid, liquid, gas), the genetic makeup of an individual, an individual's overall health, and many others. Several of the terms used to describe these factors have been included here.

acute exposure a single exposure to a toxic substance which may result in severe biological harm or death; acute exposures are usually characterized as lasting no longer than a day.

chronic exposure continuous exposure to a toxin over an extended period of time, often measured in months or years can cause irreversible side effects.

Acute toxicityAcute toxicity describes the adverse effects of a substance which result either from a single exposure[1] or from multiple exposures in a short space of time (usually less than 24 hours).[2] To be described as acute toxicity, the adverse effects should occur within 14 days of the administration of the substance.[2]

Acute toxicity is distinguished from chronic toxicity, which describes the adverse health effects from repeated exposures, often at lower levels, to a substance over a longer time period (months or years).

It is obviously unethical to test for acute (or chronic) toxicity in humans. However, some information can be gained from investigating accidental human exposures (e.g. factory accidents). Otherwise, most acute toxicity data comes from animal testing or, more recently, in vitro testing methods and inference from data on similar substances.[1][3]

Page 53: Malaria

Measures of acute toxicity

Regulatory values

Limits for short-term exposure, such as STELs or CVs, are only defined if there a particular acute toxicity associated with a substance.

Short-term exposure limit , STEL; Threshold limit value-short-term exposure limit, TLV-STEL

Ceiling value , CV; Threshold limit value-ceiling, TLV-C

Experimental values

No observed adverse effect level , NOAEL Lowest observed adverse effect level , LOAEL Maximum tolerable concentration , MTC, LC0; Maximum tolerable dose, MTD,

LD0 Minimum lethal concentration , LCmin; Mimimum lethal dose, LDmin Median lethal concentration , LC50; Median lethal dose, LD50; Median lethal time,

LT50 Absolute lethal concentration , LC100; Absolute lethal dose, LD100

Chronic toxicityChronic toxicity is a property of a substance that has toxic effects on a living organism, when that organism is exposed to the substance continuously or repeatedly. Compared with acute toxicity.

Two distinct situations need to be considered:

Prolonged exposure to a substance

For example if a person drinks too much alcohol on a regular basis then their health may suffer as a result. The alcohol does not have a long biological halflife but it is supplied on a regular basis to the body of the person.

Prolonged internal exposure because a substance remains in the body for a long time

For example if a person were to ingest radium much of it would be absorbed into the bones where it would exert a harmful effect on a person's health. The radium might cause a disturbance in the blood cell-forming part of the bone (bone marrow)

Page 54: Malaria

Secondary poisoningSecondary poisoning is the damage caused by non-biological pesticide, by means of poison. Secondary poisoning appears in a number of forms. The most prevalent among them is groundwater poisoning, which results from uncontrolled spraying of pesticides in farm fields that are in close proximity to where the groundwater is stored. An additional possibility is poisoning of insects or domestic rodents, pets and infants would become poisened as a result of eating food which has been in contact with the toxic carcasses.

In general it is recommended to Pesticide against domestic insects, by means of spraying materials which damage the insects Nervous system. This is because their Nervous system is different from that of humans.

Toxicity – Alcohol

Alcohols often have an odor described as 'biting' that 'hangs' in the nasal passages. Ethanol in the form of alcoholic beverages has been consumed by humans since pre-historic times, for a variety of hygienic, dietary, medicinal, religious, and recreational reasons. The consumption of large doses results in drunkenness or intoxication (which may lead to a hangover as the effect wears off) and, depending on the dose and regularity of use, can cause acute respiratory failure or death and with chronic use has medical repercussions. Because alcohol impairs judgment, it can often be a catalyst for reckless or irresponsible behavior. The LD50 of ethanol in rats is 10,300 mg/kg.[5]

Other alcohols are substantially more poisonous than ethanol, partly because they take much longer to be metabolized, and often their metabolism produces even more toxic substances. Methanol, or wood alcohol, for instance, is oxidized by alcohol dehydrogenase enzymes in the liver to the poisonous formaldehyde, which can cause blindness or death.[2]

An effective treatment to prevent formaldehyde toxicity after methanol ingestion is to administer ethanol. Alcohol dehydrogenase has a higher affinity for ethanol, thus preventing methanol from binding and acting as a substrate. Any remaining methanol will then have time to be excreted through the kidneys. Remaining formaldehyde will be converted to formic acid and excreted.[6][7]

Methanol itself, while poisonous, has a much weaker sedative effect than ethanol. Some longer-chain alcohols such as n-propanol, isopropanol, n-butanol, t-butanol and 2-methyl-2-butanol do however have stronger sedative effects, but also have higher toxicity than ethanol.[8][9] These longer chain alcohols are found as contaminants in some alcoholic beverages and are known as fusel alcohols,[10][11] and are reputed to cause severe hangovers although it is unclear if the fusel alcohols are actually responsible.[12] Many

Page 55: Malaria

longer chain alcohols are used in industry as solvents and are occasionally abused by alcoholics,[13][14] leading to a range of adverse health effects.[15]

Treatment- Snake Bite

It is not an easy task determining whether or not a bite by any species of snake is life-threatening. A bite by a North American copperhead on the ankle is usually a moderate injury to a healthy adult, but a bite to a child’s abdomen or face by the same snake may well be fatal. The outcome of all snakebites depends on a multitude of factors; the size, physical condition, and temperature of the snake, the age and physical condition of the victim, the area and tissue bitten (e.g., foot, torso, vein or muscle, etc.), the amount of venom injected, the time it takes for the patient to find treatment, and finally the quality of that treatment.

Snake identification

Identification of the snake is important in planning treatment in certain areas of the world, but is not always possible. Ideally the dead snake would be brought in with the patient, but in areas where snake bite is more common, local knowledge may be sufficient to recognize the snake.

In countries where polyvalent antivenins are available, such as North America, identification of snake is not of much significance.

The three types of venomous snakes that cause the majority of major clinical problems are the viper, krait and cobra. Knowledge of what species are present locally can be crucially important, as is knowledge of typical signs and symptoms of envenoming by each species of snake.

A scoring systems can be used to try and determine biting snake based on clinical features,[9] but these scoring systems are extremely specific to a particular geographical area.

First Aid

Snakebite first aid recommendations vary, in part because different snakes have different types of venom. Some have little local effect, but life-threatening systemic effects, in which case containing the venom in the region of the bite (e.g., by pressure immobilization) is highly desirable. Other venoms instigate localized tissue damage around the bitten area, and immobilization may increase the severity of the damage in this area, but also reduce the total area affected; whether this trade-off is desirable remains a point of controversy.

Because snakes vary from one country to another, first aid methods also vary; treatment methods suited for rattlesnake bite in the United States might well be fatal if applied to a tiger snake bite in Australia. As always, this article is not a legitimate substitute for

Page 56: Malaria

professional medical advice. Readers are strongly advised to obtain guidelines from a reputable first aid organization in their own region, and to beware of homegrown or anecdotal remedies.

However, most first aid guidelines agree on the following:

1. Protect the patient (and others, including yourself) from further bites. While identifying the species is desirable in certain regions, do not risk further bites or delay proper medical treatment by attempting to capture or kill the snake. If the snake has not already fled, carefully remove the patient from the immediate area.

2. Keep the patient calm and call for help to arrange for transport to the nearest hospital emergency room, where antivenin for snakes common to the area will often be available.

3. Make sure to keep the bitten limb in a functional position and below the victim's heart level so as to minimize blood returning to the heart and other organs of the body.

4. Do not give the patient anything to eat or drink. This is especially important with consumable alcohol, a known vasodilator which will speedup the absorption of venom. Do not administer stimulants or pain medications to the victim, unless specifically directed to do so by a physician.

5. Remove any items or clothing which may constrict the bitten limb if it swells (rings, bracelets, watches, footwear, etc.)

6. Keep the patient as still as possible. 7. Do not incise the bitten site.

Many organizations, including the American Medical Association and American Red Cross, recommend washing the bite with soap and water. However, do not attempt to clean the area with any type of chemical.

Treatment for Australian snake bites (which may differ to other areas of the world) stringently recommends against cleaning the wound. Traces of venom left on the skin/bandages from the strike can be used in combination with a snake bite identification kit to identify the species of snake. This speeds determination of which anti-venom to administer in the emergency room.

Pressure immobilization

Pressure immobilization is not appropriate for cytotoxic bites such as those of most vipers,[10][11][12] but is highly effective against neurotoxic venoms such as those of most elapids.[13][14][15] Developed by Struan Sutherland in 1978,[16] the object of pressure immobilization is to contain venom within a bitten limb and prevent it from moving through the lymphatic system to the vital organs in the body core. This therapy has two components: pressure to prevent lymphatic drainage, and immobilization of the bitten limb to prevent the pumping action of the skeletal muscles. Pressure is preferably applied with an elastic bandage, but any cloth will do in an emergency. Bandaging begins two to four inches above the bite (i.e. between the bite and the heart), winding around in

Page 57: Malaria

overlapping turns and moving up towards the heart, then back down over the bite and past it towards the hand or foot. Then the limb must be held immobile: not used, and if possible held with a splint or sling. The bandage should be about as tight as when strapping a sprained ankle. It must not cut off blood flow, or even be uncomfortable; if it is uncomfortable, the patient will unconsciously flex the limb, defeating the immobilization portion of the therapy. The location of the bite should be clearly marked on the outside of the bandages. Some peripheral edema is an expected consequence of this process.

Apply pressure immobilization as quickly as possible; if you wait until symptoms become noticeable you will have missed the best time for treatment. Once a pressure bandage has been applied, it should not be removed until the patient has reached a medical professional. The combination of pressure and immobilization can contain venom so effectively that no symptoms are visible for more than twenty-four hours, giving the illusion of a dry bite. But this is only a delay; removing the bandage releases that venom into the patient's system with rapid and possibly fatal consequences.

Outmoded treatments

The following treatments have all been recommended at one time or another, but are now considered to be ineffective or outright dangerous, and should not be used under any circumstances. Many cases in which such treatments appear to work are in fact the result of dry bites.

Old style snake bite kit that should not be used. Application of a tourniquet to the bitten limb is not generally recommended.

There is no generally convincing evidence that it is an effective first-aid tool as generally applied[17]. Tourniquets have been found to be completely ineffective in the treatment of Crotalus durissus [18] , but some positive results have been seen with properly applied tourniquets for cobra venom in the Phillipines[19]. Uninformed tourniquet use is dangerous, since reducing or cutting off circulation can lead to gangrene, which can be fatal[17].

Cutting open the bitten area often used prior to suction is not recommended (see also below) since it causes damage and increases the risk of infection.

Sucking out venom, either by mouth or with a pump does not work and may harm the affected area directly.[20] Suction started after 3 minutes removes a clinically insignificant quantity - less than one thousandth of the venom injected - as shown in a human study.[21] In a study with pigs, suction not only caused no improvement

Page 58: Malaria

but led to necrosis in the suctioned area.[22] Suctioning by mouth presents a risk of further poisoning through the mouth's mucous tissues.[23] The well-meaning family member or friend may also release bacteria into the victim’s wound, leading to infection.

Immersion in warm water or sour milk, followed by the application of Snake-Stones (also known as Black Stones or la Pierre Noire), which are believed to draw off the poison in much the way a sponge soaks up water.

Application of potassium permanganate. Use of electroshock therapy. Although still advocated by some, animal testing has

shown this treatment to be useless and potentially dangerous.[24][25][26][27]

In extreme cases, where the victims were in remote areas, all of these misguided attempts at treatment have resulted in injuries far worse than an otherwise mild to moderate snakebite. In worst case scenarios, thoroughly constricting tourniquets have been applied to bitten limbs, thus completely shutting off blood flow to the area. By the time the victims finally reached appropriate medical facilities their limbs had to be amputated.

Drug interactionisA drug interaction is a situation in which a substance affects the activity of a drug, i.e. the effects are increased or decreased, or they produce a new effect that neither produces on its own. Typically, interaction between drugs come to mind (drug-drug interaction). However, interactions may also exist between drugs & foods (drug-food interactions), as well as drugs & herbs (drug-herb interactions).

Generally speaking, drug interactions are avoided, due to the possibility of poor or unexpected outcomes. However, drug interactions have been deliberately used, such as co-administering probenecid with penicillin prior to mass production of penicillin. Because penicillin was difficult to manufacture, it was worthwhile to find a way to reduce the amount required. Probenecid retards the excretion of penicillin, so a dose of penicillin persists longer when taken with it, and it allowed patients to take less penicillin over a course of therapy.

A contemporary example of a drug interaction used as an advantage is the co-administration of carbidopa with levodopa (available as Carbidopa/levodopa). Levodopa is used in the management of Parkinson's disease and must reach the brain in an un-metabolized state to be beneficial. When given by itself, levodopa is metabolized in the peripheral tissues outside the brain, which decreases the effectiveness of the drug and increases the risk of adverse effects. However, since carbidopa inhibits the peripheral metabolism of levodopa, the co-administration of carbidopa with levodopa allows more levodopa to reach the brain un-metabolized and also reduces the risk of side effects.

Drug interactions may be the result of various processes. These processes may include alterations in the pharmacokinetics of the drug, such as alterations in the Absorption, Distribution, Metabolism, and Excretion (ADME) of a drug. Alternatively, drug

Page 59: Malaria

interactions may be the result of the pharmacodynamic properties of the drug, e.g. the co-administration of a receptor antagonist and an agonist for the same receptor.

Metabolic drug interactions

Many drug interactions are due to alterations in drug metabolism.[1] Further, human drug-metabolizing enzymes are typically activated through engagement of nuclear receptors.[1]

One notable system involved in metabolic drug interactions is the enzyme system comprising the cytochrome P450 oxidases. This system may be affected by either enzyme induction or enzyme inhibition, as discussed in the examples below.

Enzyme induction - drug A induces the body to produce more of an enzyme which metabolises drug B. This reduces the effective concentration of drug B, which may lead to loss of effectiveness of drug B. Drug A effectiveness is not altered.

Enzyme inhibition - drug A inhibits the production of the enzyme metabolising drug B, thus an elevation of drug B occurs possibly leading to an overdose.

Bioavailability - drug A influences the absorption of drug B.

The examples described above may have different outcomes depending on the nature of the drugs. For example, if Drug B is a prodrug, then enzyme activation is required for the drug to reach its active form. Hence, enzyme induction by Drug A would increase the effectiveness of the drug B by increasing its metabolism to its active form. Enzyme inhibition by Drug A would decrease the effectiveness of Drug B.

Additionally, Drug A and Drug B may affect each other's metabolism.