Keeping troops healthy and happy

For most of recorded history, the greatest threat to the armed forces has come not from battlefield injuries but rather from infectious diseases, for which effective drugs were needed.

In the American Revolution, Civil War, and Spanish-American War, twice as many soldiers died from disease as from battle.1,2 In World War I, the U.S. Army’s number of deaths from disease and battle were about equal. In World War II, only half as many died from disease, and in Vietnam, only one-fifth as many.1 Advances in pharmacology accounted for these dramatic reductions in mortality. But soldiers still get sick.

Roman innovations

The Romans were the first to recognize the debilitating effect of epidemics and understood the importance of sanitation to prevent disease.3 Army camps protected the water supply by locating latrines downstream. As the Roman Empire expanded to far-flung regions, the legionnaires established a sophisticated network of military hospitals. Those hospitals were separated from the Legion’s main garrisons and were also organized with attention to sanitation.4

Consequently, the Roman armies suffered somewhat less from epidemic infections than their opponents, but two-thirds of their casualties were still due to disease. Unfortunately, the Roman innovations did not outlive the empire and would not resurface for almost 2000 years.4,5

Reviving the innovations

Effective medical practices re-emerged during the Napoleonic Wars (1794–1815). Baron Dominique-Jean Larrey, Napoleon’s chief surgeon, is credited with implementing many of those innovative practices, including a system for triage, “flying ambulances,” and organized field hospitals.3,6

Until the late 18th century, combat officers did not feel it was worth risking the lives of healthy soldiers to retrieve the wounded. But then, Larrey created a dedicated medical corps to evacuate injured soldiers during battle.6

Stretcher-bearers carried the wounded to ambulances volantes (“flying ambulances”). Larrey’s vehicles were horse-drawn carts with spring suspensions and capable of accommodating four patients. The flying ambulance was, in fact, a mobile treatment center with padded mattresses, surgical instruments, supplies, and a dedicated medical staff who could treat the wounded on-site, as well as transport the seriously wounded to nearby field hospitals.5,6

Larrey said that the dangerously wounded must be treated first without regard to rank—a novel concept at the time. “Triage” comes from the French trier, meaning “to sort.” Patients are prioritized according to the urgency or importance of treatment. In a nod to Larrey, the French term is still used.4–6

Larrey ended the Napoleonic campaign as a hero for his efforts, and his innovations were widely adopted. By the beginning of the 19th century, Western military medicine had equaled and maybe surpassed the medicine of the Roman Legions, but there was still a long way to go.4

 

Stretcher-bearers carried the wounded to ambulances volantes (“flying ambulances”).

Baron Dominique-Jean Larrey (1766–1842), 1804, portrait by his sister-in-law, Marie-Guillemine Benoist.
Baron Dominique-Jean Larrey (1766–1842), 1804, portrait by his sister-in-law, Marie-Guillemine Benoist.

Smallpox mandate

In the 19th century, as in Roman times, poor sanitation in stationary camps led to infectious diseases such as typhus, dysentery, plague, and smallpox. Physicians had a variety of folk medicines, but the cause of infections was unknown, and those drugs were ineffective.5

During the American Revolution, more soldiers in the Continental Army died in the hospital than in battle. Although malnutrition and frostbite were widespread, about 25% of the hospitalized patients died from pneumonia, dysentery, or smallpox infections.5

General George Washington instituted policies on camp hygiene and preventative medicine to preserve the health of his troops. At Valley Forge, he wrote detailed instructions for construction of the barracks in a grid pattern, including provisions for adequate sanitation.2

The most significant preventative measure was his directive regarding smallpox immunization. Variolation (inoculating people with pus from a patient with smallpox pustules) had been introduced by Edward Jenner in England. The procedure immunized people against smallpox, but it had a 10% mortality rate and was controversial.5

Despite the controversy, Washington said that the army had more to dread from smallpox than from the sword of the enemy.2 In January 1777, he boldly ordered the entire Continental Army to be inoculated against smallpox.2,5 In addition, he required ongoing inoculation of all incoming recruits and even local citizens to whom his troops were exposed. It was the first time in history that mandatory immunization of an entire army was ordered by the commander.2,7

Malaria

Although Civil War surgeons (1861–1865) frequently used general anesthesia, post-operative infections were still common, and surgical outcomes remained dismal. And because large numbers of men lived in crowded unsanitary conditions, infectious diseases also rapidly spread. On both sides, disease was a greater threat than battle injury.5

Measles, tonsillar abscesses, and upper respiratory infections were common. Deteriorating camp conditions worsened sanitation and led to dysentery, typhoid, lice-borne typhus, skin infections, and mite-borne scabies. Venereal disease was also common, and the Union Army reported 1,315,000 cases of malaria.5

Since the 16th century, quinine had been effective in treating and preventing malaria.5,8 Quinine was extracted from the bark of the Peruvian cinchona tree. In the 1860’s, British and Dutch adventurers stole seeds from Peru and started cinchona plantations in Java, which subsequently supplied 95% of the world’s supply of quinine. The Union Army’s rations often included whiskey spiked with quinine sulfate.5

The brief Spanish-American War (1898) was fought in the tropics, notably Cuba and the Philippines. So many American soldiers died from typhoid, yellow fever, and malaria that the public demanded better medical care of the troops.1

After the war, the U.S. Army set up the Yellow Fever Commission (also called the Reed Commission) to investigate and make recommendations.1,2,5 Major Walter Reed, an Army surgeon, was chairman of the Commission.

Major Walter Reed, MD.
Major Walter Reed, MD.

Drawing on the work of Cuban physician Carlos Finlay, the Reed Commission devised a series of ingenious experiments that documented the first known viral cause of a human disease. The yellow fever virus was transmitted by mosquitoes.5,9

In parallel, British army surgeon Ronald Ross had unraveled the complex life cycle of the Plasmodium parasite that caused malaria and proved that the mosquito was the vector responsible for malaria transmission.5,8

Along with General William Gorgas, Reed developed ways to prevent the spread of malaria and yellow fever.2,9 The procedures largely consisted of mosquito eradication through fumigation, water drainage, and sanitation.5

Better infection management

In World War I, the recruits from densely populated urban areas had already acquired significant immunity as children, but those from sparsely populated rural areas had not.5 When the two groups mixed in close quarters at boot camp, childhood diseases such as measles, mumps, meningitis, and scarlet fever were inevitable.

By this time, several infectious diseases (smallpox, typhoid, tetanus, diphtheria, and some types of dysentery) could be prevented with vaccines. Insect-borne diseases such as typhus and trench fever were partly controlled by insecticides, especially the newly developed DDT.5 In addition, antiseptic surgical techniques and instrument sterilization prevented post-operative wound infections.9

In World War II, there were significant advances in managing infectious diseases, especially those resulting from poor sanitation and aggravated by dietary deficiency. For example, in the North African campaign, a typhus epidemic in the civilian population threatened the Allied forces. The U.S. Department of Agriculture developed a typhus vaccine, and the U.S. Army’s mass vaccination of the 500,000 deployed soldiers limited the number of their typhus cases to just eleven.5

Allied troops in Europe were vaccinated not only against typhus but also typhoid, paratyphoid, and smallpox.5,9

In the Pacific theater, combat operations were in areas where tropical infections were endemic. Soldiers suffered from dengue fever (an untreatable, incapacitating, but usually self-limited viral disease), fungal skin diseases (“jungle rot”), and diarrhea, in addition to malaria.5

At the beginning of World War II, quinine was the drug of choice for treating malaria. When the Japanese seized the Dutch East Indies (including Java) in 1942, the Allied forces were left without a reliable supply.5,8

Quinine has a number of unpleasant side effects, including vomiting, visual disturbances, ringing in the ears progressing to deafness, headache, and rash. In an effort to find a better alternative, German scientists in 1931 had developed mepacrine (also known as quinacrine), which was marketed as Atabrine.5

Atabrine was remarkably effective in preventing malaria in Burma and New Guinea. But soldiers had to be forced to take it because it tasted bitter, often caused vomiting, tended to turn the skin yellow, and occasionally caused psychosis.5

World War II Atabrine tablets for malaria.
World War II Atabrine tablets for malaria.

In 1943, clinicians found that these side effects could be minimized by giving a large loading dose followed by much smaller maintenance doses, a practice that had the additional advantage of requiring less of the scarce drug.5

Malaria is still prevalent in tropical regions, and so far, efforts to develop a malaria vaccine have been only partly successful. In the meantime, antimalarial drugs are used to treat infected patients and minimize symptoms.

Sulfa drugs

One of the most important medical advances during World War II was the introduction of antibiotics.5 In 1932, chemists at IG Farben synthesized Prontosil, the first truly effective antibacterial drug.5,10 In 1935, scientists at the Pasteur Institute in Paris discovered that Prontosil is metabolized to sulfanilamide, a much simpler compound. It could be applied topically or taken orally.5

The spectrum of antimicrobial efficacy of both Prontosil and sulfanilamide was limited and prompted research to find broad-spectrum derivatives. These analogs, which are collectively referred to as sulfa drugs, were effective against streptococci, staphylococci, meningococci, and gonococci.5,10

From the first months of World War II, sulfa drugs were used both prophylactically and therapeutically. Medics routinely dusted fresh wounds with sulfa powder. Physicians prescribed sulfa tablets for gonorrhea, and the drugs’ effectiveness against dysentery gave American troops a significant advantage in the Pacific theater.5

President Franklin Roosevelt was given a sulfa drug to treat a skin infection, and Prime Minister Winston Churchill took it to treat pneumonia. Unfortunately, the indiscriminate, widespread use of sulfa drugs led to early development of bacterial resistance.5,10 Penicillin stepped in to take its place.

Penicillin

In 1928, Alexander Fleming, a microbiologist at St. Mary’s Hospital in London, serendipitously discovered a mold that killed his cultures of staphylococcus.5 He called the active agent in the mold “penicillin.” It also killed streptococci, pneumococci, gonococci, meningococci, and diphtheria bacilli.10

Alexander Fleming, MD at work in his lab at St. Mary’s, Paddington, London, 1943.
Alexander Fleming, MD at work in his lab at St. Mary’s, Paddington, London, 1943.

Unfortunately, penicillin proved to be extraordinarily difficult to extract and almost impossible to grow in culture.5 The small amounts were insufficient for Fleming to test penicillin efficacy against bacterial infections in animals and patients.10

Then, in 1940, Howard Florey, along with Ernst Chain and Norman Heatley at Oxford University, managed to extract enough penicillin to successfully treat a bacterial infection in four lab mice.5,10

Howard Florey, MD, 1930s.
Howard Florey, MD, 1930s.

On February 1, 1941, they conducted the first clinical test. A London policeman had developed a staphylococcal infection from a cut while shaving. Penicillin immediately knocked down the infection, but Florey did not have enough penicillin to continue treatment. The policeman relapsed and died.5,10

The Oxford group published their production and purification procedures, which allowed several British drug companies to produce laboratory-scale amounts of penicillin. American scientists had better fermentation facilities and expertise for harvesting penicillin, and they were not hampered by wartime bombing.5,10

In the spring of 1941, Florey and Heatley traveled to the U.S. to assist with scaled-up production of penicillin. Then Merck, Pfizer, and Squibb were contracted by the U.S. government to produce penicillin commercially. Still, penicillin remained expensive, and demand far exceeded supply. The standard treatment procedures included recovering the drug from the patients’ urine and reusing it.5,10

Because of the scarcity, the U.S. Army prioritized giving penicillin to soldiers with venereal disease rather than those with severe wounds. The rationale was that the former were more likely to return to combat than the latter.5

Fermentation efficiency and yields steadily improved, and by D-Day, drug companies were producing enough penicillin to treat every British and American casualty during the invasion.5,10 By the end of the war, penicillin had largely replaced sulfa drugs. But sulfa drugs were still prescribed to treat burns, venereal disease, and urinary tract infections.5 In May 1945, penicillin was released to the civilian population at a price of 55 cents per dose.5

Vaccinations

In Korea, infectious and parasitic diseases comprised 90% of the cases in soldiers who were ambulatory with non-combat medical issues. The most common sick-call reports were for respiratory disease, fevers, and diarrhea. Other infectious conditions included encephalitis, polio, hemorrhagic fever, hepatitis, and venereal disease.5

Infections were also the greatest problem in Vietnam, largely because of the tropical climate. The most common were malaria, viral hepatitis, skin diseases, acute respiratory infections, diarrhea, and venereal disease. It was estimated that 70% of hospital admissions were due to insect-borne diseases, whereas combat injuries accounted for only about 16% of hospitalizations.

Drug-resistant malaria and drug-resistant venereal disease were particularly serious problems. Still, hospital admissions resulting from infectious diseases in Vietnam were significantly lower than they had been in the Pacific theater during World War II, mostly because of better preventative medicine.5

In the second half of the 20th century and early 21st century, infectious disease became a minor contributor to military morbidity and mortality, due to a wider range of antibiotic options and the large number of available vaccines.5 All U.S. military personnel now receive mandatory vaccinations for eleven infectious diseases, and the U.S. Department of Defense provides eight additional vaccines, depending on the service member’s relative risk of exposure.11

Among the most recent innovations are gel bandages infused with silver nanoparticles. Ukrainian researchers have shown that these bandages are microbicidal and can be applied immediately after injury on the battlefield to thwart infection.12

Pain killers

Since ancient times, one of the few effective medicines was opium, which was widely used to relieve pain.5,6,13 By the late 17th century, a typical military surgeon’s field chest contained opium, along with a variety of questionable folk remedies.5

In 1832, Rosengarten and Sons of Philadelphia (the predecessor of Merck Sharpe and Dohme) began producing morphine, which was extracted from the opium poppy. By the beginning of the Civil War, a variety of opium preparations were available to treat pain, including Laudanum (tincture of opium), powdered opium (9–12% morphine), and opium gum. Paregoric (tincture of camphorated opium) was prescribed to manage diarrhea, and Dover’s powders (10% opium and 10% ipecac) were taken to treat colds and malarial fevers.5

The introduction of the hypodermic needle and syringe in the mid-1800s made injected morphine such an effective pain reliever that it went from prescribed use in field hospitals to administration by medics on the battlefield. Battlefield injections not only lessened pain but also minimized the likelihood of shock during evacuation. Also, soldiers suffering chronic pain were given the drug and syringes for self-administration.5 During World War I, the U.S. Army supplied “hypo units,” so that a soldier could administer morphine by himself with one hand. Several hundred thousand hypo units were produced during the war.5

In 1938, Squibb, at the request of the U.S. Army and Navy, developed an improved hypo product, called the syrette. It was a sealed squeeze tube containing morphine with a needle on one end. After injection, the syrette was often pinned to the soldier’s collar to inform others of the dose administered. Ultimately, 75 million syrettes were produced during World War II.5

Morphine syrette used by the U.S. Army during World War II (capacity 1.5 cc).
Morphine syrette used by the U.S. Army during World War II (capacity 1.5 cc).

Liquid courage

Besides treating infectious diseases and dulling pain, the military has also taken advantage of drugs to maximize combat success. For thousands of years, alcohol provided soldiers with liquid courage, helping them brace for battle, as well as numb their injuries.13

George Washington, who owned four stills at Mount Vernon, considered alcohol essential and ensured that his army’s rations included rum. At Valley Forge, during the harsh winter of 1777, he doubled the troops’ rum rations. During the Civil War, whiskey helped both sides soldier on.13

When the U.S. entered World War I, General John Pershing arranged for his soldiers to have access to light wines and beers. The trenches along the Western Front in France were said to be drenched in alcohol.13

In World War II, the U.S. government considered alcohol so important to troop morale that the brewing industry was instructed to allocate 15% of its production to the military. During the Korean War, General Douglas MacArthur ordered that soldiers were to receive one free can of beer daily.13

In Vietnam, American soldiers received a beer ration of two cans per day, and they could also purchase up to three cases of beer per month at the post exchange. In fact, in some ways, heavy drinking was actually encouraged. Many officers rewarded kills with free alcohol. Despite the extensive publicity of illicit drug use, alcohol was actually the most abused drug in Vietnam.13

Of course, when used to excess, alcohol desensitized soldiers, making them unreliable and even self-destructive. The current military hierarchy has become less tolerant of alcohol consumption. Military operations are now more complex and involve the operation of increasingly sophisticated machinery.13 Those who operate high-tech systems need clear heads and rapid reflexes.

Caffeine

A more benign drug to boost performance was caffeine. It helped soldiers fight fatigue and enhanced their energy and stamina.13

In colonial America, tea was the most popular and ubiquitous caffeinated beverage. But because the British had given the East India Company a monopoly on tea imports, colonists were cut off from access to cheaper Dutch tea.13

Enterprising colonial merchants invested heavily in smugglers of the illicit Dutch tea. They circumvented the British duty on East India tea and enjoyed huge profits. About 80% of the tea consumed by colonists in Boston had been smuggled in, and they backed their local merchants—ultimately instigating the Boston Tea Party.13

To support colonial interests, the Continental Congress passed a resolution against tea, and coffee consumption jumped sevenfold.13 Thanks to the American Revolution, coffee has remained dominant in the U.S. ever since.

In 1832, coffee was officially added to American soldiers’ rations, and it was the most valued item in the Union soldiers’ kit during the Civil War. Confederate soldiers, on the other hand, were forced to fight without caffeine, due to the Union’s blockade of southern ports.13 They turned to chicory, a non-caffeinated substitute.

At the Battle of Antietam in 1862, young William McKinley braved enemy fire to haul vats of hot coffee to his exhausted companions. Their commander described the magical effect of the hot brew as the equivalent of “putting a new regiment in the fight.”13 Three decades later, McKinley became president. A monument at Antietam depicts Private McKinley offering a cup of coffee to a soldier.

McKinley “Coffee Break” Monument in Wilmington, Delaware.
McKinley “Coffee Break” Monument in Wilmington, Delaware.

McKinley “Coffee Break” Monument in Wilmington, Delaware.

During World War I, the U.S. Army was a leading coffee consumer, ultimately roasting 750,000 pounds of coffee beans per day for the troops. The war also brought instant coffee to the Front. Packed envelopes of soluble coffee became part of the doughboys’ rations.13

In World War II, coffee became so closely identified with soldiers that it took on the name of the GIs: a “cuppa Joe.” Instant coffee demand spiked. It was distributed globally in small envelopes for individual servings and in olive-drab cans for group use. On the Homefront, the “coffee break” was introduced: a short rest and caffeine boost to stimulate the productivity of American defense workers.13

U.S. World War II Faust instant coffee can.
U.S. World War II Faust instant coffee can.
 

In World War II, coffee became so closely identified with soldiers that it took on the name of the GIs: a “cuppa Joe.”

Consumption of other caffeinated beverages also soared during World War II. Coca-Cola, the pioneer of caffeinated soft drinks, was named for the coca leaf and the kola nut. Coca leaves contain small quantities of cocaine, which the company removed from its Coca-Cola recipe early in the 20th century. Kola nuts contain about 2% caffeine, which remained in the beverage.13

To accommodate the GIs, the Coca-Cola company built and maintained 64 bottling facilities around the world and served 10 billion Cokes. The company’s technicians also developed a portable soda fountain for use in the jungle and a slim version that fit in submarines.13

Men of the 133rd Field Artillery Battalion, Battery C, of the 36th Division in the front lines receive Coca-Cola, the first they have received in over a year. 2 March 1944, San Michele Area, Italy. National Archives (111-SC-412503)
Men of the 133rd Field Artillery Battalion, Battery C, of the 36th Division in the front lines receive Coca-Cola, the first they have received in over a year. 2 March 1944, San Michele Area, Italy. National Archives (111-SC-412503)

Over the years, military researchers developed creative ways of delivering caffeine hits, including caffeinated chewing gum, caffeinated beef jerky, mocha-flavored caffeinated energy bars, caffeinated applesauce and apple pie, and caffeinated chocolate pudding.13

Any item labeled “First Strike” in a Meal Ready to Eat (MRE) was likely heavily spiked with caffeine. In Afghanistan and Iraq, the most popular jolt was provided by caffeinated energy drinks such as Red Bull and Rip It. One Iraq War veteran said, “When platoons had to go on extended patrols, their supply sergeants often did not ask for more MREs but instead asked for more Rip It.”13

The speed of “Speed”

Amphetamine appeared on the scene more recently, and it quickly became popular with fighting forces.

Gordon Alles, a young British chemist, first synthesized amphetamine at UCLA in 1927.13 He sold his synthetic process to Smith, Kline & French. In 1932, SKF first marketed amphetamine under the brand name Benzedrine, an inhaled product that was sold over the counter to treat asthma and congestion. A few years later, SKF introduced Benzedrine tablets (“Bennies”), which were aggressively promoted for all sorts of ailments from depression to obesity.13

At the Berlin Olympic Games in 1936, athletes used Benzedrine as a doping agent. The following year, Friedrich Hauschild, a chemist at Temmler-Werke in Germany, synthesized the amphetamine analog, methamphetamine, which was sold under the brand name Pervitin. It became wildly popular and could be found in a range of products, including boxed chocolates spiked with methamphetamine.7,13

Nazi ideology opposed the use of drugs, which were considered both a sign of personal weakness and a symbol of moral decay. Weak people took drugs like opium to escape.7,13 But methamphetamine was the exception.

The German military tested Pervitin and concluded it was “a militarily valuable substance.”13 It stimulated the central nervous system, increased wakefulness and a sense of well-being, and reduced fatigue and appetite. Those were powerful war-facilitating psychoactive effects. One military researcher took daily doses himself and proclaimed, “With Pervitin, you can go on working for 36–50 hours without feeling any noticeable fatigue.”7,13

Pervitin’s effects played into the Third Reich’s obsession with physical and mental superiority. Strong people took methamphetamine to feel even stronger.7,13 It was the perfect Nazi drug. The German Blitzkrieg depended on speed and surprise, requiring soldiers to forgo sleep and yet stay alert. In addition to tablets, Pervitin was dispensed to pilots and tank crews in the form of chocolate bars known as Fliegerschokolade (flyer’s chocolate) and Panzerschokolade (tanker’s chocolate), respectively.7,13

 

“With Pervitin, you can go on working for 36–50 hours without feeling any noticeable fatigue.”

Stimulant Pervitin tablets—better known today as crystal meth—were issued to German soldiers during World War II.
Stimulant Pervitin tablets—better known today as crystal meth—were issued to German soldiers during World War II.

High on Pervitin, German tank and artillery drivers made remarkable advances night and day, almost without stopping. And German airmen used Pervitin to stay alert during their long nighttime bombing raids over Britain. “Speed,” in large part, accounted for the speed and success of Blitzkrieg.13

As early as 1940, though, German medical leaders recognized the addictive potential and life-threatening side effects of excessive Pervitin consumption, and they issued guidelines to restrict its use. But individual commanders and medical officers remained free to dispense it at their discretion. Pervitin use actually increased, and Temmler-Werke remained profitable.7,13

Battling the Red Army and surviving the extreme Russian winter of 1941–1942 were more immediately important than the long-term health risks of Pervitin. The Wehrmacht’s medical services sent 10 million methamphetamine tablets to the Eastern Front in early 1942.13

A German advertisement for Pervitin from the 1940s.
A German advertisement for Pervitin from the 1940s.

British and American troops also became enthusiastic pill poppers. Royal Air Force pilots took Benzedrine during their high-altitude nighttime bombing raids over Germany. They remained alert and less risk-averse while dropping their bombs and stayed awake on the long late-night flight home. Benzedrine distribution to all British aircrews was officially adopted in 1942. General Bernard Montgomery also fostered regular distribution of Benzedrine to his troops in North Africa.13

By 1943, Benzedrine tablets were included in the emergency kits of American bombers. Benzedrine was also in the U.S. Army and Navy medical kits, as well as in the field kits of the U.S. Marines in the South Pacific.13

Japan’s imperial government contracted methamphetamine production to Japanese pharmaceutical companies. Methamphetamine pills were distributed to pilots for long flights and to soldiers in combat. Kamikaze pilots, in particular, injected large doses of methamphetamine before their suicide missions. They were also given tablets that contained methamphetamine and were stamped with the emperor’s crest.13

During the Korean War, SKF produced dextroamphetamine (Dexedrine), which is almost twice as potent as Benzedrine. It was standard issue in the U.S. Army.13 In Vietnam, Dexedrine was handed out freely to U.S. soldiers and Navy SEALs, with little regard to dose or frequency of use. More than 225 million doses of Dexedrine were supplied to the troops. But by 1971, due to increased safety concerns, amphetamines were removed from survival kits.13

During the Gulf War (1990–1991), almost two-thirds of U.S. pilots took amphetamines. Without their “go pills,” they would have fallen asleep during the 15-hour flight across 5–6 time zones.13

Despite their controversial use and strict controls, amphetamines continue to be favored by the military.13 Modern technology requires long hours of focused attention, and amphetamines counter the constant threat of fatigue. In addition, the greater emphasis on nighttime operations and long mission durations threatens performance. Those operations run counter to circadian rhythms, and amphetamines are “useful tactical adjuncts.”13

Combating combat stress

During the Civil War, they called it soldier’s heart. In World War I, it was shell shock. In World War II, combat fatigue.1,5,14,15 In Korea, because of the cold weather operations, the symptoms were even more complex and puzzling. In Vietnam, it was the tropical environment and jungle warfare that triggered those same complex and puzzling symptoms.1 But by whatever name, this syndrome has always incapacitated a significant portion of the armed forces.

The “melancholia” experienced by Civil War solders was sometimes expressed as lethargy or excessive emotionality. Physicians diagnosed the symptoms variously as exhaustion, heart conditions, and “cardiac muscular exhaustion.”15 They attributed it to the heavy packs that the soldiers carried, homesickness, or poor motivation in soldiers who had unrealistic expectations of war.15 Little could be done to treat these casualties.

During World War I, shell shock was attributed to traumatic brain injury from the concussive force of artillery shells.1 The concussions supposedly caused small brain hemorrhages, leading to cerebral dysfunction.5 Symptoms included tremors, tics, an exaggerated startle response, fatigue, memory loss, difficulty sleeping, nightmares, delusions, hallucinations, and catatonia.5,15 Shell shock was seen in 20–30% of the battlefield casualties.5

At field hospitals near the Western Front, physicians treated shell shock by giving the casualties sedatives, rest, and reassurance.5 After a month of rest & recuperation, about 85% of these soldiers were returned to the Front. Although this practice seems harsh, it tended to prevent long-term sequelae.1

Combat fatigue was coined during World War II.1,5 Military doctors described it variously as exhaustion, battle exhaustion, war neurosis, cardiac neurosis, or psychoneurosis. The symptoms included excessive fatigue, exaggerated startle response, tremors, violence, nightmares, delusions, hallucinations, and catatonia.5,15 The syndrome seemed to be best captured in a portrait of a U.S. Marine in Peleliu in 1944, entitled the “Two-Thousand Yard Stare.”

War artist Thomas Lea’s The Two-Thousand Yard Stare.
War artist Thomas Lea’s “The Two-Thousand Yard Stare.”

Unlike in previous wars, World War II physicians attributed these symptoms to psychiatric illness rather than to physical brain damage.1,5 But despite attempts to exclude men with psychiatric illness from military service, combat fatigue still accounted for about 30% of Allied combat casualties.5

After a comprehensive review of combat fatigue cases, military psychiatrists concluded that soldiers who were continuously exposed to combat would eventually become nonfunctional. The data suggested that 200 days of constant action was the longest that any soldier could tolerate, and they advocated limiting active combat to 180 days.5

An exhausted U.S. Marine exhibits the two-thousand-yard stare after two days of constant fighting at the Battle of Eniwetok, February 1944.
An exhausted U.S. Marine exhibits the two-thousand-yard stare after two days of constant fighting at the Battle of Eniwetok, February 1944.

Enter: PTSD

Because of the protracted fighting in Vietnam, many veterans subsequently developed chronic psychological problems. The National Vietnam Veterans Readjustment Survey methodically documented these cases and highlighted the syndrome as a signature “wound” of the Vietnam War.15

Months after physical or psychological trauma, these soldiers developed symptoms such as feelings of fear, hopelessness, or horror. They tended to reexperience the triggering event in the form of nightmares or flashbacks. Overall, about 15% of Vietnam veterans experienced chronic psychological problems.5

It was generally believed that self-medication with alcohol was a reasonable and possibly effective treatment. Not surprisingly, alcoholism among veterans of both Korea and Vietnam emerged as a serious public health problem. Up to 75% of Vietnam veterans who suffered chronic psychological symptoms also abused alcohol.5

In 1980, the American Psychiatric Association formally acknowledged these symptoms as a distinct mental health condition and called it “posttraumatic stress disorder” (PTSD). PTSD, including its characteristic symptoms and diagnostic criteria, was added to the APA’s third edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-III).15

For decades, physicians and psychiatrists have tried to find effective ways to treat PTSD. The numerous psychosocial treatments include exposure therapy and cognitive behavioral therapy. Drug therapies include antidepressants, anticonvulsants, and benzodiazepines, as well as miscellaneous neurological drugs like gabapentin, propranolol, and buspirone. These treatments have met with limited success, and even patients who initially showed improvement have experienced varying degrees of relapse.15

 

The National Vietnam Veterans Readjustment Survey methodically documented these cases and highlighted posttraumatic stress disorder as a signature “wound” of the Vietnam War.

Research is ongoing by the U.S. Veterans Administration to find better evidence-based treatments for service members, veterans, and their families.15 Currently, psychedelic drugs, particularly psilocybin and MDMA, are being investigated as an adjunct to psychotherapy. But it has been difficult to obtain controlled data. Participants and investigators can typically tell whether the veteran received the psychedelic drug versus placebo.14

Ukrainian researchers are also actively investigating the causes and treatment of PTSD. According to one researcher, “Unfortunately, there is no shortage of PTSD sufferers in Ukraine.”12

Chemical and biological countermeasures

Poison gas caused massive casualties on both sides in World War I. Protection against these agents consisted mainly of gas masks, which were sometimes effective. Treatment was limited to palliative measures.

The horrendous casualties in World War I led to a general moratorium on the use of chemical warfare. Such weapons were not used in World War II or subsequent wars, but mutual distrust caused all nations to continue research. The stated objective was to develop protective countermeasures and treatments if “the other side” used chemical weapons. Because of the continuing potential hazard, troops who are deployed to areas where chemical weapons might be used are still issued autoinjectors of atropine and pralidoxime to counter the effects of nerve gas.5

Following the attacks on September 11, 2001, envelopes of anthrax (sent through U.S. post offices) renewed concerns about biological warfare. Novel or mutant infectious organisms, whether accidentally or intentionally produced, continue to pose a threat to national security and military readiness.

Cutting-edge research is ongoing in laboratories at the Walter Reed Army Institute for Research, the Naval Medical Research Center, the U.S. Army Medical Research Institute for Infectious Diseases, and a network of military labs around the world. Their efforts focus not only on ways to prevent and treat high-impact infectious diseases such as anthrax, malaria, dengue fever, and Ebola, but also on how to respond to the next as-yet-unidentified infectious microorganism.2

Author

  • Rebecca J. Anderson

    Rebecca J. Anderson holds a bachelor’s in chemistry from Coe College and earned her doctorate in pharmacology from Georgetown University. She has 25 years of experience in pharmaceutical research and development and now works as a technical writer. Her most recent book is Nevirapine and the Quest to End Pediatric AIDS.

    View all posts