Civil War Hospital Ship

The U.S.S. Red Rover, a captured Confederate vessel, was refitted as a hospital ship.

Evolution of Civil War Nursing

The evolution of the nursing profession in America was accelerated by the Civil War.

The Practice of Surgery

Amputations were the most common surgery performed during the Civil War.

Army Medical Museum and Library

Surgeon-General William Hammond established The Army Medical Museum in 1862. It was the first federal medical research facility.

Civil War Amputation Kit

Many Civil War surgical instruments had handles of bone, wood or ivory. They were never sterilized.

Sunday, January 24, 2016

History of Medical Injection Devices and Hypodermic Syringes

From: milestonescientic.com

The hypodermic syringe gets its name from two words of Greek origin: œhypo which means "under", and derma, which means "skin". It is interesting to note that during the past 150 years, the basic design of the hypodermic syringe has not changed very much. The first hypodermic syringes consisted of a cylinder with a movable plunger inside. Notable improvements included the incorporation of a glass piston within the cylinder to prevent leaks and reduce the chance of infection. As plastics developed, they were incorporated into the design to reduce costs and improve safety. Nonetheless, throughout all these years, the basic design, mechanics and manual operation of the manual syringe remained essentially unchanged.

The first known use of a syringe-like device to perform a medical procedure dates back to 900 A.D., when the Egyptian surgeon Ammar ibn Ali al-Mawsili devised a thin, hollow glass tube with suction to remove cataracts from patients eyes. At that time, syringes were only used to remove objects or fluid from humans, not inject them.

In 1650, Blaise Pascal invented the concept of a syringe (not necessarily hypodermic) as an application of what is now called Pascal's Law.  Forms of intravenous injection and infusion were used in the early and mid-1830s to treat cholera by the use of intravenous saline, but credit for the invention of the hypodermic syringe for medical purposes goes to Doctor Alexander Wood in 1853.

Dr. Wood, the Secretary of the Royal College of Physicians in Edinburgh, Scotland, is credited with performing the first subcutaneous injections for the relief of pain with a regular syringe, which at the time was used for treating birthmarks, by adding a hollow needle.  In 1855 he published a short paper in The Edinburgh Medical and Surgical Review, titled œA new method for treating neuralgia by the direct application of opiates to painful points. In the paper, he showed that the method was not necessarily limited to the administration of opiates.

In the late 1800s, a prominent surgeon, Doctor William Halstead applied the use of the hypodermic syringe to dentistry, demonstrating that an interstitial injection of aqueous cocaine resulted in an effective inferior alveolar nerve block; that a small amount of anesthetic injected into the trunk of a sensory nerve resulted in a numbing of pain in all of that nerve's branches. This discovery ushered in a new era of local pain management for both medicine and dentistry.

The History of Injecting and the Development of the Syringe

From: exchangesupplies.org

The origins of injecting probably lie in the weapons such as blowpipes and poison tipped darts that have been used for thousands of years to introduce substances into the body - albeit involuntarily for most of the recipients.

At its most basic, a syringe is a type of simple pump and it is likely that syringe-type devices were produced by many people. The earliest and most common syringe type device was called a ‘clyster’ a device for giving enemas.

It is impossible to be precise about when this developed, and when injecting as we know it began - the origins of the hypodermic syringe are clouded in uncertainty because there were numerous parallel processes of evolution and experimentation that led to the development of devices to inject drugs and medicines.

Because of this various people have been credited with the 'invention' of the syringe including Christopher Wren, Robert Boyle and Pascal, and intravenous injection is recorded as early as the 17th century.

The Fergusson syringe of 1853 became the forerunner of the modern syringe when Alexander Wood used it for the subcutaneous injection of opiates for the relief of pain.

The first injections
Wren is the first person recorded to have employed intravenous injecting in Britain.

In 1656 he experimented by injecting dogs with opium and other substances (Macht 1916). Wren’s ‘syringe’ was a crude device, consisting of a quill attached to a small bladder. In order to gain access to a vein, an incision first had to be made in the skin. High risk injecting with crude injecting devices still persist today in places such as prisons where access to modern sterile equipment is often absent or limited.

Wren also attempted intravenous injection in humans. To do this he used “the delinquent servant of a foreign ambassador but it didn't go well:

“…the victim either really or craftily fell into a swoon and the experiment had to be discontinued” (Macht 1915)

it was at least another 100 years before a syringe with an attached needle intended for puncturing the skin was produced.

In 1807 the Edinburgh Medical and Surgical Dictionary defined a syringe as:

“A well known instrument, serving to imbibe or suck in a quantity of fluid and afterwards expel the same with violence. A syringe is used for transmitting injections into cavities or canals.”

Interestingly, the above source also describes injections as being employed almost solely for injecting substances into the blood vessels of corpses for the purpose of enhancing anatomical study. Various developments and refinements towards the modern syringe were made as a result of the study and teaching of anatomy in the 17th and 18th centuries.

In the 17th century, De Graaf made a device that closely resembled the modern syringe, with a metal barrel to which the needle was directly attached. Its purpose was to trace the blood vessels of corpses.

Deliberate subcutaneous injection (under the skin) did not begin until the mid to late 19th century, probably as an extension of the then new practice of inoculation against disease.


Early experiments
Experiments with intravenous injecting continued and techniques were further developed in the 17th century. Numerous drugs were used to attempt to treat various conditions, particularly epilepsy and syphilis.

Opium was one of the first drugs to be injected in this way, but difficulties in reliably accessing veins, the use of substances unsuitable for intravenous injection (such as cinnamon, oil of sulphur and arsenic) gave poor results - which were incorrectly attributed to the route of administration - and probably limited the development of intravenous injecting as a common method of drug delivery.

Absorption of drugs through the skin
The beginning of the 19th century saw an increase in interest in attempts to introduce drugs into the body via the skin itself. Initially, this usually took the form of causing blistering to an area, removing the outer layer of skin and placing a poultice or plaster containing the drug onto it. In 1836, Lafargue further developed this idea by dipping a vaccination lancet in morphine, and pushing it under the skin.

By the middle of the century Lafargue had developed a technique of placing solid morphine based pellets under the skin. Initially this was achieved by simply making a hole with a large needle and pushing the pellet into the hole. Over time and instrument was developed to aid this procedure which Lafargue called the ‘seringue seche’ or dry syringe.

Other variations of this method included that of Crombie, who in 1873 used a technique of coating silk thread with morphia and then drawing the impregnated thread under the skin. Crombie developed this method because he felt that the recently developed hypodermic syringe was expensive and easily damaged.

Subcutaneous injecting
Through the 19th and into the early 20th century, subcutaneous injecting was generally seen as a more valuable route of administration than intravenous injection. This may have been because of the earlier interest in the absorption of drugs through the skin, as well as a lack of realisation of the potentially increased potency of intravenous injections.

In 1880, H.H Kane described intravenous injection as mainly being an unwanted consequence of subcutaneous injection and gave ways to avoid its occurrence. Writing as late as 1916, Macht said:

“however useful intravenous medication may be in special cases, its field of application is certainly more limited than that of hypodermic (subcutaneous) injection…”

The discovery of systemic action
It seems odd now, but early physicians did not realise that substances that were injected would have a systemic effect i.e travel around the whole body, thinking the action of things they injected would be local.

Early understandings of the pain relieving effects of opiates centred on the belief that most of the drug stayed at the site at which it was injected. In fact, drugs administered by any route of injection will eventually permeate throughout the body. Intravenous injection is the fastest route for injected drugs to reach the brain in concentrated form and subcutaneous injection is the slowest injected route.

Alexander Wood, although recognising some systemic action, believed that the action of opiates admistered by subcutaneous injection was mainly localised. The use of the syringe rather than previous methods was thought to allow greater accuracy in administering the drug in close proximity to a nerve, hence it was thought, facilitating better pain relief.

This belief in localised action influenced many doctors at the time. Dr Francis Anstie, editor of The Practitioner, wrote in 1869 that there was no danger associated with the hypodermic injection of remedies, and later:

“it is certainly the case that there is far less tendency with hypodermic than with gastric medication to rapid and large increase of the dose when morphia is used for a long time together”

Charles Hunter, a house surgeon at St George’s Hospital, made the connection that opiates administered by injection exert a systemic action, when he was forced to move away from the original site of injection as a result of abscess formation. He found that the patient still experienced similar relief from pain. This as Berridge and Edwards have noted, “led to a period of sustained and acrimomious debate between Wood and Hunter” about the existence or otherwise of systemic action.

Subcutaneous injecting with a syringe was initially described and popularised by Wood. It has been suggested that the fundamental misunderstanding that dependence could not occur through injected medication was partly responsible for the creation of a large number of patients dependent on morphine, described in the 19th century as ‘morphinists’. This was because the effect of the injected drug was thought to be local, rather than systemic and partly because dependence was thought to be centred on the stomach – so the theory went, avoiding ingestion through the stomach would avoid dependence.

Common problems with early injections
19th century injecting was by no means without incident or problems. The following late 19th century account of the problems associated with medical injections has powerful echoes for street injectors in the UK and other parts of the world today who continue to need to add acids to the base forms of brown street heroin and crack cocaine in order to render them soluble for injection.

“The active agent to be injected subcutaneously must be in perfect solution. The solution itself should be neutral (i.e. neither acid nor alkaline), clear and free of foreign matter, and not too concentrated. The difficulty of fulfilling all of these conditions has in the past very materially hindered the more general use of this method of treatment. But comparatively a few years ago many of the alkaloids were only to be had as bases. They were more or less insoluble without the addition of some acid and the slightest excess of the latter caused intense local irritation.” (Sharpe & Dhome 1898)

19th century descriptions of frequent subcutaneous injectors can sound similar to the appearance of some frequent injectors of street drugs in the 21st century, particularly those who are having difficulties in accessing veins.

“An extraordinary spectacle was revealed on examination. The entire surface of the abdomen and lower extremities was covered with discolored blotches, resembling small vibices, the marks of the injections. He was spotted as a leopard. For four years he had averaged three or four a day – an aggregate of between 5 and 6 thousand blissful punctures! The right leg was red and swollen, and I discovered a subcutaneous abscess extending from the knee to the ankle and occupying half the circumference of the limb.” (Gibbons 1870)

The growth of the medical use of opiates
A powerful influence on the development of widespread and repeated use of opiates by injection, would have been the obvious and immediately beneficial effects of injected morphine, particularly to those experiencing chronic pain. Doctors at the time, with few truly effective treatments available, would have had difficulty in resisting the impulse to treat pain with something as powerful, fast and effective as injected morphine. Courtwright, when discussing 19th century opiate addiction in North America, has said:

“The administration of opium and morphine by physicians was the leading cause of opiate addiction in the nineteenth century…case histories, clinical notes and remarks in the medical literature support the view that although opium and morphine were ultimately given for such unlikely disorders as masturbation, photophobia, nymphomania and ‘violent hiccough’ it was principally in those suffering from chronic ailments that the use of these drugs led to chronic addiction.” (Courtwright 1982)

The combination of the development and spread of injecting, alongside the widespread availability of opiates and opiate-based patent medicines probably contributed significantly to the increase in numbers of injectors of opiates in this period.

Injecting in the 20th century - the growth of intravenous injecting
Throughout the 19th and early 20th centuries, the most common injected route amongst both medical and ‘non-medical’ injectors was by subcutaneous injection.

Interestingly, early accounts of intravenous injection describe it as something unpleasant and to be avoided, although this is probably as a result of using too large a dose. The preference for the intravenous route of drug administration seems to have become particularly prevalent with illicit users during the 1920’s.

Richard Pates reviewed the literature on the spread of illicit intravenous injecting (Pates 2005) and concluded that early intravenous injectors probably discovered the route accidentally, and learned to use smaller doses than would have been needed for subcutaneous injection. Before 1925 intravenous injection amongst illicit users was relatively rare, by 1945 it had become the norm:

“…in the early 20th century addicts were taking doses that were enormous by today's standards and mostly had overdose experiences when they accidentally hit a vein. But when narcotics started to become more difficult to obtain and the doses became smaller, communication in the drug subculture facilitated the diffusion of the intravenous technique. The fact that (intravenous) injecting is more economical and the enjoyable rapid effect, or 'rush', contributed to the quick diffusion.” (Pates et al 2005)

However, it is very important to understand that medicine was beginning to favour the intravenous route for particular medications in the first decade of the 20th Century, particularly a drug called Salvarsan, a treatment for syphilis. The most effective alkaline form of Salvarsan could only be delivered intravenously. As Patricia Rosales says:

“In order for alkaline Salvarsan to maintain its non-toxicity, it had to be administered intravenously. It therefore required what in 1911 was considered a surgical procedure; a process much more difficult to achieve than today’s shot in the arm.” (Rosales 1997)

Rosales suggests that improvements and standardisation in the design and manufacture of syringes, needles, ampoules and the formulation of drugs, were largely driven by the precision required in the new need to give intravenous injections. It is therefore very likely that medical advances played a crucial part in the diffusion of the intravenous route.

The non-medical intravenous injection of heroin was first described in 1925 (Kolb 1925). five years previously, B.S. Wyatt had written the following about the intravenous treatment of malaria:

“From the subcutaneous injection to the intramuscular injection was a logical evolution. From the intramuscular injection to the intravenous injection was inevitable. It had to come. It is here to stay. There is every argument for no argument against intravenous therapy. Once admitted that the blood is the medium in which medicine is carried to every organ, tissue and cell of the body…” (Wyatt 1920)

The switch to disposable needles and syringes
Patents for glass disposable syringes were being taken out as early as 1903, but they were probably ideas before their time, and do not seem to have entered production.

The first truly disposable ‘syringes’ to be produced in large quantities were originally designed by James T Greeley around 1912. These were collapsible tin tubes (a bit like a modern tube of superglue) that had an attached needle and contained a contained a specific amount of morphine for subcutaneous injection on the battlefield. These were used in the 1st World War and were further developed during the 1920’s and 30’s to become the morphine Syrette manufactured by Squibb.

Syrettes were a standard part of the 1st aid kit carried by U.S. medical orderlies in World War 2. Used Syrettes were pinned to the collar of a casualty in effort to avoid inadvertent overdosing.

Greeley described the reasons for the development of his disposable device in 1912, talking of the problems with existing syringes he said:

“Asepsis is uncertain, the making of the solution is time-consuming and impossible where water is not available; the joints often leak; the piston occasionally sticks, and the needle becomes dull and rusty from boiling.”

Throughout the 20th century , the production of precision-made glass syringes was gradually refined. The first major advance came with the manufacture of syringes and needles with interchangeable parts made to exact specifications, rather than as ‘one-off’ items, as has been said above, the impetus for this standardisation was driven by the need to inject the anti-syphilitic drug Salvarsan intravenously.

Until the1960’s the majority of needles and syringes used outside of warfare, were re-useable and were supplied unsterilised. They had to be sterilised before each use.

The development of plastic disposable syringes
There are several competing claims to the design of the first disposable plastic syringe, but the most plausible is that of the Monoject syringe developed in the USA by Roehr products in 1955. The development of the Monoject syringe spurred Becton Dickinson into the development of similar plastic syringes (they had previously been developing glass disposables) and BD introduced their own Plastipack syringe in 1961.

Fears about the transmission of hepatitis B (and the resulting lawsuits) by doctors using inadequately sterilised re-useable syringes led to the takeover of the market by plastic disposables. A 1998 article in the San Francisco Chronicle on healthcare needle stick injuries, quotes a BD executive Joseph Welch as saying in 1990 of hepatitis B:

“It was probably the reason Becton Dickinson is a $2 billion company today,''

Becton Dickinson produced the first one-piece insulin syringe with integral needle in 1970.

Difficult to re-use syringes
There are many types of difficult to re-use syringes, each with a different mechanism to prevent a syringe being used more than once. They were developed for hospitals and other health care settings where they can prevent the inadvertent re-use of syringes.

Although it might seem that supplying these syringes to illicit drug users would reduce needle and syringe sharing, it is widely believed that their introduction would lead to those syringes already in circulation being kept, re-used and shared more frequently - leading to an increase in hepatitis C and HIV transmission. The United Kingdom Harm Reduction Alliance and the National Needle Exchange Forum have both warned of the potential dangers of these types of syringe.

We have written a separate article on this issue, to read it CLICK HERE

Accidental sharing and the development of the Nevershare syringe
A video study of injecting risk in Glasgow by Avril Taylor from Paisley University highlighted the prevelance of 'accidental sharing' in which injecting drug users had difficulty in avoiding sharing because all their syringes looked the same.

Exchange Supplies made a documentary film with the researchers to disseminate their findings, to see it, CLICK HERE

One of the key recommendations of the study was that much of the risk of sharing could be removed if injectors (who are well aware of the risks) were more able to tell their syringes apart.

After several years of lobbying syringe manufacturers to ask them to act on these findings, it became clear that they weren't going to do so of their own volition so Exchange Supplies embarked on it's biggest ever product development project, resulting in the launch of the 1ml insulin type nevershare syringe with plungers in 5 different colours to reduce accidental sharing in May 2007.

The Nevershare was the world's first syringe developed specifically for injecting drug users and in addition to plungers in a range of colours, it has markings in millilitres rather than insulin units, a barrel clear of print so injectors can see the solution, and a 30 gauge needle to reduce vein damage.

In September 2011, we added a 2ml detatchable needle type nevershare syringe to the range so that injecting drug users who require a different needle size to the 'traditional' insulin type syringe can also have access to coloured plungers so they can tell their syringes apart.

References
Macht D I (1916) The history of intravenous and subcutaneous injecting of drugs. The Journal of the American Medical association. LXVI

Morris R and Kendrick J (1807) The Edinburgh Medical and Surgical Dictionary.

Kane H H (1880) The Hypodermic Injection of Morphia. Its History Advantages and Dangers. Chas L Bermingham and Co, New York.

Anstie F E (1871) On the effects of prolonged use of morphia by subcutaneous injection. Practitioner 6: 148-57

Berridge V and Edwards G(1987) Opium and the people. Opiate Use in Nineteenth Century England, pp. 139-40. Yale University press, USA.

Sharp & Dhome (1898), A brief summary of hypodermic medication, 6th edition pp.8-9. Sharp & Dhome, Baltimore. Quoted in Rosales P, A history of the hypodermic syringe 1850’s – 1920’s. Harvard University Thesis, December 1997

Gibbons H (1870). Letheomania: the result of the hypodermic injection of morphia. Pacific medical and surgical journal 12: 481-495. Quoted in Rosales P, A history of the hypodermic syringe 1850’s – 1920’s. Harvard University Thesis, December 1997

Courtwright D (1982) Dark Paradise; Opiate Addiction in America before 1940, p.42 Harvard university Press, USA

Pates R, Mcbride A, Arnold K (Eds) Injecting Illicit Drugs Blackwell Publishing 2005

Rosales P, A history of the hypodermic syringe 1850’s – 1920’s. Harvard University Thesis, December 1997

Kolb L, Pleasure and Deterioration from Narcotic Addiction," Mental Hygiene, 9 1925

Wyatt B. S. (1920) The intravenous treatment of malaria, New York Medical Journal 112: 366-369

Editor. (1876) Tetanus after hypodermic injection of morphia. Lancet 2: 873-6

Bartholow R. (1891) A manual of hypodermic medication: The treatment of disease by the hypodermatic or subcutaneous method, 5th Edition, J B Lippincott Company p 38. Philadelphia USA.

Civil War Medicine

by Dr Julius Bonello, MD

The Union generals stood silently and watched as the long line of wounded made their way back to Washington. Although the morning had started out as glorious for the United States, it had quickly turned into a military debacle. Almost 2,700 Union Soldiers had been killed or wounded in a battle fought near a meandering stream known as Bull Run. The generals now knew that this engagement would be a long and costly one. They had greatly underestimated the strength of their enemy. They also realized, at that moment, that there were woefully unprepared for what was to come. Medical supplies that had been sent to the battlefield had never arrived and, according to official reports, not one wounded soldier returned by ambulance after the battle.

From 1862 to 1865, the American Civil War would cause almost 10 million soldiers to need medical assistance. At the beginning of the war, the military had only 113 doctors to meet this demand. Something had to be done and done quickly.

BACKGROUND
The Army Medical Department entered the war unprepared. Its chief, Colonel Tom Lawson, who was more than 80 years old, considered the purchase of medical books an extravagance and was reported to have flown into a rage upon hearing that one post had two sets of surgical instruments. In January 1861, the United States Army numbered 16,000 soldiers and had a medical staff of 113 surgeons. Soon after the war started, 24 surgeons left for the South leaving 89 surgeons to administer the Union army. Although nearly all doctors of this period had received their medical education on an apprenticeship basis, younger ones usually had a medical school diploma. Because medical schools had no standardized testing and licensing requirements, and testing varied state by state, the medical knowledge of a doctor of the 1860s varied in education, skill and experience. (Table 1)

At the time of the Civil War, there were 100 medical schools in the United States. School consisted of two years, the second year being a complete repetition of the first year. At the beginning of the war; some schools reduced their requirement to one year and counted a year on the battlefield as a year of apprenticeship. Some schools required only six weeks of formal learning before their students began an apprenticeship. Since many states had laws that prevented medical students from dissecting cadavers, graduates often did not see internal organs or any major trauma until their first experience in battle.

After the bombardment at Fort Sumter, southern students left the northern schools to attend southern medical schools. However, by 1863, because of the manpower shortage in the South, these medical schools closed, thereby adding to the woes of the southern medical department toward the end of the war.

Once they received their diploma, whether they liked it or not, the doctor of the day was a surgeon. Civilian doctors had little experience. At that time, civilian surgery involved what we would consider minor procedures, i.e. tooth extractions, laceration repair, drainage of abscesses, foreign body removal and similar conditions. Orthopedic practice was limited to splinting, and a joint space was never entered. True surgery was confined to a few obstetrical and gynecological procedures. No one was prepared for the carnage that was coming.

Because of these changes in medical school requirements and apprenticeships, the northern states were able to field almost 12,000 doctors during the Civil War. The Confederacy probably utilized a total of 8,000 doctors during the same time. Surgeons usually carried a rank of major and assistant surgeons were captains. Depending upon their length of service, a surgeon was paid between $162 and $200 per month.

In June 1861, two men met in New York with a group of devoted women, including Dr Elizabeth Blackwell, the first female physician in the United States. They formed the Women’s Central Association of Relief for the Sick and Wounded. On May 16, delegates of this group descended on Washington, DC, demanding the creation of a sanitary commission. Lincoln, the Secretary of War and the medical department opposed the idea. Fortunately, General Lawson was at home sick, and his replacement, Dr Robert Wood, saw the logic of this proposal. On June 9, 1861, the United States Sanitary Commission was formed. In theory, the commission was to investigate and advise in matters of sanitation and hygiene; in practice, it effected a purging and cleansing of the medical department; monitored camps, hospital food, clothing, medical supplies, ambulance services and recruitment; sent workers into the fields and hospitals to nurse and nourish; and provided everything from chloroform to tobacco. By war’s end, the commission had distributed almost $15 million worth of supplies, wholly provided by the citizens of the United States. (Table 2)

The most significant act produced by the commission was the White Paper of 1861. The commission reorganized itself, created new posts and, best of all, removed Lawson from the position of Surgeon General, replacing him with William Hammond. Hammond was an intelligent, able man with unbound energy and vision. His first move was an order that proper records be kept for all the sick, wounded and killed. This record is available today in a six-volume work found in most urban libraries. Hammond introduced a meaningful system for classifying disease, wrote and edited medical journals, accelerated the procurement of supplies and constantly fought to improve medical care. He recommended an ambulance corps, an army medical school and an army museum. He also proposed that the men, who drove ambulances and nursed the sick, be trained by the medical department. In May 1863, Hammond issued a decree restricting calomel (mercurous chloride), a powerful laxative, which had been used to treat diarrhea. The medical thinking of the 1800s focused on the bowels and bladder. If a good bowel movement or a good stream of urine could be produced, a patient was considered healthy. However, Hammond saw the high rate of mortality among patients with diarrhea and wanted calomel’s use restricted. Most medical doctors considered this directive heresy, and they brought their complaint to Washington. Forcing a trial while Hammond was on tour, they found him guilty of conduct unbecoming an officer and relieved him of duty. Joseph K Barnes replaced him, but continued all of Hammond’s proposals.

Until the Civil War, nurses in the United States were either veterans of earlier wars or the handicapped and mentally retarded. During the war, some nursing was performed by hospital stewards who were non-commissioned officers. Their duties were fully described by Joseph Woodward, a leading physician of his day. Woodward’s manual for stewards outlined, in today’s terms, the responsibilities of a registered nurse. During battles, the musical band that accompanied every regiment provided nursing care. Although almost every major engagement attracted local women who wanted to help administer medical care, their assistance was generally discouraged. The prospect of young women taking care of young men concerned the conservative faction of the nation.

In 1861, Dorothy Dix, well-known founder of institutions for the mentally insane, offered to provide trained nurses to staff military hospitals. In June 1861, she became superintendent of female nurses. Such a radical idea created a degree of public outcry; however, the plan was generally well received by the military and the US Sanitary Commission. In mid 1861, thousands of women submitted their applications in response to Dix’s call. Each candidate had to be “past 30 years of age, healthy, plain almost to repulsion in dress and devoid of personal attractions.” They had to know “how to cook all kinds of low diet” and avoid “colored dresses, hoops, curls, jewelry and flowers on their bonnets.” One such woman was Marianne Bickerdyke of Galesburg, Illinois. While on a trip to Cairo, Illinois, to supply the Union soldiers with medical supplies, she found a number of the soldiers hospitalized on beds of filthy straw laid over muddy tent floors, and dying of dysentery and typhoid. Enraged at Army inefficiency and without authorization, she went to work. She washed the casualties in bathtubs, dug the mud off the tent floors and fed her patients food sent down from Galesburg. For the duration of the war, Bickerdyke rode with the Western Army setting up hospitals, feeding her boys before they went into battle and working in front-line dressing stations. Not surprisingly, she was less popular with the brass. When the wife of an important colonel summoned her to care for her son’s measles, “Mother Bickerdyke” unceremoniously refused stating that she had plenty of soldiers to work for. The colonel complained to General Sherman who replied, “You have picked on the one person around here who outranks me. If you want to lodge your complaint against her, you will have to take it up with President Lincoln.” By the end of the war, 3,000 to 4,000 female nurses had worked for the Union.

At the outbreak of the war, the United States was not operating a single general military hospital. The country began a gigantic building program, and by January 1863, the North had built 151 hospitals with 58,000 beds. By 1865, the North operated 204 general military hospitals with 137,000 beds, and by the end of the war, the Confederacy also had 150 hospitals, with one-third centered around Richmond, Virginia. The largest at Chimborazo held 8,000 beds.

MEDICAL CARE
Medical care during the mid-1800s was still quite primitive. Although the stethoscope was discovered in 1838, Harvard Medical School did not have one until three years after the war. The thermometer, which had been employed in Europe for almost 200 years, was almost nonexistent in the United States. Just 20 thermometers were available in northern hospitals during the war. Only a handful of surgeons knew about laryngoscopes and hypodermic needles or how to use them. Because medical thinking of the 1800s was centered on the bowels and bladder, many of the medications were diuretics or laxatives. Quinine had been used for malaria for many years. Opium, which was dusted into wounds or taken by mouth, was prescribed often for pain. Chloroform, which had been discovered as an anesthetic agent just 15 years earlier, was used throughout the war. Digitalis, colchicine, and belladonna were widely used throughout the war. The most commonly used medication, however, was whiskey. Whiskey was the number one analgesic administered after an operation. The dose was one ounce every 15 minutes for pain. Products were also used initially to clear or cover the stench in the air of busy and cramped hospitals. Many of these products contained chlorine, bromine, iodine or potassium permanganate and were known to have antiseptic qualities. Toward the end of the war, they were used for dressings or poured into superficial wounds. One product was Labarraque's solution, which is 10 times stronger than our present-day Dakin’s solution. Initially used as a deodorant, it was poured into wounds during the war.

MORBIDITY AND MORTALITY DURING THE WAR
Traditional ideas of Civil War medicine are more similar to a Hollywood movie scene than reality: A tired and harried surgeon, his surgical frock covered with blood standing over a screaming patient, held down by his fellow soldiers. In reality, of the 600,000 soldiers who died during the conflict, two-thirds died from disease and only one-third on the battlefield or from wounds sustained in battle. (Table 3). The number one cause of death was the fluxes, now known as diarrhea or dysentery. Over three million cases were diagnosed during the Civil War, killing 400,000 soldiers. One entire volume of the six-volume Medical History of the War of the Rebellion is dedicated to soldiers suffering from diarrhea.

Of the injuries during the war, 94 percent were due to gunshot wounds, 6 percent to artillery, and less than 1 percent were secondary to bayonet or saber. According to official records, 33 percent of all wounds involved the arms, 35.7 percent involved the legs, 18 percent involved the trunk and 10 percent involved the head. The mortality of gunshot wounds were as follows:

·        Penetrating wounds of the abdomen and head: approximately 90 percent were fatal.
·        100 percent were fatal when the small bowel was injured; 59 percent when the colon was injured; and 100 percent when the stomach was injured.
·        A penetrating wound to the chest carried a 60 percent mortality rate.

Once the abdominal cavity had been entered, the surgeon had no recourse other than to give opium, whiskey or tobacco for comfort. No surgical interventions were available.

Three-quarters of all the operations performed during the Civil War were amputations. All limbs with open fractures were amputated, usually within the first 24 to 48 hours. The importance of early, prompt and swift surgical intervention was appreciated by the surgeons of this period. Samuel Gross, a surgeon during the war, wrote, “the success of amputations was very fair when they were performed early but most unfortunate when they were put off for any length of time.” He warned that the surgeon “must be careful to guard against procrastination. The case must be met promptly and courageously; delay of even a few hours may be fatal or at all events, place limb and life in eminent jeopardy.” John Chisolm, a Confederate surgeon, echoed this testament with the dictum, “the rule in military surgery is absolute viz; that the amputating knife should immediately follow the condemnation of the limb.” Approximately 80,000 amputations were performed under chloroform or ether anesthesia. Most were the flap type—the arteries and veins were tied with silk suture. Of 174,000 Union army wounds of the extremities, almost 30,000 soldiers underwent amputation with an overall mortality rate of 26 percent.

The high mortality from injuries is secondary to injuries from the bullet that was used. The Minie ball, named after the French captain who first developed it, was a slow, heavy, soft-lead projectile, which penetrated a body at a velocity of 950 feet per second (slow compared to today’s weapons). In addition to the tumbling effect of the projectile, this bullet would cause extensive bleeding, resulting in severe and often lethal shock.

There were 494 thoracotomies attempted, with 200 deaths, resulting in a mortality of approximately 40 percent. In 900 cases, the skull was opened with a mortality of approximately 67 percent. Of 2,818 soldiers diagnosed with sepsis, only 71 lived. Osteomyelitis, erysipelas, gangrene and septicemia were common after surgery. Once gangrene had set in, almost 50 percent of the soldiers died.

Documented pneumonia took the lives of almost 20,000 federal and 19,000 Confederate soldiers; while smallpox killed 1,000 soldiers in three months in one Virginia hospital. Scarlet fever and measles occasionally caused a death. Gonorrhea and syphilis were treated fairly commonly in the North and South. There are no statistics on the number of women and subsequent children infected with venereal disease; however, considering the fact that Union physicians treated 170,000 cases of venereal disease, the figures must be staggering. One Civil War researcher estimated that one-third of the men who died in Union and Confederate veterans’ homes were killed by the later stages of venereal disease.

PRISON CAMPS
Nineteen thousand Confederates died in Union prisons during the war, while 26,000 Federal soldiers died in southern prisons during the war. The most famous of which was in Andersonville, Georgia. This prison was built to house 10,000 soldiers but, at its height, confined over 33,000 prisoners, making it the fifth largest city in the Confederacy. Prisoners were allowed no means to build shelter. Their daily ration was one cup of cornmeal, three teaspoons of beans and a teaspoon of salt. For every 1,000 soldiers imprisoned there, 793 died. Stating it another way, one prisoner died every 11 minutes. This was almost twice the mortality rate seen in the most infamous northern prison camp, Elmira, New York, where 441 of every 1,000 soldiers died

SUMMARY
“If one wants to learn surgery, one must go to war,” Hippocrates wrote. The number of deaths surrounding the Civil War is staggering. Of the nearly three million soldiers who participated in the conflict, approximately 618,000 died— two-thirds by disease, one-third in battle. The total mortality of the war represents the loss of 2 percent of the entire United States population at that time. Union statistics document the treatment of almost one-half million injuries and six million cases of illness. Nearly 500,000 men came out of the war permanently disabled. In Mississippi, in 1866, one-fifth of the state’s revenue was spent on artificial limbs. Of the 12,344 surgeons in the Union medical corp, 336 were killed in the line of duty or died while in service. In his manual for military surgeons, Chisolm wrote, “the surgeon on the battlefield must participate in the dangers.”

America has never again witnessed pain and death in such magnitude as the Civil War. More Americans died in that conflict than in all other US wars combined. The battle at Shiloh, Tennessee, caused 24,000 casualties. This number of casualties easily surpasses the combined number of Americans who died in the Revolutionary War, the War of 1812 and the Mexican War. The battle at Antietam, Maryland, on September 17, 1863, took 23,000 casualties, making it the bloodiest day in American history. Between July 1-3, 1863, 51,000 people were killed, wounded or missing at Gettysburg, Pennsylvania. The number of casualties is almost as many as were killed during the 15-year Vietnam War conflict. On June 3, 1864, at Coldharbor, Virginia, in a frontal assault led by General Ulysses S Grant, the Union army lost more than 12,000 men; 7,000 of them dead in the first seven minutes. General Robert E Lee lost 2,500 men.

The American Civil War was the last great conflagration before the discovery of bacteria. Although Louis Pasteur’s work was carried out during the 1850s, it was not available for general knowledge until 10 to 15 years after the war. In 1867, Joseph Lister published his landmark work on surgical antisepsis, Antiseptic Principle. His principles met wide resistance, especially by American physicians, but were finally accepted and put into effect by World War I. In 1878, Robert Koch discovered the role that bacteria play in causing disease. It would take another war, World War II, and the discovery of antibiotics to bring this chapter to a close.

Table 2 A partial list of the supplies and goods that the sanitary commission sent to Gettysburg after the July 1863 battle.
Drawers, Woolen
5,310 Pairs
Drawers, Cotton
1,833 Pairs
Shirts, Woolen
7,158
Shirts, Cotton
3,266
Pillows
2,114
Blankets
1,007
Sheets
274
Stockings
5,818 Pairs
Shoes
4,000 Pairs
Combs
1500
Soap
250 Pounds
Basins and Cups
7,000
Bandage Linen
110 Barrels
Splinting/Dressing Plaster
16 Rolls
Crutches
1,200 Pairs

Table 3 Civil War Casualties
Union

Battle
110,070
Disease
224,586
Accidents/suicide
24,872
Total
359,528
Confederacy

Battle
94,000
Disease
164,000

Total
258,000

Total union and Confederacy
617,528

ABOUT THE PAINTING
“Island of Mercy: The Pry Mill at Antietam” was painted by Keith Rocco, who is a member of the Society of American Historical Artists. Gordon E Dammann, MD, commissioned the painting to benefit the National Museum of Civil War Medicine in Frederick, Maryland.

On September 17, 1862, Samuel Pry’s grist mill near Antietam Creek’s upper bridge served as a field hospital for the men wounded in the Miller Cornfield, the East and West Woods, and the Bloody Lane. At the time, surgeon Jonathan Letterman served as the new medical director of the Army of the Potomac and was reorganizing the medical corps.

This was the bloodiest single day of the conflict, yet the mill where approximately 200 seriously wounded soldiers are being treated seems almost tranquil. A red flag is apparent rather than the more common yellow hospital flag which was finally standardized in 1864. The four-wheeled Rosecrans ambulances are preferred, but the two-wheeled vehicles are still in use. Dr Letterman, who is constantly moving during and after the battle, gives instructions to surgeons in the foreground. Clara Barton and her assistant Cornelius Welles dispense blankets and other supplies that she has personally brought to the soldiers.

Wounded from both sides receive care and medical personnel from the Union Second Corps who wear the green hat bands and half chevrons assist the surgeon in triage before each wounded soldier is carried into the mill on a Satterlee stretcher.

For additional information about the painting or to order a print, please contact the National Museum of Civil War Medicine.

BIBLIOGRAPHY
1. Schaadt MJ. Civil War Medicine-An Illustrated History. Quincy, IL: Cedarwood Publishing;1998.
2. Adams GW. Doctors in Blue. Baton Rouge: Lousiana State University Press;1952.
3. Ward GW. The Civil War. New York: Alfred A. Knopf, Inc;1990.
4. Swanberg WA. Sickles the Incredible. New York: Charles Scribner’s Sons;1956.

From: From: mnwelldir.org

Mrs Elvira J. Stockwell Powers

From: findagrave.com

Birth: Aug. 6, 1827
Auburn
Worcester County
Massachusetts, USA
Death: Sep. 21, 1871
Worcester
Worcester County
Massachusetts, USA

Daughter of James and Prudence Stockwell.

In 1860, Elvira J. Powers, age 30, was a school teacher living in Roscoe, Winnebago county, Illinois.

Army Nurse, and author of "Hospital Pencillings; being a Diary while in Jefferson General Hospital, Jeffersonville, Ind., and others at Nashville, Tennessee, as Matron and Visitor," by Elvira J. Powers, Boston, 1866, pp 211. (Available online at archive.org.)

She died of consumption, age 42 according to her death certificate.

Inscription:
A devoted friend and nurse
to the soldiers in times
of war and a lover of good
works in times of peace has
gone home.

Embracing the Faith that
Christ lived and died for the
salvation of all.

She earnestly desired to
preach this glorious gospel
but while preparing herself
for the ministry she was
called from earth
and joyously passed "over
the river."

Burial:
Hope Cemetery
Worcester
Worcester County
Massachusetts, USA

Created by: DaurRegt
Record added: Oct 09, 2012
Find A Grave Memorial# 98523673

Civil War Medicine: An Overview of Medicine

From: ehistory.osu.edu

During the 1860s, doctors had yet to develop bacteriology and were generally ignorant of the causes of disease. Generally, Civil War doctors underwent two years of medical school, though some pursued more education. Medicine in the United States was woefully behind Europe. Harvard Medical School did not even own a single stethoscope or microscope until after the war. Most Civil War surgeons had never treated a gun shot wound and many had never performed surgery. Medical boards admitted many "quacks," with little to no qualification. Yet, for the most part, the Civil War doctor (as understaffed, underqualified, and under-supplied as he was) did the best he could, muddling through the so-called "medical middle ages." Some 10,000 surgeons served in the Union army and about 4,000 served in the Confederate. Medince made significant gains during the course of the war. However, it was the tragedy of the era that medical knowledge of the 1860s had not yet encompassed the use of sterile dressings, antiseptic surgery, and the recognition of the importance of sanitation and hygiene. As a result, thousands died from diseases such as typhoid or dysentery.

The deadliest thing that faced the Civil War soldier was disease. For every soldier who died in battle, two died of disease. In particular, intestinal complaints such as dysentery and diarrhea claimed many lives. In fact, diarrhea and dysentery alone claimed more men than did battle wounds. The Civil War soldier also faced outbreaks of measles, small pox, malaria, pneumonia, or camp itch. Soldiers were exposed to malaria when camping in damp areas which were conductive to breeding mosquitos, while camp itch was caused by insects or a skin disease. In brief, the high incidence of disease was caused by a) inadequate physical examination of recruits; b) ignorance; c) the rural origin of my soldiers; d) neglect of camp hygiene; e) insects and vermin; f) exposure; g) lack of clothing and shoes; h) poor food and water. Many unqualified recruits entered the Army and diseases cruelly weeded out those who should have been excluded by physcial exams. There was no knowledge of the causes of disease, no Koch's postulates. Troops from rural areas were crowded together for the first time with large numbers of other individuals and got diseases they had no immunity to. Neglect of camp hygeine was a common problem as well. Ignorance of camp sanitation and scanty knowledge about how disease was carried led to a sort of "trial and error" system.You can read Surgeon Charles Tripler's report on sanitation that is included in this web site for a contemporary view of camp hygeine. An inspector who visited the camps of one Federal Army found that they were, "littered with refuse, food, and other rubbish, sometimes in an offensive state of decomposition; slops deposited in pits within the camp limits or thrown out of broadcast; heaps of manure and offal close to the camp." The Federal government even founded a Sanitary Commission to deal with the health problems in army camps. Mary Livermore, a nurse, wrote that... "The object of the Sanitary Commission was to do what the Government could not. The Government undertook, of course, to provide all that was necessary for the soldier, . . . but, from the very nature of things, this was not possible. . . . The methods of the commission were so elastic, and so arranged to meet every emergency, that it was able to make provision for any need, seeking always to supplement, and never to supplant, the Government."  Both Armies faced problems with mosquitos and lice. Exposure turned many a cold into a case of pneumonia, and complicated other ailments. Pneumonia was the third leading killer disease of the war, after typhoid and dysentery. Lack of shoes and proper clothing further complicated the problem, especially in the Confederacy. The diet of the Civil War soldier was somewhere between barely paltable to absoultely awful. It was a wonder they did not all die of acute indigestion! It was estimated that 995 of 1000 Union troops eventually contracted chronic diarrhea or dysentery; their Confederate counterparts suffered similarly. Disease was particularly rampant in the prisoner-of-war camps, whose conditions were generally worse than the army camps.

To halt disease, doctors used many cures. For bowel complaints, open bowels were treated with a plug of opium. Closed bowels were treated with the infamous "blue mass"... a mixture of mercury and chalk. For scurvy, doctors prescribed green vegetables. Respiratory problems, such as pneumonia and bronchitis were treated with dosing of opium or sometimes quinine and muster plasters. Sometimes bleeding was also used. Malaria could be treated with quinine, or sometimes even turpentine if quinine was not available. Camp itch could be treated by ridding the body of the pests or with poke-root solution. Whiskey and other forms of alcohol also were used to treat wounds and disease ... though of questionable medical value, whiskey did relieve some pain. Most medinces were manufactured in the north; southerners had to run the Union blockade in order to gain access to them. On occasion, vital medicines were smuggled into the South, sewn into the petticoats of ladies sympathetic to the Southern cause. The South also had some manufacturing capabilites and worked with herbal remedies. However, many of the Southern medical supplies came from captured Union stores. Dr. Hunter McGuire, the medical director of Jackson's corps, commented after the War on the safeness of anethesia, saying that in part the Confederacy's good record was due in part from the supplies requisitoned from the North.

Battlefield surgery (see separate web page describing an amputation) was also at best archaic. Doctors often took over houses, churches, schools, even barns for hospitals. The field hospital was located near the front lines -- sometimes only a mile behind the lines -- and was marked with (in the Federal Army from 1862 on) with a yellow flag with a green "H". Anesthesia's first recorded use was in 1846 and was commonly in use during the Civil War. In fact, there are 800,000 recorded cases of its use. Chloroform was the most common anesthetic, used in 75% of operations. In a sample of 8,900 uses of anesthesia, only 43 deaths were attributed to the anethestic, a remarkable mortality rate of 0.4%. Anesthesia was usually administered by the open-drop technique. The anethestic was applied to a cloth held over the patient's mouth and nose and was withdrawn after the patient was unconscious. A capable surgeon could amputate a limb in 10 minutes. Surgeons worked all night, with piles of limbs reaching four or five feet. Lack of water and time meant they did not wash off hands or instruments

Bloody fingers often were used as probes. Bloody knives were used as scalpels. Doctors operated in pus stained coats. Everything about Civil War surgery was septic. The antiseptic era and Lister's pioneering works in medicine were in the future. Blood poisoning, sepsis or Pyemia (Pyemia meaning literally pus in the blood) was common and often very deadly. Surgical fevers and gangrene were constant threats. One witness described surgery as such: "Tables about breast high had been erected upon which the screaming victims were having legs and arms cut off. The surgeons and their assistants, stripped to the waist and bespattered with blood, stood around, some holding the poor fellows while others, armed with long, bloody knives and saws, cut and sawed away with frightful rapidity, throwing the mangled limbs on a pile nearby as soon as removed." If a soldier survived the table, he faced the awful surgical fevers. However, about 75% of amputees did survive.

The numbers killed and wounded in the Civil War were far greater than any previous American war. As the lists of the maimed grew, both North and South built "general" military hospitals. These hospitals were usually located in big cities. They were usually single storied, of wood construction, and well-ventilated and heated. The largest of these hospitals was Chimbarazo in Richmond, Virginia. By the end of the War, Chimbarazo had 150 wards and was capable of housing a total of 4,500 patients. Some 76,000 soldiers were treated at this hospital.

There were some advances, mainly in the field of military medicine. Jonathan Letterman, revolutionized the Ambulance Corps system. With the use of anethesia, more complicated surgeries could be performed. Better and more complete records were kept during this period than they had been before. The Union even set up a medical museum where visitors can still see the shattered leg of flamboyant General Daniel Sickles who lost his leg at the Trostle Farm at the battle of Gettysburg when a cannon ball litterally left it hanging by shreds of flesh.

The Civil War "sawbones" was doing the best he could. Sadly when American decided to kill American from 1861 to 1865, the medical field was not yet capable of dealing with the disease and the massive injuries caused by industrial warfare.

USCivilWar.Net wants to thank Jenny Goellnitz for compiling this information.
jgoellnitz@yahoo.com

Image: Civil War Surgeons at Petersburg (Library of Congress)

Amputations in the Civil War

(Originally published as "When Johnny Couldn't Come Marching Home: Civil War Amputations")
by Ansley Herring Wegner, Tar Heel Junior Historian Association, NC Museum of History, Fall 2008.

Unidentified soldier with amputated arm in Union uniform in front of painted backdrop showing cannon and cannonballs[Unidentified soldier with amputated arm in Union uniform in front of painted backdrop showing cannon and cannonballs]Many wounded soldiers during the Civil War (1861–1865), including those from North Carolina, had an operation called an amputation. In an amputation, a person has an arm or leg (or sometimes just a hand or foot) removed from their body because of a terrible injury or infection. Military advances before and during the Civil War meant more powerful, destructive weapons, and more devastating injuries, including shattered bones. Most American doctors, however, were unprepared to treat such terrible wounds. Their experience mostly included pulling teeth and lancing boils. They did not recognize the need for cleanliness and sanitation. Little was known about bacteria and germs. For example, bandages were used over and over, and on different people, without being cleaned.

With so many patients, doctors did not have time to do tedious surgical repairs, and many wounds that could be treated easily today became very infected. So the army medics amputated lots of arms and legs, or limbs. About three-fourths of the operations performed during the war were amputations.

These amputations were done by cutting off the limb quickly—in a circular-cut sawing motion—to keep the patient from dying of shock and pain. Remarkably, the resulting blood loss rarely caused death. Surgeons often left amputations to heal by granulation. This is a natural process by which new capillaries and thick tissue form—much like a scab—to protect the wound. When they had more time, surgeons might use the "fish-mouth" method. They would cut skin flaps (which looked like a fish’s mouth) and sew them to form a rounded stump.

Amputation Kit used by Dr. Gordon (from the County Doctor Museum)Amputation Kit. The instruments in the kit were used to amputate limbs and perform other surgical procedures. The kit includes two trephines, a variety of knives, an amputation saw, bone nippers, a tourniquet, tweezers, scissors and a hey saw. From the Country Doctor Museum, Digital Collections, East Carolina University.For soldiers who survived amputation and infection, it was natural to want an artificial, or fake, limb—both for looks and for function. An artificial arm will not provide a firm handshake, and an artificial leg will not get rid of a limp. But a prosthetic (another word used for an artificial limb) helps an amputee be less noticeable in public and offers the chance of a more regular daily life. Artificial limbs, especially legs, helped Civil War amputees get back to work to support themselves and their families. Agriculture had declined with so many soldiers away from home. After the war ended, it was important for men to return to their farms and increase production of food and money-making crops. Amputees were no different—they needed to be able to work on their farms, too.

North Carolina responded quickly to the needs of its citizens. It became the first of the former Confederate states to offer artificial limbs to amputees. The General Assembly passed a resolution in February 1866 to provide artificial legs, or an equivalent sum of money (seventy dollars) to amputees who could not use them. Because artificial arms were not considered very functional, the state did not offer them, or equivalent money (fifty dollars), until 1867. While North Carolina operated its artificial limbs program, 1,550 Confederate veterans contacted the government for help.

One Tar Heel veteran, Robert Alexander Hanna, had enlisted in the Confederate army on July 1, 1861. Two years later at Gettysburg, Pennsylvania, Hanna suffered wounds in the head and the left leg, just above the ankle joint. He suffered for about a month, with the wound oozing pus, before an amputation was done. After the war, Hanna received a wooden Jewett’s Patent Leg from the state in January 1867. According to family members, he saved that leg for special occasions, having made other artificial limbs to help him do his farmwork. (One homemade leg had a bull’s hoof for a foot!) The special care helped the Jewett’s Patent Leg last. When Hanna died in 1917 at about eighty-five years old, he had had the artificial leg for fifty years. It is a remarkable artifact—the only state-issued artificial leg on display today in North Carolina. Hanna’s wooden leg, as well as Civil War surgical equipment, may be seen at Bentonville Battlefield State Historic Site in Four Oaks.

*Ansley Herring Wegner is a research historian with the North Carolina Office of Archives and History. She is the author of Phantom Pain: North Carolina’s Artificial-Limbs Program for Confederate Veterans.

Wegner, Ansley Herring. 2004. Phantom Pain: North Carolina's Artificial-Limbs Program for Confederate Veterans. NC Office of Archives and History.

Image 1: Unidentified soldier with amputated arm in Union uniform in front of painted backdrop showing cannon and cannonballs], ca. 1861-1865.  Library of Congress Prints and Photographs Division Washington, D.C. 20540 USA http://hdl.loc.gov/loc.pnp/pp.print

Image 2: Amputation Kit. 1800-1899. From the Country Doctor Museum. Image courtesy of the Eastern North Carolina Digital Library, J.Y. Joyner Library, East Carolina University. https://digital.lib.ecu.edu/13941 (accessed May 4, 2015).

Image 3: Artificial foot that enables Limp-free walking. Patent number 16360. Issued January 1857 to Benjamin W. Jewett by the U.S. Patent Office.

From: ncpedia.org

Coffin Nails: The Tobacco Controversy in the 19th Century

From: tobacco.harpweek.com

Although a few women shared the tobacco habit in nineteenth-century America, it was overwhelming a male endeavor.  The most popular ways for American men of the time to consume tobacco were pipes, cigars, and chewing tobacco.  Snuff had largely gone out of fashion by mid-century, except for occasional sniffs by high-society youth.  Pipes were a favorite of all social classes, and varied in style from expensive, elaborately carved wood or stone to simpler, moderately-priced versions to cheap ones made of clay or corncob.  Chewing tobacco was common throughout the century in rural and urban areas alike, requiring spittoons in the cities’ public buildings.  It was popular among working class and poor Americans, but was shunned by polite society in the late-nineteenth century.
American soldiers were introduced to cigars during the Mexican War (1846-1848) and they became a fad with men in the 1850s.  Cigars ranged in price from cheap to expensive, but were associated for the rest of the century with affluence, respectability, and success in the male bastions of business, politics, and the military.  Cigars were the most exclusively male of all tobacco products, while cigarettes, in contrast, were considered unmanly until the end of the nineteenth century.  The manufacturing of cigarettes in the United States began in the early 1860s and mass production was achieved in the early 1880s.  There were two basic kinds of cigarette products:  high-end, expensive brands made of imported tobacco; and low-end, cheap brands made of domestic tobacco.

High-end cigarettes were linked with the moral decadence of Europe, the Middle East, and Latin America, where the products were both manufactured and popular.  Hand-rolled cigarettes, also made from imported tobacco, implied the user had the leisure to indulge the habit, and was not participating in his manly duty to earn a living and provide for a family.  Thus, the cigar represented the family man who gained wealth through hard work and self-sacrifice, and who respected traditional morality and authority; the cigarette symbolized the man whose inherited wealth allowed him to remain in a sort of perpetual, self-centered adolescence—unmarried, disrespectful of conventional morality and authority, and attracted to things foreign.

An article in the May 28, 1870 issue of Harper’s Weekly mocked “The Hero of a Fast Novel” (“fast” means immoral or socially improper).  He was constantly pleasure seeking, and lived in luxury by spending other people’s money or going into debt.  The cigarette was as much an identifying mark of the “fast” youth as his expensive food and wine.  “When out shooting, these gentlemen, who are ‘roughing it,’ toss the dogs foie gras and truffles, and drink delicate Burgundies to their perfumed cigarettes.”

A cartoon in the October 14, 1882 issue depicted a “Swell Struggling with the Cig’rette Poisoner.”  Like the “fast” young man, a “swell” was wealthy, self-indulgent, and egotistical.  In the cartoon, he wore evening clothes and top hat for a riotous night on the town.  His floral boutonniere may have mimicked the practice of Oscar Wilde, who toured the United States that year wearing or carrying lilies and sunflowers (that may be a lily pad in the swell’s buttonhole).  The text and image made it clear that the cartoonist considered cigarette smoking to be addictive and deadly.  The multiple cigarette “ribs” of the snake signified chain-smoking, which restricted his free will by encircling him.  According to the caption (which may have been sarcastic) he struggled (or should have struggled) against its hold on his life.  The skeletal view of the snake and the designation of cigarettes as “poison” delivered the powerful message that cigarette smoking is deadly.

Although cigarettes made with domestic tobacco had always been cheaper than those containing imported tobacco, two things lowered the price more.  The first was the drastic reduction of the federal tax on cigarette manufacturers in 1883.  Originally imposed to help pay for the Union war effort during the Civil War, two decades later Congress cut the tax by almost 29%, and manufacturers like James Buchanan “Buck” Duke of North Carolina, passed the savings along to customers.  In the February 25, 1882 issue of Harper’s Weekly, a cartoon by Thomas Nast criticized the call for the tax reduction.  The artist presented two stylishly dressed young men in a tobacco haze.  Their cigarette habit made them stupid and lazy, the opposite of the “manly air” they wanted to project.

The second important factor occurred when Duke contracted in 1883 with James Bonsack to perfect the latter’s invention for mass-producing cigarettes.  On April 30, 1884, the machine lasted an entire ten-hour shift, making 120,000 cigarettes.  The device reduced manufacturing costs and, in turn, consumer prices as it increased production and availability.  By the early 1890s, Duke had a near monopoly of the cigarette market via his American Tobacco Company (broken up by the U.S. Supreme Court in 1911).

The markets for cheap, domestic cigarettes in the late-nineteenth century were primarily boys and immigrant men from southern and eastern Europe, where the habit was common.  The low cost and convenience of cigarettes also encouraged urban working-class men to take up the practice over the next few decades.  Cigarettes could be smoked quickly while walking to and from work, on lunch breaks, and when the boss was not around.  They were also less offensive than other tobacco product to nonsmokers in the congested and rapidly growing cities of America.

These groups of cigarette smokers reinforced the stereotype that the habit was unmanly.  Boys were immature, and presumably did not have the responsibility of providing for a family (actually, some working-class boys did, at least in part).  In 1885, an editorial in The New York Times chastised men for smoking cigarettes and warned of the dangers it posed to the nation’s political survival.  “A grown man has no possible excuse for thus imitating the small boy…  The decadence of Spain began when the Spaniards adopted cigarettes and if this pernicious habit obtains among adult Americans[, then] the ruin of the Republic is close at hand…”  (The focus on Catholic Spain probably also reflected anti-Catholic sentiment among the dominant Protestant population in the United States.)

Poor and working-class men (which included most immigrants) were considered unsuccessful financially, and not socially (and sometimes, morally) respectable.  Therefore, they did not fit the middle-class ideal of American manhood in the late-nineteenth century.  A one-sentence item in the “Personal” column of the July 12, 1888 issue of Harper’s Weekly proclaimed, “Washington is proud of the fact that not one Congressman smokes cigarettes.”  It is certain, though, that many used cigars, pipes, or chewing tobacco.

The public attitude against men smoking cigarettes began to change with the Spanish-American War in 1898.  Many of the young American soldiers, particularly those from rural areas, were introduced to cigarettes as they came in contact with residents of the (former) Spanish colonies of Cuba, Puerto Rico, Guam, and the Philippines, where cigarettes were smoked by most of the population.  Although U.S. military officials approved enlisted men using pipes and chewing tobacco, they tried unsuccessfully to discourage cigarette smoking.  In fact, American sailors threatened to mutiny if deprived of cigarettes.  They had practical advantages over other tobacco products.  They could be carried easily in a uniform pocket, smoked quickly, and did not spoil in humid weather like cigars.  In cramped military quarters, particularly on long sea voyages, the milder cigarette odor would be less offensive to nonsmokers than strong cigar smoke.

At first, Harper’s Weekly presented cigarette smoking as a Spanish practice.  In the August 13, 1898 issue, at one point in a news story on a truce in the fighting in Cuba, the correspondent identified the two sides by their respective choice of tobacco:  “Meanwhile the little men in light blue [Spanish soldiers] sit calmly on the edge of their trenches and smoke cigarettes, while the big men in dark blue [American soldiers] sit on the edge of theirs and good-humoredly cast tobacco juice toward [the city of] Santiago.”  However, in the next week’s issue, in a news story on “The Taking of Guam,” the correspondent referred to American sailors aboard the U.S. transport, Australia, using cigarettes, cigars, and chewing tobacco.  The situation was similar in Puerto Rico, from where Charles L. Hofmann of Battery A, Pennsylvania Light Artillery, wrote home, “You can buy the best cigars here… and their cigarettes are good and strong.”

In the aftermath of the Spanish-American War, E. S. Martin questioned, in his “This Busy World” column from the December 24, 1898 issue of Harper’s Weekly, the presumption that cigarettes undermined “American manhood.”  Instead, he connected the habit explicitly with American servicemen, who smoked so many cigarettes in Cuba that they ran out of supplies.  He emphasized that cigarette smoking among American military personnel was simply a fact from the recent war, “which every one must recall.”  In the February 25, 1899 issue, a feature article on “Our New Possessions—Puerto Rico,” discussed the “Tobacco Culture” of the island, a new American territory after the war.  The author argued that with minor changes to cigarettes made in Puerto Rico, they would find a ready market in the United States.

The years between the Spanish-American War and America’s entry into World War I in 1917 were a period of transition for the public attitude toward men and cigarette smoking.  The notion in earlier decades that the foreign origin of high-end cigarette made the manliness and morality of male smokers suspect appears to have diminished by the late 1890s (even before the Spanish-American War).  An advertisement for the Nestor cigarette brand in the March 19, 1898 issue of Harper’s Weekly declared proudly that its product was “THE MODERN EMBODIMENT OF ANCIENT ORIENTAL LUXURIOUSNESS”:  so much for middle-class respectability and the work ethic.  A Nestor ad in the January 21, 1899 issue marketed to “smokers of refined taste” and featured a sketch of an Arab man (or perhaps a British man in Arab attire).

That there was still uncertainty, however, about cigarettes and American manhood is demonstrated in two other ads.  In the September 3, 1898 issue, the Van Bibber brand of cigarettes were called “LITTLE CIGARS,” to associate them with the unquestionably acceptable tobacco product for men.  In the February 18, 1899 issue, an illustrated ad for Lucke’s Rolls cigars stated, “They are not a smoke for boys or cigarette smokers.”  (Notice another Nestor ad at the top of the page.)  The fact that the cigar ad insulted cigarette smokers also indicates that the cigar company was experiencing competition from cigarette manufacturers.

Historians have attributed a drop in cigarette sales for a few years in the late-1890s and early-1900s primarily to the return of economic prosperity, which allowed more male smokers to afford cigars over the cheaper cigarettes.  In his “This Busy World” column from the August 19, 1899 issue of Harper’s Weekly, E. S. Martin noted that after increasing for 12 years, domestic cigarette production dropped during the previous year.  He observed, though, that the sale of luxury cigarettes had increased (evidence of the improved economy).

In the early-twentieth century, military officials continued their unsuccessful attempts at suppressing cigarette smoking.  In 1903, the Military Academy at West Point banned cigarettes (but promoted pipes) and soon court-martialed two cadets for possession.  However, the prohibition lapsed into disuse when the chief medical officer of West Point admitted in 1915 that “a large percentage” of cadets were “habitual cigarette smokers.”  Similarly, intense opposition from sailors caused the U.S. Navy to reverse a 1907 ban on cigarettes for those less than 21 years old.

By the time the United States entered World War I in 1917, cigarette smoking seems to have been widespread in the military.  The public association of cigarette smoking with the manly image of American servicemen, which had begun with the Spanish-American War in 1898, was reinforced in World War I.  The difference was that instead of military officials resisting the practice, as they had done in the earlier conflict, the U.S. military, federal government, and auxiliary associations officially approved, promoted, and distributed cigarettes among American military personnel.  Whereas cigarettes had been linked previously with moral decadence and considered a “gateway” drug for the use of stronger substances, they were now viewed as a partial substitute for bad (or worse) behavior:  alcohol was banned to servicemen, and military camps had prostitute-free zones around them.

Cigarette smoking would never again be considered unmanly, which was too bad for men.  By 1955, over half of American men smoked cigarettes.  Nine years later, the U.S. Surgeon General issued a report declaring that cigarette smoking caused cancer in men.   Today, despite a lower percentage of male cigarette smokers than in the past, lung cancer (overwhelmingly caused by smoking) is the leading cause of cancer-related deaths among American men.  The habit also increases the risk of heart disease, strokes, and other types of cancer.

Harper's Weekly References
1)  May 28, 1870, p. 343
article, “The Hero of a Fast Novel,” he smokes perfumed cigarettes
2)  October 14, 1882, p. 651, c. 3-4
cartoon, “Swell Struggling with the Cigarette Poisoner”

3)  February 25, 1882, p. 128, c. 1-2
cartoon, “A Bill To Make Idiots,” Nast, regarding proposal to lower tobacco tax; identifies smokers as “lunatic” and “idiot”

4)  July 12, 1888, p. 527, c. 3
item from “Personal” column:  “Washington is proud of the fact that not one Congressman smokes cigarettes.”

5)  August 13, 1898, p. 802-804
“The Truce,” Americans soldiers chewing tobacco, Spanish smoking cigarettes

6)  August 20, 1898, pp. 829-830, c. 1
“The Taking of Guam,” American sailors using cigars, cigarettes, and chewing tobacco

7)  December 24, 1898, p. 1263, c. 3
item in “This Busy World” column, by E. S. Martin, discusses laws against cigarettes, fear of undermining American manhood, but points out American soldiers in Cuba were smoking them

8)  February 25, 1899, pp. 193 and 196
feature article, “Our New Possessions—Puerto Rico,” discusses the “Tobacco Culture” of the island, argues that making minor changes to the cigarettes would give them a ready market in the United States

9)  March 19, 1898, p. 287, c. 1-2
ad, Nestor brand cigarette, “Oriental Luxuriousness”

10)  January 21, 1899, p. 75, c. 1-2
ad, Nestor brand cigarette, “An absolute necessity to smokers of refined taste”

11)  September 3, 1898, p. 879, c. 4
ad, marketed as “little cigars”

12)  February 18, 1899, p. 179
cigar ad, “They are not a smoke for boys or cigarette smokers.”

13)  August 19, 1899, p. 809, c. 1
item in “This Busy World” column, domestic cigarette production down, but sales of (foreign) luxury cigarettes up

Sources Consulted
Kluger, Richard, Ashes to Ashes:  America’s Hundred-Year Cigarette War, the Public Health, and the Unabashed Triumph of Philip Morris (NY:  Alfred A. Knopf, 1996)
Moyer, David, M.D., “The Tobacco Reference Guide,” http://new.globalink.org/tobacco/trg/Chapter15/table_of_contents.html

“Pennsylvania Volunteers of the Spanish-American War:  Letters Home,” http://paspanishamericanwar.com/letters/hofmann.html

Tate, Cassandra, Cigarette Wars:  The Triumph of the Little White Slaver (NY:  Oxford UP, 1999)

“Tips for Tobacco Users:  Tobacco Use During the Civil War,” http://www.shasta.com/suesgoodco/newcivilians/tobacco.htm

Slade, John, M.D., “Tobacco Epidemic: Historical Lessons,”

Civil War Battlefield Surgery

From: ehistory.osu.edu

A Description of Civil War Field Surgery
The most common Civil War surgery was the amputation. A few words about why there were so many amputations may be appropriate here. Many people have construed the Civil War surgeon to be a heartless individual or someone who was somehow incompetent and that was the reason for the great number of amputations performed. This is false. The medical director of the Army of the Potomac, Dr. Jonathan Letterman, wrote in his report after the battle of Antietam:

The surgery of these battle-fields has been pronounced butchery. Gross misrepresentations of the conduct of medical officers have been made and scattered broadcast over the country, causing deep and heart-rending anxiety to those who had friends or relatives in the army, who might at any moment require the services of a surgeon. It is not to be supposed that there were no incompetent surgeons in the army. It is certainly true that there were; but these sweeping denunciations against a class of men who will favorably compare with the military surgeons of any country, because of the incompetency and short-comings of a few, are wrong, and do injustice to a body of men who have labored faithfully and well. It is easy to magnify an existing evil until it is beyond the bounds of truth. It is equally easy to pass by the good that has been done on the other side. Some medical officers lost their lives in their devotion to duty in the battle of Antietam, and others sickened from excessive labor which they conscientiously and skillfully performed. If any objection could be urged against the surgery of those fields, it would be the efforts on the part of surgeons to practice "conservative surgery" to too great an extent.

Still the Civil War surgeon suffers from being called a butcher or some other derisive term.

The slow-moving Minie bullet used during the American Civil War caused catastophic injuries. The two minie bullets, for example, that struck John Bell Hood's leg at Chickamauga destroyed 5 inches of his upper thigh bone. This left surgeons no choice but to amputate shattered limbs. Hood's leg was removed only 4 and 1/2 inches away from his body. Hip amputations, like Hood's, had mortality rates of around 83%. The closer to the body the amputation was done, the more the increase in the wound being mortal. An upper arm amputation, as was done on Stonewall Jackson or General Oliver O. Howard (who lost his arm at Fair Oaks in 1862) had a mortality rate of about 24%.

Following is a description of a common battlefield amputation. Missing arms and legs were permanent, very visible reminders of the War. Amputees ranged from the highest ranking officers, like John B. Hood, Stonewall Jackson, and Oliver O. Howard, all the way down to the enlisted men, such as Corproal C.N. Lapham of the 1st Vermont Cavalry who lost both of his legs to a cannon ball. Hood, Jackson, Howard, and Lapham were certainly not alone in their loss, as 3 out of 4 wounds were to the extremities...in the Federal Army this led to 30,000 amputations.

The wait for treatment could be a day, maybe two and that was not out of the ordinary. When treatment was finally done on the poor soldier, it was not done antiseptically. It would only be in 1865 that Joseph Lister embarked upon the era of antiseptic surgery. Surgeons did not even perform careful handwashing before operating. The doctors wore blood splattered clothes. When something was dropped, it was simply rinsed in cool, often bloody water. They used sponges that had been used in previous cases and simply dipped in cold water before using them again on the next person.

A surgeon recalled: "We operated in old blood-stained and often pus-stained coats, we used undisinfected instruments from undisinfected plush lined cases. If a sponge (if they had sponges) or instrument fell on the floor it was washed and squeezed in a basin of water and used as if it was clean"

The injuries to be dealt with were dreadful and the fault of the soft lead Minie Ball. With the capability to kill at over 1,000 yards, this soft lead bullet caused large, gaping holes, splintered bones, and destroyed muscles, arteries and tissues beyond any possible repair. Those shot with them through the body, or the head, would not be expected to live. Almost all wounds were caused by the bullet, with canister, cannonballs, shells, and edged weapons next on the list.

The weapons (particularly the rifle) of the 1860s were far ahead of the tactics; i.e. the generals still thought to take a position you needed to go at it with the bayonet. The cylindrical lead bullet, the Minie ball, was rather large and heavy (.58 caliber usually). When it hit bone, it tended to expand. When it hit "guts" (i.e. the intestines) it tended to tear them in ways the old smoothbore musket ball did not. Since they crushed and smashed bone so badly, the doctors did not have much choice but to amputate a limb. Wounds to the stomach were almost always a death sentence.

Civil War doctors were woefully ill-prepared; of 11,000 Northern physicians, 500 had performed surgery. In the Confederacy, of 3,000, only 27. Many docs got their first introduction to surgery on the battlefield. Doctors usually did not specialize. Medical school, for many, was just 2 years (some less, few more). Surgeons reacted by adapting. They learned surgery on the job. And people died, of course, until they learned and became better... Many doctors were political appointments; there were no licensing boards in the 1860s... Army exam boards often even let in quacks.

Of the wounds recorded in the Civil War, 70%+ were to the extremities. And so, the amputation was the common operation of the Civil War surgeon.

The field hospital was hell on earth. The surgeon would stand over the operating table for hours without a let up. Men screamed in delirium, calling for loved ones, while others laid pale and quiet with the effect of shock. Only the division's best surgeons did the operating and they were called "operators". Already, they were performing a crude system of triage. The ones wounded through the head, belly, or chest were left to one side because they would most likely die. This may sound somewhat cruel or heartless, but it allowed the doctors to save precious time and to operate on those that could be saved with prompt attention.

The surgeon would wash out the wound with a cloth (in the Southern Army sponges were long exhausted) and probe the wound with his finger or a probe, looking for bits of cloth, bone, or the bullet. If the bone was broken or a major blood vessel torn, he would often decide on amputation. Later in the War, surgeons would sometimes experiment with resection, but amputation was far more common.

Deciding upon an amputation, the surgeon would administer chloroform to the patient. Hollywood's portrayal of battlefield surgery is dramatized and largely false; anesthesia was in common and widespread use during the war.... it would make more complicated and longer operations possible as the era of antiseptic surgery began in 1865 (too late for the poor Civil War soldier). With the patient insensible, the surgeon would take his scalpel and make an incision through the muscle and skin down to the bone. He would make incisions both above and below, leaving a flap of skin on one side.

Taking his bonesaw (hence Civil War slang for a doctor is a "Sawbones") he would saw through the bone until it was severed. He would then toss it into the growing pile of limbs. The operator would then tie off the arteries with either horsehair, silk, or cotton threads. The surgeon would scrape the end and edges of the bone smooth, so that they would not work back through the skin. The flap of skin left by the surgeon would be pulled across and sewed close, leaving a drainage hole. The stump would be covered perhaps with isinglass plaster, and bandaged, and the soldier set aside where he would wake up thirsty and in pain, the "Sawbones" already well onto his next case.

A good surgeon could amputate a limb in under 10 minutes. If the soldier was lucky, he would recover without one of the horrible so-called "Surgical Fevers", i.e. deadly pyemia or gangrene.

15 years after the War, surgeon George Otis cited the five principal advances of Civil War surgery: the surgeons had learned "something" about head injuries, how to deal with awful "ghastly wounds" without dismay, they had learned how to litigate arteries, information on injuries to spine and vertebrate had been "augumented," and "theory and practice" in chest wounds had been forwarded.

A little about the "Surgical Fevers". These were infections arising from the septic state of Civil War surgery. As you should have been able to see, the Civil War surgeon was interested not so much in cleanliness, but speed. As such, and not knowing anything about antiseptic surgery, fevers arose. Of these, the most deadly was probably pyemia. Pyemia means, literally, pus in the blood. It is a form of blood poisoning. Nothing seemed to halt pyemia, and it had a mortality rate of over 90%. Other surgical diseases included tetanus (with a mortality rate of 87%), erysepilas, which struck John B. Gordon's arm after he was wounded at Antietam, and osteomyelitis which is an inflammation of the bone. Also, there was something called "Hospital Gangrene". A black spot, about the size of a dime or so, would appear on the wound. Before long, it would spread through, leaving the wound an evil smelling awful mess. The Hospital Gangrene of the Civil War is an extinct disease now.

Primary amputation mortality rate: 28%
Secondary amputation mortality rate: 52%

Image: Confederate soldiers killed near the Wheatfield at Gettysburg (Library of Congress)

USCivilWar.Net wants to thank Jenny Goellnitz for compiling this information.
jgoellnitz@yahoo.com

The Imponderable ‘What-Ifs’: Did the Medical Issues of Three Confederate Generals Cause the South to Lose the War?

By Kevin R. Loughlin, MD, MBA

During the darkest days of World War II, Winston Churchill was credited as saying, “The imponderable ‘what- ifs’ accumulate”. Throughout history, imponderable what ifs have provoked the observer to consider how historical outcomes may have turned out differently. Such it is with the Civil War. It can be reasonably argued that the deaths of Albert Sidney Johnston and Stonewall Jackson and the illness of Robert E. Lee at Gettysburg irrevocably altered the course of the conflict and significantly contributed to the ultimate outcome.

At the outset of the Civil War both the North and South had strategic plans. The estimable Winfield Scott proposed the “Anaconda Plan” for the North, which would start by blockading the seacoast, then drive down the Mississippi to weaken the Confederacy to the west, and conclude with an attack through the heart of the Confederacy.

The South also had plans. Jefferson Davis favored a defensive strategy. He postulated  that a war of attrition would frustrate the better equipped Union and ultimately result in an armistice with the South and subsequent secession. An alternative Confederate strategy proposed by P.G.T. Beauregard was to attack in the western theatre and to control the river routes, first the Tennessee and Cumberland rivers and, if successful, then the Mississippi. On the eastern front, they would aim to capture Washington D.C. for both strategic and psychological proposes and then move northward into Maryland and Pennsylvania.

Albert Sidney Johnston graduated from West Point in 1826. He gained his early experience in the Texas War of Independence and the Mexican-American war. At the start of the Civil War, he was considered among the most gifted of the young generals, and Jefferson Davis placed him in charge of the Western Campaign.

Before, the battles of Fort Henry and Fort Donelson, Johnston had urged that the forts be fortified and additional  troops be assigned to their defense. However, his pleas were ignored, and Grant captured both forts in February of 1862 which delivered control of the Tennessee and Cumberland rivers to the Union. The control of these rivers emboldened the Union to move forward toward Shiloh to complete the larger goal of dominance of the western theatre.

The armies led by Johnston and Grant collided at Shiloh, Tennessee on April 6, 1862. In retrospect, this battle, more than either New Orleans or Vicksburg,  ultimately guaranteed Union control of the western theatre and the Mississippi.

Albert Sidney Johnston had repeatedly demonstrated personal valor during battle. His valor was at its zenith at Shiloh. Like many Confederate generals, he personally led the attack. On the first day of battle at the salient, Johnston was hit below the knee, probably by friendly fire, which severed his popliteal artery. At first he did not recognize the gravity of the wound, perhaps because of nerve damage that he had suffered to the same region during a duel in 1937. However, the severity of the wound quickly became clear as the blood filled his boot. The consequences of the loss of a leader like Johnston did not go unrecognized. His death was a strong blow to the Confederacy. Jefferson Davis believed that the loss of Johnston “was the turning point of our fate”. Davis would further state “Without doing injustice to the living, it may safely be asserted that our loss is irreparable; and that among the shinning host of the great and good who now cluster around the banner of our country, there exists no purer spirit, no more heroic soul, than that of the illustrious man whose death I join you in lamenting.” One can only speculate that if Johnston had survived, the South would have prevailed in the western theatre which would have caused the Union to wage war on two fronts. Such an obligation would have compromised their presence in Virginia, Maryland, and Pennsylavania.

Throughout American military history, few figures have attained the stature of Thomas J. “Stonewall” Jackson. He was born in 1824 in what is now West Virginia. He was poorly prepared academically for West Point and struggled at first, but demonstrating the resolve, hard work and determination that would be the signature of his military career, he graduated 17th out of 59 students in the class of 1846. Two of his classmates were George Pickett and George MCClellan.

Jackson served in the Mexican War and taught at the Virginia Military Institute. In 1861, when the Civil War broke out, he was put in command of several Virginia Infantry regiments by the governor of Virginia. He distinguished himself in the Battle of First Manassas where he received his nickname. This occurred when General Bernard Bee, in the midst of encouraging his men to fight on, pointed at Jackson and shouted, “Look, men, there is Jackson standing like a stone wall!” He would later be a major force at Second Manassas, Antietam, and Fredericksburg, during which he led the Second Corps of the Army of Northern Virginia under Robert E. Lee. Jackson was known for his audacity and tactical judgment in battle.

He next saw action at Chancellorsville on May 2, 1863. While returning to camp  from personal reconnaissance of  the Yankee front line, he was shot by friendly fire, twice in the left arm and once in the right hand. Members of the 18th Northern Carolina Regiment mistakenly took him for an advance party of the enemy and opened fire. As the shots occurred at dusk, he did not receive immediate medical attention and reportedly was accidentally dropped from  his stretcher twice. His left arm was seriously damaged and the next day was amputated by Hunter Holmes McGuire. Jackson lingered and died eight days later, on May 10, 1863. It has been speculated that he died of either pneumonia or pulmonary embolus, and the scenario and time frame would fit with either diagnosis, although sepsis has also been considered. Lee was devastated when informed of Jackson’s death and said, “I’m bleeding at my heart”.  Lee considered the loss of Jackson so significant that he restructured his high command. Before his death Jackson commanded the 2nd Corps of the Army of Northern Virginia and Longstreet commanded the First. However, after Stonewall’s death, Lee split the 2nd Corps in two. The original 2nd Corps was to be led by Richard S. Ewell, also known as “Old Bald Head”, and the  3rd Corps was led by Ambrose P. Hill. Longstreet remained leader of the First Corps. Neither Ewell or Hill had the battlefield boldness or judgment of Jackson, and their shortcomings contributed to the outcome at Gettysburg.

Robert E. Lee occupies a secure position in the pantheon of great modern military leaders. Historians concur that his military genius was evident throughout the war, except at Gettysburg. It has long been a point of historical interest, as to what was the cause of Lee’s apparent aberration as a strategist and tactician at Gettysburg, following so closely on the heels of his brilliant campaigns at Fredericksburg and Chancellorsville. Yes, there were factors that contributed to the defeat there other than Lee’s generalship. Certainly, the absence of his trusted cavalryman Jeb Stuart until late on July 2, 1863 was a critical element that deprived Lee of essential reconnaissance.

Lee was in apparent good health until March 1863 while when camped near the Rappahannock River, he developed sharp chest pain with radiation to his arms. His physicians diagnosed an “inflammation of the heart sac” and confined him to bed. Given the state of medical knowledge at the time, no other treatment was initiated and he apparently recovered. However, several historians have posited that heart disease, which was the cause of Lee’s death in 1870, played an important role in his performance at Gettysburg. Angina and pericarditis have both been proposed as contributing elements to what has been considered Lee’s poor strategy, particularly on July 2nd. Why was Pickett’s charge not initiated on July 2nd to reinforce Barksdale’s charge, which appeared to be heading to break through the Union lines, rather than waiting to give Pickett the order for the ill-fated offensive on July 3rd?  This question has never been adequately answered. Lee’s apparent strategic indecisiveness on the battlefield in Pennsylvania has never been satisfactorily explained, but clearly it was the seminal event that began the South’s demise.            

What if Johnston had not been killed at Shiloh? What if he had survived and helped carry out Beauregard’s proposal to attack the Union’s center in Ohio in late 1862? What  if he had been available to Lee at Gettysburg? What if Stonewall Jackson had lived to lead the 2nd Corps at Gettysburg and to support Barksdale’s change on July 2, 1863 or Pickett’s on July 3rd? What if Lee had not been distracted at Gettysburg by chest pain? What if the Confederates had prevailed at Gettysburg? What if a defeat at Gettysburg compelled the North to call an armistice and leave Southern secession in place? The answer to these questions remain unanswered. History’s only reply is silence. The imponderable ‘what- ifs’ indeed accumulate.

KEVIN R. LOUGHLIN, MD, MBA is a urologic surgeon at Brigham and Women’s Hospital in Boston and a professor of surgery at Harvard Medical School. He completed his undergraduate education at Princeton University and received his MD from New York Medical College.He also holds an MBA from Boston University and an MA(Hon) from Harvard. He is a trustee of the American Board of Urology and is a member of the board of directors of the American Urological Association and the American Association of Clinical Urologists. He is an omnivorous reader and particularly enjoys military history. He enjoys running, biking, and playing with his dogs, Finn and Lizzie.

Image: Winfield Scott's "Anaconda Plan," 1861. J.B. Elliott. Library of Congress

From: hektoeninternational.org

Share

Facebook Twitter Delicious Stumbleupon Favorites