Civil War Hospital Ship

The U.S.S. Red Rover, a captured Confederate vessel, was refitted as a hospital ship.

Evolution of Civil War Nursing

The evolution of the nursing profession in America was accelerated by the Civil War.

The Practice of Surgery

Amputations were the most common surgery performed during the Civil War.

Army Medical Museum and Library

Surgeon-General William Hammond established The Army Medical Museum in 1862. It was the first federal medical research facility.

Civil War Amputation Kit

Many Civil War surgical instruments had handles of bone, wood or ivory. They were never sterilized.

Tuesday, September 24, 2013



The practice of quarantine—the separation of the diseased from the healthy—has been around a long time. As early as the writing of the Old Testament, for instance, rules existed for isolating lepers. It wasn't until the Black Death of the 14th century, however, that Venice established the first formal system of quarantine, requiring ships to lay at anchor for 40 days before landing. ("Quarantine" comes from the Latin for forty.)

The Venetian model held sway until the discovery in the late 1800s that germs cause disease, after which health officials began tailoring quarantines with individual microbes in mind. In the mid-20th century, the advent of antibiotics and routine vaccinations made large-scale quarantines a thing of the past, but today bioterrorism and newly emergent diseases like SARS threaten to resurrect the age-old custom, potentially on the scale of entire cities. In this time line, follow the evolution of quarantine, from Roman times to the present.

New York State's new Quarantine Act calls for a quarantine office run by a health officer who has the power to detain any ship entering the port of New York for as long as he deems necessary. The health officer can also order all cargo to be removed and a ship cleaned and fumigated.

In April the steamer Virginia arrives in New York harbor from Liverpool, its passengers riddled with cholera. Discovering that 35 steerage passengers and two crew have died during the voyage, the city's health officer orders a swift quarantine. This and other strict quarantines undertaken during the ensuing epidemic prove successful in limiting deaths to about 600, a modest number compared to previous outbreaks.

IMAGE: Quarantine Station
The 128th Reg't was stopped at Quarantine Station on the Mississippi River south of New Orleans, when on thier way to join Bank's Army in Louisiana in late 1862. Many men of the regiment took ill aboard the Steamer Arago on their journey south and were kept here in quarantine until they recovered. Many unfortunately did not survive.

19th Century Midwifery

Call The Midwife – An Historical PerspectiveLabels

February 14, 2012 by Come Step Back In Time

The hit tv series,"Call The Midwife", is a real gem in the BBC’s Sunday night viewing schedule and its popularity is supported by viewing figures topping nearly nine million. I am not at all surprised that a second series has just been commissioned.  The series finale is on Sunday 19th February, BBC 1 at 8.30pm. Based on the books by Jennifer Worth (formerly Lee) about her own real life experiences as a newly qualified midwife in London’s East End during the 1950s.  I am currently reading Call The Midwife and will then move on to In The Midst of Life.  There are two further books, Shadows of the Workhouse and Farewell to the East End.  These books are well-written, fascinating, at times heartbreaking and a must-read for anyone with an interest in the history of medicine.  Worth decided to write down her memoirs after reading an article ‘Impressions of a Midwife in Literature’ that appeared in the January edition of Midwives Journal, 1998. The article struck a chord with her, why were, as Coates concluded, midwives virtually non-existent in literature? Worth immediately decided to rectify this and thus her wonderful books and the subsequent television series was born.

The history of midwifery is a complex subject and when conducting research for this article, I discovered that there is definitely a shortage of academic books on this highly skilled branch of nursing.

During the 17th century, City of London midwives had to serve a seven-year apprenticeship before delivering a baby on their own.  Historically, midwives have had a strained relationship with physicians, who would often viewed their practices with suspicion.   Debates surrounding the creation of strict guidelines for the practice of midwifery are often found in contemporary newspapers and medical journals.  In a number of the examples that I found, it struck me just how vulnerable the midwife was at the hands of the law if a delivery went tragically wrong or simply that the midwife was poorly trained in the first place.

In one such example, from an 1845 edition of Provincial Medical and Surgical Journal, a thirty-five year old carpenter’s wife had died soon after giving birth and an inquest was launched into the circumstances surrounding her death.  Her medical history indicated that in two previous pregnancies she had suffered from retention of the after-birth and in her current pregnancy the scenario had occurred once more.  She was attended by a legally qualified practitioner who was an admitted licentiate of the Apothecaries’ Company since 1822.  This was a home birth that took place in a tiny rural English village. Following the woman’s death, the body was examined and found to be missing the entire uterus together with several feet of the large intestine, both of which had been forcibly extracted.  A verdict of manslaughter was put forward by the coroner and the midwife practitioner committed for trial at the next assizes.  This case had highlighted, once again, the need for a regulatory body to be established for all practitioners engaged in midwifery procedures.  A Petition was subsequently put forward to parliament.  The Petition read:

‘That your Petitioners, in the pursuit of their professional duties, have frequently witnessed and deplored the evil consequences ensuing from the indiscriminate practice of Midwifery, not only to themselves, but to society in general, for the want of some adequate legal protection or recognised body to test the competency and qualifications of those who practice in that peculiar department of the medical profession, the existing medical candidates for their diploma as to their obstetric knowledge; and your Petitioners are of opinion that the practice of Midwifery has not hitherto received that degree of attention from the Legislature, or protection from the Government, which is commensurate with its importance.’

In the UK today practicing midwives are governed by strict legislation and guidelines set-out by The Nursing and Midwifery Council which was established in 2002.

Historically, a majority of working class women gave birth at home, if this wasn’t possible then they would be admitted to a ‘Lying-in’ hospital.  The reason for this was often social rather than medical and the most common type of lying-in hospital would have been found in the work-house.  St. Thomas’ Hospital in London had lying-in wards during the fifteenth century and was set-up by a charitable donation by Richard Whittington for unmarried mothers. However, during the eighteenth and nineteenth centuries the number of non-workhouse lying-in hospitals in London was on the increase:

General Lying-In Hospital, York Road, Lambeth.  Originally, opened in 1767 as the Westminster New Lying-In Hospital in Westminster Bridge Road, Lambeth.  Single mothers as well as married women were admitted.  In 1818 it changed its name to the General Lying-In Hospital and moved to York Road, Lambeth in 1828.  The Hospital closed in 1971 but this fine-looking building still exists today.  Florence Nightingale took a particular interest in the Hospital’s midwifery training programme;
Queen Charlotte’s Hospital, Goldhawk Road.  The Hospital opened in 1809, moved to Marylebone Road in 1813 and Goldhawk Road in 1940.  The Hospital admitted both single and married women;
City of London Lying-In Hospital, City Road, Finsbury.  Opened in 1750.  The building was badly bombed in 1940-1. Eventually the Hospital was moved to Hanley Road, Islington and closed in 1983;
British Lying-In Hospital, Endell Street, Holburn.  Opened in 1749 and closed in 1913.  Only married women were admitted;
New General Lying-in Hospital, Oxford Road, near Hanover Square.  Opened in 1767 under the name Queen’s Hospital.  It moved to Store Street near Tottenham Court Road, where patients did include single women.  The Hospital closed in 1800.
These hospitals were predominantly intended for the “wives of poor industrious tradesmen or distressed House-keepers and the wives of soldiers and sailors”.  London teaching hospitals did not admit women for childbirth before the late nineteenth century.  Medical students and staff sometimes delivered women in their own homes.

In the 18th century, male surgeons would often intervene in the delivery process and a new group of medical men emerged, men-midwives or ‘accoucheurs’. Developments in obstetrical instrument design helped to improve the chances of a successful labour, particularly for those women whose babies lay in different positions in the birth canal.  Pain relief options for women were limited and included, opium, brandy and after 1847, chloroform. The pioneer of the use of chloroform in childbirth was Sir James Simpson.  Ergotamine was also given to women to stem the flow of blood.

Florence Nightingale (1820-1910) is an important figure in the history of midwifery and pressed for midwifery as a career for educated women.  She established a training school for midwives in King’s College Hospital at the end of 1861.  A fully-equipped maternity ward was set-up at the Hospital and the physician accoucheurs agreed to give six months’ training to the midwives.  The midwives were trained to work in hospitals and also to deliver women in their own homes.  The tuition was provided for free but the students had to pay for their own board and lodging.  However, after just 2 years the scheme suffered a devastating blow – an outbreak of Puerperal Sepsis in the lying-in wards, following the delivery of a woman suffering from Erysipelas.   This event forced the training programme to be shut down.  Florence Nightingale was devastated and she immediately launched an investigation into the incident, writing to numerous physicians to seek opinion and advice.  She wanted to establish a set of reliable statistics of mortality in childbirth for women who gave birth in the lying-in wards.  She soon discovered that this information would be extremely difficult to come by.  The medical profession viewed her ‘interference’ with suspicion and many would not co-operate with her repeated requests for data.  However, Florence was not of the disposition to give-up easily and eventually, with the help of Dr John Sutherland (of the Sanitary Commission), was able to publish a slim volume of her findings in 1871, titled, Introductory Notes on Lying-in Institutions.  She calculated that the death rate for women giving birth in the lying-in institutions was 33.3 per thousand and the rate for home births was 5.1 per thousand.  The conclusion was drawn that death in institutions was due to the prevalence of Puerperal fever, an infection caused by insanitary conditions.  Florence advocated smaller hospitals, individual rooms for delivery, scrupulous cleanliness, shorter stays in hospital and banning medical students from attending births immediately after visiting the dissection room, which was common practice at the time. She believed that in taking these measures the huge morbidity figures could be drastically reduced.

Puerperal Sepsis, or childbed fever as it is often referred to, has claimed the lives of many women over the centuries, including a number of famous individuals such as Henry VIII’s wives Jane Seymour and Katherine Parr, Mrs Isabella Beeton and Mary Shelley’s mother Mary Wollstonecraft.  The disease is a iatrogenic disease, caused by doctors and remained a common cause of death in childbirth until the early part of the 20th century. Infectious organisms on the hands of the birth attendants are transferred to the woman’s uterus.  The most common organism is  Streptococcus, the virulent beta-haemolytic (group A).  The disease usually begins on the third day after delivery.  Typical symptoms include: high temperature; severe headache; raised pulse; severe abdominal pain; vomiting and diarrhoea.  Death occurs when the infection spreads, resulting in peritonitis and septicaemia.

Obstetric forceps first appeared in the 17th century and many of the instruments were named after the obstetrician who invented them.  In each set of delivery instruments there would be two or three forceps, often with ebony or ivory handles, perforators, cranioclasts and decapitation hooks.  When there was no safe way of delivering a live baby, the delivery was obstructed and the mother’s life hung in the balance, gruesome measures were resorted to.  The perforators were used to open the baby’s skull, then the cranioclasts were brought in to crush it and finally the hooks were employed to remove the deceased infant in parts. The vectis, half a forcep, was a popular delivery instrument during the nineteenth century.  The vectis was used to manoeuvre the foetus into the normal position for child-birth.  Technically it converted the impassable brow or shoulder positions to the normal vertex (top of the head) presentation.

Caesarean Section is now a much practiced form of delivery in the UK, with nearly 163,000 procedures performed in 2010-11. Caesarean delivery dates back to ancient times with Egyptian and Roman law sanctioning its use after the mother’s death in order to allow the infant a chance to survive. In medieval Christian times it was believed by some that those infants that survived such a procedure were in possession of great strength and special powers.  Although many medieval Christians also viewed this practice with suspicion and as an ‘unnatural’ birth.  In the Renaissance, midwives were brought in to perform post-mortem caesareans. The first successful caesarean on a live woman took place in 1500. In the eighteenth and nineteenth centuries there were even reports of women performing the operation upon themselves.  A midwife called Mary Donelly was the first to perform a successful caesarean operation in Ireland in 1738.  Although popular, the caesarean remained a rare procedure until the late nineteenth century.  Eduardo Porro (1842-1902) was an Italian obstetrician who pioneered a technique to minimize haemorrhage and sepsis risks in caesarean operations by removing the mother’s uterus at the same time. During the 1950s, when Jennifer Worth was practising as a midwife, the caesarean section rate was 3%.  Today the rate is approximately 25%.

In 1928, Sister Mary Laetitia Flieger, R.N. published a report in The American Journal of Nursing on current midwifery practices in the UK.  She reports that pregnant women were instructed to take care of their breasts by washing them with hot and cold water, morning and evening, then rub the nipples with a rough towel.  In the last month women were told to clean the nipples with soap and water using a soft nail brush.  Sister Flieger reported that diet was taken seriously by the British midwives, who suggested that no red meat should be allowed during the last month and only a little chicken and rabbit, together with plenty of fish should be consumed.  Once labour had commenced, unless it was to be a breech delivery, a forceful enema would be given.  She also mentioned that The Central Midwives’ Board (the then governing body for UK midwives) recommended that the number of intimate examinations given to the women should be limited in order to avoid the dangers of sepsis.  It is interesting to note that by the 1920s the practice of infection prevention is taken seriously. Thanks must go to the hard work and research that Florence Nightingale conducted nearly seventy years previously.  Flieger also reported on the delivery style in the UK for a non-breech delivery.  The woman should lie on her left side and an assistant raise the woman’s leg to affect delivery.  If there is a pendulous abdomen, breech delivery or forceps delivery then the women is delivered on her back.  During the puerperium (lying-in period) the woman’s perineum is swabbed about five times a day with a weak solution of iodine or lysol under thoroughly aseptic conditions.  She also gives UK childbirth mortality rates for the mother in 1928 as being approximately 3,000 deaths in relation to 800,000 babies being born.

IMAGE: 19th obstetric instruments. Vectis forceps shown on the left. The Old Operating Theatre Museum, London.


Civil War Bandages


When Governor Randall made an appeal to Wisconsin women to aid in caring for the soldiers, Sauk County women stepped up to provide medical and economic support. Women began meeting at homes to sew and knit garments for the soldiers. Soon, these informal meetings turned into aid societies with officers, rules, committees and regular schedules of work.

These aid societies raised the money needed to purchase materials for their work by gathering subscriptions (a pledge of money) and by entertaining audiences, with tableaus (A type of play in which a group of people create still picture(s) to tell the story)

Some of the items women made and provided to the troops included:
Bandages - There was no absorbent cotton or gauze. Thousands of bandages were handmade by tearing old sheets into strips and rolling them into hard round rolls. Lint was provided for the bandages by scrapping old linen or cotton with a dull knife until it became a fluffy mass. Women also assembled charpies or threads pulled from white cloth and bundled into clumps. Both the lint and charpies were placed over wounds, under rolled bandages.

Opium in the Civil War

Under the Influence: Marching Through the Opium Fog 

by James Street, Jr.

By the beginning of the Civil War, there was probably some opium of some form in most household medicine cabinets. In The Plantation Mistress, a 1982 study of women's life in the antebellum south, author Catherine Clinton writes that she found home remedies, all containing opium, for many common illnesses. She observes, ”Laudanum was commonly used throughout the antebellum era, prescribed with unfortunate frequency for 'female complaints'.....contrary to the 20th Century image....., the late 19th Century profile indicates that addicts were disproportionately upper-class, Southern, white and female. The women of the Jefferson Davis family, treated by a Dr. liberal in his dosages, became dangerously addicted." Most people using opiates did not become addicted.

Confederate society figure Mary Chestnut, writing in her diary in Richmond, Virginia, during July 1861, told of her refusal to take laudanum, a tincture of opium mixed with alcohol and water. "I have no intention of drugging myself now." she asserted. "My head is addled enough as it stands, and my heart beats to jump out of my body at every sound." Later, in March 1865, Mrs. Chestnut was a refugee in Lincolnton, N.C. She was accidentally given an overdose of Dover's powder, a mixture of opium and ipecac (what??!!). She slept for 2 days and nights. After her Dr. remarked that she was hard to kill, Mrs. Chestnut speculated,” Maybe I was saved by the adulteration so often complained of in Confederate medicine."

When called to the colors, whether Union or Confederate, doctors who used opiates liberally on civilian clients continued to use them liberally on their military patients. William H. Taylor was an assistant surgeon in the Confederate Army of Northern Virginia, an organization known for its rapid marches. After the war he wrote that he had simplified sick call on the march to one basic question: "How are your bowels? If they were open, I administered a plug of opium; if they were shut I gave them a plug of blue mass (an unstable mercury compound)." A Federal surgeon devised an even speedier sick call method. He performed diagnosis from horseback, dispensing morphine powder by pouring it into his hand and letting the patient lick it.

Morphine, injected by the recently developed hypodermic syringe, was the preferred form of opium for treating the wounded. And though syringes were scarce, even in the better-equipped Federal armies, 29,828 ounces of morphine sulphate was dispensed to Union soldiers. That figure seems almost trifling compared to the almost 10 million opium pills and 2,841 million ounces of other opiates administered by Federal medical authorities by 1865.

While not as ubiquitous in the Confederate army, opium was in reasonable supply until the very end of the war, thanks to captured medical stores and imports smuggled through the naval blockade of the southern ports. Though opiates were used profusely in the treatment of illnesses, it was in relieving the pain of wounds and surgery that they were most effective. The desire for that relief cause many injured soldiers to become opiate addicts, for pain lingered long after medical treatment in those days. And after the war it was easy to find veterans who suffered agony from war wounds or war-related illnesses for the rest of their lives.

In his book, "Dark Paradise: Opiate  Addiction in America Before 1940", David Courtwright quotes from an 1868 study titled The Opium Habit, with suggestions as to the remedy: Maimed and shattered survivors from a hundred battlefields, diseased and disabled soldiers released from hostile prisons, anguished and hopeless wives and mothers, made so by the slaughter of those dearest to them, have found, many of them, temporary relief from their sufferings in opium."

Just as the Civil War army surgeon was a ready source of alcohol for non-medicinal purposes, so was he or his staff a handy source of opiates--not just for sneaky scroungers but for high-ranking officers too. If a general wanted opium pills, what surgeon would deny him the relief he sought, when the surgeon probably prescribed them in the first place? A surgeon had only to turn to his medicine chest to satisfy such a request, or the officer could help himself from the open stock usually arrayed on shelves in the unit's medical quarters.

Dr. Charles Beneulyn Johnson, a Union regimental medical steward, described the contents of the medical chests. "During a campaign our stocks of medicines were necessarily limited to standard remedies." He recalled, among which could be named opium, morphine, Dover's powder, quinine, rhubarb, Rochelle Salts, Epsom salts, castor oil, sugar of lead, tannin, sulphate of copper, sulphate of zinc, camphor, tincture of iron, tincture of opium, camphorate, syrup of squills, simple syrup, alcohol, whiskey, brandy, port wine, sherry wine etc. Upon going into camp, where we were likely to remain a few days, these articles were unpacked and put on temporary shelves made from box lids; on the other hand, when marching orders came, the medicines were again packed in boxes, the bottles protected by old papers, etc." Johnson continued, "Practically all the medicines were in powder form or in the liquid state. Tablets were not yet come into use and pills were very far from being as plentiful as they are today...." The doctor noted, " of the very few pills we carried in stock...was composed of two grains of camphor and one of opium. Asafetida, valerian and opium and its derivatives (sic) were about all (we) had to relieve nervousness and induce sleep."

Among the aphorisms attributed to that barely literate but extraordinarily effective Confederate, Lieutenant General Nathan Bedford Forrest, is one that states, "War means fighting and fighting means killing." Not all civil war generals could muster so direct such an approach to war and violence. Many preferred to try any means of defeating an enemy except fighting. These were usually the same generals who could not overcome their troops' natural desire to remain where they were, so long as they were safe. Historian T. Harry Williams called this phenomenon the "inertia of war," that moment when "the general's own army, begins to offer resistance....when the whole inertia of the war comes to rest on his will, and only the spark of his own purpose and spirit can throw it off... a commander has to have in his make-up a mental strength and moral power that enables him to dominate whatever event or crisis may emerge on the field of battle."

But were the war's inert generals fundamentally flawed leaders or was there another reason for their lapses into feebleness? When cataloging the attributes of a successful general, Marshall Maurice Saxe, France's great military mind of the early 18th Century, presented the usual list including bravery, intelligence etc. then he added one more, health. It is doubtful that the outcome of the Civil War would have been any different if all the generals had been healthy. But the fact is they were not, and perhaps much of their erratic and lethargic behavior can be ascribed to their frail state of health--and to opium, the panacea their doctors described at every turn.

Braxton Bragg's health should have excluded him from any consideration for a field command. By 1861, when the first shot of the war was fired at Ft Sumter, Bragg had developed a long list of chronic ailments, including malaria, dyspepsia (deranged or impaired digestion) and boils. His wife and friends were aware that the greater the pressure on him, the more he complained and the more likely he was to develop boils, headaches and other painful maladies. His behavior as commander of the Confederate Army of Tennessee was as mystifying to his contemporaries as it is to current scholars. And Bragg's penchant for turning away from victory, quitting battle when he had the upper hand, was the basis for a story that when he died, he went to heaven: as he approached the Pearly Gates, they opened; then Bragg retreated.

Some critics and historians offer stupidity, incompetence or cowardice as reasons for Bragg's failures. But his blunders may have resulted from his health and the rudimentary, even primitive level of medicine prevalent during the war. Bragg's behavior showed signs of opiate use. In the field, he appeared to withdraw as battle developed, to lose track of where he was. He became unable to adapt his plans to changing situations on the battlefield. But Bragg certainly was not stupid, as evidenced by the swiftness of his September 1862 movement from Tennessee into Kentucky, to wrest the Bluegrass State from Union Major General Don Carlos Buell. He was not a coward, as his record during the Mexican War and the Battle of Shiloh demonstrated. But, as he was promoted to higher command, Bragg became more distant from the troops, appearing to avoid active command during battle. His behavior could have been the result of a combination of poor health and the use of opiates. Bragg may very well have believed the unfounded contents of his Dec.31,1862 telegram to President Davis--that he had won victory after the first day of the Battle of Stones River. Opium-induced euphoria could have induced such an effect, leading him to believe what he so desperately wanted to be true. Euphoria could have led Bragg to back away from Buell's troops after capturing the entire Federal garrison at Mumfordville, Kentucky in Sept. 1862 and seizing Frankfort, the state capital. That same euphoria may have prompted his dispatch to Richmond before the Oct. 1862 Battle of Perryville, Kentucky, claiming his army had joined with Major General Edmund Kirby Smith's when Smith's force was actually more than 100 miles away. Bragg's skewed visions of success, his paranoia towards his officers after each defeat , could have been the result of his medical care.

The gallant John Bell Hood, aggressive, vigorous and effective while with the Army of Northern Virginia, became a victim of delusions after a series of shattering wounds struck him. He left his finest attributes and his common sense on the surgeon's table. The pain from the stump of his right leg must have been horrendous when he rode strapped to his saddle. The bouncing and jolting, the abrasive rubbing of the stump against the rough cloth of a dressing or pad could not have been endured without some sort of pain-reliever.. An opiate was the standard prescription. The drug would have made Hood sleep at Spring Hill while the Federals escaped his trap. The pain was a terrible burden to inflict on Hood, but it was even worse to inflict Hood on the Army of Tennessee.

Union Major General Joseph "Fighting Joe" Hooker's affinity for spirituous liquors and spirited women was a matter of record by the time he led the Army of the Potomac to battle at Chancellorsville, Virginia, in May 1863. If Hooker was truly an alcoholic, and if he kept his pledge not to drink while commanding the army, it is highly likely he was treated with opiates to help him through withdrawal (opiates were commonly used to treat delirium tremens). This medical scenario may account may account for his poor battlefield performance. Or, there may have been another. Hooker's plans for the Battle of Chancellorsville were excellent. It was his leadership that faltered as he became more and more lethargic. The general admitted this much himself. Then, on May 3, Hooker claimed that, while he stood on the porch of a house, he was hit on the head by a column that was knocked loose by a cannon shot. He claimed he was in great pain. The Medical Director of the Army of the Potomac, Doctor Johnathan Letterman, later substantiated Hooker's claim, but failed to mention the extent of the injury, the amount of pain and whether any alcohol or morphine was administered.. But Hooker's behavior the rest of that day indicates he may have received an intoxicating prescription, for he abandoned control of his army to sleep in his tent. Opium, in smaller doses than whiskey, is an effective soporific.

The three generals mentioned here were not the only ones to experience radical behavior changes during battle, changes that may indicate the use of opium or alcohol. Bragg, Hood and Hooker were merely the highest ranking examples. Opiates may have contributed to the timidity of Confederate Lieutenant General Richard Stoddard Ewell, who in July 1863 dashed boldly into Gettysburg, strapped to his horse and minus a leg, and chased the Federals out of town, but then sank into inertia. And what of Bragg's enemy, Buell, who after an accident with his horse sat out the Oct. 1862 Battle of Perryville behind his lines? There are others, too. This is not to imply that all the Civil War's military leaders were alcoholics or drug addicts.

Ulysses S. Grant was certainly known as a two-fisted drinker, but liquor did not keep him from forging victories. Perhaps a more important observation about Grant is that he was never forced by poor health to rely on the services of a surgeon. That fact alone may have been a blessing for the Union. Due to the state of medical arts and sciences during the Civil War, some officers, tenuously steadied with alcohol or opiates, managed to hold positions of great responsibility even though they were unfit for any military service.

Others were retained after suffering debilitating wounds or illnesses, when they should have been discharged or assigned non-combat roles. But instead, history was sometimes made by men who saw their battlefields through the cloud of intoxication.

From: "Civil War Times" May 1988

Thoughts of Thanksgiving Dinner

NOVEMBER 21, 1863
By Robert E. Denney

After a couple of days of moving troops in the mud around Chattanooga, Sherman was again on the move, crossing the Tennessee at Brown's Ferry and heading northeast for the Confederate right flank around Missionary Ridge. Sherman was to attack the north end of the ridge, Thomas the center. Hooker was to attack the Confederate left flank. There were delays, even more than usual, because of the heavy rains, and the roads were quagmires.

Perry, John G., Asst. Surg., USV, 20th Mass. Vols., near Mountain Run, Va.:

We are perfectly deluged with rain, and my tent, raised on logs, has a deep pool of water around it . . . Next Thursday will be Thanksgiving Day. How I wish our men could have something extra to eat, poor fellows! They have had potatoes only about a dozen times since last June, and are becoming badly run-down. We have received from one of our officers now at the North a quantity of raisins, flour, pickles, etc., for Thanksgiving dinner, and we also have permission to send to Washington for more supplies. We officers do not need these extras, as our pay enables us to buy pickles and such things to prevent scurvy, but the men have not the money for luxuries.

From: "Civil War Medicine: Care & Comfort of the Wounded"

"Haunted Minds: The Impact of Combat Exposure on the Mental and Physical Health of Civil War Veterans" and "Years of Suffering and Change: Modern Perspectives on Civil War Medicine"

Book Reviews

By Captain Rea Andrew Redd

Haunted Minds: The Impact of Combat Exposure on the Mental and Physical Health of Civil War Veterans, Judith Andersen [143-158 with notes] and Years of Suffering and Change: Modern Perspectives on Civil War Medicine, James M. Schmidt and Guy R. Hasegawa (editors), Edinborough Press, 2009, $29.95 (hardcover), $13.95 (paper).

Judith Anderson introduces her understanding of post-traumatic stress syndrome in Civil War combat veterans with discussion of the disintegration of a veteran's marriage and family. Frank Lang, native German, active duty infantryman, hospital attendant waded through suffering and death while serving in Company K, 7th Michigan in the Army of the Potomac from 1861 to 1865. By 1871, his wife went before a Minnesota judge and testified to the emotional and physical abuse, neglect and abandonment that she and her children suffered at the hands of her husband, Frank Lang. After the divorce was granted Frank Lang disappeared.

Anderson states that from ancient to modern times, whenever humans suffer an extended period of exhaustion, hunger, hard campaigning, and combat, all at the mercy of nature's elements, the likely outcome is psychiatric disorders. Showing no visible wounds but carrying memories that would leave him cross, morose, reclusive, violent and quite willing to bring about the destitution of his wife and family, Frank Lang had changed during the course of the war. The family's neighbors confirmed the wife's testimony to the judge.

Physicians in the Civil war noted that stress reactions included partial paralysis, muteness, intermittent screaming and weeping. Nightmares, flashbacks, anxiety, agitation, irritability, nervousness, unexplained headaches and nausea with no apparent organic orgin stymied doctors understanding. Often, the diagnosis was hysteria and nostalgia. The disorders of paranoia, psychosis, hallucinations, illusions, insomnia, confusion, memory problem's, delusions and spontaneous violence were classified as nervous disorders without known cures. 'Irritable heart' and 'cardiac muscular exhaustion' were the official diagnosis. Malingering was an official medical condition during this era.

In 1862 to settle pension issues the government had to distinguish between the malingering soldier from the soldier with an organic disease the symptoms of which were malingering. In order to avoid paying pensions to soldier feigning illness, pension bards set procedures for investigating claims. A government examiner whose first criteria as the record of physical wounding. If a wound was not present, then affidavits from clergy, relatives and neighbors had to be submitted. These affidavits recounted the soldiers behavior before and after military service. If a significant change was discovered then the possibility of a pension was considered. Examining physicians were named to objectively interview the veteran. Multiple examinations were sometimes needed to form a consensus by the examining board of doctors.

The heritage of Civil War medicine is the reliance on objective evidence for a diagnosis of a nervous disease and the development of a process to collect medical records. The chief advances as they relate to the diagnosis of mental illness were few but essential for the growth of psychiatric medicine.


Dermatology and Skin Disease in the American Civil War

Excerpted from

The American Civil War (1861-1865) took place at an interesting moment in the history of medicine and nursing. Only 5 years before, in the mid-1850s, the well-publicized medical disaster affecting British and French troops in the Crimean War had catalyzed a number of fundamental changes in military and civilian medical practice. Most visible were two innovations championed by the British reformer Florence Nightingale: (a) the introduction of professional, female nurses in the military hospitals and (b) an emphasis on sanitation in hospitals and in military camps. Nightingale's efforts culminated in the formation of the Royal Sanitary Commission, a civilian organization with broad authority over military hospitals. The Royal Sanitary Commission would later serve as the model for the United States Sanitary Commission during the American Civil War.

Much later, from the 1870s on, American medicine and nursing progressed rapidly, as advances in microbiology, aseptic surgery, histopathology, epidemiology, medical education, and professional nursing training--all directly influenced by American wartime medical experiences--propelled the United States into the unaccustomed position of medical world leader.

But in between the Crimean War and the medical boom of the latter third of the 19th century came the Civil War, and it is especially interesting to look back to the 1861-1865 period to find the early traces of our shared history of progress in medicine and in nursing in America. It is not an exaggeration to say that history still resonates in our modem practices. (Some examples: the helmet-like gauze compression dressing still used to bandage large scalp incisions is known as a Nightingale dressing, the American Red Cross is the direct descendent of the U.S. Sanitary Commission, and scabies and lice are treated with pyrethroids first used in the Civil War.)

There was a great deal of skin disease in America during the Civil War, the result of poor hygiene and sanitation, lack of knowledge of microbiology, displacement of civilian populations, and movement of the armies themselves, all of which allowed the spread of infectious diseases and infestations in particular. Table 1 lists the most common skin diseases reported in Federal (Union) soldiers during the war. Table 2 provides a smaller-scale look at skin diseases in a pair of Confederate hospitals in Atlanta. As can be seen in both tables, infections and infestations composed the majority of skin diseases.

It should be mentioned that these patients were not diagnosed and treated by dermatologists. Rather, their doctors were the army surgeons who, despite having little specialized knowledge of skin diseases, generally did the best they could for their patients. There were in fact only about one dozen dermatologists in America at the time of the Civil War, all in the north. There were no dermatologists in the states of the Confederacy, nor were there any military dermatologists in either army. Most of the treatment for skin diseases, such as it was, was administered by the nurses who labored in the army hospitals.

Visit to learn more about Civil War medicine.

Prevalence of Major Eye Diseases Among US Civil War Veterans, 1890–1910

From: National Institutes of Health
By: Frank A. Sloan, PhD, Daniel W. Belsky, BA, and Idrissa A. Boly, MA

To estimate the prevalence of major eye diseases and low vision or blindness in a national sample of male US Union Army veterans from 1890 to 1910 and to compare these prevalence rates with contemporary rates for the same diseases and visual status.

Longitudinal histories of 16 022 white Union Army veterans receiving disability pensions from 1890 to 1910 were developed from pension board examination records. Prevalence rates of trachoma, corneal opacities, cataract, diseases of the retina and optic nerve, and low vision or blindness were calculated in 1895 and 1910. Changes in prevalence by age were examined.

By 1910, 11.9% of veterans had low vision or were blind in both eyes. Prevalence of cataract increased with age, resulting in 13.1% of veterans having had cataract in one or both eyes. Rates of trachoma were 3.2% in 1895 and 4.8% in 1910. Rates of corneal opacity were 3.0% and 5.1%, respectively. Glaucoma was rarely diagnosed from 1890 to 1910, but diseases of the optic nerve were reported in 2.0% of veterans in 1895 and 3.6% in 1910.

This study documents substantial reductions in the prevalence of low vision or blindness and changes in the composition of eye diseases from an era in which there were few effective therapies for eye diseases to the present.

The 20th century experienced great changes in the treatment of major eye diseases. However, longevity also increased substantially. The US population aged older than 65 years increased from 4.1% in 1900 to 12.4% in 2000,1 and eye disease prevalence often increases with age. Thus, though innovations in treatment should have decreased prevalence, increases in population age may have, at least in part, offset these decreases. Nationally representative historical data documenting trends in prevalence of major eye diseases and low vision and blindness across several decades have been lacking.

Recently, data from a federal program that began in 1862 and paid pensions to Union Army veterans of the US Civil War (1861–1865) became available in machine-readable form. To obtain a pension, veterans had to be examined by a 3-physician panel, which determined whether their illnesses or injuries qualified for compensation. Prior to 1890, only service-connected disabilities were compensated. In 1890, the program was amended to include compensation for non–service-related conditions, which led to a major increase in veterans having examinations to obtain pensions. In 1907, the program was further amended to include old age as an eligibility criterion. Although old age was not recognized by statute as a basis for receiving a pension until 1907, a minimum pension was granted to all those aged 65 years or older from 1890 to 1907, unless the veteran was unusually vigorous.2

This was the first major national pension program in the United States to which a large portion of federal expenditures was allocated in the late 19th century.3 The program covered 85% of Union Army veterans who were alive in 1900 and more than 90% of veterans who were alive in 1910.4,5 Only Union Army veterans were eligible.

The data were produced by the Center for Population Economics at the University of Chicago. A 1-stage cluster sample of 331 Union Army companies was randomly selected by the Center for Population Economics from more than 20 000 company records stored at the National Archives in Washington, DC. These companies yielded a sample of 39 616 veterans. The following 3 public data files were used: surgeons’ certificates,6 military pension and medical records,7 and census records.8

The surgeons’ certificates data consist of medical records used by the US Bureau of Pensions to evaluate pension applications. Each record contains physical examination findings. Veterans could apply for a pension more than once (they could claim >1 disability at each application); therefore there were multiple medical records for some veterans. Surgeons’ certificates data were classified by the Center for Population Economics into 21 (primarily) organ system–based health screenings.

The total surgeons’ certificates sample includes 87 224 examinations of 17 721 veterans. The veterans sample was reduced to 16 022 because of missing information on birth or death dates. Once a veteran’s condition was diagnosed, we assumed that the veteran had that condition until death. In any year, the number of veterans in the sample was well below 16 022, because either the veteran had not yet been examined or had died by then. Our analysis focused on 1895 and 1910. The year 1895 was long enough after the statutory change of 1890 that allowed many veterans with non–service-related disabilities to be examined and added to the pension roles. By 1910, 3 years had passed since the statutory change of 1907, an eligibility expansion that classified old age (≥62 years) as a disability. Also, by 1910, most surviving veterans were aged 65 years or older. The death rate was high enough that sample sizes were too small to calculate reliable estimates of prevalence a decade or a decade and a half later.

We identified veterans with a visual disability who had diagnosed amblyopia or blindness in both eyes. We examined 5 categories of eye disease: trachoma, corneal opacity, cataract, disease of the retina, and disease of the optic nerve. Trachoma, corneal opacity, and cataract were all easily identified then by basic visual inspection. All were well-known causes of blindness and were considered when determining cause of impairment. Diseases of the retina and of the optic nerve, rather than being specific diagnoses, refer to findings from an examination with an ophthalmoscope. Ophthalmoscopes were reasonably well distributed by the 1890s.

Trachoma was identified by searching responses for trachoma within a category dealing with the conjunctiva. We identified cases of corneal opacity by searching for items pertaining to the cornea. Cataract was identified from variables for cataract and cataract extraction and was indicated as specific to the left, right, or both eyes. Diseases of the retina and optic nerve were identified by codes for infection or inflammation of the eye.
Few cases were explicitly classified as glaucoma, but glaucoma was also plausibly included in a separate category, diseases of the optic nerve. We included both in the category disease of the optic nerve.

The sample increased by 50% between 1890 and 1895 (Table 1), reflecting the increase in veterans obtaining examinations after the statutory change in pension law in 1890. In 1895, of the 12 144 veterans in the sample, 84.8% were aged younger than 65 years. By 1910, of the 7782 remaining veterans in the sample, only 17.8% were aged younger than 65 years. Most veterans were aged 65 through 74 years in 1910 (66.2%).

Table 1
Age Distribution of a Sample of US Civil War Veterans for 1895 and 1910a
Cataract was by far the most common of the study diseases, with prevalence ranging from 4.5% for those younger than 55 years to 15.6% for those aged 75 years and older in 1895 (Table 2). In 1910, prevalence of cataract among those aged 75 years and older had risen to 17.1%. For those aged 65 through 74 years, prevalence of cataract was 8.4% in 1895 and 13.0% in 1910. Corneal opacity affected 3.8% and 4.8% of this age group in 1895 and 1910, respectively. Prevalence of trachoma was similar; that for diseases of the retina and optic nerve was much lower, ranging from just under 1% to 2% in 1895.

Table 2
Prevalence of Major Eye Disease Among US Civil War Veterans in 1895 and 1910 by Age
Prevalence of diseases of the retina increased between 1895 and 1910, even on an age-adjusted basis, possibly reflecting better detection in the latter year. However, in data for either year, rates of documented retinal disease did not increase with age, as is now typical in elderly populations. Prevalence of disease of the optic nerve did not change appreciably between 1895 and 1910 on an age-adjusted basis, and the patterns of prevalence rates with respect to age are irregular in both years.

In 1895 and 1910, respectively, 7.1% and 11.9% of the white male veteran population had low vision or blindness. The prevalence of low vision/blindness increased substantially with age. In 1895, 6.0% of veterans younger than 55 years and 11.0% aged 75 years or older had this diagnosis. In 1910, 14.1% of veterans aged 75 years or older were recorded as having low vision or being blind in both eyes.

Of those veterans with low vision or blindness, most had diagnosed cataract (Table 3). Corneal opacity and trachoma were present in one-quarter or more of these individuals. Diseases of the retina and optic nerve were documented in 13% to 15% and 8% to 10% of these cases, respectively.

Table 3
Prevalence of Major Eye Diseases Among US Civil War Veterans With Low Vision/Blindness

Several major eye diseases and low vision/blindness were highly prevalent at the turn of the 20th century. Prevalence of some major eye diseases, especially cataract, increased substantially with age.

Owing to increased longevity and improved medical knowledge and diagnostic techniques, reported prevalence increased for many eye diseases. Current rates of cataract surgery, a reasonable proxy for cataract prevalence in high-income countries, are well above those for cataract and cataract surgery combined in the Union Army data,8–12 especially for populations aged 70 years or older. Data from populations aged 65 years or older in the 1990s indicate a prevalence rate of about 5% for age-related macular degeneration,13,14 about 7% for diabetic retinopathy,14 and close to 8% for glaucoma.14 The combined rates of 5% and 7% for the retinal diseases are far above the corresponding rates in the Union Army data (considering joint prevalence of both diseases), even allowing for a somewhat higher mean age of the more recent population. Diabetes prevalence among white veterans in 1895 was about 2% and was about 4% in 1910.15 Judging from recent data, far fewer than half of veterans with diagnosed diabetes would have had retinal disease (<1% in 1895 and <2% in 1910).14 Urinalysis was used during this period to diagnose diabetes.15 Recent prevalence of glaucoma is also much higher than the rates diagnosed in the Union Army veteran population, even accounting for some differences in age between the comparison and Union Army data.

However, trachoma, which was highly prevalent in the United States around 1900, is now virtually nonexistent in the developed world, which is largely a result of better sanitation, improved personal hygiene, and antibiotics.16–19 The disease now exists almost exclusively in low-income countries, where prevalence is similar to that found in our study,19–21 though it was sometimes higher depending on the region.22,23

In contrast to increases in reported prevalence of major eye diseases, visual outcomes in the United States have improved dramatically since about 1900. Data from the World Health Organization from 200224 indicate that blindness prevalence (defined as visual acuity <20/400 OU) among those aged 50 years or older in the United States was 0.4%. The Eye Disease Prevalence Research Group (EDPRG)25 reported a combined prevalence of blindness (defined as best-corrected visual acuity<20/200 OU) and low vision (defined as best-corrected visual acuity<20/40 OU) for those aged 40 years or older of less than 3%. Rates for other high-income countries are similar.26–28

Eye disease prevalence in the Union Army veterans data are more similar to reported prevalence in low-income countries, where sanitation and hygiene may be more like that of the United States in the early 20th century. According to the World Health Organization,22 prevalence of blindness in Africa among individuals aged 50 years or older was 9% in 2002. In Southeast Asia, including Indonesia, Malaysia, the Philippines, and Thailand, prevalence in 2002 was 6.3%. Combined prevalence of blindness and low vision (best-corrected visual acuity<20/60 OU and > 20/200 OU) among adults in Pakistan and Bangladesh is around 10%.29,30 Rates for persons aged 60 to 69 years have been reported at 13% in Malaysia31 and 8% in Nepal.32

Access to affordable, safe, and effective cataract surgery is of primary importance in decreasing prevalence of blindness in the developed world. Cataract is the leading cause of blindness globally, accounting for 47.8% of adult-onset blindness.23 The EDPRG estimated that among white Americans, cataract caused less than 9% of blindness,16 even though the US population is proportionally much older than the world as a whole; 7.4% of the global population is aged 65 years or older33 compared with more than 12% of the US population.1
Causes of low vision or blindness cannot be ascertained from the Union Army data; however, prevalence of eye diseases among those with low vision or blindness provide an approximation. These data assign a major role to cataract as causing low vision or blindness. Even with better technology, contemporary estimates of the causes of the 2 conditions in the United States differ appreciably. According to EDPRG’s meta-analysis, cataract, for example, accounts for 59.9% of low vision and only 8.7% of blindness among white Americans. Rates are 3.3% and 6.4% for glaucoma and 22.9% and 54.4% for age-related macular degeneration among white individuals with low vision and blindness, respectively.23 Age-related macular degeneration’s prominence in the EDPRG meta-analysis relative to that in the Union Army data (even considering that some diabetic retinopathy would have also been included in diseases of the retina) primarily reflects current knowledge compared with that of a century ago as well as the higher fraction of younger elderly in the veterans’ analysis.
Unlike cataract, age-related macular degeneration, and glaucoma, corneal opacities were much more common among those with low vision or blindness in the Union Army sample compared with contemporary populations in both the United States or low-income countries. Corneal opacities accounted for 3% of blindness in the United States and 8% to 12% of blindness in Africa in 2002.22 Higher prevalence of trachoma, the most important infectious cause of corneal opacity,34 cannot fully explain this effect. While rates of trachoma in Africa (6%–8%24) are higher than those in our data, the reported effect of corneal opacity on vision is substantially lower. Corneal scarring from war-related trauma is the likeliest explanation for the high rates of corneal opacity among those with visual impairments in the veterans’ sample. This explanation is supported by the decline in the prevalence of corneal opacity among those with serious visual impairment as the sample aged and other eye diseases become more common.

A strength of the veterans data is that they are fairly representative of middle-aged and elderly white men who were alive around 1900. About half of adult white men in the North fought in the Civil War.35 Many men whose poor health precluded fighting in the Civil War, owing to, eg, congenital heart disease, probably did not survive to 1895. The pension program covered 85% of all Union Army veterans by 1900 and more than 90% by 1910.4,5 Because of the relaxation of regulations in 1890 that allowed pensions to be granted to almost all veterans older than 65 years and the formalization of this policy in 1907, there was a substantial incentive for even veterans without disabilities to apply for pensions. Fogel36 conducted analyses comparing the examined soldiers with other men of this period, concluding that the veterans sample was representative of the Northern white male population according to geographic distribution, wealth, and cause of death.

We acknowledge several study limitations. Our data only pertain to adult white men and exclude men in states not held by the Union Army until the end of the Civil War. Second, methods of diagnosing and classifying disease and depth of understanding of disease processes differed substantially between the beginning and the end of the 20th century. Diagnostic techniques around 1900 were very limited, the field of ophthalmology was comparatively young, and the physicians performing the examinations were not likely to be familiar with any ocular pathologies besides the very most common. Consequently, misdiagnosis and underdiagnosis were likely to have been more common relative to more recent data. For example, around 1900, the presence of cataracts was typically based on observing a “white pupil” rather than by methods used more recently. Vision loss at the time of cataract surgery today is comparatively mild. Thus, using contemporary criteria, prevalence of cataract in 1895 and 1910 reported in our study is likely to be understated.

However, visual and functional status was observable. Graeff,37 in his book Das Menschliche Auge published in 1933, urged those rating severe visual impairment and blindness to classify them based on the conditions’ effects on patients’ ability to function in their usual activities.

Classification, terminology, and knowledge of disease processes also differed. For example, the 1898 English edition of Fuchs’ Text-Book of Ophthalmology,38 a widely respected reference at the time, defines amblyopia as weak sight or low vision that cannot be corrected by eyeglasses. Similar definitions are found in Alt’s 1884 A Treatise on Ophthalmology for the General Practitioner39 and Higgens’ 1888 Ophthalmic Practice.40 Complete blindness is referred to in these 3 texts as amaurosis. Our analysis considered a diagnosis of amblyopia in both eyes as functional blindness.

There is good support for supposing that a diagnosis of amblyopia, as the term was used then, was associated with severe functional impairment. Wood,41 writing on the army pension program, stated that “No pension is given for a partial loss of sight or for the partial or complete loss of the field of vision or the muscular functions of the eyes.” Graeff,37 writing more than 3 decades after Fuchs,38 Alt,39 and Higgens,40 characterized amblyopia as a historical scientific term for severe visual impairment. He argued that if the criterion of amaurosis had been strictly applied as a basis for admission, 80% to 90% of all residents of institutions for the blind would have been there inappropriately.

Glaucoma provides an example of the lack of understanding of disease processes in the early 20th century. The term glaucoma appeared in ophthalmologic reference books of the period, and it was one of the listed diagnoses the board of surgeons could apply following their examinations of pension applicants; but the consequences of glaucoma were not understood then. There are 2 chief reasons for the rarity of reported glaucoma: (1) The disease was thought to be very uncommon and (2) the relationship between glaucoma and the optic nerve was not well understood. Fuchs38 wrote in 1898 that glaucoma accounted for less than 1% of all eye diseases. In 1900, Deyl and Sattler’s42 chapter “Diseases of the Optic Nerve” in Norris and Oliver’s System of Diseases of the Eye made no reference to glaucoma, though some connection with the optic disc was mentioned in Smith’s43 chapter on glaucoma in the same volume. Fuchs38 identified excavation of the optic nerve as the cause of blindness in advanced glaucoma but went no further. Both Alt39 and Higgens40 discussed glaucoma and the optic nerve separately. However, by the time Graeff’s37 text was published in 1933, examination of the optic nerve was used to diagnose glaucoma. Although tonometers existed in the late 1800s,44 in our data, there were more findings of disease of the optic nerve than there were of diagnoses labeled glaucoma. Thus, in our analysis, the latter was combined with the former group.

This study documents substantial reductions in the prevalence of low vision/blindness and changes in the composition of eye diseases from an era in which there were few effective therapies for eye diseases to the present. Comparisons of major eye diseases over the course of 100 years reveal substantial improvements in visual function. The appreciable reductions in the prevalence of low vision/blindness reflect such technological changes as innovations in the treatment of cataract and glaucoma and economic growth, which provided funds for developing capacity and financing provision of services. Other improvements, such as a reduction in the prevalence of trachoma, reflect improvements in environmental and public health, which is largely a byproduct of a country’s level of economic development.
The burden of chronic eye disease is substantial45,46 and is reflected by (1) the resources devoted to its diagnosis and treatment and (2) the losses in productivity and quality of life it can cause. Both are important components of the total burden. Around 1900, when therapeutic options were limited, the burden of such disease and disability was largely borne outside the health care system. But nevertheless, particularly as reflected in high rates of low vision/blindness, this burden was substantial and much greater than it is now.

Funding/Support: This study was supported in part by grant 2R37-AG-17473-05A1 from the National Institute on Aging.
Role of the Sponsor: The sponsor had no role in the design or conduct of this study.



1. US Census Bureau. US population by age 1900–2002. [Accessed July 14, 2006];Statistical abstract of the United States; No HS-3.
2. Costa DL. Pensions and retirement evidence from Union Army veterans. Q J Econ. 1995;110(2):297–319.
3. Holcombe RG. Veterans interests and the transition to government growth: 1870–1915. Public Choice. 1999;99(3–4):311–326.
4. Campbell A. The invisible welfare state: establishing the phenomenon of twentieth century veterans’ benefits. J Polit Milit Soc. 2004;32(2):249–267.
5. Costa DL. The Evolution of Retirement: An American Economic History, 1880–1990. Chicago, IL: University of Chicago Press; 1998. p. 160.
6. Fogel RW, et al. Inter-University Consortium for Political and Social Research; No 3417. Chicago, IL: University of Chicago, Center for Population Economics; 1999. [Accessed June 1, 2006]. Aging of Veterans of the Union Army: Surgeons’ Certificates 1862–1940.
7. Fogel RW, et al. Inter-University Consortium for Political and Social Research; No 6837. Chicago, IL: University of Chicago, Center for Population Economics; 2006. [Accessed June 1, 2006]. Aging of Veterans of the Union Army: Military, Pension, and Medical Records, 1820–1940.
8. Fogel RW, et al. Inter-University Consortium for Political and Social Research; No 6836. Chicago, IL: University of Chicago, Center for Population Economics; 2000. [Accessed June 1, 2006]. Aging of Veterans of the Union Army: United States Federal Census Records, 1850, 1860, 1900, 1910.
9. Klein BE, Kelin R, Linton KL. Prevalence of age-related lens opacities in a population: the Beaver Dam Eye Study. Ophthalmology. 1992;99(4):546–552.[PubMed]
10. Michon JJ, Lau J, Chan WS, Ellwein LB. Prevalence of visual impairment, blindness, and cataract surgery in the Hong Kong elderly. Br J Ophthalmol. 2002;86(2):133–139. [PMC free article][PubMed]
11. Mitchell P, Cumming RG, Attebo K, Panchapakesan J. Prevalence of cataract in Australia: the Blue Mountains Eye Study. Ophthalmology. 1997;104(4):581–588.[PubMed]
12. Williams A, Sloan FA, Lee PP. Longitudinal rates of cataract surgery. Arch Ophthalmol. 2006;124(9):1308–1314.[PubMed]
13. Holcomb CA, Lin MC. Geographic variation in the prevalence of macular disuse among elderly Medicare beneficiaries in Kansas. Am J Public Health. 2005;95(1):75–77. [PMC free article][PubMed]
14. Lee PP, Feldman ZW, Ostermann J, Brown DS, Sloan FA. Longitudinal prevalence of major eye diseases. Arch Ophthalmol. 2003;121(9):1303–1310.[PubMed]
15. Humphreys M, Costanzo P, Haynie KL, et al. Racial disparities in diabetes a century ago: evidence from the pension files of US Civil War veterans. Soc Sci Med. 2007;64(8):1766–1775.[PubMed]
16. Cook J, Frick KD, Baltussen R, et al. Loss of vision and hearing. In: Jamison DT, Breman JG, Measham AR, et al., editors. Disease Control Priorities in Developing Countries. New York, NY: Oxford University Press; 2006.
17. Mabey DC, Solomon AW, Foster A. Trachoma. Lancet. 2003;362(9379):223–229.[PubMed]
18. Prüss A, Mariotti SP. Preventing trachoma through environmental sanitation: a review of the evidence base. Bull World Health Organ. 2000;78(2):258–266. [PMC free article][PubMed]
19. Bejiga A, Alemayehu W. Prevalence of trachoma and its determinants in Dalocha District, Central Ethiopia. Ophthalmic Epidemiol. 2001;8(2–3):119–125.[PubMed]
20. Dolin PJ, Faal H, Johnson GJ, Ajewole J, Mohamed AA, Lee PS. Trachoma in the Gambia. Br J Ophthalmol. 1998;82(8):930–933. [PMC free article][PubMed]
21. Tabbara KF, al-Omar Om. Trachoma in Saudi Arabia. Ophthalmic Epidemiol. 1997;4(3):127–140.[PubMed]
22. Schwab L, Whitfield R, Jr, Ross-Degnan D, Steinkuller P, Smartwood J. The epidemiology of trachoma in rural Kenya: variation in prevalence with lifestyle and environment, Study Survey Group. Ophthalmology. 1995;102(3):475–482.[PubMed]
23. Zerihun N, Mabey D. Blindness and low vision in Jimma Zone, Ethiopia: results of a population-based survey. Ophthalmic Epidemiol. 1997;4(1):19–26.[PubMed]
24. Resnikoff S, Pascolini D, Etya’ale D, Kocur I, Pararajasegaram R, Pokharel GP, Mariotti SP. Global data on visual impairment in the year 2002. Bull World Health Organ. 2004;82(11):844–851. [PMC free article][PubMed]
25. Congdon N, O’Colmain B, Klaver CC. The Eye Disease Prevalence Research Group. Causes and prevalence of visual impairment among adults in the United States. Arch Ophthalmol. 2004;122(4):477–485.[PubMed]
26. Evans JR, Fletcher AE, Wormald RPL, et al. Prevalence of visual impairment in people aged 75 years and older in Britain: results from the MRC trial of assessment and management of older people in the community. Br J Ophthalmol. 2002;86(7):795–800. [PMC free article][PubMed]
27. Buch H, Vinding T, Nielsen NV. Prevalence and causes of visual impairment according to World Health Organization and United States criteria in an aged, urban Scandinavian population: the Copenhagen City Eye Study. Ophthalmology. 2001;108(12):2347–2357.[PubMed]
28. Taylor HR, Keeffe JE, Vu HTV, et al. Vision loss in Australia. Med J Aust. 2005;182 (11):565–568.[PubMed]
29. Dineen BP, Bourne RRA, Ali SM, Noorul Huq DM, Johnson GJ. Prevalence and causes of blindness and visual impairment in Bangladeshi adults: results of the National Blindness and Low Vision Survey of Bangladesh. Br J Ophthalmol. 2003;87(7):820–828. [PMC free article][PubMed]
30. Ahmad K, Khan MD, Qureshi MD, et al. Prevalence and causes of blindness and low vision in a rural setting in Pakistan. Ophthalmic Epidemiol. 2005;12(1):19–23.[PubMed]
31. Zainal M, Ismail SM, Ropilah AR, et al. Prevalence of blindness and low vision in Malaysian population: results from the National Eye Survey 1996. Br J Ophthalmol. 2002;86(9):951–956. [PMC free article][PubMed]
32. Pokharel GP, Regmi G, Shrestha SK, Negrel AD, Ellwein LB. Prevalence of blindness and cataract surgery in Nepal. Br J Ophthalmol. 1998;82(6):600–605. [PMC free article][PubMed]
33. Central Intelligence Agency. World [global population statistics] World Fact-book; [Accessed July 19, 2006]. Web site.
34. Whitcher JP, Srinivasan M, Upadhyay MP. Corneal blindness: a global perspective. Bull World Health Organ. 2001;79(3):214–221. [PMC free article][PubMed]
35. Roper R. Glorious dust: the posthumous masterwork of an influential black historian tells how slavery itself undermined the Confederacy. Am Scholar. 2007 Jan;:89–100.
36. Fogel RW. New sources and new techniques for the study of secular trends in nutritional status, health, mortality, and the process of aging. Hist Methods. 1993;26(1):5–43.
37. Graeff R. Das Menschliche Auge. Weimar, Germany: Panses Verlag; 1933.
38. Fuchs E. In: Text-Book of Ophthalmology. Duane A, translator. New York, NY: D Appleton & Co; 1898. p. 445.
39. Alt AA. A Treatise on Ophthalmology for the General Practitioner. Chicago, IL: JH Chambers & Co; 1884.
40. Higgens C. A Manual of Ophthalmic Practice. Philadelphia, PA: P Blakiston Son & Co; 1888.
41. Wood CA. The American Encyclopedia & Dictionary of Ophthalmology. Vol. 27 Chicago, IL: Cleveland Press; 1921. Vertigine visiva to Zygotes; p. 13618.
42. Deyl J, Sattler R. Diseases of the optic nerve. In: Norris WF, Oliver CA, editors. Local Diseases, Glaucoma, Wounds and Injuries, Operations. Vol. 3 Philadelphia, PA: JP Lippincott Co; 1900. pp. 587–628. Systems of Diseases of the Eye.
43. Smith P. Glaucoma: pathogenesis, symptoms, course, and treatment. In: Norris WF, Oliver CA, editors. Local Diseases, Glaucoma, Wounds and Injuries, Operations. Vol. 3 Philadelphia, PA: JP Lippincott Co; 1900. pp. 629–683. Systems of Diseases of the Eye.
44. Glaucoma. [Accessed March 27, 2007];A Historic Tour of Ophthalmology. Web site.
45. Rein DB, Zhang P, Wirth KE, et al. The economic burden of major adult visual disorders in the United States. Arch Ophthalmol. 2006;124(12):1754–1760.[PubMed]
46. Salm M, Belsky D, Sloan FA. Trends in cost of major eye diseases to Medicare, 1991 to 2000. Am J Ophthalmol. 2006;142(6):976–982.[PubMed]



Apart from amputation skills, the Civil War Surgeon developed relatively sophisticated techniques in the use of plaster splints. I came across an article, ‘Plaster Splints in the American Civil War’ published 1943 in the December issue of The British Medical Journal by an author just referred to as ‘S.W.’ S.W. had discovered a series of essays, titled ‘A-T’, that had been published between 1862-4 by The United States Sanitary Commissioner and intended for distribution amongst Army Surgeons. The essays covered a wide ranges of topics on battlefield medicine, including techniques for creating plaster splints which can be found in Volume ‘T’. The Army Surgeons were recommended to use the Maisonneuve technique and A.W.’s article details this procedure:

‘Shave or oil the skin. Make a paper pattern of the area to be covered and cut to it two thicknesses of Canton flannel or old muslin, devising windows if wounds are present; the sides of the flannel should remain about one inch apart when in position. Sprinkle plaster into equal quantity of water to a creamy consistence. Immerse cloth till thoroughly saturated, lay it on a flat surface and smooth with hand. Apply flannel to limb and put snugly over it a roller bandage. The limb is then held for a few minutes, extension being made if necessary until the plaster sets, when the roller bandage is removed. If it is necessary to delay the “setting” of the plaster this maybe achieved by adding a small quantity of carpenters’ glue.’

S.W. goes on to discuss the importance role that the use of plaster splints played in improving survival rates of Civil War soldiers:

‘..a patient with a much swollen elbow-joint wounded at the Battle of Cross-Keys. The joint had been entered by a round bullet, which was removed two weeks later, when free incisions around the joint were found to be necessary. At this stage a plaster splint was applied to the anterior surface of the arm and retained by a transverse band above the wrist and another at the middle of the humerus, the arm being flexed. This split was worn for a month and then renewed. The head of the radius came away and the patient recovered with some degree of motion in the joint. Dr Swan employed the plaster splints in several cases of fracture after the seven days’ fighting before Richmond, during M’Clellan’s campaign, and the patients were comfortable transported to Washington.’

Wisconsin Towns Send Food for the Troops


Between 1861 and 1865, Baraboo and Prairie du Sac each sent 50 boxes of relief supplies, which included knit garments, blankets, reading material and bandages to the front lines.

In addition, women also provided food for the troops. In 1863, scurvy (a disease caused by a lack of vitamin C and characterized by spongy gums, loose teeth, and bruising under the skin and on the mouth, nose and throat) broke out among the Union troops and a call was made for each state to provide anti-scorbutics (a food or supplement that counteracts the effects of scurvy. An example is vitamin C supplements). Below is a list of some of the food sent to the troops by women on the home front:

Blackberry Juice
Processed Horseradish
Dried Apples
Wine Eggs
Cabbages were made into sauerkraut
Potatoes packed into barrels with hot vinegar.

By November 1863, Wisconsin stood first among Union states in the number of boxes of anti-scorbutics sent to the Union troops.

Tuesday, September 17, 2013

Uniform of a Medical Cadet of the U.S. Army


For Commissioned Officers
 For a Medical Cadet --  Officers frock coat - one row of buttons

For a Medical Cadet -- of dark blue cloth, plain, without stripe, welt of buff cloth, down the outer seam

For a Medical Cadet-- gilt, convex, with spread eagle and star, a plain border; large size, seven-eighths of an inch in exterior diameter ; small size, one-half inch.

Medical Cadets -- a forage cap according to pattern.

Shoulder Straps.

Medical Cadets -- A strip of gold lace three inches long, half an inch wide, placed in the middle of a strap of green cloth three and three-quarters inches long by one and one quarter inches wide.

Medical Cadets


Rarely seen in images, medical cadets performed important services in the Union Army 's medical department.

More than 200 young men served the Union cause during the Civil War in a little-known organization, the U.S. Army Medical Cadet Corps. In addition to their helpful work as members of the corps, many veteran cadets continued to serve in the army's medical department. About 40 percent of them went on to become surgeons, assistant surgeons, or contact surgeons with the Federal forces. George H. Bosley was one such medical cadet.

The Medical Cadet Corps was formed by an act of Congress in August 1861. As many as 50 cadets at a time were authorized to serve a one-year term in the army as wound dressers and ambulance attendants under the supervision of medical officers. The rules stated that an applicant had to be between 18 and 22 years of age and must have studied medicine for two years and completed at least one course of lectures at a medical college. Applicants also were required to provide testimonials of their physical fitness and character. In April 1862 the corps was enlarged to 70 cadets.

The actual duties of medical cadets ranged far wider than regulations prescribed. Most cadets served in army general hospitals, where in addition to dressing wounds they assisted in operations and postmortem examinations, administered wards, and examined anatomical specimens. Officially a non-commissioned officer, a medical cadet was paid thirty dollars per month and covered for his quarters, fuel, and transportation. A daily ration was later added to his allowance. His uniform consisted of a junior officer's frockcoat with green shoulder straps adorned with a half-inch strip of gold lace, trousers with a narrow buff welt, a plain forage cap, and a non-commissioned officer's belt plate and sword.

Jeff Davis Was Blind in His Left Eye


Davis fought health problems for a good part of his life, including a nearly fatal bout with malaria in 1836. He was seriously ill again in the winter of 1857-1858, and by February he began suffering from a relapse of a chronic inflammation of his left eye. The disease was so bad that a visiting ophthalmologist commented “I do not see why this eye has not burst.” As a result, most photos of Davis are in right profile, thus hiding his left eye

The eye disease can be traced back to his first bout with malaria. About a dozen years later, during a relapse, Davis suffered a “severe eye attack” such that, in the words of his wife Varina, he could not “bear a ray of light on either eye.” Documents show that the disease recurred almost annually from that time and through the Civil War

In a severe relapse in 1858, Davis was seen by two famous eye physicians of the time – Drs. Robert Stone and Isaac Hayes. Stone described the condition of Davis’s left eye in detail, including “ulceration of the cornea,” “abscess of the eye,” and “hypopyon” (a collection of pus cells in the aqueous humor). It was Hayes who commented that he couldn't see why Davis’s eye had not already burst. Davis was given treatments of the day, including “quiet” and bandages soaked in herbal remedies; he also underwent eye surgeries in 1859 and 1860.

Davis was unquestionably suffering from another siege of metaherpetic keratoiritis (inflammation of the cornea due to structural damage to the cornea) in 1858. The cold and fever that had gripped him, as well as the intense stress over Kansas, could easily have contributed to the timing of the new assault on his left eye.

Dr. Stone's clinical notes specifically talk about ulceration of the cornea. His description indicates a ruptured healing descemetococle (hernia of the cornea) filled with iris tissue and a threatened abscess of the eyeball, as well as a possible hypopyon, an accumulation of pus in the anterior chamber of the eye.

Describing Davis upon his return to the Senate in 1857, a reporter underscored how raving his illness had been: "a pale ghastly-looking figure, his eye bandaged with strips of white linen passing over the head, his whole aspect presenting an appearance of feebleness and debility."

As a film covered the left eye, he could see only light and darkness but could no longer distinguish objects. Contemporaries used various terms when they mentioned the eye. A close friend mentioned "clouded"; another observer called it "discolored"; even the word "blind" was used. In photographs taken in 1859 and 1860, Davis did not look directly at the camera. Instead, he presented a profile which emphasized his right side and hid his left side and his damaged eye.

While imprisoned, Mr. Davis referred very kindly, and in terms of admiration, to his former friend and medical attendant, Dr. Thomas miller , of Washington. Also to Dr. Stone of Washington, who had made a specialty of the eye and its diseases. From him he had received clearer ideas of the power of vision, and the adaptation of the eye to various distances and degrees of light, than from any other source. Referring to his own loss of sight in one eye from leucoma (a white, opaque scar of the cornea), or an ulceration of the cornea, he said he could discern light with it, but could not distinguish objects.

Although Davis never again experienced eye disease that remotely resemble the seriousness of the 1858 attack and the 1859 surgery, they left their marks. He turned to eyeglasses, with evidently some temporary help. As time passed, however, the degenerative ocular process connected with his affliction continued, and in all probability phthisisvalbi (shrunken, non-functioning eye) set in.

In 2006, Dr. R. W. Hertle, a prominent ophthalmologist at Children's Hospital in Pittsburgh concluded that Davis suffered from “herpes simplex keratouveitis,” (herpes simplex of the eye) a condition that remains a major cause of injury to the eye.

Miss Mary Safford: The Angel of Cairo

MISS MARY SAFFORD: The Angel of Cairo
By Robert E. Denney

In Cairo [IL] that summer [1861] a young woman from Vermont was visiting her brother, the most important banker in town, and she became involved in working at the Cairo hospital when she could.

Miss Mary Safford was a well-educated, beautiful, and romantic young woman who captured the heart of every patient, as well as the hearts of the staff, in the hospital. Lacking the homemaking and organizational skills of [Mary Ann] Bickerdyke, she spent her time consoling the patients, writing letters for them, hanging curtains in the windows, smoothing fevered brows and being an angel.

The patients loved her to distraction, and Mrs. Bickerdyke used her to the fullest in her role. In addition, Mrs. Bickerdyke taught her the less glamorous side of nursing. Safford learned to cook, wash the patient's clothing, wash and iron the sheets, and make beds, The patients and staff called Mrs. Bickerdyke "Mother," and they called Mary Safford "The Angel of Cairo," which much pleased Mother Bickerdyke.

From: "Civil War Medicine: Care & Comfort of the Wounded"

How the Civil War Changed Funeral Practices


Wars are often responsible for medical and scientific advances, and the Civil War drove the need for a new science: an improved way to handle the dead. So many men died and so many were far from home, there was a growing need for a way to preserve a body for a decent burial once the body arrived home. Families wanted to see their fallen sons once more, and railroads added to the urgency by refusing to carry decaying bodies (identifiable by smell).

Today there is increasing interest in “green funerals” (for those looking for eco-friendly solutions), and about one-third of all Americans who die are cremated, according to the National Funeral Directors Association. However, the traditional funeral, along with embalming of the body that began in the Civil War, is still the most popular choice of how to handle the newly departed.

In the mid-19th century, the French developed a method of arterial embalming, and an American, a Dr. Thomas Holmes (1817-1900), who trained and worked as a coroner’s physician in New York in the 1850s, had begun experimenting with embalming methods used by the French.

First Military Fatality Embalmed
The first military fatality of the war, Colonel Elmer Ephraim Ellsworth (1837-1861), had worked for Lincoln in Springfield and later helped with the presidential campaign. Ellsworth died on May 24, 1861, when Union troops entered Alexandria, Virginia. Ellsworth sent his men to take over the railroad station, while he went to remove a Confederate flag from the roof of the Marshall House Hotel and was killed on his way down the stairs.

It was said that Dr. Holmes visited Lincoln and offered to embalm the body of Lincoln’s friend at no charge. Ellsworth lay in state at the White House and then was taken to City Hall in New York City where Union soldiers lined up to pay their respects. Ten days after his death, Ellsworth was buried in his hometown, Mechanicsville, New York.

From an account in the New York Times (May 27, 1861), it seems that embalming was still a developing art: “The remains [of Ellsworth] were encased in a metallic coffin, the lid of which was so arranged that through a glass cover the face and breast could be seen. The body was dressed in the Zouave uniform of Colonel Ellsworth’s corps, but it was generally remarked, did not bear that natural look so often seen in cases of rapid death. The livid paleness of the features contrasted strongly with the ruddy glow of health that always characterized the Colonel during his lifetime. The marked features and the firm expression of the mouth were, however, sufficient to remind the beholder of what once was Colonel Ellsworth….”

As a result of this successful effort to preserve the body, Dr. Holmes was given a commission from the Army Medical Corps to embalm the corpses of dead Union officers in order that they might be sent home for burial. Holmes is said to have embalmed as many as 4,000 bodies himself, but he also created a fluid that could be used for embalming and sold it to other physicians for $3 per gallon. (At that time, the chemicals were a mixture of arsenic, zinc and mercuric chlorides, creosote, turpentine and alcohol. Formaldehyde, which soon became the primary ingredient, was not discovered until after the war.)

As physicians began to practice embalming, one challenge was lining up paying customers. “At first, the embalming physicians approached soldiers directly before they went into battle,” says James W. Lowry, a Charleston, West Virginia-based embalmer with the Charleston Mortuary Service, who participates in Civil War reenactments as an embalming physician and is also a frequent speaker on Civil War embalming at conventions of funeral directors and embalmers. “The physician provided soldiers with a card that stated that they had arranged for payment for embalming and transportation if they died.

“It soon became clear that this sales method was bad for morale, so the military put a stop to it,” adds Lowry.

“As the war continued, embalming physicians began to follow the action and would take over a barn or shed near the battlefield or set up a tent and embalm bodies there,” he continues. “Since officers tended to be from well-to-do families, embalming physicians instructed soldiers to bring the bodies of officers only. Then the physician would work out payment with the grieving family.”

Some physicians were unscrupulous and charged extraordinary fees or threatened to hold the body “hostage” if the family didn’t pay.

On December 26, 1862 the New York Times ran a revealing letter to the editor, written by H.W. Rivers, who is identified as a surgeon and medical director of the Ninth Army Corps. Rivers points out that embalming of the remains is a great advance of science but that the expense of the process is beyond the reach of people of modest means… Rivers’ letter provides a recipe of sorts that could be used to preserve a body, but it is doubtful that this letter offered much help to the common man.

When President Lincoln died from a gunshot wound on April 15, 1865, Mary Lincoln had been aware of the treatment received by Colonel Ellsworth, and she requested that Lincoln be embalmed. The funeral train carrying his body left Washington D.C. on April 21, 1865 and stopped for public viewing in Baltimore, Harrisburg, Philadelphia, New York City, Albany, Buffalo, Cleveland, Columbus, Indianapolis, Michigan City, and Chicago. The procession finally pulled into Springfield on the morning of Wednesday, May 3, 1865 with the body well-preserved.

With the end of the Civil War, the practice of embalming died out for a time since people were likely to die near home and could be buried more quickly. Embalming surgeons became a thing of the past, and when interest in embalming returned again in the 1890s, undertakers began to perform these duties. Companies that wanted to sell embalming fluid sent salesmen around the country to demonstrate the process and provide certificates of training, and the practice grew. (State licensing finally entered the picture in the 1930s.)

Though the practice of embalming established itself during the Civil War, the actual numbers of people who were embalmed were actually relatively small. Because of the difficulty in identifying bodies and communicating with families about sending a body home, only about 40,000 of the approximately 650,000 soldiers who died during the Civil War were embalmed.

The reality was that the carnage was often so vast that there was no hope of getting the majority of the soldiers home. Communities near battlefields had little choice but to go out to help cover the dead or put them in mass graves.

Visit to learn more about death and burial in the Civil War.


Facebook Twitter Delicious Stumbleupon Favorites