Medicine and
surgery before 1800
» Medieval and Renaissance Europe
»
Salerno and the medical
schools
At about the same time that Arabian medicine flourished, the
first organized medical school in Europe was established at
Salerno, in southern Italy. Although the school of Salerno
produced no brilliant genius and no startling discovery, it was
the outstanding medical institution of its time and the parent
of the great medieval schools soon to be founded at Montpellier
and Paris, in France, and at Bologna and Padua, in Italy.
Salerno drew scholars from near and far. Remarkably
liberal in some of its views, Salerno admitted women as medical
students. The school owed much to the enlightened Holy Roman
emperor
Frederick II, who decreed in 1221 that no one should
practice medicine until he had been publicly approved by the
masters of Salerno.
The Salernitan school also produced a literature of its own;
the best-known work, of uncertain date and of composite
authorship, was the
Regimen Sanitatis Salernitanum (“Salernitan Guide to
Health”). Written in verse, it has appeared in numerous
editions and has been translated into many languages. Among its
oft-quoted couplets is the following:
Use three physicians still, first Doctor Quiet,
Next Doctor Merryman, and Doctor Diet.
Salerno yielded its place as the premier medical school of
Europe to
Montpellier in about 1200.
John of Gaddesden, the model for the “doctour of physick” in
Chaucer’s
Canterbury Tales, was one of the English students
there. That he relied upon astrology and upon the doctrine of
the humours is evident from Chaucer’s description:
Well could he guess the ascending of the star
Wherein his patient’s fortunes settled were.
He knew the course of every malady,
Were it of cold or heat or moist or dry.
Medieval physicians analyzed symptoms, examined excreta, and
made their diagnoses. Then they might prescribe diet, rest,
sleep,
exercise, or baths;
or they could administer emetics and purgatives or bleed the
patient. Surgeons could treat fractures and dislocations, repair
hernias, and perform amputations and a few other operations.
Some of them prescribed opium, mandragora, or alcohol to deaden
pain. Childbirth was left to midwives, who relied on tradition
and folklore.
Great hospitals were established during the Middle Ages by
religious foundations, and infirmaries were attached to abbeys,
monasteries, priories, and convents.
Doctors and nurses in these institutions were members of
religious orders and combined spiritual with physical
healing.
Medical practice and learning in
the vibrant city of Salerno were nourished by the Greek past of
southern Italy, favored by healing shrines in the tradition of Cos
and Epidauros, energized by trade with Sicily and across the
Mediterranean, and fostered by monastic bibliophiles at Monte
Cassino.
The fame of Salerno's
practitioners, women as well as men, had spread across Europe by the
end of the eleventh century, when it was surpassed by the
distinction of its teachers and eclipsed by the bookish legacy of
the monk Constantine the African (d. 1087). Nevertheless, practical
concerns remained manifest even in writings of Constantine, for
example the Viaticum, a medical guide for travellers.
Summaries of all medical
knowledge commonly bore the title Practica. Common-sense dietetics
were compiled and soon became popularized as the Regimen of the
School of Salerno "for the King of England." Through the twelfth
century, Salernitan masters showed a growing interest in theoretical
foundations. They speculated about natural philosophy in numerous
Questions, drew on Aristotle's views of nature (physis) which
contributed to the labeling of medicine as physica or physick,
(hence "physician") saw the need of a better knowledge of anatomy,
and pursued dialectical inquiry in their commentaries on
authoritative treatises. Principles and procedure were interwoven in
encyclopedic manuals of which the Breviary on the Signs, Causes, and
Cures of Diseases by Joannes de Sancto Paulo is a model.
Both theory and practice were
envisioned in the selection of a few simple works for basic medical
education, anchored in the Hippocratic-Galenic "Art" and eventually
standardized in the Articella. Salerno's empirical orientation
endured in the compilation of ob-gyn lore known as Trotula, while
its theoretical contributions were preferred by a Scholastic master
such as Gerard de Berry who wrote a Commentary on the Viaticum of
Constantine.
The reach and durability of
Salerno's influence is documented particularly in the large number
of Salernitan manuscripts that are extant in English libraries.
Joannes de Sancto Paulo.
Breviarium
de signis, causis, et curis morborum.
13th century. (DeRicci NLM 27)
This Salernitan manual was widely
copied across 13th-century Europe.
Gerard du Berry. Super Viatico
Constantini. 13th
century. (DeRicci NLM 11. Schullian 505)
A commentary on Constantine's
Viaticum with the original text shown by the underlines. The remains
of an elaborate medieval mend is seen on this page.
Constantinus Africanus.
Viaticum.
13th century. (DeRicci NLM 12.)
This manuscript contains marginal
annotations in many different hands - at least five on this page
alone - showing that it was a heavily used text.
Regimen
sanitatis Salernitanum.
Louvain : Johannes de Westfalia, ca. 1480. (Schullian 387.)
This very early printed edition
of the Regimen
sanitatis, printed in
the Low Countries, must have had an English owner. The marginal note
translates into English doggerel verse the text of the Regimen
that is printed in bold above it:
Sheeps flesh if eaten without
wine, Is better meate then flesh of swine.If with your meate you use
some wine,Hogges flesh is meate and medicine.
Experimentarius
medicinae.
Argent[orati] [i.e. Strasbourg]: Apud Joannem Schottum, 1544.
The name
Trotula, while actually the title of a compilation, has most
frequently been assigned to a female teacher in the schools of
Salerno.
Whatever her
name, a Salernitan woman appears to be the source of much in this
treatise on diseases of women attributed to Trotula.
Medieval and
Renaissance Europe » The
spread of new learning
Among the teachers of
medicine in the medieval universities there were many who
clung to the past, but there were not a few who determined
to explore new lines of thought. The new learning of the
Renaissance, born in Italy, grew and expanded slowly.
Two great 13th-century scholars who influenced medicine were
Roger Bacon, an active observer and tireless
experimenter, and
Albertus Magnus, a distinguished philosopher and
scientific writer.
About this time
Mondino dei Liucci taught at Bologna. Prohibitions
against human dissection were slowly lifting, and Mondino
performed his own dissections rather than following the
customary procedure of entrusting the task to a menial.
Although he perpetuated the errors of Galen, his
Anothomia, published in 1316, was the first practical
manual of anatomy. Foremost among the surgeons of the day
was
Guy de Chauliac, a physician to three popes at Avignon.
His
Chirurgia magna (“Great Surgery”), based on
observation and experience, had a profound influence upon
the progress of surgery.
The Renaissance in the 14th, 15th, and 16th centuries was
much more than just a reviving of interest in Greek and
Roman culture; it was rather a change of
outlook, an eagerness for discovery, a desire to
escape from the limitations of tradition and to explore new
fields of thought and action. In medicine, it was perhaps
natural that anatomy and physiology, the knowledge of the
human body and its workings, should be the first aspects of
medical learning to receive attention from those who
realized the need for reform.
It was in 1543 that
Andreas Vesalius, a young Belgian professor of anatomy
at the
University of Padua, published De humani corporis
fabrica (“On the Structure of the Human Body”).
Based on his own dissections, this seminal work corrected
many of Galen’s errors. By his
scientific observations and methods, Vesalius showed
that Galen could no longer be regarded as the final
authority. His work at Padua was continued by
Gabriel Fallopius and, later, by
Hieronymus Fabricius ab Aquapendente; it was his work on
the valves in the veins,
De venarum ostiolis (1603), that suggested to his
pupil William Harvey his revolutionary theory of the
circulation of the blood, one of the great medical
discoveries.
Surgery profited from the new
outlook in
anatomy, and the great reformer
Ambroise Paré dominated the field in the 16th century.
Paré was surgeon to four kings of France, and he has
deservedly been called the father of modern surgery. In his
autobiography, written after he had retired from 30 years of
service as an army surgeon, Paré described how he had
abolished the painful practice of cautery to stop bleeding
and used ligatures and dressings instead. His favourite
expression, “I dressed him; God healed him,” is
characteristic of this humane and careful doctor.
In Britain during this period surgery, which was
performed by barber-surgeons, was becoming regulated and
organized under royal charters. Companies were thus formed
that eventually became the royal colleges of surgeons in
Scotland and England.
Physicians and surgeons united in a joint
organization in Glasgow, and a
college of physicians was founded in London.
The 16th-century medical scene was enlivened by the
enigmatic physician and alchemist who called himself
Paracelsus. Born in Switzerland, he traveled extensively
throughout Europe, gaining medical skills and practicing and
teaching as he went. In the tradition of Hippocrates,
Paracelsus stressed the power of nature to heal; but unlike
Hippocrates he believed also in the power of supernatural
forces, and he violently attacked the
medical treatments of his day. Eager for reform, he
allowed his intolerance to outweigh his discretion, as when
he prefaced his lectures at Basel by publicly burning the
works of Avicenna and Galen. The authorities and medical men
were understandably outraged. Widely famous in his time,
Paracelsus remains a controversial figure to this day.
Despite his turbulent career, however, he did attempt to
bring a more rational approach to diagnosis and treatment,
and he introduced the use of chemical drugs in place of
herbal remedies.
A contemporary of Paracelsus,
Girolamo Fracastoro of Italy was a scholar cast from a
very different mold. His account of the disease syphilis,
entitled
Syphilis sive morbus Gallicus (1530; “Syphilis or
the French Disease”), was written in verse. Although
Fracastoro called syphilis the
French disease, others called it the Neapolitan disease,
for it was said to have been brought to Naples from America
by the sailors of
Christopher Columbus. Its origin is still questioned,
however. Fracastoro was interested in epidemic infection,
and he offered the first scientific explanation of disease
transmission. In his great work,
De contagione et contagiosis morbis (1546), he
theorized that the seeds of certain diseases are
imperceptible particles transmitted by air or by contact.
The
Enlightenment
In the 17th century the
natural sciences moved forward on a broad front.
There were attempts to grapple with the nature of science,
as expressed in the works of thinkers like Francis Bacon,
Descartes, and Newton. New knowledge of chemistry superseded
the theory that all things are made up of earth, air, fire,
and water, and the old Aristotelian ideas began to be
discarded. The supreme 17th-century achievement in medicine
was Harvey’s explanation of the circulation of blood.
The
Enlightenment »
Harvey and the experimental method
Born in Folkestone,
Eng.,
William Harvey studied at
Cambridge University and then spent several
years at Padua, where he came under the influence of
Fabricius. He established a successful
medical practice in London and by precise
observation and scrupulous reasoning developed his
theory of circulation. In 1628 he published his
classic book Exercitatio Anatomica
de Motu Cordis et Sanguinis in Animalibus (Concerning
the Motion of the Heart and Blood), often
called De Motu Cordis.
That the book
aroused controversy is not surprising. There were
still many who adhered to the teaching of
Galen that the blood follows an ebb and flow
movement in the
blood vessels. Harvey’s work was the result of
many careful experiments, but few of his critics
took the trouble to repeat the experiments, simply
arguing in favour of the older view. His second
great book,
Exercitationes de generatione animalium
(“Experiments Concerning Animal Generation”),
published in 1651, laid the foundation of modern
embryology.
Harvey’s discovery of the circulation of the
blood was a landmark of medical progress; the new
experimental method by which the results were
secured was as noteworthy as the work itself.
Following the method described by the philosopher
Francis Bacon, he drew the truth from experience and
not from authority.
There was one gap in Harvey’s argument: he was
obliged to assume the existence of the
capillary vessels that conveyed the blood
from the arteries to the veins. This link in the
chain of evidence was supplied by
Marcello Malpighi of Bologna (who was born in
1628, the year of publication of De Motu Cordis).
With a primitive microscope Malpighi saw a network
of tiny blood vessels in the
lung of a frog. Harvey also failed to show
why the blood circulated. After
Robert Boyle had shown that air is essential to
animal life, it was Richard Lower who traced the
interaction between air and the blood. Eventually
the importance of oxygen, which was confused for a
time by some as phlogiston, was revealed, although
it was not until the late 18th century that the
great chemist Antoine-Laurent Lavoisier discovered
the essential nature of oxygen and clarified its
relation to respiration.
Although the compound
microscope had been invented slightly earlier,
probably in Holland, its development, like that of
the telescope, was the work of
Galileo. He was the first to insist upon the
value of measurement in science and in medicine,
thus replacing theory and guesswork with accuracy.
The great Dutch microscopist
Antonie van Leeuwenhoek devoted his long life to
microscopical studies and was probably the first to
see and describe bacteria, reporting his results to
the Royal Society of London. In England,
Robert Hooke, who was Boyle’s assistant and
curator to the
Royal Society, published his Micrographia
in 1665, which discussed and illustrated the
microscopic structure of a variety of materials.
The
Enlightenment »
The futile search for an easy system
Several attempts were made in the 17th century to
discover an easy system that would guide the
practice of medicine. A substratum of superstition
still remained. Richard Wiseman, surgeon to Charles
II, affirmed his belief in the “royal touch” as a
cure for
king’s evil, or scrofula, while even the learned
English physician
Thomas Browne stated that witches really
existed. There was, however, a general desire to
discard the past and adopt new ideas.
The view of
the French philosopher
René Descartes that the human body is a machine
and that it functions mechanically had its
repercussions in medical thought. One group adopting
this explanation called themselves the
iatrophysicists; another school, preferring to view
life as a series of chemical processes, were called
iatrochemists.
Santorio Santorio, working at Padua, was an
early exponent of the iatrophysical view and a
pioneer investigator of metabolism. He was
especially concerned with the measurement of what he
called “insensible perspiration,” described in his
book De statica medicina (1614; “On Medical
Measurement”). Another Italian, who developed the
idea still further, was
Giovanni Alfonso Borelli, a professor of
mathematics at Pisa University, who gave his
attention to the mechanics and statics of the body
and to the physical laws that govern its movements.
The iatrochemical school was founded at Brussels
by
Jan Baptist van Helmont, whose writings are
tinged with the mysticism of the alchemist. A more
logical and intelligible view of iatrochemistry was
advanced by
Franciscus Sylvius, at Leiden; and in England a
leading exponent of the same school was
Thomas Willis, who is better known for his
description of the brain in his Cerebri anatome
nervorumque descriptio et usus (“Anatomy of the
Brain and Descriptions and Functions of the
Nerves”), published in 1664 and illustrated by
Christopher Wren.
It soon became apparent that no easy road to
medical knowledge and practice was to be found along
these channels and that the best method was the
age-old system of straightforward clinical
observation initiated by Hippocrates. The need for a
return to these views was strongly urged by
Thomas Sydenham, well named “the English
Hippocrates.” Sydenham was not a voluminous writer
and, indeed, had little patience with book learning
in medicine; nevertheless he gave excellent
descriptions of the phenomena of disease. His
greatest service, much needed at the time, was to
divert physicians’ minds from speculation and lead
them back to the bedside, where the true art of
medicine could be studied.
The
Enlightenment »
Medicine in the 18th century
Even in the 18th century the search for a simple way
of healing the sick continued. In Edinburgh the
writer and lecturer
John Brown expounded his view that there were
only two diseases, sthenic (strong) and asthenic
(weak), and two treatments, stimulant and sedative;
his chief remedies were alcohol and opium. Lively
and heated debates took place between his followers,
the Brunonians, and the more orthodox Cullenians
(followers of
William Cullen, a professor of medicine at
Glasgow), and the controversy spread to the medical
centres of Europe.
At the opposite end of the
scale, at least in regard to dosage, was
Samuel Hahnemann, of Leipzig, the originator of
homeopathy, a system of treatment involving the
administration of minute doses of drugs whose
effects resemble the effects of the disease being
treated. His ideas had a salutary effect upon
medical thought at a time when prescriptions were
lengthy and doses were large, and his system has had
many followers.
By the 18th century the
medical school at Leiden had grown to rival that
of Padua, and many students were attracted there
from abroad. Among them was
John Monro, an army surgeon, who resolved that
his native city of
Edinburgh should have a similar
medical school. He specially educated his son
Alexander with a view to having him appointed
professor of anatomy, and the bold plan was
successful.
Alexander Monro studied at Leiden under
Hermann Boerhaave, the central figure of
European medicine and the greatest clinical teacher
of his time. Subsequently, three generations of
Alexander Monros taught anatomy at Edinburgh
University over a continuous period of 126 years.
Medical education was increasingly incorporated into
the universities of Europe, and Edinburgh became the
leading academic centre for medicine in Britain.
In 18th-century London,
Scottish doctors were the leaders in surgery and
obstetrics. The noted teacher
John Hunter conducted extensive researches in
comparative anatomy and physiology, founded
surgical pathology, and raised surgery to the level
of a respectable branch of science. His brother
William Hunter, an eminent teacher of anatomy,
became famous as an obstetrician. Male doctors were
now attending women in childbirth, and the leading
obstetrician in London was
William Smellie. His well-known Treatise on the Theory and Practice of Midwifery,
published in three volumes in 1752–64, contained the
first systematic discussion on the safe use of
obstetrical forceps, which have since saved
countless lives. Smellie placed
midwifery on a sound scientific footing and
helped to establish obstetrics as a recognized
medical discipline.
On the basis of work begun in the 18th century,
René Laënnec, a native of Brittany, who
practiced medicine in Paris, invented a simple
stethoscope, or cylindre, as it was
originally called. In 1819 he wrote a treatise,
De l’auscultation médiate (“On Mediate
Auscultation”), describing many of the curious
sounds in the heart and lungs that are revealed by
the instrument. Meanwhile a Viennese physician,
Leopold Auenbrugger, discovered another method
of investigating diseases of the chest, that of
percussion. The son of an innkeeper, he is said
to have conceived the idea of tapping with the
fingers when he recalled that he had used this
method to gauge the level of the fluid contents of
his father’s casks.
One highly significant medical advance, late in
the century, was
vaccination.
Smallpox, disfiguring and often fatal, was
widely prevalent.
Inoculation, which had been practiced in the
East, was popularized in England in 1721–22 by
Lady Mary Wortley Montagu, who is best known for
her letters. She observed the practice in Turkey,
where it produced a mild form of the disease, thus
securing immunity, although not without danger. The
next step was taken by
Edward Jenner, a country practitioner who had
been a pupil of John Hunter. In 1796 Jenner began
inoculations with material from cowpox (the bovine
form of the disease); and when he later inoculated
the same subject with smallpox, the disease did not
appear. This procedure—vaccination—has been
responsible for eradicating the disease.
Public health and
hygiene were receiving more attention during the
18th century. Population statistics began to be
kept, and suggestions arose concerning health
legislation. Hospitals were established for a
variety of purposes. In Paris,
Philippe Pinel initiated bold reforms in the
care of the mentally ill, releasing them from their
chains and discarding the long-held notion that
insanity was caused by demon possession.
Conditions improved for sailors and soldiers as
well.
James Lind, a British naval surgeon from
Edinburgh, recommended fresh fruits and citrus
juices to prevent
scurvy, a remedy discovered by the Dutch in the
16th century. When the British navy adopted Lind’s
advice—decades later—this
deficiency disease was eliminated. In 1752 a
Scotsman, John
Pringle, published his classic Observations
on the Diseases of the Army, which contained
numerous recommendations for the health and comfort
of the troops. Serving with the British forces
during the
War of the Austrian Succession, he suggested in
1743 that military hospitals on both sides should be
regarded as sanctuaries; this plan eventually led to
the establishment of the
Red Cross organization in 1864.
Two pseudoscientific doctrines relating to
medicine emerged from Vienna in the latter part of
the century and attained wide notoriety. Mesmerism,
a belief in “animal magnetism” sponsored by
Franz Anton Mesmer, probably owed any
therapeutic value it had to suggestions given while
the patient was under
hypnosis.
Phrenology, propounded by
Franz Joseph Gall, held that the contours of the
skull were a guide to an individual’s mental
faculties and character traits; this theory remained
popular throughout the 19th century.
At the same time, sound scientific thinking
was making steady progress, and advances in
physics, chemistry, and the
biological sciences were converging to
form a rational scientific basis for every
branch of clinical medicine. New knowledge
disseminated thoughout Europe and traveled
across the sea, where centres of medical
excellence were being established in
America.
The rise of scientific medicine
in the 19th century
The portrayal of the history of medicine
becomes more difficult in the 19th century.
Discoveries multiply, and the number of
eminent doctors is so great that the history
is apt to become a series of biographies.
Nevertheless, it is possible to discern the
leading trends in modern medical thought.
The rise of scientific medicine
in the 19th century
» Physiology
By the beginning of the 19th century, the
structure of the
human body was almost fully known, due
to new methods of microscopy and of
injections. Even the body’s microscopic
structure was understood. But as important
as anatomical knowledge was an understanding
of
physiological processes, which were
rapidly being elucidated, especially in
Germany. There, physiology became
established as a distinct science under the
guidance of
Johannes Müller, who was a professor at
Bonn and then at the
University of Berlin. An energetic
worker and an inspiring teacher, he
described his discoveries in a famous
textbook, Handbuch der Physiologie des
Menschen (“Manual of Human
Physiology”), published in the 1830s.
Among Müller’s illustrious pupils were
Hermann von Helmholtz, who made
significant discoveries relating to sight
and hearing and who invented the
ophthalmoscope; and
Rudolf Virchow, one of the century’s
great medical scientists, whose outstanding
achievement was his conception of the cell
as the centre of all pathological changes.
Virchow’s work Die Cellularpathologie,
published in 1858, gave the deathblow to the
outmoded view that disease is due to an
imbalance of the four humours.
In France the most brilliant physiologist
of the time was
Claude Bernard, whose many important
discoveries were the outcome of carefully
planned experiments. His researches
clarified the role of the pancreas in
digestion, revealed the presence of glycogen
in the liver, and explained how the
contraction and expansion of the
blood vessels are controlled by
vasomotor nerves. He proposed the concept of
the internal environment—the chemical
balance in and around the cells—and the
importance of its stability. His
Introduction à l’étude de la médecine
expérimentale (1865;
An Introduction to the Study of Experimental
Medicine) is still worthy of study
by all who undertake research.
The rise of scientific medicine
in the 19th century
» Verification of the
germ theory
Perhaps the overarching medical advance of
the 19th century, certainly the most
spectacular, was the conclusive
demonstration that certain diseases, as well
as the infection of surgical wounds, were
directly caused by minute living organisms.
This discovery changed the whole face of
pathology and effected a complete revolution
in the practice of surgery.
The idea that
disease was caused by entry into the body of
imperceptible particles was of ancient date.
It had been expressed by the Roman
encyclopaedist
Varro as early as 100
bc, by
Fracastoro in 1546, by
Athanasius Kircher and Pierre Borel
about a century later, and by
Francesco Redi, who in 1684 wrote his
Osservazioni intorno agli animali
viventi che si trovano negli animali viventi
(“Observations on Living Animals Which Are
to Be Found Within Other Living Animals”),
in which he sought to disprove the idea of
spontaneous generation. Everything must
have a parent, he wrote; only life produces
life. A 19th-century pioneer in this field,
regarded by some as founder of the parasitic
theory of infection, was
Agostino Bassi of Italy, who showed that
a disease of silkworms was caused by a
fungus that could be destroyed by chemical
agents.
The main credit for establishing the
science of
bacteriology must be accorded to the
French chemist
Louis Pasteur. It was Pasteur who, by a
brilliant series of experiments, proved that
the fermentation of wine and the souring of
milk are caused by living microorganisms.
His work led to the pasteurization of milk
and solved problems of agriculture and
industry as well as those of animal and
human diseases. He successfully employed
inoculations to prevent anthrax in sheep and
cattle, chicken cholera in fowl, and finally
rabies in humans and dogs. The latter
resulted in the widespread establishment of
Pasteur institutes.
From Pasteur,
Joseph Lister derived the concepts that
enabled him to introduce the
antiseptic principle into surgery. In
1865 Lister, a professor of surgery at
Glasgow University, began placing an
antiseptic barrier of
carbolic acid between the wound and the
germ-containing atmosphere.
Infections and deaths fell dramatically,
and his pioneering work led to more refined
techniques of sterilizing the surgical
environment.
Obstetrics had already been robbed of
some of its terrors by Alexander Gordon at
Aberdeen, Scot.,
Oliver Wendell Holmes at Boston, and
Ignaz Semmelweis at Vienna and Pest
(Budapest), who advocated disinfection of
the hands and clothing of midwives and
medical students who attended confinements.
These measures produced a marked reduction
in cases of
puerperal fever, the
bacterial scourge of women following
childbirth.
Another pioneer in bacteriology was the
German physician
Robert Koch, who showed how bacteria
could be cultivated, isolated, and examined
in the laboratory. A meticulous
investigator, Koch discovered the organisms
of tuberculosis, in 1882, and cholera, in
1883. By the end of the century many other
disease-producing microorganisms had been
identified.
The rise of scientific medicine
in the 19th century
» Discoveries in clinical medicine and
anesthesia
There was perhaps some danger that in the
search for bacteria other causes of disease
would escape detection. Many physicians,
however, were working along different lines
in the 19th century. Among them were a group
attached to Guy’s Hospital, in London:
Richard Bright,
Thomas Addison, and Sir William Gull.
Bright contributed significantly to the
knowledge of kidney diseases, including
Bright’s disease, and Addison gave his
name to disorders of the
adrenal glands and the blood. Gull, a
famous clinical teacher, left a legacy of
pithy aphorisms that might well rank with
those of Hippocrates.
In Dublin Robert
Graves and
William Stokes introduced new methods in
clinical diagnosis and medical training;
while in Paris a leading clinician,
Pierre-Charles-Alexandre Louis, was
attracting many students from America by the
excellence of his teaching. By the early
19th century the
United States was ready to send back the
results of its own researches and
breakthroughs. In 1809, in a small Kentucky
town,
Ephraim McDowell boldly operated on a
woman—without anesthesia or antisepsis—and
successfully removed a large ovarian tumour.
William Beaumont, in treating a shotgun
wound of the stomach, was led to make many
original observations that were published in
1833 as Experiments and Observations on
the Gastric Juice and the Physiology of
Digestion.
The most famous contribution by the
United States to medical progress at this
period was undoubtedly the introduction of
general
anesthesia, a procedure that not only
liberated the patient from the fearful pain
of surgery but also enabled the surgeon to
perform more extensive operations. The
discovery was marred by controversy.
Crawford Long,
Gardner Colton,
Horace Wells, and
Charles Jackson are all claimants for
priority; some used
nitrous oxide gas, and others employed
ether, which was less capricious. There
is little doubt, however, that it was
William Thomas Morton who, on Oct. 16,
1846, at Massachusetts General Hospital, in
Boston, first demonstrated before a
gathering of physicians the use of ether as
a
general anesthetic. The news quickly
reached Europe, and general anesthesia soon
became prevalent in surgery. At Edinburgh,
the professor of midwifery,
James Young Simpson, had been
experimenting upon himself and his
assistants, inhaling various vapours with
the object of discovering an effective
anesthetic. In November 1847
chloroform was tried with complete
success, and soon it was preferred to ether
and became the anesthetic of choice.
The rise of scientific medicine
in the 19th century
» Advances at the end of the century
While antisepsis and anesthesia placed
surgery on an entirely new footing,
similarly important work was carried out in
other fields of study, such as parasitology
and disease transmission.
Patrick Manson, a British pioneer in
tropical medicine, showed in China, in 1877,
how insects can carry disease and how the
embryos of the
Filaria worm, which can cause
elephantiasis, are transmitted by the
mosquito. Manson explained his views to a
British army surgeon,
Ronald Ross, then working on the problem
of
malaria, and Ross discovered the
malarial parasite in the stomach of the
Anopheles mosquito in 1897.
In Cuba,
Carlos Finlay expressed the view, in
1881, that
yellow fever is carried by the
Stegomyia mosquito. Following his lead,
the Americans
Walter Reed,
William Gorgas, and others were able to
conquer the scourge of yellow fever in
Panama and made possible the completion of
the
Panama Canal by reducing the
death rate there from 176 per 1,000 to 6
per 1,000.
Other victories in
preventive medicine ensued, because the
maintenance of health was now becoming as
important a concern as the cure of disease;
and the 20th century was to witness the
evolution and progress of national health
services in a number of countries. In
addition, spectacular advances in diagnosis
and treatment followed the discovery of
X rays by Wilhelm Conrad Röntgen, in
1895, and of radium by Pierre and
Marie Curie in 1898. Before the turn of
the century, too, the vast new field of
psychiatry had been opened up by
Sigmund Freud. The tremendous increase
in scientific knowledge during the 19th
century radically altered and expanded the
practice of medicine. Concern for upholding
the quality of services led to the
establishment of public and professional
bodies to govern the standards for medical
training and practice.
Douglas
James GuthriePhilip Rhodes
Medicine in
the 20th century
The 20th century has produced such a
plethora of discoveries and advances that in
some ways the face of medicine has changed
out of all recognition. In 1901, for
instance, in the
United Kingdom the expectation of life
at birth, a primary indicator of the effect
of
health care on
mortality (but also reflecting the state
of health education, housing, and
nutrition), was 48 years for males and 51.6
years for females. After steady increases,
by the 1980s life expectancy had reached
71.4 years for males and 77.2 years for
females. Other industrialized nations showed
similar dramatic increases. Indeed, the
outlook has so altered that, with the
exception of diseases such as cancer and
AIDS, attention has become focused on
morbidity rather than mortality, and the
emphasis has changed from keeping people
alive to keeping them fit.
The rapid
progress of medicine in this era was
reinforced by enormous improvements in
communication between scientists throughout
the world. Through publications,
conferences, and—later—computers and
electronic media, they freely exchanged
ideas and reported on their endeavours. No
longer was it common for an individual to
work in isolation. Although specialization
increased, teamwork became the norm. It
consequently has become more difficult to
ascribe medical accomplishments to
particular individuals.
In the first half of the century,
emphasis continued to be placed on combating
infection, and notable landmarks were also
attained in endocrinology, nutrition, and
other areas. In the years following
World War II, insights derived from
cell biology altered basic concepts of
the disease process; new discoveries in
biochemistry and physiology opened the way
for more precise diagnostic tests and more
effective therapies; and spectacular
advances in biomedical engineering enabled
the physician and surgeon to probe into the
structures and functions of the body by
noninvasive imaging techniques like
ultrasound (sonar),
computerized axial tomography (CAT), and
nuclear magnetic resonance (NMR). With
each new scientific development, medical
practices of just a few years earlier became
obsolete.
Medicine in the 20th century
» Infectious diseases and chemotherapy
In the years following the turn of the
century, ongoing research concentrated on
the nature of infectious diseases and their
means of transmission. Increasing numbers of
pathogenic organisms were discovered and
classified. Some, such as the rickettsias,
which cause diseases like typhus, were
smaller than bacteria; some were larger,
such as the protozoans that engender malaria
and other tropical diseases. The smallest to
be identified were the viruses, producers of
many diseases, among them mumps, measles,
German measles, and poliomyelitis; and
in 1910
Peyton Rous showed that a virus could
also cause a
malignant tumour, a sarcoma in chickens.
There was still little to be done for the
victims of most infectious organisms beyond
drainage, poultices, and ointments, in the
case of local infections, and rest and
nourishment for severe diseases. The search
for treatments aimed at both vaccines and
chemical remedies.
Medicine in the 20th century
» Infectious diseases and chemotherapy
» Ehrlich and arsphenamine
Germany was well to the forefront in
medical progress. The scientific approach to
medicine had been developed there long
before it spread to other countries, and
postgraduates flocked to German medical
schools from all over the world. The opening
decade of the 20th century has been well
described as the golden age of German
medicine. Outstanding among its leaders was
Paul Ehrlich.
While still a student,
Ehrlich carried out some work on
lead poisoning from which he evolved the
theory that was to guide much of his
subsequent work—that certain tissues have a
selective affinity for certain chemicals. He
experimented with the effects of various
chemical substances on disease organisms. In
1910, with his colleague Sahachiro Hata, he
conducted tests on
arsphenamine, once sold under the
commercial name Salvarsan. Their success
inaugurated the
chemotherapeutic era, which was to
revolutionize the treatment and control of
infectious diseases. Salvarsan, a synthetic
preparation containing arsenic, is lethal to
the microorganism responsible for syphilis.
Until the introduction of penicillin,
Salvarsan or one of its modifications
remained the standard treatment of syphilis
and went far toward bringing this social and
medical scourge under control.
Medicine in the 20th century
» Infectious diseases and chemotherapy
»
Sulfonamide
drugs
In 1932 the German bacteriologist
Gerhard Domagk announced that the red
dye
Prontosil is active against
streptococcal infections in mice and humans.
Soon afterward French workers showed that
its active
antibacterial agent is
sulfanilamide. In 1936 the English
physician
Leonard Colebrook and his colleagues
provided overwhelming evidence of the
efficacy of both Prontosil and sulfanilamide
in streptococcal septicemia (bloodstream
infection), thereby ushering in the
sulfonamide era. New sulfonamides, which
appeared with astonishing rapidity, had
greater potency, wider antibacterial range,
or lower toxicity. Some stood the test of
time; others, like the original
sulfanilamide and its immediate successor,
sulfapyridine, were replaced by safer and
more powerful successors.
Medicine in the 20th century
» Infectious diseases and
chemotherapy
» Antibiotics
»
Penicillin
A dramatic episode in
medical history occurred in 1928,
when
Alexander Fleming noticed the
inhibitory action of a stray mold on a
plate culture of staphylococcus bacteria
in his laboratory at St. Mary’s
Hospital, London. Many other
bacteriologists must have made the
observation, but none had realized the
possible implications. The mold was a
strain of Penicillium—P.
notatum—which gave its name to the
now-famous drug penicillin. In spite of
his conviction that penicillin was a
potent antibacterial agent, Fleming was
unable to carry his work to fruition,
mainly because biochemists at the time
were unable to isolate it in sufficient
quantities or in a sufficiently pure
form to allow its use on patients.
Ten
years later Howard Florey, Ernst Chain,
and their colleagues at
Oxford University took up the
problem again They isolated penicillin
in a form that was fairly pure (by
standards then current) and demonstrated
its potency and relative lack of
toxicity. By then World War II had
begun, and techniques to facilitate
commercial production were developed in
the United States. By 1944 adequate
amounts were available to meet the
extraordinary needs of wartime.
Medicine in the 20th century
» Infectious diseases and
chemotherapy
» Antibiotics
» Antituberculous drugs
While penicillin is the most useful and
the safest antibiotic, it suffers from
certain disadvantages. The most
important of these is that it is not
active against Mycobacterium
tuberculosis, the bacillus of
tuberculosis. In view of the importance
of tuberculosis as a
public health hazard, this is a
serious defect. The position was rapidly
rectified when, in 1944,
Selman Waksman,
Albert Schatz, and
Elizabeth Bugie announced the
discovery of
streptomycin from cultures of a
soil organism,
Streptomyces griseus, and
stated that it was active against M.
tuberculosis. Subsequent
clinical trials amply confirmed this
claim. Streptomycin suffers, however,
from the great disadvantage that the
tubercle bacillus tends to become
resistant to it. Fortunately, other
drugs became available to supplement it,
the two most important being
para-aminosalicylic acid (PAS) and
isoniazid. With a combination of two
or more of these preparations, the
outlook in tuberculosis improved
immeasurably. The disease was not
conquered, but it was brought well under
control.
Medicine in the 20th century
» Infectious diseases and
chemotherapy
» Antibiotics
» Other antibiotics
Penicillin is not effective over the
entire field of microorganisms
pathogenic to humans. During the 1950s
the search for antibiotics to fill this
gap resulted in a steady stream of them,
some with a much wider antibacterial
range than penicillin (the so-called
broad-spectrum antibiotics) and some
capable of coping with those
microorganisms that are inherently
resistant to penicillin or that have
developed resistance through exposure to
penicillin.
This tendency of
microorganisms to develop resistance to
penicillin at one time threatened to
become almost as serious a problem as
the development of resistance to
streptomycin by the bacillus of
tuberculosis. Fortunately, early
appreciation of the problem by
clinicians resulted in more discriminate
use of penicillin. Scientists continued
to look for means of obtaining new
varieties of penicillin, and their
researches produced the so-called
semi-synthetic antibiotics, some of
which are active when taken by mouth,
while others are effective against
microorganisms that have developed
resistance to the earlier form of
penicillin.
Dramatic though they undoubtedly
were, the advances in
chemotherapy still left one
important area vulnerable, that
of the
viruses. It was in bringing
viruses under control that
advances in immunology—the study
of immunity—played such a
striking part. One of the
paradoxes of medicine is that
the first large-scale
immunization against a
viral disease was instituted
and established long before
viruses were discovered. When
Edward Jenner introduced
vaccination against the virus
that causes smallpox, the
identification of viruses was
still 100 years in the future.
It took almost another half
century to discover an effective
method of producing antiviral
vaccines that were both safe and
effective.
In the meantime,
however, the process by which
the body reacts against
infectious organisms to generate
immunity became better
understood. In Paris, Élie
Metchnikoff had already detected
the role of
white blood cells in the
immune reaction, and
Jules Bordet had identified
antibodies in the blood serum.
The mechanisms of antibody
activity were used to devise
diagnostic tests for a number of
diseases. In 1906
August von Wassermann gave
his name to the
blood test for syphilis, and
in 1908 the
tuberculin test—the
skin test for
tuberculosis—came into use. At
the same time, methods of
producing effective substances
for inoculation were improved,
and immunization against
bacterial diseases made rapid
progress.
Medicine in the
20th century
»
Immunology
» Antibacterial
vaccination
»
Typhoid
In 1897 the English
bacteriologist
Almroth Wright
introduced a vaccine
prepared from killed typhoid
bacilli as a preventive of
typhoid. Preliminary trials
in the Indian army produced
excellent results, and
typhoid vaccination was
adopted for the use of
British troops serving in
the South African War.
Unfortunately, the method of
administration was
inadequately controlled, and
the government sanctioned
inoculations only for
soldiers that “voluntarily
presented themselves for
this purpose prior to their
embarkation for the seat of
war.” The result was that,
according to the official
records, only 14,626 men
volunteered out of a total
strength of 328,244 who
served during the three
years of the war. Although
later analysis showed that
inoculation had had a
beneficial effect, there
were 57,684 cases of
typhoid—approximately one in
six of the British troops
engaged—with 9,022 deaths.
A bitter controversy over
the merits of the vaccine
followed, but before the
outbreak of
World War I immunization
had been officially adopted
by the army. Comparative
statistics would seem to
provide striking
confirmation of the value of
antityphoid inoculation,
even allowing for the better
sanitary arrangements in the
latter war. In the South
African War the annual
incidence of enteric
infections (typhoid and
paratyphoid) was 105 per
1,000 troops, and the annual
death rate was 14.6 per
1,000; the comparable
figures for World War I were
2.35 and 0.139,
respectively.
It is perhaps a sign of
the increasingly critical
outlook that developed in
medicine in the post-1945
era that experts continued
to differ on some aspects of
typhoid immunization. There
was no question as to its
fundamental efficacy, but
there was considerable
variation of opinion as to
the best vaccine to use and
the most effective way of
administering it. Moreover,
it was often difficult to
decide to what extent the
decline in typhoid was
attributable to improved
sanitary conditions and what
to the greater use of the
vaccine.
Medicine in the
20th century
»
Immunology
» Antibacterial
vaccination
»
Tetanus
The other great hazard of
war that was brought under
control in World War I was
tetanus. This was achieved
by the prophylactic
injection of tetanus
antitoxin into all
wounded men. The serum was
originally prepared by the
bacteriologists
Emil von Behring and
Shibasaburo Kitasato in
1890–92, and the results of
this first large-scale trial
amply confirmed its
efficacy. (Tetanus antitoxin
is a sterile solution of
antibody globulins—a
type of blood protein—from
immunized horses or cattle.)
It was not until the 1930s,
however, that an efficient
vaccine, or
toxoid, as it is known
in the cases of tetanus and
diphtheria, was produced
against tetanus. (Tetanus
toxoid is a preparation of
the toxin—or poison—produced
by the microorganism;
injected into humans, it
stimulates the body’s own
defenses against the
disease, thus bringing about
immunity.) Again, a war was
to provide the opportunity
for testing on a large
scale, and experience with
tetanus toxoid in World
War II indicated that it
gave a high degree of
protection.
The story of diphtheria is
comparable to that of
tetanus, though even more
dramatic. First, as with
tetanus antitoxin, came
the preparation of
diphtheria antitoxin by
Behring and Kitasato in
1890. As the antitoxin came
into general use for the
treatment of cases, the
death rate began to decline.
There was no significant
fall in the number of cases,
however, until a
toxin–antitoxin mixture,
introduced by Behring in
1913, was used to immunize
children. A more effective
toxoid was introduced by the
French bacteriologist Gaston
Ramon in 1923, and with
subsequent improvements this
became one of the most
effective vaccines available
in medicine. Where mass
immunization of children
with the toxoid was
practiced, as in the United
States and Canada beginning
in the late 1930s and in
England and Wales in the
early 1940s, cases of
diphtheria and deaths from
the disease became almost
nonexistent. In England and
Wales, for instance, the
number of deaths fell from
an annual average of 1,830
in 1940–44 to zero in 1969.
Administration of a combined
vaccine against diphtheria,
pertussis (whooping
cough), and tetanus (DPT)
is recommended for young
children. Although an
increasing number of
dangerous
side effects from the
DPT vaccine have been
reported, it continues to be
used in most countries
because of the protection it
affords.
Medicine in the
20th century
»
Immunology
» Antibacterial
vaccination
»
BCG vaccine
for tuberculosis
If, as is universally
accepted, prevention is
better than cure,
immunization is the ideal
way of dealing with diseases
caused by microorganisms. An
effective, safe vaccine
protects the individual from
disease, whereas
chemotherapy merely copes
with the infection once the
individual has been
affected. In spite of its
undoubted value, however,
immunization has been a
recurring source of dispute.
Like vaccination against
typhoid (and against
poliomyelitis later),
tuberculosis immunization
evoked widespread
contention.
In 1908
Albert Calmette, a pupil
of Pasteur, and
Camille Guérin produced
an avirulent (weakened)
strain of the tubercle
bacillus. About 13 years
later, vaccination of
children against
tuberculosis was introduced,
with a vaccine made from
this avirulent strain and
known as BCG (bacillus
Calmette-Guérin) vaccine.
Although it was adopted in
France, Scandinavia, and
elsewhere, British and U.S.
authorities frowned upon its
use on the grounds that it
was not safe and that the
statistical evidence in its
favour was not convincing.
One of the stumbling
blocks in the way of its
widespread adoption was what
came to be known as the
Lübeck disaster. In the
spring of 1930, 249 infants
were vaccinated with BCG
vaccine in Lübeck, Ger.; by
autumn, 73 of the 249 were
dead. Criminal proceedings
were instituted against
those responsible for giving
the vaccine. The final
verdict was that the vaccine
had been contaminated, and
the BCG vaccine itself was
exonerated from any
responsibility for the
deaths. A bitter controversy
followed, but in the end the
protagonists of the vaccine
won when a further trial
showed that the vaccine was
safe and that it protected
four out of five of those
vaccinated.
Medicine in the 20th
century
»
Immunology
» Immunization against viral
diseases
With the exception of smallpox,
it was not until well into the
20th century that efficient
viral vaccines became available.
In fact, it was not until the
1930s that much began to be
known about viruses. The two
developments that contributed
most to the rapid growth in
knowledge after that time were
the introduction of
tissue culture as a means of
growing viruses in the
laboratory and the availability
of the
electron microscope. Once
the virus could be cultivated
with comparative ease in the
laboratory, the research worker
could study it with care and
evolve methods for producing one
of the two requirements for a
safe and effective vaccine:
either a virus that was so
attenuated, or weakened, that it
could not produce the disease
for which it was responsible in
its normally virulent form; or a
killed virus that retained the
faculty of inducing a protective
antibody response in the
vaccinated individual.
The
first of the viral vaccines to
result from these advances was
for yellow fever, developed by
the microbiologist
Max Theiler in the late
1930s. About 1945 the first
relatively effective vaccine was
produced for
influenza; in 1954 the
American physician
Jonas E. Salk introduced a
vaccine for
poliomyelitis; and in 1960
an oral
poliomyelitis vaccine,
developed by the virologist
Albert B. Sabin, came into
wide use.
These vaccines went far
toward bringing under control
three of the major diseases of
the time although, in the case
of influenza, a major
complication is the disturbing
proclivity of the virus to
change its character from one
epidemic to another. Even
so, sufficient progress has been
made to ensure that a pandemic
like the one that swept the
world in 1918–19, killing more
than 15,000,000 people, is
unlikely to occur again. Centres
are now equipped to monitor
outbreaks of influenza
throughout the world in order to
establish the identity of the
responsible viruses and, if
necessary, take steps to produce
appropriate vaccines.
During the 1960s effective
vaccines came into use for
measles and
rubella (German measles).
Both evoked a certain amount of
controversy. In the case of
measles in the Western world it
was contended that, if acquired
in childhood, it is not a
particularly hazardous malady,
and the naturally acquired
disease evokes permanent
immunity in the vast majority of
cases. Conversely, the vaccine
induces a certain number of
adverse reactions, and the
duration of the immunity it
produces is problematical. In
the end the official view was
that universal measles
vaccination is to be commended.
The situation with rubella
vaccination was different.
This is a fundamentally mild
affliction, and the only cause
for anxiety is its proclivity to
induce congenital deformities if
a pregnant woman should acquire
the disease. Once an effective
vaccine was available, the
problem was the extent to which
it should be used. Ultimately
the consensus was reached that
all girls who had not already
had the disease should be
vaccinated at about 12
years. In the United States
children are routinely immunized
against measles, mumps, and
rubella at the age of 15 months.
Medicine in the 20th
century
»
Immunology
» The immune response
With advances in cell biology in
the second half of the 20th
century came a more profound
understanding of both normal and
abnormal conditions in the body.
Electron microscopy enabled
observers to peer more deeply
into the structures of the cell,
and chemical investigations
revealed clues to their
functions in the cell’s
intricate metabolism. The
overriding importance of the
nuclear genetic material DNA (deoxyribonucleic
acid) in regulating the
cell’s protein and enzyme
production lines became evident.
A clearer comprehension also
emerged of the ways in which the
cells of the body defend
themselves by modifying their
chemical activities to produce
antibodies against injurious
agents.
Up until the turn of
the century, immunity referred
mostly to the means of
resistance of an animal to
invasion by a parasite or
microorganism. Around
mid-century there arose a
growing realization that
immunity and immunology cover a
much wider field and are
concerned with mechanisms for
preserving the integrity of the
individual. The introduction of
organ
transplantation, with its
dreaded complication of tissue
rejection, brought this broader
concept of immunology to the
fore.
At the same time, research
workers and clinicians began to
appreciate the far-reaching
implications of immunity in
relation to endocrinology,
genetics, tumour biology, and
the biology of a number of other
maladies. The so-called
autoimmune diseases are caused
by an aberrant series of immune
responses by which the body’s
own cells are attacked.
Suspicion is growing that a
number of major disorders such
as diabetes, rheumatoid
arthritis, and
multiple sclerosis may be
caused by similar mechanisms.
In some conditions viruses
invade the genetic material of
cells and distort their
metabolic processes. Such
viruses may lie dormant for many
years before becoming active.
This may be the underlying cause
of many cancers, in which cells
escape from the usual
constraints imposed upon them by
the normal body. The dreaded
affliction of
acquired immune deficiency
syndrome (AIDS) is caused by
a virus that has a long dormant
period and then attacks the
cells that produce antibodies.
The result is that the affected
person is not able to generate
an immune response to infections
or malignancies.
At the beginning of the 20th
century, endocrinology was in
its infancy. Indeed, it was not
until 1905 that Ernest H.
Starling, one of the many
brilliant pupils of Edward
Sharpey-Schafer, the dean of
British physiology during the
early decades of the century,
introduced the term
hormone for the internal
secretions of the
endocrine glands. In 1891
the English physician
George Redmayne Murray
achieved the first success in
treating myxedema (the common
form of hypothyroidism) with an
extract of the
thyroid gland. Three years
later,
Sharpey-Schafer and George
Oliver demonstrated in extracts
of the adrenal glands a
substance that raised the blood
pressure; and in 1901
Jokichi Takamine, a Japanese
chemist working in the United
States, isolated this active
principle, known as
epinephrine or adrenaline.
During the first two decades of
the century, steady progress was
made in the isolation,
identification, and study of the
active principles of the various
endocrine glands, but the
outstanding event of the early
years was the discovery of
insulin by
Frederick Banting,
Charles H. Best, and
J.J.R. Macleod in 1921.
Almost overnight the lot of the
diabetic patient changed
from a sentence of almost
certain death to a prospect not
only of survival but of a long
and healthy life.
For more
than 30 years, some of the
greatest minds in physiology had
been seeking the cause of
diabetes mellitus. In 1889
the German physicians Joseph von
Mering and
Oskar Minkowski had shown
that removal of the pancreas in
dogs produced the disease. In
1901 the American pathologist
Eugene L. Opie described
degenerative changes in the
clumps of cells in the pancreas
known as the
islets of Langerhans, thus
confirming the association
between failure in the function
of these cells and diabetes.
Sharpey-Schafer concluded that
the islets of Langerhans secrete
a substance that controls the
metabolism of carbohydrate. Then
Banting, Best, and Macleod,
working at the
University of Toronto,
succeeded in isolating the
elusive hormone and gave it the
name insulin.
Insulin was available in a
variety of forms, but synthesis
on a commercial scale was not
achieved, and the only source of
the hormone was the pancreas of
animals. One of its practical
disadvantages is that it has to
be given by injection;
consequently an intense search
was conducted for some
alternative substance that would
be active when taken by mouth.
Various preparations—oral
hypoglycemic agents, as they are
known—appeared that were
effective to a certain extent in
controlling diabetes, but
evidence indicated that these
were only of value in relatively
mild cases of the disease. For
the person with advanced
diabetes, a normal, healthy life
remained dependent upon the
continuing use of insulin
injections.
Medicine in
the 20th century
»
Endocrinology
» Cortisone
Another major advance in
endocrinology came from
the Mayo Clinic, in
Rochester, Minn. In 1949
Philip S. Hench and
his colleagues announced
that a substance
isolated from the cortex
of the adrenal gland had
a dramatic effect upon
rheumatoid arthritis.
This was compound E, or
cortisone, as it
came to be known, which
had been isolated by
Edward C. Kendall in
1935. Cortisone and its
many derivatives proved
to be potent as
anti-inflammatory
agents. Although it is
not a cure for
rheumatoid arthritis, as
a temporary measure
cortisone can often
control the acute
exacerbation caused by
the disease and can
provide relief in other
conditions, such as
acute
rheumatic fever,
certain kidney diseases,
certain serious diseases
of the skin, and some
allergic conditions,
including acute
exacerbations of asthma.
Of even more long-term
importance is the
valuable role it has as
a research tool.
Not the least of the
advances in
endocrinology was the
increasing knowledge and
understanding of the sex
hormones. This
culminated in the
application of this
knowledge to the problem
of
birth control. After
an initial stage of
hesitancy, the
contraceptive
pill, with its basic
rationale of preventing
ovulation, was accepted
by the vast majority of
family-planning
organizations and many
gynecologists as the
most satisfactory method
of contraception. Its
risks, practical and
theoretical, introduced
a note of caution, but
this was not sufficient
to detract from the wide
appeal induced by its
effectiveness and ease
of use.
Medicine in the 20th
century
» Vitamins
In the field of
nutrition, the
outstanding advance of
the 20th century was the
discovery and the
appreciation of the
importance to health of
the “accessory food
factors,” or
vitamins. Various
workers had shown that
animals did not thrive
on a synthetic
diet containing all
the correct amounts of
protein, fat, and
carbohydrate; they even
suggested that there
must be some unknown
ingredients in natural
food that were essential
for growth and the
maintenance of health.
But little progress was
made in this field until
the classical
experiments of the
English biologist
F. Gowland Hopkins
were published in 1912.
These were so conclusive
that there could be no
doubt that what he
termed “accessory
substances” were
essential for health and
growth.
The name
vitamine was suggested
for these substances by
the biochemist
Casimir Funk in the
belief that they were
amines, certain
compounds derived from
ammonia. In due course,
when it was realized
that they were not
amines, the term was
altered to vitamin.
Once the concept of
vitamins was established
on a firm scientific
basis it was not long
before their identity
began to be revealed.
Soon there was a long
series of vitamins, best
known by the letters of
the alphabet after which
they were originally
named when their
chemical identity was
still unknown. By
supplementing the diet
with foods containing
particular
vitamins, deficiency
diseases such as
rickets (due to
deficiency of
vitamin D) and
scurvy (due to lack
of
vitamin C, or
ascorbic acid)
practically disappeared
from Western countries,
while deficiency
diseases such as
beriberi (caused by
lack of vitamin B1,
or thiamine), which were
endemic in Eastern
countries, either
disappeared or could be
remedied with the
greatest of ease.
The isolation of
vitamin B12,
or
cyanocobalamin, was
of particular interest
because it almost
rounded off the
fascinating story of how
pernicious anemia
was brought under
control. Throughout the
first two decades of the
century, the diagnosis
of pernicious anemia,
like that of diabetes
mellitus, was nearly
equivalent to a death
sentence. Unlike the
more common form of
so-called secondary
anemia, it did not
respond to the
administration of
suitable
iron salts, and no
other form of treatment
touched it; hence, the
grimly appropriate title
of pernicious anemia.
In the early 1920s,
George R. Minot, one
of the many brilliant
investigators that
Harvard University
has contributed to
medical research,
became interested in
work being done by the
American pathologist
George H. Whipple on
the beneficial effects
of raw beef liver in
severe experimental
anemia. With a Harvard
colleague,
William P. Murphy,
he decided to
investigate the effect
of raw liver in patients
with pernicious anemia,
and in 1926 they were
able to announce that
this form of therapy was
successful. The validity
of their findings was
amply confirmed, and the
fear of pernicious
anemia came to an end.
As so often happens
in medicine, many years
were to pass before the
rationale of liver
therapy in pernicious
anemia was fully
understood. In 1948,
however, almost
simultaneously in the
United States and
Britain, the active
principle,
cyanocobalamin, was
isolated from liver, and
this vitamin became the
standard treatment for
pernicious anemia.
Medicine in the 20th
century
» Malignant disease
While progress was the
hallmark of medicine
after the beginning of
the 20th century, there
is one field in which a
gloomier picture must be
painted, that of
malignant disease, or
cancer. It is the
second most common cause
of death in most Western
countries in the second
half of the 20th
century, being exceeded
only by deaths from
heart disease. Some
progress, however, has
been achieved. The
causes of the various
types of malignancies
are not known, but many
more methods are
available for attacking
the problem; surgery
remains the principal
therapeutic standby, but
radiotherapy and
chemotherapy are
increasingly used.
Soon after the discovery
of
radium was
announced, in 1898, its
potentialities in
treating cancer were
realized; in due course
it assumed an important
role in therapy.
Simultaneously, deep
X-ray therapy was
developed, and with the
atomic age came the use
of
radioactive isotopes.
(A
radioactive isotope
is an unstable variant
of a substance that has
a stable form; during
the process of breaking
down, the unstable form
emits radiation.)
High-voltage X-ray
therapy and radioactive
isotopes have largely
replaced radium. Whereas
irradiation long
depended upon X rays
generated at 250
kilovolts, machines that
are capable of producing
X rays generated at
8,000 kilovolts and
betatrons of up to
22,000,000
electron volts (MeV)
have come into clinical
use.
The most effective of
the isotopes is
radioactive
cobalt. Telecobalt
machines (those that
hold the cobalt at a
distance from the body)
are available containing
2,000 curies or more of
the isotope, an amount
equivalent to 3,000
grams of radium and
sending out a beam
equivalent to that from
a 3,000-kilovolt X-ray
machine.
Of even more
significance have been
the developments in the
chemotherapy of cancer.
Nothing remotely
resembling a
chemotherapeutic cure
has been achieved, but
in certain forms of
malignant disease, such
as leukemia, which
cannot be treated by
surgery, palliative
effects have been
achieved that prolong
life and allow the
patient in many
instances to lead a
comparatively normal
existence.
Fundamentally,
however, perhaps the
most important advance
of all in this field has
been the increasing
appreciation of the
importance of
prevention. The
discovery of the
relationship between
cigarette smoking and
lung cancer is the
classic example. Less
publicized, but of equal
import, is the
continuing supervision
of new techniques in
industry and food
manufacture in an
attempt to ensure that
they do not involve the
use of cancer-causing
substances.
The first half of the
20th century witnessed
the virtual conquest of
three of the major
diseases of the tropics:
malaria,
yellow fever, and
leprosy. At the turn of
the century, as for the
preceding two centuries,
quinine was the only
known drug to have any
appreciable effect on
malaria. With the
increasing development
of tropical countries
and rising standards of
public health, it became
obvious that quinine was
not completely
satisfactory. Intensive
research between World
Wars I and II indicated
that several synthetic
compounds were more
effective. The first of
these to become
available, in 1934, was
quinacrine (known as
mepacrine, Atabrine, or
Atebrin). In World War
II it amply fulfilled
the highest expectations
and helped to reduce
disease among Allied
troops in Africa,
Southeast Asia, and
the Far East. A number
of other effective
antimalarial drugs
subsequently became
available.
An even
brighter prospect—the
virtual eradication of
malaria—was opened up by
the introduction, during
World War II, of the
insecticide
DDT
(1,1,1-trichloro-2,2,-bis[p-chlorophenyl]ethane,
or
dichlorodiphenyltrichloro-ethane).
It had long been
realized that the only
effective way of
controlling malaria was
to eradicate the
anopheline
mosquitoes that
transmit the disease.
Older methods of
mosquito control,
however, were cumbersome
and expensive. The
lethal effect of DDT on
the mosquito, its
relative cheapness, and
its ease of use on a
widespread scale
provided the answer. An
intensive worldwide
campaign, sponsored by
the
World Health
Organization, was
planned and went far
toward bringing malaria
under control.
The major problem
encountered with respect
to effectiveness was
that the mosquitoes were
able to develop a
resistance to DDT; but
the introduction of
other insecticides, such
as dieldrin and lindane
(BHC), helped to
overcome this
difficulty. In recent
years the use of these
and other insecticides
has been strongly
criticized by
ecologists, however.
Yellow fever is
another
mosquito-transmitted
disease, and the
prophylactic value of
modern insecticides in
its control was almost
as great as in the case
of malaria. The forest
reservoirs of the virus
present a more difficult
problem, but the
combined use of
immunization and
insecticides did much to
bring this disease under
control.
Until the 1940s the
only drugs available for
treating
leprosy were the
chaulmoogra oils and
their derivatives.
These, though helpful,
were far from
satisfactory. In the
1940s the group of drugs
known as the
sulfones appeared,
and it soon became
apparent that they were
infinitely better than
any other group of drugs
in the treatment of
leprosy. Several other
drugs later proved
promising. Although
there is as yet no known
cure—in the strict sense
of the term—for leprosy,
the outlook has so
changed that there are
good grounds for
believing that this
age-old scourge can be
brought under control
and the victims of the
disease saved from those
dreaded mutilations that
have given leprosy such
a fearsome reputation
throughout the ages.
William Archibald Robson
Thomson Philip Rhodes
Surgery in the 20th
century
» The opening
phase
Three seemingly
insuperable
obstacles beset the
surgeon in the years
before the mid-19th
century: pain,
infection, and
shock. Once these
were overcome, the
surgeon believed
that he could burst
the bonds of
centuries and become
the master of his
craft. There is
more, however, to
anesthesia than
putting the patient
to sleep. Infection,
despite first
antisepsis
(destruction of
microorganisms
present) and later
asepsis (avoidance
of contamination),
is still an
ever-present menace;
and shock continues
to perplex
physicians. But in
the 20th century,
surgery has
progressed farther,
faster, and more
dramatically than in
all preceding ages.
Surgery in
the 20th century
» The opening phase
» The situation
encountered
The shape of surgery
that entered the new
century was clearly
recognizable as the
forerunner of today’s,
blurred and hazy though
the outlines may now
seem. The operating
theatre still retained
an aura of the past,
when the surgeon played
to his audience and the
patient was little more
than a stage prop. In
most hospitals it was a
high room lit by a
skylight, with tiers of
benches rising above the
narrow, wooden operating
table. The instruments,
kept in glazed or wooden
cupboards around the
walls, were of forged
steel, unplated, and
with handles of wood or
ivory.
The means to
combat
infection hovered
between
antisepsis and
asepsis. Instruments
and dressings were
mostly sterilized by
soaking them in dilute
carbolic acid (or
other antiseptic), and
the surgeon often
endured a gown freshly
wrung out in the same
solution. Asepsis gained
ground fast, however. It
had been born in the
Berlin clinic of
Ernst von Bergmann
where, in 1886, steam
sterilization had been
introduced. Gradually,
this led to the complete
aseptic ritual, which
has as its basis the
bacterial cleanliness
(as opposed to social
cleanliness) of
everything that comes in
contact with the wound.
Hermann Kümmell, of
Hamburg, devised the
routine of “scrubbing
up.” In 1890
William Stewart Halsted,
of
Johns Hopkins University,
had rubber gloves
specially made for
operating, and in 1896
Johannes von
Mikulicz-Radecki, a Pole
working at Breslau,
Ger., invented the gauze
mask.
Many surgeons,
brought up in a confused
misunderstanding of the
antiseptic
principle—believing that
carbolic would cover a
multitude of sins, many
of which they were
ignorant of
committing—failed to
grasp what asepsis was
all about. Thomas
Annandale, for example,
blew through his
catheters to make sure
that they were clear,
and many an instrument,
dropped accidentally,
was simply given a quick
wipe and returned to
use. Tradition died
hard, and asepsis had an
uphill struggle before
it was fully accepted.
“I believe firmly that
more patients have died
from the use of gloves
than have ever been
saved from infection by
their use,” wrote W.P.
Carr, an American, in
1911. Over the years,
however, a sound
technique was evolved as
the foundation for the
growth of modern
surgery.
Anesthesia, at the
turn of the century,
progressed slowly. Few
physicians made a career
of the subject, and
frequently the patient
was rendered unconscious
by a student, a nurse,
or a porter wielding a
rag and bottle.
Chloroform was
overwhelmingly more
popular than ether, on
account of its ease of
administration, despite
the fact that it was
liable to kill by
stopping the heart.
Although by the end
of the first decade,
nitrous oxide
(laughing gas) combined
with ether had
displaced—but by no
means entirely—the use
of chloroform, the
surgical problems were
far from ended. For
years to come the
abdominal surgeon
besought the anesthetist
to deepen the level of
anesthesia and thus
relax the abdominal
muscles; the anesthetist
responded to the best of
his ability, acutely
aware that the deeper he
went, the closer the
patient was to death.
When other anesthetic
agents were discovered,
the anesthetist came
into his own, and many
advances in spheres such
as brain and heart
surgery would have been
impossible without his
skill.
The third obstacle,
shock, is perhaps
the most complex and the
most difficult to define
satisfactorily. The only
major cause properly
appreciated at the start
of the 20th century was
loss of blood, and once
that had occurred
nothing, in those days,
could be done. And so,
the study of shock—its
causes, its effects on
human physiology, and
its prevention and
treatment—became
all-important to the
progress of surgery.
In the latter part of
the 19th century, then,
surgeons had been
liberated from the
age-old bogies of pain,
pus, and hospital
gangrene. Hitherto,
operations had been
restricted to
amputations, cutting for
stone in the bladder,
tying off arterial
aneurysms (bulging and
thinning of artery
walls), repairing
hernias, and a variety
of procedures that could
be done without going
too deeply beneath the
skin. But the anatomical
knowledge, a crude skill
derived from practice on
dead bodies, and above
all the enthusiasm, were
there waiting. Largely
ignoring the mass of
problems they uncovered,
surgeons launched forth
into an exploration of
the
human body.
They acquired a
reputation for
showmanship; but much of
their surgery, though
speedy and spectacular,
was rough and ready.
There were a few who
developed supreme skill
and dexterity and could
have undertaken a modern
operation with but
little practice; indeed,
some devised the very
operations still in use
today. One such was
Theodor Billroth,
head of the surgical
clinic at Vienna, who
collected a formidable
list of successful
“first” operations. He
represented the best of
his generation—a
surgical genius, an
accomplished musician,
and a kind, gentle man
who brought the breath
of humanity to his work.
Moreover, the men he
trained, including von
Mikulicz, Vincenz
Czerny, and Anton von
Eiselsberg, consolidated
the brilliant start that
he had given to
abdominal surgery in
Europe.
Surgery in
the 20th century
» The opening phase
» Changes before
World War I
The opening decade of
the 20th century was a
period of transition.
Flamboyant exhibitionism
was falling from favour
as surgeons, through
experience, learned the
merits of painstaking,
conscientious
operation—treating the
tissues gently and
carefully controlling
every bleeding point.
The individualist was
not submerged, however,
and for many years the
development of the
various branches of
surgery rested on the
shoulders of a few
clearly identifiable
men. Teamwork on a large
scale arrived only after
World War II. The
surgeon, at first, was
undisputed master in his
own wards and theatre.
But as time went on and
he found he could not
solve his problems
alone, he called for
help from specialists in
other fields of medicine
and, even more
significantly, from his
colleagues in other
scientific disciplines.
The increasing scope of
surgery led to
specialization.
Admittedly, most general
surgeons had a special
interest, and for a long
time there had been an
element of
specialization in such
fields as ophthalmology,
orthopedics, obstetrics,
and gynecology; but
before long it became
apparent that, to
achieve progress in
certain areas, surgeons
had to concentrate their
attention on that
particular subject.
Surgery
in the 20th century
» The opening
phase
» Changes before
World War I
»
Abdominal
surgery
By the start of the
20th century,
abdominal surgery,
which provided the
general surgeon with
the bulk of his
work, had grown
beyond infancy,
thanks largely to
Billroth. In 1881 he
had performed the
first successful
removal of part of
the stomach for
cancer. His next two
cases were failures,
and he was stoned in
the streets of
Vienna. Yet, he
persisted and by
1891 had carried out
41 more of these
operations with 16
deaths—a remarkable
achievement for that
era.
Peptic ulcers
(gastric and
duodenal) appeared
on the surgical
scene (perhaps as a
new disease, but
more probably
because they had not
been diagnosed
previously), and in
1881 Ludwig Rydygier
cured a young woman
of her
gastric ulcer by
removing it. Bypass
operations—gastroenterostomies—soon
became more popular,
however, and enjoyed
a vogue that lasted
into the 1930s, even
though fresh ulcers
at the site of the
juncture were not
uncommon.
The other end of
the
alimentary tract
was also subjected
to surgical
intervention;
cancers were removed
from the large bowel
and rectum with
mortality rates that
gradually fell from
80 to 60 to 20 to 12
percent as the
surgeons developed
their skill. In 1908
the British surgeon
Ernest
Miles carried
out the first
abdominoperineal
resection for cancer
of the rectum; that
is, the cancer was
attacked both from
the abdomen and from
below through the
perineum (the area
between the anus and
the genitals),
either by one
surgeon, who
actually did two
operations, or by
two working
together. This
technique formed the
basis for all future
developments.
Much of the new
surgery in the
abdomen was for
cancer, but not all.
Appendectomy became
the accepted
treatment for
appendicitis (in
appropriate cases)
in the
United States
before the close of
the 19th century;
but in Great Britain
surgeons were
reluctant to remove
the organ until
1902, when King
Edward VII’s
coronation was
dramatically
postponed on account
of his appendicitis.
The publicity
attached to his
operation caused the
disease and its
surgical treatment
to become
fashionable—despite
the fact that the
royal appendix
remained in the
King’s abdomen; the
surgeon, Frederic
Treves, had merely
drained the abscess.
Surgery
in the 20th century
» The opening
phase
» Changes before
World War I
»
Neurosurgery
Though probably the
most demanding of
all the surgical
specialties,
neurosurgery was
nevertheless one of
the first to emerge.
The techniques and
principles of
general surgery
were inadequate for
work in such a
delicate field.
William
Macewen, a
Scottish general
surgeon of
outstanding
versatility, and
Victor Alexander
Haden Horsley,
the first British
neurosurgeon, showed
that the surgeon had
much to offer in the
treatment of disease
of the
brain and
spinal cord.
Macewen, in 1893,
recorded 19 patients
operated on for
brain abscess,
18 of whom were
cured; at that time
most other surgeons
had 100 percent
mortality rates for
the condition. His
achievement remained
unequaled until the
discovery of
penicillin.
An
American,
Harvey Williams
Cushing, almost
by himself
consolidated
neurosurgery as a
specialty. From 1905
on, he advanced
neurosurgery through
a series of
operations and
through his
writings. Tumours,
epilepsy,
trigeminal neuralgia,
and pituitary
disorders were among
the conditions he
treated
successfully.
Surgery
in the 20th century
» The opening
phase
» Changes before
World War I
» Radiology
In 1895 a
development at the
University of
Würzburg had
far-reaching effects
on medicine and
surgery, opening up
an entirely fresh
field of the
diagnosis and study
of disease and
leading to a new
form of treatment,
radiation therapy.
This was the
discovery of
X rays by
Wilhelm Conrad
Röntgen, a professor
of physics. Within
months of the
discovery there was
an extensive
literature on the
subject: Robert
Jones, a British
surgeon, had
localized a bullet
in a boy’s wrist
before operating;
stones in the
urinary bladder
and gallbladder had
been demonstrated;
and fractures had
been displayed.
Experiments began on
introducing
substances that are
opaque to X rays
into the body to
reveal organs and
formations, both
normal and abnormal.
Walter Cannon, a
Boston physiologist,
used X rays in 1898
in his studies of
the alimentary
tract. Friedrich
Voelcker, of
Heidelberg, devised
retrograde
pyelography
(introduction of the
radiopaque medium
into the
kidney pelvis by
way of the ureter)
for the study of the
urinary tract in
1905; in Paris in
1921, Jean Sicard
X-rayed the spinal
canal with the help
of an oily iodine
substance, and the
next year he did the
same for the
bronchial tree; and
in 1924 Evarts
Graham, of St.
Louis, used a
radiopaque
contrast medium
to view the
gallbladder. Air was
also used to provide
contrast; in 1918,
at
Johns Hopkins,
Walter Dandy
injected air into
the ventricles
(liquid-filled
cavities) of the
brain.
The problems of
injecting contrast
media into the
blood vessels
took longer to
solve, and it was
not until 1927 that
António Moniz, of
Lisbon, succeeded in
obtaining pictures
of the arteries of
the brain. Eleven
years later, George
Robb and Israel
Steinberg of New
York overcame some
of the difficulties
of
cardiac
catheterization
(introduction of a
small tube into the
heart by way of
veins or arteries)
and were able to
visualize the
chambers of the
heart on X-ray
film. After much
research, a further
refinement came in
1962, when Frank
Sones and Earl K.
Shirey of Cleveland
showed how to
introduce the
contrast medium into
the
coronary arteries.
The battlefields of the
20th century stimulated
the progress of surgery
and taught the surgeon
innumerable lessons,
which were subsequently
applied in civilian
practice. Regrettably,
though, the principles
of military surgery and
casualty evacuation,
which can be traced back
to the Napoleonic wars,
had to be learned over
again.
World War I
broke, quite
dramatically, the
existing surgical
hierarchy and rule of
tradition. No longer did
the European surgeon
have to waste his best
years in apprenticeship
before seating himself
in his master’s chair.
Suddenly, young surgeons
in the armed forces
began confronting
problems that would have
daunted their elders.
Furthermore, their
training had been in
“clean” surgery
performed under aseptic
conditions. Now they
found themselves faced
with the need to treat
large numbers of grossly
contaminated
wounds in improvised
theatres. They
rediscovered
debridement (the
surgical excision of
dead and dying tissue
and the removal of
foreign matter).
The older surgeons
cried “back to Lister,”
but antiseptics, no
matter how strong, were
no match for
putrefaction and
gangrene. One method of
antiseptic
irrigation—devised by
Alexis Carrel and
Henry Dakin and called
the
Carrel–Dakin treatment—was,
however, beneficial, but
only after the wound had
been adequately debrided.
The scourges of tetanus
and
gas gangrene were
controlled to a large
extent by antitoxin and
antiserum injections,
yet surgical treatment
of the wound remained an
essential requirement.
Abdominal casualties
fared badly for the
first year of the war,
because experience in
the utterly different
circumstances of the
South African War had
led to a belief that
these men were better
left alone surgically.
Fortunately, the error
of continuing with such
a policy 15 years later
was soon appreciated,
and every effort was
made to deliver the
wounded men to a
suitable surgical unit
with all speed. Little
progress was made with
chest wounds beyond
opening up the wound
even further to drain
pus from the
pleural cavity
between the chest wall
and the lungs.
Perhaps the most
worthwhile and enduring
benefit to flow from
World War I was
rehabilitation. For
almost the first time,
surgeons realized that
their work did not end
with a healed wound. In
1915 Robert Jones set up
special facilities for
orthopedic patients, and
at about the same time
Harold Gillies
founded British
plastic surgery in a
hut at Sidcup, Kent. In
1917 Gillies popularized
the pedicle type of
skin graft (the type
of graft in which skin
and
subcutaneous tissue
are left temporarily
attached for nourishment
to the site from which
the graft was taken).
Since then plastic
surgery has given many
techniques and
principles to other
branches of surgery.
Surgery in
the 20th century
» Between the world
wars
The years between the
two world wars may
conveniently be regarded
as the time when surgery
consolidated its
position. A surprising
number of surgical
firsts and an amazing
amount of fundamental
research had been
achieved even in the
late 19th century, but
the knowledge and
experience could not be
converted to practical
use because the human
body could not survive
the onslaught. In the
years between World Wars
I and II, it was
realized that
physiology—in its widest
sense, including
biochemistry and fluid
and electrolyte
balance—was of major
importance along with
anatomy, pathology, and
surgical technique.
Surgery in the 20th
century
» Between the
world wars
» The problem
of
shock
The first
problem to be
tackled was
shock, which
was, in brief,
found to be due
to a decrease in
the effective
volume of the
circulation. To
combat shock,
the volume had
to be restored,
and the obvious
substance was
blood itself. In
1901
Karl Landsteiner,
then in Austria,
discovered the
ABO blood
groups, and in
1914 sodium
citrate was
added to freshly
drawn
blood to
prevent
clotting. Blood
was occasionally
transfused
during World War
I, but
three-quarters
of a pint was
considered a
large amount.
These
transfusions
were given by
directly linking
the vein of a
donor with that
of the
recipient. The
continuous drip
method, in which
blood flows
from a flask,
was introduced
by Hugh Marriott
and Alan Kekwick
at the Middlesex
Hospital,
London, in 1935.
As
blood
transfusions
increased in
frequency and
volume,
blood banks
were required.
Although it took
another world
war before these
were organized
on a large
scale, the first
tentative steps
were taken by
Sergey
Sergeyevich
Yudin, of
Moscow, who, in
1933, used
cadaver blood,
and by Bernard
Fantus, of
Chicago, who,
four years
later, used
living donors as
his source of
supply. Saline
solution,
plasma,
artificial
plasma
expanders, and
other solutions
are now also
used in the
appropriate
circumstances.
Sometimes
after operations
(especially
abdominal
operations), the
gut becomes
paralyzed. It is
distended, and
quantities of
fluid pour into
it,
dehydrating
the body. In
1932 Owen
Wangensteen, at
the
University of
Minnesota,
advised
decompressing
the bowel, and
in 1934 two
other Americans,
Thomas Miller
and William
Abbott, of
Philadelphia,
invented an
apparatus for
this purpose, a
tube with an
inflatable
balloon on the
end that could
be passed into
the
small intestine.
The fluid lost
from the tissues
was replaced by
a continuous
intravenous drip
of saline
solution on the
principle
described by
Rudolph Matas,
of
New Orleans,
in 1924. These
techniques
dramatically
improved
abdominal
surgery,
especially in
cases of
obstruction,
peritonitis
(inflammation of
the abdominal
membranes), and
acute
emergencies
generally, since
they made it
possible to keep
the bowel empty
and at rest.
Surgery in the 20th
century
» Between the
world wars
» Anesthesia
and thoracic surgery
The strides
taken in
anesthesia from
the 1920s onward
allowed surgeons
much more
freedom. Rectal
anesthesia had
never proved
satisfactory,
and the first
improvement on
the combination
of nitrous
oxide, oxygen,
and ether was
the introduction
of the
general
anesthetic
cyclopropane
by
Ralph Waters
of Madison,
Wis., in 1933.
Soon afterward,
intravenous
anesthesia was
introduced;
John Lundy
of the Mayo
Clinic brought
to a climax a
long series of
trials by many
workers when he
used
Pentothal (thiopental
sodium, a
barbiturate) to
put a patient
peacefully to
sleep. Then, in
1942, Harold
Griffith and G.
Enid Johnson, of
Montreal,
produced
muscular
paralysis by the
injection of a
purified
preparation of
curare. This
was harmless
since, by then,
the anesthetist
was able to
control the
patient’s
respiration.
If there was one
person who was
aided more than
any other by the
progress in
anesthesia, it
was the
thoracic
(chest) surgeon.
What had
bothered him
previously was
the collapse of
the
lung, which
occurred
whenever the
pleural cavity
was opened.
Since the end of
the 19th
century, many
and ingenious
methods had been
devised to
prevent this
from happening.
The best known
was the negative
pressure cabinet
of
Ernst
Ferdinand
Sauerbruch, then
at Mikulicz’
clinic at
Breslau; the
cabinet was
first
demonstrated in
1904 but was
destined soon to
become obsolete.
The solution
lay in
inhalational
anesthesia
administered
under pressure.
Indeed, when
Théodore
Tuffier, in
1891,
successfully
removed the apex
of a lung for
tuberculosis,
this was the
technique that
he used; he even
added an
inflatable cuff
around the tube
inserted in the
trachea to
ensure a
gas-tight fit.
Tuffier was
ahead of his
time, however,
and other
surgeons and
research workers
wandered into
confused and
complex byways
before Ivan
Magill and Edgar
Rowbotham,
working at
Gillies’
plastic-surgery
unit, found
their way back
to the
simplicity of
the endotracheal
tube and
positive
pressure. In
1931 Ralph
Waters showed
that
respiration
could be
controlled
either by
squeezing the
anesthetic bag
by hand or by
using a small
motor.
These
advances allowed
thoracic surgery
to move into
modern times. In
the 1920s,
operations had
been performed
mostly for
infective
conditions and
as a last
resort. The
operations
necessarily were
unambitious and
confined to
collapse
therapy,
including
thoracoplasty
(removal of
ribs),
apicolysis
(collapse of a
lung apex and
artificially
filling the
space), and
phrenic crush
(which paralyzed
the diaphragm on
the chosen
side); to
isolation of the
area of lung to
be removed by
first creating
pleural
adhesions; and
to drainage.
The technical
problems of
surgery within
the chest were
daunting until
Harold Brunn of
San Francisco
reported six
lobectomies
(removals of
lung lobes) for
bronchiectasis
with only one
death. (In
bronchiectasis
one or more
bronchi or
bronchioles are
chronically
dilated and
inflamed, with
copious
discharge of
mucus mixed with
pus.) The secret
of Brunn’s
success was the
use of
intermittent
suction after
surgery to keep
the cavity free
of secretions
until the
remaining lobes
of the lung
could expand to
fill the space.
In 1931 Rudolf
Nissen, in
Berlin, removed
an entire lung
from a girl with
bronchiectasis.
She recovered to
prove that the
risks were not
as bad as had
been feared.
Cancer of
the lung has
become a major
disease of the
20th century;
perhaps it has
genuinely
increased, or
perhaps modern
techniques of
diagnosis reveal
it more often.
As far back as
1913 a Welshman,
Hugh Davies,
removed a lower
lobe for cancer,
but a new era
began when
Evarts Graham
removed a whole
lung for cancer
in 1933. The
patient, a
doctor, was
still alive at
the time of
Graham’s death
in 1957.
The thoracic
part of the
esophagus is
particularly
difficult to
reach, but in
1909 the British
surgeon Arthur
Evans
successfully
operated on it
for cancer. But
results were
generally poor
until, in 1944,
John Garlock,
of New York,
showed that it
is possible to
excise the
esophagus and to
bring the
stomach up
through the
chest and join
it to the
pharynx. Lengths
of colon are
also used as
grafts to bridge
the gap.
Surgery in the 20th
century
»
World
War II
and after
» Support
from other
technologies
At first,
perhaps, the
surgeon tried to
do too much
himself, but
before long his
failures taught
him to share his
problems with
experts in other
fields. This was
especially so
with respect to
difficulties of
biomedical
engineering and
the exploitation
of new
materials. The
relative
protection from
infection given
by antibiotics
and chemotherapy
allowed the
surgeon to
become far more
adventurous than
hitherto in
repairing and
replacing
damaged or
worn-out tissues
with foreign
materials. Much
research was
still needed to
find the best
material for a
particular
purpose and to
make sure that
it would be
acceptable to
the body.
Plastics, in
their seemingly
infinite
variety, have
come to be used
for almost
everything from
suture material
to heart valves;
for
strengthening
the repair of
hernias; for
replacement of
the head of the
femur (first
done by the
French surgeon
Jean Judet and
his brother
Robert-Louis
Judet in 1950);
for replacement
of the lens of
the eye after
extraction of
the natural lens
for cataract;
for valves to
drain fluid from
the brain in
patients with
hydrocephalus;
and for many
other
applications.
This is a far
cry, indeed,
from the
unsatisfactory
use of celluloid
to restore bony
defects of the
face by the
German surgeon
Fritz Berndt in
the 1890s. Inert
metals, such as
vitallium, have
also found a
place in
surgery, largely
in orthopedics
for the repair
of fractures and
the replacement
of joints.
The scope of
surgery was
further expanded
by the
introduction of
the operating
microscope.
This brought the
benefit of
magnification
particularly to
neurosurgery and
to ear surgery.
In the latter it
opened up a
whole field of
operations on
the eardrum and
within the
middle ear.
The principles
of these
operations were
stated in 1951
and 1952 by two
German surgeons,
Fritz Zöllner
and Horst
Wullstein; and
in 1952 Samuel
Rosen of New
York mobilized
the footplate of
the stapes to
restore
hearing in
otosclerosis—a
procedure
attempted by the
German Jean
Kessel in 1876.
Although
surgeons aim to
preserve as much
of the body as
disease permits,
they are
sometimes forced
to take radical
measures to save
life; when, for
instance, cancer
affects the
pelvic organs.
Pelvic
exenteration
(surgical
removal of the
pelvic organs
and nearby
structures) in
two stages was
devised by Allen
Whipple of
New York City,
in 1935, and in
one stage by
Alexander
Brunschwig, of
Chicago, in
1937. Then, in
1960, Charles S.
Kennedy, of
Detroit, after a
long discussion
with Brunschwig,
put into
practice an
operation that
he had been
considering for
12 years:
hemicorporectomy—surgical
removal of the
lower part of
the body. The
patient died on
the 11th day.
The first
successful
hemicorporectomy
(at the level
between the
lowest
lumbar vertebra
and the sacrum)
was performed 18
months later by
J. Bradley Aust
and Karel B.
Absolon, of
Minnesota. This
operation would
never have been
possible without
all the
technical,
supportive, and
rehabilitative
resources of
modern medicine.
The attitude of
the medical
profession
toward heart
surgery was for
long
overshadowed by
doubt and
disbelief.
Wounds of the
heart could be
sutured (first
done
successfully by
Ludwig Rehn,
of
Frankfurt am
Main, in
1896); the
pericardial
cavity—the
cavity formed by
the sac
enclosing the
heart—could be
drained in
purulent
infections (as
had been done by
Larrey in 1824);
and the
pericardium
could be
partially
excised for
constrictive
pericarditis
when it was
inflamed and
constricted the
movement of the
heart (this
operation was
performed by
Rehn and
Sauerbruch in
1913). But
little beyond
these procedures
found
acceptance.
Yet, in the
first two
decades of the
20th century,
much
experimental
work had been
carried out,
notably by the
French surgeons
Théodore Tuffier
and Alexis
Carrel. Tuffier,
in 1912,
operated
successfully on
the aortic
valve. In
1923 Elliott
Cutler of Boston
used a tenotome,
a tendon-cutting
instrument, to
relieve a girl’s
mitral stenosis
(a narrowing of
the
mitral valve
between the
upper and lower
chambers of the
left side of the
heart) and in
1925, in London,
Henry Souttar
used a finger to
dilate a mitral
valve in a
manner that was
25 years ahead
of its time.
Despite these
achievements,
there was too
much
experimental
failure, and
heart disease
remained a
medical, rather
than surgical,
matter.
Resistance
began to crumble
in 1938, when
Robert Gross
successfully
tied off a
persistent
ductus
arteriosus
(a fetal blood
vessel between
the
pulmonary artery
and the aorta).
It was finally
swept aside in
World War II by
the remarkable
record of
Dwight
Harken, who
removed 134
missiles from
the chest—13 in
the heart
chambers—without
the loss of one
patient.
After the
war, advances
came rapidly,
with the initial
emphasis on the
correction or
amelioration of
congenital
defects. Gordon
Murray, of
Toronto, made
full use of his
amazing
technical
ingenuity to
devise and
perform many
pioneering
operations. And
Charles Bailey
of Philadelphia,
adopting a more
orthodox
approach, was
responsible for
establishing
numerous basic
principles in
the growing
specialty.
Until 1953,
however, the
techniques all
had one great
disadvantage:
they were done
“blind.” The
surgeon’s dream
was to stop the
heart so that he
could
see what he
was doing and be
allowed more
time in which to
do it. In 1952
this dream began
to come true
when
Floyd Lewis,
of Minnesota,
reduced the
temperature
of the body so
as to lessen its
need for oxygen
while he closed
a hole between
the two upper
heart chambers,
the atria. The
next year
John Gibbon, Jr.,
of Philadelphia
brought to
fulfillment the
research he had
begun in 1937;
he used his
heart–lung
machine to
supply oxygen
while he closed
a hole in the
septum between
the atria.
Unfortunately,
neither method
alone was ideal,
but intensive
research and
development
led, in the
early 1960s, to
their being
combined as
extracorporeal
cooling. That
is, the blood
circulated
through a
machine outside
the body, which
cooled it (and,
after the
operation,
warmed it); the
cooled blood
lowered the
temperature of
the whole body.
With the heart
dry and
motionless, the
surgeon operated
on the coronary
arteries; he
inserted plastic
patches over
holes; he
sometimes almost
remodeled the
inside of the
heart. But when
it came to
replacing valves
destroyed by
disease, he was
faced with a
difficult choice
between human
tissue and
man-made valves,
or even valves
from animal
sources.
Surgery in the 20th
century
»
World
War II
and after
» Organ
transplantation
In 1967 surgery
arrived at a
climax that made
the whole world
aware of its
medicosurgical
responsibilities
when the South
African surgeon
Christiaan
Barnard
transplanted
the first human
heart. Reaction,
both medical and
lay, contained
more than an
element of
hysteria. Yet,
in 1964, James
Hardy, of the
University of
Mississippi,
had transplanted
a chimpanzee’s
heart into a
man; and in that
year two
prominent
research
workers, Richard
Lower and Norman
E. Shumway, had
written:
“Perhaps the
cardiac surgeon
should pause
while society
becomes
accustomed to
resurrection of
the mythological
chimera.”
Research had
been
remorselessly
leading up to
just such an
operation ever
since Charles
Guthrie and
Alexis Carrel,
at the
University of
Chicago,
perfected the
suturing of
blood vessels in
1905 and then
carried out
experiments in
the
transplantation
of many organs,
including the
heart.
New
developments in
immunosuppression
(the use of
drugs to prevent
organ rejection)
have advanced
the field of
transplantation
enormously.
Kidney
transplantation
is now a routine
procedure that
is supplemented
by dialysis with
an
artificial
kidney
(invented by
Willem Kolff in
wartime Holland)
before and after
the operation;
mortality has
been reduced to
about 10 percent
per year.
Rejection of the
transplanted
heart by the
patient’s
immune system
was overcome to
some degree in
the 1980s with
the introduction
of the
immunosuppressant
cyclosporine;
records show
that many
patients have
lived for five
or more years
after the
transplant
operation.
The
complexity of
the liver and
the
unavailability
of supplemental
therapies such
as the
artificial
kidney have
contributed to
the slow
progress in
liver
transplantation
(first performed
in 1963 by
Thomas Starzl).
An increasing
number of
patients,
especially
children, have
undergone
successful
transplantation;
however, a
substantial
number may
require
retransplantation
due to the
failure of the
first graft.
Lung
transplants
(first performed
by Hardy in
1963) are
difficult
procedures, and
much progress is
yet to be made
in preventing
rejection. A
combined
heart-lung
transplant is
still in the
experimental
stage, but it is
being met with
increasing
success;
two-thirds of
those receiving
transplants are
surviving,
although
complications
such as
infection are
still common.
Transplantation
of all or part
of the pancreas
is not
completely
successful, and
further
refinements of
the procedures
(first performed
in 1966 by
Richard Lillehei)
are needed.
Robert G.
RichardsonEd.
Additional Reading
The literature
on the history
of medicine
covers all
topics and
periods and
includes
biographies as
well as
descriptions of
the development
of hospitals,
research
institutes,
health care, and
medical
education in
different
countries.
Introductory
studies include
George T.
Bettany,
Eminent
Doctors: Their
Lives and Their
Work, 2
vol. (1885,
reprinted 1972);
Arturo
Castiglioni,
A History of
Medicine,
2nd rev. ed.
(1947;
originally
published in
Italian, 1927),
a classic work;
Fielding H.
Garrison,
An
Introduction to
the History of
Medicine,
4th rev. ed.
(1929, reprinted
1967), a
scholarly
history;
Douglas Guthrie,
A History of
Medicine,
rev. ed. (1958);
Howard W.
Haggard,
Devils,
Drugs, and
Doctors: The
Story of the
Science of
Healing from
Medicine-Man to
Doctor
(1929, reprinted
1980);
Richard H. Meade,
An
Introduction to
the History of
General Surgery
(1968), a
well-documented
work on
developments in
surgery on
separate organs;
Charles Singer
and
E. Ashworth
Underwood,
A Short
History of
Medicine,
2nd ed. (1962);
Philip Rhodes,
An Outline
History of
Medicine
(1985). T
he Oxford
Companion to
Medicine, 2
vol., edited by
John Walton,
Paul B. Beeson,
and
Ronald Bodley
Scott
(1986), is a
comprehensive
text of
20th-century
developments and
persons.
Ancient
traditions of
non-Western
medicine are
presented in
P. Kutumbiah,
Ancient
Indian Medicine
(1962);
Heinrich R.
Zimmer,
Hindu
Medicine
(1948, reprinted
1979);
Edward H. Hume,
The Chinese
Way in Medicine
(1940, reprinted
1975);
Paul U. Unschuld,
Medicine in
China: A History
of Ideas
(1985;
originally
published in
German, 1980);
Edward G. Browne,
Arabian
Medicine
(1921, reprinted
1983).
For
developments
from the origin
of Western
medicine to the
end of the 18th
century, see
William G. Black,
Folk-Medicine: A
Chapter in the
History of
Culture
(1883, reprinted
1970);
W.H.R. Rivers,
Medicine,
Magic, and
Religion
(1924, reprinted
1979), a
comprehensive
treatment of
primitive
medicine;
John Scarborough,
Roman
Medicine
(1969, reprinted
1976);
Robert S.
Gottfried,
Doctors and
Medicine in
Medieval
England,
1340–1530
(1986);
A. Wear,
R.K. French,
and
I.M. Lonie
(eds.), The
Medical
Renaissance of
the Sixteenth
Century
(1985);
Katharine Park,
Doctors and
Medicine in
Early
Renaissance
Florence
(1985);
Guy Williams,
The Age of
Agony: The Art
of Healing, c.
1700–1800
(1975, reprinted
1986).
Medicine and
surgery during
the 19th and
20th centuries
are the subject
of
Carl J. Pfeiffer,
The Art and
Practice of
Western Medicine
in the Early
Nineteenth
Century
(1985);
Thomas E. Keys,
The History
of Surgical
Anesthesia,
rev. ed. (1963,
reprinted 1978);
M.H. Armstrong
Davison,
The
Evolution of
Anesthesia
(1965);
Robert G.
Richardson,
The Scalpel
and the Heart
(1970;
U.K. title,
The Surgeon’s
Heart: A History
of Cardiac
Surgery,
1969);
John S. Haller,
Jr.,
American
Medicine in
Transition,
1840–1910
(1981);
Ruth J. Abram
(ed.), Send
Us a Lady
Physician: Women
Doctors in
America,
1835–1920
(1985);
George Rosen,
The
Structure of
American Medical
Practice,
1875-1941
(1983);
A. McGehee
Harvey,
Science at
the Bedside:
Clinical
Research in
American
Medicine,
1905–1945
(1981), a
discussion of
the
institutionalization
of clinical
research;
Lawrence Galton,
Med Tech:
The Layperson’s
Guide to Today’s
Medical Miracles
(1985), a
historical
dictionary.
Related Articles
Aspects of
the topic
history-of-medicine
are
discussed in
the
following
places at
Britannica.
...central
Europeans.
Scandinavians,
Dutch,
and
North
Germans
were
generally
larger;
protein
in
meat,
fish,
and
cheese
was
probably
as
important
as
their
racial
stock.
Even
where
there
were
advances
in
medicine,
treatment
of
illness
remained
primitive.
The
majority
who
relied
on
the
simples
or
charms
of
the
local
wise
woman
may
have
been
no
worse
off
than
those
for
whom
more
learned
advice
was...
the
application
of
engineering
knowledge
to
the
fields
of
medicine
and
biology.
The
bioengineer
must
be
well
grounded
in
biology
and
have
engineering
knowledge
that
is
broad,
drawing
upon
electrical,
chemical,
mechanical,
and
other
engineering
disciplines.
The
bioengineer
may
work
in
any
of a
large
range
of
areas.
One
of
these
is
the
provision
of
artificial
means
to
assist
defective
body...
...phenomenon,
and
not
by
any
means
the
underlying
basis.
This
general
human
tendency
to
describe
disorders
of
communication
by
what
the
listener
hears
is
analogous
to
the
attempts
of
early
medicine
to
classify
diseases
by
the
patient’s
symptoms
that
the
diagnosing
physician
could
see
or
hear
or
feel
or
perhaps
smell.
Before
the
great
discoveries
of
the
19th
century
had
erected
a...
...need
not
be
preceded
by a
definition
of
death.
They
accept
death
as
an
easily
determined
empirical
fact,
not
requiring
discussion
or
further
elaboration.
But
a
conceptual
crisis
has
arisen
in
modern
medicine
and
biology,
a
crisis
that
stems
precisely
from
the
realization
that
the
definition
of
death—taken
for
granted
for
millennia—requires
reexamination.
To
approach
the
subject
of...
...and
medicine
have
been
achieved
at a
certain
price.
A
mechanistic
approach
has
developed,
in
which
the
protraction
of
dying
has
become
a
major
by-product
of
modern
technology.
The
philosophy
of
modern
medicine
has
been
diverted
from
attention
to
the
sick
and
has
begun
to
reify
the
sickness.
Instead
of
perceiving
death
as
something
natural,
modern
physicians
have
come
to
see
it
as
bad
or...
...shall
have
conferred
the
greatest
benefit
on
mankind.”
These
prizes
as
established
by
his
will
are
the
Nobel
Prize
for
Physics,
the
Nobel
Prize
for
Chemistry,
the
Nobel
Prize
for
Physiology
or
Medicine,
the
Nobel
Prize
for
Literature,
and
the
Nobel
Prize
for
Peace.
The
first
distribution
of
the
prizes
took
place
on
Dec.
10,
1901,
the
fifth
anniversary
of
Nobel’s
death.
An
additional
award,...
The
three
great
areas
of
Hellenistic
scholarship
were
medicine,
astronomy,
and
mathematics.
Alexandria
attracted
Herophilus
(fl.
3rd
century
bc)
from
Chalcedon,
who
refused
to
stand
in
awe
of
the
accepted
medical
dogmas
and
was
distinguished
in
systematic
anatomy,
and
the
notable
physiologist
Erasistratus
(fl.
3rd
century
bc)
from
Ceos,
who
realized
that
the
heart
is
the
motor
for
the...
The
Greeks
not
only
made
substantial
progress
in
understanding
the
cosmos
but
also
went
far
beyond
their
predecessors
in
their
knowledge
of
the
human
body.
Pre-Greek
medicine
had
been
almost
entirely
confined
to
religion
and
ritual.
Disease
was
considered
the
result
of
divine
disfavour
and
human
sin,
to
be
dealt
with
by
spells,
prayers,
and...
...God
as a
result
of
his
self-knowledge.
The
Canon
of
Medicine
(Al-Qānūn
fī
al-ṭibb)
is
the
most
famous
single
book
in
the
history
of
medicine
in
both
East
and
West.
It
is a
systematic
encyclopaedia
based
for
the
most
part
on
the
achievements
of
Greek
physicians
of
the
Roman
imperial
age
and
on
other
Arabic
works
and,
to...
...a
role
in
establishing
the
principles
of
experimentation
in
the
life
sciences,
advancing
beyond
the
vitalism
and
indeterminism
of
earlier
physiologists
to
become
one
of
the
founders
of
experimental
medicine.
His
most
seminal
contribution
was
his
concept
of
the
internal
environment
of
the
organism,
which
led
to
the
present
understanding
of
homeostasis—i.e.,
the
self-regulation
of...
Most
remarkable
in
Celsus’
work
is
the
apparently
advanced
state
of
medical
practice
at
the
time.
He
recommended
cleanliness
and
urged
that
wounds
be
washed
and
treated
with
substances
now
considered
to
be
somewhat
antiseptic,
such
as
vinegar
and
thyme
oil.
He
described
plastic
surgery
of
the
face,
using
skin
from
other
parts
of
the
body....
German
medical
scientist
known
for
his
pioneering
work
in
hematology,
immunology,
and
chemotherapy
and
for
his
discovery
of
the
first
effective
treatment
for
syphilis.
He
received
jointly
with
Élie
Metchnikoff
the
Nobel
Prize
for
Physiology
or
Medicine
in
1908.
Fallopius
served
as
canon
of
the
cathedral
of
Modena
and
then
turned
to
the
study
of
medicine
at
the
University
of
Ferrara,
where
he
became
a
teacher
of
anatomy.
He
then
held
positions
at
the
University
of
Pisa
(1548–51)
and
at
Padua
(1551–62).
His
exhaustive
observations,
made
during
dissection
of
human
cadavers
and
outlined
in
Observationes
anatomicae...
Greek
physician,
writer,
and
philosopher
who
exercised
a
dominant
influence
on
medical
theory
and
practice
in
Europe
from
the
Middle
Ages
until
the
mid-17th
century.
His
authority
in
the
Byzantine
world
and
the
Muslim
Middle
East
was
similarly
long-lived.
ancient
Greek
physician
who
lived
during
Greece’s
Classical
period
and
is
traditionally
regarded
as
the
father
of
medicine.
It
is
difficult
to
isolate
the
facts
of
Hippocrates’
life
from
the
later
tales
told
about
him
or
to
assess
his
medicine
accurately
in
the
face
of
centuries
of
reverence
for
him
as
the
ideal
physician.
About
60
medical...
...of
light,
gravity,
capillary
attraction,
and
twilight;
and
developed
observatories
for
the
empirical
study
of
heavenly
bodies.
They
made
advances
in
the
uses
of
drugs,
herbs,
and
foods
for
medication;
established
hospitals
with
a
system
of
interns
and
externs;
discovered
causes
of
certain
diseases
and
developed
correct
diagnoses
of
them;
proposed
new
concepts
of
hygiene;
made
use
of...
Koch
attended
the
University
of
Göttingen,
where
he
studied
medicine,
graduating
in
1866.
He
then
became
a
physician
in
various
provincial
towns.
After
serving
briefly
as a
field
surgeon
during
the
Franco-Prussian
War
of
1870–71,
he
became
district
surgeon
in
Wollstein,
where
he
built
a
small
laboratory.
Equipped
with
a
microscope,
a
microtome
(an
instrument
for
cutting
thin
slices...
British
surgeon
and
medical
scientist
who
was
the
founder
of
antiseptic
medicine
and
a
pioneer
in
preventive
medicine.
While
his
method,
based
on
the
use
of
antiseptics,
is
no
longer
employed,
his
principle—that
bacteria
must
never
gain
entry
to
an
operation
wound—remains
the
basis
of
surgery
to
this
day.
He
was
made
a
baronet...
...founded
the
science
of
microscopic
anatomy.
After
Malpighi’s
researches,
microscopic
anatomy
became
a
prerequisite
for
advances
in
the
fields
of
physiology,
embryology,
and
practical
medicine.
pioneer
of
American
medical
education,
surgeon
general
of
the
Continental
armies
during
the
American
Revolution,
and
founder
of
the
first
medical
school
in
the
United
States.
German-Swiss
physician
and
alchemist
who
established
the
role
of
chemistry
in
medicine.
He
published
Der
grossen
Wundartzney
(Great
Surgery
Book)
in
1536
and
a
clinical
description
of
syphilis
in
1530.
Returning
home
to
begin
medical
practice
in
1769,
he
was
appointed
professor
of
chemistry
in
the
College
of
Philadelphia,
and
in
the
following
year
he
published
his
Syllabus
of a
Course
of
Lectures
on
Chemistry,
the
first
American
textbook
in
this
field.
Despite
war
and
political
upheavals,
Rush’s
practice
grew
to
substantial
proportions,
partly
owing
to
his...
Renaissance
physician
who
revolutionized
the
study
of
biology
and
the
practice
of
medicine
by
his
careful
description
of
the
anatomy
of
the
human
body.
Basing
his
observations
on
dissections
he
made
himself,
he
wrote
and
illustrated
the
first
comprehensive
textbook
of
anatomy.
German
pathologist
and
statesman,
one
of
the
most
prominent
physicians
of
the
19th
century.
He
pioneered
the
modern
concept
of
pathological
processes
by
his
application
of
the
cell
theory
to
explain
the
effects
of
disease
in
the
organs
and
tissues
of
the
body.
He
emphasized
that
diseases
arose,
not
in
organs
or
tissues
in
general,
but
primarily
in
their
individual
cells.
Moreover,
he...
...living
things
and
the
organisms
(such
as
bacteria,
fungi,
and
viruses)
that
infect
them.
Pharmacology,
the
science
of
drugs,
deals
with
all
aspects
of
drugs
in
medicine,
including
their
mechanism
of
action,
physical
and
chemical
properties,
metabolism,
therapeutics,
and
toxicity.
This
article
focuses
on
drugs
used
in
the
treatment
and
prevention
of
human...
...the
name
of
its
editor,
Siegfried
Flügge.
Another
work
is
the
Encyclopaedic
Dictionary
of
Physics
(1961–64;
and
four
supplements,
1966–75),
edited
by
James
Thewlis.
In
medicine
the
pioneer
British
Encyclopaedia
of
Medical
Practice
(1936–39)
was
followed
by
The
Encyclopaedia
of
General
Practice
(1963).
...and
special
guardian
spirits,
were
important
among
all
tribal
groups.
Among
the
Chaco
groups,
shamanism
was
very
highly
developed,
both
for
curing
illnesses
and
in
working
for
the
general
welfare
of
the
tribe.
Sickness
was
caused,
it
was
thought,
by
one
of
two
means:
mysterious
foreign
objects
would
magically
penetrate
the
body,
causing...
Knowledge
expanded
because
of
specialization.
Medicine
embraced
skills
such
as
acupuncture,
obstetrics,
dentistry,
laryngology,
ophthalmology,
and
treatment
of
rheumatism
and
paralysis.
The
demand
for
improved
technology,
aided
by
certain
concerns
of
the
Neo-Confucian
philosophy,
helped
to
promote
numerous
investigations
that
approached
the
use
of
...
...with
certitude
the
origins
of
alchemy,
but
the
evidences
in
China
appear
to
be
slightly
older.
Indeed,
Chinese
alchemy
was
connected
with
an
enterprise
older
than
metallurgy—i.e.,
medicine.
Belief
in
physical
immortality
among
the
Chinese
seems
to
go
back
to
the
8th
century
bc,
and
belief
in
the
possibility
of
attaining
it
through
drugs
to
the
4th
century
bc.
The
magical...
...with
laying
this
base
while
at
the
same
time
regarding
alchemy
as
mostly
“wrong.”
It
is
far
from
clear,
however,
that
the
basis
of
chemistry
was
in
fact
laid
by
alchemy
rather
than
medicine.
During
the
crucial
period
of
Arabic
and
early
Latin
alchemy,
it
appears
that
innovation
owed
more
to
nascent
medical
chemistry
than
to
alchemy.
Another
source
of
information
concerning
the
extent
of
biological
knowledge
of
these
early
peoples
was
the
discovery
of
several
papyri
that
pertain
to
medical
subjects;
one,
believed
to
date
back
to
1600
bc,
contains
anatomical
descriptions;
another
(c.
1500
bc)
indicates
that
the
importance
of
the
heart
had
been
recognized.
Because
these
ancient
documents,
which
contained
mixtures...
Daoist
physiological
techniques
have,
in
themselves,
no
devotional
character.
They
have
the
same
preoccupations
as
physicians:
to
preserve
health
and
to
prolong
physical
life.
Medicine
developed
independently
from
about
the
1st
century
ce,
but
many
Daoist
faith
healers
and
hygienists
added
to
medical
knowledge.