250 to 300 words on services hospice care. Describe one hospice that will benefit ella and her family. Full instructions are attached
week5discussion1.docx

chapter_10.pdf

chapter_11.pdf

1547681554701_family_document.pdf

Unformatted Attachment Preview

Instructions
National Hospice….. website link

Resources

Instructor Guidance
This week, our discussion focuses on Ella and her decision to be at home, as much as
possible, for the remaining time in her life. When a patient is discharged from the hospital,
at-home care reduces the risk of the patient returning to the hospital. This, in turn, leads to
a decrease in the financial burden of health care (Family Caregiver Alliance, 2009, para.
10). In Ella’s case, her husband will be providing the majority of her care, with help from
the other family members. She will also have hospice services to provide medications,
wound care, and bath assistance.
Your job in this discussion is to learn what services are offered by hospice, beyond what is
listed above, and then find other community services to help the family members
supporting Ella and Joe. You will analyze and evaluate the community service. This means
you will identify what the service will do for the family, and whether or not it is enough to
address their needs fully.
The U.S. Health Care
System and Chronic
Illness and Disability
10
Juice Images/SuperStock
Chris O’Meara/Associated Press
Learning Objectives
1. Gain a broad overview of how the modern U.S. health care system was created
2. Understand how the U.S. health care system is financed through a mixture of public funding, private insurance, and charity care
3. Understand how chronic conditions have increasingly become the leading causes of morbidity and
mortality during the 20th century
4. Understand how cost containment became a major concern in contemporary health care policy and
how the prevalence of chronic disease makes cost containment difficult
5. Understand the complex challenges that the contemporary U.S. health care system faces, as
reflected in the provisions of the Affordable Care Act of 2010
atL80953_10_c10.indd 311
12/5/13 2:00 PM
Section 10.1  Introduction: The Creation of the Modern U.S. Health Care System
CHAPTER 10
10.1  Introduction: The Creation of the Modern U.S.
Health Care System
T
he system of health care delivery in the United States reflects broad patterns in the evolution of medical research and practice that developed over the course of the 19th and 20th
centuries within western society. Like other industrial democracies, the United States has
had to balance issues of access, cost, and quality in the delivery of health care services (Mechanic
and McApline, 2010). As Daniel M. Fox (1993a) argued in his essay “Medical Institutions and the
State,” these issues can be framed in economic terms of supply and demand. The supply of medical
services depends primarily on three related factors: the role and function of hospitals, the nature
of medical education and research, and the regulation of entry and practice of the profession. By
contrast, the demand for medical services (i.e., how they are used and by whom) depends largely
on how the system is financed. In addressing these demands, the U.S. system has come to rely on
a distinctive mix of public and private financing that clearly differentiates it from other industrial
democracies; in this respect, the United States is unique in its “patchwork quilt” of health care
financing. The Affordable Care Act (ACA) of 2010 tries to make the system of financing health care
more uniform; however, in the implementation of the ACA in the years ahead, isolated gaps in
coverage may persist—which means this legislation may have to revisited in the future.
Supplying Modern Medicine: From Unregulated Craft
to Regulated Profession
For centuries, western medicine—both its theories about disease and how it was practiced as
a social and economic activity—changed primarily by increments. Before the 19th century, the
dominant view held that disease resulted from an imbalance in bodily fluids known as “humors.”
Socially and economically, medicine was an unregulated market; nearly anyone could claim to
be a physician provided that he (or less probably, she) could convince patients to pay for clinical
services rendered. During the 19th century, these long-held views and practices were challenged,
primarily because of developments that occurred in Europe.
In the early 19th century, changes in Parisian medical education made hospital training central to
medical education. This training combined clinical observation with autopsies. Also, the availability of a large number of hospital patients made statistical comparisons between therapies possible. As a result, ancient dogmas were shattered: The view of “disease as humorial imbalance” was
replaced with the view of “disease in specific locations.” Depletion therapies such as bloodletting
were shown to be ineffective (empirical evidence indicated that those who were being bled were
dying more frequently than those who were not being bled). Demolishing these archaic views prepared the way for a new understanding, which resulted from reforms in higher education (most
notably in Germany). In particular, the modern research university was developed in Germany
and, in the biomedical context, laboratory research institutes were created to study subjects such
as physiology, pathology, and bacteriology.
Seeing these developments, reformers in American medical education upgraded academic standards. During the 1870s, the medical schools at both Harvard University and the University of
Pennsylvania extended the medical curriculum from 2 to 3 years. During the 1880s, many academic physicians likewise endorsed improvements in medical education to enhance the standing
of the profession overall. In 1890, a national association of medical schools was established; this
organization required that medical education last 3 years (for at least 6 months each year) and
that laboratory instruction be provided in histology, chemistry, and pathology (Starr, 1982).
atL80953_10_c10.indd 312
12/5/13 2:00 PM
Section 10.1  Introduction: The Creation of the Modern U.S. Health Care System
CHAPTER 10
But the decisive event that helped create the contemporary U.S. health care system was the
founding of Johns Hopkins University in Baltimore, Maryland. Endowed by the Baltimore railroad
investor for which it was named, the university had a medical school from its inception. As Hopkins wrote to the trustees of the new university in an 1873 letter, Johns Hopkins Hospital should
in construction and arrangement compare favorably with any other institution of
like character in this country or in Europe. . . . [and it] should ultimately form a
part of the medical school of that university for which I have made ample provision by my will. (as cited in Chapman, 1994, pp. 107–108)
Hopkins died later that year,
leaving $7 million to create the
university, hospital, and medical school that would bear his
name—the largest single endowment ever bestowed in the
United States at the time.
Daniel Coit Gilman was selected
to head Johns Hopkins as its
first president. After earning
his bachelor’s degree, Gilman
had traveled throughout Europe
and studied at the University of
Berlin, which was founded early
Courtesy Everett Collection
in the century as the model
Johns Hopkins University Medical School established the
research university. In his inauframework for training doctors that is still used today. Here four
gural address at Johns Hopkins,
professors sit with the graduating class of 1900.
Gilman chided proponents of the
then-common 2-year system of
medical education by noting that “in some of our very best colleges, the degree of Doctor of Medicine can be obtained in half the time required to win the degree of Bachelor of Arts” (as cited in
Chapman, 1994, p. 127). Gilman later proposed a 3-year program of study as a prerequisite for
attending medical school.
In addition to Gilman, the other key figure in the early planning stages of Johns Hopkins was
Dr. John Shaw Billings. In 1876, the Johns Hopkins Hospital board of trustees paid for Billings to
travel to Europe for two months to inspect hospitals. This trip led to a chance meeting with a
young American doctor, William H. Welch, who was studying research techniques at a physiological laboratory in Leipzig. Billings was favorably impressed by Welch and persuaded President
Gilman to recruit him as a member of the medical school faculty; Welch accepted the appointment in 1884. In a similar coordinated effort, Billings and Gilman appointed William Osler
(1849–1919) in 1888 as a professor of medicine in the medical school and physician-in-chief in
the hospital (Chapman, 1994). Like Welch, Osler had completed his medical training in Europe,
where he had learned about new developments in pathology and laboratory sciences (Bynum
& Bynum, 2007). Welch and Osler, in turn, hired the other five members of the initial Johns
Hopkins Medical School faculty, each of whom was recruited based on research ability, as
reflected in scientific output (Bruce, 1987).
atL80953_10_c10.indd 313
12/5/13 2:00 PM
Section 10.1  Introduction: The Creation of the Modern U.S. Health Care System
CHAPTER 10
In 1893, the Johns Hopkins Medical School admitted its first class of students. Just as high standards had been implemented in the recruitment of faculty, likewise high admission standards
were implemented for students: All entering students were required to have a college degree. To
earn the medicine medical degree, each student had to complete 4 years of instruction, which
involved 2 years of basic laboratory sciences and 2 years of observation on the hospital wards
under the supervision of the faculty (Starr, 1982). By conceiving of the graduate school, medical
school, and hospital as different facets of a unified educational institution, Johns Hopkins University Medical School established the basic framework for training doctors that still exists today, the
“teaching hospital.”
Parallel to these education reforms, various states successfully regulated entry into the profession
by establishing boards of medical examiners. By certifying that those who practiced medicine in
the state had proper credentials, the boards elevated and standardized the body of knowledge
a doctor had to master. Eventually, these higher standards required that medical education be
extended from 2 years (as had been customary) to 8 years beyond high school. These developments put many medical schools (especially those that relied on tuition as a primary source of
income) in a financial bind: The extended curriculum led to a decline in enrollment, and the higher
standards required costly upgrades to educational facilities in order to supply laboratory and clinical instruction. As the sociologist Paul Starr (1982) observed, these new economic realities killed
many medical schools from the late 19th through the early 20th century.
The catalyst that hastened these schools’ demise was a famous 1910 report on the state of American medical education authored by Johns Hopkins graduate Abraham Flexner. The background
to the Flexner Report was an earlier 1906 study of 160 American medical schools initiated by
the American Medical Association’s (AMA) Council on Medical Education. In its report, the AMA
approved of the instruction provided at 82 schools; it stated that instruction at 46 schools was
inadequate but salvageable and that instruction at the remaining 32 could not be adequately
upgraded. Because professional ethics prohibited criticism of fellow doctors, the AMA engaged
an outside organization, the Carnegie Foundation for the Advancement of Teaching, to conduct
a similar study. Heading the study was Flexner, who visited each of the medical schools in the
United States. Many of these schools (known as proprietary medical schools) were primarily a
collection of local doctors, often with no formal university affiliation, financed through tuition
(which was split among the faculty). In his report, Flexner illustrated many examples of blatant
falsehoods in the catalogues of many of these medical schools; some advertised the presence of
laboratory facilities that did not exist, libraries that had no books, and medical faculty who did
not teach (they were engaged in private practice instead). Flexner’s standard of adequate medical
education was clearly predicated on the Hopkins model, with its focus on laboratory and clinical
instruction. As he wrote, “The student no longer merely watches, listens, memorizes; he does” (as
cited in Bonner, 1995, p. 293). Like the earlier report, Flexner recommended that first-tier institutions be strengthened in emulation of the Hopkins model, that second-tier schools be elevated so
that they compared to their first-tier counterparts, and that all the remaining medical schools be
closed (Starr, 1982).
In response to Flexner’s report, large amounts of foundation money was directed at the first-tier
schools and those that Flexner believed could be elevated to first-tier status. For example, John
D. Rockefeller gave almost $50 million to improve medical education along the lines Flexner had
advocated. Among the salvageable schools that did not receive foundation dollars, state legislatures stepped in and provided funding so that an adequate number of doctors was practicing
in each state. As Robert P. Hudson (1997) observed regarding Flexner’s achievement, “Flexner is
properly remembered not so much for the fire that he set as for his blueprint of the new structure
which was to arise from the ashes” (p. 206).
atL80953_10_c10.indd 314
12/5/13 2:00 PM
Section 10.1  Introduction: The Creation of the Modern U.S. Health Care System
CHAPTER 10
As Fox (1993b) observed, the general features of the health care system resulting from these
developments were in place by the 1920s:







Hospitals grew in size and began increasingly to use expensive “high-tech”
equipment;
Medical and surgical specialties developed and had increasingly rigorous
entrance requirements;
Medical school admission requirements became more stringent (e.g., possession of a bachelor’s degree);
Hiring, retention, and promotion of medical school faculty became based on
laboratory research;
Teaching hospitals associated with medical schools became the leading providers
of health care services in their regions;
Public health agencies at the local and state level focused on preventing,
monitoring, and treating infectious diseases; and
State-mandated insurance programs provided compensation for workplacerelated injuries and exposures to environmental toxins. (pp. 36–37)
Ultimately, these developments structured how health care services were supplied to patients in
the United States, including those who suffered from chronic conditions and disabilities.
Financing the System
The demand for these services was structured, in part, by how these medical services came to be
paid for. As Normand and Thomas (2009) observed, the modern state often plays a role in financing health care by mobilizing funds, developing a method for sharing risks (i.e., insurance), and
subsidizing access. Traditionally, modern states have relied on taxation to generate the needed
funds and have developed a state-administered system of social insurance to share risks among
the entire population. Like other modern countries, the United States has relied on taxes to generate revenue; however, it has never enacted a state-run system of national health insurance.
Instead, the general population has relied primarily on private health insurance; however, the U.S.
government has provided funds to cover designated vulnerable segments of the population who
are unlikely to receive coverage from the private sector.
In contrast to the United States, other industrial democracies created publicly administered programs in the late 19th and early 20th centuries. In 1883, Germany enacted a health insurance law
that established compulsory health insurance financed by premiums for its industrial workers.
Paid in advance, one third of the premiums was paid by the employers and two thirds were paid
by the employees (Porter, 1999). Soon, other European countries followed Germany’s lead: in
1891, Sweden enacted voluntary health insurance legislation; in 1892, Denmark created workers’
voluntary insurance; in 1910, France created a universal state retirement system; and in 1911,
the British enacted the National Insurance Act, which funded coverage for medical care through
payments from employers and employees (Porter, 1999). In 1948 in Great Britain, the National
Health Service Act 1946 went into effect, establishing the present-day National Health Service in
England and Wales.
Although various groups and politicians have advocated for the creation of universal social insurance for health care in the United States, private insurance plans became the norm. One early
attempt at private hospital insurance was the Baylor University Hospital plan. Started in December
1929, the program has been described as the “father” of the Blue Cross movement. It enrolled
atL80953_10_c10.indd 315
12/5/13 2:00 PM
Section 10.1  Introduction: The Creation of the Modern U.S. Health Care System
CHAPTER 10
public schoolteachers in the Dallas area; in return for a small monthly premium, each enrollee was
guaranteed 21 days of hospital care (Starr, 1982). Eventually, the success of this program led all the
hospitals in a geographic region to be included in the hospital insurance scheme.
Opponents of compulsory national insurance—such as the AMA and the U.S. Chamber of Commerce—used the emergence of the Blue Cross plans to argue for the appropriateness of private
health care financing arrangements. As one supporter put it, Blue Cross would “eliminate the demand
for compulsory health insurance and stop the reintroduction of vicious sociological bills into the state
legislature year after year” (Rothman, 1994, pp. 14–15). Blue Cross was further portrayed as a way
for the self-reliant middle class to obtain coverage for medical expenses without relying on charity
care, which was associated with care provided to the poor in public hospitals (Rothman, 1994).
Although the early Blue Cross and Blue Shield plans were nonprofit, a viable market for health
insurance soon led commercial, for-profit health insurers to offer coverage. This market expanded
considerably when the federal government passed a series of laws during World War II and afterward that made private health insurance an especially appealing fringe benefit that large employers offered to workers. During the war, the government limited the ability of employers to raise
wages but allowed employers to expand benefits; under this arrangement, offering health insurance became an important recruitment tool to attract scarce workers during wartime. In 1949,
the government let benefits be part of the “wage package,” which meant labor unions negotiated health insurance as part of contract talks. In 1954, the Internal Revenue Service ruled that
employers’ contributions towards purchasing health insurance for their workers were not taxable
as income for the workers; in contemporary parlance, this money was taken out on a “pretax”
basis (Blumenthal, 2006). As a result of these developments, voluntary health insurance expanded
considerably during the 1950s and early 1960s to a point where approximately three fourths of all
families in America had some form of insurance (Numbers, 1997).
However, lack of coverage was still widespread—especially for the elderly. In the mid-1960s,
approximately 85% of the elderly were uninsured (Harrison, 2003; Numbers, 1997). Because the
elderly had medical expenses and hospital stays that were twice as expensive as for those below
age 65 and were twice as likely to have a chronic illness (Oberlander, 2003), commercial insurers
either did not cover them or charged prohibitively high premiums. …
Purchase answer to see full
attachment