Answer the following question set in a 2-3 page essay, double-spaced, 1” margins and size 12 font. Properly cite all references and quotes,(( the readings are downloaded )) What do the Schwartz reading and the Taub & Nyhan reading have to do with Plato’s Allegory of the Cave? What does Plato’s Allegory of the Cave have to do with you … and with political science? Plato: https://yale.learningu.org/download/ca778ca3-7e93-…
taub___nyhan___believing_things_that_are_false___2017.docx

schwartz___finding_it_hard_to_focus___2018.docx

Unformatted Attachment Preview

Why People Continue to Believe
Objectively False Things
In a lens, a glimpse of the F.B.I. director, James Comey, and the N.S.A. director, Mike Rogers,
testifying before the House Intelligence Committee on Monday. (CreditCreditEric Thayer for
The New York Times)
By Amanda Taub and Brendan Nyhan

March 22, 2017
“Everyone is entitled to his own opinion, but not his own facts,” goes the saying — one
that now seems like a relic of simpler times.
Today, President Trump is sticking with his own facts — his claim that the Obama
administration wiretapped him during the election — in the face of testimony to the
contrary by the F.B.I. director, James Comey.
When asked about the accusations Mr. Trump had made on Twitter, Mr. Comey told a
Senate committee on Monday, “I have no information that supports those tweets, and
we have looked carefully inside the F.B.I.”
Other government authorities have come to similar conclusions. But Sean Spicer, the
White House press secretary, said after the hearing that Mr. Trump stood by his wiretap
allegations despite Mr. Comey’s testimony.
Mr. Trump’s claims may appear to his opponents to have been embarrassingly
debunked. But social science research suggests that Mr. Trump’s alternative version of
reality may appeal to his supporters.
Partisan polarization is now so extreme in the United States that it affects the way that
people consume and understand information — the facts they believe, and what events
they think are important. The wiretapping allegations could well become part of a
partisan narrative that is too powerful to be dispelled.
Mr. Trump, perhaps unconsciously, has grasped a core truth of modern politics: that
voters tend to seek out information that fits the story they want to believe, usually one in
which members of the other party are the bad guys.
Since the 1980s, Americans have been reporting increasingly negative opinions about
the opposing party. Partisanship, and particularly “negative partisanship,” the rejection
of the opposing party, has now become a kind of tribal identity that shapes how people
define themselves and others, according to Sean Westwood, a professor at Dartmouth
College who has studied partisan polarization. “It drives people to support their team at
any cost, and oppose the opposing team at any cost,” he said.
This partisan polarization affects the way Americans of all political stripes consume
information. People are more likely to believe stories that come from their side of the
political divide, particularly if an authority figure vouches for them. And they are more
likely to share news with their preferred slant as a way of showing they are good
members of their political tribe.
Mr. Trump’s wiretap claim is particularly likely to appeal to that partisan dynamic. At its
core, it is a story about Barack Obama being fundamentally untrustworthy, perhaps
even dangerous to the country. Mr. Trump’s supporters, who are already more likely to
believe that basic narrative, may be more likely to accept his wiretap claims.
Sticking to his version of events will probably strengthen Mr. Trump’s base, said John
Sides, a political-science professor at George Washington University who studies
political communication.
“At the end of the day, those people are going to put a floor under your approval
ratings,” he said. But, he cautioned, the constant cycles of controversy are preventing
him from establishing the broader appeal required to assure a successful presidency.
One might expect that facts would convince people that Mr. Trump’s claim was baseless,
but other political myths have proved remarkably robust to the sort of debunking Mr.
Comey provided.
For instance, Mr. Trump disavowed the “birther” myth in September 2016, conceding
that Mr. Obama was in fact born in Hawaii. There was an increase afterward in the
number of voters who said they believed Mr. Obama was born in the United States, but
polling by Morning Consult suggests that part of that effect has already faded. In
September, it found that 62 percent of registered voters said they believed Mr. Obama
had been born in the United States, but in a follow-up poll early this month, that
number had dropped to 57 percent.
This decline cannot be attributed simply to partisan bias; it occurredamong both
Democrats (who went to 77 percent from 82 percent) and Republicans (down to 36
percent from 44 percent). Over time, people may simply forget the contrary evidence
they’ve heard and fall back on their old beliefs.
Even when myths are dispelled, their effects linger. The Boston College political scientist
Emily Thorson conducted a series of studies showingthat exposure to a news article
containing a damaging allegation about a fictional political candidate caused people to
rate the candidate more negatively even when the allegation was corrected and people
believed it to be false.
There are ways to correct information more effectively. Adam Berinsky of M.I.T., for
instance, found that a surprising co-partisan source (a Republican member of Congress)
was the most effective in reducing belief in the “death panel” myth about the Affordable
Care Act.
But in the wiretapping case, Republican lawmakers have neither supported Mr. Trump’s
wiretap claims (which could risk their credibility) nor strenuously opposed them (which
could prompt a partisan backlash). Instead, they have tried to shift attention to a
different political narrative — one that suits the partisan divide by making Mr. Obama
the villain of the piece. Rather than focusing on the wiretap allegation, they have sought
to portray the House Intelligence Committee hearings on Russian interference in the
election as an investigation into leaks of classified information.
In Monday’s hearing, for instance, Representative Trey Gowdy, Republican of South
Carolina, asked Mr. Comey which Obama administration officials would have had access
to the leaked information, implying that officials from the previous administration must
be responsible for the leaks.
The result is that the same hearings may appear completely different to voters of
different parties. Democrats may see a defeat for Mr. Trump because Mr. Comey
rejected his wiretap claims and confirmed that the F.B.I. is investigating whether there
was any coordination between his campaign and the Russian intelligence officials who
interfered in the election.
But Republican voters may see confirmation that the leaks of classified information
were criminal, and that former Obama administration officials had means, motive and
opportunity to carry out the act. A Breitbart Newsreport, for example, breezed past Mr.
Comey’s debunking of the wiretap as “already known,” then focused on leaks to the
press.
The question of whether Mr. Obama wiretapped Mr. Trump has now been answered
clearly and strongly in the negative. But the myth, and its effects, seem likely to continue
unless Republicans denounce it more forcefully.
The Upshot provides news, analysis and graphics about politics, policy and everyday
life. Follow us on Facebook and Twitter. Sign up for our newsletter.
A version of this article appears in print on March 22, 2017, on Page A18 of the New
York edition with the headline: Why Objectively False Things Continue to Be Believed.
https://www.nytimes.com/2017/03/22/upshot/why-objectively-false-things-continue-to-bebelieved.html?em_pos=small&emc=edit_up_20170322&nl=upshot&nl_art=3&nlid=60355410&ref=head
line&te=1 [last accessed: 8/26/18]
Finding It Hard to Focus? Maybe It’s Not
Your Fault
The rise of the new “attention economy.”
By Casey Schwartz

Aug. 14, 2018
Credit: Lucy Jones
It was the big tech equivalent of “drink responsibly” or the gambling
industry’s “safer play”; the latest milestone in Silicon Valley’s year of
apology. Earlier this month, Facebook and Instagram announced new
tools for users to set time limits on their platforms, and a dashboard to
monitor one’s daily use, following Google’s introduction of Digital Well
Being features.
In doing so the companies seemed to suggest that spending time on the
internet is not a desirable, healthy habit, but a pleasurable vice: one that
if left uncontrolled may slip into unappealing addiction.
Having secured our attention more completely than ever dreamed, they
now are carefully admitting it’s time to give some of it back, so we can
meet our children’s eyes unfiltered by Clarendon or Lark; go see a movie
in a theater; or contra Apple’s ad for its watch, even go surfing without —
heaven forfend — “checking in.”
“The liberation of human attention may be the defining moral and
political struggle of our time,” writes James Williams, a technologist
turned philosopher and the author of a new book, “Stand Out of Our
Light.”
Mr. Williams, 36, should know. During a decade-long tenure at Google,
he worked on search advertising, helping perfect a powerful, data-driven
advertising model. Gradually, he began to feel that his life story as he
knew it was coming unglued, “as though the floor was crumbling under
my feet,” he writes.
Increasingly public incidents of attention failure, like Pablo Sandoval,
when he was the third baseman for the Red Sox, getting busted checking
Instagram in the middle of a game (and getting suspended), or Patti
LuPone taking away an audience member’s phone, both in 2015, would
likely come as no surprise to him.
Mr. Williams compares the current design of our technology to “an entire
army of jets and tanks” aimed at capturing and keeping our attention.
And the army is winning. We spend the day transfixed by our screens,
thumb twitching in the subways and elevators, glancing at traffic lights.
We flaunt and then regret the habit of so-called second screening, when
just one at a time isn’t enough, scrolling through our phones’ latest
dispatches while watching TV, say.
One study, commissioned by Nokia, found that, as of 2013, we were
checking our phones on average 150 times a day. But we touch our
phones about 2,617 times, according to a separate 2016 study, conducted
by Dscout, a research firm.
Apple has confirmed that users unlock their iPhones an average of 80
times per day. Screens have been inserted where no screens ever were
before: over individual tables at McDonald’s; in dressing rooms when
one is most exposed; on the backs of taxi seats. For only $12.99, one can
purchase an iPhone holster for one’s baby stroller … or (shudder) two.
This is us: eyes glazed, mouth open, neck crooked, trapped in dopamine
loops and filter bubbles. Our attention is sold to advertisers, along with
our data, and handed back to us tattered and piecemeal.
Credit: Lucy Jones
You’ve Got Chaos
Mr. Williams, 36, was speaking on Skype from his home in Moscow,
where his wife, who works for the United Nations, has been posted for
the year.
Originally from Abilene, Tex., he had arrived to work at Google in what
could still be called the early days, when the company, in its idealism,
was resistant to the age-old advertising model. He left Google in 2013 to
conduct doctoral research at Oxford on the philosophy and ethics of
attention persuasion in design.
Mr. Williams is now concerned with overwired individuals losing their
life purpose.
“In the same way that you pull out a phone to do something and you get
distracted, and 30 minutes later you find that you’ve done 10 other
things except the thing that you pulled out the phone to do — there’s
fragmentation and distraction at that level,” he said. “But I felt like
there’s something on a longer-term level that’s harder to keep in view:
that longitudinal sense of what you’re about.”
He knew that among that his colleagues, he wasn’t the only one feeling
this way. Speaking at a technology conference in Amsterdam last year,
Mr. Williams asked the designers in the room, some 250 of them, “How
many of you guys want to live in the world that you’re creating? In a
world where technology is competing for our attention?”
“Not a single hand went up,” he said.
Mr. Williams is also far from the only example of a former soldier of big
tech (to continue the army metaphor) now working to expose its cultural
dangers.
In late June, Tristan Harris, a former design ethicist for Google, took the
stage at the Aspen Ideas Festival to warn the crowd that what we are
facing is no less than an “existential threat” from our very own gadgets.
Red-haired and slight, Mr. Harris, 34, has been playing the role of
whistle-blower since he quit Google five years ago. He started the Center
for Humane Technology in San Francisco and travels the country,
appearing on influential shows and podcasts like “60 Minutes” and
“Waking Up,” as well as at glamorous conferences like Aspen, to describe
how technology is designed to be irresistible.
He likes a chess analogy. When Facebook or Google points their
“supercomputers” toward our minds, he said, “it’s checkmate.”
Back in the more innocent days of 2013, when Mr. Williams and Mr.
Harris both still worked at Google, they’d meet in conference rooms and
sketch out their thoughts on whiteboards: a concerned club of two at the
epicenter of the attention economy.
Since then, both men’s messages have grown in scope and urgency. The
constant pull on our attention from technology is no longer just about
losing too many hours of our so-called real lives to the diversions of the
web. Now, they are telling us, we are at risk of fundamentally losing our
moral purpose.
“It’s changing our ability to make sense of what’s true, so we have less
and less idea of a shared fabric of truth, of a shared narrative that we all
subscribe to,” Mr. Harris said, the day after his Aspen talk. “Without
shared truth or shared facts, you get chaos — and people can take
control.”
They can also profit, of course, in ways large and small. Indeed, a whole
industry has sprung up to combat tech creep. Once-free pleasures like
napping are now being monetized by the hour. Those who used to relax
with monthly magazines now download guided-meditation apps like
Headspace ($399.99 for a lifetime subscription).
HabitLab, developed at Stanford, stages aggressive interventions
whenever you enter one of your self-declared danger zones of internet
consumption. Having a problem with Reddit sucking away your
afternoons? Choose between the “one-minute assassin,” which puts you
on a strict 60-second egg timer, and the “scroll freezer,” which creates a
bottom in your bottomless scroll — and logs you out once you’ve hit it.
Like Moment, an app that monitors screen time and sends you or loved
ones embarrassing notifications detailing exactly how much time has
been frittered away on Instagram today, HabitLab gets to know your
patterns uncomfortably well in order to do its job. Apparently, we now
need our phones to save us from our phones.
Researchers have known for years that there’s a difference between “topdown” attention (the voluntary, effortful decisions we make to pay
attention to something of our choice) and “bottom-up” attention, which
is when our attention is involuntarily captured by whatever is going on
around us: a thunderclap, gunshot or merely the inviting bleep that
announces another Twitter notification.
But many of the biggest questions remain unanswered. At the top of that
list, no smaller a mystery remains than “the relationship between
attention and our conscious experience of the world,” said Jesse
Rissman, a neuroscientist whose lab at U.C.L.A. studies attention and
memory.
Also unclear: the consequence of all that screen time on our bedraggled
neurons. “We don’t understand how modern technology and changes in
our culture impact our ability to sustain our attention on our goals,” Dr.
Rissman said.
Britt Anderson, a neuroscientist at the University of Waterloo in Canada,
went so far as to write a 2011 paper titled “There Is No Such Thing as
Attention.”
Dr. Anderson argued that researchers have used the word to apply to so
many different behaviors — attention span, attention deficit, selective
attention and spatial attention, to name a few — that it has become
essentially meaningless, even at the very moment when it’s more
relevant than ever.
Are the Kids … All Right?
Despite attention’s possible lack of existence, though, many among us
mourn its passing — Ms. LuPone, for example, and others who command
an audience, like college professors.
Katherine Hayles, an English professor at U.C.L.A., has written about the
change she sees in students as one from “deep attention,” a state of
single-minded absorption that can last for hours, to one of “hyper
attention,” which jumps from target to target, preferring to skim the
surface of lots of different things than to probe the depths of just one.
At Columbia University, where every student is required to pass a core
curriculum with an average of 200 to 300 pages of reading each week,
professors have been discussing how to deal with the conspicuous
change in students’ ability to get through their assignments. The
curriculum has more or less stayed in place, but “we’re constantly
thinking about how we’re teaching when attention spans have changed
since 50 years ago,” said Lisa Hollibaugh, a dean of academic planning at
Columbia.
In the 1990s, 3 to 5 percent of American school-aged children were
thought to have what is now called attention deficit hyperactivity
disorder. By 2013, that number was 11 percent, and rising, according to
data from the National Survey of Children’s Health.
At Tufts University, Nick Seaver, an anthropology professor, just finished
his second year of teaching a class he designed called How to Pay
Attention. But rather than offering tips for focusing, as one might expect,
he set out to train his students to look at attention as a cultural
phenomenon — “the way people talk about attention,” Dr. Seaver said,
with topics like the “attention economy” or “attention and politics.”
As part of their homework for the “economy” week, Dr. Seaver told his
students to analyze how an app or website “captures” their attention and
then profits from it.
Morgan Griffiths, 22, chose YouTube. “A lot of the media I consume has
to do with ‘RuPaul’s Drag Race,’” Mr. Griffiths said. “And when a lot of
those videos end, RuPaul himself pops up at the very end and says, ‘Hey
friends, when one video ends, just open the next one, it’s called binge
viewing, go ahead, I encourage you.’”
A classmate, Jake Rochford, who chose Tinder, noted the extreme
stickiness of a new “super-like” button. “Once the super-like button came
into play, I noticed all of the functions as strategies for keeping the app
open, instead of strategies for helping me find love,” Mr. Rochford, 21,
said. After completing that week’s assignment, he disabled his account.
But Dr. Seaver, 32, is no Luddite.
“Information overload is something that always feels very new but is
actually very old,” he said. “Like: ‘It is the 16th century, and there
are so many books.’ Or: ‘It is late antiquity and there is so much writing.’
“It can’t be that there are too many things to pay attention to: That
doesn’t follow,” he said. “But it could be that there are more things that
are trying to actively demand your attention.”
And there is not only the attention we pay to consider, but also the
attention we receive.
Sherry Turkle, the M.I.T. sociologist and psychologist, has been writing
about our relationship with our technology for decades. Devices that
come with us everywhere we go, she argues, introduce a brand new
dynamic: Rather than compete with their siblings for their parents’
attention, children are up against iPhones and iPads, Siri and Alexa,
Apple watches and computer screens.
Every moment th …
Purchase answer to see full
attachment