Kranzberg’s Laws of Technology [...]

Melvin Kranzberg’s six laws of technology:

  • Technology is neither good nor bad; nor is it neutral.
  • Invention is the mother of necessity.
  • Technology comes in packages, big and small.
  • Although technology might be a prime element in many public issues, nontechnical factors take precedence in technology-policy decisions.
  • All history is relevant, but the history of technology is the most relevant.
  • Technology is a very human activity – and so is the history of technology.

Sources: Wikipedia, Hansen 2003

The Ideology of Disgust [...]

A propensity for disgust may be tied to tolerance for inequality and xenophobia.

In 2012, a team of academics from Europe and the U.S.—Yoel Inbar, David Pizarro, Ravi Iyer, and Jonathan Haidt—published a paper titled “Disgust Sensitivity, Political Conservatism, and Voting,” looking at the role disgust plays in political orientation. The researchers posited three different types of disgust: interpersonal disgust (i.e., the feeling produced by drinking from the same cup as someone else); core disgust (the response to maggots, vomit, dirty toilets, etc.); and animal-reminder disgust (how we react to corpses, blood, anything that evokes our animal nature).

Disgust, they write, “serves to discourage us from ingesting noxious or dangerous substances,” but also plays a role in moral and social judgments. Those who feel more disgusted by unpleasant images, smells, or tastes judge more harshly that which violates their subjective moral code.

The team had respondents position themselves on a political scale from conservative to liberal. The respondents then stated how strongly they agreed or disagreed with statements like “I never let any part of my body touch the toilet seat in a public washroom,” and rated other hypotheticals according to the level of disgust they generated. Even when controlling for age, education, geography, and religious belief, individuals with higher “disgust sensitivity” were found to be more likely to tolerate wealth inequality, view homosexuality negatively, and place more belief in authoritarian leaders and systems.

Most strikingly, interpersonal disgust was an important predictor of anti-immigrant attitudes. (Source)

The Analog Brain [...]

The fundamental difference between analog and digital information is that analog information is continuous and digital information is made of discrete chunks. Digital computers work by manipulating bits, ones, and zeroes. And operations on these bits occur in discrete steps. With each step, transistors representing bits switch on or off. Jiggle a particular atom on a transistor this way or that, and it will have no effect on the computation, because with each step the transistor’s status is rounded up or down to a one or a zero. Any drift is swiftly corrected.

On a neuron, however, jiggle an atom this way or that, and the strength of a synapse might change. People like to describe the signals between neurons as digital, because a neuron either fires or it doesn’t, sending a one or a zero to its neighbors in the form of a sharp electrical spike or lack of one. But there may be meaningful variation in the size of these spikes and in the possibility that nearby neurons will spike in response. The particular arrangement of the chemical messengers in a synapse, or the exact positioning of the two neurons, or the precise timing between two spikes—these all can have an impact on how one neuron reacts to another and whether a message is passed along. (Source)

Howard Dean on Money in 2003 [...]

In the same way, his predecessors also framed their campaigns as movements—grassroots efforts to challenge the establishment and grab the reins of the Democratic Party from monied interests and other villains. “This is a campaign to unite and empower people,” said Howard Dean in Burlington, Vermont, where he announced his bid for the Democratic nomination in June 2003. “It is a call to every American, regardless of party, to join together in common purpose, for the common good, and for the common good to save and restore all that it means to be an American.” He transitioned into a familiar message about the threats to American democracy: “Companies leaving the country to avoid paying taxes, or avoid paying people a livable wage. And corporations doing this with the support of our own government and a political process in Washington that they rent—if not own. This, this is the fear of James Madison and Thomas Jefferson—the fear that economic power would one day try to seize political power.” (Source)

Jerry Brown on Political Money in 1992 [...]

Or, to reach back to an insurgent politician who faced a (different) Clinton, there was California’s Jerry Brown, running in the 1992 Democratic primary. “Our democratic system has been the object of a hostile takeover engineered by a confederacy of corruption, careerism, and campaign consulting,” said Brown in his announcement speech. “And money has been the lubricant greasing the deal. Incredible sums—literally hundreds of millions of dollars—from political action committees, lobbyists, and wealthy patrons have flooded into the campaign war chests of Washington’s entrenched political elite—Democrats and Republicans alike.” (Source)

The Pastrami Principle [...]

A couple of months ago, Jeb Bush (remember him?) posted a photo of his monogrammed handgun to Twitter, with the caption “America.” Bill de Blasio, New York’s mayor, responded with a picture of an immense pastrami sandwich, also captioned “America.” Advantage de Blasio, if you ask me.

Let me now somewhat ruin the joke by talking about the subtext. Mr. Bush’s post was an awkward attempt to tap into the common Republican theme that only certain people — white, gun-owning, rural or small-town citizens — embody the true spirit of the nation. It’s a theme most famously espoused by Sarah Palin, who told small-town Southerners that they represented the “real America.” You see the same thing when Ted Cruz sneers at “New York values.”

Mr. de Blasio’s riposte, celebrating a characteristically New York delicacy, was a declaration that we’re also Americans — that everyone counts. And that, surely, is the vision of America that should prevail. (Source)

Nonwhite Support for Crime Bill [...]

The 1994 Crime Bill enjoyed its strongest support from nonwhite voters.

Ms. Brock’s comments underscore a sometimes overlooked reality in today’s re-examination of the crime bill: The legislation was broadly embraced by nonwhite voters, more enthusiastically even than by white voters. About 58 percent of nonwhites supported it in 1994, according to a Gallup poll, compared with 49 percent of white voters. (Source)

For a list of what was in the 1994 crime bill see What Was in the 1994 Crime Bill?

The 1994 crime bill had little to do with the increase in mass incarceration. See
The 1994 Crime Bill and Incarceration
Black Silent Majority discusses the African-American community’s part in the anti-crime movements of the later twentieth century.

Lost Cause Mythology [...]

Southern partisans laid the foundation of the post-racial Confederacy shortly after its defeat. Knowing that the eyes of history would view the cause of slavery as, in the words of Ulysses Grant, “one of the worst for which a people ever fought,” the vanquished Confederates sought to deny that they had ever fought to preserve slavery, or that their society had been built on the idea that white men were superior to black men.
As recounted in Bruce Levine’s Confederate Emancipation, in 1865 former Confederate Vice President Alexander Stephens disowned his famous “Cornerstone Speech,” which declared that slavery was “the proper status of the negro in our form of civilization.” Stephens blamed the supposed misunderstanding on “reporter’s notes which were very imperfect.” Southern champion Robert E. Lee insisted in 1870 that “far from engaging in a war to perpetuate slavery,” he was “rejoiced that slavery is abolished.” Jefferson Davis, the first and only Confederate president, wrote in 1881, “The existence of African servitude was in no wise the cause of the conflict.”
Thus began the “Lost Cause” mythology, what historian David Blight has called “a propaganda assault on popular history and memory,” meant to exonerate the Confederacy of the most significant cause of its brief existence. A century of pro-Confederate historical agitprop has dismissed slavery as the central cause of the war, framing the Confederates’ treason in defense of slavery as a response to Northern aggression, and the nation’s brief experiment with interracial democracy as a tyrannical failure. Even now, many Americans still believe that the Confederacy was not fighting for slavery, but rather over tariffs or “states’ rights.” When the state of Mississippi declared April Confederate Heritage Month in February 2016, Gov. Phil Bryant’s proclamation, unlike Mississippi’s 1861 declaration of secession, did not even mention slavery. (Source)

Bernie’s Motorcycle [...]

Jay Newton-Small, author of Broad Influence: How Women Are Changing the Way America Works explains the underlying sexism of the 2016 primary race in a conversation with Dana Milbank:

Campaigning While Female also deprives Clinton of the ability to make lofty promises. Sanders, for example, has a $15 trillion non-starter of a health-care plan. If Clinton floated such a plan, the media would mock it as patently absurd. But Sanders gets a pass.

Why the double standard? “Men are the guys who want to go out and buy the motorcycle, and women are the purse-string holders,” Newton-Small said. “It’s a very traditional role we are putting women into by making them the one saying, no, we can’t do all these really fun things. This is a very stereotypical box she gets put into, which then makes it very hard for her to be inspirational.” (Source)

Now people may come back and say — well, these are not “fun things”: health care, free college, etc. But that’s not how stereotypes work. They work on an emotional level, forcing Clinton into the emotional space of the dour mom.

Borg Complex [...]

The Borg Complex describes a set of rhetorical moves in discussing technological trends whereby the speaker claims that “the future is coming, whther you like it or not” in an attempt to shut off discussion of what future we want.

“Resistance is futile.” This is what the Borg, of Star Trek fame, announces to its victims before it proceeds to assimilate their biological and technological distinctiveness. It is also what many tech gurus and pundits announce to their audiences as they dispense their tech-guru-ish wisdom. They don’t quite use those words,of course, but they might as well. This is why I’ve taken to calling this sort of rhetoric a Borg Complex. (Source)

There are numerous elements you can use to identify the phenomenon. The speaker:

  1. Makes grandiose, but unsupported claims for technology

  2. Uses the term Luddite a-historically and as a casual slur

  3. Pays lip service to, but ultimately dismisses genuine concerns

  4. Equates resistance or caution to reactionary nostalgia

  5. Starkly and matter-of-factly frames the case for assimilation

  6. Announces the bleak future for those who refuse to assimilate

Completely Different and Exactly the Same [...]

Advocates of technology claim that their technology is revolutionary and like nothing that has come before, until it comes of course to regulating it, where suddenly it’s unclear why all these new rules are needed for something that is exactly the same as previous technology.

David Golumbia pulls the phrase from a comment on a Nick Carr post, and uses it to talk about Google Glass — a technology that is going to “change the world forever”, but when people protest about privacy issues the response from boosters like Jeff Jarvis is “How crazy! That would be like blaming cameras for privacy concerns!”

But to return to my main point: Jarvis’s response is exactly the one mentioned by CS Clark, very oddly and tellingly used to describe a technology that deserves the adjective “new” if anything does. Yes, Google Glass is in part a camera. But it’s not the kind of camera that requires you to sit for two hours to get any exposure, or the kind whose film needs professional developing, or the kind that makes one print that develops in your hand, or even the kind that other people notice when you use it to take their picture. It’s a new kind of camera, built in part out of old bits of camera technology. Yet to Jarvis it’s “fear-mongering” to consider these new features as new, even as he and other engage in ecstatic reverie over what this new stuff enables.

This deserves expansion and further reflection. The logic is evident everywhere today, and it’s mostly aligned with power. It connects to the “Borg Complex” described by L M Sacasas, with certain debates in Digital Humanities, with the advent of MOOCs, and much else. It’s not a phenomenon of which I’m aware there being a discussion in the critical literature. It deserves more attention. It deserves a name. Ideas? I’m working on it. (Source)

As Golumbia notes, this is an extremely common pattern, which can be seen at work in almost any technological discussion. GMOs are going to change the world, but they are exactly the same as normal hybridization. Uber has revolutionized the taxi business but is really just an outsourcing piecework app when it comes to regulation. Etc.

It’s not that “exactly the same” doesn’t have a point — for example, I think fears about GMOs are largely overblown, precisely because we have been doing variants of this for 10,000 years. But as Golumbia notes, the truth is nothing is completely different OR exactly the same. It’s the nuance around the differences that is in fact the most important part around both technical innovation and policy, and short-circuiting the conversation around that is bad for everyone.

Code is Action [...]

The demand that governments not regulate code becomes a demand that governments not regulate action—that is, that governments not regulate at all. That would be less troubling if there were not widespread, repeated, and effective demands for that (Source)

PubPeer [...]

That was Brandon Stell’s thought when he started PubPeer, a website where faceless Internet critics can pick apart scientific papers in academic journals. Four years later, PubPeer has grown to be a proof-of-concept for a new way to vet scientific work — one that subjects authors to the scrutiny of hundreds or even thousands of experts, rather than just the handful who reviewed it before publication. As a result, Mr. Stell and his website have become key figures in the debate over how best to restore public confidence in academic science.

Mr. Stell believes that his own hypothesis — that open, anonymous commenting makes science better — can withstand even its most rational critics. His formula is simple: More anonymity means more scrutiny for published papers, and more scrutiny means more errors are caught. If that means more authors have to tolerate questions from indiscernible figures up in the cheap seats, then so be it. (Source)

How Much Has Autism Increased? [...]

From the Economist:

IN AMERICA in 1970 one child in 14,000 was reckoned to be autistic. The current estimate is one in 68—or one in 42 among boys. Similarly high numbers can be found in other rich countries: a study in South Korea found that one in 38 children was affected. Autism is a brain condition associated with poor social skills. It has a wide spectrum of symptoms, from obsessive behaviour to hypersensitivity to sound, light or other sensory stimulation, the severity of which ranges from mild to life-blighting. The range of consequences is also wide. At one end, the autism of a computer scientist may be barely noticeable; at the other, a quarter of autistic children do not speak. (Source)

When Just-World Belief Kills [...]

According to the Journal of Behavioral Medicine, attributing success to personal characteristics instead of biased structural systems may negatively impact health outcomes for Black Americans. Nao Hagiwara and her colleagues at Virginia Commonwealth University were interested in seeing whether the “just world” belief—the belief that the world is a just place where people get what they deserve—would influence the relationship between perceived discrimination and health consequences for Black middle-age adults. The psychologists found that Blacks who both strongly believed that the world was a just place and reported experiencing high levels of discrimination were more likely than other Blacks to suffer from a greater number of chronic illnesses and increased blood pressure. Why? Because respectability politics tells Black Americans that what is happening to them in this country is their (our) fault. (Source)

Extrinsic Rewards and Costly Sharing [...]

Extrinsic motivation is subject to the overjustification effect. People who receive a reward for good behavior often come to believe that they exhibited the behavior for the reward, rather than due to internal motivation. This understanding in turn undermines intrinsic motivations. This study examines the overjustification effect in the realm of prosocial sharing, and finds similar results.

Two studies investigated the influence of external rewards and social praise in young children’s fairness-related behavior. The motivation of ninety-six 3-year-olds’ to equalize unfair resource allocations was measured in three scenarios (collaboration, windfall, and dictator game) following three different treatments (material reward, verbal praise, and neutral response). In all scenarios, children’s willingness to engage in costly sharing was negatively influenced when they had received a reward for equal sharing during treatment than when they had received praise or no reward. The negative effect of material rewards was not due to subjects responding in kind to their partner’s termination of rewards. These results provide new evidence for the intrinsic motivation of prosociality—in this case, costly sharing behavior—in preschool children. (Source)

The End of Scale [...]

Time, attention, value, real tangible utility value to the daily lives of people. We all got fooled into thinking those could be replaced by tonnage of shares/views/interactions, forgetting there were humans on the other end, who at some point would get tired of the distraction and deception. We all got fooled by the startup ecosystem, by the investors drunk of dreams of unicorns (in media, of all places!), by the media who were covering all of this, desperate to look relevant and cool.

If you are the type that sees analogies everywhere – I am one of them – then you can see a lot of parallels among this the rise and crash of media scale chasing era with the bundling and rebundling of crappy mortgages and passing them onwards to be rebundled and sold to the gullible, only to come crashing down only seven years ago. Chasing scale in finance, at any cost, same as chasing scale in media businesses, at any cost. 

Only this time it is happening when the American economy is having its best run in a long time.

The cracks could be seen coming from years away, if only you would care to look for them. You could see it when Upworthy lost its worth back in 2014. You could see it when readers caught up to the scam that “Recommended Stories” on story pages weren’t really recommended, but payola. You could see it in the crapola churned out on “platform” since, well, forever, and when everyone else started copying it a few years ago. (Source)

The Horror of Perpetual Activism [...]

Activism and decision-making have always been flip sides of the coin. What happens in a world of perpetual activism?

I am writing this after the Brooklyn Debate between Hillary Clinton and Bernie Sanders. I’ve been a supporter of Clinton this cycle, but must admit she said something last night that unnerved me a bit — comparing Libya to Syria she implied that a lot of the mess in Syria could be attributed to the fact that Syrian leader Assad was not removed.

This cuts to the heart of my worries about Clinton’s foreign policy — that she has a bit of hubris about what American power can accomplish, and puts her thumb on the scale for intervention more than I think is wise.

Does this change my support for Clinton? Not a bit. In terms of character alone, I believe Clinton to be the best change agent we have — everything I have seen indicates she is a quick learner, a careful listener, and a master coalition-builder that understands the nature of presidential power and how to get things done with it. I tend to believe that a Sanders presidency would be four wasted years for progressives, because he seems (again, in my opinion) to have none of the qualities or skills that this particular moment requires, and in fact seems to grossly misunderstand the nature and power of the current Democratic coalition (the dismissal of wins in more diverse states as wins in “conservative” states that “don’t matter” is particularly distressing to me, and makes me want to throw shoes at my TV every time I hear it).

That said, what do I do with this sour note of Clinton’s, this nagging feeling of a interventionist hubris not quite tamed? It’s important not to lose track of it, to keep our eyes open to it, to hold her accountable, to be ready to shift opinions on Clinton should this start to reveal itself as a major issue.

Recent theories of cognition suggest that there are really two types of logic we have, intertwined with one another. One developed for problem-solving — we look carefully at a situation, try to figure out how to mend that spear or bowl, how best to track down that bison or cook the meat. But many of our more abstract abilities come from another source — the need to convince people that our way is the one correct way so that people will assist us, and not someone else.

The Argumentative Theory of cognition, for example, suggests that our inability to see evidence that does not support our beliefs is not a “brain bug” but a result of an evolutionary process that favored people who could seem the most sure about their beliefs. People who are willing to display self-doubt do not tend to gain broad support, so evolution helpfully nuked self-doubt. As a result we’re actually horrible problem solvers, but boy are we sure of ourselves.

I tend to take a “better angels” view on the question — as humans we have multiple modes of operation — the problem-solving mode that does attempt to take in all inputs (right now I am watching our pet green cheek conure figure out how to nudge his door open — which is a sort of pure endeavor) and a rhetorical mode focused on proving out existing ideas right. And like a lot of things, cognition is mixed mode, with the particular mix triggered by the environment.

What I worry about in a world of context collapse is that we are becoming perpetual activists. Here’s why. In private conversations with people I’ve found it easier to express doubt about this thing or that thing. Why? Because very often I have deep relations with these people and these people have a deep commitment to the sorts of change I would like to see. …..

Negativity is a Voter Suppression Tactic [...]

Negativity doesn’t switch votes as much as demoralize people and make them less apt to vote period.

Democratic political analyst Dan Cence says Bernie Sanders needs to ask himself how much longer he’s going to keep it up for the good of the Democratic Party. And he says candidates need to be careful about being too rude, or to tough, or to negative.
Cence says, “when you attack really hard, it tends to lead to voter suppression more than it leads to someone changing their mind on a vote.”
Maybe so, but neither Clinton nor sander show any signs of letting up. Dan Payne says the advent of social media plays heavily into the current tenor of the debate. (Source)

Mushroom Ego Death [...]

The potential therapeutic benefit of psychedelics revolves mostly around ego death, which allows people to transcend self-identity (with good and bad effects).

The most remarkable potential benefit of hallucinogens is what’s called “ego death,” an experience in which people lose their sense of self-identity and, as a result, are able to detach themselves from worldly concerns like a fear of death, addiction, and anxiety over temporary — perhaps exaggerated — life events.

When people take a potent dose of a psychedelic, they can experience spiritual, hallucinogenic trips that can make them feel like they’re transcending their own bodies and even time and space. This, in turn, gives people a lot of perspective — if they can see themselves as a small part of a much broader universe, it’s a lot easier for them to discard personal, relatively insignificant and inconsequential concerns about their own lives and death. (Source)

Forty People [...]

The Prom Queen of Instagram doesn’t know (or like) forty people offline:

As they got ready, she talked about a party she was planning at her mom’s apartment.

“I’m thinking like 40 people,” Hymowitz said.

“Do you even like 40 people?” Nussdorf said.

Hymowitz thought for a moment. “I’m not sure I know 40 people,” she said. “It would be ten people I actually like.” (Source)

Merits of an Education in Psychology [...]

Often, students take their first psychology course because they are interested in helping others and want to learn more about themselves and why they act the way they do. Sometimes, students take a psychology course because it either satisfies a general education requirement or is required for a program of study such as nursing or pre-med. Many of these students develop such an interest in the area that they go on to declare psychology as their major. As a result, psychology is one of the most popular majors on college campuses across the United States (Johnson & Lubin, 2011).

A number of well-known individuals were psychology majors. Just a few famous names on this list are Facebook’s creator Mark Zuckerberg, television personality and political satirist Jon Stewart, actress Natalie Portman, and filmmaker Wes Craven (Halonen, 2011). About 6 percent of all bachelor degrees granted in the United States are in the discipline of psychology (U.S. Department of Education, 2013).

An education in psychology is valuable for a number of reasons. Psychology students hone critical thinking skills and are trained in the use of the scientific method. Critical thinking is the active application of a set of skills to information for the understanding and evaluation of that information. The evaluation of information—assessing its reliability and usefulness— is an important skill in a world full of competing “facts,” many of which are designed to be misleading.

For example, critical thinking involves maintaining an attitude of skepticism, recognizing internal biases, making use of logical thinking, asking appropriate questions, and making observations. Psychology students also can develop better communication skills during the course of their undergraduate coursework (American Psychological Association, 2011). Together, these factors increase students’ scientific literacy and prepare students to critically evaluate the various sources of information they encounter.

In addition to these broad-based skills, psychology students come to understand the complex factors that shape one’s behavior. They appreciate the interaction of our biology, our environment, and our experiences in determining who we are and how we will behave. They learn about basic principles that guide how we think and behave, and they come to recognize the tremendous diversity that exists across individuals and across cultural boundaries (American Psychological Association, 2011).

Link to Learning

Watch a brief video that describes some of the questions a student should consider before deciding to major in psychology.

The Cognitive Revolution [...]

Behaviorism’s emphasis on objectivity and focus on external behavior had pulled psychologists’ attention away from the mind for a prolonged period of time. The early work of the humanistic psychologists redirected attention to the individual human as a whole, and as a conscious and self-aware being. By the 1950s, new disciplinary perspectives in linguistics, neuroscience, and computer science were emerging, and these areas revived interest in the mind as a focus of scientific inquiry. This particular perspective has come to be known as the cognitive revolution (Miller, 2003). By 1967, Ulric Neisser published the first textbook entitled Cognitive Psychology, which served as a core text in cognitive psychology courses around the country (Thorne & Henley, 2005).

Although no one person is entirely responsible for starting the cognitive revolution, Noam Chomsky was very influential in the early days of this movement ([(link)]). Chomsky (1928–), an American linguist, was dissatisfied with the influence that behaviorism had had on psychology. He believed that psychology’s focus on behavior was short-sighted and that the field had to re-incorporate mental functioning into its purview if it were to offer any meaningful contributions to understanding behavior (Miller, 2003).

A photograph shows a mural on the side of a building. The mural includes Chomsky's face, along with some newspapers, televisions, and cleaning products.

European psychology had never really been as influenced by behaviorism as had American psychology; and thus, the cognitive revolution helped reestablish lines of communication between European psychologists and their American counterparts. Furthermore, psychologists began to cooperate with scientists in other fields, like anthropology, linguistics, computer science, and neuroscience, among others. This interdisciplinary approach often was referred to as the cognitive sciences, and the influence and prominence of this particular perspective resonates in modern-day psychology (Miller, 2003).

Dig Deeper: Feminist Psychology

The science of psychology has had an impact on human wellbeing, both positive and negative. The dominant influence of Western, white, and male academics in the early history of psychology meant that psychology developed with the biases inherent in those individuals, which often had negative consequences for members of society that were not white or male. Women, members of ethnic minorities in both the United States and other countries, and individuals with sexual orientations other than heterosexual had difficulties entering the field of psychology and therefore influencing its development. They also suffered from the attitudes of white, male psychologists, who were not immune to the nonscientific attitudes prevalent in the society in which they developed and worked. Until the 1960s, the science of psychology was largely a “womanless” psychology (Crawford & Marecek, 1989), meaning that few women were able to practice psychology, so they had little influence on what was studied. In addition, the experimental subjects of psychology were mostly men, which resulted from underlying assumptions that gender had no influence on psychology and that women were not of sufficient interest to study.

An article by Naomi Weisstein, first published in 1968 (Weisstein, 1993), stimulated a feminist revolution in psychology by presenting a critique of psychology as a science. She also specifically criticized male psychologists for constructing the psychology of women entirely out of their own cultural biases and without careful experimental tests to verify any of their characterizations of women. Weisstein used, as examples, statements by prominent psychologists in the 1960s, such as this quote by Bruno Bettleheim: “. . . we must start with the realization that, as much as women want to be good scientists or engineers, they want first and foremost to be womanly companions of men and to be mothers.” Weisstein’s critique formed the foundation for the subsequent development of a feminist psychology that attempted to be free of the influence of male cultural biases on our knowledge of the psychology of women and, indeed, of both genders.

Crawford & Marecek (1989) identify several feminist approaches to psychology that can be described as feminist psychology. These include re-evaluating and discovering the contributions of women to the history of psychology, studying psychological gender differences, and questioning the male bias present across the practice of the scientific approach to knowledge.

Multicultural Psychology [...]

Culture has important impacts on individuals and social psychology, yet the effects of culture on psychology are under-studied. There is a risk that psychological theories and data derived from white, American settings could be assumed to apply to individuals and social groups from other cultures and this is unlikely to be true (Betancourt & López, 1993). One weakness in the field of cross-cultural psychology is that in looking for differences in psychological attributes across cultures, there remains a need to go beyond simple descriptive statistics (Betancourt & López, 1993). In this sense, it has remained a descriptive science, rather than one seeking to determine cause and effect. For example, a study of characteristics of individuals seeking treatment for a binge eating disorder in Hispanic American, African American, and Caucasian American individuals found significant differences between groups (Franko et al., 2012). The study concluded that results from studying any one of the groups could not be extended to the other groups, and yet potential causes of the differences were not measured.

This history of multicultural psychology in the United States is a long one. The role of African American psychologists in researching the cultural differences between African American individual and social psychology is but one example. In 1920, Cecil Sumner was the first African American to receive a PhD in psychology in the United States. Sumner established a psychology degree program at Howard University, leading to the education of a new generation of African American psychologists (Black, Spence, and Omari, 2004). Much of the work of early African American psychologists (and a general focus of much work in first half of the 20th century in psychology in the United States) was dedicated to testing and intelligence testing in particular (Black et al., 2004). That emphasis has continued, particularly because of the importance of testing in determining opportunities for children, but other areas of exploration in African-American psychology research include learning style, sense of community and belonging, and spiritualism (Black et al., 2004).

The American Psychological Association has several ethnically based organizations for professional psychologists that facilitate interactions among members. Since psychologists belonging to specific ethnic groups or cultures have the most interest in studying the psychology of their communities, these organizations provide an opportunity for the growth of research on the impact of culture on individual and social psychology.

Link to Learning

Read a news story about the influence of an African American’s psychology research on the historic Brown v. Board of Education civil rights case.

Freud and Psychoanalytic Theory [...]

Perhaps one of the most influential and well-known figures in psychology’s history was Sigmund Freud ([(link)]). Freud (1856–1939) was an Austrian neurologist who was fascinated by patients suffering from “hysteria” and neurosis. Hysteria was an ancient diagnosis for disorders, primarily of women with a wide variety of symptoms, including physical symptoms and emotional disturbances, none of which had an apparent physical cause. Freud theorized that many of his patients’ problems arose from the unconscious mind. In Freud’s view, the unconscious mind was a repository of feelings and urges of which we have no awareness. Gaining access to the unconscious, then, was crucial to the successful resolution of the patient’s problems. According to Freud, the unconscious mind could be accessed through dream analysis, by examinations of the first words that came to people’s minds, and through seemingly innocent slips of the tongue. Psychoanalytic theory focuses on the role of a person’s unconscious, as well as early childhood experiences, and this particular perspective dominated clinical psychology for several decades (Thorne & Henley, 2005).

Photograph A shows Sigmund Freud. Image B shows the title page of his book, A General Introduction to Psychoanalysis.

Freud’s ideas were influential, and you will learn more about them when you study lifespan development, personality, and therapy. For instance, many therapists believe strongly in the unconscious and the impact of early childhood experiences on the rest of a person’s life. The method of psychoanalysis, which involves the patient talking about their experiences and selves, while not invented by Freud, was certainly popularized by him and is still used today. Many of Freud’s other ideas, however, are controversial. Drew Westen (1998) argues that many of the criticisms of Freud’s ideas are misplaced, in that they attack his older ideas without taking into account later writings. Westen also argues that critics fail to consider the success of the broad ideas that Freud introduced or developed, such as the importance of childhood experiences in adult motivations, the role of unconscious versus conscious motivations in driving our behavior, the fact that motivations can cause conflicts that affect behavior, the effects of mental representations of ourselves and others in guiding our interactions, and the development of personality over time. Westen identifies subsequent research support for all of these ideas.

More modern iterations of Freud’s clinical approach have been empirically demonstrated to be effective (Knekt et al., 2008; Shedler, 2010). Some current practices in psychotherapy involve examining unconscious aspects of the self and relationships, often through the relationship between the therapist and the client. Freud’s historical significance and contributions to clinical practice merit his inclusion in a discussion of the historical movements within psychology.

James and Functionalism [...]

William James (1842–1910) was the first American psychologist who espoused a different perspective on how psychology should operate ([(link)]). James was introduced to Darwin’s theory of evolution by natural selection and accepted it as an explanation of an organism’s characteristics. Key to that theory is the idea that natural selection leads to organisms that are adapted to their environment, including their behavior. Adaptation means that a trait of an organism has a function for the survival and reproduction of the individual, because it has been naturally selected.

A drawing depicts William James.

As James saw it, psychology’s purpose was to study the function of behavior in the world, and as such, his perspective was known as functionalism. Functionalism focused on how mental activities helped an organism fit into its environment. Functionalism has a second, more subtle meaning in that functionalists were more interested in the operation of the whole mind rather than of its individual parts, which were the focus of structuralism. Like Wundt, James believed that introspection could serve as one means by which someone might study mental activities, but James also relied on more objective measures, including the use of various recording devices, and examinations of concrete products of mental activities and of anatomy and physiology (Gordon, 1995).

Wundt and Structuralism [...]

Wilhelm Wundt (1832–1920) was a German scientist who was the first person to be referred to as a psychologist. His famous book entitled Principles of Physiological Psychology was published in 1873. Wundt viewed psychology as a scientific study of conscious experience, and he believed that the goal of psychology was to identify components of consciousness and how those components combined to result in our conscious experience. Wundt used introspection (he called it “internal perception”), a process by which someone examines their own conscious experience as objectively as possible, making the human mind like any other aspect of nature that a scientist observed. Wundt’s version of introspection used only very specific experimental conditions in which an external stimulus was designed to produce a scientifically observable (repeatable) experience of the mind (Danziger, 1980). The first stringent requirement was the use of “trained” or practiced observers, who could immediately observe and report a reaction. The second requirement was the use of repeatable stimuli that always produced the same experience in the subject and allowed the subject to expect and thus be fully attentive to the inner reaction. These experimental requirements were put in place to eliminate “interpretation” in the reporting of internal experiences and to counter the argument that there is no way to know that an individual is observing their mind or consciousness accurately, since it cannot be seen by any other person. This attempt to understand the structure or characteristics of the mind was known as structuralism. Wundt established his psychology laboratory at the University at Leipzig in 1879. In this laboratory, Wundt and his students conducted experiments on, for example, reaction times. A subject, sometimes in a room isolated from the scientist, would receive a stimulus such as a light, image, or sound. The subject’s reaction to the stimulus would be to push a button, and an apparatus would record the time to reaction. Wundt could measure reaction time to one-thousandth of a second (Nicolas & Ferrand, 1999).

Photograph A shows Wilhelm Wundt. Photograph B shows Wundt and five other people gathered around a desk with equipment on top of it.

However, despite his efforts to train individuals in the process of introspection, this process remained highly subjective, and there was very little agreement between individuals. As a result, structuralism fell out of favor with the passing of Wundt’s student, Edward Titchener, in 1927 (Gordon, 1995).

Wertheimer, Koffka, Köhler, and Gestalt Psychology [...]

Max Wertheimer (1880–1943), Kurt Koffka (1886–1941), and Wolfgang Köhler (1887–1967) were three German psychologists who immigrated to the United States in the early 20th century to escape Nazi Germany. These men are credited with introducing psychologists in the United States to various Gestalt principles. The word Gestalt roughly translates to “whole;” a major emphasis of Gestalt psychology deals with the fact that although a sensory experience can be broken down into individual parts, how those parts relate to each other as a whole is often what the individual responds to in perception. For example, a song may be made up of individual notes played by different instruments, but the real nature of the song is perceived in the combinations of these notes as they form the melody, rhythm, and harmony. In many ways, this particular perspective would have directly contradicted Wundt’s ideas of structuralism (Thorne & Henley, 2005).

Unfortunately, in moving to the United States, these men were forced to abandon much of their work and were unable to continue to conduct research on a large scale. These factors along with the rise of behaviorism (described next) in the United States prevented principles of Gestalt psychology from being as influential in the United States as they had been in their native Germany (Thorne & Henley, 2005). Despite these issues, several Gestalt principles are still very influential today. Considering the human individual as a whole rather than as a sum of individually measured parts became an important foundation in humanistic theory late in the century. The ideas of Gestalt have continued to influence research on sensation and perception.

Structuralism, Freud, and the Gestalt psychologists were all concerned in one way or another with describing and understanding inner experience. But other researchers had concerns that inner experience could be a legitimate subject of scientific inquiry and chose instead to exclusively study behavior, the objectively observable outcome of mental processes.

Pavlov, Watson, Skinner, and Behaviorism [...]

Early work in the field of behavior was conducted by the Russian physiologist Ivan Pavlov (1849–1936). Pavlov studied a form of learning behavior called a conditioned reflex, in which an animal or human produced a reflex (unconscious) response to a stimulus and, over time, was conditioned to produce the response to a different stimulus that the experimenter associated with the original stimulus. The reflex Pavlov worked with was salivation in response to the presence of food. The salivation reflex could be elicited using a second stimulus, such as a specific sound, that was presented in association with the initial food stimulus several times. Once the response to the second stimulus was “learned,” the food stimulus could be omitted. Pavlov’s “classical conditioning” is only one form of learning behavior studied by behaviorists.

John B. Watson (1878–1958) was an influential American psychologist whose most famous work occurred during the early 20th century at Johns Hopkins University. While Wundt and James were concerned with understanding conscious experience, Watson thought that the study of consciousness was flawed. Because he believed that objective analysis of the mind was impossible, Watson preferred to focus directly on observable behavior and try to bring that behavior under control. Watson was a major proponent of shifting the focus of psychology from the mind to behavior, and this approach of observing and controlling behavior came to be known as behaviorism. A major object of study by behaviorists was learned behavior and its interaction with inborn qualities of the organism. Behaviorism commonly used animals in experiments under the assumption that what was learned using animal models could, to some degree, be applied to human behavior. Indeed, Tolman (1938) stated, “I believe that everything important in psychology (except … such matters as involve society and words) can be investigated in essence through the continued experimental and theoretical analysis of the determiners of rat behavior at a choice-point in a maze.”

A photograph shows John B. Watson.

Behaviorism dominated experimental psychology for several decades, and its influence can still be felt today (Thorne & Henley, 2005). Behaviorism is largely responsible for establishing psychology as a scientific discipline through its objective methods and especially experimentation. In addition, it is used in behavioral and cognitive-behavioral therapy. Behavior modification is commonly used in classroom settings. Behaviorism has also led to research on environmental influences on human behavior.

B. F. Skinner (1904–1990) was an American psychologist ([(link)]). Like Watson, Skinner was a behaviorist, and he concentrated on how behavior was affected by its consequences. Therefore, Skinner spoke of reinforcement and punishment as major factors in driving behavior. As a part of his research, Skinner developed a chamber that allowed the careful study of the principles of modifying behavior through reinforcement and punishment. This device, known as an operant conditioning chamber (or more familiarly, a Skinner box), has remained a crucial resource for researchers studying behavior (Thorne & Henley, 2005).

Photograph A shows B.F. Skinner. Illustration B shows a rat in a Skinner box: a chamber with a speaker, lights, a lever, and a food dispenser.

The Skinner box is a chamber that isolates the subject from the external environment and has a behavior indicator such as a lever or a button. When the animal pushes the button or lever, the box is able to deliver a positive reinforcement of the behavior (such as food) or a punishment (such as a noise) or a token conditioner (such as a light) that is correlated with either the positive reinforcement or punishment.

Skinner’s focus on positive and negative reinforcement of learned behaviors had a lasting influence in psychology that has waned somewhat since the growth of research in cognitive psychology. Despite this, conditioned learning is still used in human behavioral modification. Skinner’s two widely read and controversial popular science books about the value of operant conditioning for creating happier lives remain as thought-provoking arguments for his approach (Greengrass, 2004).

Maslow, Rogers, and Humanism [...]

During the early 20th century, American psychology was dominated by behaviorism and psychoanalysis. However, some psychologists were uncomfortable with what they viewed as limited perspectives being so influential to the field. They objected to the pessimism and determinism (all actions driven by the unconscious) of Freud. They also disliked the reductionism, or simplifying nature, of behaviorism. Behaviorism is also deterministic at its core, because it sees human behavior as entirely determined by a combination of genetics and environment. Some psychologists began to form their own ideas that emphasized personal control, intentionality, and a true predisposition for “good” as important for our self-concept and our behavior. Thus, humanism emerged. Humanism is a perspective within psychology that emphasizes the potential for good that is innate to all humans. Two of the most well-known proponents of humanistic psychology are Abraham Maslow and Carl Rogers (O’Hara, n.d.).

Abraham Maslow (1908–1970) was an American psychologist who is best known for proposing a hierarchy of human needs in motivating behavior ([(link)]). Although this concept will be discussed in more detail in a later chapter, a brief overview will be provided here. Maslow asserted that so long as basic needs necessary for survival were met (e.g., food, water, shelter), higher-level needs (e.g., social needs) would begin to motivate behavior. According to Maslow, the highest-level needs relate to self-actualization, a process by which we achieve our full potential. Obviously, the focus on the positive aspects of human nature that are characteristic of the humanistic perspective is evident (Thorne & Henley, 2005). Humanistic psychologists rejected, on principle, the research approach based on reductionist experimentation in the tradition of the physical and biological sciences, because it missed the “whole” human being. Beginning with Maslow and Rogers, there was an insistence on a humanistic research program. This program has been largely qualitative (not measurement-based), but there exist a number of quantitative research strains within humanistic psychology, including research on happiness, self-concept, meditation, and the outcomes of humanistic psychotherapy (Friedman, 2008).

A triangle is divided vertically into five sections with corresponding labels inside and outside of the triangle for each section. From top to bottom, the triangle's sections are labeled: self-actualization corresponds to

Carl Rogers (1902–1987) was also an American psychologist who, like Maslow, emphasized the potential for good that exists within all people ([(link)]). Rogers used a therapeutic technique known as client-centered therapy in helping his clients deal with problematic issues that resulted in their seeking psychotherapy. Unlike a psychoanalytic approach in which the therapist plays an important role in interpreting what conscious behavior reveals about the unconscious mind, client-centered therapy involves the patient taking a lead role in the therapy session. Rogers believed that a therapist needed to display three features to maximize the effectiveness of this particular approach: unconditional positive regard, genuineness, and empathy. Unconditional positive regard refers to the fact that the therapist accepts their client for who they are, no matter what he or she might say. Provided these factors, Rogers believed that people were more than capable of dealing with and working through their own issues (Thorne & Henley, 2005).

Humanism has been influential to psychology as a whole. Both Maslow and Rogers are well-known names among students of psychology (you will read more about both men later in this text), and their ideas have influenced many scholars. Furthermore, Rogers’ client-centered approach to therapy is still commonly used in psychotherapeutic settings today (O’hara, n.d.)

Link to Learning

View a brief video of Carl Rogers describing his therapeutic approach.

The Psychodynamic Perspective [...]

Psychodynamic theory is an approach to psychology that studies the psychological forces underlying human behavior, feelings, and emotions, and how they may relate to early childhood experience. This theory is especially interested in the dynamic relations between conscious and unconscious motivation, and asserts that behavior is the product of underlying conflicts over which people often have little awareness.

Psychodynamic theory was born in 1874 with the works of German scientist Ernst von Brucke, who supposed that all living organisms are energy systems governed by the principle of the conservation of energy. During the same year, medical student Sigmund Freud adopted this new “dynamic” physiology and expanded it to create the original concept of “psychodynamics,” in which he suggested that psychological processes are flows of psychosexual energy (libido) in a complex brain. Freud also coined the term “psychoanalysis.” Later, these theories were developed further by Carl Jung, Alfred Adler, Melanie Klein, and others. By the mid-1940s and into the 1950s, the general application of the “psychodynamic theory” had been well established.

Photograph of Freud

The Role of the Unconscious

Freud’s theory of psychoanalysis holds two major assumptions: (1) that much of mental life is unconscious (i.e., outside of awareness), and (2) that past experiences, especially in early childhood, shape how a person feels and behaves throughout life. The concept of the unconscious was central: Freud postulated a cycle in which ideas are repressed but continue to operate unconsciously in the mind, and then reappear in consciousness under certain circumstances. Much of Freud’s theory was based on his investigations of patients suffering from “hysteria” and neurosis. Hysteria was an ancient diagnosis that was primarily used for women with a wide variety of symptoms, including physical symptoms and emotional disturbances with no apparent physical cause. The history of the term can be traced to ancient Greece, where the idea emerged that a woman’s uterus could float around her body and cause a variety of disturbances. Freud theorized instead that many of his patients’ problems arose from the unconscious mind. In Freud’s view, the unconscious mind was a repository of feelings and urges of which we have no awareness.

The treatment of a patient referred to as Anna O. is regarded as marking the beginning of psychoanalysis. Freud worked together with Austrian physician Josef Breuer to treat Anna O.’s “hysteria,” which Freud implied was a result of the resentment she felt over her father’s real and physical illness that later led to his death. Today many researchers believe that her illness was not psychological, as Freud suggested, but either neurological or organic.

The Id, Ego, and Superego

Freud’s structural model of personality divides the personality into three parts—the id, the ego, and the superego. The id is the unconscious part that is the cauldron of raw drives, such as for sex or aggression. The ego, which has conscious and unconscious elements, is the rational and reasonable part of personality. Its role is to maintain contact with the outside world to keep the individual in touch with society, and to do this it mediates between the conflicting tendencies of the id and the superego. The superego is a person’s conscience, which develops early in life and is learned from parents, teachers, and others. Like the ego, the superego has conscious and unconscious elements. When all three parts of the personality are in dynamic equilibrium, the individual is thought to be mentally healthy. However, if the ego is unable to mediate between the id and the superego, an imbalance is believed to occur in the form of psychological distress.

Image of a clip-art iceberg, with large portions of its superego and ego under the surface of the water, with the id at the bottom of the iceberg. The exposed portion is conscious experience.

Psychosexual Theory of Development

Freud’s theories also placed a great deal of emphasis on sexual development. Freud believed that each of us must pass through a series of stages during childhood, and that if we lack proper nurturing during a particular stage, we may become stuck or fixated in that stage. Freud’s psychosexual model of development includes five stages: oral, anal, phallic, latency, and genital. According to Freud, children’s pleasure-seeking urges are focused on a different area of the body, called an erogenous zone, at each of these five stages. Psychologists today dispute that Freud’s psychosexual stages provide a legitimate explanation for how personality develops, but what we can take away from Freud’s theory is that personality is shaped, in some part, by experiences we have in childhood.

Jungian Psychodynamics

Carl Jung was a Swiss psychotherapist who expanded upon Freud’s theories at the turn of the 20th century. A central concept of Jung’s analytical psychology is individuation: the psychological process of integrating opposites, including the conscious with the unconscious, while still maintaining their relative autonomy. Jung focused less on infantile development and conflict between the id and superego and instead focused more on integration between different parts of the person. Jung created some of the best-known psychological concepts, including the archetype, the collective unconscious, the complex, and synchronicity.

Psychodynamics Today

At present, psychodynamics is an evolving multidisciplinary field that analyzes and studies human thought processes, response patterns, and influences. Research in this field focuses on areas such as:

  • understanding and anticipating the range of conscious and unconscious responses to specific sensory inputs, such as images, colors, textures, sounds, etc.;
  • utilizing the communicative nature of movement and primal physiological gestures to affect and study specific mind-body states; and
  • examining the capacity of the mind and senses to directly affect physiological response and biological change.

Psychodynamic therapy, in which patients become increasingly aware of dynamic conflicts and tensions that are manifesting as a symptom or challenge in their lives, is an approach to therapy that is still commonly used today.



The Biological Perspective [...]

Biopsychology—also known as biological psychology or psychobiology—is the application of the principles of biology to the study of mental processes and behavior. The fields of behavioral neuroscience, cognitive neuroscience, and neuropsychology are all subfields of biological psychology.

Overview of Biopsychology

Biopsychologists are interested in measuring biological, physiological, and/or genetic variables and attempting to relate them to psychological or behavioral variables. Because all behavior is controlled by the central nervous system, biopsychologists seek to understand how the brain functions in order to understand behavior. Key areas of focus include sensation and perception, motivated behavior (such as hunger, thirst, and sex), control of movement, learning and memory, sleep and biological rhythms, and emotion. As technical sophistication leads to advancements in research methods, more advanced topics, such as language, reasoning, decision-making, and consciousness, are now being studied.

Brain scans

Behavioral neuroscience has a strong history of contributing to the understanding of medical disorders, including those that fall into the realm of clinical psychology. Neuropsychologists are often employed as scientists to advance scientific or medical knowledge, and neuropsychology is particularly concerned with understanding brain injuries in an attempt to learn about normal psychological functioning.

MRI of the brain

Neuroimaging tools, such as functional magnetic resonance imaging (fMRI) scans, are often used to observe which areas of the brain are active during particular tasks in order to help psychologists understand the link between brain and behavior.


Biopsychology as a scientific discipline emerged from a variety of scientific and philosophical traditions in the 18th and 19th centuries.

Image of parts of the brain, showing the pineal gland

Philosophers like Rene Descartes proposed physical models to explain animal and human behavior. Descartes suggested, for example, that the pineal gland, a midline unpaired structure in the brain of many organisms, was the point of contact between mind and body. In The Principles of Psychology (1890), William James argued that the scientific study of psychology should be grounded in an understanding of biology. The emergence of both psychology and behavioral neuroscience as legitimate sciences can be traced to the emergence of physiology during the 18th and 19th centuries; however, it was not until 1914 that the term “psychobiology” was first used in its modern sense by Knight Dunlap in An Outline of Psychobiology.


Cultural Psychology [...]

Cultural psychology is the study of how psychological and behavioral tendencies are rooted and embedded within culture. The main tenet of cultural psychology is that mind and culture are inseparable and mutually constitutive, meaning that people are shaped by their culture and their culture is also shaped by them.

A major goal of cultural psychology is to expand the number and variation of cultures that contribute to basic psychological theories, so that these theories become more relevant to the predictions, descriptions, and explanations of _all _human behaviors—not just Western ones. Populations that are Western, educated, and industrialized tend to be overrepresented in psychological research, yet findings from this research tend to be labeled “universal” and inaccurately applied to other cultures. The evidence that social values, logical reasoning, and basic cognitive and motivational processes vary across populations has become increasingly difficult to ignore. By studying only a narrow range of culture within human populations, psychologists fail to account for a substantial amount of diversity.

Collage of white pop culture icons.

Cultural psychology is often confused with cross-cultural psychology_; _however, it is distinct in that cross-cultural psychologists generally use culture as a means of testing the universality of psychological processes, rather than determining how local cultural practices shape psychological processes. So while a cross-cultural psychologist might ask whether Jean Piaget’s stages of development are universal across a variety of cultures, a cultural psychologist would be interested in how the social practices of a particular set of cultures shape the development of cognitive processes in different ways.

Vygotsky and Cultural-Historical Psychology

Cultural-historical psychology is a psychological theory formed by Lev Vygotsky in the late 1920s and further developed by his students and followers in Eastern Europe and worldwide. This theory focuses on how aspects of culture, such as values, beliefs, customs, and skills, are transmitted from one generation to the next. According to Vygotsky, social interaction—especially involvement with knowledgeable community or family members—helps children to acquire the thought processes and behaviors specific to their culture and/or society. The growth that children experience as a result of these interactions differs greatly between cultures; this variance allows children to become competent in tasks that are considered important or necessary in their particular society.

The Cognitive Perspective [...]

Cognitive psychology is the school of psychology that examines internal mental processes such as problem solving, memory, and language. “Cognition” refers to thinking and memory processes, and “cognitive development” refers to long-term changes in these processes. Much of the work derived from cognitive psychology has been integrated into various other modern disciplines of psychological study, including social psychology, personality psychology, abnormal psychology, developmental psychology, educational psychology, and behavioral economics.

Cognitive psychology is radically different from previous psychological approaches in that it is characterized by both of the following:

  1. It accepts the use of the scientific method and generally rejects introspection as a valid method of investigation, unlike phenomenological methods such as Freudian psychoanalysis.
  2. It explicitly acknowledges the existence of internal mental states (such as belief, desire, and motivation), unlike behaviorist psychology.

Cognitive theory contends that solutions to problems take the form of algorithms, heuristics, or insights. Major areas of research in cognitive psychology include perception, memory, categorization, knowledge representation, numerical cognition, language, and thinking.

History of Cognitive Psychology

Cognitive psychology is one of the more recent additions to psychological research. Though there are examples of cognitive approaches from earlier researchers, cognitive psychology really developed as a subfield within psychology in the late 1950s and early 1960s. The development of the field was heavily influenced by contemporary advancements in technology and computer science.

Early Roots

In 1958, Donald Broadbent integrated concepts from human-performance research and the recently developed information theory in his book Perception and Communication, which paved the way for the information-processing model of cognition. Ulric Neisser is credited with formally having coined the term “cognitive psychology” in his book of the same name, published in 1967. The perspective had its foundations in the Gestalt psychology of Max Wertheimer, Wolfgang Köhler, and Kurt Koffka, and in the work of Jean Piaget, who studied intellectual development in children.

Although no one person is entirely responsible for starting the cognitive revolution, Noam Chomsky was very influential in the early days of this movement. Chomsky (1928–), an American linguist, was dissatisfied with the influence that behaviorism had had on psychology. He believed that psychology’s focus on behavior was short-sighted and that the field had to reincorporate mental functioning into its purview if it were to offer any meaningful contributions to understanding behavior (Miller, 2003).

Jean Piaget’s Theory of Cognitive Development

Photograph of Jean Piaget

Instead of approaching development from a psychoanalytic or psychosocial perspective, Piaget focused on children’s cognitive growth. He is most widely known for his stage theory of cognitive development, which outlines how children become able to think logically and scientifically over time. As they progress to a new stage, there is a distinct shift in how they think and reason.

The Humanistic Perspective [...]

Humanistic psychology is a psychological perspective that rose to prominence in the mid-20th century, drawing on the philosophies of existentialism and phenomenology, as well as Eastern philosophy. It adopts a holistic approach to human existence through investigations of concepts such as meaning, values, freedom, tragedy, personal responsibility, human potential, spirituality, and self-actualization.

Basic Principles of the Humanistic Perspective

The humanistic perspective is a holistic psychological perspective that attributes human characteristics and actions to free will and an innate drive for self-actualization. This approach focuses on maximum human potential and achievement rather than psychoses and symptoms of disorder. It emphasizes that people are inherently good and pays special attention to personal experiences and creativity. This perspective has led to advances in positive, educational, and industrial psychology, and has been applauded for its successful application to psychotherapy and social issues. Despite its great influence, humanistic psychology has also been criticized for its subjectivity and lack of evidence.

Developments in Humanistic Psychology

In the late 1950s, a group of psychologists convened in Detroit, Michigan, to discuss their interest in a psychology that focused on uniquely human issues, such as the self, self-actualization, health, hope, love, creativity, nature, being, becoming, individuality, and meaning. These preliminary meetings eventually culminated in the description of humanistic psychology as a recognizable “third force” in psychology, along with behaviorism and psychoanalysis. Humanism’s major theorists were Abraham Maslow, Carl Rogers, Rollo May, and Clark Moustakas; it was also influenced by psychoanalytic theorists, including Wilhelm Reich, who discussed an essentially good, healthy core self, and Carl Gustav Jung, who emphasized the concept of archetypes.

Maslow’s Hierarchy of Needs

Abraham Maslow (1908–1970) is considered the founder of humanistic psychology, and is noted for his conceptualization of a hierarchy of human needs. He believed that every person has a strong desire to realize his or her full potential—or to reach what he called “self-actualization”. Unlike many of his predecessors, Maslow studied mentally healthy individuals instead of people with serious psychological issues. Through his research he coined the term “peak experiences,” which he defined as “high points” in which people feel at harmony with themselves and their surroundings. Self-actualized people, he believed, have more of these peak experiences throughout a given day than others.

To explain his theories, Maslow created a visual, which he termed the “hierarchy of needs.” This pyramid depicts various levels of physical and psychological needs that a person progresses through during their lifetime. At the bottom of the pyramid are the basic physiological needs of a human being, such as food and water. The next level is safety, which includes shelter and needs paramount to physical survival. The third level, love and belonging, is the psychological need to share oneself with others. The fourth level, esteem, focuses on success, status, and accomplishments. The top of the pyramid is self-actualization, in which a person is believed to have reached a state of harmony and understanding. Individuals progress from lower to higher stages throughout their lives, and cannot reach higher stages without first meeting the lower needs that come before them.

Pyramid showing the hierarchy of needs.

Rogers’ Person-Centered Therapy

Carl Rogers (1902–1987) is best known for his person-centered approach, in which the relationship between therapist and client is used to help the patient reach a state of realization, so that they can then help themselves. His non-directive approach focuses more on the present than the past and centers on clients’ capacity for self-direction and understanding of their own development. The therapist encourages the patient to express their feelings and does not suggest how the person might wish to change. Instead, the therapist uses the skills of active listening and mirroring to help patients explore and understand their feelings for themselves.

Photograph of Carl Rogers.

Rogers is also known for practicing “unconditional positive regard,” which is defined as accepting a person in their entirety with no negative judgment of their essential worth. He believed that those raised in an environment of unconditional positive regard have the opportunity to fully actualize themselves, while those raised in an environment of conditional positive regard only feel worthy if they match conditions that have been laid down by others.

May’s Existentialism

Rollo May (1909–1994) was the best known American existential psychologist, and differed from other humanistic psychologists by showing a sharper awareness of the tragic dimensions of human existence. May was influenced by American humanism, and emphasized the importance of human choice.

Advantages and Disadvantages

Humanistic psychology is holistic in nature: it takes whole persons into account rather than their separate traits or processes. In this way, people are not reduced to one particular attribute or set of characteristics, but instead are appreciated for the complex beings that they are. Humanistic psychology allows for a personality concept that is dynamic and fluid and accounts for much of the change a person experiences over a lifetime. It stresses the importance of free will and personal responsibility for decision-making; this view gives the conscious human being some necessary autonomy and frees them from deterministic principles. Perhaps most importantly, the humanistic perspective emphasizes the need to strive for positive goals and explains human potential in a way that other theories cannot.

However, critics have taken issue with many of the early tenets of humanism, such as its lack of empirical evidence (as was the case with most early psychological approaches). Because of the inherent subjective nature of the humanistic approach, psychologists worry that this perspective does not identify enough constant variables in order to be researched with consistency and accuracy. Psychologists also worry that such an extreme focus on the subjective experience of the individual does little to explain or appreciate the impact of external societal factors on personality development. In addition, The major tenet of humanistic personality psychology — namely, that people are innately good and intuitively seek positive goals — does not account for the presence of deviance in the world within normal, functioning personalities.

The Socio-Cultural Perspective [...]

Sociocultural factors are the larger-scale forces within cultures and societies that affect the thoughts, feelings, and behaviors of individuals. These include forces such as attitudes, child-rearing practices, discrimination and prejudice, ethnic and racial identity, gender roles and norms, family and kinship structures, power dynamics, regional differences, religious beliefs and practices, rituals, and taboos. Several subfields within psychology seek to examine these sociocultural factors that influence human mental states and behavior; among these are social psychology, cultural psychology, and cultural-historical psychology.

Datensparsamkeit [...]

Datensparsamkeit is a German word that “refers to collecting only the minimal amount of information necessary to complete a task.” It is opposed to the now common practice of gathering as much information from the users actions as possible, in anticipation of future use.

I learned about datensparsamkeit from a colleague who was creating a confidential survey tool. I thought about it again recently while considering what information news organizations may collect and store from users, either through themselves or third-party tools.

March brought a particularly egregious example of poor data collection. CNBC put up a tutorial to teach visitors about secure password creation, but the passwords were collected and then sent without encryption to a Google spreadsheet, which means anyone in an individual users’ Wi-Fi network or CNBC’s network with the correct permissions could see it. The tutorial was also set up so that the 30 or so advertisers whose ads appeared on the page could also see the passwords.

CNBC took down the tutorial, but they’re not the only news organization that routinely collects data about users through quizzes or survey results or apps. Some news organizations then transmit this collected information to advertisers or third-party tools. Should news organizations ponder the ethical considerations of transmitting personal information like IP addresses to third-party tools or advertisers without asking for explicit permission? (Source)

Grace Hopper warned that overly aggressive privacy law could result in Defensive Computing.

When questioned about regulation, Big Data people argue that what they are doing is Completely Different and Exactly the Same as what came before.

Challenges from the Right [...]

Some of the GOP’s biggest-name incumbents and front-runners still must beat back nagging challengers from the right who promise to sap energy and money from the party’s efforts to fend off Democrats in the general election even if they’re almost certain to lose.

Sens. John McCain of Arizona and Kelly Ayotte of New Hampshire will spend their summers under attack by underfinanced tea party opponents. In Nevada, the reemergence of Sharron Angle — who infamously blew a chance to defeat Senate Democratic Leader Harry Reid in 2010 — means the GOP’s hand-picked candidate for the open seat, Rep. Joe Heck, has a fight on his hands to get to the general election. And Republicans also are fretting over primaries in Colorado, Indiana and Florida, as well as the fate of a former leader of their campaign arm. (Source)

Neo-Nazis in the Classroom [...]

There is a famous podcast from Love + Radio called “The Silver Dollar” (if you’ve never listened to it, you should). The podcast features the story of African-American musician and author Daryl Davis, who became friends with various members of the Klu Klux Klan and was able to drive several important members to quit the organization. Davis shares a story from 10th grade when his teacher brought in the leader of the American Nazi Party, Matt Koehl, to his Problems in the 20th Century class.
“You cannot do that today in high school. All that political correctness, which if you ask me, is a bunch of bullspit,” Davis says in the podcast.
During Koehl’s lecture, he pointed to Davis and another African-American student in the class and said they were going to be shipped back to Africa. Davis was stunned. “From that day forward- that was a turning point in my life- I began collecting everything I could get my hands on that dealt with white supremacy, black supremacy, anti-semitism, the Nazis in Germany, the neo-Nazis overs here, the Klu Klux Klan, things like that. Music is my profession, but learning more about racism on all sides of the tracks was my obsession.” (Source)

Posting While Female [...]

The Guardian looked at online abuse on its site over the past ten years, and found it to be mostly directed at women writers and racial minorities.

Although the majority of our regular opinion writers are white men, we found that those who experienced the highest levels of abuse and dismissive trolling were not. The 10 regular writers who got the most abuse were eight women (four white and four non-white) and two black men…And the 10 regular writers who got the least abuse? All men. (Source)

The internet taught a new Microsoft AI to be rascist and sexist in under one day. See Internet Teaches AI Chatbot Racism

Future generations may look back on the web as Sexist Architecture, which exacts a higher price for participation by women than men.

Pew has a 2014 report on abuse on the net. (Link)

Transcripts not Equivalent [...]

Transcripts have been used at times as a substitute when videos could not be captioned for hearing impaired students. However, transcripts are not seen as equivalent in most circumstances, because:

  1. The transcript does not capture the images in the video, and is often not understandable without the images,

  2. It is impossible to read a video transcript while watching the screen

  3. The transcripts are often not provided at the same time as the video link

Simultaneous Access [...]

An accessibility issue. As an example, a student might be provided a transcript or captioning of a video lecture. However, if this captioning is not provided at the same time the videos are provided to the students, this is not considered equal access.

Why Older People Are Happier [...]

The older you are, the less likely you are to be a maximizer—which helps explain why studies show people get happier as they get older.

“One of the things that life teaches you is that ‘good enough’ is almost always good enough,” Dr. Schwartz says. “You learn that you can get satisfaction out of perfectly wonderful but not perfect outcomes.” (Source)

Reasons Reuse Fails [...]

  • Embedded visual context
  • Embedded textual context
  • Wrong size for reuse
  • No cross-links

Embedded Visual Context

  • My first revelation. Brown University.
  • Far beyond this
    • PDFs
    • Backgrounds
    • Layouts
    • Tables
  • what would remix look like under this? How does it feel?

SOLUTION: Keep it simple. Reduce Layout.

Embedded Textual Content

  • Candela Psych
  • Learning Theory — talks about behavior modification and phobias, and I don’t care (is it on the test?)
  • As we talked about in chapter 4,
  • TSA Randomizer: who is this person speaking? What are they reacting to? What’s agenda? (Link)

Amalie “Emmy” Noether [...]

Yet there is no doubt that Amalie “Emmy” Noether transformed how we think about the universe. Despite the hairy mathematics, her great first theorem can be described conceptually in just a short sentence: Symmetries give rise to conservation laws.

This simplicity masks a penetrating insight. It provided a unifying perspective on the physics known at the time – and laid the groundwork for nearly every major fundamental discovery since. (Source)

Aspiring Adults Adrift [...]

A follow-up to Academically Adrift:

What happens after they graduate? In their follow-up, “Aspiring Adults Adrift,” published in 2014, what they reported was as distressing as the findings in their first book, but not much of a surprise. Poor academic performers were more likely than other recent graduates to be unemployed, stuck in unskilled jobs, or to have been fired or laid off. Where students went to college didn’t matter two years out, the sociologists found, as much as what they did while on campus. “The most important choice students can make is whether they are on the party-social pathway through college,” Dr. Arum said, “or are investing sufficient attention and focus on academic pursuits.” (Source)

How to College: Recommendations [...]

Dr. Settersten asked how many of them knew their professors well enough to request a letter of recommendation for the application. Only a smattering of hands went up. He wondered aloud: Why hadn’t more of them visited him during office hours, an easy way to build a one-on-one relationship with a professor who teaches hundreds of students a semester?

“What shocked me is that they say, ‘No one has told me this before,’ ” he said to me later. “They’re seniors and they don’t know how to navigate the institution.” Fewer than half of college seniors in the annual National Survey of Student Engagement said they talked often with a faculty member about their career plans. (Source)

Cooperation vs. Collaboration [...]

Harold Jarche:

Working cooperatively requires a different mindset than merely collaborating on a defined project. Being cooperative means being open to others outside your group. It also requires the casting-off of business metaphors based on military models (target markets, chain of command, strategic plans, line & staff).

Cooperation, sharing with no direct benefit, is needed at work so that we can continuously develop emergent practices demanded by increased complexity. Collaborating on specific tasks is not enough. We have to be prepared for perpetual Beta. What worked yesterday may not work today. No one has the definitive answer any more but we can use the intelligence of our networks to make sense together and see how we can influence desired results. (Source)

Reposting Reduces Comprehension [...]

Reposting articles contaminates memory and reduces comprehension of the articles reposted. In particular, misremembering an article almost doubled under a “posting” condition.

In one experiment, 80 students were presented with 50 short messages. After reading each, half of them were given the option to press “repost” or “next.” The other half simply pressed “next” and moved on.

Afterwards, all took a 10-item, multiple-choice test designed to determine how much of the information contained in the posts was understood and retained. As expected, participants who had the option to re-post scored significantly lower than those in the read-only group.

The researchers then looked at which specific messages those in the first group re-posted, and how well they were able to retain the information contained in the messages. They found participants were almost twice as likely to misremember a message if they had re-posted it. (Source)

The study authors suggest a cognitive load theory to explain the results.

Memories become contaminated the more they are remembered. See The Least Contaminated Memory

The Engram Lifecycle may (or may not!) provide a partial model for understanding this phenomenon.

Another Weibo study found that Anger Spreads Fastest

Collaboration Doesn’t Rate [...]

True, this survey is a little biased, with 400 elite scientists having to vote for the three top traits of a scientist. But interesting that collaboration barely rates at all:


Honesty and curiosity are the most important traits underlying excellent
science, according to a survey of around 400 members of elite US scientific
societies, such as the National Academy of Sciences. A pilot study led by survey co-organizer Robert Pennock, a philosopher at Michigan State University in East Lansing, had previously identified the ten most widely held values among scientists who have been honoured by their peers for being exemplary. Although honesty and curiosity dominated, these virtues also included perseverance, objectivity and the willingness to abandon a preferred hypothesis in the face of conflicting evidence (see ‘Core values’).(Source)

One problem with this survey is it is not clear how these “scientific values” differ from general human values. For insight about science, it sure would be nice to have a control.

Invention of the Self-Serve Grocery Store [...]

At the time, most grocery stores in the United States were a hand-holding experience. Every item was behind a counter, and you’d be “waited on” by a clerk in a white apron. If you were buying, say, a pound of sorghum wheat, you’d point to it, then the employee would reach up, measure it out for you, and give it to you in a pretty package. From a shop owner’s perspective, this business model was not ideal: it required a large, knowledgable staff, and it moved goods at a rather slow pace. Goldman had long-since recognized these shortcomings — and in California he saw an answer: the “self-serve” supermarket. As early as 1916, stores had experimented with a self-serve model, where goods were put out on shelves and customers helped themselves. By 1919, this concept was in full swing in California, and Goldman, an impressionable young lad, was determined to take it back to Oklahoma. (Source)

Rise of the One-Word Song Title [...]

Why might shorter song titles be better commercially? Mostly because they are easier to remember, particularly if they are repeated over and over in the song. The last thing the music industry wants is for you to love a song but be unable to remember its name when you go to stream or download the song. But it’s tough to forget “Hello” or “Happy” when Adele and Pharrell keep repeating the one-word title throughout the song.

In the course of an interview about the death of the chorusless pop song, the music critic and chart analyst Chris Molanphy considered the growing market orientation of the pop song title: “Branding has gotten harder and faster,” he explains. “We’re increasingly moving away from the days of the bizarre title that has nothing to do whatsoever with the refrain of the song.”

If our hypothesis that commodification led to the shortening song title is correct, then we would expect to see even shorter titles at the very top of the charts. And that is just what we found. (Source)

Gage Skidmore’s Celebrity Photos [...]

Student uses Creative Commons licensing to become one of the world’s most utilized photographers.

Over the past 6 years, Skidmore has posted close to 40,000 pictures of Presidential candidates and Hollywood celebrities to Flickr. All of his photos are filed under a Creative Commons attribution license, allowing anyone — including Donald Trump — to freely use them. As a result, he’s become the Internet’s go-to source for political photographs: His shots are used by thousands of outlets, including The Atlantic, The Washington Post, The Associated Press, and NPR. His Flickr account has been linked to 30 million times. A “Gage Skidmore” Google image search turns up close to 500,000 results.

Gage Skidmore is truly ubiquitous, and here’s why: all of his images are filed under Creative Commons. That is, they are 100% free to use — even commercially — so long as attribution is given. But considering all of the time and money he puts into obtaining his photos, why would he simply give his work away? Here’s how he explains it:

“The photography world is changing very rapidly. Anyone can go out and buy a semi-professional camera (or a cell phone with a camera) and upload their photos to the Internet for all to access. In years past, organizations like AP or Getty had a corner on the market, but as the Internet has become an integral part of our lives, photographers have had to adapt. Creative Commons is a vehicle that allows my photos to be received by a wide audience; it has also allowed me to get my name out there, and secure paid gigs.” (Source)

Type A and Tobacco [...]

The Type A personality was an invention of the Tobacco Industry.

The two men who coined the term ‘Type A’ in the 1960s were also goal-oriented, high achieving types. But they wanted to be Type B: easygoing, take every day as it comes type people. Because while many people now use the term Type A as a badge of honor, the two men were cardiologists, and they invented the term Type A to describe stressed patients who were at a higher risk of heart attack, stroke, and other cardiovascular conditions.

So why did this medical term become a cultural mainstay? One reason is that the idea proved very influential—the two cardiologists wrote a bestselling book about Type A behavior and their work guided future medical research.

The other reason is that the cardiologists and their peers had very generous patrons who funded their research and promoted their ideas: the executives of tobacco companies, who desperately wanted to pin the blame for the high rates of heart attack and cardiovascular disease among their customers on something other than smoking. (Source)

Edited by China [...]

Desire to sell American media into Chinese markets has subtle and not-so-subtle impact on what Hollywood makes.

It’s also a question of censorship. In China, the State Administration of Press, Publication, Radio, Film and Television must approve any movie before it can be screened. The 37-person committee, as noted in the U.S. government report “Directed by Hollywood, Edited by China”, has a mandate to ban all movies that are “anti-China.” It’s headed by a member of the Communist Party who worked for 10 years as the deputy director of Beijing’s propaganda department.

In the case of Looper, the filmmakers originally intended for the protagonist to go to Paris. But in order to access the Chinese market, the producers signed a co-production deal with a Chinese company that was contingent on switching the setting from Paris to Shanghai. The Chinese censors probably liked that the movie portrayed China, in the near future, as a more attractive place than France; they have balked at scenes that even slightly malign China—like a scene in Shanghai of clothes drying on clotheslines in Mission Impossible 3. (Source)

First Internet Ad [...]

The world’s first internet ad reached a public with no herd immunity, and had a click-through of 46%. It’s a good example of how behaviors adapt and shift.

On October 27, 1994, the world’s first Internet ad graced computer screens across America. Small, rectangular, and multi-colored, it bore an ominous message that had nothing to do with its creator, AT&T: “Have you ever clicked your mouse right here? YOU WILL.”

Miraculously, 44% of those who saw the ad clicked on it. (Source)

Why Use the Median? [...]

Many analysts believe that the unthinking use of the average damages our understanding of quantitative information. This is because when people look at averages, they think it is “the norm”. But in reality, it might be highly impacted by just one huge outlier.

Imagine an analyst who wanted to know the representative value for the cost of real estate on a block with five houses. Four of the houses are worth $100,000 and one is worth $900,000. Given these numbers, the average would be $200,000 and the median $100,000. In this case, and many others, the median gives you a better sense of what is “typical”.

Recognizing the huge impact a small number of extreme values – or a skewed distribution – can have on the average, the U.S. Census primarily uses the median to present changes in family incomes. This methodology deemphasizes the growth in income of people in the top 1%.

The median is also less sensitive to the dirty data that many analysts deal with today. As statisticians and data analysts increasingly collect data from surveys and scraping the web, user error when entering data can have a larger impact on results. When a user accidentally adds an extra zero that changes an answer from 100 to 1,000, it is much more likely to impact the average than the median. (Source)

First Use of Average [...]

Eisenhart finds that until the late 16th Century, most scientists who measured magnetic declination used an ad hoc methodology of determining the best value among their many measurements.

But in 1580, the scientist William Borough seems to have done something different. Borough took eight different measures of declination and, “conferring them altogether”, decided the best value was between 11 ⅓ and 11 ¼ degrees. He likely calculated the arithmetic mean, which was within this range. Yet Borough does not explicitly say this was his methodology.

It is not until 1635 that there is an absolutely clear case of a scientist using an average as a representative value. It was then that the English astronomer Henry Gellibrand took two measurements of declination, one in the morning (11 degrees) and one in the afternoon (11 degrees and 32 minutes), and, deciding on the most likely value, Gellibrand wrote:

So that if we take the arithmeticall (sic) mean, we may probably conclude the [correct measurement] to be about 11 gr. 16 min.

And there we have it. What may be the very first use of the average as an approximation of the truth! (Source)

Online Dating and Elo Scores [...]

The swipe-left, swipe-right dating app Tinder, for example, is known for making matches based on an internal attractiveness ranking it calculates for each of its users. As Sean Rad, the founder of Tinder, has explained to Fast Company, Tinder calls each user’s ranking his or her “elo score.” The term comes from the world of professional chess, where elo scores are used to rank players. If an average player beats a grandmaster, her score increases significantly. If a great player loses to an even better player, his elo score only drops a few points.

On Tinder, the chess matches are users indicating whether they want to go on a date with each other, and users’ scores go up or down depending on how highly ranked they are. As one journalist put it, the system looks a lot like “a definitive scoring of our attractiveness, a supercharged Hot or Not-style algorithm.”

Rad stresses that the elo score rates “desirability,” which incorporates more than attractiveness. Yet the app clearly uses elo scores to match equally datable people. Rad has said that he can ballpark someone’s elo score just by looking at pictures of the matches served up by Tinder’s algorithm. (Source)

Death of the ‘Mixed-Attractiveness’ Couple [...]

Instead it’s well established among academics interested in dating that “opposites attract” is a myth. Study after study supports the idea of “assortative mating”: the hypothesis that people generally date and marry partners who are like them in terms of social class, educational background, race, personality, and, of course, attractiveness.
There is an exception, however, to this seeming rule that people always date equally attractive people: The longer two people know each other before they start dating, the more likely it is that a 3 will date a 6, or a 7 will marry a 10.
Which is interesting to think about as dating apps, which match strangers up for dates, take over the dating world. Because if more and more people meet their future spouse on a first date, the mixed-attractiveness couple might just go extinct. (Source)

Why You Need To Stop Blogging & Regain Your Writing Soul [...]

Blogging can be a great practice for writers. It forces you to write regularly and helps you discipline yourself in your craft. I’m a fan of it. Really.

But it can also be a disease — a people-pleasing addiction that saps you of your creative edge. The true writer must beware of and take caution when using this marvelous tool for one reason: Writing solely for others can cost you your (writing) soul. (Source)

Blogless System [...]

The Blogless publishing system is for web writers. Just publish, share your texts with friends and don’t think about the rest. (Source)

Paul’s Online Math Notes [...]

Welcome to my online math tutorials and notes. The intent of this site is to provide a complete set of free online (and downloadable) notes and/or tutorials for classes that I teach at Lamar University. I’ve tried to write the notes/tutorials in such a way that they should be accessible to anyone wanting to learn the subject regardless of whether you are in my classes or not. In other words, they do not assume you’ve got any prior knowledge other than the standard set of prerequisite material needed for that class. In other words, it is assumed that you know Algebra and Trig prior to reading the Calculus I notes, know Calculus I prior to reading the Calculus II notes, etc. The assumptions about your background that I’ve made are given with each description below. (Source)

The Qin Revision [...]

The newly discovered texts challenge long-held certainties about this era. Chinese political thought as exemplified by Confucius allowed for meritocracy among officials, eventually leading to the famous examination system on which China’s imperial bureaucracy was founded. But the texts show that some philosophers believed that rulers should also be chosen on merit, not birth—radically different from the hereditary dynasties that came to dominate Chinese history. The texts also show a world in which magic and divination, even in the supposedly secular world of Confucius, played a much larger part than has been realized. And instead of an age in which sages neatly espoused discrete schools of philosophy, we now see a more fluid, dynamic world of vigorously competing views—the sort of robust exchange of ideas rarely prominent in subsequent eras.These competing ideas were lost after China was unified in 221 BCE under the Qin, China’s first dynasty. In one of the most traumatic episodes from China’s past, the first Qin emperor tried to stamp out ideological nonconformity by burning books (see illustration on this page). Modern historians question how many books really were burned. (More works probably were lost to imperial editing projects that recopied the bamboo texts onto newer technologies like silk and, later, paper in a newly standardized form of Chinese writing.) But the fact is that for over two millennia all our knowledge of China’s great philosophical schools was limited to texts revised after the Qin unification. Earlier versions and competing ideas were lost—until now. (Source)

Student Desire for Engagement [...]

The researchers’ paper on their results says they generally found that students’ propensity to engage in certain activities was more strongly correlated with various educational outcomes than their actual engagement. In other words, students who wanted to engage in an activity, but had not yet done so, had educational outcomes closer to those who had already engaged in the activity than those who had forgone it based on a lack of interest.

Bad Fan [...]

The “Bad Fan” is a fan who reads a TV show’s purpose wrongly — and in some cases gets it entirely reversed. Often bad fans mistake problematic antiheroes for an admirable ones, or misread a critical take on a shallow world as celebratory.

For the Bad Fan, Tony Soprano is a tough and admirable protagonist, and Walter White is someone to be idolized. The consciousless law firm on The Good Wife is thought to be a fun and vibrant workplace. Sometimes the problem is less ideological, as with the people who watch Game of Thrones for the sex scenes but complain about the labyrinthine plot.

Emily Nussbaum originated the term when writing about the “Problem of the Bad Fan”:

A few weeks ago, during a discussion of “Breaking Bad” on Twitter (my part-time volunteer gig), we all started yakking about the phenomenon of “bad fans.” All shows have them. They’re the “Sopranos” buffs who wanted a show made up of nothing but whackings (and who posted eagerly about how they fast-forwarded past anything else). They’re the “Girls” watchers who were aesthetically outraged by Hannah having sex with Josh(ua). They’re the ones who get furious whenever anyone tries to harsh Don Draper’s mellow. If you create a TV show, you’re probably required to say something in response to these viewers along the lines of, “Well, you know, whatever anyone gets out of the show is fine! It’s not my place to say. I’m just glad people are watching.”

Luckily, I have not created a show. So I will say it: some fans are watching wrong. (Source)

Some people see All in the Family as having one of the original Bad Fan problems. See Bunker’s Backfire

For a literary example, see Sociopathic Whuffie

Brain-Sided [...]

There is no such thing as a left or right-brained person:

In a new two-year study published in the journal Plos One, University of Utah neuroscientists scanned the brains of more than 1,000 people, ages 7 to 29, while they were lying quietly or reading, measuring their functional lateralization – the specific mental processes taking place on each side of the brain. They broke the brain into 7,000 regions, and while they did uncover patterns for why a brain connection might be strongly left or right-lateralized, they found no evidence that the study participants had a stronger left or right-sided brain network. (Source)

The myth can do a lot of harm — the idea that there are two types of people — the creative and the logical — is pernicious, and encourages people to ignore talents they may have, It is also dis-integrative, not recognizing the creativity necessary to logical tasks, and the logic necessary to productive creativity.

Quadratic Voting [...]

Applying economics to survey design provides a better sense of what voters find important.

The researchers, a mix of academics and private-sector experts from a range of disciplines, theorized that if they imposed artificial scarcity on survey respondents, they could force them to make tradeoffs that would reflect their real-world priorities better than traditional polling. To test that theory, they conducted a survey that asked more than 1,000 Americans their opinion on 10 hot-button issues like abortion, gun control and pay equity for women. But in this survey, respondents were given 100 credits that they could allocate as votes on the different issues. Someone who cared deeply about immigration could spend all 100 credits on that issue, but then she wouldn’t be able to weigh in on any of the other subjects.

There was an added twist: Each additional vote on an issue cost more credits than the one before it. Casting a single vote in favor of abortion rights cost just one credit, but casting four votes cost 16 credits. As a result, it was more expensive to take a more extreme position.

The approach, which the authors call quadratic voting,2 is based on research from economist Glen Weyl, who first developed it as a way to improve group decision making. Weyl has co-founded a company, Collective Decision Engines, to apply the quadratic voting technique to market research, product design and other commercial purposes. (Source)

In-Market, Print Is King [...]

“It’s totally unsurprising that print readership has been shrinking, but it is extremely surprising that in-market online readership hasn’t been growing,” says Hsiang Iris Chyi, Ph.D., an associate professor at the School of Journalism at the University of Texas at Austin. She wrote the paper with Ori Tenenboim, a doctoral student at UT.

Bloody Popular Sovereignty [...]

Devolving power to the states to decide whether they would be free turned them into battlegrounds, and tore the country apart.

This compromise, while winning Douglas the support he wanted for his Illinois railroad plan, proceeded to backfire on the nation in spectacular fashion. Since the question of whether a territory would be slave or free now hinged on the number of voters within it who supported each, pro- and anti-slavery activists (including one anti-slavery fanatic who would pop up again later on: John Brown) rushed into Kansas and Nebraska, hoping by their presence to establish a majority for their side. Within a year open violence had broken out between the two factions, earning Kansas the sobriquet “Bleeding Kansas.” Rather than settling the question of whether Kansas should be free or slave, the legislation and the violence it created ended up leaving it wide open, as neither side would accept a vote that went the other way as legitimate.

Beyond the two territories that had prompted the debate, the effects were dramatic as well. Northerners, who had assumed that territories north of the old compromise line were safe from the expansion of slavery, suddenly found themselves confronted with a new reality in which any territory (or established state!) could switch from free to slave with a single vote. Many of those Northerners had previously been content to accept slavery in the South, so long as it never touched them directly; now they began to worry that someday they would have to confront the issue in their own communities. This caused the first major push that drove many of these previously apathetic citizens into supporting abolition. Anger at Douglas flared up across the North; as he himself put it, “I could travel from Boston to Chicago by the light of my own effigy.”[(Source)] (

A Friend Back Home [...]

Because it requires such concentration, the process of taking notes itself can be distracting. Dr. Kiewra recalled that when he was still a student, one of his professors banned note-taking in class because he wanted students to pay full attention to the lesson. The teacher instead supplied prepared notes for the entire class.

Nonetheless, Dr. Kiewra recalled that he continued taking his own notes, cradling his head in his arms to shield his notebook as he wrote. One day, however, the professor caught him in the act.

“Mr. Kiewra, are you taking notes in my classroom?” he demanded. The flustered student dissembled. “I’m only writing a letter to a friend back home.”

“Oh thank goodness,” the professor said. “I thought you were taking notes. “

Note Taking Speed [...]

Generally, people who take class notes on a laptop do take more notes and can more easily keep up with the pace of a lecture than people scribbling with a pen or pencil, researchers have found. College students typically type lecture notes at a rate of about 33 words a minute. People trying to write it down manage about 22 words a minute.

Conceal to Control [...]

In talking about the Panama Papers the author makes an interesting point — revelations are meant to destroy, concealment, however, is meant to control. What if the Panama Papers was a warning shot for people not yet mentioned?

In sum, my thinking is that this could have been a Russian intelligence operation, which orchestrated a high-profile leak and established total credibility by “implicating” (not really implicating) Russia and keeping the source hidden. Some documents would be used for anti-corruption campaigns in a few countries—topple some minor regimes, destroy a few careers and fortunes. By then blackmailing the real targets in the United States and elsewhere (individuals not in the current leak), the Russian puppet masters get “kontrol” and influence. 

If the Russians are behind the Panama Papers, we know two things and both come back to Putin personally: First, it is an operation run by RFM, which means it’s run by Putin; second, it’s ultimately about blackmail. That means the real story lies in the information being concealed, not revealed. You reveal secrets in order to destroy; conceal in order to control. Putin is not a destroyer. He’s a controller.[(Source)] (

Learning Is Not Fun [...]

Learning is not always fun, and maybe it shouldn’t be.

Firstly, the premise is all wrong. Anybody who plays video games knows that they are not FUN. They are always engaging, but they are also often anxiety provoking, sometimes frustrating, occasionally anger-inducing. Secondly, the conclusion is absurd: learning is not, nor should it always be FUN. Learning is hard and learning can sometimes be excruciatingly painful. It forces you to sever ties to existing ways of understanding the world and replace them more articulate ones. Almost three millennia ago, when Plato described the process of education in The Republic, he used the metaphor of walking out of a dark cave and staring into the sun. At first, the light burns your eyes, but slowly you adjust to a new way of seeing. There’s nothing FUN about that.

The truth is that games and digital interactive learning platforms can help students become as passionate about learning traditional academic content as they are about learning to play Assasin’s Creed. All of the ways that humans make sense of the world—poetry, literature, math, science, engineering, history, etc.—are systems. And playing games is essentially an immersive process through which we learn to navigate complex systems. In other words, learning is already a game, but learning is not fun. (Source)

Addiction Is a Form of Learning

Civil Wrongs [...]

Harlem business leaders supported stricter law enforcement and harsher punishments for criminals. In 1973, nearly three-quarters of blacks and Puerto Ricans favored life sentences for drug pushers, and the Rev. Oberia Dempsey, a Harlem pastor, said: “Take the junkies off the streets and put ’em in camps,” and added, “we’ve got to end this terror and restore New York to decent people. Instead of fighting all the time for civil rights we should be fighting civil wrongs.” (Source)

NAACP Citizens’ Mobilization Against Crime [...]

At the same time, many African Americans, the social group most adversely affected by crime and the drug trade, supported Rockefeller’s anti-drug efforts. Since the late 1960s, many black activists pushed the state to take a tougher stand against lawlessness in their communities. African Americans wanted the state to fulfill its responsibility and provide protection. Black residents wanted to ‘escape the reign of criminal terror’ (New York Times, 1969a). In the late 1960s, for example, the NAACP Citizens’ Mobilization Against Crime advocated stronger law enforcement presence in black neighborhoods and lobbied Governor Rockefeller for stiffer penalties against violent offenders. In ‘Harlem Likened to the Wild West’, the New York Times (1969a) reported that African-American activists sent Governor Rockefeller and the New York State Legislature telegrams supporting increased police presence and minimum prison terms, including five years for muggers. (Source)

Harlem Leader’s Reaction [...]

To impress an increasingly conservative Republican Party with his stance on crime, Gov. Nelson A. Rockefeller of New York used the deep-seated fears and earnest pleas of African-Americans to frame and justify his tough antidrug proposals. Black leaders wrote editorials, appeared on television, testified before legislative committees and stood with the governor at a news conference in support. The black activist Glester Hinds said: “I don’t think the governor went far enough.” He called for “capital punishment,” saying that drug dealers “need to be gotten rid of completely.”

Facebook’s Listening Is Just Shazam [...]

Facebook can get pretty creepy, but its “listening” feature is at present pretty mundane. From Snopes:

Myth: The feature listens to and stores your conversations.

Fact: Nope, no matter how interesting your conversation, this feature does not store sound or recordings. Facebook isn’t listening to or storing your conversations.

Here’s how it works: if you choose to turn the feature on, when you write a status update, the app converts any sound into an audio fingerprint on your phone. This fingerprint is sent to our servers to try and match it against our database of audio and TV fingerprints. By design, we do not store fingerprints from your device for any amount of time. And in any event, the fingerprints can’t be reversed into the original audio because they don’t contain enough information.

Myth: Facebook is always listening using your microphone.

Fact: Nope, if you choose to turn this feature on, it will only use your microphone (for 15 seconds) when you’re actually writing a status update to try and match music and TV.

That’s not to say we shouldn’t be vigilant. Neverending changes to privacy policies combined with a surveillance culture in government and business could (and has) led to some pretty awful things. But for the moment the Facebook feature is just a post-based version of Shazam.

Singularity [...]

The idea was formally described as the “Singularity” in 1993 by Vernor Vinge, a computer scientist and science fiction writer, who posited that accelerating technological change would inevitably lead to machine intelligence that would match and then surpass human intelligence. In his original essay, Dr. Vinge suggested that the point in time at which machines attained superhuman intelligence would happen sometime between 2005 and 2030.

Ray Kurzweil, an artificial intelligence researcher, extended the idea in his 2006 book “The Singularity Is Near: When Humans Transcend Biology,” where he argues that machines will outstrip human capabilities in 2045. The idea was popularized in movies such as “Transcendence” and “Her.”

Two Fears [...]

A lot of folks quote the observation, attributed to Margaret Atwood, that men are afraid women will laugh at them, but women are afraid men will kill them. It’s provenance is actually unclear, but the earliest version I have found is from Molly Ivins in 1991:

Margaret Atwood, the Canadian novelist, once asked a group of women at a university why they felt threatened by men. The women said they were afraid of being beaten, raped, or killed by men. She then asked a group of men why they felt threatened by women. They said they were afraid women would laugh at them. (Source)

Even this quote is only verifiable through GoodReads. It’d be nice to get visual confirmation.

The fear is not unfounded. See The Men They Know

The Men They Know [...]

When murdered, women are almost universally killed by men they know, and usually by an intimate partner.

Brown’s worries reflected a shocking truth: When women are murdered, they’re likely to be killed by men they know. According to a Violence Policy Center report, 94 percent of the 1,701 women murdered in 2011 were killed by men they knew, and 61 percent of the identified killers were intimate partners. (Source)

The Violence Against Women Act may have worked. See VAWA Success

Sexism in men may be more transmitted via a mother’s behavior than a father’s. See Mothers and Sexist Attitudes

Sadly related: Science’s Sexual Assault Problem

Locating and Evaluating [...]

The bane of many a professor’s existence is having to read shoddy undergraduate work. With research-based assignments in particular, professors need a lot of coffee and aspirin.
A massive new study out yesterday (April 4) reveals that more than half of US faculty members think their undergraduates “have poor skills related to locating and evaluating scholarly information.” Basically, they’re horrible at research. (Source)

1,000 Word Text Editor [...]

What if we had to write as simple as we talk to a six-year-old? What if our emails could be as easy and fun to read as Randall Monroe’s descriptions of the Red World space car and the Upgoer Five? To play with that idea, I made a simple text editor that doesn’t allow words that are not among the 1,000 most common ones. Do I expect myself and other people to only use this editor going forward? No. That would be outright barbarous.
I open sourced my new text editor and put it on Github for anyone to use and modify. Not much happened. Then, a few months later, Malthe wrote a tweet about it.

VAWA Success [...]

After the Violence against women act passed, violence against women fell — though it is hard to tell if this was due to larger declines in violence. More importantly, the act improved the sensitivity and preparedness of police to deal with these issues.

In the decades after the VAWA was first passed in 1993, the rate of intimate partner violence declined 67 percent, according to statistics from the White House. Furthermore, between 1993 and 2007 homicide rates from intimate partners of women have declined 35 percent. For men killed by their partners, the rate declined 46 percent.

Overall, more victims are reporting domestic and sexual violence, largely because the issue is considered a crime in the public view. Compared with 1994, there are more resources for survivors of domestic violence, police are better equipped to handle such reports, and there are more federal and local laws to prosecute abusers. (Source)

Women are most often killed by The Men They Know

Partisan Sorting [...]

Partisan sorting is an interesting phenomenon. In recent times we’ve increasingly seen voters taking cues from their party on what to believe rather than parties being constructed of their voter’s views.

As an example, consider the economic conservative and the issue of gun control. There’s no reason these two issues need to go together — the only reasons why the Republican party espouses both is historical, the coalition of the rich and the rural that they’ve put together. In previous times we could see this as a coalition, with rural voters continuing to espouse some economically liberal leanings and economic conservatives tolerating but not agreeing with anti-gun views.

But that’s not how it works today. In the age of the ideological party, a frame is adopted that ties these two things together — both are economic regulation and gun control are Big Government telling you what to do. This intellectual underpinning is of course a fiction in some ways — a retroactively constructed argument to rationalize a pre-existing coalition, but it connects the two issues and allows one to reinforce the other. Ideological consistency then demands that you be passionate about both.

This is true on the Democratic side as well — there’s no reason why someone who supports more limited use of the military should also believe in gender-neutral bathrooms — at least from a purely logical perspective. But your mind is, undoubtedly, already constructing the ideological case. (Note: there are of course emotional dispositions that might connect the two, but this is a different matter).

What to make of this? Two things — first, voters and candidates are becoming more ideologically consistent, and this is driving polarization. But second, and more interestingly, a lot of what we’re seeing right now is a result of this. People see Sanders and Trump as parallels, but honestly Sanders functions far more like Cruz in terms of party dynamics — an ideologically consistent candidate who takes the party line at face value.

Trump is something different — Trump is, perhaps, a post-Cruz and Sanders event in a party’s lifecycle. Trump is the unwinding of the ideological underpinnings for people tired of the partisan sorting. It’s a faction driven by cultural issues (largely around white identity, but not solely) that has gotten tired of the gymnastics of tying that to Right-to-Life, Religious Freedom, and taxation schemes.

Anger Drives Voting and Donations [...]

The fact that anger caused conservative Republicans to believe the country was more polarized is important. In a different study conducted during the 2008 presidential campaign, we found that when Americans perceived other people as polarized in their support for candidates Barack Obama and John McCain, they were more likely to say they would vote. And we recently reported evidence that over the past four decades, the more divided people perceived Democrats and Republicans to be on various topics, the more likely they were to contribute to campaigns, to vote and to get involved in political campaigns.

So, anger causes people to be more polarized, to see more polarization, and, because they see more polarization, to take more political action (Source)

Partisan Stereotyping [...]

One feature of lifestyle politics is a vast overestimate of the lifestyles of adherents.

We document a consequential and heretofore undiscovered perceptual phenomenon in American politics and public opinion: Americans considerably overestimate the share of party-stereotypical groups in the mass-level parties. For instance, on average, people think that 32% of Democratic supporters are LGBT (6% in reality) and 38% of Republican supporters earn over $250,000 per year (2%). We experimentally demonstrate that these perceptions are genuine and party-specific, and not artifacts of expressive responding, innumeracy, or erroneous perceptions of group base rates. These misperceptions are relatively universal across partisan groups and positively associated with political interest. We experimentally document two political consequences of this perceptual bias: when provided information about the actual share of various party-stereotypical groups in the out-party, partisans come to see supporters of the out-party as less extreme and feel less socially distant from them. Thus, people’s skewed mental images of the parties appear to fuel contemporary pathologies of partisanship. (Source)

Ideologically Defined Parties Have Reduced Capacity to Govern [...]

 Polarization in Congress derives from both sincere ideological differences about policy means and ends and strategic behavior to exploit those differences to win elections.  The combination of high ideological stakes and intense competition for party control of the national government has all but eliminated the incentives for significant bipartisan cooperation on important national problems.  Consequently, polarization has reduced congressional capacity to govern.  Congress has been less productive in legislation, more prone to delays in appropriating funds, and increasingly slow in handling executive and judicial appointments.  While hard to quantify, there is considerable evidence for a decline in the quality of legislative deliberation and legislation.  Of significant concern is the extent to which this reduction in legislative capacity has contributed to a shift in the constitutional balance as it enhanced opportunities for executive and judicial encroachments on legislative prerogatives. (Source)

Gerrymandering is not a Primary Driver of Polarization [...]

Features of our electoral system such as political gerrymandering and partisan primaries are not likely to be important causes of polarization.  That the House and Senate have polarized in tandem suggests that partisan districting cannot be a primary cause and researchers have failed to find much of an incremental contribution.  Similarly, scholars have not identified any substantial impact of the primary system on polarization.  The relationship between our system of private campaign finance and polarization is more complex.  While there is little evidence that the origins of greater polarization lie in campaign finance, the growing participation of ideologically oriented donors appears to have exacerbated the problem. (Source)

Elite Polarization Drives Partisan Sorting [...]

 While significant disagreement persists as to how much voters have polarized by taking increasingly extreme views, there is a consensus that voters are much better sorted within the party system.  Conservative voters are much more likely to identify as Republican and liberals as Democrats than two generations ago.  Moreover, voters’ partisanship increasingly predicts their positions on issues.  Voters are primarily changing their issue positions to match the partisanship rather than switching parties.  Since voters seem to be responding to the positions of their party leaders, the causal arrow seems to run from elite polarization to partisan sorting.  Whether partisan sorting has an additional feedback effect on elite polarization is less clear. (Source)

Polarization is about the Right’s Movement [...]

Political polarization is asymmetrical, with the majority of the movement coming from the right.

The evidence points to a major partisan asymmetry in polarization.  Despite the widespread belief that both parties have moved to the extremes, the movement of the Republican Party to the right accounts for most of the divergence between the two parties.  Since the 1970s, each new cohort of Republican legislators has taken more conservative positions on legislation than the cohorts before them.  That is not true of Democratic legislators.  Any movement to the left by the Democrats can be accounted for by a decline in white representatives from the South and an increase in African-American and Latino representation. (Source)

Polarization Dates to the Mid-1970s [...]

Nolan McCarthy points out the timeline of polarization rules out a lot of explanations of it.

Chart showing increasing polarization dating back to the 1970s

Based on both qualitative and quantitative evidence, the roots of our current polarization go back almost 40 years to the mid-1970s.  Indices of polarization based on roll call voting in Congress have been nearly monotonical in both chambers of Congress since around 1978.  This evidence is primarily important for the explanations of polarization that it rules out.  It casts doubt on explanations focused on more contemporary events such as the Clinton impeachment, the 2000 presidential election, the election of Barack Obama or the emergence of the tea party.  That both chambers have been affected suggests a limited role for explanations based on the institutional differences between the House and the Senate.  The timing is much more consistent with explanations based on large historical trends such as the post-Civil Rights realignment of Southern politics and increased levels of economic and social inequality. (Source)

Another important date: 1966. See The Sanitized Lexicon of 1966

Belief Communities [...]

“Americans … values and basic beliefs are more polarized along partisan lines than at any point in the past 25 years.” (Pew Research)

Pundits bemoan this polarization, and await a leader who’ll cut the Gordian knot of gridlock politics to deliver us to a bi-partisan nirvana. The pundits will have a long wait.


America’s current hyper-partisanship stems from a perfect storm of factors (described below). And it’s created Belief Communities — where people who want to believe patently untrue things (e.g., that President Obama was born in Kenya) are never challenged in their beliefs, and may even be encouraged in their fantasies. (Source)

2.5 million votes [...]

Clinton said she has received 2.5 million more votes than Sanders. That claim can’t be completely verified because we don’t have the raw vote totals from the caucus states of Iowa, Nevada, Washington and Maine.

However, given the numbers we do have, it is likely that Clinton’s lead is at least in the neighborhood of 2.4 million votes.

That’s not much of a difference. We rate this claim Mostly True. (Source)

Racism is not Residential [...]

If appropriation turned #staywoke into a game, irony turned it into a punchline. What the competition and the joke share, however, is a belief in the frivolity of careful criticism. When racism is presumed to reside in persons and in objects rather than systems and institutions, the question “is this racist?” is the easiest one to answer and often the only one asked. But racism has had a long life, and keeps outliving those to whose names it is attached.

If hate is residential—dwelling in bodies and in objects—how can we account for the residual pain that lingers once those bodies and objects have disappeared? In her discussion of affective economies, Professor Sarah Ahmed (also behind feministkilljoys), writes that hate “does not reside in a given subject or object. Hate is economic; it circulates between signifiers in relationships of difference and displacement.” Hate often exceeds the containers we allow it, seeping out like plasma and expanding to fill the space. The most pernicious racism is unrecognizable precisely because it is that which binds. It’s difficult to name a racist when you yourself are bound up in their racism—either because you are the object of their hate, or because their hate has a history with which you are aligned. (Source)

Woke Olympics [...]

The logic of the game is thus: Hate resides within the subject (Macklemore, Donald Trump, Grandfather), and justice within the copula (he is). To make racism disappear, attribute ignorance to someone else; name him publicly. Now you’re woke. But what masquerades as a feat of anti-racism is really just a poorly devised self-help regime, better designed to confirm the wokeness of its participants than to inspire any awakening. For Woke Olympians, resisting racism is as simple as bearing witness. The games rely on a theory of safety akin to the Department of Homeland Security’s If You See Something, Say Something™—when hate appears, report its apparition. Winning is nothing more than performing a vanishing act, or willing injustice away. If racism is wholly contained within malignant bodies, the easiest way to fight it is to make those people disappear. But this is the laziest kind of magic: sleight of hand that postures as sorcery.

Should competitors wish to test their critical ability, they can take Jezebel’s “How Woke are You?” quiz, whose questions include “How familiar are you with current events and social issues?” “What’s your favorite brand of coffee?” and “Who is bae?” Wokeness, as indicated by the quiz, is easy to gauge: “some people are woke” (here they offer Matt McGorry as an example), “some people aren’t woke” (hyperlinked to a video of Ted Cruz), and “other people don’t even know what woke means!” (with a link back to an oft-cited Yahoo Answers post: “My guess would be that [it means] ‘I stay awake’, but it is not good English.”). Everyone along this woke/non-woke spectrum, including the maker of the quiz, is white, but if a player scores high enough, her wokeness is represented by a stock image of a black man drinking coffee. (Source)