I had friends of mine who attended MIT back in the 1970s tell me they used to call themselves “tools,” which told us us something about how they regarded themselves and were regarded. Technologists were clearly bright people whom others used to solve problems or make money. Divorced from any mystical value, their technical innovations, in the words of the French sociologist Marcel Mauss, constituted “a traditional action made effective.” Their skills could be applied to agriculture, metallurgy, commerce, and energy.
In recent years, like Skynet in the Terminator, the tools have achieved consciousness, imbuing themselves with something of a society-altering mission. To a large extent, they have created what the sociologist Alvin Gouldner called “the new class” of highly educated professionals who would remake society. Initially they made life better—making spaceflight possible, creating advanced medical devices and improving communications (the internet); they built machines that were more efficient and created great research tools for both business and individuals. Yet they did not seek to disrupt all industries—such as energy, food, automobiles—that still employed millions of people. They remained “tools” rather than rulers.
With the massive wealth they have now acquired, the tools at the top now aim to dominate those they used to serve. Netflix is gradually undermining Hollywood, just as iTunes essentially murdered the music industry. Uber is wiping out the old order of cabbies, and Google, Facebook, and the social media people are gradually supplanting newspapers. Amazon has already undermined the book industry and is seeking to do the same to apparel, supermarkets, and electronics. (Source)
ccording to Sturgeon’s Law, 90% of anything is crap. It was named after Theodore Sturgeon, who came to the defence of science fiction in the 1950s. “The claim (or fact) that 90% of science fiction is crap is ultimately uninformative, because science fiction conforms to the same trends of quality as all other artforms,” Sturgeon mused, confirming both the truth and the futility of his theory – ultimately, it comes down to how one defines “crap”. (Source)
Laws around who gets access and maintenance right to telephone poles are key to the maintenance of cable monopoly power.
Everyone understands that this is not really about poles at all. (AT&T says that cleaning up the Swamp Two process by allowing a single approved contractor to move things around on poles would “put service reliability and public safety at risk.” Which seems unlikely.) Pole shenanigans represent an exercise of raw, entrenched power. The incumbents don’t want fiber available at a cheap price with a city as a wholesale provider. They’d rather keep things as they are — so Tennessee, for example, remains a state in which more than three quarters of households (and two thirds of businesses) get download speeds that don’t meet the FCC’s requirement of 25 megabits per second. Suffering with slow (or no) internet is like being bashed by big stick of wood.
Nashville is not alone in being pole-axed. AT&T filed a federal lawsuit over a similar Swamp Two city ordinance in Louisville; Connecticut has been stymied in rationalizing its pole regime; Google has had trouble with poles in Silicon Valley; a big fiber effort in Kentucky has run into a buzzsaw of pole issues; and any city thinking about alternative fiber has to now assume it will face a lawsuit if it tries to address its Swamp Two problems. (Source)
In 1934, a freshman matriculating at Dartmouth would spend $1,050 on tuition, room and board, and “incidentals.” The expected real cost of a year at Dartmouth, after fraternity fees and other expected expenses were accounted, was $1,700—by TIME’s reporting, “highest in the land.” (That’s about $30,500 in today’s dollars.) That year the U.S. Office of Education surveyed the nation’s colleges about the cost of attendance and found that the average cost for one academic year was $630 ($11,300 today). (Source)
As John D. Rockefeller Jr. explained it in 1927, there was once a time when it made sense for society not to expect students to pay much for college: most of the students were going into the ministry, or into some other low-paying but society-benefiting career, so it behooved the nation to keep costs low by supplementing funds with endowments and gifts from men like Rockefeller himself. That had changed by the early 20th century, when more men (and a few women) were going to college, many of them in preparation for their future high-earning careers, or simply because it was becoming more normal. Why, the reasoning went, shouldn’t they pay more? (Source)
If we accept that reputation is a natural outgrowth of social interaction (community model) and as mediated via reputation mechanisms such as points systems and star ratings (emergent model), it is apparent that there is some scope for policy, particularly in large institutions, to ensure appropriate mechanisms are in place to recognise OEP activities. What we are proposing here could be classed as a “commons thinking” approach (see Kenrick, 2009), drawing together university’s wider role in stewardship of knowledge creation and the academic discourse which facilitates it through recognising OEP and the value of workload considerations, which appear to be one of the biggest barriers for staff currently, in order to foster a culture of collaboration. Creating the space for mass OEP engagement through top-down policy will support the bottom-up formation of CoPs, and a collective OEP effort, and, in turn, more successful and sustainable outcomes. (Source)
Mismatches between our taste and the taste of others causes cognitive dissonance and anxiety which many resolve through modifying their taste.
Liking of songs, based on teens’ initial ratings was strongly related to activity in the head of the caudate nucleus. Although the exact nature of caudate activity remains a source of debate, most agree that activity in this area is highly related to reward and valuation. So it appears to reflect pleasure, not familiarity. The researchers also observed activity in a number of regions that have been previously associated with the pleasurable aspects of listening to music.
When adolescents changed their ratings, according to their brain activity, it had nothing to do with increased liking of the music. Instead, a very different picture emerged. The network of regions associated with changing a rating included bilateral insula, the anterior cingulate cortex and the supplementary motor cortex and frontal poles — regions previously associated with anxiety and pain. These regions all showed increased activity when teens were shown a popularity rating that did not match their own, meaning that they had this neural response prior to changing their own rating. Interestingly, individuals who demonstrated the greatest sensitivity to popularity, as determined by survey measures taken at the beginning of the study, manifested the strongest insula activity during the act of conforming. Simply, the greater the insula activity, the higher the odds of conforming. The authors suggest that this pattern of activity can be explained by cognitive or emotional dissonance caused by the mismatch between one’s ratings and the ratings of others. (Source)
A related effect is the Recommender’s Paradox
Social media leads to being Surprised by Disagreement
More CC students means more university students down the road. Ad the same time, limited reductions in CC cost do not seem to negatively affect university enrollment.
Community college enrollment in the first year after high school increased by 5.1 percentage points for each $1,000 decrease in tuition which implies an elasticity of -.29. Lower tuition also increased transfer fr (Source)
My second kvetch with ‘self-paced’ is that it is operationally inaccurate. For students enrolled in CBE courses and programs that are credit bearing and eligible for financial aid — CBE is NOT a ‘self-paced’ option or experience.
While almost every CBE model is designed to give the student significant latitude and agency over their pace — the control is not unilaterally at the discretion of the learner. It is a negotiated flexibility, with milestones, deadlines and absolute limits, arbitrated between course designers, faculty, learners — and regulators — to ensure there is ‘satisfactory academic progress’ to maintain financial aid eligibility.
In short, the variability of pace in CBE is not without boundaries. And the limits are not, ultimately, set by the learners themselves. Even without the regulatory requirement to ensure the student’s pace is sufficiently productive — experience tells us, that when learners are left to their own initiative to engage with and progress through a course of study, most tend to lapse, languish and leave. Most recently the completion rates of MOOCs have provided a stark demonstration of this syndrome. (Source)
“We’ve seen very aggressive efforts to mischaracterize and mislead people about alternatives,” says Ethan Senack, of Student Public Interest Research Groups (PIRGs), who points to an embarrassingly unsuccessful #usedtextbookproblems Twitter campaign launched last year by McGraw-Hill. The half-baked attempt to goad college students into sharing pictures of the supposed woes of using secondhand course material immediately backfired. “I don’t have any #usedtextbookproblems but some douchebag charged me $200 for a new textbook (you were that douchebag),” one commenter groused. A mother wrote, “#usedtextbookproblems Thank goodness for used textbooks! With 3 kids in college at the same time how can we possibly afford new books??!” (Source)
All is not lost, however. The push toward open educational resources, or OER—textbooks that aren’t bogged down with steep licensing fees, marketing costs, and production expenses—is gaining momentum, and Massachusetts has led the charge. Consider UMass Amherst’s Open Education Initiative. Launched in 2011, the program incentivizes professors to use low-cost materials in their courses. In the first few years, the program has saved students $1.3 million.
Similar programs are catching on around the country. “Almost every school is under pressure to improve the affordability of their course material,” says Geoffrey Willison, CEO of Boston-based Valore, a company that created a digital marketplace for used textbooks and last year acquired Boundless, a fledgling OER publisher. “It’s on all of us to create a more sustainable model.” (Source)
Other higher education institutions in Pierce County also are using digital textbooks for some courses, among them Tacoma Community College, University of Washington Tacoma and Bates Technical College. But Pierce is out front in offering entire degrees.
Pierce faculty make use of the Open Course Library, a project funded by the state and the Bill and Melinda Gates Foundation. It went live in 2013 with textbooks, syllabuses, additional reading materials and assessments for 81 of the state’s most highly-enrolled courses. It is available to students at no or low cost.
Mark Jenkins, director of eLearning for the Washington State Board of Community and Technical Colleges, said the state’s early adoption of digital textbooks has made students and faculty receptive to using them.
“We don’t have to go out and beg people to be a part of (open textbooks),” he said. “That’s a level of maturity very few states have.” (Source)
It’s a movement that has taken hold at Pierce College, whose Joint Base Lewis-McChord campus in 2015 became the third community college in the country to offer a two-year degree students can complete without ever having to purchase a textbook. By Pierce College’s own estimate, the average student at one of its three campuses pays $1,100 a year on textbooks these days.
“When I heard about (open textbooks), I thought, ‘This makes perfect sense,’ ” said Denise Yochum, president of the Pierce College system. “This is the work we’ve been trying to do for a really long time.”
Now, with a $100,000 grant from Achieving the Dream, a national community college reform network, Pierce College will add a second open textbook degree pathway, this one in prenursing. The program will be ready for enrollment starting in winter 2017 and available at all three of the college’s campuses. (Source)
Earlier this year, Facebook denied criticisms that its Trending feature was surfacing news stories that were biased against conservatives. But in an abrupt reversal, the company fired all the human editors for Trending on Friday afternoon, replacing them with an algorithm that promotes stories based entirely on what Facebook users are talking about. Within 72 hours, according to the Washington Post, the top story on Trending was about how Fox News icon Megyn Kelly was a pro-Clinton “traitor” who had been fired (she wasn’t).
The original accusations of bias came from a disgruntled ex-editor at Facebook, who leaked internal Trending training materials to Gizmodo. The training package offered tips on, among other things, how to curate news from an RSS feed of reputable sources when the stories provided by Facebook users were false or repetitive. Though the human editors were always expendable—they were mostly there to train the Trending algorithm—they were still engaging in quality control to weed out blatant falsehoods and non-news like #lunch. And after Trending latched on to the fake Kelly scoop, it appears that human intervention might still be required to make Facebook’s algorithms a legitimate source of news after all. (Source)
To begin to answer that question, one must trace the disparate histories of predictive policing’s component parts through a series of crises and conjunctions. Actuarial techniques like Northpointe’s (or the older Level of Service Inventory–Revised, another recidivism-risk-assessment battery) emerge out of insurance companies’ demand for risk management during the late 19th and early 20th centuries’ chronic economic crises.
Two more pieces of the puzzle, biometrics and organized surveillance, emerge in the 18th and 19th centuries out of the shifting tactics for maintaining white supremacy in both southern slave plantations and northern cities. Simone Browne, for example, has shown that New York’s colonial “lantern laws,” which forbade unaccompanied black people from walking the streets at night without carrying a lit lantern, were originally instituted because of white fear of antislavery insurrection.
And lastly, statistical techniques of crime prediction come down to us through the early-20th century Chicago School of sociology, which swapped cruder theories of physically inherent racial difference for more refined spatio-cultural theories of industrial capitalist “social disorganization.” These shored up sexuality and the color line as the key arbiters of cultural degradation, as in studies positing a “culture of poverty” that generates criminality. This is Roderick Ferguson’s point in Aberrations in Black when he argues that “the Chicago School’s construction of African American neighborhoods as outside heteropatriarchal normalization underwrote municipal government’s regulation of the South Side, making African American neighborhoods the point at which both a will to knowledge and a will to exclude intersected.” (Source)
When we automate the status quo, we are automating white supremacy, whether we are “woke” or not.
But the problem with predictive policing goes beyond Northpointe or biased algorithms. Focusing on the algorithms relies on a delimited analysis of how power works: If only we could have woke programmers, then we would have woke systems. Swap out “programmers” for “cops” and you have a version of the “few bad apples” theory of policing, which ignores the way in which violence and repression are inherent and structural within law enforcement. The problem with predictive policing algorithms, and the fantasy of smart government it animates, is not that they can “become” racist, but that they were built on a law-enforcement strategy that was racist all along. (Source)
How much data anonymization is good enough? The problem is harder than you think
Consider this classic puzzle, called The Dot-Town Suicides:
Each resident of Dot-town carries a red or blue dot on his (or her) forehead, but if he ever figures out what color it is he kills himself. Each day the residents gather; one day a stranger comes and tells them something—anything —non-trivial about the number of blue dots. Prove that eventually every resident kills himself.
“Non-trivial” means here that there is some number of blue dots for which the statement would not have been true. Thus we have a frighteningly general version of classical problems involving knowledge about knowledge. (Source)
“City” is a monumental architectonic work, with dimensions comparable to those of the National Mall, in Washington, D.C., and a layout informed by pre-Columbian ritual cities like Teotihuacan. Heizer started it in 1972, when he was in his late twenties and had already established himself as an instigator of the earthworks movement, a group of artists, including Robert Smithson and Walter De Maria, who made totemic outdoor sculptures, often in the majestic wastelands of the American West. “City” is made almost entirely from rocks, sand, and concrete that Heizer has mined and mixed on site. The use of valueless materials is strategic, a hedge against what he sees as inevitable future social unrest. “My good friend Richard Serra is building out of military-grade steel,” he says. “That stuff will all get melted down. Why do I think that? Incans, Olmecs, Aztecs—their finest works of art were all pillaged, razed, broken apart, and their gold was melted down. When they come out here to fuck my ‘City’ sculpture up, they’ll realize it takes more energy to wreck it than it’s worth.” (Source)
John Locke asserted that continuity of memory was what defined the me-ness of each of us, our sense of identity. A 1980’s thought experiment by Derek Parfit puts an interesting twist on that and causes us to question whether this common-sense notion is sufficient. post
The first question imagines a simple teleporter:
“Suppose that you enter a cubicle in which, when you press a button, a scanner records the states of all the cells in your brain and body, destroying both while doing so. This information is then transmitted at the speed of light to some other planet, where a replicator produces a perfect organic copy of you. Since the brain of your Replica is exactly like yours, it will seem to remember living your life up to the moment when you pressed the button, its character will be just like yours, and it will be in every other way psychologically continuous with you.”
The question is whether you’d press the button, and as Parfitt observes elsewhere the effect is not terribly different in some ways from waking up from a sleep.
If you manage to press the button, it gets more and more complex from there. We can imagine the machine messes up and doesn’t vaporize you at the source teleporter. The duplicate calls up the authorities and asks to have you vaporized immediately as there’s been a mistake with the teleporter. How do you feel about that? And who is the “you” here anyway?
Parfit draws some conclusions for this that are controversial, and from which some say he has retreated from in recent years. (Link)
The problem is also dealt with on The Big Bang Theory
For a long time, physicists assumed quantum teleportation wasn’t possible. In order to teleport an object, like our pig lizard, we must scan it to obtain precise information about its atomic structure. However, the more accurately an object is scanned, the more it is disturbed by the process of being scanned. We can’t measure a particle without altering it in some way, never mind every single subatomic particle that makes up a full-sized pig lizard. So how could we extract all the information we would need to create an exact copy in another location via teleportation?
In 1993, an IBM physicist named Charles Bennett and his colleagues figured out a way to work around this fundamental limitation using quantum entanglement, a strange connection between particles that even Einstein called “spooky.” Their method involves three particles: the particle to be teleported (A) and an entangled pair of other particles (B and C). First, B and C are entangled and sent to separate locations. B then interacts with A, and A’s information is transferred to B. Since B is still entangled with C, any information transferred to B is also transferred automatically to C without any need to transmit that information across physical space-time. C essentially turns into A, in the new location. (Source)
So if you now number yourself among the disenchanted, then you have no choice but to accept things as they are, or to seriously seek something else. But beware of looking for goals: look for a way of life. Decide how you want to live and then see what you can do to make a living WITHIN that way of life. But you say, “I don’t know where to look; I don’t know what to look for.”
And there’s the crux. Is it worth giving up what I have to look for something better? I don’t know — is it? Who can make that decision but you? But even by DECIDING TO LOOK, you go a long way toward making the choice. (Source)
Tanden knows from normal. Unlike most of Clinton’s top people, she comes from a childhood of hardship and turmoil unusual in the rarified upper levels of Washington politics. When she was in elementary school, her father – who dabbled in real estate — sold the family home in a Boston suburb and skipped out without giving Tanden, her stunned mother and older brother any warning. Her mother, a tough but untested immigrant from Andhra Pradesh near Delhi, took control — signing up for food stamps, welfare and Section 8 housing while taking job in a local travel agency — so her kids could remain in Bedford’s top-shelf schools.
“I remember being in the lunch line, and I was the only kid using the voucher back then for food stamps or reduced lunch,” Tanden recalled. “I paid 10 cents; everyone was paying like $1.50. And I remember being at the Purity Supreme, which is our supermarket, and my mom was using the food stamps, and everyone else was paying with cash. And I asked her like, ‘Why do we have to use the funny money?’”
Her mother, she added, “is very strong, a strong-willed person… she was incredibly resilient.” (Source)
Here & Now’s Meghna Chakrabarti finds out about the approach from Linda Williams, who teaches a zero-textbook business administration course at Tidewater Community College in Virginia. (Source)
Cheating was relatively stable as a student behavior between 1964 and 1997), but the type of cheating (from isolated incidents to collaborative efforts and from implicit to explicit) may have shifted.
It’s notable that this shift occurred before the impact of the internet.
Understanding student cheating is particularly important given trends that show cheating is widespread and on the rise. In 1964, Bill Bowers published the first large-scale study of cheating in institutions of higher learning. Bowers surveyed more than 5,000 students in a diverse sample of 99 U.S. colleges and universities and found that three fourths of the respondents had engaged in one or more incidents of academic dishonesty. This study was replicated some 30 years later by McCabe and Treviño (1997) at 9 of the schools who had participated in Bowers’s original survey. Although McCabe and Treviño observed only a modest increase in overall cheating, significant increases were found in the most explicit forms of test or exam cheating. (Source)
McCabe and Hanson agree that while students at all levels resort to cheating, it’s those at the top and at the bottom who tend to cheat more.
“The top’s cheating to thrive, the bottom’s cheating to survive,” McCabe says, “and those in the middle are content with their grades and just go along in life and are happy.” (Source)
Students are less likely to cheat in upper-level classes than in broad prerequisites.
It should come as no surprise that students are less likely to cheat in upper-level classes in their majors: These tend to be smaller, more personal, more focused on the students’ own interests, and in general far more intrinsically motivating to students. But we welcome students to campus with required (and often overstuffed) classes that are designed to foster extrinsic motivation—and that nudge them toward academic dishonesty from the beginning. (Source)
Structure of education encourages cheating.
At most American universities, it’s traditional to begin the educational process with a type of class that sets up those motivations exactly wrong. Consider your typical introductory college lecture course. A student enters Major University and enrolls in History of Western Civilization. She is told this class is a requirement she must fulfill before she can take the upper-level courses she wants to take in her major. Because of the large size of the class, it consists of weekly lectures, in which the professor covers the key events and trends of a thousand years of western civilization. Grades will be determined by three exams. The professor warns students that those exams are difficult, and that only the best will earn the coveted As.
For a student who loves history already, this class may work just fine. But for the average student in the lecture hall, a class like this one swims in extrinsic motivators. If you pass this class, you can take the classes you really want to take. If you find a way to do well on just three tests, you will earn the ultimate extrinsic motivator: a good grade. And if you earn a good enough grade, you will have the privilege of being considered one of “the best.” (Source)
Trump, the King of Shame, has covertly come to the rescue. He has shamed virtually every line-cutting group in the Deep Story—women, people of color, the disabled, immigrants, refugees. But he’s hardly uttered a single bad word about unemployment insurance, food stamps, or Medicaid, or what the tea party calls “big government handouts,” for anyone—including blue-collar white men.
In this feint, Trump solves a white male problem of pride. Benefits? If you need them, okay. He masculinizes it. You can be “high energy” macho—and yet may need to apply for a government benefit. As one auto mechanic told me, “Why not? Trump’s for that. If you use food stamps because you’re working a low-wage job, you don’t want someone looking down their nose at you.” A lady at an after-church lunch said, “If you have a young dad who’s working full time but can’t make it, if you’re an American-born worker, can’t make it, and not having a slew of kids, okay. For any conservative, that is fine.”
But in another stroke, Trump adds a key proviso: restrict government help to real Americans. White men are counted in, but undocumented Mexicans and Muslims and Syrian refugees are out. Thus, Trump offers the blue-collar white men relief from a taker’s shame: If you make America great again, how can you not be proud? Trump has put on his blue-collar cap, pumped his fist in the air, and left mainstream Republicans helpless. Not only does he speak to the white working class’ grievances; as they see it, he has finally stopped their story from being politically suppressed. We may never know if Trump has done this intentionally or instinctively, but in any case he’s created a movement much like the anti-immigrant but pro-welfare-state right-wing populism on the rise in Europe. For these are all based on variations of the same Deep Story of personal protectionism. (Source)
But something else seemed at play. Many blue-collar white men now face the same grim economic fate long endured by blacks. With jobs lost to automation or offshored to China, they have less security, lower wages, reduced benefits, more erratic work, and fewer jobs with full-time hours than before. Having been recruited to cheer on the contraction of government benefits and services—a trend that is particularly pronounced in Louisiana—many are unable to make ends meet without them. In Coming Apart: The State of White America, conservative political scientist Charles Murray traces the fate of working-age whites between 1960 and 2010. He compares the top 20 percent of them—those who have at least a bachelor’s degree and are employed as managers or professionals—with the bottom 30 percent, those who never graduated from college and are employed in blue-collar or low-level white-collar jobs. In 1960, the personal lives of the two groups were quite similar. Most were married and stayed married, went to church, worked full time (if they were men), joined community groups, and lived with their children.
A half-century later, the 2010 top looked much like their counterparts in 1960. But for the bottom 30 percent, family life had drastically changed. While more than 90 percent of children of blue-collar families lived with both parents in 1960, by 2010, 22 percent did not. Lower-class whites were also less likely to attend church, trust their neighbors, or say they were happy. White men worked shorter hours, and those who were unemployed tended to pass up the low-wage jobs available to them. Another study found that in 2005, men with low levels of education did two things substantially more than both their counterparts in 1985 and their better-educated contemporaries: They slept longer and watched more television. (Source)
Obama as Muslim may be something like the Roman belief in religion. To believers it’s neither true or not true in the traditional sense. Rather, the belief helps capture a “feeling” that believers identify with.
The most widespread of these suspicions, of course—shared by 66 percent of Trump supporters—is that Obama is Muslim.
What the people I interviewed were drawn to was not necessarily the particulars of these theories. It was the deep story underlying them—an account of life as it feels to them. Some such account underlies all beliefs, right or left, I think. The deep story of the right goes like this:
You are patiently standing in the middle of a long line stretching toward the horizon, where the American Dream awaits. But as you wait, you see people cutting in line ahead of you. Many of these line-cutters are black—beneficiaries of affirmative action or welfare. Some are career-driven women pushing into jobs they never had before. Then you see immigrants, Mexicans, Somalis, the Syrian refugees yet to come. As you wait in this unmoving line, you’re being asked to feel sorry for them all. You have a good heart. But who is deciding who you should feel compassion for? Then you see President Barack Hussein Obama waving the line-cutters forward. He’s on their side. In fact, isn’t he a line-cutter too? How did this fatherless black guy pay for Harvard? As you wait your turn, Obama is using the money in your pocket to help the line-cutters. He and his liberal backers have removed the shame from taking. The government has become an instrument for redistributing your money to the undeserving. It’s not your government anymore; it’s theirs.
I checked this distillation with those I interviewed to see if this version of the deep story rang true. Some altered it a bit (“the line-waiters form a new line”) or emphasized a particular point (those in back are paying for the line-cutters). But all of them agreed it was their story. One man said, “I live your analogy.” Another said, “You read my mind.” (Source)
The later, more successful communes, he said, were a result of lessons learned in the early movement: ”that there has to be some leadership and decision-making, some control of membership, that you can’t sell drugs to people in town, go skinny-dipping in the town pond and offend your neighbors.”
Then, all these years later, Mr. Houriet’s eyes filled with tears and his voice choked up. ”There was a brief, shining moment when we knew it could work,” he said, scanning the panel of his fellow communards. ”We knew it could work, but we blew it.” (Source)
Marty Jezer, a co-founder of Packer Corners, blamed the failure of the movement on the profound conflict within the counterculture. ”We had a big cider press operation at Packer Corners,” Mr. Jezer recalled, ”and I remember being up on top of the press one day, feeding in apples to make cider. A bunch of hippies had come by to help, but instead they were dancing around the press, throwing apples at each other. It wasn’t much help. We had an impractical but noble vision that was constantly undermined by people who came just to play.”
In all, Mr. Houriet concluded, the important lesson of the commune movement was that ”open-ended, anarchistic communities didn’t work because of problems with leadership, with land ownership, the role of drugs and booze, plus internal conflicts among the members. There was a lot of trauma involved, and not just from chemicals. The movement opened a Pandora’s box of the liberated self, and the trauma proceeded from the inability of people to deal with themselves.”
Johnson Pastures, one of the better known Vermont communes, became a “slum” as participation broadened. The first arrivals shared common philosophy and values, future buses that arrived did not.
”The Red Clover Collective was the educated, affluent kids,” Mr. Lieberman said. ”The people at the Free Farm were middle-class kids, emulating the Red Clover hippies, and the J. P. was the Ellis Island of the commune movement, drawing people with nowhere to go and nothing else to do.”
In fact, Johnson’s Pastures and its membership policies, or lack thereof, became the epitome of the movement in its extreme. The former landowner, Michael Carpenter, a silent, bearded man who attended the weekend sessions, set an open-door policy, refusing to turn away anyone.
The result, said Mr. Light, was that ”during the summer of 1969, somewhere between 800 and 1,000 people passed through the J. P. Lots of them would come in buses. The place became a slum. The class differences were very relevant; the first communards had shared values and education, but it quickly sank to the lowest common denominator — the criminal element. What happened at the J. P. was a colossal failure.” (Source)
The fire at the Johnson Pastures Commune became a symbol of the excesses and issues of communes more generally, at least in Vermont.
The quest for a utopia soon turned into a self-destructive orgy of excess, many participants concluded, culminating symbolically in the fire that razed the big house at the Johnson’s Pastures Commune, by far the largest of the communes. The blaze, on April 16, 1970, killed four people.
Chuck Light, a witness and member of the commune at the time, recalled during a panel discussion that the fire at the commune known as J. P. was precipitated when, after a night of drinking and drugs, someone tipped over a candle.
”I was living in a hovel in the back of a truck at the time,” Mr. Light said. ”We heard screaming and came running; people were leaping out of second-story windows. The old wooden house went up in minutes.”
That blaze, he concluded, ”became a central symbol of the movement, symbolic of the personal fires and conflicts that were going on around us and among us.” (Source)
The cozy mystery (sometimes simply called a cozy) is a subgenre of crime fiction that gives readers a chance to delight in vicariously solving a murder—without graphic violence or sex. Protagonists are typically amateur (and usually female) sleuths solving small-town crimes with old-fashioned detective work rather than forensics. These unlikely heroes are often small-business owners who find themselves drawn into detection by crimes impacting their work; sometimes their investigative efforts are aided by a significant other with police connections.Natalee Rosenstein, senior executive editor of Penguin’s Berkley Books, traces the renewed interest in the genre to the early 1990s. “With the breakthrough of Lilian Jackson Braun’s Cat Who series, the market for cozies really opened up,” Rosenstein says. “There was a great untapped market for cozier mysteries that was really not being met.”Cozies offer readers the kind of escapism that harder-boiled detective stories simply can’t. Marilyn Stasio, who has been the Crime columnist for The New York TimesBook Review since the late 1980s, recently wrote: “The abiding appeal of the cozy mystery owes a lot to our collective memory, true or false, ofsimpler, sweeter times.”And the genre’s resurgence has opened up new opportunities for authors for whom success in other genres has been elusive. (Source)
Martin Freeman Clips
Short note: the “I feel like” hedge that is growing in popularity could be seen as a general trend towards making statements inarguable, or as one Cornell linguist put it, “relativism run amok”.
This is a small example of the (sometimes) false promise of bringing feelings into conversation. Ideally bringing feelings into a conversation should help us better communicate, and it can do that. But basing a conversation on feelings can also cocoon us, and protect us from self-examination. After all, who are you to dispute what I feel?
But of course we do care that what you feel has some relation to reality, as making a better reality is in fact our joint project. I may “feel like” it’s insulting to say that cops shoot black men more than white men, and you may “feel like” it’s obvious that they do; at the end of the day the first step in making a better society is to determine whether they are in fact shooting black men at a higher rate or not. (They are).
It’s probably also useful to investigate why people “feel like” things that are clearly happening are not happening, from a social change perspective. But without some reference to an objective reality, even feelings can’t be analyzed in useful ways. The interpretation of the feeling that “The CIA is out to get me” is going to be very different based on whether the CIA is actually out to get you or not.
The I Feel Like Paradox is that it offers the appearance of acceptance of other opinions while simultaneously putting one’s thoughts out of reach of others.
A bit on LANPAR, predecessor to VisiCalc.
“LANPAR” – LANguage for Programming Arrays at Random is the world’s first electronic spreadsheet.
Co-invented and developed by Rene Pardo and Remy Landau this software was created in 1969 and it’s use sold to the Plant Budgeting Divisions of Bell Canada, AT&T and the 18 Operating Telephone Companies across the U.S. and Long Lines- in addition to General Motors in Warren Michigan.
It was invented at the time because Bell Canada and AT&T had the problem that in changing the 2000 cells in their budgeting forms, the lead time for the MIS groups to re-write the software in Fortran was 6 months to 2 years.
While LANPAR was a predecessor, it was not an influence on VisiCalc. In some ways it was the personal computer which made the workflow assumptions of the spreadsheet make sense.
A note from a writer on a Wikipedia/Librarians meeting. Structured facilitation and discussion dominance.
It rewards the quick and assertive, whereas I spent today watching participation be more equal and distributed when there was structured moderation, and slide into literally 90% male voices in the last ten minutes of the day, when people were feeling punchy and discussion was totally open. (Source)
Andromeda Yelton writes on the hidden bias of Wikipedia’s editing process:
The encyclopedia that anyone can edit is the encyclopedia where everyone has the right to the floor. And there’s a liberation in that — on the internet, no one knows you’re a dog and the things you say matter — but there’s also an oppression, in that it rewards everyone who’s never stopped to think that maybe they’re not the expert. It rewards an investment in being right, but not in noticing the emotional undercurrents of the room, in building relationships over time and ensuring stakeholders are identified and heard, which is very much how well-run libraries tend to operate. It rewards the quick and assertive, whereas I spent today watching participation be more equal and distributed when there was structured moderation, and slide into literally 90% male voices in the last ten minutes of the day, when people were feeling punchy and discussion was totally open. (Source)
WordPerfect, the application that truly launched the word-processing revolution, was a university project.
WordPerfect, the piece of software that turned word processing into big business, came to life much the same way Netscape did—it was based on work originally done at a university, Brigham Young University to be exact. (That’s right, the Mormons brought us word processing.) The app, released to the public in 1980 after a few years of incubation at BYU, differed from WordStar in an important way—it kept the key commands off-screen, putting the focus on the words instead. (Many editors are still trying for this ideal.) Gary North, the libertarian-type who I highlighted in my recent piece about Y2K hoarders, was an active WordPerfect user until 2011, and some lawyers still love it. (Source)
The folks at Xerox PARC invented a lot of stuff in the 70s that would later become incredibly common in modern computers. One of those things was copy-and-paste, which Larry Tesler added to the Gypsy editor. The editor, one of the first “modeless” text editors, also supported the mouse. “Gypsy was the very first thing outsiders in the company understood,” Tesler recalled for the book Fumbling the Future, a 1999 tome about Xerox PARC. It later inspired a lot of word processors. (Source)
The 1975 creation of the Electric Pencil, the first commercially available word processor specifically written for a home computer, was so early to the game that the TRS-80 didn’t support lowercase characters—which meant the manual for the software had to explain how to modify the hardware for the machine to support lower-case text. Jerry Pournelle, a science fiction author and very early blogger, was among the first to write a book using Electric Pencil, authoring Oath of Fealty with his writing partner, Larry Niven. (Source)
IBM “Paperwork Explosion” commercial. Full of electronic MOOG-y goodness.
And when the company needed to promote the first typewriter that relied on reusable storage, the MT/ST (Magnetic Tape/Selectric Typewriter), it worked with a guy who at the time was best known for coffee commercials, and teamed him with another guy who was known for soundtracking Looney Tunes.
That meant Jim Henson was in charge of the vision of selling this paperwork-saving product to the masses—and he was teamed up with Raymond Scott, who was by this time already a living legend.
Together, they came up with something that could have only come out of 1967. Scott stepped up with a minimal, blip-heavy electronic soundtrack, and Henson followed suit with the visuals for “Paperwork Explosion,” a five minute clip that combines apparent stock footage of paper, clocks, space shuttles; Mad Men-era office drones, staring directly in the camera and repeating robotic lines; and soundtrack-emphasized quick cuts. (Source)
Commercial contains line “Machines should work. People should think.”
IBM’s best-known logo by Rand, was actually its second. Its first, a set of blocky slab-serif letters, eventually went the way of the dodo, but the company stuck with Paul Rand—who eventually gave the logo some new life with a few stripes. “Rand’s series of IBM logos culminated in a 1972 version formed from stacked stripes, suggesting speed and dynamism, which made the company’s initials instantly recognizable worldwide. The ‘8-bar’ logo is still in use today,”according to the company. (Source)
Obama believes “national conversations” harden positions instead of softening them:
And then, finally, I think it’s going to be important for all of us to do some soul-searching. You know, there’s been talk about, Should we convene a conversation on race? I haven’t seen that be particularly productive when, you know, politicians try to organize conversations. They end up being stilted and politicized, and folks are locked into the positions they already have.
On the other hand, in families and churches and workplaces, there’s a possibility that people are a little bit more honest and at least you ask yourself your own questions about, Am I wringing as much bias out of myself as I can? Am I judging people as much as I can based on not the color of their skin, but the content of their character? That would, I think, be an appropriate exercise in the wake of this tragedy. (Source)
National conversations involve a sort of Context Collapse
Although Trump has been seemingly slow to realize it, the more than $2 billion in free media he rode to the GOP nomination was simultaneously hardening the broader country’s negative view of Trump just as it was endearing him to the conservative base. The cascade of Trump-created controversies following the conventions that precipitated Conway’s hiring appear to have irrevocably damaged his credibility as a plausible commander in chief and could prove to be the turning point in the general election itself.
Trump taps Bush, Romney veterans for transition
By ANDREW RESTUCCIA and JENNIFER HABERKORN
“It was a terribly damaging period,” said Steve Schmidt, the GOP strategist who guided John McCain’s 2008 campaign. “It hit on his trust numbers, his fitness for office — and at a time when [Hillary Clinton]’s had some hard news cycles. In any normal cycle, she’s the de-facto incumbent and these stories would have her on defense; and she’s not on defense, so there’s an opportunity cost to all this.”
More than 60 percent of Americans have an unfavorable opinion of Trump, leaving Clinton, with a 54 percent unfavorable rating, as only the second most unpopular presidential candidate in history. Both candidates, in fact, have held unfavorable ratings above 50 percent since launching their respective campaigns, with Trump hovering around the 60 percent mark, only a few points above Clinton. Asked about a smell they might associate with this election, the participants in a focus group conducted by Peter Hart in Wisconsin this week gave the following responses: “sulfur,” “rotten eggs,” “garbage,” “manure” and a “skunk’s fart.” (Source)
Valium was initially marketed as a non-addictive drug of which it was impossible to take a lethal overdose.
It was initially believed that Valium was not addictive and that it was nearly impossible to take a lethal dose by a suicidal person. After about ten years on the market, Valium had been prescribed to 59.3 million patients. Accounting for 81% of the tranquilizer market in the U.S., critics felt Valium was over marketed by Hoffman-LaRoche. Around 1975, Valium was being abused on the street as an illegal drug. Valium was still considered generally safe, but soon reports of dependency and withdrawal began to be made. Hoffman-LaRoche was accused of failing to adequately warn physicians and patients of the risk of dependency and ignoring early warnings of serious Valium complications.
During the 1970’s, Elliot Valenstein sums up the history of the over-prescription of Valium. “…there is no doubt, as in the Rolling Stones song “Mother’s Little Helper”, far too many women had the habit of “running for the shelter” of the pill that would help them get through their day.” (Source)
There’s a history of candidates concealing their medical risks. After former Massachusetts senator Paul Tsongas won the 1992 Democratic primary in New Hampshire, his personal doctors vouched for his health, implying that Tsongas’ cancer was cured when it was actually incurable. Tsongas would’ve died before his first term was over. (Source)
Much of the blame for the current opioid epidemic can be placed at the feet of Stamford, Connecticut company Purdue Pharma and the system which enabled it to thrive. A family-owned business, Purdue “struck gold” after getting FDA approval for Oxycontin, a new opioid formulation.
They presented their formulation as a breakthrough, claiming it provided 12 hours of relief, which gave it a different profile than previous opioids, one that could conceivably be less addictive. Unfortunately this fact was not true, and Purdue’s erroneous marketing contention that OxyContin lasted 12 hours (when for many people it lasted only eight) may have been a direct cause of some addictions. See Twelve Hour Problem
They skillfully funded and directed new pain education programs for physicians and authored new pain treatment guidelines with groups such as JCAHO, in campaigns with names like Pain Is The Fifth Vital Sign. These efforts portrayed the concerns over opioid addiction to be overblown, and encouraged doctors to take pain management as a core measure of success. See JCAHO and the Opioid Epidemic, Pain Is The Fifth Vital Sign
The programs worked. In 1995, their sales were $45 million. But by the end of the 90s they were doing sales of 1.4 billion. See A 2,000 Percent Increase
As signs of addiction, overdose, and illegal prescription practices increased, Purdue Pharma turned a blind eye to the problem, even redoubling marketing efforts to physicians that were clearly selling to doctor-shoppers and supplying illegal markets. See OxyContin Ring, Big Data and OxyContin
Many Oxy addicts (perhaps most) eventually found heroin a cheaper substitute, and moved to that at a source. In fact, most heroin users today started with pain killers, a shift from older addiction models. See
80% of Heroin Users Started with Painkillers, Route to Heroin Abuse
As a result of the impact of this epidemic, we have seen the first reverses in all-cause mortality we have ever seen in a major demographic, short of war and AIDS. See Opioids, Alcohol, Suicide
In 2007, Purdue plead guilty to misleading regulators, doctors, and others about the addictive effects of OxyContin. They paid a 600 million dollar fine. See Purdue Pleads Guilty
Pill addicts got addicted by doctors, not dealers. See Addicts Used Doctors, Not Dealers
History repeats itself: Roche created similar problems with its marketing of Valium. See False Promise of Valium
In Hyperalgesia, patients that receive narcotics over long periods of time become more, sensitive to pain.
The introduction of medical pot laws may have saved lives by reducing opioid addiction. See Medical Pot Laws and Opioid Abuse
The company that makes the narcotic painkiller OxyContin and three current and former executives pleaded guilty today in federal court here to criminal charges that they misled regulators, doctors and patients about the drug’s risk of addiction and its potential to be abused.
To resolve criminal and civil charges related to the drug’s “misbranding,” the parent of Purdue Pharma, the company that markets OxyContin, agreed to pay some $600 million in fines and other payments, one of the largest amounts ever paid by a drug company in such a case.
Also, in a rare move, three executives of Purdue Pharma, including its president and its top lawyer, pleaded guilty today as individuals to misbranding, a criminal violation. They agreed to pay a total of $34.5 million in fines. (Source)
He will almost certainly survive his August 30 primary to face Rep. Ann Kirkpatrick, a top Democratic recruit with a long record of winning tough races, in November. But he has had to endure a pesky primary battle against Kelli Ward, a little-known state senator who once held a public hearing on the conspiracy theory that the government has been deliberately poisoning citizens via “chemtrails.” His job approval rating is among the lowest in the Senate, and Hillary Clinton is neck and neck with Trump in a state that has voted Democratic once since 1948.McCain, a camera-happy curmudgeon who years ago earned a reputation as a “maverick” fighting for campaign-finance reform, is an unpopular politician running with an unpopular nominee. Ask him if he thinks Trump can handle the nuclear arsenal, and he stammers uncomfortably. But even though Trump has suggested McCain is not a war hero because he was captured in Vietnam and once called him a “dummy” who should be defeated in his primary, McCain has promised to vote for Trump. McCain’s conundrum is easier stated than solved: To win, he needs the party—but his party has gone to the Mad House. (Source)
Interestingly, MTV got upset over Bjork’s animated breasts in the final seconds of this video, and would only play it at night.
Old video footage of Bjork.
BONUS: Exclaim noted that some footage of Björk’s pre-Sugarcubes teenage punk band, Tappi Tíkarrass, or, translated, “Cork the Bitch’s Ass,” had resurfaced this week. And though it seems like this footage has actually been online for quite a while (the YouTube video was posted in 2012), I personally feel that with around 100,000 views, it hasn’t been seen nearly enough:
Here’s a a short 7 minute doc on career of Bjork:
Old video footage of Bjork.
BONUS: Exclaim noted that some footage of Björk’s pre-Sugarcubes teenage punk band, Tappi Tíkarrass, or, translated, “Cork the Bitch’s Ass,” had resurfaced this week. And though it seems like this footage has actually been online for quite a while (the YouTube video was posted in 2012), I personally feel that with around 100,000 views, it hasn’t been seen nearly enough:
Old video footage of Bjork.
BONUS: Exclaim noted that some footage of Björk’s pre-Sugarcubes teenage punk band, Tappi Tíkarrass, or, translated, “Cork the Bitch’s Ass,” had resurfaced this week. And though it seems like this footage has actually been online for quite a while (the YouTube video was posted in 2012), I personally feel that with around 100,000 views, it hasn’t been seen nearly enough:
Tomorrow’s university graduates will be taking a journey into the professional unknown guided by a single, mind-blowing statistic: 65% of today’s students will be doing jobs that don’t even exist yet.
Technological change, economic turbulence and societal transformation are disrupting old career certainties and it is increasingly difficult to judge which degrees and qualifications will be a passport to a well-paid and fulfilling job in the decades ahead.
A new wave of automation, with the advent of true artificial intelligence, robots and driverless cars, threatens the future of traditional jobs, from truck drivers to lawyers and bankers.
But, by 2025, this same technological revolution will open up inspiring and exciting new career opportunities in sectors that are only in their infancy today. (Source)
Stanford wants to instrument grit.
SCHMUTTE: Technology has tremendous potential to help students become more self-aware and intentional as learners. One of our concepts was something we called the Grit Bit — like a Fitbit, but for learning. Imagine a tool that tracks metrics related to students’ progress: stress and mood levels, physical activity, social activity and physical location. It then helps you make correlations among those factors and how you’re performing toward learning goals. Technology could help you collect a lot of valuable data that would help you be a smarter learner (Source)
The alt-right appeals to gamers because of the white male geek sense that they are the natural rulers of the world, held back by political correctness.
The affinity between gamers and right politics makes sense. “It’s not hard to see why this ideology would catch-on with white male geeks,” Klint Finley writes in his excellent explainer on neoreaction. “It tells them that they are the natural rulers of the world, but that they are simultaneously being oppressed by a secret religious order. And the more media attention is paid to workplace inequality, gentrification and the wealth gap, the more their bias is confirmed.” (Source)
See also Red Pill Politics
This is called denominalisation, which is the technical term for converting a noun to a verb. There are two ways to accomplish this conversion. You can either affix the noun with a suffix, like -ify, as in purify or clarify. Or you can do what we’ve been doing, and just steal a thing and do it. The name for the second option is zero derivation – because nothing is changed when the verb is derived from a noun in this way. (Source)
But this hand-wringing is a distraction from the very real problems with artificial intelligence today, which may already be exacerbating inequality in the workplace, at home and in our legal and judicial systems. Sexism, racism and other forms of discrimination are being built into the machine-learning algorithms that underlie the technology behind many “intelligent” systems that shape how we are categorized and advertised to.
Take a small example from last year: Users discovered that Google’s photo app, which applies automatic labels to pictures in digital photo albums, was classifying images of black people as gorillas. Google apologized; it was unintentional.
But similar errors have emerged in Nikon’s camera software, which misread images of Asian people as blinking, and in Hewlett-Packard’s web camera software, which had difficulty recognizing people with dark skin tones. (Source)
Then, of course, there’s the content, which, at a few dozen posts a day, Nicoloff is far too busy to produce himself. “I have two people in the Philippines who post for me,” Nicoloff said, “a husband-and-wife combo.” From 9 a.m. Eastern time to midnight, the contractors scour the internet for viral political stories, many explicitly pro-Trump. If something seems to be going viral elsewhere, it is copied to their site and promoted with an urgent headline. (The Khan story was posted at the end of the shift, near midnight Eastern time, or just before noon in Manila.) The resulting product is raw and frequently jarring, even by the standards of this campaign. “There’s No Way I’ll Send My Kids to Public School to Be Brainwashed by the LGBT Lobby,” read one headline, linking to an essay ripped from Glenn Beck’s The Blaze; “Alert: UN Backs Secret Obama Takeover of Police; Here’s What We Know …,” read another, copied from a site called The Federalist Papers Project. In the end, Nicoloff takes home what he jokingly described as a “doctor’s salary” — in a good month, more than $20,000. (Source)
Readers who clicked through to the story were led to an external website, called Make America Great Today, where they were presented with a brief write-up blended almost seamlessly into a solid wall of fleshy ads. Khan, the story said — between ads for “(1) Odd Trick to ‘Kill’ Herpes Virus for Good” and “22 Tank Tops That Aren’t Covering Anything” — is an agent of the Muslim Brotherhood and a “promoter of Islamic Shariah law.” His late son, the story suggests, could have been a “Muslim martyr” working as a double agent. A credit link beneath the story led to a similar-looking site called Conservative Post, from which the story’s text was pulled verbatim. Conservative Post had apparently sourced its story from a longer post on a right-wing site called Shoebat.com.
Within 24 hours, the post was shared more than 3,500 times, collecting a further 3,000 reactions — thumbs-up likes, frowning emoji, angry emoji — as well as 850 comments, many lengthy and virtually all impassioned. A modest success. Each day, according to Facebook’s analytics, posts from the Make America Great page are seen by 600,000 to 1.7 million people. In July, articles posted to the page, which has about 450,000 followers, were shared, commented on or liked more than four million times, edging out, for example, the Facebook page of USA Today. (Source)
Sanders followers repurposed frequent Sanders email blasts into graphic Faqcebook memes.
Rafael Rivero is an acquaintance of Provost’s who, with his twin brother, Omar, runs a page called Occupy Democrats, which passed three million followers in June. This accelerating growth is attributed by Rivero, and by nearly every left-leaning page operator I spoke with, not just to interest in the election but especially to one campaign in particular: “Bernie Sanders is the Facebook candidate,” Rivero says. The rise of Occupy Democrats essentially mirrored the rise of Sanders’s primary run. On his page, Rivero started quoting text from Sanders’s frequent email blasts, turning them into Facebook-ready memes with a consistent aesthetic: colors that pop, yellow on black. Rivero says that it’s clear what his audience wants. “I’ve probably made 10,000 graphics, and it’s like running 10,000 focus groups,” he said. (Clinton was and is, of course, widely discussed by Facebook users: According to the company, in the last month 40.8 million people “generated interactions” around the candidate. But Rivero says that in the especially engaged, largely oppositional left-wing-page ecosystem, Clinton’s message and cautious brand didn’t carry.) (Source)
Most retweeters and Facebook reposters aren’t informing, or even arguing. They are using headlines the way one might use a bumper sticker: to express who there are and bond with others.
From a user’s point of view, every share, like or comment is both an act of speech and an accretive piece of a public identity. Maybe some people want to be identified among their networks as news junkies, news curators or as some sort of objective and well-informed reader. Many more people simply want to share specific beliefs, to tell people what they think or, just as important, what they don’t. A newspaper-style story or a dry, matter-of-fact headline is adequate for this purpose. But even better is a headline, or meme, that skips straight to an ideological conclusion or rebuts an argument. (Source)
Much position taking is not substantive, but presentational. See Identity More Than Policy
In Partisan Sorting people first sort into parties based on belief, but then have beliefs molded by party.
Amber Case’s Templated Self is an attempt to understand how UI mediates expression of identity.
Identity Headlines combined with homogeneous groups can create a Filter Bubble
Holy Moly! So more than half of all the people that huddled with Clinton were donors to her family’s foundation? Grab the can of damage-control spray!
Or maybe not. Click through to the actual article and a key qualifier rears its head. The count doesn’t include anyone in the U.S. federal government or representatives of foreign governments. In other words, most of the people with whom Clinton met as secretary of state. The analysis drilled in on “154 people from private interests” who chatted by phone or met with Clinton in person. Eighty-five of them “donated to her family charity or pledged commitments to its international programs,” for a total of “as much as $156 million.”
Those numbers represent the fruit of worthwhile investigation; we ought to know everything about the overlaps between Clinton’s work as secretary of state and the operations of the Clinton Foundation. Yet the tweet promoting the story, which has more than 10,000 retweets and likes combined, is tendentious and misleading. A lamentable hyping on social media. (Source)
“Put That There” was a pioneering MIT effort studying more natural ways of communicating with computers through gesture, voice and collaboration. These sorts of interactions are commonplace now on home game consoles — here you see the foundations being laid a full thirty years ago. (Source)
The argument for simulating page-flips goes back a long way.
SDMS does not distinguish between icons and document windows, like Xerox Star will do a few years later. The user can zoom into the minimized thumbnail world of material until an item becomes clear and legible. The monitor to the right will display control elements to edit the document in focus. For example, if the main screen displays a calculator in full-size the secondary monitor offers numerical and mathematical function buttons. If the user zooms in a book, the monitor will display the table of contents. The book can be read on the main screen. To turn a page the user has to perform a diagonal top-right to bottom-left finger stroke on the touch-sensitive pad, that is built into the arm rest. The turning of the page is accompanied by a page-flipping animation. Richard Bolt argues [Ibid., p. 15],
[…] This page-turning animation visually separates one page from the next and gives readers a sense of where they are in the material in a way that endlessly scrolled text would not.
On long jumps that are directly issued at the table of contents, the animation takes longer to indicate more skipped pages. This artificial delay – computers could do it almost instantaneously – is the equivalent to the physical steps of opening a book on a specific page. It helps to later recall the position of a paragraph of interest. (Source)
The Filebox in Xerox’s NoteCards system was similar to the “playlists” of today.
In addition to the standard text and graphic cards, NoteCards has more card types built in. These are filebox and browser cards. They support the user in sorting and categorizing the content cards. A filebox is a simple way to collect cards that have some aspects in common. Fig. 2.6 shows two filebox cards in the lower left corner. Fileboxes can also contain other filebox cards and provide to such an extent a hierarchical structure among the cards. A browser card shows a diagram based on the link structure between the cards. The large window in Fig. 2.6 is an example for this. (Source)
NoteCards (cf. 2.1.6) is originally a research project at Xerox PARC starting in the early-1980s. It uses a physical card metaphor; i.e. each card displays its content in a separate window. The system is designed to support information-analysis tasks, like reading, interpretation, categorization and technical writing [Shneiderman/Kearsley 89]. For that reason the focus lies on structuring and editing information compared to a more browsing and reading focus of Hyperties and Guide that run on less powerful personal computers. Consequently a special browser card displays an overview graph to illustrate the connections between the cards.
NoteCards is fully integrated with the InterLisp environment for Xerox workstations. This means that it is highly customizable for the skilled user. (Source)
ENQUIRE ultimately proved a failure – in terms of it being implemented within the research divisions of CERN – but it was vital for providing a conceptual basis for the future World Wide Web. It is believed that the original ENQUIRE program – that was stored on a diskette – was overwritten by either Brian Carpenter or Robert Cailliau during the mid 1980’s. (Source)
In 1962 Douglas Engelbert started to develop computer tools to augment the human intellect. The project was called Augment. In contrast to the Memex, Augment was actually implemented and was successfully demonstrated in 1968. Although it was not developed as a hypertext system, NLS (online system), a part of the Augment project, had many hypertext features like cross-references to other work. In 1972 Augment lost its research support, and Engelbert had to stop developing his project. He is still pushing his original augmentation ideas [Nie95]. (Source)
ENQUIRE was a software project written in 1980 by Tim Berners-Lee at CERN, which was the predecessor to the World Wide Web. It was a simple hypertext program that had some of the same ideas as the Web and the Semantic Web but was different in several important ways.
According to Berners-Lee, the name was inspired by a book entitled Enquire Within Upon Everything. (Source)
The development of tools providing new capacities has been replaced by safe and obvious enhancements of comfortable procedures. Second, developers and authors have lost sight of the fact that there are two products in the electronic development of a document: the printout and the “source” file. Currently everything is directed toward producing the printout; the source is a mere by-product, wasting valuable disk space, useful for little more than producing another printout of the same document. Proprietary formats and the lack of semantic and pragmatic coding make these files useless for sharing with colleagues or processing by intelligent systems. Finally, scholars’ time and energy are diverted from researching and composing to formatting for final presentation. The authors of this article pay considerable attention to the quality of submissions and have typeset several books, but current systems tend to focus authors’ attention on appearance all the time, not just when the document is ready for submission. (Source)
Even in academia there is reduced impetus for more intelligent systems. Universities have their own business and administrative offices that make good use of business-oriented systems. Moreover, scholars often prefer these systems over the alternatives. Those who have access to more powerful systems rarely have the time to learn to exploit them fully, and many find them too complicated to use at all. This is quite understandable, since most text formatters on minicomputers and mainframes were developed under a model that is even more inappropriate than author-as-typist. Written by and for programmers, these systems often require quasi-programming skills, treating authors as programmer-typists. Most scholars experienced in computing are only too happy to escape such rough and poorly adapted systems for simple, handy little programs that help them get text onto paper quickly. Lacking the concepts necessary to recognize major recent improvements and unaware of the possibilities for new strategies for research and composition, they hail this movement backward as a major advance in scholarly computing. Because this response comes even from the most experienced scholars, it carries weight with potential systems developers as well as with those who are just beginning to use computers. More and more scholars call for computing facilities that simply enhance their capacity to type; they exert pressure pulling the industry away from significant development. The consequence is an industry that is clinging to the past; scholars seek to do what they have always done, only a little faster. 3
This shift in dominant models creates three major problems: first, the incentive for significant research and development in computing systems is disappearing, and a major portion of resources has been diverted into the enhancement of a minor portion of the document development process. Lacking the time to train themselves in other disciplines, many of the scholars who are setting the trends in text processing do not understand or value the possibilities. Moreover, the resources required for the development of sophisticated systems have been severely underes timated throughout the industry, and people have become impatient with the lack of immediately useful products. Thus, we see far more attention paid to keyboards, printers, fonts, displays, graphics, colors, and similar features than to the retrieval and structuring of information or even to the verification of spelling and grammar.4 (Source)
Good presenters can lead to lousy learning, due to the overconfidence effect.
Eloquent and engaging scientific communicators in the mould of physicist Brian Cox make learning seem fun and easy. So much so that a new study says they risk breeding overconfidence. When a presenter is seen to handle complicated information effortlessly, students sense wrongly that they too have acquired a firm grasp of the material.
Shana Carpenter and her colleagues showed 42 undergrad students a one-minute video of a science lecture about calico cats. Half of them saw a version in which the female lecturer was confident, eloquent, made eye-contact and gestured with her hands. The other students saw a version in which the same lecturer communicated the same facts, but did so in a fumbling style, frequently checking her notes, making little eye contact and few gestures. (Source)
Social media content has grown exponentially in the recent years and the role of social media has evolved from just narrating life events to actually shaping them. In this paper we explore how many resources shared in social media are still available on the live web or in public web archives. By analyzing six different event-centric datasets of resources shared in social media in the period from June 2009 to March 2012, we found about 11% lost and 20% archived after just a year and an average of 27% lost and 41% archived after two and a half years. Furthermore, we found a nearly linear relationship between time of sharing of the resource and the percentage lost, with a slightly less linear relationship between time of sharing and archiving coverage of the resource. From this model we conclude that after the first year of publishing, nearly 11% of shared resources will be lost and after that we will continue to lose 0.02% per day. (Source)
It’s a hard story for journalists to tell. Journalists are, despite their political reputation, fundamentally conservative. The only way to keep explaining what’s happening in the world, day after day, is to rely on some basic frames. Cause and effect have to unfold within stable institutions, according to accepted rules.A story that falls outside the everyday frames—The mayor is a crackhead who leaves a trail of violence where he goes, say, or This beloved entertainer is accused of being a serial rapist—requires a radical shift of perspective. Possibly the best and truest part of the movie Spotlight was how much of the Boston Globe’s investigation into the Catholic Church’s secret sexual abuse came out of the Globe’s own morgue. The paper had already written the story, piece by piece. It just hadn’t read it. (Source)
The alt-right, the online mini-movement that backs Trump while hurling anti-Semitic imprecations at everyone who might doubt his greatness, is characterized by a reverse nationalism, in which sometimes Russia, sometimes Hungary, sometimes the Hohenzollern monarchy becomes the object of perverted patriotism. Their own mongrel country and its flaccid Constitution receive only disdain. While the content of this ideology remains marginal in American life, its alienation from its own country comes ever closer to the center of politics.
Isn’t that the central story of the Trump campaign? Only some Americans qualify as full Americans—and loyalty is owed not to the America that is, but to a false memory of America as it was and a sinister vision of the purged and purified America that could be, if only we can exclude enough people who don’t truly belong. (Source)
Red Pill politics, the belief that there is an all-encompassing secret reality that others are not privy to, dominates the alt-right.
One of the favorite memes of the so-called alt-right is the “red pill,” from the movie The Matrix. To swallow the red pill is to be liberated from the pleasing illusion that one lives in a thriving, happy country—and to awaken to the hideous truth that one is deceived and exploited, a captive in a ruined land. Trump ran the first red-pill presidential campaign—a campaign whose central theme was the brokenness of the American experiment. It is a theme that will only grow more powerful to its believers should Trump lose.
“We are a country that doesn’t win anymore,” said Donald Trump in his speech opposite the January 28, 2016, Republican debate. “We don’t win anymore. When was the last time we won? We don’t win on trade. We don’t win on the military. We don’t beat ISIS. We don’t do anything. We’re not good.” (Source)
The left’s delusions are more moderate, but do exist. See The Youthful Cult of Corbyn
Red Pill politics is aided by the Filter Bubble
An amazing stat.
From 1988 to 2008, the number of felonies reported by New York City to the FBI dropped from 719,887 to 198,419 – a remarkable 72 percent reduction. Outside of New York City, the number of crimes declined by half as much, only 38 percent.
New Jersey’s 1994 Truth in Sentencing Law was milder than most, aiming at judicial transparency rather than mandatory minimums. (It would not have met the 1994 federal standard for TiS).
Under the measure, judges will be required to “inform the public of the actual period of time” that a defendant is likely to spend in prison as a result of the sentence.
Judges will not only say how long the sentence is but also state when a defendant may be released for good behavior. The final determination of parole will still be made by the State Parole Board.
Court officials say about 60 percent of New Jersey inmates who are eligible for parole get full credit for good behavior and are released at the earliest estimated parole dates.
“Under the truth-in-sentencing policy, the public will not be left with the mistaken impression that the sentenced imposed is what the defendant will actually serve,” the Supreme Court statement reads. (Source)
Jenna’s Law, which applied Truth-In-Sentencing provisions to first time violent felony offenders in New York was passed due to overwhelming public pressure.
Janice Grieshaber Geddes remembers how then-Assembly Speaker Sheldon Silver would not let Jenna’s Law come to a vote in 1998 even though the overwhelming majority of the Assembly’s members supported it.
Geddes’ daughter, Jenna Grieshaber Honis, had been murdered in 1997 in Albany by a convicted felon out on parole.
Silver initially prevented a vote on the bill named after Jenna – which would require most first-time violent felons to serve at least six-sevenths of their sentence before they were released and then undergo at least 18 months of parole supervision – even though 130 of his members were sponsoring the bill.
Six weeks later, facing public pressure, the speaker caved and Jenna’s Law was passed. (Source)
In 1995, the Governor ended indeterminate sentencing for second-time violent felony offenders and provided that if an offender maintains a good disciplinary and program record while in prison, he or she would be eligible for release only after serving 85 percent of his or her prison term. (Source)
Jenna’s Law applied this more broadly to first time violent felony offenders.
Indeterminate sentencing was replaced in Delaware by the Truth in Sentencing Act of 1990, which ushered in determinate sentencing.
In Delaware, for offenders convicted up to 1990, a parole release date can be established after a minimum of one-third of the sentence has been served. In addition to a one-third term parole
date, time served is further reduced by a complex system of meritorious good-time credits. With the merit and good-time credits combined, it is not uncommon for offenders under Delaware’s indeterminate system to serve no more than one-fifth or one-fourth of their sentence. Ironically, criminals incarcerated for the most serious offenses accrue the largest percentage decreases in actual time served. (Source)
Sometimes we have to stop projects a bit early. But that’s OK.
Here’s the original Mount Rushmore design.
Though recent sentencing reforms are linked to increasing time served, the average (or mean) sentence length imposed on offenders entering prison decreased, from 72 months in 1990 to
68 months in 1996 (table 5). Consistent with sentencing policy change, the projected minimum time expected to be
served by persons entering prison increased slightly. If parole eligibility requirements, good-time credits, and early release policies are taken into account, persons entering State
prisons in 1996 were expected to serve a minimum of 42 months in prison, up from 40 months in 1990.
For violent offenders the average imposed sentence decreased from 107 months in 1990 to 104 months in 1996,
while the expected time to be served increased. On average, violent offenders admitted to prison in 1996 were expected to serve about 3 months longer than those admitted in 1990 (or a minimum term of 70 months versus 67 months).
By offense, the average sentence length for murder (excluding offenders sentenced to life) showed the largest increase between 1990 and 1996, up from 233 months to 253 months. Offenders admitted to prison in 1996 for murder, without a life sentence,
were expected to serve about 40 months longer (215 months) than
offenders admitted in 1990 (176 months).
Indeterminate sentencing was the norm in the U. S. penal system through the 1970s. Under indeterminate sentencing, prisoners could be released at any point by parole boards, and the sentence handed down in court was meant only as a maximum.
Indeterminate sentencing put the focus of imprisonment on rehabilitation, but also resulted in vastly unequal sentences for similar crimes. Given what we know about the nature of racism, it’s reasonable to believe that many of these inequities fell along racial lines.
The first indeterminate sentencing system became law in Ohio in 1885. Following Ohio’s lead, indeterminate sentencing was gradually adopted by all states and remained the nation’s sentencing system through the 1970s. Indeterminate sentencing encourages “individualized” sentencing where rehabilitation of the offender is a key determinant for release from incarceration.
Generally, a judge issuing an indeterminate sentence sets the term of incarceration at the maximum allowed by law. The actual time served is later determined by the parole board which, in its own judgment of the offenders’ rehabilitation, determines the date of release and hence, the actual time served.
Uncertainty abounds under indeterminate sentencing. Similar offenders with similar offenses can have significant variations in time served, varying from very short to very long sentences.
Although indeterminate sentencing is one of the norms for sentencing, mandatory sentences are frequently found sprinkled throughout the states’ criminal statutes and sentencing laws. In many cases mandatory sentences are passed in the legislatures for the specific purpose of curtailing the decision making authority of the parole boards. Sometimes, they are enacted as a response to a
particularly heinous and highly publicized crime. (Source)
Fourteen states had abolished early release by discretion of a parole board for all offenders by January 1999. (Source)
Notably absent here are California and New York.
> if Breitbart’s success is from publishing articles that it knows will upset its readers, the key is to publish the stories that present the perfect combination of facts to fuel rightwing anger. For example, a story about homeland security secretary Jeh Johnson, who visited Louisiana following the recent floods, saying Obama could not also attend because he has a “very busy schedule”. It is true – he did say that. But the article is illustrated with a picture of a smiling Obama on a golf course, and the text notes the president is “on vacation”.
riddled with similar endowments.
Researchers have found associations between physical performance and more than 200 genetic variations. More than 20 of those relate to elite athleticism. These performance-enhancing variations can affect height, blood flow, metabolic efficiency, muscle mass, muscle fibres, bone structure, pain threshold, fatigue resistance, power, speed, endurance, susceptibility to injury, psychological aptitude, and respiratory and cardiac functions, to name just some.
We don’t disqualify athletes with these types of predispositions. We celebrate them.
With seven Olympic medals, Finland’s Eero Mäntyranta, for example, is among the all-time greats of Nordic skiing. It is a sport that requires incredible stamina – a trait assisted by an abundance of red blood cells, which carry oxygen to the muscles. That’s why so many endurance athletes try to boost their red blood cell count by training at high altitude, sleeping in altitude chambers, or through illegal measures like blood doping or taking a synthetic version of the hormone erythropoietin (EPOR).
Mäntyranta, who died in 2013, didn’t need any of that. He had a condition called primary familial and congenital polycythaemia, associated with a variation in the EPOR gene, which caused his body to produce 65 per cent more red blood cells than the average male. David Epstein, author of The Sports Gene, calls Mäntyranta’s EPOR variant a “gold medal mutation”.
Which are exactly the kinds of messages Em Ford, 27, was receiving en masse last year on her YouTube tutorials on how to cover pimples with makeup. Men claimed to be furious about her physical “trickery,” forcing her to block hundreds of users each week. This year, Ford made a documentary for the BBC called Troll Hunters in which she interviewed online abusers and victims, including a soccer referee who had rape threats posted next to photos of his young daughter on her way home from school. What Ford learned was that the trolls didn’t really hate their victims. “It’s not about the target. If they get blocked, they say, ‘That’s cool,’ and move on to the next person,” she says. Trolls don’t hate people as much as they love the game of hating people. (Source)
The 2016 Summer Olympics was a high point for women in sports, but you wouldn’t know it from the coverage.
For some reason, there’s been a remarkable online effort to paint the Rio Olympics as a bottomless pit of sexist drivel. The evidence in favor of this is thin to the point of nonexistence, and today it reached comical proportions. Here is Emily Crockett at Vox:
It’s no wonder that this unfortunate Olympics headline, from the Colorado paper the Greeley Tribune, caught fire on social media this week. It seemed to be the perfect encapsulation of exactly how the coverage of this year’s games is going when it comes to women — and the way women are treated in society more generally:
Seriously? Our latest outrage is a headline at the Greeley Tribune, circulation 25,000? Given Phelps’ fame and his quest for six gold medals—along with the fact that Ledecky was breaking her own world record (for the fourth time), making it barely even news that she won—you could argue that the Tribune made the right call. But even if it didn’t, who cares? One small newspaper in one small town wrote one headline that was perhaps slightly misconceived. That’s what’s generating outrage today?
It’s the internet that’s made this kind of thing possible. If you dedicate yourself to trawling every bit of media in existence for arguably sexist coverage, you’re going to find something every day. When you have literally millions of items to choose from, it’s inevitable. But it’s also essentially meaningless. What’s actually remarkable is that the folks desperately looking for sexist coverage have found so little.
As I put in The Better Angels of Our Nature, alluding to the famous Seinfeld monologue about team sports, “People root for clothing instead of blood and soil.” Psychological experiments going back to the famous 1950s Robbers Cave study show that hostilities can be tamped down when both sides have to work for a superordinate goal, such as pulling a bus out of the mud.
The Romans and many other cultures have often embraced a notion of honorable suicide. An honorable suicide is a suicide that is not done for personal reasons, but to achieve some greater good. Mark Antony’s death, for example, was seen as dishonorable not because it was a suicide, but because he killed himself over a love affair. Cato the Younger, who killed himself after a battle defeat to avoid prolonging civil strife, on the other hand, was seen as noble. In other words, the suicide is judge by motives and by intended impact.
Christian notions have been decidedly mixed. Most Christian doctrine forbids self-harm, but many in Christianity many have long defended suicide as forgivable in cases of extended pain. Thomas More famously said in his world of Utopia that it was permissible for someone to commit suicide if their life was burdened by torturous pain: “if they thus deliver themselves from torture, or allow others to do it, they shall be happy after death. Since they forfeit none of the pleasures, but only the troubles of life by this, they think they not only act reasonably, but consistently with religion.”
This focus on pain and suffering has formed much of the debate around Euthanasia in Christian countries.
“You have to look at the broader perspective,” Clinton said. “He’s won some, and I’ve won some, but I have 2 and a half million more votes than he does. And I have a very significant lead in delegates.” (Separately, Clinton’s campaign manager Robby Mook made the same claim in an April 4 Medium post titled, “To Hillary Clinton supporters: The facts on where the race stands.”) (Source)
Page himself explained in his blog post that one big reason for the change was to empower the division heads, now CEOs. “Alphabet is about businesses prospering through strong leaders and independence,” he wrote. This seemed quite a switch — when Page himself became Google CEO in 2011, he took steps to make the leaders (known as the L-team) work together, even to the point of sharing a physical space in the afternoons. Speculation was that the switch came in part because Page was afraid that if the L-team members weren’t able to run their own operations somewhat autonomously, they would go elsewhere. (Source)