Install Theme
Pap(ier)
Hans Hollein
1968

Pap(ier)

Hans Hollein

1968

Mask, green gneiss stone, 1928 

Mask, cast concrete, 1929

Henry Moore

The War Cripples, 1920

Lady with Mink and Veil, 1920

Otto Dix

Mind Control & the Internet

JUNE 23, 2011

Sue Halpern

New York Review of Books

Early this April, when researchers at Washington University in St. Louis reported that a woman with a host of electrodes temporarily positioned over the speech center of her brain was able to move a computer cursor on a screen simply by thinking but not pronouncing certain sounds, it seemed like the Singularity—the long-standing science fiction dream of melding man and machine to create a better species—might have arrived. At Brown University around the same time, scientists successfully tested a different kind of brain–computer interface (BCI) called BrainGate, which allowed a paralyzed woman to move a cursor, again just by thinking. Meanwhile, at USC, a team of biomedical engineers announced that they had successfully used carbon nanotubes to build a functioning synapse—the junction at which signals pass from one nerve cell to another—which marked the first step in their long march to construct a synthetic brain. On the same campus, Dr. Theodore Berger, who has been on his own path to make a neural prosthetic for more than three decades, has begun to implant a device into rats that bypasses a damaged hippocampus in the brain and works in its place.

The hippocampus is crucial to memory formation, and Berger’s invention holds the promise of overcoming problems related to both normal memory loss that comes from aging and pathological memory loss associated with diseases like Alzheimer’s. Similarly, the work being done at Brown and Washington University suggests the possibility of restoring mobility to those who are paralyzed and giving voice to those who have been robbed by illness or injury of the ability to communicate. If this is the Singularity, it looks not just benign but beneficent.

Michael Chorost is a man who has benefited from a brain–computer interface, though the kind of BCI implanted in his head after he went deaf in 2001, a cochlear implant, was not inserted directly into his brain, but into each of his inner ears. The result, after a lifetime of first being hard of hearing and then shut in complete auditory solitude, as he recounted in his memoir, Rebuilt: How Becoming Part Computer Made Me More Human (2005), was dramatic and life-changing. As his new, oddly jejune book, World Wide Mind: The Coming Integration of Humanity, Machines, and the Internet, makes clear, he is now a cheerleader for the rest of us getting kitted out with our own, truly personal, in-brain computers. In Chorost’s ideal world, which he lays out with the unequivocal zeal of a convert, we will all be connected directly to the Internet via a neural implant, so that the Internet “would become seamlessly part of us, as natural and simple to use as our own hands.”

The debate between repair and enhancement is long-standing in medicine (and sports, and education, and genetics), though it gets louder and more complicated as technology advances. Typically, repair, like what those Brown, USC, and Washington University research teams are aiming to do for people who have suffered stroke, spinal cord and other injuries, neurodegeneration, dementia, or mental illness, is upheld as something good and necessary and worthy. Enhancement, on the other hand—as with performance drugs and stem cell line manipulation—is either reviled as a threat to our integrity and meaning as humans or conflated with repair until the distinction becomes meaningless.1

Chorost bounces over this debate altogether. While the computer in his head was put there to fix a deficit, the fact that it is there at all is what seems to convince him that the rest of us should become cyborgs. His assumption—it would be too generous to call it an argument—is that if that worked for him, this will work for us. “My two implants make me irreversibly computational, a living example of the integration of humans and computers,” he writes. “So for me the thought of implanting something like a BlackBerry in my head is not so strange. It would not be so strange for a lot of people, I think.”

More than a quarter-century ago, a science writer named David Ritchie published a book that I’ve kept on my bookshelf as a reminder of what the post-1984 world was supposed to bring. Called The Binary Brain, it extolled “the synthesis of human and artificial intelligence” via something he called a “biochip.” “The possibilities are marvelous to contemplate,” he wrote.

You could plug into a computer’s memory banks almost as easily as you put on your shoes. Suddenly, your mind would be full of all the information stored in the computer. You could instantly make yourself an expert in anything from Spanish literature to particle physics…. With biochips to hold the data, all the information in the MIT and Harvard libraries might be stuffed into a volume no greater than that of a sandwich. All of Shakespeare in a BB-sized module…. You may see devices like this before this century ends.

“Remember,” he says gravely, “we are talking here about a technology that is just around the corner, if not here already. Biochips would lead to the development of all manner of man-machine combinations….”

Twenty-six years later, in the second decade of the new millennium, here is Chorost saying almost the same thing, and for the same reason: our brains are too limited to sufficiently apprehend the world.2 “Some human attributes like IQ appear to have risen in the twentieth century,” he writes, “but the rate of increase is much slower than technology’s. There is no Moore’s Law for human beings.” (Moore’s Law is the much-invoked thesis, now elevated to metaphor, that says that the number of components that can be placed on an integrated circuit doubles every two years.) Leaving aside the flawed equivalences—that information is knowledge and facts are intelligence—Chorost’s “transmog” dream is rooted in a naive, and common, misperception of the Internet search engine, particularly Google’s, which is how most Internet users navigate through the fourteen billion pages of the World Wide Web.

Most of us, I think it’s safe to say, do not give much thought to the algorithm that produces the results of a Google search. Ask a question, get an answer—it’s a straightforward transaction. It seems not much different from consulting an encyclopedia, or a library card catalog, or even an index in a book. Books, those other repositories of facts, information, and ideas, are the template by which we understand the Web, which is like a random, messy, ever-expanding volume of every big and little thing. A search is our way into and through the mess, and when it’s made by using Google, it’s relying on the Google algorithm, a patented and closely guarded piece of intellectual property that the company calls PageRank, composed of “500 million variables and 2 billion terms.”

Those large numbers are comforting. They suggest an impermeable defense against bias, a scientific objectivity that allows the right response to the query to bubble up from the stew of so much stuff. To an extent it’s a self-perpetuating system, since it uses popularity (the number of links) as a proxy for importance, so that the more a particular link is clicked on, the higher its PageRank, and the more likely it is to appear near the top of the search results. (This is why companies have not necessarily minded bad reviews of their products.) Chorost likens this to Hebbian learning—the notion that neurons that fire together, wire together, since a highly ranked page will garner more page views, thus strengthening its ranking. [In this way]pages that link together “think” together. If many people visit a page over and over again, its PageRank will become so high that it effectively becomes stored in the collective human/electronic long-term memory.

Even if this turns out to be true, the process is anything but unbiased.

A Google search—which Chorost would have us doing in our own technologically modified heads—”curates” the Internet. The algorithm is, in essence, an editor, pulling up what it deems important, based on someone else’s understanding of what is important. This has spawned a whole industry of search engine optimization (SEO) consultants who game the system by reconfiguring a website’s code, content, and keywords to move it up in the rankings. Companies have also been known to pay for links in order to push themselves higher up in the rankings, a practice that Google is against and sometimes cracks down on. Even so, results rise to the top of a search query because an invisible hand is shepherding them there.

Read More

Junya Wantanabe FW 06/07 masks.

"Van Gogh" read by Joe Brainard at St. Marks Church on March 31st, 1971.

From Selected Writings (New York: The Kulchur Foundation, 1971).

via PennSound Poetry Archive.

NETHERLAND

Homeless in New York, a young gay woman learns to survive.

by 

The New Yorker

DECEMBER 10, 2012

Samantha was sitting on a lawn chair in her parents’ garage, smoking a joint, when she decided to run away. She had just graduated from high school, where she had few friends, and felt invisible. She went to class stoned and wrote suicidal poems about the shame of being molested by a family friend: “why try when there is no hope / for my dirty soul there is no soap.” The thought of remaining in her home town, in central Florida, made her feel ill. Reclining in her chair in the brightly lit garage, she closed her eyes and thought, Is this going to be my life?

Samantha had got A’s in high school and had planned to escape to college, until she realized she couldn’t afford it. The only other option, she decided, was to flee. She wanted to go to Manhattan, which she’d never visited, because it seemed like a good place to meet other lesbians. Samantha enjoyed reading about botany and had long assumed that, like some plants, she was asexual, a self-sustaining organism. She found it trivial and unbecoming when girls at school pined over their crushes. Then, at fifteen, she watched “Lara Croft: Tomb Raider” and was uncomfortably captivated by Angelina Jolie. Her English teacher at the time had the students spend five minutes every day on an exercise called Vomit, in which they wrote down every phrase that occurred to them. Their pens could not stop moving. “In my fifty-millionth Vomit, I spaced out and wrote, ‘I’m a lesbian and no one knows,’ ” she told me. “It was this crazy voice that knew.”

Throughout the summer of 2009, Samantha researched the logistics of being homeless in New York, reading all the articles she could find online, no matter how outdated. She learned that if she went to a homeless shelter before she was eighteen social workers would be required to contact her family. She wanted nothing to do with her parents, who, she believed, hadn’t taken her complaints of sexual abuse seriously; her mother suggested it was a hallucination. Samantha planned to live on the streets for several weeks, until her eighteenth birthday. Then she would begin the rest of her life: getting a job, finding an apartment, and saving for college.

In a purple spiral-bound notebook, she created a guide for life on the streets. She listed the locations of soup kitchens, public libraries, bottle-return vending machines, thrift stores, and public sports clubs, where she could slip in for free showers. Under the heading “known homeless encampments,” she wrote down all the parks, boardwalks, and tunnels where she could sleep and the subway line she’d take to get there. Her most detailed entry was a description of an abandoned train tunnel in Harlem and the name of a photographer who had taken pictures of the homeless people who lived in it. She hoped that if she mentioned the photographer’s name she would be “accepted by the underground society.”

On September 5, 2009, she bought a Greyhound bus ticket using the name Samantha Green. (She has asked me not to use her legal name.) Her parents were away for the day, visiting friends, and she told her thirteen-year-old brother that she was leaving for New York. He expressed concern about her being homeless, but she reassured him. “It’s kind of like camping,” she said. Her brother, who had always treated her with reverence, agreed not to tell her parents where she was going. He helped her break into her father’s safe so that she could take her birth certificate. Then he drove her to Walmart, where she bought a durable backpack, a roll of duct tape, protein bars, multivitamins, a box of garbage bags, a canteen, and a jar of peanut butter.

Samantha’s parents came home six hours after she left and found a note on her bed: “I’m not coming back for a long time… . I am safe where I am.”

Samantha spent her first few nights in Central Park, sleeping under a pine tree. She wore the cargo pants, steel-toed Brahma work boots, and blue hoodie that she had left home in. She kept an open book by her side so that anyone passing by would assume she was a student who had drifted off. Using her backpack as a pillow, she slept lightly, alert to the sound of footsteps. More than any noise, she feared the buzz of police radios. She avoided thoughts of danger by embellishing them, imagining that her absence was of central concern to the police. She survived her first days in New York, she said, by “acting like I was in some sort of spy novel.”

For hours every day, she wandered around the city, memorizing street names and bus routes, observing how the neighborhoods changed depending on the time of day. Her favorite time was just before dawn, when the bars let out. She watched drunken tourists shout foolish things as they searched for cabs, and enjoyed knowing that, comparatively, she had her bearings. Rarely sleeping more than four hours a night, she was constantly looking for opportunities to close her eyes. One of her first discoveries was the Museum of Natural History, where the bathroom stalls were conveniently narrow. She could sit on the toilet, her head against the stall, until she was woken at the end of the day by the sound of the janitor’s mop.

By sharing cigarettes, she befriended other homeless kids, many of whom hung out at the Apple Store on Fifth Avenue. Their poverty wasn’t apparent—most of them had stolen at least one trendy outfit—but Samantha could spot them easily, because of their backpacks and the way they lingered near the least impressive computers. (The pictures in their Facebook profiles had shiny new laptops in the background.) On rainy nights, Samantha occasionally slept with them on the A, C, E subway line, which has the city’s longest route. They called it “Uncle Ace’s house.” One person would stay awake, on guard against cops or thieves; the rest napped until the end of the line.

Many of the kids knew each other from the youth shelters, a decentralized and temporary system that turns away far more people than it houses. The city has roughly two hundred and fifty shelter beds for some four thousand youth between the ages of thirteen and twenty-five who are homeless on any given night. This substratum of the homeless population has historically been overlooked. Until 1974, running away was a crime. The federal youth shelter system wasn’t established until the seventies, following an era in which homeless kids were seen as middle-class dropouts who would shortly return home. The media portrayed them as rebellious flower children in search of a countercultural utopia. According to a 1967 article in the Times, the crisis involved “thousands of young runaways, particularly girls, who are flooding the Village area to live as hippies.”

During the recent recession, the rate of unemployment for people between the ages of fifteen and twenty-four reached nearly twenty per cent, a record high. Samantha dropped off her résumé (which she printed at libraries) at dozens of fast-food restaurants, but having no job experience, and given her appearance—she had packed no change of clothes—she seldom got called for interviews. She tried to make money by recycling bottles, but older homeless people had cornered the market. Instead, she shoplifted. It was easy, because of her wholesome looks. Half Cherokee on her mother’s side, she had sharp cheekbones, high-arched eyebrows, and long, shiny hair. She targeted chain stores like 7-Eleven and Whole Foods: she’d steal a package of oatmeal from one and then use the microwave at the other.

In a journal stolen from Barnes & Noble, she kept a log of all the items she’d pocketed: Advil PM, beef jerky, “Practical Guide to Cherokee Sacred Ceremonies and Traditions,” four lesbian romances by Gerri Hill, Emergen-C, an exercise shirt, an onion bagel. “I started this log with the intention of paying all these stores back when I got back on my feet again,” she wrote on the second page. “I now know that’s impossible.”

Read More

Shoe designed by Pablo Picasso and fabricated by Perugia, c. 1955.

Shoe designed by Pablo Picasso and fabricated by Perugia, c. 1955.

Gridirons from Studio International

David Noonan

Untitled, 2009

Untitled, 2009

Untitled, 2004

Untitled, 2007

Untitled, 2006

Untitled, 2004

David Noonan lives and works in London.

H.C. Westermann, Burning House, 1958

H.C. Westermann, Sun Coming Up Over New York, 1961

Robin Schwartz

Robin Schwartz photographs her daughter, Amelia. via the NYTimes and L.S.

LAST DAY FOR THE SAMPLE SALE WITH AN ADDITIONAL 10% OFF EXISTING SALE PRICES.

LAST DAY FOR THE SAMPLE SALE WITH AN ADDITIONAL 10% OFF EXISTING SALE PRICES.