Culture Making is now archived. Enjoy five years of reflections on culture worth celebrating.
For more about the book and Andy Crouch, please visit andy-crouch.com.

Posts tagged technology

photo
"Der Bibelschreiber," by robotlab, from the installation bios [bible], 2007 :: via pietmondriann.com
Nate:
by Andy Crouch for Culture Making

Steve Jobs was a supreme example of a culture maker. 

He made cultural goods, in every sense of that word. “Real artists ship,” he famously told his engineers. Culture is only changed when you make more of it, and, boy, did Steve Jobs make more of it.

He pursued excellence, and in particular he pursued beauty. In every market Apple entered, it did things more cleanly, elegantly, and beautifully than its competitors. It’s not too much to credit Steve Jobs with the return of beauty to the center of our culture’s aspirations.

He built teams. Yes, by all accounts he could be an abrasive manager, to say the least (though one hears fewer stories of that from the last ten years, when Apple had been rescued from disaster and, perhaps, illness had chastened him in some ways). But he pulled together teams of 3, 12, and 120 that demonstrated tenacious loyalty and disciplined creativity in the otherwise fickle world of Silicon Valley. He was a celebrity, but he was not a rock star—he was a leader. That makes all the difference in the world.

But all this, and so much more, is fairly obvious. I think something less obvious will be Steve Jobs’s greatest legacy.

The most fundamental question of our technological age is this: Will technology make us more, or less, fully human?

Steve Jobs just may have decisively shifted the answer to that question. He embodied the hope that the answer is more. 

The Mac was launched with this brilliant promise: “1984 won’t be like 1984.” Apple’s products respected human beings—their embodiment, their quest for beauty and meaning and even joy—in a way that their competitors’ did not. And Steve himself, who exuded calm and confidence and vision even while he stirred consumers to frenzies of desire and competitors to distraction, envy, and imitation, represented our vision of ourselves as we hope we can be: not slaves to technology, but free and creative users of it.

In this broken, beautiful world, there are no pure icons—but neither are there any completely empty idols. Apple’s bitten apple is not an icon—like all idols, the more fervent the worship the more it will disappoint. And yet, it is, and Steve Jobs was, a sign of something true and worth seeking: a fully human life. For all of us who seek that life, the only proper response to Steve Jobs’s extraordinary culture making is: thank you.

Nate:
from "Why Teenagers Read Better Than You," by Joanne McNeil, Tomorrow Museum, 20 June 2009 :: first posted here 23 June 2009

Certainly, the increasing quality of young adult books is a draw. But there are exceptional videogames, there are exceptional websites and exceptional television programs to fight for a teenager’s attention. So why are they still reading?

I think there is another reason why young adult novels are doing well, and it is less easy gauge. As of yet, there are no real studies determining this, but anecdotally, we all relate to it. A book is an opportunity to get “off the grid.” We read to break free of their digital tether. To experience what life was like before the net. To disconnect. To finally feel alone.

A book holds your hand in solitude and says, here you are alone in your room and everything is alright. You don’t need to call a friend or Twitter something. The world is still turning. If you go for a forty minute walk without your mobile, don’t worry, you’re not going to miss anything.

Anyone who does not see the vanity of the world is very vain himself.

And so who does not see it, apart from the young who are preoccupied with bustle, distractions, and plans for the future?

But take away their distractions and you will see them wither from boredom.

Then they feel their hollowness without understanding it, because it is indeed depressing to be in a state of unbearable sadness as soon as you are reduced to contemplating yourself, and without distraction from doing so.

—Pascal, Penseés 70 (tr. Honor Levi)

Andy:
from "Why I Returned My iPad," by Peter Bregman, Harvard Business Review, 16 June 2010 :: via Ted Olsen

The brilliance of the iPad is that it’s the anytime-anywhere computer. On the subway. In the hall waiting for the elevator. In a car on the way to the airport. Any free moment becomes a potential iPad moment.

The iPhone can do roughly the same thing, but not exactly. Who wants to watch a movie in bed on an iPhone?

So why is this a problem? It sounds like I was super-productive. Every extra minute, I was either producing or consuming.

But something — more than just sleep, though that’s critical too — is lost in the busyness. Something too valuable to lose.

Boredom.

Being bored is a precious thing, a state of mind we should pursue. Once boredom sets in, our minds begin to wander, looking for something exciting, something interesting to land on. And that’s where creativity arises.

My best ideas come to me when I am unproductive. When I am running but not listening to my iPod. When I am sitting, doing nothing, waiting for someone. When I am lying in bed as my mind wanders before falling to sleep. These “wasted” moments, moments not filled with anything in particular, are vital.

They are the moments in which we, often unconsciously, organize our minds, make sense of our lives, and connect the dots. They’re the moments in which we talk to ourselves. And listen.

To lose those moments, to replace them with tasks and efficiency, is a mistake. What’s worse is that we don’t just lose them. We actively throw them away.

from "MIT Student Designs All-Terrain Wheelchair for the Poor," by Cliff Kuang, Fast Company, 23 February 2010 :: via The Morning News
Nate:

from "500 Fotos," by Adam Tyson
Christy:
Andy:
from "Luxury or Necessity? The Public Makes a U-Turn," by Rich Morin and Paul Taylor, Pew Research Center, 23 April 2009 :: via Floyd Norris

No longer do substantial majorities of the public say a microwave oven, a television set or even home air conditioning is a necessity. Instead, nearly half or more now see each of these items as a luxury. Similarly, the proportion that considers a dishwasher or a clothes dryer to be essential has dropped sharply since 2006.

These recession-era reevaluations are all the more striking because the public’s luxury-versus-necessity perceptual boundaries had been moving in the other direction for the previous decade. For example, the share of adults who consider a microwave a necessity was just 32% in 1996. By 2006, it had shot up to 68%. Now it has retreated to 47%. Similarly, just 52% of the public in the latest poll say a television set is a necessity—down 12 percentage points from 2006 and the smallest share to call a TV a necessity since this question was first asked more than 35 years ago.

Andy:

Q: You sound like you’re able to handle the ups and downs of this job pretty well.

A: I think the key to doing this job, in addition to multitasking and speed of movement, is to be able to handle the emotional components. I’m good at it; I’m empathetic and I don’t take it home at the end of the day. I can talk about things like domestic violence; it’s just a reality.

Q: How long have you been doing this job?

A: I’ve been doing it for nine years. My job now is training supervisor, so I manage ongoing training. New trainees go through a nine-month process; we have an academy. They learn call-taking, radio dispatching, the medical aspect, interpersonal skills.

And you have to know geography. Geography is so important, because people can call and have no clue where they are.

photo
"Jesus," Orlando, Florida, 2008, from the series Holy Land, by Lee Satkowski :: via Flak Photo
Nate:
Andy:
from "Pieties and Pixels," by Glenn Arbery, FIRST THINGS: On the Square, 27 February 2009

[Wedding guests who double as amateur photographers] do not see themselves as intruding upon the event, but as absenting themselves from it in order to bestow the gift of . . . precious memories (which always requires the foreboding ellipsis). They sacrifice their ordinary presence at the mere wedding to become a selfless, invisible recording eye, as though they occupied some interstitial space between the sacred, but still physical one of the church and—what, exactly? The not-yet-embodied future? It strikes me that they think they are made angels by the camera, observers unobserved.

But there they were, still in their bodies, perfectly visible to everyone.

And who in the world were they? My wife told me later that she stopped the one on the side aisle by catching his eye, shaking her head, and fiercely mouthing the word No. Crestfallen, he retreated. The other one, however, a young woman, angled across the front of the church in front of the pulpit, went through the entrance to the sacristy, and emerged behind a carved wooden grate where she stationed herself for the next half-hour intermittently flashing away like an expert sniper at the bride and groom.

Andy:
from "That Synching Feeling," by Eric Felten, WSJ.com, 5 February 2009

“There’s too many variables to go live. I would never recommend any artist go live because the slightest glitch would devastate the performance,” [explained Jennifer Hudson’s producer regarding her prerecorded Super Bowl anthem.] His justification echoed Itzhak Perlman explaining why the all-star classical quartet at the inauguration was prerecorded. “It would have been a disaster if we had done it any other way,” Mr. Perlman told the New York Times. “This occasion’s got to be perfect. You can’t have any slip-ups.”

My, what a standard of perfection is now demanded. No longer is a good or even a great performance good enough. Now we must have performances free from the “slightest glitch.” And since no one—not even a singer of Ms. Hudson’s manifest talent nor a violinist of Mr. Perlman’s virtuosity—can guarantee that a live performance will be 100% glitch-free, the solution has been to eliminate the live part. Once, synching to a recorded track was the refuge of the mediocre and inept; now it’s a practice taken up by even the best artists.

video Objectified

from "Objectified: A Documentary Film by Gary Hustwit," 5 January 2009 :: via Daring Fireball
Andy:
Andy:

Texting is the cheapest and most popular mode of cellphone communication in most of the world, and last year text messages topped voice calls even in the U.S. The world’s three billion cellphones far surpass the Internet as a universal communications medium, and they are vital to business development in less-developed economies.

But companies that develop predictive text say they have created cellphone software for fewer than 80 of the world’s 6,912 languages cataloged by SIL International, a Dallas organization that works to preserve languages.

One key to using the languages is the availability of a technology called predictive text, which reduces the number of key taps necessary to create a word when using a limited keypad. Market research shows that text messaging soars after predictive text becomes available. . . .

In Hindi, a language with 11 vowels and 34 consonants that is spoken by 40% of the Indian population, texting “Namaste,” which means “hello,” can take 21 key presses. . . . Typing “Namaste” with predictive text takes just six key presses. Nuance Corp. of Burlington, Mass., which dominates the predictive-text market, says that in 2006 cellphone users in India with predictive text in their handsets averaged 70 messages a week; those without it averaged 18.

Andy:
from "Dwelling in Possibilities," by Mark Edmundson, ChronicleReview.com, 14 March 2008 :: via Santiago Ramos at Good Letters

A Romantic, says Nietzsche, is someone who always wants to be elsewhere. If that’s so, then the children of the Internet are Romantics, for they perpetually wish to be someplace else, and the laptop reliably helps take them there — if only in imagination. The e-mailer, the instant messenger, the Web browser are all dispersing their energies and interests outward, away from the present, the here and now. The Internet user is constantly connecting with people and institutions far away, creating surrogate communities that displace the potential community at hand.

Then too, booking by computer has made travel easier and, by eliminating a certain number of middlemen, kept it reasonably cheap. So there’s an inducement to take off physically as well. The Internet is perhaps the most centrifugal technology ever devised. The classroom, where you sit down in one space at one time and ponder a text or an issue in slow motion, is coming to feel ever more antiquated. What’s at a premium now is movement, making connections, getting all the circuitry fizzing and popping.

For students now, life is elsewhere. Classes matter to them, but classes are just part of an ever-enlarging web of activities and diversions. Students now seek to master their work — not to be taken over by it and consumed. They want to dispatch it, do it well and quickly, then get on to the many other things that interest them. For my students live in the future and not the present; they live with their prospects for success and pleasure. They dwell in possibility.

excerpt Achtung!
Andy:
from "Work Ethic 2.0: Attention Control," by Mike Elgan, InternetNews Realtime IT News, 29 December 2008

In a world in which entire industries bet their businesses on gaining access to our attention, which value leads to better personal success: hard work or the ability to control attention?

A person who works six hours a day but with total focus has an enormous advantage over a 12-hour-per-day workaholic who’s “multi-tasking” all day, answering every phone call, constantly checking Facebook and Twitter, and indulging every interruption.

Nate:

Rather than infer that nanotechnology is safe, members of the public who learn about this novel science tend to become sharply polarized along cultural lines, according to a study conducted by the Cultural Cognition Project at Yale Law School in collaboration with the Project on Emerging Nanotechnologies. The report is published online in the journal Nature Nanotechnology.

These findings have important implications for garnering support of the new technology, say the researchers.

The experiment involved a diverse sample of 1,500 Americans, the vast majority of whom were unfamiliar with nanotechnology, a relatively new science that involves the manipulation of particles the size of atoms and that has numerous commercial applications. When shown balanced information about the risks and benefits of nanotechnology, study participants became highly divided on its safety compared to a group not shown such information.

The determining factor in how people responded was their cultural values, according to Dan Kahan, the Elizabeth K. Dollard Professor at Yale Law School and lead author of the study. “People who had more individualistic, pro-commerce values, tended to infer that nanotechnology is safe,” said Kahan, “while people who are more worried about economic inequality read the same information as implying that nanotechnology is likely to be dangerous.”

According to Kahan, this pattern is consistent with studies examining how people’s cultural values influence their perceptions of environmental and technological risks generally. “In sum, when they learned about a new technology, people formed reactions to it that matched their views of risks like climate change and nuclear waste disposal,” he said.

Andy:
from "Amazon Kindle, Just the Right Mix of Book and Nonbook," by Virginia Heffernan, NYTimes.com, 31 October 2008 :: via Alan Jacobs

Really, it’s terrible. How this prototype ever made it into production I don’t know. It’s as if its creators had never seen an iPhone. Or a Walkman, for that matter. Where have they been? And the Internet capability that the device offers (almost exclusively so you can download books and other reading material from Amazon) is so poor — its parameters so hard to determine, its browser so ungracious and inaccessible — that you’re discouraged from ever exploiting it.

At the same time, and you’d be justified in thinking I’m just seeking a silver lining to rationalize my homely new purchase (it cost $360, after all), there’s some way in which the Kindle’s weak Internet connection and elusive browser are the best parts of the machine. As I said, the Kindle feels insular and remote from the wild world of commerce and buzzing data swarms. But the fact that it’s connected to the Web sort of — it has to be, right? Or how else could I download all these books? — makes the Kindle somehow better than a book. Because while I like a few hours on an airplane, I can’t say I want to move into a locked library carrel and never visit the Internet again. And I like that the Kindle, which connects to the Web through some proprietary Amazon entity called a Whispernet, is not completely out of it. The Kindle acknowledges the Internet; it hears its clamorous demands. It just ignores those demands. For the user, this means the Kindle bestows on the contemporary reader the ultimate grace: it keeps the Internet at bay.

Andy:
from "Intuition + Money - An Aha Moment," by John Markoff, NYTimes.com, 11 October 2008

Black silicon was discovered because [Eric] Mazur started thinking outside the boundaries of the research he was doing in the late 1990s. His research group had been financed by the Army Research Organization to explore catalytic reactions on metallic surfaces.

“I got tired of metals and was worrying that my Army funding would dry up,” he said. “I wrote the new direction into a research proposal without thinking much about it — I just wrote it in; I don’t know why.” And even though there wasn’t an immediate practical application, he received the financing.

It was several years before he directed a graduate student to pursue his idea, which involved shining an exceptionally powerful laser light — briefly matching the energy produced by the sun falling on the surface of the entire earth — on a silicon wafer. On a hunch, the researcher also applied sulfur hexafluoride, a gas used by the semiconductor industry to make etchings for circuits.

The silicon wafer looked black to the naked eye. But when Dr. Mazur and his researchers examined the material with an electron microscope, they discovered that the surface was covered with a forest of ultra-tiny spikes.

At first, the researchers had no idea what they had stumbled onto, and that is typical of the way many scientific discoveries emerge. Cellophane, Teflon, Scotchgard and aspartame are among the many inventions that have emerged through some form of fortunate accident or intuition.

“In science, the most exciting expression isn’t ‘Eureka!’ It’s ‘Huh?’” said Michael Hawley, a computer scientist based in Cambridge, Mass., and a board member and investor in SiOnyx.

Black silicon has since been found to have extreme sensitivity to light. It is now on the verge of commercialization, most likely first in night vision systems.

photo
from "Stunningly Intricate: Curta Mechanical Calculator," by Avi Abrams, Dark Roasted Blend, 6 September 2008
Andy: