Brain Pickings

Posts Tagged ‘software’

12 MAY, 2011

The Filter Bubble: Algorithm vs. Curator & the Value of Serendipity

By: Maria Popova

How the web gives us what we want to see, and that’s not necessarily a good thing.

Most of us are aware that our web experience is somewhat customized by our browsing history, social graph and other factors. But this sort of information-tailoring takes place on a much more sophisticated, deeper and far-reaching level than we dare suspect. (Did you know that Google takes into account 57 individual data points before serving you the results you searched for?) That’s exactly what Eli Pariser, founder of public policy advocacy group MoveOn.org, explores in his fascinating and, depending on where you fall on the privacy spectrum, potentially unsettling new book, The Filter Bubble — a compelling deep-dive into the invisible algorithmic editing on the web, a world where we’re being shown more of what algorithms think we want to see and less of what we should see.

I met Eli in March at TED, where he introduced the concepts from the book in one of this year’s best TED talks. Today, I sit down with him to chat about what exactly “the filter bubble” is, how much we should worry about Google, and what our responsibility is as content consumers and curators — exclusive Q&A follows his excellent TED talk:

The primary purpose of an editor [is] to extend the horizon of what people are interested in and what people know. Giving people what they think they want is easy, but it’s also not very satisfying: the same stuff, over and over again. Great editors are like great matchmakers: they introduce people to whole new ways of thinking, and they fall in love.” ~ Eli Pariser

q0

What, exactly, is “the filter bubble”?

EP: Your filter bubble is the personal universe of information that you live in online — unique and constructed just for you by the array of personalized filters that now power the web. Facebook contributes things to read and friends’ status updates, Google personally tailors your search queries, and Yahoo News and Google News tailor your news. It’s a comfortable place, the filter bubble — by definition, it’s populated by the things that most compel you to click. But it’s also a real problem: the set of things we’re likely to click on (sex, gossip, things that are highly personally relevant) isn’t the same as the set of things we need to know.

q1

How did you first get the idea of investigating this?

EP: I came across a Google blog post declaring that search was personalized for everyone, and it blew my mind. I had no idea that Google was tailoring its search results on an individual basis at all — the last I’d heard, it was showing everyone the same “authoritative” results. I got out my computer and tried it with a friend, and the results were almost entirely different. And then I discovered that Google was far from the only company that was doing this. In fact, nearly every major website is, in one way or another. (Wikipedia is a notable exception.)

q2

In an age of information overload, algorithms certainly finding the most relevant information about what we’re already interested in more efficiently. But it’s human curators who point us to the kinds of things we didn’t know we were interested in until, well, until we are. How does the human element fit into the filter bubble and what do you see as the future of striking this balance between algorithmic efficiency and curatorial serendipity?

EP: The great thing about algorithms is that, once you’ve got them rolling, they’re very cheap. Facebook doesn’t have to pay many people to edit the News Feed. But the News Feed also lacks any real curatorial values — what you’re willing to Like is a poor proxy for what you’d actually like to see or especially what you need to see. Human curators are way better at that, for now — knowing that even though we don’t click on Afghanistan much we need to hear about it because, well, there’s a war on. The sweet spot, at least for the near future, is probably a mix of both.

One interesting place this comes up is at Netflix — the basic math behind the Netflix code tends to be conservative. Netflix uses an algorithm called Root Mean Squared Error (RMSE, to geeks), which basically calculates the “distance” between different movies. The problem with RMSE is that while it’s very good at predicting what movies you’ll like — generally it’s under one star off — it’s conservative. It would rather be right and show you a movie that you’ll rate a four, than show you a movie that has a 50% chance of being a five and a 50% chance of being a one. Human curators are often more likely to take these kinds of risks.

q3

How much does Google really know about us, in practical terms, and — more importantly — how much should we care?

EP: That depends on how much you use Google — about me, it knows an awful lot. Just think: it’s got all of my email, so it not only has everything I’ve written to friends, it has a good sense of who I’m connected to. It knows everything I’ve searched for in the last few years, and probably how long I lingered between searching for something and clicking the link. There are 57 signals that Google tracks about each user, one engineer told me, even if you’re not logged in.

Most of the time, this doesn’t have much practical consequence. But one of the problems with this kind of massive consolidation is that what Google knows, any government that is friends with Google can know, too. And companies like Yahoo have turned over massive amounts of data to the US government without so much as a subpoena.

Companies like Yahoo have turned over massive amounts of data to the US government without so much as a subpoena.” ~ Eli Pariser

I’d also argue there’s a basic problem with a system in which Google makes billions off of the data we give it without giving us much control over how it’s used or even what it is.

q4

Do you think that we, as editors and curators, have a certain civic responsibility to expose audiences to viewpoints and information outside their comfort zones in an effort to counteract this algorithmically-driven confirmation bias, or are people better left unburdened by conflicting data points?

EP: In some ways, I think that’s the primary purpose of an editor — to extend the horizon of what people are interested in and what people know. Giving people what they think they want is easy, but it’s also not very satisfying: the same stuff, over and over again. Great editors are like great matchmakers: they introduce people to whole new ways of thinking, and they fall in love.

q4

Is it possible to reconcile personalization and privacy? What are some things we could do in our digital lives to strike an optimal balance?

EP: Well, personalization is sort of privacy turned inside out: it’s not the problem of controlling what the world knows about you, it’s the problem of what you get to see of the world. We ought to have more control over that — one of the most pernicious things about the filter bubble is that mostly it’s happening invisibly — and we should demand it of the companies we use. (They tend to argue that consumers don’t care — we should let them know we do.)

On an individual level, I think it comes down to varying your information pathways. There was a great This American Life episode which included an interview with the guy who looks at new mousetrap designs at the biggest mousetrap supply company. As it turns out, there’s not much need for a better mousetrap, because the standard trap does incredibly well, killing mice 90% of the time.

The reason is simple: Mice always run the same route, often several times a day. Put a trap along that route, and it’s very likely that the mouse will find it and become ensnared.

So, the moral here is: don’t be a mouse. Vary your online routine, rather than returning to the same sites every day. It’s not just that experiencing different perspectives and ideas and views is better for you — serendipity can be a shortcut to joy.

Ed. note: The Filter Bubble is out today and one of the timeliest, most thought-provoking books I’ve read in a long time — required reading as we embrace our role as informed and empowered civic agents in the world of web citizenship.

Brain Pickings has a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s an example. Like? Sign up.

06 APRIL, 2011

SubMap: Visualizing Subjective Urban Patterns

By: Maria Popova

What Twitter in Finland has to do with villages in Hungary and the solipsism of urbanity.

Maps, cities and data visualization are among our sharpest points of interest, so when the the three converge, we’re swooning all over. SubMap, which we stumbled upon on the excellent new ArtsTech News aggregator, is a visualization project that flies in the face of the traditional conception of maps as static and objective representations of the public world, and instead maps the subjective personal experiences of a city’s residents.

From locals’ favorite places in Budapest to Finland’s real-time Twitter chatter to a subjective map of the city plotting the cartographers’ homes as the epicenter, the maps are living abstractions of civic sentiment, part Hitotoki, part ComplexCity, part We Feel Fine, part something else entirely.

The project’s latest iteration, SubCity 2.0: Ebullition, captures 12 years worth of data patterns from origo.hu, Hungary’s leading news site, not only visually but also through a sonic representation.

In the 30 fps animation, each frame represents a single day, each second covers a month, starting from December 1998 until October 2010. Whenever a Hungarian city or village is mentioned in any domestic news on origo.hu website, it is translated into a force that dynamically distorts the map of Hungary. The sound follows the visual outcome, creating a generative ever changing drone.”

SubMap is the work of Dániel Feles, Krisztián Gergely, Attila Bujdosó and Gáspár Hajdu from Hungarian new media lab Kitchen Budapest, a hub for young researchers and experimenteurs looking to explore the intersection of mobile communication, online communities and urban space.

via Creators Project

We’ve got a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s an example. Like? Sign up.

03 MARCH, 2011

TED 2011: The Rediscovery of Wonder, Day 3

By: Maria Popova

Embracing chaos, 57 things Google knows about you, and how to 3D-print a kidney.

This week, we’re reporting live from TED 2011: The Rediscovery of Wonder. So far, we warmed up with 5 must-read books by some of this year’s speakers, synthesized highlights from Day 1 and Day 2, and spotlighted an inspired urban intervention by designer and TED Fellow Candy Chang. Today, we’re back — on the brink of our sleep budged — with highlights, photos and notable soundbites from Day 3 — dig in.

Historian Edward Tenner

Culture and technology historian Edward Tenner showed statistical evidence that the greatest time for game-changing innovation in modern history was actually The Great Depression, which had a paradoxically stimulating effect on creativity. He argued that one of the grand questions of our time is how to close the gap between our capabilities and our foresight.

Our ability to innovate is increasing geometrically but our capacity to model those innovations is linear.” ~ Edward Tenner

Tenner’s excellent 1997 book, Why Things Bite Back: Technology & the Revenge of Unintended Consequences, will change the way you think about adversity, opportunity and innovation.

Chris Anderson presenting the winners of the Ads Worth Spreading contest.

Image credit: James Duncan Davidson / TED

TED announced the 10 winners of the inaugural Ads Worth Spreading contest, seeking to reframe commercial communication from an interruption to inspiration.

Eli Pariser of MoveOn.org fame, author of the excellent forthcoming The Filter Bubble: What the Internet Is Hiding from You, delivered a stride-stopping and timely curtain-pull on our modern information diet and what we’re being force-fed by the powers of the Internet. Google, apparently, looks at 57 data points to serve us personally tailored search results.

We’ve moved to an age where the Internet is showing us what it thinks we want to see, but not necessarily what we need to see.” ~ Eli Pariser

Which raises the question of responsibility: Is the responsibility of those who serve information to give us more of what we already like and believe, or to open our eyes to new perspectives? And if it’s all algorithmically driven, is there even a place for such responsibility? Our key takeaway from Pariser’s talk, one particularly relevant to our own credo, is that human information curators will have an increasingly important role as moral mitigators of algorithmic personalization efficiency.

Eli Pariser 'We need the new information gatekeepers to encode a sense of civic responsibility into algorithms.'

Image credit: James Duncan Davidson / TED

We need the Internet to introduce us to different ideas and different perspectives.” ~ Eli Pariser

Virginia Tech’s Dennis Hong is building the world’s first vehicle for the visually-impaired. and recently made history with the Blind Driver Challenge.

Dennis Hong 'We need the new information gatekeepers to encode a sense of civic responsibility into algorithms.'

Image credit: James Duncan Davidson / TED

High-functioning autistic savant Daniel Tammet opened the door to his fascinating view of the world. He used synesthesia, the strange neurological crossing of the senses, as an example of how the world is often richer than we think it to be.

Daniel Tammet shows us the world through the eyes of an autistic savant.

Image credit: James Duncan Davidson / TED

Tammet’s Born On A Blue Day: Inside the Extraordinary Mind of an Autistic Savant is one of the most fascinating perspective shifts you’ll ever read.

Google's Sebastian Thrun 'We took a driverless car from San Francisco to LA, and no one even noticed there was no driver.'

Image credit: James Duncan Davidson / TED

The idea behind the Stuxnet worm is quite simple: We don’t want Iran to get the bomb.” ~ Ralph Langner

Security consultant Ralph Langner 'Mossad is responsible for Stuxnet. But the real force behind that is not Israel, it is the only cyber force: The U.S.'

Image credit: James Duncan Davidson / TED

In one of the day’s most jaw-dropping demos, the kind that restores one’s faith in humanity, Berkley BionicsEythor Bender showcased the incredible eLEGS exoskeletons, which enable the paralyzed to walk again, and HULC, which enables ordinary people to carry up to 200 lbs. Bender was joined onstage by a soldier, who demoed HULC, and a paralyzed woman who walked for the first time in 18 years thanks to eLEGS.

Eythor Bender on stage with paraplegic Amanda Boxtel, ecstatic in her new non-invasive exoskeleton legs.

Image credit: James Duncan Davidson / TED

Biomedical engineer Fiorenzo Omenetto is developing amazing non-invasive implants made of silicon and silk.

Fiorenzo Omenetto shows a disposable cup made of silk, a biodegradable, biocompatible alternative to the highly unsustainable styrofoam.

Image credit: James Duncan Davidson / TED

There was no shortage of astounding demos today. Anthony Atala, whose work in 3D organ printing is an unbelievable next frontier in medicine, literally “printed” a kidney on the TED stage as 1,700 of the world’s smartest people gasped in awe, speechless.

Anthony Atala 'prints' a kidney to a collective gasp.

Image credit: James Duncan Davidson / TED

The remarkable papercut artist Béatrice Coron, whose stunning artwork we’ve spotted on the New York subway, echoed some of our own beliefs about combinatorial creativity:

I’m influenced by everything I read, everything I see. In life and in paper cutting, everything is connected: One story leads to another.” ~ Beatrice Coron

Watch Coron’s creative process and swoon like we did:

Keep an eye on our live Twitter coverage and come back here tomorrow evening for highlights from the final day.

We’ve got a free weekly newsletter and people say it’s cool. It comes out on Sundays and offers the week’s best articles. Here’s an example. Like? Sign up.