Research Grants

2010 Singularity Research Challenge

$100.00

Offering unusually good philanthropic returns — meaning greater odds of a positive Singularity and lesser odds of human extinction — the Singularity Institute has launched a new challenge campaign. The sponsors, Edwin Evans, Rolf Nelson, Henrik Jonsson, Jason Joachim, and Robert Lecnik, have generously put up $100,000 of matching funds, so that every donation you make until February 28th will be matched dollar for dollar. If the campaign is successful, it will raise a full $200,000 to fund SIAI's 2010 activities.

Starting this campaign, we've put more details of our ongoing and potential work online than ever before, so you can get an overview of what projects we're doing, and so you can easily fund the proposals you support most. For more, see our grants below.

Jump to:

Our Goals

For almost a decade, the Singularity Institute has been asking questions on the future of human civilization: How can we benefit from increasingly powerful technology without succumbing to the risks, up to and including human extinction? What is the best way to handle artificial general intelligence (AGI): programs as smart as humans, or smarter?

Among SIAI's core aims is to continue studying "Friendly AI": AI that acts benevolently because it holds goals aligned with human values. This involves drawing on and contributing to fields like decision theory, computer science, cognitive and moral psychology, and technology forecasting.

Creating AI, especially the Friendly kind, is a difficult undertaking. We're in it for as long as it takes, but we've been doing more than laying the groundwork for Friendly AI. We've been raising the profile of AI risk and Singularity issues in academia and elsewhere, forming communities around enhancing human rationality, and researching other avenues that promise to reduce the most severe risks the most effectively.

How Earmarked Donations Work

If you make a donation to the Singularity Institute, you can choose which grant proposal your donation should help to fill. Any time a grant proposal is fully funded, it goes into our "active projects" file: it becomes a project that we have money enough to fund, and that we are publicly committed to funding. (Some of the projects will go forward even without earmarked donations, with money from the general fund — but many won't, and since our work is limited by how much money we have available to support skilled staff and Visiting Fellows, more money allows more total projects to go forward.)

Any remaining money allocated to partially funded grants on Feb 28 (at the close of the Challenge Campaign) will be returned to the general fund.

Recent Achievements

We have put together a document to inform supporters on our 2009 achievements. The bullet point version:

  • Singularity Summit 2009, which received extensive media coverage and positive reviews.
  • The hiring of new employees: President Michael Vassar, Research Fellows Anna Salamon and Steve Rayhawk, Media Director Michael Anissimov, and Chief Compliance Officer Amy Willey.
  • Founding of the Visiting Fellows Program, which hosted 14 researchers during the Summer and is continuing to host Visiting Fellows on a rolling basis, including graduate students and degree-holders from Stanford, Yale, Harvard, Cambridge, and Carnegie Mellon.
  • Nine presentations and papers given by SIAI researchers across four conferences, including the European Conference on Computing and Philosophy, the Asia-Pacific Conference on Computing and Philosophy, a Santa Fe Institute conference on forecasting, and the Singularity Summit.
  • The founding of the Less Wrong web community, to "systematically improve on the art, craft, and science of human rationality" and provide a discussion forum for topics important to our mission. Some of the decision theory ideas generated by participants in this community are being written up for academic publication in 2010.
  • Research Fellow Eliezer Yudkowsky finished his posting sequences at Less Wrong. Yudkowsky used the blogging format to write the substantive content of a book on rationality and to communicate to non-experts the kinds of concepts needed to think about intelligence as a natural process. Yudkowsky is now converting his blog sequences into the planned rationality book, which he hopes will help attract and inspire talented new allies in the effort to reduce risk.
  • Throughout the Summer, Eliezer Yudkowsky engaged in Friendly AI research with Marcello Herreshoff, a Stanford mathematics student who previously spent his gap year as a Research Associate for the Singularity Institute.
  • In December, a subset of SIAI researchers and volunteers finished improving The Uncertain Future web application to officially announce it as a beta version. The Uncertain Future represents a kind of futurism that has yet to be applied to Artificial Intelligence — futurism with heavy-tailed, high-dimensional probability distributions.

Grant Proposals

For our future work, we have prepared the following grant proposals. Each of these requires funding from individual donors — possibly you. Select a grant to put your money toward, below:

Total raised so far: $100,000, with $0 remaining
General Fund: $78,455

Or, if you'd like to jump-start efforts to reduce human extinction risk other than the projects above, and if you're interested in donating $1,000 or more — email Anna Salamon, at annasalamon at singinst dot org. We'd love to hear your thoughts, and to work with you to create an effective new project.

Donor Comments

  • "Good stuff SIAI, keep up the great work." - Adam Ford

  • "We need to sort out the mess that is human morality before any attempt to make AGI is succesful. The complexity and subtlety of human wishes should be common knowledge in the entire AGI community." - Johan Edström

  • "Many groups admirably aim to reduce suffering, but SIAI is one of the few that seriously examines the most cost-effective ways of doing so in light of physics, computation, and cognitive science." - Brian Tomasik

  • "Work related to the singularity is humanity's greatest hope to solve the unignorable problems of our world. I donate for our future." - Diane Parish

  • "Great to see the articles, conference talks, and new plans!" - Joshua Fox

  • "The SIAI may be the only thing that saves humanity." - Robin Powell

  • "Here's to the future. May it be better than the past." - Paul Gentemann

  • "If I had to donate to a single non profit organization, this would be it! The work being done here will undoubtedly greatly benefit this and future generations." - L'Emir-Nader Chehab

  • "Working on Friendly AI is solving every human problem by other means." - Scott Dickey

  • "Of the many organizations trying to do good, SIAI has probably most rigorously considered how to really maximize good." - Alex Edelman

  • "A donation to the SIAI should be part of every investment portfolio." - Arthur Breitman

Donate now, and seize a better than usual chance to move our work forward. Credit card transactions are securely processed through PayPal. Select an amount and your credit card number will be requested at the next screen.

PayPal – Grant Donation

(200 characters max)