July 2012 Newsletter

This newsletter was sent out to Singularity Institute newsletter subscribers in July 2012

Greetings from the Executive Director

Luke Muehlhauser

Friends of the Singularity Institute,

Greetings! Our new monthly newsletter will bring you the latest updates from the Singularity Institute (SI). (You can read earlier monthly progress updates here.)

These are exciting times at SI. We just launched our new website, and also the website for the Center for Applied Rationality. We have several research papers under development, and after a long hiatus from AI research, researcher Eliezer Yudkowsky is planning a new sequence of articles on “Open Problems in Friendly AI.”

We have also secured $150,000 in matching funds for a new fundraising drive. To help support us in our work toward a positive Singularity, please donate today and have your gift doubled!

Luke Muehlhauser
Singularity Institute Executive Director

Donate Today to Double Your Impact!

Our summer 2012 matching drive is on! Now is your chance to double your impact while helping us raise up to $300,000 to help fund our research program and stage the upcoming Singularity Summit… which you can register for now!Since we published our strategic plan in August 2011, we have achieved most of the near-term goals outlined therein. Here are just a few examples:

In the coming year, the Singularity Institute plans to do the following:

  • Hold our annual Singularity Summit, this year in San Francisco! Speakers this year include Ray Kurzweil, Steven Pinker, Tyler Cowen, Temple Grandin, Peter Norvig, Robin Hanson, and Vernor Vinge.
  • Spin off the Center for Applied Rationality as a separate organization focused on rationality training, so that the Singularity Institute can be focused more exclusively on Singularity research and outreach.
  • Publish additional research on AI risk and Friendly AI.
  • Eliezer will write an “Open Problems in Friendly AI” sequence for Less Wrong.
  • Finish Facing the Singularity and publish ebook versions of Facing the Singularity and The Sequences, 2006-2009.
  • And much more! For details on what we might do with additional funding, see How to Purchase AI Risk Reduction.

We appreciate your support for our high-impact work! Donate now, and seize a better than usual chance to move our work forward.

Eliezer to Write “Open Problems in Friendly AI”

After a long hiatus from AI research to help with movement-building, Eliezer Yudkowsky is now planning a sequence of articles on open problems in Friendly AI research. These articles will help to explain the technical research that can be done today to help ensure a positive Singularity. The articles will be initially published at Less Wrong, where his earlier sequences of articleswere published.What about Eliezer’s in-progress rationality books? They are on hold for now while Eliezer works on other projects. We have signed a retainer with a professional author who has written at least one best-selling science book, and he will work on Eliezer’s books once he finishes his current project, probably late this fall.

Visit Our New Website

SI Media Director Michael Anissimov and many others have worked hard to create the new look and feel for our online web presence, at Singularity.org. The site has been reorganized into six top-level pages: About, What We Do, Research, Media, Get Involved, and Donate.The single greatest update to our website is to our researchpage and our research papers, which are now organized into three categories. Most of our research papers have been ported to a clean new template, and every reference has been manually checked and updated.

We’ve also created a transparency page which features Q&As, policy documents, and our tax forms dating back to our founding in 2000.

For casual reading, there is also a new tech summaries page that features short articles on emerging technologies such as the Berkeley Brain-Computer Interface, regenerative medicine, and AI in automated science and discovery.

Be sure to subscribe to our blog for regular updates!

Center for Applied Rationality (CFAR)

Per our August 2011 strategic plan, SI is helping to launch a separate organization devoted to rationality skills training. That organization is the Center for Applied Rationality (CFAR), which was recently granted 501c3 status and has a new website at AppliedRationality.org.In an age when our economic, political, and technological choices can spark both amazing progress and unprecedented devastation, it’s crucial that decision-makers are not only aware of the many near-universal cognitive biases, but also well-trained in avoiding them, and in using thinking habits based in probability and logic that outperform our brains’ innate algorithms. And that’s the goal of the Center for Applied Rationality (CFAR): to turn decades of research in cognitive science into a set of practicable techniques people can actually use to become better at reasoning and making decisions.

To think rationally about the future, we must be able to weigh different levels of risk and uncertainty, compare expected values, avoid over-weighting short term outcomes at the expense of the long term, avoid the “narrative fallacy” (in which vividly imagined outcomes appear more likely), have well-calibrated levels of confidence in their judgments, and think clearly even about emotionally fraught decisions — and much, much more.

CFAR is devoted to teaching those techniques, and the math and science behind them, to adults and exceptional youth. In the process, CFAR will be breaking new ground in studying the long-term effects of rationality on life outcomes using randomized controlled trials, to help us improve our material and to contribute to the body of knowledge in applied rationality. And we’ll building a real-life community of tens of thousands of students, entrepreneurs, researchers, programmers, philanthropists, and other people who are passionate about using rationality to improve their decisions for themselves and for the rest of the world.

Read more about what CFAR does, and send your friends to the new What is Rationality? page, which may now be the best short introduction to rationality available.

Featured Donor: Jesse Liptrap

Jesse LiptrapEach month, our newsletter will share the views of a featured donor.Jesse Liptrap, a bioinformatician at the University of California, Berkeley was led to the SI’s broader community via a Less Wrong meetup in Spring of 2009. With a longstanding interest in transhumanism and the philosophy of moving beyond the frailty of human hardware, Jesse took quickly to the idea that applied rationality offers a set of tools for more accurately assessing the risks and opportunities of powerful emerging technologies.

Over the past few years, Jesse has climbed our Top Donors list. He also serves as SI’s non-staff Treasurer.

Featured Summit Video

This month we are featuring a video from the 2010 Singularity Summit: Anna Salamon’s “How Much it Matters to Know What Matters: A Back of the Envelope Calculation.” Anna’s talk shows that research about the Singularity has extremely high value of information, and should therefore be supported over and above many other kinds of research currently being conducted.

$1000 Prize for Best AGI Safety Paper

This year’s AGI-12 conference will include a special track about the impacts of artificial general intelligence, called “AGI Impacts.”The Singularity Institute is sponsoring a $1000 prize for the best AGI safety contribution to ‘AGI-12′ and ‘AGI-Impacts 2012.’ The winner will be decided by a jury from SI and announced by Louie Helm at the end of the AGI Impacts conference.

The award is given in honor of Alan Turing, who not only discovered some of the key ideas in machine intelligence, but also grasped its importance, writing that “…it seems probable that once [human-level machine thinking] has started, it would not take long to outstrip our feeble powers… At some stage therefore we should have to expect the machines to take control…”

The prize is awarded for work that not only increases awareness of this important problem, but also makes technical progress in addressing it.

The deadline for paper submission is August 31st. For details, see the AGI-12 Call for Papers.

Ioven Fables Hired

The Singularity Institute has hired Ioven Fables to help with organizational operations and development. Ioven is a business person with a long dedication to big-picture understanding, philosophy, and the future.Ioven has worked for years at the business end of technology, and has B.A. in Philosophy from Boston College.

How to Get Involved

Would you like to be more involved in our work at the Singularity Institute? There are so many opportunities available, there may be a role for everyone. Here’s what you could do:

  • Donate. There are projects we’d like to launch, and people we’d like to hire, as soon as we can raise the funds to do so. Financial support is perhaps the clearest, most direct way to contribute to our mission.
  • Volunteer. We have dozens of opportunities for skilled volunteering work. Visit our volunteering page to see how you can help.
  • Share your expertise. Have expertise in economics, maths, computer science, cognitive science, physics, law, non-profit development, marketing, event planning, executive coaching, publishing? Please sign up as a Singularity Institute Volunteer Advisor!
  • Intern with us. Apply to work with us in Berkeley as an intern.
  • Apply to be a visiting fellow. Visiting fellows work with us for short periods on research projects. Gain valuable experience and work directly with our researchers!
  • Apply for a job. We are currently seeking research fellows, a communications director, and a grants manager. You can also apply to be a remote researcher, a job you can do from anywhere in the world!

Thank you for your continued interest and support. Don’t hesitate to get in contact with me at louie@singularity.org.

Louie Helm
Singularity Institute Director of Development

In Other News

Leave a Reply

Your email address will not be published. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>