Announcement

Collapse
No announcement yet.

Centre for Study of Existential Risk

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • Centre for Study of Existential Risk

    ---------ahh, now google grasps this before I can even finish editing ! -------------


    new Cambridge research centre for "extinction-level" risks

    A philosopher, a scientist and a software engineer have come together to propose a new centre at Cambridge to address developments in human technologies that




    Centre for Study of Existential Risk , CSER . founders:
    Huw Price, Bertrand Russell Professor of Philosophy
    Jaan Tallinn, former software engineer, one of the founders of Skype
    Lord Martin Rees, former Master of Trinity College and President of
    the Royal Society




    What better place than Cambridge
    cyberspace
    online forum - Less Wrong -





    Pandora’s Box’ Moment - Some Highly Advanced Technologies
    May Pose A Serious Threat To Our Species - Scientists Say
    17 September, 2013



    -----Co-founders
    Huw Price Bertrand Russell Professor of Philosophy, Cambridge
    Martin Rees Emeritus Professor of Cosmology & Astrophysics, Cambridge
    Jaan Tallinn Co-founder of Skype

    -----Cambridge advisors
    David Cleevely Founding Director, Centre for Science and Policy
    Tim Crane Knightbridge Professor of Philosophy
    Robert Doubleday Executive Director, Centre for Science and Policy
    Hermann Hauser Co-founder, Amadeus Capital Partners
    Jane Heal Emeritus Professor of Philosophy
    Sean Holden Senior Lecturer, Computing Laboratory; Fellow of Trinity College
    David Spiegelhalter Winton Professor of the Public Understanding of Risk

    -----External advisors
    Nick Bostrom Professor of Philosophy, Future of Humanity Institute, Oxford
    David Chalmers Professor of Philosophy, NYU & ANU
    George M Church Professor of Genetics, Harvard Medical School
    Dana Scott Emeritus Professor of Computer Science, Philosophy & Mathematical Logic,
    Carnegie Mellon University
    Murray Shanahan Professor of Cognitive Robotics, Imperial College, London
    Max Tegmark Professor of Physics, MIT
    Jonathan B Wiener Professor of Law, Environmental Policy & Public Policy, Duke University

    I'm interested in expert panflu damage estimates
    my current links: http://bit.ly/hFI7H ILI-charts: http://bit.ly/CcRgT

  • #2
    Re: Centre for Study of Existential Risk





    (Jan.2012)
    Of course: FHI, FutureTech, the Singularity Institute, and Leverage Research.
    New: the Global Catastrophic Risk Institute (Seth Baum & Tony Barrett).
    I've also heard that the following people are working to set up x-risk departments/organizations:
    Huw Price at Cambridge
    Newton Howard at MIT
    Jeffrey Epstein

    > My guess is that you can purchase the most x-risk reduction by donating to
    > either SI or FHI; the other orgs either don't exist yet or don't have much of a
    > track record yet.

    FHI : http://www.fhi.ox.ac.uk/
    The Future of Humanity Institute is a multidisciplinary research institute
    at the University of Oxford. It enables a select set of leading intellects
    to bring the tools of mathematics, philosophy, and science to bear on
    big-picture questions about humanity and its prospects.
    The Institute belongs to the Faculty of Philosophy and is affiliated
    with the Oxford Martin School


    The Oxford Martin Programme on the Impacts of Future Technology,
    launched in September 2011, is an interdisciplinary horizontal Programme within
    the Oxford Martin School in collaboration with the Faculty of Philosophy at Oxford
    University. The Programme, which is directed by Professor Nick Bostrom, works
    closely with the Future of Humanity Institute; the Institute for the Future of Computing,
    the Oxford University Computing Laboratory (Professor Bill Roscoe) and the Oxford
    e-Research Centre (Professor Anne Trefethen); the Institute for Science and Ethics
    (Professor Julian Savulescu); and other Oxford Martin School Institutes.
    The Oxford Martin Programme on the Impacts of Future Technology analyzes
    possibilities related to long-range technological change and the potential social
    impacts of future transformative technologies. Research foci include issues
    related to the future of computing, existential risks, and methodology, including
    the following areas: Changing rates of change; Automation and complexity barriers;
    Machine intelligence capabilities and safety; Novel applications and unexpected
    societal impacts: Predictability horizons; and Existential risks and future
    technologies.

    MIRI's artificial intelligence research is focused on developing the mathematical theory of trustworthy reasoning for advanced autonomous AI systems.

    MIRI?s mission is to ensure that the creation of smarter-than-human intelligence
    has a positive impact.

    Leverage conducts foundational research and supports inspired teams — applying our understanding of people, science, history, and research to identify and advance breakthrough areas in science and technology.

    Leverage Research develops high-impact technologies to create a
    better future We conduct research, launch businesses, build
    student group networks, and more
    Our Principles We test projects in parallel to discover and implement
    the highest-value ones. Our principles guide the selection of these projects.

    GCRI is a nonprofit and nonpartisan think tank that analyzes the risk of events that could significantly harm or even destroy human civilization at the glob ...

    (webpage produces strange delays in my browser)
    GCRI studies the breadth of major GCRs: nuclear warfare, climate change, pandemics,
    artificial intelligence, and more. We focus on big questions such as the most effective
    ways of reducing GCR.
    GCRI?s research is nonpartisan, non-ideological, transdisciplinary, and held to
    the highest academic standards. The research covers the breadth of GCR topics
    and welcomes contributions from any perspective. Researchers or other
    professionals who are interested in getting involved should contact Executive
    Director Seth Baum (seth [at] gcrinstitute.org).


    Post Date July 01, 2013 11 Comments
    U.S. Secretary of Health and Human Services Kathleen Sebelius speaking to
    the World Health Assembly image courtesy of US Mission Geneva/Eric Bridiers.
    ----------------------------
    On Tuesday 12 June, GCRI hosted an online lecture by Catherine Rhodes entitled
    ?Sovereign Wrongs: Ethics in the Governance of Pathogenic Genetic Resources?.


    This timely study will be of interest to students and academics concerned with the management of genetic resources and its connection to issues such as intellectual property rights, biodiversity conservation and food security. It will appeal strongly t


    Baum, Seth and Grant Wilson. The ethics of global catastrophic risk from dual-use
    bioengineering. Ethics in Biology, Engineering and Medicine, forthcoming, DOI:
    10.1615/EthicsBiologyEngMed.2013007629.
    I'm interested in expert panflu damage estimates
    my current links: http://bit.ly/hFI7H ILI-charts: http://bit.ly/CcRgT

    Comment

    Working...
    X