Skip to content

Kendrea Beers

Organisation
Oregon State University
Biography

Why do you care about AI Existential Safety?

Progress in AI existential safety would enable us to deploy beneficial AI systems responsibly.

For better or worse, AI technologies impact the causes that you and I care about, from animal welfare to the simple desire not to die in an AI-exacerbated nuclear war or pandemic. Advanced AI systems could improve lives in countless ways: curing diseases, alleviating climate change, and so on. But I can all too easily imagine futures in which AI systems make money for a reckless few at the risk of civilization-scale catastrophes for everyone else. With safety guarantees in place, responsible actors, too, could benefit from advanced systems.

I'm always looking for ways to contribute to the interdisciplinary project of aligning AI to benefit the entire world, in all its complexity and diversity. I believe that it is crucial to draw upon the insights of many types of experts--AI researchers, philosophers, scientists, policymakers, religious thinkers, engineers, and ordinary people--to robustly and holistically mitigate risks.

Please give at least one example of your research interests related to AI existential safety:

As of March 2023, I study inverse reinforcement learning and deontic logic. I'm working toward ensuring that a learned reward function will not incentivize forbidden patterns of behavior, even in adversarially generated scenarios.

Read more
Our content

Content from this author

Sort order

Sort order

Category

Category
  • AI (312)
  • Recent News (217)
  • Translation (102)
  • Newsletters (88)
  • Nuclear (86)
  • Featured (65)
  • Partner Orgs (53)
  • Biotech (38)
  • AI Researcher Profile (37)
  • Climate & Environment (37)

Content type

Content type
  • Posts (473)
  • People (155)
  • Newsletters (120)
  • Podcasts (66)
  • Resources (42)
  • Open Letters (30)
  • Pages (24)
  • Grants (23)
  • Events (14)
  • FLA Awards (6)

Number of results

June 8, 2023
podcast
June 1, 2023

Yawen Duan

person
June 1, 2023
person
June 1, 2023

Caspar Oesterheld

person
June 1, 2023
person
June 1, 2023

Kayo Yin

person
June 1, 2023
person
June 1, 2023

Dr. Peter S. Park

person
June 1, 2023
person
May 26, 2023
podcast
May 12, 2023
podcast
April 16, 2023

Landon Klein

person
April 16, 2023
person
April 13, 2023

Signature email verified

page
April 13, 2023
page
1 2 3 12

Load more

Sign up for the Future of Life Institute newsletter

Join 20,000+ others receiving periodic updates on our work and cause areas.
View previous editions
cloudmagnifiercrossarrow-up linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram