Skip to content

Scott Niekum

Position
Associate Professor
Organisation
UMass Amherst
Biography

Why do you care about AI Existential Safety?

One of my concerns about the rapid pace of current AI development and deployment is that not all alignment issues, safety concerns, and societal harms can likely be mitigated with post-hoc solutions. Rather, I believe that many of these issues are best addressed when AI systems are being designed -- for example, through the use of risk-aware learning techniques or models that are explicitly constructed to support auditing. Thus, it is critical to begin research into such techniques and technological building blocks now, rather than react to problems as they arise; by the time many harms manifest, it will be too late to change the basic technologies they are built upon or the entrenched community practices that lead to them.

Please give at least one example of your research interests related to AI existential safety:

My work focuses on both theoretical and practical aspects of AI alignment and risk-aware learning. I'm particularly interested in reward inference from human preferences, probabilistic performance guarantees in (inverse) reinforcement learning settings, and efficient verification of agent alignment.

Read more
Our content

Content from this author

Sort order

Sort order

Category

Category
  • Recent News (217)
  • AI (137)
  • Nuclear (47)
  • Grants Program (25)
  • Climate & Environment (23)
  • AI Research (22)
  • Partner Orgs (20)
  • Biotech (18)
  • AI Safety Principles (13)
  • FLI projects (10)

Content type

Content type
  • Posts (200)
  • Newsletters (16)
  • Podcasts (1)

Number of results

June 8, 2023
podcast
June 1, 2023

Yawen Duan

person
June 1, 2023
person
June 1, 2023

Caspar Oesterheld

person
June 1, 2023
person
June 1, 2023

Kayo Yin

person
June 1, 2023
person
June 1, 2023

Dr. Peter S. Park

person
June 1, 2023
person
May 26, 2023
podcast
May 12, 2023
podcast
April 16, 2023

Landon Klein

person
April 16, 2023
person
April 13, 2023

Signature email verified

page
April 13, 2023
page
1 2 3 12

Load more

Sign up for the Future of Life Institute newsletter

Join 20,000+ others receiving periodic updates on our work and cause areas.
View previous editions
cloudmagnifiercrossarrow-up linkedin facebook pinterest youtube rss twitter instagram facebook-blank rss-blank linkedin-blank pinterest youtube twitter instagram