- New at IAFF: The Ubiquitous Converse Lawvere Problem; Two Major Obstacles for Logical Inductor Decision Theory; Generalizing Foundations of Decision Theory II.
- New at AI Impacts: Guide to Pages on AI Timeline Predictions
- “Decisions Are For Making Bad Outcomes Inconsistent“: Nate Soares dialogues on some of the deeper issues raised by our “Cheating Death in Damascus” paper.
- We ran a machine learning workshop in early April.
- “Ensuring Smarter-Than-Human Intelligence Has a Positive Outcome“: Nate’s talk at Google (video) provides probably the best general introduction to MIRI’s work on AI alignment.
- Our strategy update discusses changes to our AI forecasts and research priorities, new outreach goals, a MIRI/DeepMind collaboration, and other news.
- MIRI is hiring software engineers! If you’re a programmer who’s passionate about MIRI’s mission and wants to directly support our research efforts, apply here to trial with us.
- MIRI Assistant Research Fellow Ryan Carey has taken on an additional affiliation with the Centre for the Study of Existential Risk, and is also helping edit an issue of Informatica on superintelligence.
News and links
- DeepMind researcher Viktoriya Krakovna lists security highlights from ICLR.
- DeepMind is seeking applicants for a policy research position “to carry out research on the social and economic impacts of AI”.
- The Center for Human-Compatible AI is hiring an assistant director. Interested parties may also wish to apply for the event coordinator position at the new Berkeley Existential Risk Initiative, which will help support work at CHAI and elsewhere.
- 80,000 Hours lists other potentially high-impact openings, including ones at Stanford’s AI Index project, the White House OSTP, IARPA, and IVADO.
- New papers: “One-Shot Imitation Learning” and “Stochastic Gradient Descent as Approximate Bayesian Inference.”
- The Open Philanthropy Project summarizes its findings on early field growth.
- The Centre for Effective Altruism is collecting donations for the Effective Altruism Funds in a range of cause areas.