| Item | Value |
|---|---|
| Donations List Website (data still preliminary) |
| Organization | Title | Start date | End date | Employment type | Source | Notes |
|---|---|---|---|---|---|---|
| Center for Human-Compatible AI | Spotlighted Student | 2019-01-19 | 2020-12-03 | graduate student | [1], [2] | |
| Center for Human-Compatible AI | Alumnus | 2022-01-22 | graduate student | [3], [4] | ||
| 80,000 Hours | External Advisor | 2023-09-25 | advisor | [5], [6] | ||
| Longview Philanthropy | Advisor | 2023-11-28 | advisor | [7], [8], [9] | Research Scientist at Google Deepmind |
| Name | Creation date | Description |
|---|---|---|
| Clarifying some key hypotheses in AI alignment | 2019-08-15 | With Ben Cottier. A diagram collecting several hypotheses in AI alignment and their relationships to existing research agendas. |
| Title | Publication date | Author | Publisher | Affected organizations | Affected people | Document scope | Cause area | Notes |
|---|
| Title | Publication date | Author | Publisher | Affected organizations | Affected people | Affected agendas | Notes |
|---|---|---|---|---|---|---|---|
| AI Alignment Podcast: An Overview of Technical AI Alignment with Rohin Shah (Part 2) | 2019-04-25 | Lucas Perry | Future of Life Institute | Rohin Shah, Dylan Hadfield-Menell, Gillian Hadfield | Embedded agency, Cooperative inverse reinforcement learning, inverse reinforcement learning, deep reinforcement learning from human preferences, recursive reward modeling, iterated amplification | Part two of a podcast episode that goes into detail about some technical approaches to AI alignment. | |
| AI Alignment Podcast: An Overview of Technical AI Alignment with Rohin Shah (Part 1) | 2019-04-11 | Lucas Perry | Future of Life Institute | Rohin Shah | iterated amplification | Part one of an interview with Rohin Shah that goes covers some technical agendas for AI alignment. |
Showing at most 20 people who are most similar in terms of which organizations they have worked at.
| Person | Number of organizations in common | List of organizations in common |
|---|---|---|
| Hilary Greaves | 2 | 80,000 Hours, Longview Philanthropy |
| Claire Zabel | 2 | 80,000 Hours, Longview Philanthropy |
| Neel Nanda | 2 | Center for Human-Compatible AI, Longview Philanthropy |
| Paul Christiano | 1 | 80,000 Hours |
| Owen Cotton-Barratt | 1 | 80,000 Hours |
| Niel Bowerman | 1 | 80,000 Hours |
| Alice De Gennaro | 1 | 80,000 Hours |
| Arden Koehler | 1 | 80,000 Hours |
| Aric Floyd | 1 | 80,000 Hours |
| Balázs Rapi | 1 | 80,000 Hours |
| Bella Forristal | 1 | 80,000 Hours |
| Brenton Mayer | 1 | 80,000 Hours |
| Callum Evans | 1 | 80,000 Hours |
| Chana Messinger | 1 | 80,000 Hours |
| Conor Barnes | 1 | 80,000 Hours |
| Daniel Dewey | 1 | 80,000 Hours |
| Eli Nathan | 1 | 80,000 Hours |
| Eve McCormick | 1 | 80,000 Hours |
| Huon Porteous | 1 | 80,000 Hours |
| Huw Thomas | 1 | 80,000 Hours |