In 200 years, humanity has made tremendous progress. The last few decades have seen unimaginable technological developments. The earth may exist for another few billion years. Billions of people and other living beings can still be born. That is, if we take responsibility and care for their future.
The social philosopher Roman Krznaric tells us to be a good ancestor. The way we live today and its consequences for the climate, biodiversity, availability of food, energy sources and raw materials show short-term thinking. As if it doesn’t matter what we leave behind.
The future of our world requires long-term thinking. In this way we can try to prevent as far as possible potential disasters that threaten life on our planet.
Abuse of AI
Although urgent and important, these risks still receive far too little attention due to short-term thinking. Take, for example, the risk of a pandemic. Although scientists had been warning about this for a long time, almost the entire world was unprepared.
There are many ways to reduce the risk of humanity and other life becoming extinct:
- better policies on nuclear conflict;
- precautionary measures against pandemics;
- prevention of risks from artificial intelligence, biotechnology and bio-weapons;
- institutions and governments that take rational decisions with a view to the longer term;
- projects that make our society more resilient to major disasters.
In short, donating to organisations that work in this field is a positive step. Below are the selected organisations.
|Logo||Information||Recommendation||ANBI / RSIN|
|Effective Altruism Funds: Long-Term Future is mainly focused on potential risks from advanced artificial intelligence, nuclear conflict and contagious diseases, but is open to supporting projects which address other global catastrophic risks.||Manual|
|The John Hopkins Center for Health Security works to protect global public health. Their key focus points are infectious diseases, pandemic risk, natural disasters and deliberate biological threats.||Founders Pledge|
The Forethought Foundation for Global Priorities Research aims to determine how individuals and institutions should spend their limited resources in order to improve the world by as much as possible (through CEA).
|Centre for Effective Altruism||Manual|
|The Global Catastrophic Risk Institute is an impartial think tank working on reducing the risk of events that are capable of significantly damaging or even destroying human society.||Giving What We Can|
|The Machine Intelligence Research Institute carries out mathematical research to ensure that smarter-than-human artificial intelligence has a positive impact.||Open Philanthropy Project|