11 June 2020

Saving Humnity

Saving Humnity – Norma


The article in The Independent “Will the human race become extinct? Almost certainly* (Nick Longrich - The Independent)” covers most of the emotional and technical information about our topic. But a simpler approach would be the table in the Wikipedia article “Global catastrophic risk”** Risk/Estimated probability for human extinction before 2100 that originally appeared in a paper by the Future of Humanity Institute. It seems that humanity probably does not need saving in the short term, but as John Maynard Keynes said “...In the long run we are all dead.” (WikiQuotes).


The question for us is, what has this topic got to do with philosophy? If we look at the Risk table 7 out of nine mentioned risks involve human made causes. Admittedly this list is a few years old but some of the human made risks are not that well known even today. People still protest against nuclear weapons, but I’ve never heard of protests to ban nanotechnology. Having said that, I have read quite a few articles on the dangers of nanotechnology. What is curious is here we are, a dozen years later, in the midst of a natural pandemic. Thankfully, it does not seem that the Coronavirus is about to wipe out the human race: admittedly these things take their time.


So maybe we feel obliged to consider the question as a legitimate philosophical ground by virtue of the human connection with a high proportion of extinction risks. Of course, just because a risk is high it does not follow it will be the one to kill us all. Nor is the risk time sensitive even if the list goes till 2100: indeed a few years ago we were all scared of nuclear terrorism and nuclear war was more a subject of the 1970s and 1980s.


The other thing you will notice is that this short list does not include the popular causes of extinction such as an impact event from space, aliens, the moon crashing into the earth and an invasion of ants. As the Wikipedia article points out, though, at the time of preparing the list there might have been some bias in preparing the list, mainly climate change. Indeed the article is clear that this is not exactly an objective list. Be it as it may, the list was the product of expert opinion in 2008 at a global catastrophic risk conference.


We might be tempted to argue that what the list shows us, or any discussion on the topic, is that humans are more likely to cause our own extinction and, therefore, there is moral or ethical duty to reconsider our actions. The problem is that moral or ethical pursuits are thankless and futile tasks. If someone has a dogmatic grudge against humanity it is going to be a very hard job to persuade them not to blow up a nuclear bomb.


The traditional idea of applied ethics to put the onus on the actor to act morally has not been the most successful of projects. If people have to be told not to do something, explode the bomb, the system has already failed at least that person. Happily the majority of human beings on Earth do not want to explode a nuclear bomb for their beliefs.


Hence, I would argue that our morality should not be based on telling people not to do bad acts, but rather on society to act in such a way that people would not feel the need to act in a bad way. But before we can reach that goal we have to overcome some instincts such as greed and envy.


A common mistake people make is what I might describe as a language-epistemological error. Basically, what I mean by this is that we hear a term or a concept used and have some information about it but we do not know the real story to ask relevant questions. Even today we’ll be lucky to find someone at random who would know a decent argument against nanotechnology: most sunscreen today use nanotechnology and hence a high risk of marine pollution. But if we had to ask someone for an argument about nuclear energy the arguments would be quite varied: radiation, the bomb, pollution, cancer and so on. But not many people would point out at the lack of a financial budget to maintain the nuclear plant safely and the employees paid enough to be well qualified and responsible for their work. This is borne out by all the nuclear accidents these past few decades. A more serious objection is storage of radioactive material


Our number one enemy, therefore, is not necessarily the technology but mainly: we do not know what questions to ask to hold those in authority accountable; we are committed empiricists, we are more likely to act once an event has happened rather than when a reasonable argument has been formulated; and then there is the perennial problem that we tend to mix up what we belief to be the case to what really is the case. And this problem is the result of assuming that what is claimed to be true is indeed also a fact. Truth is a value judgement whereas facts are empirical events. To put it in another way, it is not the truth that makes facts objective, but rather facts that cause what is true.


However, as I have argued, before we can save humanity, we first need to know what we are saving humanity from.


Best Lawrence


*Will the human race become extinct? Almost certainly

C - Monday 18 May 2020  -  The Independent



**Global catastrophic risk


Source: Future of Humanity Institute, 2008 (reference leads to a PDF file)


telephone/WhatsApp: 606081813

Email: philomadrid@gmail.com


No comments: