Reply to Ormond:
Perhaps I ought to say at the outset that I believe that we are quite capable of destroying ‘civilisation’ as we know it and that we have several ways of doing so.
Furthermore, with the growth of technology, these ways are increasing at an increasing rate. I agree with you in this respect. There are no guarantees that we can manage these threats successfully and so, it goes without saying, no assurances can be given for protection every day, forever. But major threats have existed since the dawn of history; meteorites, supernovae, gamma ray bursts, rapid climate change, super-volcanoes, methane turnovers and so on. We are in a better position now than ever before to be aware of such threats and even to begin to assess their probabilities but we have precious little chance of stopping them. We live with the possibility of extinction tomorrow, as we have always done.
No, it certainly doesn’t help that our own technologies are increasing the risk of mass catastrophes and we must certainly do all we can to maximise their benefits whilst minimising their potential for harm. But what we can’t do is to unlearn knowledge.
Technology has always brought benefits and problems. It brought about mass death and destruction during World War I but the Spanish ‘flu epidemic of 1918 killed more than four years of war. The exponential growth of world population is due overwhelmingly to declining death rates as a result of improved water management, more effective hygiene, mass vaccination, perinatal health care, and a thousand other technical improvements even though, at the same time, birth rates have dropped, particularly as a result of the education of women. Should we stop the search for further improvements because they are too successful?
Here we come to the crux of the issue.
How about a strategy of first things first? Nuclear weapons are the most pressing threat at the moment. We could focus our efforts on that for now. This doesn't solve all the challenges that will arise out of the knowledge explosion, but it gives us something concrete and specific to start with. Such a strategy pulls the rug out from under all those who will say there's nothing we can do. We can do this.
So, 50% of all science research redirected to the nuclear issue, now, today, right away, without further delay.
Do you have a factual basis for determining that nuclear weapons are the most pressing threat at the moment? I am certainly no expert and I am not, therefore, in a good position to judge, but think I could argue a case that bioengineered pandemics, artificial superintelligence and nanotechnology, not to mention climate change, offer at least as much of a threat. And that takes no account of technologies in the offing which may become very significant in the next few years. We cannot know what threats are over the horizon. So, by directing
half of all science research to ‘the nuclear issue’, you will be guaranteeing a reduction in funding to other areas which may have the potential to bring more benefits.
In any case, what, exactly is ‘the nuclear issue’? If you are worried about the prolifereration of available fissile material, is this related to science funding or is it, rather, a matter or political will and organisation? I suggest that your concerns might be directed more profitably towards political systems and management rather than to science research. It is not difficult to present a case that many of the most beneficial advances in technology were not foreseen at the outset of the research program. Science progress is not so readily directed.
Let's take 50% of all science research funding and redirect it at finding some method of managing knowledge development so that we are in control of knowledge, instead of knowledge being in control of us.
I think that your sentiments are laudable and your practical suggestion would be catastrophic. It assumes that scientists are responsible for the management of knowledge, which they are not. It was not scientists who dropped the atom bomb on Hiroshima. Politicians made the decision and they sought the advice of scientists as and when they considered it necessary. If you limit science funding in one country then you put that country at a relative disadvantage with respect to knowledge compared to others and you limit the options of the decision-makers. The atom bomb was developed in the knowledge that competing powers were also working along similar lines. Einstein contributed information which was of benefit to the construction of nuclear weapons when he was, essentially, a pacifist. He had very little control over what uses others make of information.
If your suggestion is directed at a particular country then, as I have explained, this automatically puts that country at a relative disadvantage. If your aim is directed more broadly – worldwide - then what systems exist to implement such a program? UNESCO springs to mind, but that has already been in existence for more than seventy years. As Fooloso4 has pointed out,
"It is a common but foolish, and in some cases dangerous, assumption that it is always better to do something than nothing".
If you ask me what type of action might help to ensure future security, then I would not advocate limiting knowledge. I would certainly encourage education. I would fight for a political system which was free from nepotism and corruption, where controls were in place to ensure probity and where leaders are answerable to the populace. At the most basic level, I wouldn't give control of nuclear codes to crazy people. Recent developments around the world do not encourage me.