Technological Superiority≠Overall Superiority

I continue to ponder what I consider the irrational sense of superiority that most humans seem to feel about their species. From games to concerns about climate change to international problem solving to fiction to religion, most people seem to assume the human species is the only one that matters.

We can do things that other animals on Earth cannot do such as write books, travel to the moon, develop advanced math and physics, and build weapons capable of destroying all multicellular life on Earth. But does that give us overall superiority? I think not. In many ways we are one of the least important life forms on the planet. In fact, if we were to suddenly disappear, life on Earth with the exception of animals we have bred for our convenience like dairy cows would do just fine – better, in fact, than they are doing now. Bacteria and other micro-organisms are far more important to Earth than we are because without them, all animal life would cease to exist.

Many people are finally realizing that other animals have complex emotional and social lives, and many are better at it than we are. Elephants are a good example of highly intelligent animals who make great parents and have strong social bonds. Yet we slaughter them for their tusks or for “fun.” So who is the superior animal in that realm? Not us, I suggest.

If we continue to see ourselves as the only life form that matters, ironically we will continue to destroy the ecosystem that sustains our lives. We may be the first species to engineer its own mass extinction, which to me is a sad prospect.

The Fundamental Problem Humans Refuse to Face

This is based on a post I made on a Nova discussion following the program on robotics:
I think the key problem with us humans is that we don’t stop to investigate our meta-problems. We don’t realize that the “problem” isn’t Russia, or Iran, or Korea or ISIS. The problem is how we humans organize into groups and then engage in war/violence. That behavior is pretty universal and I believe it could be solved if we as humans would recognize it as the problem and work on solving it.

Already much is known about methods of constructive conflict resolution. But when someone suggests applying these methods there are always voices that say  “they” won’t cooperate in this, whoever the current “they” are, so we have to engage in war. But that’s the problem – there’s always a we and a they. In our own country we have Republicans hating Democrats, another we-they situation. We-they thinking can exist at the family level to the neighborhood level all the way up to the national level. It’s the way we currently think and survival requires that we change that way of thinking, but I see no sign of widespread recognition of this. Thus my pessimism.

The Wolf

Wolf wanting to be free

This poster says it all. I love dogs too, and it’s a mystery to me how one can love a wolf-like dog and hate a wolfI. I wonder if our hostility to wolves is because they dare to live free and not bow to our command. We are an arrogant and vengeful race.

Thoughts on Human Self-Aggrandisement

I was following some discussion threads on Amazon regarding books dealing with the potential of human intelligence to understand the nature of things and on the prospects for developing artificial intelligence that may or may not come back to bite us as in the Terminator series of movies. The following is one of the posts I made there:
“Bostrum (the author of a book warning of the dangers of AI)  seems enamored with human intelligence, assuming it to be superior to any other intelligence we know of. I believe that human self-aggrandizement is one of our key weaknesses as a species. We often assume we are the only species that matters; most proposed solutions to problems we face address only their effect on humans, as if that was all that counts.

Our “superior” intelligence has resulted in gross overpopulation of the planet, a mass extinction of magnitude that is projected to be as bad as the one that killed the dinosaurs, endless wars fought with increasingly deadly weapons obtainable by almost every group that wants them, 20 or so percent of the human population living in abject poverty, an extraordinary lack of skill in using peaceful conflict resolution skills that have already been developed, and so on.

Various other species behave more intelligently than we do in certain areas. We are the best at technological development and the arts and sciences, but that’s about it. Many other species are better at handling conflict, at raising their young, fitting into their environment, etc. In short, we are extraordinarily stupid in many areas that affect our survival and the survival of other life on the planet, and one of the most stupid ideas is that of human superiority in all things.

Ok, that being over with, I would be interested in a book that convincingly describes how superintelligence could be created and gives well documented evidence of that. I have been in computers and math for fifty years (well, computers for only forty years) and I well remember the AI craze that consumed the industry in the ’80s. People then thought that the solution to AI was just around the corner. Then reality hit, and the difficulties of producing true AI became apparent. Now we have developed a computer system that can beat a human at chess and apparently can fix a satellite without human intervention in some cases. There is still a huge gap between these feats and producing AI that can function like that shown in The Terminator movies.

Of the problems facing us now (the likelihood of our self-destruction from war, contamination and pollution, disease, including bio-warfare, and other such insane behavior) means I don’t lose much sleep over the dangers of AI, though if we could develop it, I’m sure we would, because we like to act first and think later, like the birds in Bostrum’s allegory.” Here is a link if you want to Look Inside for the Bostrum’s allegory, which I liked, and the preface I refer to.