Chapter Six Self-Destruction (Part I)

The extinction of mankind refers to the destruction of all humanity as a whole. Previous analysis has yielded the conclusion that no external threat can destroy all humanity in the billions of years before the sun evolves into a red giant. With the elimination of external threats, the continued survival of mankind becomes completely reliant on humans themselves. Whether or not humanity will self-destruct is completely dependent upon our use of science and technology. No other element is capable of destroying a species so intelligent and so vast in number.

 

SECTION ONE: THE MEANS FOR EXTINCTION WILL INEVITABLY APPEAR (PART I): PHILOSOPHICAL DEDUCTION

One: The Inconceivable Nature of Science and Technology

The history of scientific and technological development revolves around the history of mankind itself. Man’s earliest technological achievements were the use and manufacture of stone tools and the mastery of fire; these also signaled the beginning of human history.

The ancestors of humanity experienced millions of years of development and faced countless obstacles. At every period in time, people encountered incomprehensible natural phenomena and had both the desire to overcome and transform nature and a respect for nature. Yet science and technology continue to amaze us with their incredible achievements and ability to surmount so many things that were thought to be invincible for generations. Some scientific and technological achievements still amaze people long after their conception, and even those in the forefront of the field are constantly astonished. This is because the power of science and technology is just too great, and it consistently exceeds our subjective imagination and customary experience.

In order to fully illustrate the inconceivable nature of science and technology, we will focus on a few scientific and technological achievements that had a huge impact on humanity. Even from today’s point of view, these discoveries were truly marvelous.

1. Electricity, Electromagnetic Waves, and their Application

In 1844, the US Congress set up a telegraph line between Washington and Baltimore. This was the first time electricity was used as an information medium. On May 21, when the telegraph line opened, people were amazed to hear that information could be communicated between two places through a wire, and the telegraph room was packed with onlookers. People talked and took bets, but most thought that the telegram could not exceed the speed of a good horse. At that time, Baltimore was holding the Democratic National Convention, and the list of presidential candidates was immediately spread to Washington via telegraph. Onlookers and US politicians were shocked.

Today, debating the speed of telegraphs versus horses would be hilarious. At a speed close to light, electricity is hundreds of millions of times faster than horses. However, at the beginning of the nineteenth century, the only ways to communicate were through postcards, by boats and horses, or by walking. Though beacons could pass information quickly, they could only transmit abstract information and not specific content.

The use of telegraphs was based on the discovery of electricity; before this discovery, people did not even have a concept of passing information using anything other than human or animal bodies. Let us try to imagine things from that point of view.

Electromagnetic waves are equivalent in speed to electricity. Today, even an ordinary middle school student understands that electromagnetic waves are produced though electromagnetic induction. However, the discovery of this basic physical law was not easy. When electricity and magnetism were discovered, they were thought to be two completely unrelated things. At the beginning of the nineteenth century, Oersted deduced that electricity and magnetism should be connected, and confirmed the electromagnetic effect in April 1820 after numerous experiments. Later, Michael Faraday also confirmed electromagnetism in his laboratory and put forth the law of electro-magnetic induction. After summarizing the electromagnetic theory, James Clerk Maxwell proposed the theory in 1864 that electromagnetic induction could produce electromagnetic waves. This series of deductions was later confirmed by the young scientist Heinrich Rudolf Hertz; this was the early establishment process of electromagnetism.

The discovery of electricity and the establishment of electromagnetism introduced humans to the electric age. Generators and motors became widely used, and people began to accept the existence of invisible and intangible electromagnetic waves. Electromagnetic waves allow us to talk with friends thousands of miles away with our mobile phones; they show us all kinds of entertainment and news on TV; and they permit us to control spacecrafts billions of kilometers from Earth. Today we take such scientific marvels for granted. But would we feel the same way two hundred years ago?

In science fiction movies when people travel to current times through time machines, they usually panic and marvel at the sight of TVs. Two hundred years ago, most people would have a hard time believing what is now reality. Even as we fully understand the theories of electricity and magnetism today, they still surprise us with further advancements.

In 1883, Thomas Edison discovered an interesting phenomenon when studying light bulbs. When he sealed a piece of metal and filament together in a light bulb, an electrical current would only pass through if a positive voltage were added. Based on this phenomenon, people invented vacuum tubes at the beginning of the twentieth century, among which diodes can detect signals while (bipolar junction) transistors can amplify signals. The invention of vacuum tubes created the conditions for radio communication and broadcasting. People were able to receive radio signals and transmit music and news through electromagnetic waves, further surpassing the technology of wired telephones and telegraphs.

With the help of vacuum tubes, scientists developed the first electronic computer in 1945, which exceeded the human brain in terms of calculation. It was previously inconceivable to use a machine as a replacement for the human brain. What was more amazing was that this man-made machine had computing power that far exceeded the capabilities of even the smartest person. The first computer was enormous and used a total of eighteen thousand vacuum tubes, weighed thirty tons, and required more than 170 square meters of space. It could run five thousand times per second, and in the ten years of its tenure, it completed more calculations than human brains ever had. This would all have seemed unbelievable before computers were invented, yet those early computers seem like mere “child’s play” only seventy years later.

In the middle of the twentieth century, people invented transistors made out of semiconductor materials. Transistors perform like vacuum tubes, but they are smaller, lighter, longer lasting, lower cost, have lower energy consumption, and do not require preheating. The transistor soon replaced the vacuum tube in the radio and computer fields. The application of semiconductor materials has gone through three stages, from transistor to integrated circuits to large-scale integrated circuits. A vacuum tube is half the size of a fist; the earliest transistors could be made as small as a grain of rice. Later, large-scale integrated circuits could be made to fit on a tiny chip. Now one integrated block contains millions or even hundreds of millions of transistors.

The replacement of vacuum tubes by transistors revolutionized the field of radio and electronics and greatly improved the performance and characteristics of products. A vacuum tube radio used to be the size of a trunk, but now radios can be smaller than a matchbox. The earliest computer required an entire room. In 1996, the University of Pennsylvania replicated the computing functions of that first computer onto a 7.44 × 5.29 square millimeter chip to commemorate the fiftieth anniversary of the first electronic computer. This nail-sized chip held 174,569 transistors and was fully equipped with the capabilities of its thirty-ton ancestor.

After only seventy years of development, the supercomputers today can compute one quadrillion times per second, meaning that in one second these computers can compute more than human brains have for all of history. Could this type of achievement been conceived hundreds of thousands of years ago?

2. The Understanding of Nuclear Energy

Nuclear energy is the most powerful force we can access today, and we have developed a considerable level of understanding in regard to nuclear energy. We can use nuclear energy to build highly destructive weapons, and we can also use it to generate power. The fact that a tiny bit of material can contain great energy is no longer surprising to us, but that was not the case seventy years ago.

Human understanding of nuclear energy fully reflects the inconceivable nature of science and technology, and it also shows how limited human understanding is when it comes to the power of science and technology. Einstein’s E = MC2 formula was proposed in 1905. At the time, even those that believed in the formula viewed it as a purely theoretical equation with no practical value. Although people recognized that the sun burned nuclear energy, they still believed that the heavy door of nuclear energy could only be opened through the power of celestial bodies.

Human understanding of basic matter particles was long clouded by fallacy. Until the end of the nineteenth century, almost all scientists believed that the atom was a whole, and matter could not be subdivided into smaller particles. On November 8, 1895, the German physicist Wilhelm Röntgen accidentally discovered a new ray while conducting a cathode ray experiment. This ray was extremely penetrative and was called the X-ray due to its unknown origin. When Röntgen published his research results, the scientific community was shocked. People began to re-examine whether the atom could be subdivided.

In 1896, physicists discovered that uranium ore emitted a penetrative ray similar to an X-ray, even without daylight or other rays. This phenomenon was later called a “radioactive phenomenon.” In 1902, Marie Curie extracted 0.12 grams of pure radium in her laboratory. Radium’s radioactivity was two million times that of uranium, and it could produce heat without burning. Calculations showed that radium produced 250,000 times more heat than its equivalent in coal; however, its energy had little use value due to its slow release. When radium finished radiating, it became two new elements: helium and lead.

In 1918, the physicist Ernest Rutherford discovered three rays emitted by radium: α-ray, β-ray, and γ-ray. In 1902, he proposed that the radioactive phenomenon was the atom’s own transformation process. The theory of atomic transformation further disproved the theory that atoms could not be subdivided; it was a revolutionary moment in the history of physics.

Prior to this, physicists had discovered that X-rays were a type of high-speed particle flow; its particle mass was 1/1841 the particle mass of hydro-gen atoms. This was the first time a particle smaller than an atom had been discovered. It was called an electron.

In 1911, Rutherford used α particles to bombard gold foil only one hundred thousandth of a centimeter thick and found that, on average, one out of every two million α particles was bounced back. He deduced that the α particles had encountered something quite dense and that this dense matter could only be a small portion of the atom, which was the nucleus. At the same time, Rutherford further speculated that the nucleus not only had positively charged particles but also uncharged particles. He named the positively charged particles protons and the non-charged particles neutrons; this speculation was later confirmed.

In 1919, Rutherford conducted an experiment that used α particles released from polonium elements to bombard nitrogen atoms. In doing so, he found that the proton released from the nitrogen nucleus transformed into an oxygen isotope. This was the first time artificial transmutation was achieved. Later, Rutherford used α particles to bombard boron, fluoride, sodium, phosphorus, and other elements, deriving protons from those nuclei as well and demonstrating that the nucleus could be divided.

However, α particles have difficulty producing protons when bombarding some of the heavier elements, because α particles are positive and the nuclei of heavy elements have more positively charged protons. Due to electro-static repulsion, the α particles will have difficulty removing the protons of heavier elements.

In 1932, James Chadwick, a student of Rutherford’s, tried bombarding boron and beryllium with α particles and found another component of the nucleus: neutrons. Since neutrons are not charged, not repelled by the nucleus’ charge, and have much heavier mass than electrons, they can easily displace electrons. The discovery of neutrons brought us one step away from nuclear energy; however, in April 1933, shortly after the neutron was discovered by his own student, Rutherford frankly stated his point of view in a speech on nuclear fission at the London British Association. He said, “We cannot expect this (i.e., nuclear energy) to obtain energy, because this method of producing energy is very low efficiency, the evolution of the atom as a source of power is purely theoretical speculation.”

His prophecy was very pessimistic. Rutherford was one of the greatest scientists in the field of atomic physics; he is recognized as the father of modern experimental physics and nuclear physics, and only Einstein can compete with his great contributions to nuclear physics. However, he made this pessimistic statement when we were right outside the threshold of nuclear energy. Even more interesting was the fact that Einstein actually recognized Rutherford’s prophecy. This shows that even the most outstanding scientists can seriously underestimate the great power of science and technology.

In 1934, the Yorio-Curies bombarded aluminum with α particles and produced a phosphorus isotope that soon transformed into silicon and released positrons. This was the first time a radioactive element was artificially generated. Encouraged by the results of the Yorio-Curie couple, Enrico Fermi tried to bombard nuclei with neutrons instead of α particles. At that time, there were only ninety-two kinds of known elements, so Fermi conducted bombardment experiments on all of them. When he used slow neutrons to bombard the 92nd element, uranium, a new element with completely different chemical signature was produced. Fermi could not explain this result; he thought that uranium had absorbed the neutron and produced a transuranic element, but this analysis was wrong.

Otto Hahn, Lise Meitner, and Niels Bohr further verified and correctly interpreted this experimental result. Their conclusion was that when slow neutrons bombarded the uranium element, the nucleus split into two after capturing a neutron. The so-called transuranic element was actually a new element (barium), and when the nucleus split, it lost mass and released energy. This process of splitting the nucleus was later named the “Bohr nuclear fission.” It was a brand-new term in nuclear physics.

This revolutionary explanation was a major breakthrough in atomic physics. On this basis, Fermi further suggested that uranium nuclei could emit one or several neutrons during fission, these newly generated neutrons would continue to bombard other uranium nuclei, and more neutrons would be produced to continue bombarding unsplit uranium nuclei, thus forming a “chain reaction.” This chain reaction would be completed in a flash, thus releasing a huge amount of energy. Fermi had completely worked out the method and principle of nuclear energy; it became clear that nuclear energy could be harnessed and used.

However, the master scientist, Bohr, was still asserting the impossibility of practically applying nuclear fission at this time, and he listed fifteen reasons to support his theory. Bohr was supported by a great number of scientists, once again demonstrating how deceptive science and technology can be. It was no wonder that people were disbelieving. According to calculations, one kg of uranium-235 would lose one gram of mass after completing fission. If the E = MC2 formula was applied, this tiny one gram of material would explode to produce energy equivalent to twenty thousand tons of explosive TNT in an instant; the enormity of that power could only be imagined. So great was this power that even the greatest scientists did not believe it could be harnessed.

Fermi’s speculation was quickly confirmed in several laboratories when it was shown that a uranium nuclear fission could release two to three neutrons. This provided substantive argument for the chain reaction theory, which in turn proved that nuclear energy could be mobilized. The key to nuclear energy had been obtained.

The ensuing struggle to persuade American politicians to research the atomic bombs was also lengthy and difficult. Politicians simply refused to believe in such a “whimsical, impossible” invention. Even when Einstein himself wrote to Roosevelt, his proposal was initially rejected.

3. Transgenesis

The inheritance of organisms has long been dominated by nature. The emergence of humans and all other animals and plants was the result of generations of natural evolution and mutation. Sphinxes and centaurs were all creatures of mythology; the idea that man could create a species the same way God could was utterly preposterous. But today, the power of science and technology has gifted mankind with the kind of power only gods of myth and legend could dream of. Humans now have the ability to create new species at will, change the characteristics of existing species, and even change the features of human beings themselves.

This incredible power came with the reveal of genetic inheritance. In the mid-nineteenth century, the priest Gregor Mendel found stable genetic factors in plant seeds when conducting pea experiments. These genetic factors were the deciders of biological traits. In the 1950s, scientists decoded the double helix structure of DNA and confirmed that genetic codes existed upon the DNA (very few viruses inherit through RNA). This discovery provided room for experimentation, and a large number of scientists began to work towards the manipulation of biological traits.

DNA is a long, complex chain molecule; the genes that determine bio-logical traits are fragments of the DNA chain, and each gene corresponds with a characteristic of the organism. Each human being has thirty thousand to thirty-five thousand genes. They are the factors that decide individual appearance, skin color, sex, body shape, personality, intelligence, and so on. All organisms display properties corresponding to their genetic code; modification of biological genes changes the characteristics of the organism.

The above findings confirmed theoretically that biological traits could be altered by “cutting and pasting” DNA. Following this train of thought, sphinxes and centaurs could be possible, fruits could grow to apple-size but taste like plums, and beans could grow as thick as cucumbers.

This realization was reliant on technology. DNA molecules are only two millionths of a millimeter in size, so changing their structure would not be an easy feat. In other words, the key to DNA recombination technology was finding the “scalpel” and “glue” to cut and paste DNA molecules.

The issue was resolved in a miraculously short number of years. Scientists discovered that endonuclease within the nucleic acid had a limiting effect and could act as the scalpel for DNA splicing, while several “ligase” enzymes were found capable of pasting and repairing DNA fragments. Based on the above research, scientists successfully spliced a DNA molecule and pasted in new DNA in 1971, achieving gene recombination for the first time.

A complex life-form takes hundreds of thousands or hundreds of millions of years to evolve, and humans are one such example. Yet genetic technology can create a new species in a few dozen days or a matter of months. In the past, only deities were capable of creating life—but today, man has the same capabilities. That is truly incredible.

 

Two: Reflecting on Human Understanding of Science and Technology’s Power

Humans traditionally had limited understanding of science and technology’s true power, and it constantly defies even our wildest imaginations. Due to this, the foresight of many scientists and philosophers has been met with ridicule and even persecution. We all recognize Einstein to be one of the founding fathers of modern physics. He enjoyed many scientific achievements in his life; the greatest among them was undoubtedly the theory of relativity. This theory set the basic theoretical framework of physics and solved problems that even Newtonian mechanics could not. It also successfully predicted many physical phenomena.

The series of achievements brought on by the theory of relativity shocked the world, and Einstein was awarded the Nobel Prize in Physics in 1921. However, the incredible nature of relativity theory caused great controversy, and many of the best physicists of the time would not recognize it. The previous Noble Prize-winning German scientists opposed the decision so strongly that they threatened to return their claim money if the theory of relativity was awarded. To compromise, the Nobel Prize selection committee awarded Einstein as the founder of photoelectric effect, and the theory of relativity was never awarded.

Previous discussion has touched upon similar controversies surrounding revolutionary ideas, like the continental drift theory, heliocentric theory, and the theory of biological evolution. Why do such controversies keep happening? We can sum it up into two factors:

1. Serious Underestimation of Science and Technology’s Power

People’s perceptions of technological achievements always tend towards one of two extremes. The achievements of the past are often taken for granted because they become commonplace and their theories are thoroughly revealed. With the understanding of geomagnetism, compasses become easily understood; the principle of optics explains why a combination of lenses allows us to see what the naked eye cannot; and mechanical dynamics make the workings of cars more apparent. Without scientific foundations and related theories, all these phenomena would seem inconceivable.

Conversely, attitudes towards future scientific discoveries usually veer into the other extreme. We often overestimate the achievements we have already mastered and believe that the unknown holds no big surprises; therefore, we often have limited vision in the analysis of future technologies and underestimate the power they might pose. Even the most outstanding scientists of a time can fall into this trap. In reality, most scientific findings are only phased truths; they will usually be replaced by higher-order truths sooner or later.

After Newtonian mechanics was founded, it was treated as absolute truth. Even the most outstanding scientists did not question it. At a British Royal Society New Year’s celebration in 1900, famous physicist Lord Kelvin made a speech claiming that the field of physics was essentially completed, save for two “clouds” on the horizon. These two clouds were: the inability to detect the luminous ether (specifically, the failure of the Michelson-Morley experiment), and the black body radiation effect known as the ultraviolet catastrophe. Little did he know that these two clouds would revolutionize traditional physics and produce the theory of relativity and quantum mechanics, bringing physics from the Newtonian age into the Einstein age.

Practical experience has proven that Newtonian mechanics cannot explain many phenomena; it is at best a limited approximation of truth and must be amended by the theory of relativity and quantum mechanics. Relativity and quantum mechanics themselves are not the ultimate truth either, as they still leave many questions unanswered and may be amended or overturned in the future.

Many truths that were once considered absolute have been overturned in the course of science history, and mankind has continued to achieve previously unimaginable feats through science and technology. Despite these incredible achievements, we still persist in the underestimation of future scientific and technological breakthroughs.

By summing up the past, we can make this logical evaluation: human understanding of science and technology is still very superficial, and the power it holds far exceeds our imagination. The future of scientific and technological development is still very long, and many more miraculous things will be discovered. We absolutely cannot assess science and technology of the future from the standpoint of today.

2. Theoretical Breakthroughs are the Key to Cognitive Breakthroughs

For most of human civilization, science and technology have been separate. Science was more theoretical, while technology focused on practical application. When the overall civilization level of human society was still relatively low, there was no need for the two to combine. Even during the beginning stages of the Industrial Revolution, inventions were intuitive and brought limited surprise.

Older inventions like alchemy and the compass were not explained by chemical or geomagnetic theories but were mostly accidental discoveries that lacked theoretical backing. The majority of innovations were intuitive leaps obtained through direct imagination and purposeful research. It was not hard for people to understand them.

After the Industrial Revolution successfully combined science and technology, major inventions became less intuitive and more complex and abstract. Science and technology became one joint term, and their breakthroughs were increasingly unimaginable to those who did not understand the theory behind them. The understanding of science and technology today is highly reliant on theoretical breakthroughs. The invention of generators, motors, telegrams, telephones, and the internet were all facilitated by the discovery of electromagnetic induction.

Electromagnetic waves were also discovered under the guidance of this principle, and it in turn inspired the invention of radios, televisions, wire-less telegraphs, and mobile phones. Scientists also came up with the idea of using electromagnetic waves to transmit and control information. None of these incredible inventions would have occurred without its corresponding theoretical breakthrough.

Scientific theories are a lighthouse that guides the invention of all techno-logical innovations within its scope. Correspondingly, once theoretical breakthroughs are achieved, people will naturally accept the previously incredible things that are now within scope of the new theory. Without theoretical backing, innovations are generally considered to be impossible.

The breakthroughs in scientific theory are the keys to cognitive breakthroughs in science and technology. Theoretical theory both limits and propels the development of science and technology, and it promotes the objective understanding of the field and its power as well.

Three: Deduction: The Means for Extinction Will Inevitably Appear

After understanding the general approach to science and technology, we can sum it up as thus: the power of science and technology is inconceivable, and we will always be limited in our understanding of it; future discoveries will inevitable occur, and as long as human history endures, science and technology’s development will never end.

We must recognize the inconceivable nature of science and technology and face it with the correct attitude. Though we cannot determine the exact future of scientific and technological developments, we can conclude that they will be exponentially more powerful and may devastate as well as benefit humanity.

As long as science and technology continue to develop, there will be a day that they possess the power to exterminate mankind.