Of course, not everyone likes the idea of spreading scientific understanding. Remember what the Bishop of Birmingham’s wife is reputed to have said about Darwin’s claim that human beings are descended from monkeys: "My dear, let us hope it is not true, but, if it is true, let us hope it will not become generally known."
Introduction: Scientia
Of all the scientific terms or concepts that ought to be more widely known to help to clarify and inspire science-minded thinking in the general culture, none are more important than “science” itself.
Many people, even many scientists, have traditionally had a narrow view of science as controlled, replicated experiments performed in the laboratory—and as consisting quintessentially of physics, chemistry, and molecular biology. The essence of science is conveyed by its Latin etymology: scientia, meaning knowledge. The scientific method is simply that body of practices best suited for obtaining reliable knowledge. The practices vary among fields: the controlled laboratory experiment is possible in molecular biology, physics, and chemistry, but it is either impossible, immoral, or illegal in many other fields customarily considered sciences, including all of the historical sciences: astronomy, epidemiology, evolutionary biology, most of the earth sciences, and paleontology. If the scientific method can be defined as those practices best suited for obtaining knowledge in a particular field, then science itself is simply the body of knowledge obtained by those practices.
Science—that is, reliable methods for obtaining knowledge—is an essential part of psychology and the social sciences, especially economics, geography, history, and political science. Not just the broad observation-based and statistical methods of the historical sciences but also detailed techniques of the conventional sciences (such as genetics and molecular biology and animal behavior) are proving essential for tackling problems in the social sciences. Science is nothing more nor less than the most reliable way of gaining knowledge about anything, whether it be the human spirit, the role of great figures in history, or the structure of DNA.
It is in this spirit of Scientia that Edge, on the occasion of its 20th anniversary, is pleased to present the Edge Annual Question 2017. Happy New Year!
—John Brockman, Editor, January 1, 2017
Richard H. Thaler
Father of Behavioral Economics; Director, Center for Decision Research, University of Chicago Graduate School of Business; Author, Misbehaving
Neurodiversity
Humans have diversity in neurological conditions. While some, such as autism are considered disabilities, many argue that they are the result of normal variations in the human genome. The neurodiversity movement is an international civil rights movement that argues that autism shouldn’t be “cured” and that it is an authentic form of human diversity that should be protected.
In the early 1900s eugenics and the sterilization of people considered genetically inferior were scientifically sanctioned ideas, with outspoken advocates like Theodore Roosevelt, Margaret Sanger, Winston Churchill and US Supreme Court Justice Oliver Wendell Holmes Jr. The horror of the Holocaust, inspired by the eugenics movement, demonstrated the danger and devastation these programs can exact when put into practice.
Temple Grandin, an outspoken spokesperson for autism and neurodiversity argues that Albert Einstein, Wolfgang Mozart and Nikola Tesla would have been diagnosed on the “autistic spectrum” if they had been alive today. She also believes that autism has long contributed to human development and that “without autism traits we might still be living in caves.” Today, non-neurotypical children often suffer through a remedial programs in the traditional educational system only to be discovered to be geniuses later. Many of these kids end up at MIT and other research institutes.
With the invention of CRISPR the possibility of editing the human genome at scale has suddenly become feasible. The initial applications that are being developed involve the “fixing” of genetic mutations that cause debilitating diseases, but they are also taking us down a path with the potential to eliminate not only autism but much of the diversity that makes human society flourish. Our understanding of the human genome is rudimentary enough that it will be some time before we are able to enact complex changes that involve things like intelligence or personality, but it’s a slippery slope. I saw a business plan a few years ago that argued that autism was just “errors” in the genome that could be identified and “corrected” in the manner of “de-noising” a grainy photograph or audio recording.
Clearly some children born with autism are in states that require intervention and have debilitating issues. However, our attempts to “cure” autism, either through remediation or eventually through genetic engineering, could result in the eradication of a neurological diversity that drives scholarship, innovation, arts and many of the essential elements of a healthy society.
We know that diversity is essential for healthy ecosystems. We see how agricultural monocultures have created fragile and unsustainable systems.
My concern is that even if we figure out and understand that neurological diversity is essential for our society, I worry that we will develop the tools for designing away any risky traits that deviate from the norm, and that given a choice, people will tend to opt for a neuro-typical child.
As we march down the path of genetic engineering to eliminate disabilities and disease, it’s important to be aware that this path, while more scientifically sophisticated, has been followed before with unintended and possibly irreversible consequences and side-effects.
Introduction: Scientia
Of all the scientific terms or concepts that ought to be more widely known to help to clarify and inspire science-minded thinking in the general culture, none are more important than “science” itself.
Many people, even many scientists, have traditionally had a narrow view of science as controlled, replicated experiments performed in the laboratory—and as consisting quintessentially of physics, chemistry, and molecular biology. The essence of science is conveyed by its Latin etymology: scientia, meaning knowledge. The scientific method is simply that body of practices best suited for obtaining reliable knowledge. The practices vary among fields: the controlled laboratory experiment is possible in molecular biology, physics, and chemistry, but it is either impossible, immoral, or illegal in many other fields customarily considered sciences, including all of the historical sciences: astronomy, epidemiology, evolutionary biology, most of the earth sciences, and paleontology. If the scientific method can be defined as those practices best suited for obtaining knowledge in a particular field, then science itself is simply the body of knowledge obtained by those practices.
Science—that is, reliable methods for obtaining knowledge—is an essential part of psychology and the social sciences, especially economics, geography, history, and political science. Not just the broad observation-based and statistical methods of the historical sciences but also detailed techniques of the conventional sciences (such as genetics and molecular biology and animal behavior) are proving essential for tackling problems in the social sciences. Science is nothing more nor less than the most reliable way of gaining knowledge about anything, whether it be the human spirit, the role of great figures in history, or the structure of DNA.
It is in this spirit of Scientia that Edge, on the occasion of its 20th anniversary, is pleased to present the Edge Annual Question 2017. Happy New Year!
—John Brockman, Editor, January 1, 2017
*****
Father of Behavioral Economics; Director, Center for Decision Research, University of Chicago Graduate School of Business; Author, Misbehaving
The Premortem
Before a major decision is taken, say to launch a new line of business, write a book, or form a new alliance, those familiar with the details of the proposal are given an assignment. Assume we are at some time in the future when the plan has been implemented, and the outcome was a disaster. Write a brief history of that disaster.
Applied psychologist Gary Klein came up with “The Premortem,” which was later written about by Daniel Kahneman. Of course we are all too familiar with the more common postmortem that typically follows any disaster, along with the accompanying finger pointing. Such postmortems inevitably suffer from hindsight bias, also known as Monday-morning quarterbacking, in which everyone remembers thinking that the disaster was almost inevitable. As I often heard Amos Tversky say, “the handwriting may have been written on the wall all along. The question is: was the ink invisible?”
There are two reasons why premortems might help avert disasters. (I say might because I know of no systematic study of their use. Organizations rarely allow such internal decision making to be observed and recorded.) First, explicitly going through this exercise can overcome the natural organizational tendencies toward groupthink and overconfidence. A devil’s advocate is unpopular anywhere. The premortem procedure gives cover to a cowardly skeptic who otherwise might not speak up. After all, the entire point of the exercise is to think of reasons why the project failed. Who can be blamed for thinking of some unforeseen problem that would otherwise be overlooked in the excitement that usually accompanies any new venture?
The second reason a premortem can work is subtle. Starting the exercise by assuming the project has failed, and now thinking of why that might have happened creates the illusion of certainty, at least hypothetically. Laboratory research shows that by asking why did it fail rather than why might it fail, gets the creative juices flowing. (The same principle can work in finding solutions to tough problems. Assume the problem has been solved, and then ask, how did it happen? Try it!)
An example illustrates how this can work. Suppose a couple years ago an airline CEO invited top management to conduct a premortem on this hypothetical disaster: All of our airline’s flights around the world have been cancelled for two straight days. Why? Of course, many will immediately think of some act of terrorism. But real progress will be made by thinking of much more mundane explanations. Suppose someone timidly suggests that the cause was the reservation system crashed and the backup system did not work properly.
Had this exercise been conducted, it might have prevented a disaster for a major airline that cancelled nearly 2000 flights over a three-day period. During much of that time, passengers could not get any information because the reservation system was down. What caused this fiasco? A power surge blew a transformer and critical systems and network equipment didn’t switch over to backups properly. This havoc was all initiated by the equivalent of blowing a fuse.
This episode was bad, but many companies that were once household names and now no longer exist might still be thriving if they had conducted a premortum with the question being: It is three years from now and we are on the verge of bankruptcy. How did this happen?
And, how many wars might not have been started if someone had first asked: We lost. How? (...)
Joichi Ito
Director, MIT Media Lab; Coauthor (with Jeff Howe), Whiplash: How to Survive Our Faster Future
Before a major decision is taken, say to launch a new line of business, write a book, or form a new alliance, those familiar with the details of the proposal are given an assignment. Assume we are at some time in the future when the plan has been implemented, and the outcome was a disaster. Write a brief history of that disaster.
Applied psychologist Gary Klein came up with “The Premortem,” which was later written about by Daniel Kahneman. Of course we are all too familiar with the more common postmortem that typically follows any disaster, along with the accompanying finger pointing. Such postmortems inevitably suffer from hindsight bias, also known as Monday-morning quarterbacking, in which everyone remembers thinking that the disaster was almost inevitable. As I often heard Amos Tversky say, “the handwriting may have been written on the wall all along. The question is: was the ink invisible?”
There are two reasons why premortems might help avert disasters. (I say might because I know of no systematic study of their use. Organizations rarely allow such internal decision making to be observed and recorded.) First, explicitly going through this exercise can overcome the natural organizational tendencies toward groupthink and overconfidence. A devil’s advocate is unpopular anywhere. The premortem procedure gives cover to a cowardly skeptic who otherwise might not speak up. After all, the entire point of the exercise is to think of reasons why the project failed. Who can be blamed for thinking of some unforeseen problem that would otherwise be overlooked in the excitement that usually accompanies any new venture?
The second reason a premortem can work is subtle. Starting the exercise by assuming the project has failed, and now thinking of why that might have happened creates the illusion of certainty, at least hypothetically. Laboratory research shows that by asking why did it fail rather than why might it fail, gets the creative juices flowing. (The same principle can work in finding solutions to tough problems. Assume the problem has been solved, and then ask, how did it happen? Try it!)
An example illustrates how this can work. Suppose a couple years ago an airline CEO invited top management to conduct a premortem on this hypothetical disaster: All of our airline’s flights around the world have been cancelled for two straight days. Why? Of course, many will immediately think of some act of terrorism. But real progress will be made by thinking of much more mundane explanations. Suppose someone timidly suggests that the cause was the reservation system crashed and the backup system did not work properly.
Had this exercise been conducted, it might have prevented a disaster for a major airline that cancelled nearly 2000 flights over a three-day period. During much of that time, passengers could not get any information because the reservation system was down. What caused this fiasco? A power surge blew a transformer and critical systems and network equipment didn’t switch over to backups properly. This havoc was all initiated by the equivalent of blowing a fuse.
This episode was bad, but many companies that were once household names and now no longer exist might still be thriving if they had conducted a premortum with the question being: It is three years from now and we are on the verge of bankruptcy. How did this happen?
And, how many wars might not have been started if someone had first asked: We lost. How? (...)
*****
Joichi Ito
Director, MIT Media Lab; Coauthor (with Jeff Howe), Whiplash: How to Survive Our Faster Future
Neurodiversity
Humans have diversity in neurological conditions. While some, such as autism are considered disabilities, many argue that they are the result of normal variations in the human genome. The neurodiversity movement is an international civil rights movement that argues that autism shouldn’t be “cured” and that it is an authentic form of human diversity that should be protected.
In the early 1900s eugenics and the sterilization of people considered genetically inferior were scientifically sanctioned ideas, with outspoken advocates like Theodore Roosevelt, Margaret Sanger, Winston Churchill and US Supreme Court Justice Oliver Wendell Holmes Jr. The horror of the Holocaust, inspired by the eugenics movement, demonstrated the danger and devastation these programs can exact when put into practice.
Temple Grandin, an outspoken spokesperson for autism and neurodiversity argues that Albert Einstein, Wolfgang Mozart and Nikola Tesla would have been diagnosed on the “autistic spectrum” if they had been alive today. She also believes that autism has long contributed to human development and that “without autism traits we might still be living in caves.” Today, non-neurotypical children often suffer through a remedial programs in the traditional educational system only to be discovered to be geniuses later. Many of these kids end up at MIT and other research institutes.
With the invention of CRISPR the possibility of editing the human genome at scale has suddenly become feasible. The initial applications that are being developed involve the “fixing” of genetic mutations that cause debilitating diseases, but they are also taking us down a path with the potential to eliminate not only autism but much of the diversity that makes human society flourish. Our understanding of the human genome is rudimentary enough that it will be some time before we are able to enact complex changes that involve things like intelligence or personality, but it’s a slippery slope. I saw a business plan a few years ago that argued that autism was just “errors” in the genome that could be identified and “corrected” in the manner of “de-noising” a grainy photograph or audio recording.
Clearly some children born with autism are in states that require intervention and have debilitating issues. However, our attempts to “cure” autism, either through remediation or eventually through genetic engineering, could result in the eradication of a neurological diversity that drives scholarship, innovation, arts and many of the essential elements of a healthy society.
We know that diversity is essential for healthy ecosystems. We see how agricultural monocultures have created fragile and unsustainable systems.
My concern is that even if we figure out and understand that neurological diversity is essential for our society, I worry that we will develop the tools for designing away any risky traits that deviate from the norm, and that given a choice, people will tend to opt for a neuro-typical child.
As we march down the path of genetic engineering to eliminate disabilities and disease, it’s important to be aware that this path, while more scientifically sophisticated, has been followed before with unintended and possibly irreversible consequences and side-effects.
by Edge.org | Read more:
Image: "Spiders 2013" by Katinka Matson