Our lives on this planet have improved in so many amazing ways over the last century. On average, we are now healthier, more affluent and literate, less violent and longer living. Despite these unprecedented positive changes, clear signs exist that we are in the midst of an emerging crisis — one that has not yet been recognized in its full breadth, even though it lurks just beneath the surface of our casual conversations and swims in the undercurrents of our news feeds. This is not the well-known crisis that we’ve induced upon the earth’s climate, but one that is just as threatening to our future. This is a crisis of our minds. A cognition crisis.
A cognition crisis is not defined by a lack of information, knowledge or skills. We have done a fine job in accumulating those and passing them along across millennia. Rather, this a crisis at the core of what makes us human: the dynamic interplay between our brain and our environment — the ever-present cycle between how we perceive our surroundings, integrate this information, and act upon it.
This ancient perception-action cycle ensured our earliest survival by allowing our primordial predecessors to seek nutrients and avoid toxins. It is from these humble beginnings that the human brain evolved to pursue more diverse resources and elude more inventive threats. It is from here that human cognition emerged to support our success in an increasingly complex and competitive environment: attention, memory, perception, creativity, imagination, reasoning, decision making, emotion and aggression regulation, empathy, compassion, and wisdom. And it is here that our crisis exists.
Today, hundreds of millions of people around the world seek medical assistance for serious impairments in their cognition: major depressive disorder, anxiety, schizophrenia, autism, post-traumatic stress disorder, dyslexia, obsessive-compulsive disorder, bipolar disorder, attention deficit hyperactivity disorder (ADHD), addiction, dementia, and more. In the United States alone, depression affects 16.2 million adults, anxiety 18.7 million, and dementia 5.7 million — a number that is expected to nearly triple in the coming decades.
American teens have experienced a 33% increase in depressive symptoms, with 31% more having died by suicide between 2010 and 2015.
The immense personal, societal and economic impact of cognitive dysfunction warrants heightened consideration because the crisis is growing, not receding. Despite substantial investment in research and treatments by governments, foundations, and companies around the world, the prevalence and impact of these conditions are escalating. Between 2005 and 2015, the number of people worldwide with depression and anxiety increased by 18.4% and 14.9% respectively, while individuals with dementia exhibited a 93% increase over those same years.
To some degree, these trends reflect the overall growth and aging of the world’s population. This will only continue to increase in the future: the global population of seniors is predicted to swell to 1.5 billion by 2050. Although there are clear benefits to living longer, an unfortunate negative consequence is the burden it places on many aspects of cognition.
There are signs something else is going on, too. Over the last several decades, worrying tears have appeared in the cognitive fabric of our youth, notably in terms of emotional regulation and attentional deployment. American teens have experienced a 33% increase in depressive symptoms, with 31% more having died by suicide in 2015 than in 2010. ADHD diagnoses have also increased dramatically. While a growing awareness of these conditions — and with it, more frequent diagnoses — are likely factors, it does not seem this is the whole story; the magnitude of this escalation points to a deeper problem. (...)
Neuroscientists and leadership in the medical world now appreciate that much more unites seemingly disparate aspects of cognition than divide them. For example, attention deficits are now recognized to be a prominent feature of major depressive disorder, and are included in the most recent diagnostic criteria — the bible used by mental health experts — as a “diminished ability to concentrate.” The reality is that each of us has one mind, and embracing this will foster our ability to nurture it.
There is also, as I’ve said, a common, underlying aggravator that has exerted an impact across all domains of cognition: the dramatic plunge we’ve taken into the information age on the back of the digital revolution. Every way we interact with our environment, as well as with each other and ourselves, has been radically transformed by technology.
The old environment, where our cognition evolved, is long gone. The new environment, where multidimensional information flows like water (from a firehose!), challenges our brain and behavior at a fundamental level.
This has been shown in the laboratory, where scientists have documented the influence of information overload on attention, perception, memory, decision making, and emotional regulation. And it has also been shown in the real world, where we see strong associations between the use of technology and rising rates of depression, anxiety, suicide, and attention deficits, especially in children.
Although the exact mechanism is still under exploration, a complex story is emerging. We are seeing accelerating reward cycles associated with intolerance to delayed gratification and sustained attention; excessive information exposure connected with stress, depression, and anxiety (e.g., fear of missing out and being non-productive); and, of course, multitasking has been linked to safety issues (such as texting while driving) and a lack of focus (which impacts our relationships, our studies, and our work).
What’s more, our constant engagement with technology interferes with the pursuit of other behaviors critical for maintaining a healthy mind, such as nature exposure, physical movement, face-to-face contact, and restorative sleep. Its negative influence on empathy, compassion, cooperation, and social bonding are just beginning to be understood.
by Adam Gazzaley MD, PhD, Medium | Read more:
A cognition crisis is not defined by a lack of information, knowledge or skills. We have done a fine job in accumulating those and passing them along across millennia. Rather, this a crisis at the core of what makes us human: the dynamic interplay between our brain and our environment — the ever-present cycle between how we perceive our surroundings, integrate this information, and act upon it.
This ancient perception-action cycle ensured our earliest survival by allowing our primordial predecessors to seek nutrients and avoid toxins. It is from these humble beginnings that the human brain evolved to pursue more diverse resources and elude more inventive threats. It is from here that human cognition emerged to support our success in an increasingly complex and competitive environment: attention, memory, perception, creativity, imagination, reasoning, decision making, emotion and aggression regulation, empathy, compassion, and wisdom. And it is here that our crisis exists.
Today, hundreds of millions of people around the world seek medical assistance for serious impairments in their cognition: major depressive disorder, anxiety, schizophrenia, autism, post-traumatic stress disorder, dyslexia, obsessive-compulsive disorder, bipolar disorder, attention deficit hyperactivity disorder (ADHD), addiction, dementia, and more. In the United States alone, depression affects 16.2 million adults, anxiety 18.7 million, and dementia 5.7 million — a number that is expected to nearly triple in the coming decades.
American teens have experienced a 33% increase in depressive symptoms, with 31% more having died by suicide between 2010 and 2015.
The immense personal, societal and economic impact of cognitive dysfunction warrants heightened consideration because the crisis is growing, not receding. Despite substantial investment in research and treatments by governments, foundations, and companies around the world, the prevalence and impact of these conditions are escalating. Between 2005 and 2015, the number of people worldwide with depression and anxiety increased by 18.4% and 14.9% respectively, while individuals with dementia exhibited a 93% increase over those same years.
To some degree, these trends reflect the overall growth and aging of the world’s population. This will only continue to increase in the future: the global population of seniors is predicted to swell to 1.5 billion by 2050. Although there are clear benefits to living longer, an unfortunate negative consequence is the burden it places on many aspects of cognition.
There are signs something else is going on, too. Over the last several decades, worrying tears have appeared in the cognitive fabric of our youth, notably in terms of emotional regulation and attentional deployment. American teens have experienced a 33% increase in depressive symptoms, with 31% more having died by suicide in 2015 than in 2010. ADHD diagnoses have also increased dramatically. While a growing awareness of these conditions — and with it, more frequent diagnoses — are likely factors, it does not seem this is the whole story; the magnitude of this escalation points to a deeper problem. (...)
Neuroscientists and leadership in the medical world now appreciate that much more unites seemingly disparate aspects of cognition than divide them. For example, attention deficits are now recognized to be a prominent feature of major depressive disorder, and are included in the most recent diagnostic criteria — the bible used by mental health experts — as a “diminished ability to concentrate.” The reality is that each of us has one mind, and embracing this will foster our ability to nurture it.
There is also, as I’ve said, a common, underlying aggravator that has exerted an impact across all domains of cognition: the dramatic plunge we’ve taken into the information age on the back of the digital revolution. Every way we interact with our environment, as well as with each other and ourselves, has been radically transformed by technology.
The old environment, where our cognition evolved, is long gone. The new environment, where multidimensional information flows like water (from a firehose!), challenges our brain and behavior at a fundamental level.
This has been shown in the laboratory, where scientists have documented the influence of information overload on attention, perception, memory, decision making, and emotional regulation. And it has also been shown in the real world, where we see strong associations between the use of technology and rising rates of depression, anxiety, suicide, and attention deficits, especially in children.
Although the exact mechanism is still under exploration, a complex story is emerging. We are seeing accelerating reward cycles associated with intolerance to delayed gratification and sustained attention; excessive information exposure connected with stress, depression, and anxiety (e.g., fear of missing out and being non-productive); and, of course, multitasking has been linked to safety issues (such as texting while driving) and a lack of focus (which impacts our relationships, our studies, and our work).
What’s more, our constant engagement with technology interferes with the pursuit of other behaviors critical for maintaining a healthy mind, such as nature exposure, physical movement, face-to-face contact, and restorative sleep. Its negative influence on empathy, compassion, cooperation, and social bonding are just beginning to be understood.
Image: Maria Medem
[ed. It ain't just technology. Economic insecurity and inequality, corporate rapaciousness (in all its various forms), parasitic "healthcare" profiteering, environmental degradation, dysfunctional politics, militarized policing, constant bombardment by consumer marketing industries (see also: Speech Defects), pervasive surveillance, endless wars and more. If you don't have some form of cognitive impairment you're probably nuts.]