What is considered typical and usual is guided by the cultural framework a person is accustomed to. In the brain sciences, it can easily be forgotten that “normal” and “normality” are not rock solid concepts. Simply acknowledging that “normal” does not have an objective existence is insufficient without also changing scientific practices accordingly. This chapter unpacks why normality has been such a persistent concept for the last two centuries. The concept of normality grew alongside the development of statistical methods and was instrumental in constructing a much maligned concept of “degeneration.” Statistics are useful in a wide range of scientific contexts, but detrimental when used as a blunt instrument of measurement to legitimize labels that differentially sort people into subpopulations that augment social inequalities. A rigorous questioning of normality and degeneration ensures an ethical engagement with hypotheses of neuroscience experiments and the implications of research findings. This chapter surveys some of the key historical developments at the origins of the brain sciences to understand some of the biases present today. The language used to classify the world can lead to blind spots that remain hidden for generations. Rather than searching for a direct localization of human behavior in biological etiology, this chapter advocates a complex localization through mapping distributed agency across intersecting neurobiological, cultural, and environmental processes. Normal might be a value-laden term that has no place in the brain sciences, but a value-free operational conceptualization of the processes of degeneracy may be central to understanding dynamic neuro-cultural systems.
|Title of host publication||Handbook of Neuroethics|
|Editors||Jens Clausen, Neil Levy|
|Place of Publication||Dordrecht|
|Publisher||Springer, Springer Nature|
|Number of pages||21|
|Publication status||Published - 1 Jan 2015|