During the 1830s there was a shift from the enlightened or rational man to the average man. The rise of statistics in the 19th century attests to the loss in faith in the power of individual reason with respect to the masses. The average man embodied a form of political activity that could no longer be understood or rationality determined for certain. Adolphe Quetelet (1797-1874) invented the term ‘the average man’ to give individual representation to society – recognizing no real person would have all the characteristics of the average man.
Quetelet portrayed the average man as the center of a symmetrical distribution that we know as the normal or bell curve. Similar to measurements in astronomy the average stands for a true value of something that you are trying to measure, and the spread of measures around the mean is just a distribution of error. From science or statistical point of view the figure of the average man more closely represents society as a whole than any flesh-and-blood individual can.
The strength of the argument was the stability of the average. From Aristotle he adopted the concept as seeing the extremes as vice and the average as a virtuous balance. Instead of thinking of a continuum of low ability to high ability he imagined a spectrum of passions or politics, where both extremes are dangerous while the average means stability – the foundation of steady progress.
The study of the averages supported a program of social amelioration based on treating legislation as a kind of social experiment. You try out some new law or new legislation, such as making schooling available to populations who did not have it before, and you see how this affects crimes or illegitimate births. Then you can plan other legislative changes based on that response. The concept of the average man provides a predictable measure of society that the individual cannot be. Quetelet was seeking to relocate the rationalism which had become problematic at the individual level and shift it to involve all of society.
Since the turn of the 20th century, there has been a belief that technology and reason could make us masters of our environment. By the end of the 20th century, individualism, happiness, and capitalism were part of the core values of Western culture. Individualism is the belief that one’s place in the societal hierarchy – their occupational class, income and wealth, and power and prestige as well as the placement such as health and disease status – comes through one’s own efforts, and the right to make free choices which feeds consumer capitalism. Individualism fueled the American Dream – the hope for a better quality of life through following the rules, leading to a higher standard of living than their parents had. Meritocratic individualism creates a blind spot to social supports provided by the community, allowing individuals to then give full credit to themselves for their successes.
The proponents of meritocracy argue that it is more just and productive, allowing for distinctions to be made on the basis of performance. In fact, meritocracy serves to justify the status quo – perpetuate the existing upper class – merit can always be defined as what results in success, thus whoever is successful can be portrayed as deserving success, rather than success being predicted by criteria for merit. If wealth accrued based on merit, one would expect wealth to be distributed according to the bell-shaped curve, which it is not. Meritocracy inevitably becomes an oligarchy as demonstrated by the growth in income inequality and a reduction in economic mobility.
In the latter part of the 19th century, statistical thought shifted from an emphasis on normal populations to an emphasis on variation, or abnormal populations. Francis Galton (1822-1911) proposed to replace Quetelet’s ‘average man’ with statistics coupled with biology to upgrade man, to a better race, and in the process introduced eugenics. Also in the late 19th century August Weismann (1834-1914) promoted the theory that genetic information was stored in the nucleus of cells, which governed the interpretation of the DNA within each living cell. Weismann believed that nothing that happened to somatic cells could be passed on to the germ cells; the somatic cells did not transfer information to future generations. He firmly opposed Darwin’s idea of inheritance of acquired characteristics. At the turn of the 20th century, statisticians using the bell curve to identify abnormal populations turned to the genetic theory for more scientific support – genetics (it was believed at the time) accounts for the majority of the inheritance/control of the IQ. (The Galton Chair of Eugenics at University College London was renamed the Galton Chair of Genetics in 1963.)
Cognitive function was analyzed for changes in a 1970 birth cohort over 10 years in the United Kingdom. In this study, cognitive performance was measured at 22 months of age and followed periodically. Children who ranked low at baseline rose through the ranks if their parents were affluent, and conversely, children who ranked high at baseline fell back if their parents were poor. These finding suggest that the majority of those born without congenital disease, and regardless of level of access to health care, a poor social-economic environment is likely to undermine development and later health prospects. Head Start in the US and Sure Start in the UK are designed to interrupt the intergenerational cycle of deprivation. Fixation on the bell curve introduced the notion of ‘abnormal bodies’ and ushered in the reign of the tyranny of the bell curve. Under this reign, social commentators, like Charles Murray, have made convincing arguments that the IQ is primarily genetic and the difference between affluent populations and poor populations indicates the evolution of classes – supporting inequity in the system. The British birth cohort study disarms his hypothesis.
With the completion of the human genome project in 2003 it became known that genetics accounts for about 10% of diseases, and the remaining causes appear to be from insults from environmental and occupational sources. In the 21st century, the epigenetics revolution is rewriting our understanding of genetics, disease and inheritance. (Epigenetics is the study of changes produced in gene expression caused by mechanisms other than changes in the underlying sequences.) From believing that our biological fates were written in our genes, it is now recognized that the environment, and more specifically our perception of the environment, directly controls our behavior and genetic activity. Individuals are much more sensitive to exposures from their environment, diet and lifestyles than previously thought. Epigenetics highlights the effects of inequality in living and working conditions, as well as a range of disparities in societal opportunities including income, housing, employment, and access to health care.
Taking action on epigenetic harms to reduce heath inequity requires a paradigm shift, from treating high risk or diseased individuals which does not have much impact on population health levels overall, to changing a risk factor across a whole population by just a small amount that can have a great impact on the incidence of a disease or problem in the community. For example reducing salt intake in processed foods by a small proportion across a population (at a level individuals would not notice) would reduce blood pressure levels and in time reduce death rates from cardiovascular disease.
Stephen McNamee and Robert Miller of the University of North Carolina, argue in their book, The Meritocracy Myth, that belief in meritocracy is sustaining a myth that disguises economic inequality in North America and prevents progressive government initiatives to address the issue.1 In such a society the average man no longer shares in the American Dream.
1Williams, Ray. (13 June 2010) “The Myths of the Self-made Man and Meritocracy.” http://www.psychologytoday.com/blog/wired-success/201006/the-myths-the-self-made-man-and-meritocracy