Pages

2019-03-01

Dunning-Kruger: Cognitive Bias, Illusory Superiority and the Knowledge Illusion


The Dunning Kruger Effect is a psychological phenomenon first identified in a 1999 study by U.S. psychologists David Dunning, then of Cornell, and Justin Kruger of University of Illinois who discovered that people with relatively low ability and levels of knowledge tend to systematically overestimate their own abilities. The competence of other people is, by contrast, always underestimated. It basically describes “ignorance about ignorance,” a very wide-spread human characteristic.

After all, Danning and Kruger’s study earned them the satirical Ig Nobel Prize in 2000 – an award given for research that makes people laugh, then think – and the effect named after them had at least a career in popular psychology. However, Dunning-Kruger has also been used in connection with climate change deniers who, due to a particular cognitive defect regarding scientific evidence, would rather distrust all data on global warming.

Now, in a survey of 2,500 Europeans and U.S. citizens, researchers led by Philip Fernbach (Leeds School of Business at the University of Colorado, Boulder) encountered this phenomenon again, this time in connection with green genetic engineering. For their study in Nature Human Behavior, psychologists and marketing researchers first ascertained how well the subjects rated their own knowledge on the subject. In the second part, they then tested knowledge in genetics to determine how much their test persons did, in fact, know.

In addition, study participants were asked about their attitude to genetically modified organisms. Although there is widespread consensus among researchers that genetically modified food is safe for human consumption and theoretically even has potential to provide some health benefits, over 90 percent of respondents rejected genetically modified food. This has long turned into a cultural war.

But that was not the most important result of the study. That honor belongs to a paradoxical nexus: those who are particularly opposed to genetically modified food also said that they knew a lot about the subject. At the same time, they performed worst on knowledge tests about genetics in particular and science in general – a poster version of the Dunning-Kruger effect: the knowledge illusion, or illusory superiority.

In the words of first author Philip Fernbach, "Extreme views often stem from people who feel they understand complex topics better than they do." One possible self-reinforcing consequence of the phenomenon is that those with the least knowledge of important scientific subjects will most likely remain ignorant – simply because they are “knowledge-resistant,” not open to new knowledge. And, of course, they will not look it up because they think they already know everything in the first place.

So the key question is how to make people "appreciate what they do not know," as study co-author Nicholas Light says. But that also means rethinking previous approaches to science communication. Mere information and an appeal to trust science and its kind of cognitive production will no longer reach this group of radical opponents of genetic engineering

The study’s authors also looked at other topics such as gene therapy and climate change. While the attitude towards gene therapy and knowledge about it show very similar patterns to genetic engineering, the findings about climate change deniers differed: remarkably, unlike in the study about genetics, the Dunning-Kruger effect does not seem to apply there. Rather, political polarization and group membership appears to shape people's attitude to climate change considerably more than knowledge (or lack thereof). Just like science itself is busy seeking to build resistance to climate change, human nature, in its political incarnation, may be inclined to favor resistance to science. 

2019-02-11

IT‘s Coming Power Famine

Ivona Brandić, one of the younger members of a national Academy of Sciences anywhere (yes, there is an old felony on the books, “brilliant while female,” it goes back to Hypatia and probably long before her) recently debunked a myth carefully nourished by media priorities – a fundamentally luddite phobia of AI despite far more pressing worries concerning the energy supply fueling IT as it enters the age of Big Data:

Brandić  points out that nary a day goes by without tales of new horror scenarios about breakthroughs in artificial intelligence AI that describe risks of AI taking over and making us all redundant, of autonomous killer robots, or medical systems making independent decisions capable of extinguishing life. It will probably not come to all that because long before we develop killer robots, will we probably run out of global power supply. Few scientists and bitcoin miners aside, it is still poorly understood why the increasing power consumption of IT can become a real problem. Society at large became aware of an issue with power-hungry IT only when early adopters realized that profit and loss did not depend solely on the price of cryptocurrencies but also on the price of electricity and the global computing power of mining, all determinants of the ‘reward’, which is the number of crypto units. Bitcoin activities consume as much energy as Singapore or Iraq.

The history of information processing had several inflection points, such as the invention of the desktop computer or the smartphone. They had one thing in common: enormous increase in energy consumption. The biggest such event, digital transformation, is yet to arrive.

Just Google alone consumes as much power as San Francisco. There are currently about 8.4 million data centers worldwide that consume about 416 terawatts a year, as much electricity as nearly 30 nuclear power plants are able to generate, with exponentially rising tendency. 400 of them are hyper-scale. Several hundred wind turbines are needed to replace a single nuclear power plant (one nuclear power plant in Germany produces the equivalent of 3,000 wind turbines), none of which is easy to build in many areas for political reasons. Even a coal plant takes 600 wind turbines to replace. Green power generation is limited because it is restricted, inter alia, by geographic and natural resources. As a result of increasing digitization, soon every light bulb, every shopping cart, every jacket, every item around us will be or contain a computer in its own right that continually produces data that needs to be processed and stored. So, in the near future, the IT sector will become one of the largest consumers of electricity globally. There is a rather substantial risk that societies will reach for quick solutions such as new nuclear power plants as IT’s energy consumption suddenly rises dramatically. The other risk is that one will resort to radical measures to reduce or limit access to IT, with devastating social consequences.

Data increasingly needs to be processed where it is generated: at the first possible data processing point in the network. If four autonomous self-driving cars are about to cross an intersection without traffic lights in Birmingham, AL, then the server processing the data cannot be in Atlanta, because decisions must be made by autonomous cars in small fractions of a second. If the server is in Atlanta, then the latency, or response time, is just too great. This will not change much in the near future regardless of 5G, 6G or other technologies, simply because physics will not cooperate. For these four cars and their passengers to remain unharmed, data needs to be processed in their immediate vicinity. Now, it is true that data is increasingly not just transported but also processed by the internet infrastructure, for example through routers or switches. But because these are not powerful enough, new small data centers, so-called "edge data centers", are being built in the immediate vicinity of data producers. But building an edge infrastructure along the highway, in the city and in other urban areas is very expensive, inefficient, and can rarely be resolved with green energy.

The classic approach to this problem would be to outsource data processing to large data centers (for example, to clouds). Such data centers can be built wherever there is cheap and green power, and above all, where there is plenty of space. Currently, many companies are building so-called "high-latitude" data centers beyond the sixtieth parallel north (Alaska, Canada, Scandinavia and Northern Russia) because servers can be cooled there easily and cheaply. Cooling often amounts to nearly 40 percent of a data center’s total power consumption. Still, these large, optimized and efficient green-lawned data centers help us very little with the four self-driving cars at the crossing in downtown Birmingham. That is why building data centers increasingly requires imagination. To reduce cooling cost, Microsoft builds data centers under water for several reasons: half the world's population lives in coastal regions and most data cables are already submerged, which provides for short latency periods in addition to low cooling cost. Alas, Birmingham, like the rest of the world, is far from the sixtieth parallel, with neither a seashore nor a lake in sight where a data center could be buried. The big question is thus whether further technological breakthroughs will be needed to regain control of IT’s voracious power consumption. The obvious answer is in the affirmative. 

The first approach to dealing with increasing power consumption will be to use completely new, super-efficient computational architectures. Quantum computers promise extreme computational efficiency at a fraction of classic computers’ energy consumption. But quantum computers are primarily suitable only for highly specific tasks, say, in the financial sector, for simulations, or in life sciences. Predicting developments in quantum computing research in the near future remains a challenge but it is unlikely that autonomous cars can be equipped with quantum computers in the medium term. It also still seems unlikely that a quantum computer could be used for real-time evaluation of camera imaging right there on the highway.

Another approach is to store power across time and space, which is currently not possible with conventional technology. But hydrogen power plants can achieve precisely that. If, political risk aside, huge solar panel farms could be built in the Sahara and other deserts, we could generate enormous quantities of electricity, albeit in unpopulated areas devoid of power consumers. However, that power could be used to generate hydrogen which may be transported in containers or pipelines to areas of high demand for power and where hydrogen power plants could be operated. Unfortunately, recent studies show that this intermediary step of producing and transporting hydrogen involves an enormously large carbon footprint of its own.

The future will likely bring a hybrid that combines classic and quantum computers, conventional and hydrogen power plants. But we will largely have to make do with classic computers and classic power generation throughout much of the digital transformation. Two assets can be relied upon to achieve that: one is a well-developed telecommunications infrastructure that will be expanded further, and a well-developed internet infrastructure. Their symbiosis requires distributing applications across data centers ensuring that as little power as possible is consumed and that user satisfaction is maintained. This relies on the spatio-temporal conditions of each edge or cloud data center. Geo-mobility of users and the ability to create application profiles will also play an important role.

Unfortunately, behavior surrounding data-intensive applications can be predicted only very poorly, because data values may extend across infinite domains. Statistical procedures promise important remedies. But increasing mobility of users, devices and ultimately of all infrastructure create new challenges for economically optimized workload distribution, requiring a systems approach, a virtual version of operations research analytics to result in substantially improved decisions.