Problems in understanding the evolution of honesty

There are several simple problems with any argument that honest communication (between unrelated individuals) requires costly signals.

First, all signals have some cost.   Certainly signals that serve no other purpose for the signaler must have a cost.   This includes all signals that evolve to produce a response from receivers.   By the standard argument, the universality of costs implies that all communication is necessarily honest.   Yet there are good examples of dishonest signals (dishonest alarm calls that allow subordinate animals to gain access to food, for instance).

Second, the converse of the argument does not hold ... there is no reason to conclude that costly signals must be honest.   Even dishonest signals incur some costs in terms of time, energy, or risk.   To make any sense, the argument about honesty resulting from costs must set some lower limit for the costs of honest signals, some limit higher than the costs of dishonest signals.   The Sir Philip Sidney game, however, set no lower limit to the costs of honest signals.

Third, a conclusion that cost-free signals (if there are any) might be dishonest seems like a no-brainer.   If there were a signal that had no costs, then signalers should produce it all the time, even if the chance of a response from a receiver were vanishingly small!   If costs are zero, then any chance of a benefit would result in a net gain.

So, it does not make sense to argue that costs make signals honest (because all signals, even dishonest ones, have costs). Nor does it contribute much to argue that honest signals are always costly (because all signals are costly regardless of their honesty and if some hypothetical signal had no costs there would be no restrictions on its use).

The only thing that would make sense out of this argument would be a quantitative conclusion that honest communication evolves if and only if signals had costs that exceed some minimal cost.   No one has produced this quantitative condition.   Zahavi does suggest that more costly signals provide greater assurance of honesty, but the Philip Sidney game on the contrary places an upper limit on the costs of honest signals.

We need some new approaches!   First, we need a better way to understand the possibilities for honesty in communication.   Second, we need a better way to understand the costs of signals in relation to honesty.   For the first of these objectives, consider receivers instead of signalers.

Optimal conditions for receivers

Instead of the costs for signalers, focus on the benefits for receivers.   As Alan Grafen (and others) have recognized, communication would never occur (would never evolve) if receivers did not benefit from responding, at least on average.   Otherwise, receivers should evolve to stop responding, and communication would cease.   Receivers, not signalers, control the evolution of honesty!

Communication systems evolve only when receivers benefit on average.   All stable communication systems are thus honest.   If they were not, then selection would result in their disappearance.

Furthermore, by focusing on receivers, instead of signalers, we can see that, although stable communication systems are always honest on average, it is never true that any communication system is honest in all instances.   Errors by receivers, manipulation by signalers, eavesdropping by receivers cannot be entirely avoided.   Inevitably both the signaler and the receiver are exploited some of the time.

To reach this conclusion, we need to start over again, with proper definitions of communication and signals.

  • Communication occurs when a receiver responds to a signal from a signaler.

The crucial concept is the signal.

  • A signal is any pattern of matter or energy emitted by one individual and evoking a response in another without providing all of the power for that response.

The response cannot result from one individual overpowering the other (for instance, a predator tackling prey).   Instead, the receiver's nervous system responds to some change in its sense organs (so the signal must provide some power, enough to affect the sense organs of the receiver).   It is the receiver's decision to respond.   When it does, it must provide most of the power for its response.   The receiver thus controls the outcome of communication.

Receivers have a difficult task, because signals degrade and become mixed with background energy before they reach the receiver.   Sounds, for instance, attenuate (lose power) overall, but high frequencies attenuate more than low frequencies (because high frequencies are absorbed by molecular vibrations in the atmosphere or water and are more likely to be scattered by objects like vegetation).   In forests, sounds also accumulate reverberations (continuous echoes from trunks of trees and layers of foliage), so closely spaced notes at the same frequency become difficult to distinguish.   Degradation of signals during propagation from the signaler to the receiver means that receivers have to deal with messy signals not clean ones.   Most experiments in animal behavior use clear signals to evoke responses, but most receivers in natural situations deal with messy signals.

Furthermore, there are always background sounds in natural environments.   Other species produce their own signals (also degraded and thus messy), and nonbiological sources (wind, water, moving vegetation) make additional sounds.   All of these extraneous sounds create background energy against which a receiver must recognize the (messy) signals to which it wants to respond.

As a result receivers make errors.   In most (or all) natural communication, a receiver's sense organs respond continually to background energy.   It is reasonable to suppose that the level of activity in a sense organ varies in time even in the absence of a signal of interest to the receiver.   When a signal does occur, the level of activity in the sense organ increases but still varies.   Often (perhaps always in natural conditions) the level of activity in the absence of a signal overlaps that in the presence of a signal.   This situation is the one that engineers study under the heading of Signal Detection Theory.   The mean difference between the activity or a sense organ with and without a signal is called the signal/noise ratio.

When signals cannot be absolutely separated from noise by receivers, they face a double bind.   To deal with the situation, they must pick a threshold above which they respond and below which they do not.   Because the levels of activity with and without a signal overlap, any threshold can result in four possible outcomes each time a signaler decides whether or not to respond.

If a signal has occurred that produces a level of activity in sense organs above the threshold, the receiver responds (a correct detection).   If the level of activity does not reach the threshold, the receiver does not respond (a missed detection).   If the level of activity in sense organs is below the threshold when there is no signal present, the receiver does not respond (a correct rejection).   On occasion, the level of sensory activity might exceed the threshold for a response even when no signal is present (a false alarm).

We can diagram these four outcomes ...


				    RESPONSE OCCURS
				+			-

		+	Correct detection	Missed detection
				(CD)			(MD)
SIGNAL PRESENT
		-	False alarm		Correct rejection
				(FA)			(CR)

Two of these outcomes are correct (CD and CR).   Both have net benefits for the receiver.   Two, however, are errors (MD and FA), both of which have costs for the receiver.   A missed detection means further time searching for a mate, for instance, with risks of predation and expense of time and energy.   A false alarm, in this case, means mating with a suboptimal male.

The receiver can of course greatly reduced the chances of a false alarm, just by raising its threshold for response.   Then on many fewer occasions with no signal present will the receiver's sensory activity exceed its threshold for response.   Raising the threshold, however, increases the chances of missed detections.

The receiver can reduce the chances of missed detections, by lowering its threshold for response.   Lowering the threshold, however, increases the chances of a false alarm.   The receiver is in a double bind.   Any change in its threshold for response increases one kind of error and decreases the other.   A receiver in a noisy (natural) situaton cannot avoid errors.

Receivers can adjust their thresholds to optimize their decisions ... by balancing the four possible payoffs, the probabilities of each, and the probability of a signal ... to produce the greatest average benefit.   Recall that if receivers cannot realize some benefit, then they should stop trying to respond to signals at all, and communication would cease.   They should keep trying whenever there is a benefit on average.   Nevertheless, they cannot avoid some errors.   The best they can do is to optimize the trade-off among the two kinds of errors and the two kinds of correct decisions.