12 Aug 2011

How does a multicellular organism fight cancer? Why is it that multicellular organisms lose vitality as they age, even in such relatively simple organisms as slime molds? How does a society fight sociopathy? Why did the Roman Empire fall? Why do people gossip, and is it a bad thing? Will this essay be just a long list of apparently unrelated questions?

The answer to the last question, is 'no'. But, once you see a principle at work, it is sometimes the case that you can see it in many unrelated fields, and so it is in this case. Let's get to it.

First, a word about what cancer is. One definition (from Oxford's online dictionary) is: 'The disease caused by an uncontrolled division of abnormal cells in a part of the body.' Not a bad try, but we will need to unpack that a bit before it will do for our purposes here.

First off, 'disease'. This is important to discuss, because it's not the cancer cells themselves that are (by and large) suffering from the condition. In fact, the cancerous tumor may be undergoing quite robust growth. It is a disease because it is bad for the overall organism. Even if the tumor causes the overall organism's size to increase, and we generally consider growth a good thing, it's still a disease. For example, we want a child to grow up, but not via a bunch of cancerous tumors. It's a disease because the cells of the body are supposed to do certain things for the organism, and cancer gets in the way of that.

So second, 'uncontrolled division'. In the literal sense, this is untrue, everything is controlled somehow. Cancerous cells aren't able to defy the laws of physics and increase in numbers without being brought nutrients from elsewhere. The implication is clearly that there is a system which SHOULD be controlling these cells' division, which isn't able to. This system would allow each kind of cell to divide only to the extent that it was good for the overall organism, and in cancer this has broken down. Note that, in the case of healthy tissue, there isn't a precise number of cells we're supposed to have; the process is somewhat unplanned or undirected. Not, however, uncontrolled. What is controlling it, is something we will discuss more soon.

(click tab 2 above to view next page)

Third, 'abnormal cells' is true, but insufficient. It's not just that the cells are abnormal that is the problem. Many cells in the body (human or otherwise) are abnormal, in that they have important features that most cells don't; cell specialization is one of the basic features of multicellular organisms, and it's why they are a successful life form. So, what is it about these abnormal cells that makes them problematic? Primarily, it's that they are not contributing much of anything to the well-being of the overall organism. They take in resources, but they don't contribute anything back in the way that, for example, healthy muscle or lung or brain cells do.

So, at the core of the idea of cancer (which is a term used for a number of different conditions) is that it has short-circuited, evaded, or defeated a system (or perhaps several systems) in the body which insures that every kind of cell is contributing back to the health of the overall organism (or, in the case of the reproductive organs, contributing to the survival of the genes that the body is a vessel for).

'Cancer' is an example of a more general phenomenon that we tend most to notice how a part of the body works to keep us alive, when it starts having trouble. This is even more true than in the case of the heart or lungs, though; the system(s) which prevents cancer from being the normal situation are only partially understood, because they don't reside in a single location in the body. When our heart has a problem, we most often notice the effects immediately, and the mechanisms by which the heart does its job are relatively straightforward, because we can see the heart as a distinct thing, and this makes it more obvious what it is and what it does.

As a mostly (but not entirely) self-contained community of cells, the multicellular organism is a little like a community of humans. We know that communities have discernible identities, they can often (but not always) repel invaders, they can work together for a common purpose, they can pass resources from the periphery to the center and waste products the other way, they can even reproduce (or fail to), by establishing colonies (the U.S. Is the offspring of England, Carthage was the offspring of Phoenicia). The same community can experience the gradual death of member after member, adding new members to take their place, and retain the same identity (or at least believe it does) even after enough time has passed that every individual has been replaced. There are doubtless real differences between communities and organisms, but there are enough similarities that we might expect to see some similar solutions to the problems both kinds of entities can face.

Can there be a cancer in a community? What would that look like?

(click tab 3 above to view next page)

Well first of all, it would be a cluster of individuals who do not contribute to the health of the overall community. In other words, they would consume resources without contributing much back. Note that, if we measure 'productive' by growth or other economic measures, they might look very productive. The measure of what they contribute to the overall community needs to be something that looks not at how wealthy they are, but what they do for others. Cancer cells are great accumulators of resources; the problem (what makes them a disease) is that they contribute nothing (or very little) back.

Second, it would tend to reproduce beyond what is good for the society. There are types of people who contribute nothing (or almost nothing) to society, which are not analogous to cancer because they remain a small minority. For example, most homeless people (for economic or physical or mental health reasons) don't contribute much to the well-being of the overall society, but it's not usually a problem because they don't become a bigger and bigger percentage of the population over time. In a society with other problems, the homeless population may go up, but this is a symptom, not a cause, and the fact of having many homeless people doesn't create much of a feedback loop to increase the numbers of homeless people; in most cases actually the support network for each homeless person will decrease as their numbers increase, providing even more of an incentive to avoid being homeless. However, in some cases an antisocial group may become a bigger and bigger portion of either the population or the economy, in a process that is self-reinforcing or at least unchecked.

In some ways, the real mystery here is why this doesn't always happen. To look at some of the reasons why not, let's look for a bit at the Prisoner's Dilemma. A well known thought experiment which has turned into a fair number of real experiments, it now is a term used for any situation in which there are two agents, who have a choice of either cooperating or betraying, but must both make their choice before they know the other one's choice. What makes it a dilemma is that both will be better off if they both cooperate, but if one of them is going to betray, then that one will do better than the other.

The payoff matrix could be something like:

both betrayeach gets 1
one betraysbetrayer gets payoff of 3, other gets nothing
neither betrayboth get 2

(click tab 4 above to view next page)

Note that, crucially, the net benefit to society is greater in the case where neither betray (total gain of 2+2=4), but the best case for an individual is the one in which they betray but the other does not (gain of 3 rather than 2 for that individual). Many tournaments have been run, often using computer programs implementing various strategies.

In the simplest case, where people are matched up once only, the best strategy is to always betray. Whether the other person cooperates or betrays, your best outcome will be if you betray. However, once you are paired with the same other agent for a series, it becomes a better strategy to use an algorithm called Tit-for-tat, in which you cooperate on the first round, then every round thereafter do whatever the other agent did last round. In this way, the loss to a betrayer is small, but if you are paired with someone who cooperates (or, crucially, another Tit-for-tat agent), you will get round after round of cooperative payoff. Since the betrayer will rarely get more than one round of payoff, this tends to reinforce cooperative behavior.

In the even more complex (and in some ways realistic) case, however, there may be errors. Let's say you have an error rate of 1/10, so that an agent which intends to cooperate will one time in ten betray (accidentally), and vice versa. Now, Tit-for-tat has a problem. If it often 'holds a grudge', then these mistaken betrayals will cause it to betray for (most likely) the rest of the series, getting much less payoff. Now, a 'turn the other cheek' strategy can do well. If most agents are Tit-for-tat, but one is Turn-the-other-cheek, then most agents will occasionally get very low payoff when they get stuck in a cycle of mutual retribution. 'Turn the other cheek' will on the other hand be able to overcome such a misstep, and will do better in the long run.

You can experiment with this using the simulation at left. First, type into the three boxes how many of each agent you want (adding up to 20 total). Then, click on the button marked 'Click here to start the new society', and it will create a 'society' with that distribution. When you click the second button, marked 'Click here to evolve the existing society', it will replace the lowest-scoring agent with an agent of the same type as the one of the previous generation which scored highest (plus, occasionally, there will be a random mutation, so even if you start with no Sociopaths, eventually you will have one). Keep clicking this button, and you will see the distribution of types change over time. Eventually, the Sociopaths tend to win.

(click tab 5 above to view next page)

To understand why the Sociopaths tend to dominate, in this simple simulation, we first need to look at what happens when there are no Sociopaths. If you start with, for example, 10 Tit for Tat and 10 Turn the Other Cheek, and then run it for a few dozen generations (you can hit the "click here to start new society" button if you want to start over and try again). Eventually, Turn the Other Cheek will come to have 12-14 slots out of the 20. This is because, when there are too many Tit for Tat agents, they lose too many points to rounds of mutual recrimination. If Tit for Tat faces another of its own kind, and either side accidentally betrays even once, then every round thereafter will be another in an endless cycle of betrayals. The average score for each side in this case will be around 1.5, since on one round they will get 3, and the next they will get 0. If one or both of them are Turn the Other Cheek, then these accidental betrayals have no great consequence, and both sides will average 2 instead of 1.5. If there are not too many Tit for Tat, then this disadvantage is not too great, so they are not driven to extinction, but they will never remain dominant. Even if you start with 15 Tit for Tat and 5 Turn the Other Cheek, the society will tend to evolve towards no more than 8 Tit for Tat.

This, however, makes the society ripe vulnerable to the Sociopath, when one finally does arise (this simulation has a mutation about 1 round in 10, when instead of imitating the best performer, the lowest performing agent will just pick one of the three strategies at random). Worse yet, not only does Turn the Other Cheek tend to suffer badly at the hands of a Sociopath, getting 0 points round after round, it delivers to the Sociopath a 3 points/round windfall. This means that it will do better than non-Sociopathic agents who are cooperating with each other. If most of the Sociopaths interactions are with Tit for Tat, this is not a problem, since it will get only 1 point/round against them (after the first round, both Tit for Tat and Sociopath will practice mutual betrayal). But, if most of the Sociopath's opponents are Turn the Other Cheek rather than Tit for Tat, it will shoot to the top, and more Sociopaths will appear (since the worst performing agent in any round usually converts to the best-performing strategy). About half the time, Tit-for-Tat will be able to survive in sufficient numbers such that, once all of the Turn the Other Cheek have died off, they can defeat the invasion of Sociopaths. However, since the rise of Sociopaths will happen again and again, and Tit-for-Tat can survive this only about half the time, eventually you will end up with a society of 20 Sociopaths and 0 of either of the other types.

Once you have mostly Sociopaths, then the benefits which Tit for Tat and Turn the Other Cheek could have from mutual cooperation become insufficient to make up for the fact that they are constantly being taken advantage of by Sociopaths, and the downward spiral to an all-Sociopathic society continues. Unfortunately, while an all-Tit for Tat society is vulnerable to conversion by Turn the Other Cheek, and an all-Turn the Other Cheek society is vulnerable to conversion by Sociopaths, an all-Sociopath society is not convertible to either Tit for Tat or Turn the Other Cheek, and it will remain stable (and low-performing, since the normal payoff in an all-Sociopathic society is 1 point/round). There is some randomness built into the results, and if run several times you may get different results. Occasionally all of the Turn the Other Cheek agents will be killed off/converted while there are still enough Tit for Tat left to turn the tide, and the Sociopaths will spend too much time betraying each other while the Tit for Tat agents are (usually) able to cooperate. Run it long enough, though, and eventually the Sociopaths always take over.

(click tab 6 above to view next page)

So what does this mean for anything other than a simple simulation? It highlights the fundamental balancing act that any society (whether of cells, or humans, or maybe even species) has to try to perform. The best performance happens when different agents cooperate, but often this cooperation creates the possibility of betrayal. Our own immune system has the challenge of being able to attack any threat (e.g. bacteria, virus, cancerous tumor) that is consuming resources without contributing back. However, as anyone with an auto-immune disorder can attest, this is not something even our own immune system can always do correctly. Being too vigilant, can result in a community at war with itself, too quick to interpret as betrayal what is actually just error. We have all known people of this type, and they're not good to live near. A society of 20 Tit-for-Tat agents will have a total score of around 80,000, while a society of 20 Turn the Other Cheek will have a total score of 140,000, nearly twice as good.

However, what Great Men (and Women) throughout history have been less than stellar at facing up to, is that simply refusing to take offense at anything is not a viable strategy. Anyone with an immune system that is NOT sufficiently vigilant, is well aware of this (for the short time before they die of one infection or another). To say that we as a society don't need an immune system is essentially to deny the existence of incurable sociopaths, and there is ample evidence that at least some people just don't care about others and will take everything they are allowed to. Empathy, morality, and self-restraint are hard, and there will always be those who cannot or at least do not make the effort. If the nature of the society at the time is that they shoot to the top, they will be imitated. Like cancerous cells that can rely on other cells to continue to create blood vessels to bring them new resources, even as they use it to create more cancer cells that will ultimately kill off the organism they are part of, there are individuals who take more and more, create little or nothing, and assume that there will always be someone out there producing more for them to take. For a long time, this will be true.

The first recorded simulation tournament of this type was run by Robert Axelrod in 1984, and the winning strategy was contributed by Anatol Rappaport (he was the creator of the Tit-for-Tat strategy, at least in this context). However, while Tit-for-Tat did well in that tournament, we find here that a society of all Tit-for-Tat is ultimately unstable, and in any event low-performing. Looking forward, we are going to look for a strategy (or mix of strategies) which allows for a high-performing society which is resistant to invasion. The trickiest part is, that a society must be resistant both to a "mutation" which consumes too many resources (e.g. cancer, sociopath) and a mutation which is insufficiently on guard against this (e.g. poor immune system, excessively obedient citizenry), without falling into a low-performing trap of excessive vigilance (e.g. autoimmune disorders, rigidly conformist society that stifles innovation).