Modeling Opinion Flow in Humans Using Boids Algorithm & Social Network Analysis
September 28, 2006 Page 8 of 8
The astute reader will have noticed that the inherit 'truth' of an idea has not been mentioned in this paper. While having an idea be correct may be useful when trying to spread it, correctness is not the final word. People have believed in many weird things that we now can see are incorrect, or that we disagree with on moral grounds. (Millions of people followed Hitler. Millions of people can be wrong.) The ‘truth’ may win out in the long run, but that long run may be a very long time.
Communism is a good example of this. Any economist could, in just a few minutes, explain why communism produces less that capitalism. But it took over 40 years of the Soviet Union for that lesson to come out, and it still hasn't come out in some parts of the world.
How can inherently wrong ideas have such long shelf life? Some ideas are probably just sticky. Communism, for example, sounds great at first. There are probably plenty of other reasons. Here we offer two reason that can be particularly powerful, and open the door to further investigation.
When a group of people have vested interest in maintaining an idea — no matter how silly it is — then that idea will continue to have some weight for a long time. Leaders by their very nature command respect and the ideas that they espouse will generally gain some traction in their followers. Leaders can also try to limit the contact that a group has with the outside world – thus minimizing the effect of contacts outside of the group8.
People who find an idea expedient, because it justifies their claim to power or access to resources, will in general become great proponents of that idea. This can work for good as well for bad. For example someone who thinks he can win in a free and fair election will be a great advocate of democracy9.
Leaders are not the only people who can find an idea expedient. For example, many people find the idea that ‘our problems are not our fault, but the result of those people over there’ to be expedient psychologically. The sweet psychological candy of scapegoating is a powerful force. It can relieve one group of blame and responsibility, and helps some of their politicians gain power.
- Feel free while reading this paper to interchange the names King John and King Richard, with your favorite decision of choice. Some examples may be ‘Coke beats Pepsi’, ‘democracy vs. insurgency’ or ‘capitalism vs. communism.’
- Another way to look at this is to think of the ‘pool’ of people to be a ‘pool’ in which ideas and opinions (memes) move around. Memes move from person to person as they infect each other with them. This formulation is equivalent. It all depends on who one thinks of as the intelligent agent; the people or the memes.
- The author believes that much of this transmittal of opinions is done without even conscious intention. For example, if someone rolls their eyes when they hear the name of a particular candidate, they are sending a message to the people around them. And it is a signal that people pick up on. Since much opinion flow comes from non-verbal signals, it stands to reason that logical well-constructed verbal arguments have little to do with why people believe what they believe.
- Since this universe is so small, we will not have a ‘news source’ reflecting the term for a ‘crowd average.’
- This may lead to Person A changing their opinion in large part due to the force on them from Person B, while at the same instant person B is changing their opinion due to the influence of Person C, but that is acceptable.
- In physical terms, this is as considering her in the bath of public opinion.
- It doesn’t have to be. It depends on the culture. The elites may have contempt for the commoners, while the commoners have simultaneously have respect for the elites.
- Many people believe that relationships are important when trying to change a society. Using the method presented her, and social network analysis, we can begin to quantify that importance.
- To minimize the psychological dissonance that comes from saying one thing and believing another, the democracy promoter may come to believe in democracy even if they did not at first.
Page 8 of 8