http://globalguerrillas.typepad.com/globalguerrillas/2004/05/cascading_syste.html
This is a supplemental blog for a course which will cover how the social, technological, and natural worlds are connected, and how the study of networks sheds light on these connections.
‘Broken Windows’: Preventing an Information Cascade
Thursday, April 12th, 2007 9:21 pm
Contributed by: lmh88
http://www.citizen-times.com/apps/pbcs.dll/article?AID=200770411153
The above article, “Put ‘Broken Windows’ Theory to the Test on Litter,” is an editorial endorsement for a litter-reduction act in North Carolina. The author claims that the state’s excessive litter accumulation is due in part to an information cascade in which people are more apt to litter if there is already trash on the ground. In promoting the litter-reduction legislature, the author references the apparent successes of Rudolph Giuliani in cleaning up NYC through the ‘Broken Windows’ initiative.
As described <here> on Wikipedia, “Broken Windows” is a theory of urban renewal and order maintenance policing, based on ideas related to information cascades. The theory was originally described in a 1982 article by James Q. Wilson and George Kelling, and has been implemented in cities throughout the country, most notably through Mayor Giuliani’s “zero tolerance”/“quality of life” reform acts. Under Giuliani’s direction, NYC police arrested, and the city prosecuted, thousands of people for the petty crimes that would otherwise have warranted only a fine (e.g., subway fare evasion and public drinking).
Social scientists have pointed to confounds in assessing the effectiveness of the Broken Windows policy of in preventing major crime like murder. However, it is less debated that preventing an information cascade is the best way to combat petty crimes like littering in North Carolina.
Posted in Topics: social studies
Herding Behavior in Stock Trading
Wednesday, April 11th, 2007 11:37 pm
Contributed by: jaydogg
http://else.econ.ucl.ac.uk/newweb/research/blurb1.php
The field of behavioral finance gleans much from the idea of information cascades. In particular, the herding phenomenon has caught the attention of Dr. Antonio Guarino and Marco Cipriani. In the case of trading stocks, herd behavior is the act of disregarding a trader’s own private information in favor of previous traders’ actions. The results of their experiments are very interesting. For instance, one of their conclusions was that herd behavior is limited when traders trade just using information. However this only works in a perfect market because herding almost always occurs once transactions costs are introduced. More interestingly, this research has implications in the study of financial contagions where herding moves between markets.
Guarino introduced “social learning” as the rationale behind the herding behavior of traders. Much like the simple model introduced in class, traders look to learn from the actions of traders who precede them. The first few traders may trade based on the signals their private information gave them or perhaps for liquidity or other non-informational reasons. For example a trader may have information that leads him to expect a firm to miss its earnings estimate. He would be inclined to sell or short that stock. A second trader could arrive at the same conclusion based on his own information and sell as well. A third trader’s private information is now rendered useless. His signal is no longer a deciding factor as it will be eclipsed by inferred signals of the first two traders’ actions. Of course the financial markets with all the volume and traders involved can seriously be modeled in this fashion. However, that gives an idea of how it could work. Guarino states that informational reasons to trade erode in importance as non-informational reasons dominate and generates herd behavior.
This has important implications in terms of stock frauds. Falsely propping a stock up and sending false signals can cause an information cascade to occur and cause a bubble to develop. For example, a wealthy investor can afford to buy enough shares of a stock over a short period time of to create the illusion of high demand. It is even possible, especially in emerging markets, to disguise one as several different entities thus creating the illusion that several traders have bought the stock. In this fashion, the investor may create an information cascade that causes the price of the stock to skyrocket. Of course, that investor will jettison at the peak of the bubble leaving the rest of the cows to jump off a cliff. Although the risks of such fraud is relatively low in the United States, investors should still be wary and keep the concept of herding and information cascades in the back of their minds.
Information Cascades in the Motion Picture Industry modeled by Bose-Einstein distribution
Wednesday, April 11th, 2007 11:04 pm
Contributed by: rmw39
Bose-Einstein Dynamics and Adaptive Contracting in the Motion Picture Industry, Arthur De Vany, W. David Walls
Following from an earlier post about information cascades and the movie industry that referenced Arthur De Vany’s example, I dug up a paper by Arthur De Vany and W. David Walls called “Bose-Einstein Dynamics and Adaptive Contracting in the Motion Picture Industry.” De Vany and Walls explain a very interesting phenomena of the motion picture industry by modeling the information cascade in an audience that can make a film either a hit or a flop as a Bose-Einstein process, which affects the dynamics of cinema screen revenue.
As a matter of background, the Bose-Einstein theory comes from statistical mechanics and is used to describe the statistical distribution of bosons. Einstein predicted that if a Bose gas were cooled to almost absolute zero, the bosons would suddenly crowd together into the lowest energy state. This singular quantum state achieved is called the Bose-Einstein condensate. It is the probability of particles “crowding together” in the same state in the Bose-Einstein distribution that is relevant to modeling audience behavior in the movie industry.
De Vany and Walls analyze the movie industry’s distribution of revenue between different movies being shown by looking at audience behavior over a movie’s run at the cinema. We start with a Bayesian probability of the likelihood that some movie-goer will choose a certain movie out of a set of movies being shown. Since the successive outcomes of this random variable are not independent, each trial will be a Bayesian model of a movie-goer who keeps track of information about the quality of the movie by observing previous trials. So then the probability that a new movie-goer will see a movie depends on who has already seen it (and has reported positively on it) and the distribution of the other viewers who see other movies. Then the unconditional distribution of cinema revenue based on the succession of movie-goers’ choices is shown mathematically to follow a Bose-Einstein distribution, where the probability that a customer selects a particular movie is proportional to the fraction of all other movie-goers who select that movie, so that previous selections will attract new selections of the same movie. This produces potential leverage for movies that undergo an information cascade of this sort, where differences between movie revenues can grow exponentially. This fact reinforces the property of the Bose-Einstein distribution that all possible outcomes are uniform and equally likely (so the probability that 20 people see one movie and only 1 sees another is equally as likely as the scenario where no one goes to either).
The moral of the story is that production companies in Hollywood cannot make accurate expectations on a movie’s revenue. Because of information cascades, modeled in this paper as a Bose-Einstein process, movies that are expected to do well could rapidly decline in viewers and low-budget movies could become hits when it receives a mass of positive reinforcement from early viewers and then attracts new viewers exponentially. This is precisely what happened to the $175M production “Waterworld” that flopped with $88M in revenue, while the low-budget no-big-star film “Home Alone” grossed almost $300M. The stochasticity of how well Hollywood movies do, such that production companies can never accurately determine success rates, certainly reminds me a great deal of the book A Random Walk Down Wall Street, which first popularized the idea that a monkey throwing darts at a board of stock names can typically out-do the average professional institutional investor ~ this is the best analogy to production companies and the scripts they choose to produce.
Posted in Topics: Education
E85: Why Networks Effects Hinder the use of Ethanol Based Fuel
Wednesday, April 11th, 2007 10:44 pm
Contributed by: R
http://72.14.205.104/search?q=cache:ia3K8y8XtsMJ:globalwarming-factorfiction.com/2007/04/02/pump-games-fill-up-with-ethanol-one-obstacle-is-big-oil/+Pump+Games:+Fill+Up+With+Ethanol%3F+One+Obstacle+Is+Big+Oil&hl=en&ct=clnk&cd=2&gl=us&client=firefox-a
Article in Wall Street Journal 2 April 2007
“Pump Games: Fill Up With Ethanol? One Obstacle is Big Oil”
The author of this article describes the current situation of the ethanol-based fuel E85 (85% ethanol, 15% gasoline) to replace the use of gasoline in transportation in the United States. Many believe that ethanol fuel is one possible solution to reduce our dependence on imported oil. Furthermore, E85 fuel burns cleaner than gasoline and is therefore better for the environment. The one downfall of E85 is that it is slightly more expensive in terms of price per mile driven. However, this does not explain why consumers have refused to adopt the use of ethanol fuel.
To begin, only certain cars that are E85 compatible are able to use this fuel. Although Ford and GM are currently working on developing “flex-fuel” cars that can use either E85 or gasoline, there are very few cars currently available that are compatible with E85. The second problem is the main obstacle: as a consumer, where can I buy E85 fuels? There are currently only approximately 1000 gas stations in the US, about one percent of the total, that offer E85. Many state that the main obstacles to providing E85 fuel are big oil companies. For example, since gas stations are franchises, the owner of a franchise could provide E85 oil. However, main oil companies prohibit this because they do not provide E85 fuel and they do not want their franchises to sell products that aren’t there’s. However, I believe that another major factor is due to a network effect.
Imagine yourself as the owner of a gas station. You happen to have the permission from the oil company of your franchise to sell E85 at your pump. However, to provide E85, you need to install a new tank and pump system, since E85 cannot be mixed with gasoline. Therefore, there is large initial investment required to provide E85. Here’s where the network effect takes place. Why would you pay so much money to provide ethanol-based fuel, which won’t sell very well due to a limited amount of customers? Furthermore, as a customer, I would not want to spend on a new car that is ethanol compatible if it is too hard to find this fuel. The fact that gasoline is so readily available and easy to sell prevents both sellers and consumers to make the switch to E85.
Perhaps government subsidies and the rise in the price of oil could help sales in E85. If enough consumers begin to adopt E85, perhaps this could trigger a network cascade that would cause more stations to provide it and therefore convince more consumers to buy it. However, for the moment, it seems that subsidies from government and non-government organizations to help stations provide E85 are what keep this type of fuel alive.
Posted in Topics: Education
Information Cascade and Peer Pressure
Wednesday, April 11th, 2007 10:41 pm
Contributed by: blusmells
New Study Looks at Peer Pressure and Implications for Preventing
The above article is from HealthNewsDigest.com. The article points out that “friends’ substance use is one of the most powerful influences that lead adolescents to use themselves.” However a three-year Weill Cornell Medical College study found that students with “competence skills had a long-term effect in reducing the impact of friends’ substance use.” Just for clarification, competence skills are defined as “high-refusal-assertiveness” and “good decision-making skills” which were learned about the students by conducting the study over a three years.
When someone succumbs to peer pressure, he is making a decision based on other people’s decisions. Therefore, this article and study can be analyzed using our information cascade theory. We can say that the decision is either accepting or rejecting substances. Now, where this study fits in is when we talk about the distribution of signals. Since doing drugs has a negative payoff, we can say that there is a probability of q > 1/2 that a student will receive a low signal in order to reject the substance use. According to the study, if we can instill these competence skills into our children, then we can increase the value of q and thus increase the probability that a student will reject drugs. Still, as we can see in real life, information cascade is still a possible result and hence social groups in students are still susceptible to drug use.
Posted in Topics: Education, Health, social studies
Recognition of Cascade Behavior
Wednesday, April 11th, 2007 9:54 pm
Contributed by: Maddie
In a paper titled “Do Individuals Recognize Cascade Behavior of Others? An Experimental Study”, authors Tim Grebe, Julia Schmid, and Andreas Stiehler discuss the information cascade phenomenon and how it can affect people’s value of a choice, subsequently leading to information about people’s knowledge or lack thereof of their whereabouts in a cascade. I will go over the qualitative results discovered from this experiment and some questions that it has opened.
Much like the urn game in the previous assignment, this experiment sets up a game with two urns, A and B, with 3 black balls and 2 white balls in one and vice versa in the other urn. A ball is randomly chosen (signal) and the player decides which urn the ball comes from. In addition, there are artificial agents in place to select urns based on the principles of information cascades as discussed in a paper by Bikchandani et al to eliminate possibility of cascade errors. The main difference in this experiment from many others regarding the information cascade, is that the experimenters are more interested not in the pattern of signals or even in the player decisions of urn A or B, but in the value of money players are willing to pay to participate in the game and make predictions. The reason for this is that players will pay more if they think that they have a better probability of predicting correctly and they are also asked for their assignment of subjective probability that they are correct. For more detailed description, please see the paper.
Information cascade principles found in the Bikchandani paper would lead one to expect that players will continue bidding higher prices until a cascade starts, and once the cascade starts, the max price would stabilize and be equal for each prediction afterwards. In contrast, the experiment resulted in players continuing to increase the max price setting even in positions where a cascade has already formed (p8). The increasing maximum prices show that players learn from the guesses of other people before them, but don’t’ realize that those people are making guesses from observations of prior participants as well and may not know any better (p1). They are not consistently aware that cascade behavior has occurred.
Some questions one may wonder is what kind of subjects experimenters used and how this would affect results. In this case, experimenters used people with business and economics backgrounds. I started wondering if these participants learned about information cascades in their courses much like we are currently, and they still could not identify when a cascade began. On one hand, one would imagine in cases such as fashion fads or movie selections, one could tell if one was basing decisions on public or herd mentality. But it is interesting to see that people often don’t realize that even if many people prior to them have chosen something, that those dozens, hundreds, or even thousands most likely based their decision on many others before them who also did not have any more information of about the true value for the decision. The paper was interesting in zoning in on a subtle point, experimenters and those as “viewers” may know a cascade has started, but it may be difficult to actually realize when you’re in one.
Link to PDF file for the pages referenced
http://www2.wiwi.hu-berlin.de/institute/wt1/papers/2006/251006.pdf
Also see this paper for classic decription of information cascades.
SushilBikhchandani, David Hirshleifer and Ivo Welch. A theory of fads, fashion, custom and cultural change as information cascades. Journal of Political Economy. Vol. 100, pp992-1026, 1992.
Posted in Topics: General
My Own (mini) IM Cascade…
Wednesday, April 11th, 2007 9:54 pm
Contributed by: andrewjchatham
The Technology Adoption Lifecycle
When Google Talk came out a few years ago, I jumped right on it. It had a clean user interface and easy-to-use voice chat. I coerced a few close friends and family members to download it and give it a try… and it stuck. Now, granted, that switch was fairly simple for us to make - A was better than B, AND we didn’t have to give up B to switch to A - we could all still use AIM to talk to our other friends, while using Google Talk to talk to each other. Per the article above, we would be innovators - enthusiasts who liked technology for its own sake, and were willing to use something unknown.
As time progressed, our (more technologically inclined friends) saw what we were doing and were intrigued - especially by the minimal interface and the chat logging. And, because they didn’t have to give up AIM to start using Google Talk, “switched” too. This begins to address one aspect of “technology evangelism” - promoting technology to likely to switch, if only they knew about it. These people simply needed to see Google Talk’s advantages and they were convinced. But they didn’t really seek it out. They would fit into the early adopter category - users who were willing to take a chance on a new technology because the posited that the benefits and risks were worth it.
Interestingly enough, there was even a chasm period, as described in the article. The initial wave of early adopters had slowed to a trickle - but it was still a trickle.
Eventually, though, with the correct technology innovations (such as embedding Talk into Gmail, and starting to log chats into Gmail) on Google’s part, and a change in technology evangelism style on my part, a cascade began to form. For one thing, I began to realize that more and more of my friends were using Google Talk - without my even mentioning it to them. This was largely a function of them getting Gmail (at each other’s suggestion/encouragement) and beginning to use the built-in Talk function. On the other hand, I began to realize how much I benefited from my friends using Google Talk (1. I could log my chats with them into Gmail and 2. they could greatly benefit from the improvement that Gmail is over Cornell Webmail, and Gmail includes Talk), so I began to very strongly encourage many of them to switch - and they did. These would be the pragmatists (early adopters) - unwilling to adopt early because of risks, etc., but very willing to take advantage of the new technology and features.
Eventually, a small cluster of people within this larger cluster of Google Talk users that had been created (including myself) stopped using AIM completely. The number of people we knew that only had AIM began dropping closer and closer to zero, to the point where it stopped being worth having. This, in turn, created a new reason for the rest of our friends to switch to Google Talk - they had to if they wanted to talk to us.
This model is an interesting twist to the one discussed in class (with protocols A and B). Perhaps represent those who use both A and B as a new protocol - C. Switching directly from A to B is still really hard - following similar constraints to what was discussed in class. Switching from either A or B to C is rather easy - a function of both quality and number of people who are either C or the opposite protocol. Switch from C to A or B is rather hard - also a function of both quality (one has to be perceived as far better in order to give up the other) and number of people who are using the new target protocol (probably needs to be very high in order to be worthwhile).
And so you can stage a product incursion of A into the market of B as follows:
- Start with all B nodes, rather completely and strongly interconnected
- Convert some most-strongly-connected set of nodes to C - have initial C cluster
- Adjacent nodes begin to convert to C - C cluster grows, and a cascade starts
- Eventually, some cluster of C nodes decides to switch completely to A - have initial A cluster
- As C cluster grows, A cluster will grow, and conversely, as A cluster grows, C cluster will grow, and you have another cascade
Posted in Topics: Technology
Linguistic Differences and Diffusion
Wednesday, April 11th, 2007 9:45 pm
Contributed by: matt
Presumably, many of us have been to other parts of the country and heard things that sounded weird to our ears, other than just an accent. There are many normal differences we are often aware of, such as the name for a sweet carbonated beverage. In fact, people often rally on this debate, claiming that one of “pop” or “soda” is clearly better than the other. Website popvssoda.com lists a relatively current map of what different parts of the country that call the beverage. But there are noticeable changes in grammar within the country that also seem important for discussion.
“The car needs washed.” If that sounds right to you, then you’re most likely from a narrow band in the middle of the country extending from the east coast to Montana. You also might be from southern Georgia or a small area encompassing central New York and northeastern Pennsylvania. The notable thing here is that these small areas that are completely surrounded by people who wouldn’t use this grammatical structure continue to use it.
The same thing happens with the use of a positive ‘anymore,’ for a meaning somewhat like ‘nowadays.’ One might say, “a good job sure is hard to come by anymore.” This is said in approximately the same narrow band of land from the east coast to Montana, as well as an isolated area in Phoenix, Arizona. The fact that even people in an area close to Phoenix, who probably have a lot of contact with people in Phoenix, don’t use ‘anymore’ in this way says something important.
Why is the threshold for spreading of lexical items or grammatical structure so high? It’s possible that there just isn’t enough intermingling of people who say these terms, but then how did they spread to different separate areas in the first place? There is a good deal of space between the people who say “the car needs washed” in southern Georgia and the people in the main continental drag of people who say it, yet there must have been a good deal of intermingling at some time. It’s also possible that people have developed a high tendency to stay with what they’ve been saying for a long time. Personally, if I heard “the car needs washed,” I would think that the person had forgotten to add “to be” before “washed.” Likewise, I would not even understand the use of a positive “anymore.” Even if I move to an area where this is said, I am unlikely to pick it up for dozens of years.
If we really wanted to answer this question, we might want to look at the map here from popvssoda.com and look at the counties where the large concentrations of “pop”-sayers and “soda”-sayers meet. What motivates their choice and why doesn’t “pop” or “soda” take over? Would there really be two clusters of people, refusing to change their term? Are the areas where pop and soda clash constantly in oscillation? Is it possible that one term will eventually win out, and if so which? To me, “pop” seems less likely to win, but then again, I grew up saying “soda.” However, it is possible that if there ever exists a true “official dialect” of American English, these differences could disappear. Until such a thing happens, there is unlikely to be a winner in the cases of “the car needs washed,” “anymore,” “soda,” and the like.
(Geographic distribution of positive “anymore” and “the car needs washed” from the Atlas of North American English: Phonetics, Phonology and Sound Change: A Multimedia Reference Tool by William Labov, Sharon Ash, and Charles Boberg) 2006. Mouton de Gruyter.)
Posted in Topics: Education, social studies
Network Analysis of Open Source Communities
Wednesday, April 11th, 2007 2:28 pm
Contributed by: cs662
http://freesoftware.mit.edu/papers/crowstonhowison.pdf
In this paper, The Social Structure of Free and Open Source Software Development, by Kevin Crowston and James Howison of Syracuse University, the social organization of these development communities is studied from the perspective of social network analysis. In so doing, the authors contest conventional notions regarding how the structures of open source projects differ from those in the environment of corporate development. It is often assumed that, while the nature of the latter consists of a top down organization, the open source community is decentralized and that this free and open flow of information is the major requirement for their functioning. Although the authors note that some developers, for instance those of the Apache project, have recognized the importance of particular individuals in the process of writing software and have taken precautions in order to be certain that these crucial links will not be lost in the future, most people in the community have yet to take into consideration the structure of the network in which they work.
Because the writing of code depends, at least to some extent, on the communication between different developers, as Crowston and Howison argue, the nature of the structure of interaction between individuals must be considered. This is in contrast to the majority of studies on network centralization solely based on who writes more of the code. Instead, the authors look at a broad range of different projects and analyze the intercommunication between project contributors during the process of reporting bugs. In investigating this level of communication centrality in a number of different open source projects, the authors hope to dispel preconceived notions regarding how the quantitative difference between them and corporations.
The conclusion reached by Crowston and Howison are very interesting. The level of centrality in the networks they analyzed is extremely close to being a normal distribution. The negative correlation between centrality and the size of the project is especially interesting. Thus, the initiator of the project plays a large role in the project’s beginning, but as the project becomes larger it becomes more decentralized. One way this study could be furthered would be to inspect how differently sized corporations have different structures of communication superimposed on their developmental networks. Thus, like much of what we have studied in class, the structure of the network is very closely linked to the path dependence of the flow of information. This, moreover, can be correlated to other unsuspected properties of the network, in this case the number of contributors. Finally, this study implies that the possibility exists for the participants in networks like these to become aware of the requirements for their survival and growth and could ultimately begin to manipulate the structures of their own communication network to their benefit.
Posted in Topics: Education
Posted in Topics: Science, Technology, social studies
No Comments