This is a supplemental blog for a course which will cover how the social, technological, and natural worlds are connected, and how the study of networks sheds light on these connections.


Multi-Agent Modeling: Games on Networks

In class we alluded to the fact that a computer simulation is one method to show that the results of Network Exchange Theory could be replicated from probabilistic behavior rules enacted by computer programs. Not surprisingly there this method of computer simulation can be extended to explore a variety of other social phenomena. In fact, there is an emerging field of multi-agent simulation that is allowing for interdisciplinary work between social, physical, and computer scientists to exploit the growing speed of modern computers in the pursuit of understanding through the modeling of artificially connected agents (people, animals, physical particles) and their dynamical interactions.

The basic idea behind agent based simulation is using computer code to represent the simple behavioral rules of social agents. Then exploiting the significant availability of computing power, these computer “agents” are allowed to “interact” in modeler specified ways. The goal is then to to obtain a visual or statistical representation of how these interactions between agents evolve over time and to examine the macro-scale trends that can be built-up from the micro-level interactions. These notions are extended directly from those expressed by Thomas Schelling in “Micromotives and Marcobehaviors.” One benefit of this type of modeling is that it allows for the modeler to experiment with a large number of agents and to collect data on their behaviors on a much smaller time scale than any social experiment involving humans would. Also, unlike classical economic and sociological experimentation, this form of study allows for a focal shift from the equilibrium setting to a more dynamic point of view. With the ability to perform literally hundreds of agent interactions in a fraction of a second, the total number can be huge over any extended simulation run. With such a large data set and a short run time, the modeler can extract information noting the changing behavioral norms as an experiment approaches its equilibrium.

This method of social experimentation has already provided some interesting very results. Many examples of such are presented by on of the fields pioneers Dr. Robert Axtell; a few include models that attempt to recreate an ancient civilization, explore the formation of agent coalitions based on game theoretic agent interactions, and test the evolution of behaviors when agents interact according to the Nash Bargaining game. The Nash Bargaining model, in particular, relates to our discussion of Network Exchange theory; here agents are coded to negotiate the division of a good according to the payoffs listed in a 3×3 matrix and allowing for adaptive behavior by giving each player a memory for their previous interactions. The ease with which heterogeneity is built into each agent makes each simulation run slightly different, but the hope is to identify some long term social equilibriums and to note the different dynamics by which these rest states are achieved. Analogous to our model in class, this computer model has one results in the emergence of social classes based on an unequal (but still efficient) distribution of property. Another interesting twist to this model is the importance of social “tags” that help identify the various type of agents playing the bargaining game. The role of the tags here is that it aids in the adaptive measures that an agent uses by recalling the previous plays of other similarly tagged agents. (the summary above is taken from an article “The Emergence of Classes in a Multi-Agent Bargaining Model” by Robert Axtell, Joshua Epstein and H. Peyton Young. Unfortunately I was not able to find a link for this paper, and used a paper copy received from Dr. Axtell as my source.)

The following link takes you to the book by Axtell and Epstein describing their modeling efforts in the creation of artificial agent civilizations:

http://encompass.library.cornell.edu/cgi-bin/checkIP.cgi?access=gateway_standard%26url=http://cognet.mit.edu/library/books/view?isbn=0262550253

The overall relevance of this post to the course is the usefulness of game theoretic ideas used to interact agents that may live on some (social or physical) network. The key feature of this method is that it allows for repeated play between agents (not necessarily the same pair of agents, however) and therefore for an evolutionary perspective on the dynamics by which marco-level equilibriums are achieved through micro-level behaviors. This notion makes available the idea of agents living on a grid (in which case they interact with their physical neighbors) or a more abstract social network (where interactions are only possible between socially connected individuals). Here the variety of topologies of the networks and the rules of the games that the agents engage in make it possible to model a large variety of social interactions and hopefully some very useful results.

Posted in Topics: Science, Technology, social studies

No Comments

Network Models in Civilization IV

In the PC game Civilization IV players take control of civilization races, and through a progression of time from 4000BC to 2050AD, the player aims to develop this civilization and win the game by any of the pre-defined 6 victory methods. One aspect of the game is technologies that act as prerequisites for actions the civilization can take. This article describes the calculation needed to determine the cost of acquiring a technology. Included in this calculation is a modifier that is dependent on the number of known civilizations (with respect to the player’s civilization). How the modifier affects the overall calculation can be modeled with network theory.

According to the article, the speed at which the technology can be researched is dependent on the number of known civilizations who have the technology compared to the total number of civilizations that started in the game.

Tech Known by Civilizations modifier = 1 + RDDW (0.30 * # known Civs who have the tech / # of Civs who started the game

This can be reflected as a star network where the player’s civilization is the center node. The power of the player’s civilization in this network diagram can be reflected in the $1 game that was taught in class, with each node only capable of performing one successful trade (which is an accurate representation since once a civilization acquires a technology, it cannot acquire it again). As the number of nodes added to the stat network increases, the amount of money the center node is able to extort out of the outlying nodes increases. Similarly, as the number of known civilizations who have the technology increases, the ability for the player’s civilization to research the technology increases since it is expected that this civilization would have an easier time learning the technology when it has more sources it can draw from (though this dynamic is not explicitly shown in the game). Unlike the $1 game representation, the benefits each node gains is not an actual dollar amount. The value for the player’s node (center node) is an assessment of the civilization’s ability to research a technology while the value of the outlying nodes represents the ability for the other civilizations to prevent the player from acquiring the technology.

Posted in Topics: General

No Comments

Power in Al-Qaeda

Article link: http://www.guardian.co.uk/alqaida/story/0,,2031450,00.html

This article takes stock of the strength of Al-Qaeda on the anniversary of the 3/11 Madrid train bombings. It is interesting to look at the structure of Al-Qaeda, particularly after the war in Afghanistan which was purported to have largely disrupted it. This British report finds that Al-Qaeda is experiencing a resurgence, with a plethora of cells across the Middle East (particularly in Iraq and Afghanistan, although the report states that the Afghani cells are fringe groups and that the Iraqi groups are largely home-grown rather than comprised by foreign disruptive agents) and Western nations (particularly in Europe). These cells share an ideology and an overlying coordination but otherwise operate largely independently. This Al-Qaeda has its hub in western Pakistan and has, contrary to the official claims of the U.S. and U.K. governments, been strengthened by the war in Iraq.

 

The remaining Al-Qaeda organization, so far as it can be made out, can be described as a network of roughly four tiers: the well-known familiar leaders such as Osama bin Laden and Ayman Al-Zawahiri, a new “middle-management” that coordinates the propaganda, attacks, and training of recruits, the local groups in various countries that concern themselves with the concrete, localized preparations for an attack, and individuals in Western and Middle Eastern countries who are willing to carry out the attacks.

 

The interactions between tiers are largely unidirectional top-bottom to make it harder for investigators to trace back from the bottom up, although interactions within a tier can be two-way.

 

Al-Qaeda’s power lies mainly in its “middle-management” tier: certainly, the many local groups depend upon it for resources, and this tier also wields the power of exclusion over which local groups will receive Al-Qaeda’s official support and resources. As well, the middle-management tier may experience satiation; this tier has its pick of offers from Al-Qaeda cells all over the world but will only endorse so many projects, so localized cells must offer more sophisticated and perhaps more sensational attack plans in order to get the attention and support of the middle-management tier.

Recognizing the unique power of the middle-management tier, which is stronger than that of the figurehead top tier according to the criteria of power that we discussed in class, it is now the middle-management tier that is being targeted by
U.S. forces. However, this strategy has so far yielded little success due to the fact that the “middle-management” tier has no dearth of willing replacements despite the low life expectancy associated with the job; the tier is a very large and extensive network of its own, making it impossible to wipe out in one fell swoop. 

 

So what effect would capturing Osama bin Laden have on Al-Qaeda’s power structure? Little, it appears. While this would be a key political move in shoring up support for the war on terror in the West, eliminating bin Laden would do almost nothing to disrupt the day-to-day functions of Al-Qaeda as the middle-management tier would continue to keep Al-Qaeda working.

Posted in Topics: social studies

No Comments

Start-Up Aims for Database to Automate Web Searching

Link - http://www.nytimes.com/2007/03/09/technology/09data.html

The contemporary notion that technology becomes obsolete as quickly as it’s developed has been applicable to many thing from processors to mp3 players, but not to the core of internet usability and ebusiness commerce - web searching. Historically web searching has been done by querying a search engine that has a repository of sites it has searched and it returns the websites that match the query, based on advertisements, popularity, and relevance. The problem with this classical model is that there is very little relational measurement among queries made and little structure.
As Esther Dyson mentions, “Most search engines are about algorithms and statistics without structure, while databases have been solely about structure until now.” The opportunities from this movement would affect users minimally, but aid in automated searches and help with such possibilities like automatic localization of equipment, intelligent devices, as well as advanced integration of previously isolated ideas and technologies. The relevance to what we’re studying in Networks has to do with taking hyperlinking and other passive links and transforming them into relational ties like in a social networks and using these elements of bridges and closure to make these searches more accurate and relevant. Although these ideas aren’t mentioned explicitly, through the use of databasing ideas and technologies, these outcomes would be expected and take unidirectional linking as discussed in class with bigger websites being linked by smaller ones and making these ties bidirectional based on relevance within the query and based on how the results are in the database.
Personally, I feel this is a terrific idea not only for programmers and those who deal with automatic retrieval of search data, but those wanting smarter devices and are annoyed with seemingly inane problems with technology, such as setting one’s location with a remote or having a device know what service it’s using or should expect to use (like in ISP, cable, etc.). It is also a noble effort, like Wikipedia, to make information this powerful freely accessible and public, for the most part; it should effect the growth of metaverse-like (as in Second Life) worlds where there are many things that occur behind the scenes that are automated and in need of such a relational database for the internet.

Posted in Topics: Technology

No Comments

John Edwards: Campaign of the Future?

Article Link: http://blogs.zdnet.com/micro-markets/?p=1022

In this article, Donna Bogatin questions whether online social networks have benefited the campaign of former Senator John Edwards, a Democratic candidate for the 2008 election. She suggests that Edwards is trying to adopt a “User Generated Politics Approach,” but has failed. Bogatin defines this approach as using social networking sites as third parties that will help generate interest and “connections” to his campaign. So far, the campaign has decided to leave no stone unturned with this approach and has set up profiles at 24 social networking sites, including flickr, facebook, partybuilder (the Democratic party’s networking site), youtube, orkut, xanga, and livejournal. The profiles offer visitors opportunities to “connect” or “friend” Edwards, upload images, or download political posters and memorabilia. The Edwards website encourages social networkers to access these profiles, post hyperlinks to Edwards’ site in their homepages, and include Edwards’ image in their daily email correspondence.

Despite the scale of the campaign’s effort to reach out to these online networks, Bogatin labels their efforts a failure. She cites as evidence the fact that Edwards has no friends on Bebo, Tagworld, or VSocial. She also includes a testimonial from an online networker who “connected” with Edwards just because he had only 2 online friends on his networking website. On one site, she explains the lack of information provided by Edwards’ profile, which merely lists a to-do list: “Become your president. Donate Blood.” Bogatin also explains that the campaign is not using innovative networking strategies, but instead relying on the “third-party, impersonal, business model” of the networking service sites.

What may be even more interesting is exactly to whom the campaign wants to contact, to whom do they actually contact, and why? As we know from Gladwell readings, it can be beneficial to know a lot of people (Edwards’ profile as a Connector). But, to get ideas to stick and tip, you need a Maven (Edwards’ profile as an excellent source of information) and a Salesman. The sales pitch may only be effective if the right people link up to Edwards’ profile. Partybuilder, the Democratic party’s networking site, might target the best proportion of activists. But, those activists might not necessarily be connected to Facebook Stalkers with 500-odd friends and extremely high levels of betweeness. Stickiness slows if Edwards’ profile does not convey his platform well. Therefore, the User Generated Politics Approach as applied by the Edwards campaign has some limitations.

On the other hand, Edwards’ campaign may be more cued into web-searching. By branching out to 24 online social networks, Edwards has made a splash in the Internet-business-politics community that publishes articles online about his effort. If Edwards’ name comes up with x amount of articles with a query Edwards AND Internet or Edwards AND Online, the campaign might anticipate growing interest in their candidate because of his exposure. So even if their efforts fail, there is no such thing as bad news.

Posted in Topics: Education

View Comment (1) »

Wikipedia-based search Engine

(http://www.techcrunch.com/2006/12/23/wikipedia-to-launch-searchengine-exclusive-screenshot/).

Recently Wikipedia founder Jimmy Wales announced that he is planning to launch a new search engine within the year. Wales justified his new project by expressing his qualms with the current methods that major search engines use for finding “good” pages. He says that computers are bad at making judgments on how good a page is, but that this is an easy task for humans: “It usually only takes a second to figure out if the page is good, so the key here is building a community of trust that can do that.” The new search engine’s returning the top three results as Wikipedia content, and the remaining results as those that are linked to by Wikipedia, clearly establishes the people who have developed these entries as the “community of trust”.

It is interesting to consider Wales’ assumptions of the link structure of the web. Since it is unlikely that such a search engine would be able to survive with such a limited (although a claimed more useful) index, we can assume that it will index other pages (though not as many as major search engines). Perhaps they would start crawling the web at Wikipedia, following external links, and possibly stopping if it reaches a link a certain depth from Wikipedia.

By returning the top three results from Wikipedia, Wales is essentially endowing the pages in Wikipedia’s catalog as the “ultimate authorities” of the web, independent of how the incoming/outgoing link structure of these pages may compare to other non-Wikipedia pages. By searching through the external links of the entries for results establishes these external pages as strong authorities. This in addition to Wikipedia’s highly intra-connected nature makes the Wikipedia pages “ultimate hubs” of the web. Pages that are reached through following a series external links would have comparatively weak scores, unless they linked back to Wikipedia or one of its external links.

Finally, I would like to consider the effect of the “anyone can edit” nature of Wikipedia and the issues that may arise if the search engine were to catch on. There have been a number of cases of information vandalism on Wikipedia over the past few years (http://www.time.com/time/business/article/0,8599,1595184,00.html). Furthermore, in an attempt to get pages ranked more highly, people wouldn’t need to go through the trouble of trying to get highly ranked pages to link to them – they could simply add a link in a Wikipedia entry.

The new search engine would need to be robust to such vandalisms and attempts at page rank inflation. Endowed with a way to determine the “goodness” of an external page, Wikipedia may be able to stop external link spamming. For example, if the page does not have links from other reputable sources, it may reject the page or require a certain number of users to verify its validity. This would enforce Wales’ idea of allowing a trusted group of people control the “goodness” of information returned by his searches.

It would also allow for the structure of a web centralized at Wikipedia to evolve much differently than the way the web today can evolve. Today, in order for a new page to be ranked highly, it needs to obtain links from other highly ranked sites. A page with useful information may go unnoticed by such sites and unknown to the world. However, by proposing the page as an external link from Wikipedia, the creator can make the page known without requiring it to be known by “major pages” first.

Posted in Topics: Education

No Comments

Blogs and Business

Investor’s Business Daily: Managing for Success

Blogs have become a popular networking source for all types of businesses. Besides having the potential of creating new customers and interest in one’s own company, blogs are a way in which companies can learn what others are saying about their own company to make improvements as well. Martin Schwimmer used this type of marketing pitch in 2001 to get his business up and running; he created The Trademark Blog, which is maintained by the Schwimmer Mitchell Law Firm. In a way this can be related to the article “The Strength of Weak Ties” by Mark S. Granovetter. Martin Schwimmer did not know the people which responded to his blog. They happened to find him through a search engine, which eventually led to benefit both parties involved. The way in which people hear or read about this specific blog can be considered weak ties. In 2006, approximately 50% of the referrals were a result of the clients with which Martin Schwimmer had been in contact with through his blog.

Andy Sernovitz, the Word of Mouth Marketing Association’s founding CEO and a marketing expert, uses his idea that “people are already talking about you and your business right now on social networks and blogs” (Managing for Success, Gary M. Stern of Investor’s Business Daily). One of his ventures was WOMMA, with the mission of promoting and improving marketing through the use of blogs. If there is a blog pertaining to a certain topic and a specific company is involved somehow in the post, it will drive people to website of that business. And then within one’s own social network, news about this product is spread, and business grows. An example of ideas businesses use to improve their sales is offering contests, where the winner gets some product(i.e. furniture as in the case of Smartbargains); this generates a great amount of “buzz” and new business for the company. Other companies join together to promote sales of two different products, as in the case of Starbucks and Mitch Albom, the author of “The Five People You Meet in Heaven.”

Other social networking websites that have the same idea as the The Trademark Blog and WOMMA include MySpace, Friendster, Linked-In, and Gather, to name a few. There are multiple ways to promote business through blogs; it is a new and effective tool for both businesses and customers alike.

On a side note, the way AIM Fight works is shown here:  

Out to three degrees?

The score of a person’s screen name is the sum of the current number of people who have that screen name listed as a buddy, out to three degrees. The score isn’t an exact number of third-degree connections, but how many connections there are relative to other AIM users.

Posted in Topics: Education

No Comments

I can’t get onto MSN!!!

Optical fibers are the bedrock of modern communications. Since its invention in the 1970’s over 300 million kilometers of optical fiber have been produced. One fiber has the ability to carry 60 millions simultaneous telephone calls. These amazing properties are being called to service everyday in our global information network. So when a 7.1 Richter scale earthquake shook the sea bed off the coast of Taiwan where several large cable bundles passed through, nearly all of Asia was put under a telecommunications blackout of some magnitude or other.

The quake damaged both cables that connected Asia to itself and cables that spanned the Pacific to America. The breaks brought local internet capacity to less than 60% while cutting off American sites entirely. On an Asia based forum (Aliababa.com), a user asked:

 

How Did the Earthquake Near Taiwan affect your business in Sourcing from China?”

And received responses like:

 

“….we could not open, MSN, Yahoo and other international websites…All the messangers [sic] stopped for the first day. Yahoo take 3 days to restart and MSN also two days…”

 

And,

“i canot [sic] enter into website abroad :’(   :’(   :’(

 

I personally had two friends who were in China at the time of the quake, both of whom recounted to me the agony with which they endured life without AIM, Email, and Facebook.

“The communications cables are not divided up per country, so no country will find its situation improving sooner or later than the others,” said.Ku A-Jong, a spokesman for KT, the largest South Korean mobile service provider.

Fiber links are built in loops. A link from Taiwan to Singapore to Hong Kong would circle back to Taiwan. If one link breaks, data will immediate being flowing in the other direction. The Chinese state news reported that China’s largest telecom company China Telecom Corp. was asking other telecom companies in Europe and the U.S. to borrow bandwidth through satellites (iht.com).

We remember from class that these links are built in loops just as the first internet connections were built in the 60’s between MIT, Harvard, Stanford etc. When one link broke servers tried to reroute data through the other lines for data within the network and trying to reroute data from America through Europe.. Also, the trans-Pacific cables can be considered analogous to weak ties among friends. They are the conduits by which information from a far away place get to where you are. There aren’t many of them and one of them is broken, there will be a loss of large amounts of information. Normally, the weak links in a social network are not called into service frequently, yet with the internet they are in constant use, thus all of East Asia has to suffer from not being able to get on MSN.

Posted in Topics: Technology, social studies

No Comments

Animal Contests as Evolutionary Games

Article: http://www.americanscientist.org/template/AssetDetail/assetid/15654/page/1

 Evolution is the governing force in nature driving change towards more suitable strategies by animals to their environments. Evolutionary game theory is extremely useful when analyzing the strategies utilized by animals in animal contests - whereby two animals in an encounter can choose different strategies based on their ecotype (habitat), information structure, and pattern of interaction (determination of reward based on player’s strategies). These stable strategies are those that will allow the animal to fare best in these contests and thus will be the strategy passed down onto their offspring.

Most animal contests are viewed under the general lens of the players (animals) assessing their own, as well as their opponent’s strengths and weaknesses based on some measure of physical prowess in addition to other factors such as whether a territory under dispute is owned or is being intruded by that player. However, many paradoxes arise under this broad view. How does one explain a threat by an animal if that threat increases the vulnerability of the signaler, why some animals will actually elect to leave a territory they own when faced with an intruder, or why in some cases fight time does not correlate negatively with the size difference between fighting animals? One can assume these are simply what they appear to be - paradoxes that don’t fit into the general game assigned to animal contests. However, more insightful is the approach that perhaps there are many different, coexisting games that provide a reward structure under which the actions witnessed are actually the most evolutionary stable strategies.

The paradoxes described above are explained in detail within the article. These specific examples were chosen because they represent evolutionary stable strategies in games different than those generally assumed to cover the whole of animal contests. Each of these games is, of course, subject to environmental conditions. For example, when damselflies engage each other, it is seen that the other damselfly’s size does not enter into the other player’s decision to engage in a contest. Rather, it seems to be that each fly assesses their own reserves, and then values winning as proportional to the potential remaining reserves. The environmental conditions in this case would involve a coefficient of variation, which defines the dispersion of energy reserves about their mean, and a cost/benefit ration, which compares “reproductive cost of a spent unit of fat reserves to the eventual winner’s reproductive benefit from a saved unit. Similar techniques for describing other “paradox” situations are applied in this manner within the article (some pretty explanatory graphs are even included).

In class we have discussed simple games, such as the prisoners’ dilemma and the penalty kicks in professional soccer. However, when analyzing games in the natural world, it is often that the governing dynamics are not as clear. It seems that though often one type of game may work in general, the exceptions to the game may actually be examples of entirely different games played under different parameters. Much time and effort must go into not only defining the payoffs from a game, but the actual conditions and parameters that define each game so that an observer can be sure they are analyzing a contest in the correct manner (though testing these predictions is extremely difficult).

Posted in Topics: Education

View Comment (1) »

Digg Swarm

swarm-example-1.gif

Digg Swarm is an online visualization of what stories are popular on Digg.com. Digg.com is a user generated news content site where users submit news stories that are interesting to them. Popular stories get “Dugg” to the top by other users of the site, and the most popular stories Get pushed to the front page where millions of viewers can see them. The exact algorithm for getting pushed to the top page is unknown, but it has to do with how many diggs the stories get, who diggs the stories (more active users get more push), and how fast the stories get diggs.

In the visualizations, the users are the little yellow circles and the stories are the big white bordered circles. Lines connect users to stories and stories appear when they are dugg and disappear if they are not dugg for a specific amount of time. Same goes with users. The bigger the story is the more diggs it has and the bigger the user is, the more friends and influence it has. Lines also connect stories that have been dugg by the same user.

Digg Swarm is a focal network that is similar to the focus network that we had to conceptualize on homework 1. I watched for a long time, and I am pretty sure that it does not currently implement triadic or focal closure. However, this would be a neat (and useful) thing for it to represent in the future. I think the most beneficial thing to see now is that how big stories attract even more users faster. (For example, try to find the stories that are the most popular - easy - just look for the one with all the little yellow circles rushing to it at once!.) Digg Swarm would be a very useful model for researchers to model networks off of in future studies.

Check out the visualization here:

http://labs.digg.com/swarm/

Posted in Topics: Education

View Comment (1) »