This is a supplemental blog for a course which will cover how the social, technological, and natural worlds are connected, and how the study of networks sheds light on these connections.


Wikipedia to Writers: Prove Your Expertise

http://www.cnn.com/2007/TECH/internet/03/08/wikipedia.credentials.ap/index.html

            We all know that the internet is a large network that allows for easy dissemination of large amounts of information.  This is what has made Wikipedia such a powerful tool, as user submissions have resulted in 1.7 million articles in English alone.  What makes Wikipedia particularly useful is the anonymity of article submissions.  As Wikipedia founder Jimmy Wales explains that “anonymity puts a reader’s attention on the substance of what people have written rather than who they are.” 

            However this anonymity also leads to abuses of the system, from fraudulent entries to juvenile vandalism on existing entries.  One of the cases that came to light recently is that of Ryan Jordan, a prominent poster who had claimed to be a theology professor. 
Jordan had even recently been given administrative powers to block edits or users.  To rectify this problem, Wikpedia now requires those who claim expertise to prove it.  However,
Wales maintains that users can still post anonymously if they choose not to claim expertise. 

            I think Wikipedia has achieved a delicate balance in keeping the information flow that makes the internet such an asset while putting measures in place to strengthen the validity of posts.  They faced a problem similar to what early search engines had – a need to refine the quality of information in their respective products.  Both were able to improve their services while maintaining the core purpose of their product.  For example, search engines refined searches without falling for website tricks (such as looking for “authority” type pages instead of pages that simply repeat a word over and over), and Wikipedia retains its anonymity feature while tightening controls on ‘expert’ posters.

Posted in Topics: Education

No Comments

Auctions and Networks to Affect Online Shopping

Every holiday season, I resort to doing the bulk of my shopping online because of the convenience and ease. And like any shopper, I always want to find the best price. I always find that I spend most of my time cross-referencing many online stores and checking the current auction prices on eBay. It’s a hassle that will hopefully soon meet its end.

Recently, I read about HammerTap’s acquisition of Auction Trust Network (ATN) [ article located at http://www.pr.com/press-release/31917 ]. HammerTap is a company that has always been trying to help both the buyer and seller in online auctions by creating software solutions for analyzing auction data. ATN is the producer of ReconnectTM, which is a software tool used by eBay customers. With this technology, HammerTap now looks to “revolutionize the online shopping experience” by supplying online customers “with helpful purchasing information, such as average selling price, for specific items in the eBay marketplace.”

I think this acquisition is very relevant to our course. For some items like baseball cards, price is determined by its auctioned values, but for many items, there is a discrepancy between an online merchant’s price and the average auctioned value. HammerTap is now going to make this discrepancy very public, which will increase competition and hopefully push merchants to minimize the discrepancy between the current auction price and their own price.

In class, we saw in the “Matching Markets with Intermediaries” example that two traders in perfect competition for one buyer and seller with have to set prices such that their profit is zero, because the traders would keep having to price-cut each other. In this example with HammerTap, we have one trader with a set price (the auction value), so that the other trader would not have to price-cut much below the auction value.

Also, by merging ATNs technology, HammerTap will also create a social network of representing trust between different merchants/consumers. As the article explains, each person using the software will have a buddy list and share the list with everyone that is on the list. This trust network adds additional pressure on online merchants to compete with online auctions and small time sellers. The “auction versus online” price discrepancy discussed above could be based on paying more for a reliable source. If that is the case, HammerTap will allow a buyer to indirectly gain trust in a merchant by seeing that someone the buyer trusts, trusts the merchant; therefore, unknown merchants with better prices can exponentially gain customers (when thought of like population growth) and online merchants need to have their prices reflect this competition.

Posted in Topics: Education, Technology, social studies

No Comments

Mamma.com

http://mamma.com/info/about.html

http://www.googleguide.com/google_works.html

On March 8th, Mamma.com’s stock soared 27% after it reported earnings of $420,175 on revenues of $3.6 million. This is paltry compared to industry leader Google’s earnings of $997.3 million on over $3 billion in revenues. However, any profitability reported by an internet company is a good sign as most tech companies have a hard time turning a profit. The appeal of Mamma.com’s search engine, or the is its self-touted “smart” metasearch engine. In essence, a query to its search engine is similar to searching multiple search engines and will display the best results from all the search engines.

Google’s search engine works much like what was described in class. First, the internet is “crawled” by Googlebots, and updated as needed. The results are then stored in Google’s index database. Last, when the user enters a query, the query processor searches the index and returns what it thinks are the best pages according to its ranking system. The system used by Google is not public information, but again it seems similar to what was discussed in class. The ranks or the level of relevance to the query topic depends on factors like how many links there are to a web page from other pages, and the quality of these linking pages. In all likelihood, this process is iterated to give the best results.

Instead of taking Google’s method of physically building up an index database by crawling the web, Mamma.com’s metasearch engine piggybacks on the likes of Google and Yahoo search engines. In essence, each time a user send a query to Mamma.com, the metasearch engine sends out queries to multiple search engines and content sites and returns the results after ranking them by relevance. According to Mamma.com, one of the chief ways that it sorts result is by using the duplicate results (from different engines). The more times a page turns up in different search engines, the higher its rank is. This sort of search engine has become very popular, and its results are good as well. The main problem of course is that metasearch engines relies on other search engines for its results. Without companies like Google and Yahoo, websites like Mamma.com would not exist.  For now though, Mamma.com has a bright future ahead.

Posted in Topics: Technology

No Comments

Baby Traffic

www.foreignaffairs.org/2003110faessay82611/ethan-b-kapstein/the-baby-trade.html

Well, I’m writing this blog entry about a slightly different topic than many may imagine; a more concerning one, I should think. The link above goes to a report on baby trafficking - a trade, as it were, that has been increasing in size in the last few years, as more Western people choose to adopt babies from poor countries. The notion of adopting these babies certainly does not sound all too bad - they would obviously have a better life in the West - yet the strict and often non-sensical restrictions that come with the process of legally adopting a child from poorer countries often has families in the West placed on rejection lists. It is here that black-market opportunists step in, offering to sell children. And though the end result is like an adoption, it is still an act that transforms human beings into tradable goods, i.e. slaves.

Posted in Topics: Education

No Comments

Price models in online advertising: CPA vs CPC

Google currently uses the Cost-Per-Click pricing model, where advertisers pay based on the number of clicks to their advertisements. The value that a business gets from such an advertising model ultimately has to do with the number of ‘actions’ that the advertising triggers in a consumer. In some cases, this is easy to measure. If a consumer buys a product and was directly referred by clicking on a Google advertisement, it is clear that the ad triggered an ‘action’.  In other cases, it is not as clear when the ad has caused an action, especially when an ad is being run for brand or product recognition. Regardless, it is often useful to a business to understand what its advertising cost is per action generated. For example, if a retailer spends $2000 on advertising for 5000 clicks, and has determined that of these clicks, 500 led to a purchase (‘action’) then the cost per action is $4, although the retailer is really paying $0.40/ click.

Some Reports (http://internet.seekingalpha.com/article/12363) indicate that Google is considering the introduction of a Cost-Per-Action pricing model, where they charge not by the number of clicks but by the actual number of actions that the ad generates. In the example above, the retailer could sign up and elect to pay $4/action instead of $0.40/click. One reason this becomes attractive because of the concern for click fraud. The argument can be made that when paying per click the retailer risks paying for ‘fraudulent’ clicks whereas when paying per action this is much less likely. One way that Google appears to have created an opportunity here is through Google Checkout, a service where Google handles the monetary transaction of a purchase akin to ‘Paypal’. This would allow Google to more easily track when an ‘action’ is generated from an ad click.

So what is the motivation for Google to introduce CPA advertising? Clearly, it is not certain that CPA advertising will make its way out of current testing phases. Google must see a way to increase its revenue by using CPA advertising. If a company is currently paying $4/action in a CPC model and is not willing to pay more, then it does not benefit Google to move to a CPA model.

An interesting note about CPA relates to Google’s Quality Scores. In a CPC model, Google uses the ‘Click Through Rate’ as a metric in determining the relevance of an advertisement to a given keyword. In a CPA model, Google could use an “Action Rate” to determine the Quality Score. This is powerful because (perhaps not explicitly) a user now knows that prominent ads (high Quality Scores) have a higher chance of not only appearing relevant (likely for a user to click) but actually being relevant (likely for a user to act on).

It seems to me that this fact increases the value of advertising to the user, which in turn should increase the valuations given by advertisers and the prices that Google assigns to keyword slots, generating more revenue for Google. This also implies a way that a company would be willing to pay a higher rate in a CPA model than in a CPC model. Additionally, if many companies start using Google Checkout in conjunction with CPA Adwords, that also increases revenue for Google.  Overall, it seems that the addition of a CPA model could increase revenue for Google, and time will tell if this is the case.

Posted in Topics: Education

Comments (2) »

Maximizing Survival in Duels, Truels and N-uels

In an absolute sense, it is wise to say that survival is the most important thing to a human being. When it comes down to it, even wealth is second to health- realistically money is of no use when one can’t use it. As such it is not surprising that research has been done in the game-theoretic aspects of dueling, which even if it might not have practical applications in our civilized world, is still interesting in its economic aspect. Historically, duels as ways of settling differences (swords over words) have been very important, with the Alexander Hamilton and Aaron Burr duel being perhaps the most famous for its tragic outcome.
Duels are formalized forms of combat, where two combatants armed with deadly weapons fight each other, often in the presence of their “seconds”. Duels have been around for a long time, and have often been used to settle points of honor or (imagined or real) affronts. According to an article in the Smithsonian Magazine on dueling, duels arise in many forms, including balloon dueling (with each combatant boarding a balloon and attempting to shoot down his opponent’s balloon) and billiard dueling (attempting to beat opponents senseless with billiard balls).
While game theory cannot do much to help in classical two-person duels (rigorous training, skill, bravery and luck will win most duels), there are certain precautions one can take to avoid fatal casualties. One is choosing appropriate dueling rules and conditions, such as halting the proceedings upon first blood and choosing proper weapons. Another one (in the case of using pistols) is to shoot in the air or to delope, implying the opponent is not worth shooting (and possibly averting a fatal conflict).
In the case of duels involving more people, however, besides being more fun because of the richness of choices (whom to shoot, for one), are also mathematically more interesting. Truels (interesting paper from NYU), or duels involving three people, greatly change the rules of the game. Depending on the rules (order of shooting, time frame allowed per shot, number of shots…), individual marksmanship skill (usually mathematically represented with percentage accuracy) and strategy chosen, the results can be greatly differing.
In both An Introduction to Probability and Its Applications by Larsen & Marx and Fifty Challenging Problems in Probability by Mosteller there are problems involving truels. The rules are as follows: 3 duelists A, B and C with different firing accuracies (95%, 50% and 30%) face each other. To make it fair, there is sequential firing, with C firing first, and assuming they survive, then B and finally A. What is C’s best response? The standard textbook response is that combatant C should delope, as B will subsequently engage in a shooting contest with A (as A is the most dangerous one), and C can then fight the remaining weakened opponent, maximizing chances of survival.
The math does not end here, however. According to this article, “Optimal play can be very sensitive to slight changes in the rules, such as the number of rounds of play allowed. At the same time, some findings for truels are quite robust: the weakness of being the best marksman, the fragility of pacts, the possibility that unlimited supplies of ammunition may stabilize rather than undermine cooperation, and the deterrent effect of an indefinite number of rounds of play (which can prevent players from trying to get the last shot). Some of these findings are counterintuitive, even paradoxical.”
There are many more in-depth research papers regarding truels in their variations, as well as duels involving more than 3 combatants, called N-uels. While many of these do involve advanced concepts like Markov Chain Theory, the essential concept remains the same, with Nash Equilibria and probability distributions, and they do make for interesting reads. Who knows, they could save your life someday.

Posted in Topics: Education

No Comments

Yahoo’s New Advertising Platform

Yahoo Search Marketing, formerly known as Overture, which was originally GoTo.com - the first pay-per-click advertiser, launched a new pay-per-click model February 5th.

http://www.nytimes.com/2006/05/08/technology/08yahoo.html?ex=1304740800&en=658529acc1fc013a&ei=5090&partner=rssuserland&emc=rss

Yahoo’s new advertising platform, codenamed Panama, takes both quality and bid-price into account when determining ad ranks. This is nothing new - Google is already doing it. Google’s AdWords program uses a sophisticated algorithm with a grid of supercomputers to determine which ads should be placed on top of a particular search instead of choosing purely on price. Yahoo’s new platform will attempt to do just this, hoping to catch up to Google. Why bother taking page relevance into account, when you could just charge the top advertiser the most?

http://www.clickz.com/showPage.html?page=3625248

Say you have 3 advertisers with the following the following bids and CTRs (Click-Through-Rates):

  • TaxACT bids $1.00 and has a 3.0% CTR
  • TurboTax bids $0.99 and has a 3.6% CTR
  • ExpressTaxRefund bids $1.02 and has a 2.9% CTR

Using Yahoo’s original platform, ExpressTaxRefund would rank first. However, using the new system, TurboTax would be much more likely to take the top bid. Even though Yahoo may not make as much per click with the new system, it is designed to increase the average CTR which will in turn increase Yahoo’s total revenue from the system.

http://www.ecommerce-guide.com/news/article.php/3664101

After only a few weeks, Yahoo’s system has worked out well - click rates have gone up 5% and 9% in the first and second weeks respectively. It will be interesting to watch future developments with Google, Yahoo, and MSN as they borrow search engine concepts for use in their auction-based advertising systems.

Posted in Topics: General, Technology

View Comment (1) »

“Untruthfulness” in a Vickery Auction

Testing Vickery’s Revenue Equivalence Theory in Construction Auctions

Vickery, “second-price,” auctions are often used because of their characteristic of producing truthfulness within bidders. The inherent truth telling creates an auction which, in the long run, operates under true valuations, and procures socially optimum prices. However, the idealness Vickery auctions hold seems to sometimes dissolve when more unique attributes are applied to the auction.

In the construction industry, the form of auctions slightly differ from more common auctions. For instance, there is only one job (the buyer) for multiple contractors (the sellers). This mismatch places the auction, interestingly, onto the sellers rather than the buyers, meaning the sellers must compete with each other over how much compensation they are to receive for their work. Obviously, the buyer is looking to complete the job with the least expensives, meaning he is looking for the contractor with the lowest asking price. The final twist is that the auction is run over multiple rounds, allowing others to see the bids of others after each round, and readjuest their bidding strategy for the next. The most common practice in the industry is to use sealed first-price (”lowest-price” ) auctions to determine which contractor receives the job and for what price he will work for. Though commonly thought to produce relatively accurate valuations and prices for the contractors, Drew and Skitmore deemed there was cause for untruthfulness and uncertainty in bidding, leading them to wonder if a Vickery second-price auction would produce more socially beneficial results. They figured that in a first-price auction, prices were being driven up for the buyers because the contractors all would be reluctant of bidding their true value, and instead upping the bid to safeguard themselves from not making enough profit.

The best strategy in a first price sealed low bid auction (where the bidder submitting the lowest bid wins the contract at the value of the lowest bid) is for contractors to 1. assume theirs is the lowest bid; 2. determine their bid; and then 3. adjust the bid upwards to the second lowest bid. However, for construction contract auctions, the “compactness” of the bids makes it virtually impossible to estimate the value of the second lowest bid with sufficient accuracy, and the problem facing contractors is that increasing their bid too little results in lost revenue to the contractor while increasing too much means losing the competition.

The hypothesis of the paper was that instituting a second-price auction would produce similar profit margins for the contractors while maintaining similar costs for the clients, though producing some external benefits as well:

[The revenue equivalence theory] implies that construction clients and contractors would be no worse off financially, irrespective of whether a FPA or SPA is used. There is likely to be a psychological difference, however, with contractors feeling they are getting a better deal. One outcome of this could be that contractors will be happier with their lot and less inclined to seek ways of extracting more money out of clients by cutting corners and/or claiming extras. In other words, a less disputatious industry may result—solving, at a stroke, what has been recognized over the years as the biggest problem facing the construction industry worldwide

They proceded to run a simulation which charted the differences in first and second price auctions for varying types of contracting jobs. Under the circumstances of these contracting auctions, the attributes were effectively complete reverses of the normal type of Vickery auction: it was the sellers, not the buyers, who were bidding on the price of the job, and instead of awarding the “item” to the highest bidder at the second highest price, the lowest bidder gets the construction job and does it at the second lowest price.

Would these characteristics significantly alter the normal idealness and practicality of second-price auctions, or would it produce the same optimizing outcomes? After a robust simulation and thorough statistical analysis, it turned out that buyers would actually be forced to pay more for the job (i.e. the lowest bids of the contractors went up, effectively raising the final price of the job). The attributes of this type of auction caused slight upward change in bidding, but then effectively raised the price more by allocating it to the second lowest.

In class, we have recently been discussing the strategies behind keyword-based advertising, and though we have examined many methods to generating income, it has been concluded that even Google and other search engines do not know the optimal strategy that produces the highest profits. This example shows how changing the attributes of an auction can cause it to slightly, or in this case significantly, deviate from its normal economic equilibrium properties. Examples such as this should highlight to Google that since they are dealing with a new and unusual form of advertising, that tweaking their ranking and bidding systems by altering their characteristics might, in fact, be beneficial for them to play around with, and doing so might produce much better profit margins for the company than they currently have.

Posted in Topics: General

No Comments

Google’s Holy Grail

Cracking Google’s Page Ranking

Lately it seems that anytime anyone talks about technology, all you ever hear about is “Google”. Google this, and Google that. Not only has Google weeded its way into daily conversations, but, ultimately, Google dominates, and is king of the Internet. A recent article discussed how, websites continuously vie to rank high in a Google search. No one seems to know the secret behind what has been referred to as Google’s “secret sauce”, or “Holy Grail of online marketing”, or in plain terms, Google’s PageRank algorithm. Many have tried to crack it, mathematicians have tried to make sense of it, but no one has been able to uncover the mysteries behind Google’s PageRank algorithm. Google’s PageRank algorithm goes above and beyond just assigning popular websites a high page rank. A good analogy that was given in the article was that it’s like a popularity contest between all websites. Everyone votes for who they think is the best page, and usually the one with the most votes wins. However, Google goes beyond this. “Google looks at who voted for who, and assigns a value to the importance of the site that casts the vote and that site can pass on its popularity and importance to the site it linked to.” Although, it’s not easy to get a high page ranking from Google, Google spokesmen claim that if one follows the guidelines for higher page rankings, then it’s not that difficult. Basically, the trick is to “pretend Google isn’t even there”. While Google has been questioned supposedly taking part in deliberate de-ranking, Google claims that every decision about what pages get what ranking, is determined by it’s complex algorithm. Their number one concerns are the quality of results, and preventing spam.

As a frequent user of Google, I have never been left disappointed by what my searches return. Although, it’s possible that Google can be a bite over selective about which websites receive what page rank, Google does a good job in making sure that it’s users receive high quality results and find exactly what they are looking for. It will be interesting to see how all of the recent law suits against Google pan out, and if one day someone will be able to crack the code of Google’s highly treasured, PageRank algorithm.

Posted in Topics: Education

Comments (2) »

Problems with Online Social Networks

http://www.msnbc.msn.com/id/14325063/site/newsweek/

Picture this situation: You’ve just gone to an awesome dorm party and to celebrate, you decide to take pictures of the event and post them on Facebook to save the memories and share it with your friends. A few days later, you learn that these pictures were viewed by administrators and that you were getting punished for drinking in a dorm. This is exactly what happened to Brad Davis, a freshman at Emory, and this is a perfect example of the dangers of social networks.

“Web of Risks” discusses the negatives of having social-networking websites. The purpose of these websites is to connect people from all over the world and allow people to post interests, pictures, videos and other such info to allow other people to know more about them and to also help people find others with common interests. However, the ability to share and find info so easily is also the bane of these networks. Generally, anyone can view anyone’s webpage and explore all the contents in it. This freedom causes websites such as Facebook to become risky for students. Facebook has become the main netwrok for college students. Given the opportunity, many students, like Brad Davis above, tend to put up pictures of themselves doing unlawful activities that college students think are cool. However, administrators of the school can also sign up for Facebook if they are graduates of the college or are student workers in that college. These administrators can easily get access to almost every picture on Facebook that is connected to their college. These pictures can then be used as evidence to punish the students in it.

Although online social networks are great places to promote communication and find new friends, it is not all fun and games. It is still necessary to be careful in what is posted, for websites still follow the same laws as the real world.

Posted in Topics: Education

No Comments