miércoles, 17 de julio de 2013

Ley de Metcalfe: Reduciendo la cola larga de las redes sociales


Guest Blogger Bob Metcalfe: Metcalfe’s Law Recurses Down the Long Tail of Social Networks


The blogosphere has started bubbling some interesting discussion of how Metcalfe’s Law applies to current Web 2.0 dynamics like social networking.Some IEEE typesBrad FeldNiel Robertson, a PhD. student named Fred Stutzman, my partner Sim Simeonovmyself and a few others have posted on this in the last few weeks.
Bob Metcalfe, who invented the law in the first place and is my partner at Polaris (and who, along with Al Gore, invented the Internet…), offers his own view in a guest blog post below.

Metcalfe’s original insight was that the value of a communications network grows (exponentially, as it turns out) as the number of users grows.
All seem to agree that Metcalfe’s Law offers a good theoretical framework for thinking about Social Networks. Robertson argues that in addition to the number of users, the rank of a social network is another variable that should be considered when the law is applied to a social network as opposed to a communications network; Stutzman, on the other hand, suggests that one ought to add consideration of “the sum of actions and associations” enabled by a particular social network.
Not surprisingly, Metcalfe himself offers a more insightful and, I think, important contribution to the conversation — that to understand the value of a social network we need to consider not just the number of users but also the affinity between the members of the network.
Enjoy Bob’s post, and by all means please feel free to add your own comments…
Metcalfe’s Law Recurses Down the Long Tail of Social Networking
By Bob Metcalfe
Metcalfe’s Law is under attack again. This latest attack argues that the value of a network does not grow as the square of its number of users, V~N^2, like I’ve been saying for 26 years, but slower, V~N*Log(N). The new attack comes in a cover story by Briscoe, Odlyzko, and Tilly in a prestigious 385,000-member social network called IEEE SPECTRUM. And now they are saying that my law is not just wrong but also “dangerous.”
Below is the original not PowerPoint but 35mm slide I used circa 1980 to convince early Ethernet adopters to try LANs large enough to exhibit network effects – networks larger than some “critical mass.”
metcalf.PNG
This slide was named “Metcalfe’s Law” by George Gilder in the September 1993 issue of FORBES and later in his book TELECOSM. Again, thank you, George.
Ethernet’s early adopters took this advice, and so my computer communication compatibility company, 3Com, prospered. Last year, according to IDC, 33 years after Ethernet’s invention at Xerox Parc, a quarter billion new Ethernet switch ports were shipped worldwide.
And now for some inconvenient truths. Al Gore famously claimed to have invented the Internet in the 1980s, which struck some of us as a little late. Like his father, Al Gore Senior, who claimed to have invented what is inexplicably called the Eisenhower Interstate Highway System, Al Gore Junior invented what he called the Information Superhighway. The actual Internet was invented, I think, either by BBN at UCLA in 1969 or at Stanford in 1973.
With his Information Superhighway, Gore invented not the Internet but the Internet … Bubble. I was present when Vice President Gore mentioned Metcalfe’s Law in an MIT commencement address, inflating his administration’s Internet Bubble. I helped Gore inflate the Internet Bubble by touting Metcalfe’s Law. I am not sorry.
There are people who think the Internet Bubble was the worst thing that ever happened, and I hope those people are satisfied now that Ken Lay is dead. To those people my law may be, as the SPECTRUM article says, dangerous.Because my law allegedly over-estimates the values of networks, it might be used to inflate a second Internet Bubble, probably the imminent Social Networking Bubble, which will then inevitably burst. Can’t have that.
So, in IEEE SPECTRUM, Briscoe, Odlyzko, and Tilly debunk Metcalfe’s Law, again. It turns out that the value of a network does not grow as the square of the number of its users, V~N^2, but much more slowly, V~N*log(N), they figure.Cold water can now be thrown on the promoters of social networking. The bursting of a second Internet Bubble is thereby averted.
In renewed defense of Metcalfe’s Law, let me first point out that Al Gore has moved on to the invention of Global Warming. If a second Internet Bubble is to be inflated, I will have to do it without Gore’s hot air this time… Let’s get started.
Let me contrast Metcalfe’s Law with
Moore’s Law. Moore’s and Metcalfe’s Laws are similar in that both begin with the letter M. They are different in that
Moore’s Law is exponential in time while Metcalfe’s Law is quadratic in size.

Moore‘s Law, which states that semiconductors double in complexity every two years, has been numerically accurate since 1965. Metcalfe’s Law, on the other hand, has never been evaluated numerically, certainly not by me.
Nobody, including Briscoe, Odlyzko, and Tilly in their SPECTRUM attack, has attempted to estimate what I hereby call A, network value’s constant of proportionality in my law, V=A*N^2. Nor has anyone tried to fit any resulting curve to actual network sizes and values.
As I wrote a decade ago, Metcalfe’s Law is a vision thing. It is applicable mostly to smaller networks approaching “critical mass.” And it is undone numerically by the difficulty in quantifying concepts like “connected” and “value.”
So, if the value of a network does grow as V~N*log(N), I challenge Briscoe, Odlyzko, and Tilly to prove it with some real network sizes and values. In the meantime, I’ll stick with V~N^2.
While they’re at it, my law’s critics should look at whether the value of a network actually starts going down after some size. Who hasn’t received way too much email or way too many hits from a Google search? There may be diseconomies of network scale that eventually drive values down with increasing size. So, if V=A*N^2, it could be that A (for “affinity,” value per connection) is also a function of N and heads down after some network size, overwhelming N^2. Somebody should look at that and take another crack at my poor old law.
But, if anybody wants to spend time on Metcalfe’s Law, let me suggest what are likely to be more fruitful paths. Accurate formulas for the static value of a network are fine, but it would be much more useful to understand the dynamics of network value over time. Also important would be linking Metcalfe’s Law to
Moore’s Law and showing how that potent combination underlies what WIRED’s Editor-in-Chief Chris Anderson calls The Long Tail.
Metcalfe’s Law points to a critical mass of connectivity after which the benefits of a network grow larger than its costs. The number of users at which this critical mass is achieved can be calculated by solving C*N=A*N^2, where C is the cost per connection and A is the value per connection. The N at which critical mass is achieved is N=C/A. It is not much of a surprise that the lower the cost per connection, C, the lower the critical mass number of users, N. And the higher the value per connection, A, the lower the critical mass number of users, N.
Continuing to paint with a broad brush, I take
Moore’s Law to mean that my law’s connectivity cost C — the cost of the computing and communication used to create connectivity — is halved every two years. Combining Moore’s and Metcalfe’s Laws, therefore, the number of users at which a network’s value exceeds its cost halves every two years. And that’s just considering C.
I am reminded that the first Ethernet card I sold at 3Com in 1980 went for $5,000. By 1982, the cost was down to $1,000. Today, Ethernet connections cost under $100, perhaps as low as $5 per connection. Whatever the critical mass sizes of Ethernets were in 1980, they are a lot lower now.
But that’s not all. The denominator of C/A, the constant of value proportionality, A, has been going up. In the 1980s, Ethernet connectivity allowed users only to share printers, share disks, and exchange emails — a very low A indeed. But today, Internet connectivity brings users the World Wide Web, Amazon, eBay, Google, iTunes, blogs, … and social networking. The Internet’s value per connection, A, is a lot higher now, which means the critical mass size of the Internet, C/A, is a lot lower now, and for two reasons: cost and value.
Amazon connectivity among people and books allows my five-year-old book, INTERNET COLLAPSES, to be available still, with Amazon rank below 1,000,000. There’s eBay connectivity among people with ever more arcane things to buy and sell. There’s blogosphere connectivity among many more writers each with many fewer readers. Daily newspaper circulations have been going down since 1984, and there are now millions of active blogs, most of them very small. Blogs are an early form of social networking among growing numbers of smaller groups along ever more refined dimensions of affinity.
Social networks form around what might be called affinities. For each affinity, there is a critical mass size given by N=C/A, as above. If the number of people sharing an affinity is above this critical mass, then their social network may form, otherwise not. As Internet access gets cheaper and the tools for exploiting affinities get better, many more social networks will become viable.
Somebody should look at this. Somebody already has: Chris Anderson.
Moore’s and Metcalfe’s Laws bring us to Chris Anderson’s new book, THE LONG TAIL, which you should read immediately. (Actually, there’s no rush. 
Anderson’s book currently has double-digit rank at Amazon, like my book did five years ago. Take your time and you might even get THE LONG TAIL for next to nothing as it moves down Amazon’s Long Tail.)

Anderson‘s Long Tail explains how, for example, more people are listening to music other than the Top 40 hits. Thanks to iTunes, even though there still is a Top 40, the fraction of music listening from down music’s Long Tail is increasing. It remains to be seen whether the growth of music’s Long Tail increases total music sales, which would be my guess, or whether it shifts revenues away from Britney Spears.
For another example of The Long Tail, millions of books like mine, which would otherwise be out of print, can still be found at Amazon.com and delivered in a day or two. Try buying INTERNET COLLAPSES, used if you must.
Let me leave as an exercise for the reader to develop the formulas for how Amazon’s Long Tail grows to the right as the combination of Moore’s and Metcalfe’s Laws biennially halves the critical-mass size of book audiences. Book buying generally shrinks with time, but I’m guessing that Amazon’s per book critical masses, its N=C/As, have been shrinking faster.
Similar formulas could quantify how Moore’s and Metcalfe’s Laws have also driven down the critical mass sizes (N=C/A) of Internet-enabled social networks and extended their Long Tail to the right. Looking more closely, I see that Metcalfe’s Law recurses. Just being on the Internet has some increasing value that may be described by my law. But then there’s the value of being in a particular social network through the Internet. It’s V~N^2 all over again. Down a level, N is now the number of people in a particular social network, which has its own C, A, V, and critical mass N.
Of course the cost (C*N) of getting connected in a social network has been going down thanks to the proliferation of the Internet and its decreasing price.The value (A*N^2) of particular social networks has been growing with broadband and mobile Internet access. Emerging software tools expedite the viral growth and ease of communication among network members, also boosting the value of underlying connectivity.
So, if you want to spend time on V~N^2, and I hope you do, then forget minor refinements like V~N*log(N) and help inflate the next Internet Bubble by figuring out how Metcalfe’s Law recurses down The Long Tail of social networking.
Bob Metcalfe received the National Medal of Technology from President Bush in 2005 for his leadership in the invention, standardization, and commercialization of Ethernet. Bob is a general partner of Polaris Venture Partners, where he serves on the boards of Polaris-back companies including Ember, GreenFuel, Infinite Power Solutions, Mintera, Narad, Paratek Microwave, and SiCortex.

miércoles, 10 de julio de 2013

Ley de Metcalfe: Más incompresión que equivocación

Metcalfe’s Law: more misunderstood than wrong?

The industry is at it again–trying to figure out what to make of Metcalfe’s Law. This time it’s IEEE Spectrum with a controversially titled “Metcalfe’s Law is Wrong”. The main thrust of the argument is that the value of a network grows O(nlogn) as opposed to O(n2). Unfortunately, the authors’ O(nlogn) suggeston is no more accurate or insightful than the original proposal.
There are three issues to consider:
  • The difference between what Bob Metcalfe claimed and what ended up becoming Metcalfe’s Law
  • The units of measurement
  • What happens with large networks
The typical statement of the law is “the value of a network increases proportionately with the square of the number of its users.” That’s what you’ll find at the Wikipedia link above. It happens to not be what Bob Metcalfe claimed in the first place. These days I work with Bob at Polaris Venture Partners. I have seen a copy of the original (circa 1980) transparency that Bob created to communicate his idea. IEEE Spectrum has a good reproduction, shown here.

The unit of measurement along the X-axis is “compatibly communicating devices”, not users. The credit for the “users” formulation goes to George Gilder who wrote about Metcalfe’s Law in Forbes ASAP on September 13, 1993. However, Gilder’s article talks about machines and not users. Anyway, both the “users” and “machines” formulations miss the subtlety imposed by the “compatibly communicating” qualifier, which is the key to understanding the concept.
Bob, who invented Ethernet, was addressing small LANs where machines are visible to one another and share services such as discovery, email, etc. He recalls that his goal was to have companies install networks with at least three nodes. Now, that’s a far cry from the Internet, which is huge, where most machines cannot see one another and/or have nothing to communicate about… So, if you’re talking about a smallish network where indeed nodes are “compatibly communicating”, I’d argue that the original suggestion holds pretty well.
The authors of the IEEE article take the “users” formulation and suggest that the value of a network should grow on the order of O(nlogn) as opposed to O(n2). Are they correct? It depends. Is their proposal a meaningful improvement on the original idea? No.
To justify the logn factor, the authors apply Zipf’s Law to large networks. Again, the issue I have is with the unit of measurement. Zipf’s Law applies to homogeneous populations (the original research was on natural language). You can apply it to books, movies and songs. It’s meaningless to apply it to the population of books, movies and songs put together or, for that matter, to the Internet, which is perhaps the most heterogeneous collection of nodes, people, communities, interests, etc. one can point to. For the same reason, you cannot apply it to MySpace, which is a group of sub-communities hosted on the same online community infrastructure (OCI), or to the Cingular / AT&T Wireless merger.
The main point of Metcalfe’s Law is that the value of networks exhibits super-linear growth. If you measure the size of networks in users, the value definitely does not grow O(n2) but I’m not sure O(nlogn) is a significantly better approximation, especially for large networks. A better approximation of value would be something along the lines of O(SumC(O(mclogmc))), where C is the set of homogeneous sub-networks/communities andmc is the size of the particular sub-community/network. Since the same user can be a member of multiple social networks, and since |C| is a function of (there are more communities in larger networks), it’s not clear what the total value will end up being. That’s a Long Tail argument if you want one…
Very large networks pose a further problem. Size introduces friction and complicates connectivity, discovery, identity management, trust provisioning, etc. Does this mean that at some point the value of a network starts going down (as another good illustration from the IEEE article shows)? It depends on infrastructure. Clients and servers play different roles in networks. (For more on this in the context of Metcalfe’s Law, see Integration is the Killer App, an article I wrote for XML Journal in 2003, having spent less time thinking about the problem ;-) ). P2P sharing, search engines and portals, anti-spam tools and federated identity management schemes are just but a few examples of the myriad of technologies that have all come about to address scaling problems on the Internet. MySpace and LinkedIn have very different rules of engagement and policing schemes. These communities will grow and increase in value very differently. That’s another argument for the value of a network aggregating across a myriad of sub-networks.
Bottom line, the article attacks Metcalfe’s Law but fails to propose a meaningful alternative.

High Contrast

lunes, 8 de julio de 2013

Ley de Metcalfe: Demasiado optimista

Researchers: Metcalfe's Law overshoots the mark

Summary: Rule that encouraged madcap expansion in the dot-com era turns out to have been, um, way too generous, scientists argue. ZDNet

Two University of Minnesota researchers have written a paper arguing that Metcalfe's Law, a rule of thumb that computes the value of communication networks, is overly optimistic.
Metcalfe's Law--a rule of thumb, really, that provided a rationale for aggressive expansion efforts during the dot-com boom--posits that the value of a network increases with the square of the number of devices in the network. But in a preliminary paper (click for PDF) published March 2, Andrew Odlyzko and Benjamin Tilly of the university's Digital Technology Center concluded that the law "is a significant overestimate." In one example, where the law would find a network's value increased 100 percent, their calculations found only a 5 percent enhancement.
The two researchers proposed an alternative formula that heads in the same direction as Metcalfe's Law but doesn't go as far. The differences in the two laws explain why established network powers, such as AT&T, have resisted cooperation with smaller rivals.
Metcalfe's Law was a driving feature of the dot-com boom. Netscape co-founder Marc Andreessen, for example, argued that the law explained the surging amount of time people spent online using services from America Online, his employer at the time.
There is no shortage of laws in the computing realm. Moore's Law, by Intel co-founder Gordon Moore, describes the rate at which more transistor circuitry can be packed onto a single chip. AndAmdahl's Law, by IBM mainframe designer Gene Amdahl, governs the performance boost gained by adding new processors to a computer. (Amdahl also is credited for another computing industry innovation, coining the term "FUD," the fear, uncertainty and doubt that one company's propagandists use to undermine a rival's product.)


'Fundamental fallacy'
Metcalfe's Law came from Bob Metcalfe, a founder of networking equipment supplier 3Com and coinventor of the now-ubiquitous Ethernet networking standard. According to the law, a network with 20 telephones--or alternatively, fax machines, instant-messaging teenagers or Internet-phone callers--is four times more valuable than a network with 10. A network with 30 nodes is nine times more valuable than one with 10.
Not so, Odlyzko and Tilly argue.
"The fundamental fallacy underlying Metcalfe's (Law) is in the assumption that all connections or all groups are equally valuable," the researchers report.
If Metcalfe's Law were true, there would have been tremendous economic incentives to accelerate network mergers that in practice take place slowly. "Metcalfe's Law provides irresistible incentives for all networks relying on the same technology to merge or at least interconnect."
The researchers propose a less dramatic rule of thumb: the value of a network with n members is not n squared, but rather n times the logarithm of n. That means, for example, that the total value of two networks with 1,048,576 members each is only 5 percent more valuable together compared to separate. Metcalfe's Law predicts a 100 percent increase in value by merging the networks.
It's not a merely academic issue. "Historically there have been many cases of networks that resisted interconnection for a long time," the researchers say, pointing to incompatible telephone, e-mail and text messaging standards. Their network effect law, in contrast to Metcalfe's, shows that incumbent powers have a reason to shut out smaller new arrivals.

When two networks merge, "the smaller network gains considerably more than the larger one. This produces an incentive for larger networks to refuse to interconnect without payment, a very common phenomenon in the real economy," the researchers conclude.

domingo, 7 de julio de 2013

sábado, 6 de julio de 2013

Todo lo que necesita saber acerca de la guerra de precios en eReaders

Everything you need to know about the great e-book price war

How the DOJ's antitrust lawsuit against Apple and the Big Six book publishers will affect the business of lit




Jeff Bezos (Credit: AP/Reed Saxon)
Closing arguments for the Department of Justice’s antitrust suit against Apple concluded last week, although U.S. District Judge Denise Cote is not expected to reach a decision for another couple of months. If you’ve found the case difficult to follow, you’re not alone. Still it’s worth getting a handle on the basics because the suit — or, more precisely, the business deals behind it — have changed book publishing in significant ways. Furthermore, Judge Cote’s decision could have impact well beyond the book industry.
Apple was charged with colluding with publishers to fix e-book prices. At the root of the dispute lie two different ways that publishers can sell books to retailers.
First, there’s the wholesale model, the way that book publishers have sold printed books to bookstores and other outlets for years. The publisher sets a cover price for a book, sells it to a retailer at a discount (typically 50 percent) and then the retailer can sell the book to consumers for whatever price it chooses.
The other method of selling books is via the agency model, which means, essentially, on commission. The retailer offers the book to consumers at a price the publisher sets and gets a percentage of whatever sales are made. It’s rare for print books to be sold in this way, but it’s the method Apple uses to sell content like music and apps in its iTunes store.
Until 2010 — as Andrew Albanese explains in his admirably lucid “The Battle of $9.99: How Apple, Amazon and the ‘Big Six’ Publishers Changed the E-Book Business Overnight,” a new “e-single” published by Publishers Weekly — book publishers had been selling e-books to Amazon using the wholesale model. They’d simply adapted the system they were already using to sell print books to the online retailer. This, they would soon realize, was a big mistake.
The wholesale model is widely seen as an odd way to sell e-books, since what the purchaser buys is “licensed access” to a digital file, rather than a physical object like a book. But what would torment publishers most about this arrangement was the freedom the wholesale model gave to Amazon to set the prices of e-books.
With the launch of the Kindle, Amazon promoted a low baseline price of $9.99 for most e-books. That meant that Amazon was selling virtually all newly published e-books at a loss. For example: A new book with a hardcover list price of $29.95 would be given an e-book price of $23.95 — 20 percent less to account for the publisher’s savings in printing, binding and distribution. The publisher would sell that e-book to Amazon for $12, and Amazon would retail it for $9.99, taking a $2 loss.
Why would Amazon do this? Observers have proposed several motives. Perhaps Amazon aimed to entice heavy readers to the newfangled Kindle; the customer could tell herself she’d make up the cost of the device in savings on the books themselves. Others have suggested that cheap e-books were loss leaders that drew customers back to Amazon over and over again, presumably so they’d go on to purchase high-margin items like TVs.
The most popular theory by far holds that Amazon intended from the start to totally dominate the e-book marketplace. By using its wealth to subsidize the sale of e-books at a loss, it could drive any competitors out of the market. Bricks-and-mortar chains like Barnes and Noble and online start-ups like Kobo (both of which would introduce their own e-reader devices) or device-neutral rivals like Google would simply not be willing or able to bleed cash as long as Amazon could. And because the Kindle is a “closed platform” — Kindle e-books can only be read on Kindle devices or apps — the more Kindle e-books a customer owned, the more reluctant she’d be to switch to a different device.
Obviously, however deep its pockets, Amazon would not be able to go on selling e-books at a loss indefinitely. But once Amazon was cemented in place as the uncontested sovereign of e-book retail, it could do whatever it wanted: force publishers to reduce their own prices, and/or raise prices on consumers.
If this was the retailer’s strategy, it was initially an effective one. By the end of 2009, Amazon owned 90 percent of the robustly growing e-book market. Even though e-books still made up a small percentage of overall book sales, publishers finally saw the writing on the wall. Amazon had a near-monopoly and was furthermore devaluing books in the eyes of consumers — they began to think of books as worth $9.99, not $23.95. Book publishing is a low-margin business to begin with, and the mammoth retailer seemed poised to scrape even those minimal profits away.
At that point, Apple entered the scene with a hotly anticipated new device, the iPad, and plans to open its own e-book store. Needless to say, the nation’s largest book publishers looked upon this rich new Amazon competitor with keen interest. The trial at the U.S. District Court for the Southern District of New York this month has provided a record of what happened next.
As narrated by Albanese and other observers of the trial, Apple approached book publishers about making their titles available in the iBookstore. Apple felt that it needed at least four of the “Big Six” publishers to launch the store, and it entered into discussions with all six. Initially, Apple’s primary negotiator, Eddy Cue, assumed they’d purchase e-books via the wholesale model. A couple of the publishers he spoke with proposed the agency model for e-books, an idea that had been kicking around the book world for a few months.
Apple liked the idea. So did the publishers — they would make less money per e-book this way than they did by selling wholesale to Amazon, but they could live with that. What Amazon was doing wasn’t sustainable anyway. Under agency terms, publishers could control the pricing of their books and assert that $12.99 to $14.99 was a fair market value for most new titles. Although authors would also receive less in royalties from agency sales, the Authors’ Guild endorsed the move as the only alternative to watching “Amazon destroy the physical distribution chain” — that is, brick-and-mortar bookstores — in the words of Guild president Scott Turow. (If you want to know why bookstores are especially important to authors, read this.)
However, Apple needed a critical mass of publishers to participate. Otherwise their store would have too few desirable titles. And none of the publishers wanted to be the first to go out on a limb and risk being the only one selling their titles for three to five bucks more than everyone else. Last but not least, Apple knew it couldn’t make the iBookstore a success if it sold the most in-demand titles for dollars more than Amazon did.
By early April 2010, when the iPad and the iBookstore officially launched, five of the Big Six publishers had entered into agency deals with Apple. Each of those publishers had also informed Amazon that if the retailer wanted to continue selling their e-books, it would have to buy them on agency terms as well. As the publishers saw it, they’d stood up to a “bully.”
Amazon was so infuriated by this development that it punished the first publisher to demand agency terms, Macmillan, by removing buy buttons from all Macmillan books, digital and print,for a week. “We will have to capitulate and accept Macmillan’s terms because Macmillan has a monopoly over their own titles,” Amazon announced in a message to its customers after it finally restored the buttons — a curious statement a little like complaining that George R.R. Martin has a “monopoly” on the writing of George R.R. Martin novels.
But Amazon had more than petulant retorts up its sleeve. Days after Macmillan delivered its new terms to their Seattle offices, the retailer sent a white paper to the U.S. Department of Justice, accusing the publishers of violating antitrust regulations. By the next spring, Attorney General Eric Holder announced that the DOJ had filed a civil antitrust lawsuit against Apple and the five publishers who had agreed to sell books with the agency model through the iBookstore.
At issue was whether the publishers “colluded” together to set uniform prices or simply seized the opportunity presented by a new competitor to negotiate better terms with Amazon. DOJ argues that Apple is culpable because it deliberately served as a hub or conduit by which the publishers could reach an agreement among themselves. All five of the publishers, pleading financial constraints, have since settled with the government while admitting no wrongdoing. Apple is the only remaining defendant.
Initially, prospects looked dim for the tech giant. Before the trial started, Judge Cote said at a hearing, “I believe that the government will be able to show at trial direct evidence that Apple knowingly participated in and facilitated a conspiracy to raise prices of e-books, and that the circumstantial evidence in this case, including the terms of the agreements, will confirm that.”
However, many close watchers of the trial feel that Apple has made a strong case that if there was any colluding among the publishers, they did not enable it. By setting up agency-model terms with the publishers, Apple was merely making it possible to enter into the e-book retailing market as a competitor with Amazon. The day before closing arguments, Judge Cote remarked that her views on the case “have somewhat shifted.” But we probably won’t find out how just how much of a shift it’s been until the fall.
For Apple, the stakes remain high. If it is found to have violated antitrust law with the iBookstore, it will not be asked to pay damages. Instead, the DOJ will likely demand that Apple clean up its act and insist on overseeing its operations, perhaps as it did in settling the antitrust case against Microsoft in 2001. E-books make up a small sliver of iTunes sales, but the rest of the content offered there — music, apps, video — is obtained on very similar terms. Having the DOJ hovering over and meddling in future agreements would decidedly cramp Apple’s style. There are also pending and potential civil suits filed by states’ attorneys general and consumer groups seeking damages that would get a boost from a decision against Apple.
But for the Big Six publishers, this “defeat” has a surprising upside. They appear to have achieved much of what they wanted in the first place, which was not money but a more competitive e-book market and more control over the prices (and perceived value) of their books. Amazon has ceded 30 percent of the e-book market to competitors, and now buys most of the Big Six’s books on agency terms. The prices of popular Kindle titles, especially New York Times Best Sellers are either identical to those in the iBookstore, or at most a dollar cheaper. And as Albanese pointed out recently in Publishers Weekly, even the payouts to consumers mandated by the settlements have a silver lining: The “monies — nearly $175 million in total … will be issued almost entirely as credits to e-book consumers’ accounts, meaning those funds will flow back to the publishers, almost like a court-ordered promotion.” Sometimes you can’t lose for winning.
Laura Miller
Laura Miller is a senior writer for Salon. She is the author of "The Magician's Book: A Skeptic's Adventures in Narnia" and has a Web site,magiciansbook.com.


viernes, 5 de julio de 2013

jueves, 4 de julio de 2013

Nueva estrategia de precios para aplicaciones parece ir bien

New pricing strategy for apps going strong

Developers around the world adopt pricing model 'Fans in charge'


Pricing experiment app SnelTrein still ongoing. App developers worldwide start adopting new pricing strategy. http://pressdoc.com/p/00141u



On June 27th a Dutch app developer launched a radical new pricing strategy for apps. They call the pricing model 'Fans in charge'. Free for anyone to adopt. Innovatio claims to be on a mission to heal the global app economy. A bit ambitious and crazy, but worth the risk. In a nutshell it comes down to this: An app gets more expensive everyday until a day comes by where the app is sold zero times. When that moment arrives, fans & early adopters get to decide on the final price tag of the app.


The motivation
Innovatio started the experiment because the company believes that the current app economy is sick and needs to be healed. Apps are pretty cheap, compared to other products in our daily lives, according to Innovatio. People happily pay $7 or more for a movie ticket, but find apps over $5 to be ‘unreal’.
The experiment
Last Thursday Innovatio launched a public transport app for the Dutch: SnelTrein. It’s a very cool and smart product that works really well. But that’s not the point. It’s the first app that uses this new pricing strategy. The app started at $0,99 and went up in price every day. It now costs $7,99 and is still selling. The developer is now waiting for a day with zero sales. When that moment arrives, fans & early adopters get to vote on the final price tag.
Early results
Although the amount of sales and revenue dropped as the price rose, it looks like revenue is bottoming out and is starting to rise. The big question is: How long will it take till a price is reached where people stop buying the app? Early results are plotted in a chart. (see attachments).
Pricing strategy gets adopted worldwide
Innovatio published the radical pricing strategy on TNW market. For free. So far 35 companies and developers have claimed the new pricing strategy for apps.

Twitter Delicious Facebook Digg Stumbleupon Favorites More

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | Best Hostgator Coupon Code