Followers

Thursday, August 14, 2008

Web firms acknowledge tracking behavior without consent

Several Internet and broadband companies have acknowledged using targeted-advertising technology without explicitly informing customers, according to letters released Monday by the House Energy and Commerce Committee.

And Google Inc., the leading online advertiser, stated that it had begun using Internet tracking technology that enabled it to more precisely follow Web-surfing behavior across affiliated sites.

The revelations came in response to a bipartisan inquiry of how more than 30 Internet companies might have gathered data to target customers. Some privacy advocates and lawmakers said the disclosures helped build a case for an overarching online-privacy law.

"Increasingly, there are no limits technologically as to what a company can do in terms of collecting information . . . and then selling it as a commodity to other providers," said committee member Edward J. Markey (D-Mass.), who created the Privacy Caucus 12 years ago. "Our responsibility is to make sure that we create a law that, regardless of the technology, includes a set of legal guarantees that consumers have with respect to their information."

Markey said he and his colleagues plan to introduce legislation next year, a sort of online-privacy Bill of Rights, that would require that consumers must opt in to the tracking of their online behavior and the collection and sharing of their personal data.

But some committee leaders cautioned that such legislation could damage the economy by preventing small companies from reaching customers. Rep. Cliff Stearns (R-Fla.) said self-regulation that focuses on transparency and choice might be the best approach.

Google, in its letter to committee Chairman John Dingell (D-Mich.), Markey, Stearns and Rep. Joe L. Barton (R-Texas), stressed that it did not engage in potentially the most invasive of technologies -- deep packet inspection, which companies such as NebuAd have tested with some broadband providers. But Google did note that it had begun to use across its network the "DoubleClick ad-serving cookie," a computer code that allows the tracking of Web surfing.

Alan Davidson, Google's director of public policy and government affairs, stated in the letter that users could opt out of a single cookie for both DoubleClick and the Google content network. He also said that Google was not yet focusing on "behavioral" advertising, which depends on website tracking.

But on its official blog last week, Google touted how its recent $3.1-billion merger with DoubleClick provided advertisers "insight into the number of people who have seen an ad campaign," as well as "how many users visited their sites after seeing an ad."

"Google is slowly embracing a full-blown behavioral targeting over its vast network of services and sites," said Jeffrey Chester, executive director of the Center for Digital Democracy. He said that Google, through its vast data collection and sophisticated data analysis tools, "knows more about consumers than practically anyone."

Microsoft Corp. and Yahoo Inc. have disclosed that they engage in some form of behavioral targeting. Yahoo has said it will allow users to turn off targeted advertising on its websites; Microsoft has yet to respond to the committee.

More than a dozen of the 33 companies queried said they did not conduct targeted advertising based on consumers' Internet search or surfing activities. But, Chester said, a number of them engage in sophisticated interactive marketing. On Comcast's site promoting "interactive advertising," for instance, the company promotes its ability to receive users' monthly data: "over 3 billion page views, 15 million unique users . . . and over 60 million video streams."

Comcast declined to comment.

Broadband providers Knology and Cable One, for instance, recently ran tests using deep-packet-inspection technology provided by NebuAd to see whether it could help them serve up more relevant ads, but their customers were not explicitly alerted to the test. Cable One is owned by Washington Post Co.

Both companies said that they have ended the trials. Cable One has no plans to adopt the technology, spokeswoman Melany Stroupe said. "However, if we do," she said, "we want people to be able to opt in."

Ari Schwartz, vice president of the Center for Democracy and Technology, said lawmakers were beginning to understand the convergence of platforms. "People are starting to see: 'Oh, we have these different industries that are collecting the same types of information to profile individuals and the devices they use on the network,' " he said. "Internet. Cellphones. Cable. Any way you tap into the network, concerns are raised."

Markey said Monday that any legislation would require explicitly informing the consumer of the type of information that was being gathered and any intent to use it for a different purpose, and a right to say 'no' to the collection or use.

The push for overarching legislation is bipartisan. "A broad approach to protecting people's online privacy seems both desirable and inevitable," Barton said. "Advertisers and data collectors who record where customers go and what they do want profit at the expense of privacy."

As of Monday evening, the committee had posted letters from 25 companies on its website.
Original here

IBM VP: Office OpenXML a dead end, Microsoft will back ODF

During the LinuxWorld Expo in San Francisco, I met with Bob Sutor, IBM's vice president of open source and standards. We discussed document standards and the implications of ISO's controversial decision to grant fast-track approval to Microsoft's Office Open XML (OOXML) format.

Allegations of procedural irregularities in the OOXML approval process have raised serious questions about the integrity of ISO. Some national standards bodies complained that their views were disregarded or ignored during the OOXML ballot resolution meeting because of unreasonable time constraints. Some critics fear that the problems that arose during the ISO evaluation of OOXML will contribute to disillusionment and apathy towards open standards.

I asked Sutor if he thinks that the murky OOXML process will lead to uncertainty about standards in general. He doesn't accept that possibility and argued instead that the widespread public scrutiny received by OOXML is a sign that people are beginning to care deeply about open standards. He says that the extensive mainstream press coverage and public discussion about the OOXML decision helped to boost the visibility of ODF and increase awareness of standards-based technology.

Although he doesn't believe that the OOXML controversy will slow down adoption of open standards, he suspects that ISO will lose some credibility. The national bodies that were not permitted to present their proposals during the review process clearly feel that they were marginalized by blanket voting and other procedural shortcuts. This has led them to question the inclusiveness of ISO. The organization's dismissive response to the allegations has seriously exacerbated the issue and is reinforcing the perception of exclusivity.

ISO asserts that the approval of OOXML inherently validates the process. It also argues that the procedural shortcuts that blemished the OOXML process were permissible because they were instituted by the will of the majority. The ISO standardization model clearly favors consensus over technical validity, which means that it can provide no substantive guarantees that the standards it produces are actually sound. Sutor says that this revelation will compel some to question whether ISO is really a suitable venue for technical standards. In practice, the consensus-driven approach may be well suited for standardizing things like paper sizes, but not document formats.

If ISO declines to acknowledge the problems with its standardization model and pursue needed reforms, then it could face irrelevance, Sutor said. He pointed out that ISO has no inherent authority beyond the trust vested in it by national standards bodies and implementors of standards. He argued that anyone can create a standards organization and that other organizations will emerge to displace ISO if it loses public trust.

I asked him if he thinks that ISO approval of OOXML will drive implementors and adopters away from ODF. He has seen no evidence of such a trend and argued that uptake of OOXML has been slow. He claimed that the complexity of the standard has deterred acceptance and said that Microsoft's next-generation office suite hasn't significantly accelerated usage of OOXML in the wild. The vast majority of existing documents are already in the old binary formats and he contends that many users of Office 2007 still save new documents in the binary formats to accommodate compatibility with the older version. He thinks that implementors want to tap into that massive legacy document base and don't see much value yet in supporting OOXML in their software.

He is convinced that the industry will regard OOXML as a dead end, and that will force Microsoft to accept ODF. We have already seen some evidence of this in Microsoft's recent decision to support ODF in Office—a change of heart that was precipitated by pressure from ODF adopters. Sutor suspects that this trend will continue and that Microsoft will eventually fully embrace ODF.

Sutor acknowledges that ODF lacks support for some of Office's functionality, but he is convinced that the gaps can be filled if Microsoft is willing to collaborate with OASIS and propose improvements to the format. His chief concern is that suspicion and distrust of Microsoft could undermine any collaboration, so he strongly encourages ODF advocates to keep an open mind and give Microsoft the benefit of the doubt if the company makes a bona fide effort to participate in the evolution of the standard.

If Microsoft isn't given the opportunity to participate in shaping the future of ODF or chooses not to, he says we will likely see the company embrace and extend the format with its own non-standard extensions. He hopes that the ODF community can collectively work with Microsoft to prevent the format from suffering that kind of fragmentation.

The road ahead for ODF is filled with both challenges and opportunities. Sutor sees a very promising future for open standards and is convinced that companies and governments are now capable of recognizing the value of interoperability.

Original here

Court: violating copyleft = copyright infringement

A federal appeals court has overruled a lower court ruling that, if sustained, would have severely hampered the enforceability of free software licenses. The lower court had found that redistributing software in violation of the terms of a free software license could constitute a breach of contract, but was not copyright infringement. The difference matters because copyright law affords much stronger remedies against infringement than does contract law. If allowed to stand, the decision could have neutered popular copyleft licenses such as the GPL and Creative Commons licenses. The district court decision was overturned on Wednesday by the United States Court of Appeals for the Federal Circuit.

The copyright holder in the case is Robert Jacobsen, the lead developer of the Java Model Railroad Interface, a software package used by model railroad enthusiasts. A firm called Kamind Associates downloaded parts of Jacobsen's project, stripped out the copyright notice and other identifying information, and began redistributing the modified version without Jacobsen's approval.

JMRI was released under version 1.0 of the Artistic License, which is also widely used in the Perl community. While not not technically a copyleft license, it gives users broad freedom to use, modify, and redistribute the JMRI software provided that certain conditions are met. Jacobsen argued that Kamind's failure to comply with the terms of the license deprived it of any permission to redistribute the software and made it guilty of copyright infringement. The District Court for the Northern District of California disagreed, holding that the Artistic License granted an "intentionally broad" license to the JMRI software, and that violations of the license terms should be viewed as mere breaches of contract rather than copyright infringement.



Used under GNU Free Documentation License v1.2
That matters because copyright holders can typically obtain injunctions against continued use of their works, whereas the winning party in a contract dispute more commonly receives monetary damages. That's problematic for free software because it's difficult to compute the monetary value of software that is given away free of charge. As a result, many violations of copyleft licenses could have been punished with small fines that would amount to little more than a slap on the wrist.

The decision outraged the free software community and the broader free culture movement. In December, a broad coalition of organizations that rely on copyleft licenses, including Creative Commons, the Software Freedom Law Center, and the Wikimedia Foundation, filed an amicus brief urging that the ruling be overturned. The brief, authored by lawyers at Stanford's Center for Internet and Society, pointed out that there are now hundreds of millions of works released under licenses similar to the Artistic License and suggested that the district court had failed to appreciate the potential consequences of neutering them. It made the case that the decision could "disrupt settled expectations on which literally millions of individuals, including award winning producers, firms such as IBM, educational institutions such as MIT and Harvard, and even governments have built businesses, educational initiatives, artistic collaborations, and public service projects."

The Federal Circuit appears to have been heavily influenced by the Stanford brief, as it specifically cited Creative Commons, MIT, Wikipedia, and various free software projects as examples of organizations that benefit from copyleft licenses. In a short, clearly-reasoned opinion, the Federal Circuit summarized the public benefits of public licensing and found that the district court had dismissed its terms too lightly. Unlike the lower court, the appeals court seemed to understand that reciprocity lay at the heart of free software licenses. Just as traditional software firms thrive on the exchange of code for money, free software projects thrive on the exchange of code for code. The Federal Circuit recognized that "there are substantial benefits, including economic benefits, to the creation and distribution of copyrighted works under public licenses that range far beyond traditional license royalties." Allowing those rules to be flaunted undermines the free software model.

Larry Lessig, who founded both Creative Commons and Stanford's Center for Internet and Society, called the decision "huge and important." His reaction is likely to be shared by other advocates of free software, who have long worried about the enforceability of copyleft licenses.

It's important to distinguish between this case and the other cases on software licensing that Ars has covered recently. Like this week's decision, those cases were focused in part on whether copyright licenses would be enforced via contract law or copyright law. However, there was a crucial difference between those cases and this one: the first-sale doctrine, which says that selling a given copy of a work exhausts the copyright holder's rights with respect to that copy. In the previous cases, the dispute was over a single copy of the work—a promo CD in one case and a box of software in the other. The courts held that no license was needed in those cases because under the first sale doctrine, the lawful owners of those copies didn't need any further permission to use them. This week's case, on the other hand, involves a firm that was creating and distributing new copies of a work, a situation in which the first-sale doctrine simply doesn't apply.

EA Choose BitTorrent for Warhammer Online Distribution

Developed by Mythic Entertainment, Warhammer Online: Age of Reckoning is a long-awaited WoW competitor, due for general release on September 18th this year. As with many games of this type, the developer is beta testing the software, helped by players in the US, Europe, New Zealand and Australia.

To become a tester in the beta, many people were selected via applications, contests and prize draws. The first stage of testing was over in the final quarter of 2007 and during February 2008, invites to test were sent out to lucky recipients. July this year saw the activation of so-called ‘Guild beta’ keys which signaled the start of the next stage of testing and on August 10th, keys were sent out for Beta 3.

The Warhammer Online beta gaming client is quite a size, weighing in at over 9.3GB. Providing a download of this size for testers could turn into a needlessly expensive exercise for the publisher, Electronic Arts, and the developer, Mythic, so they have made the decision to utilize BitTorrent instead, dramatically reducing distribution costs.

There is no other way to download the client, indicating that EA/Mythic are completely confident that their target market will have no trouble using torrents but just in case, they provide a comprehensive BitTorrent guide which covers basics like client selection, up to more advanced techniques such as port forwarding. An eagle-eyed reader who is involved in the beta testing pointed out that the guide also includes advice to follow a TorrentFreak guide to speed up transfers.

The Mythic tracker is located at http://torrent2.eamythic.com and currently reports around 1300 seeds and over 7000 leechers. A search on the hash value of the torrent, reveals the beta is (unsurprisingly) being tracked by other torrent sites too.

The Warhammer Beta team was contacted for a response, but hasn’t responded before publication.


First USB 3.0 demos at IDF next week?

Intel today sent out a press release stating that its “Extensible Host Controller Interface (xHCI) draft specification revision 0.9 in support of the USB 3.0 architecture, also known as SuperSpeed USB” is now available. This move not only clears some confusion over claims that Intel may be withholding USB 3.0 specifications, but also indicates that we should be able to see first USB 3.0 demonstrations at next week’s IDF in San Francisco.


Intel’s xHCI debuts with USB 3.0 and provides hardware component designers, system builders and device driver developers with a description of the hardware/software interface between system software and the host controller hardware for USB 3.0 and USB 2.0 (previous versions are not supported). According to Intel, the xHCI draft specification provides a standardized method for USB 3.0 host controllers to communicate with the USB 3.0 software stack and is being made available under RAND-Z (royalty free) licensing terms to all USB 3.0 Promoter Group and contributor companies that sign an xHCI contributor agreement.

A revised xHCI 0.95 specification is planned to be made available in the fourth quarter of this year.

The release of the spec follows claims that Intel could be engaging in unfair business practices by withholding the spec and the company’s subsequent confirmations that the host controller standards would be made available in the second half of 2008 royalty-free - “free, gratis, unpaid, zero dollars, free of charge, at no cost, on the house.”

While Intel has kept its promise, the implications of the announcement are that the USB 3.0 technology is virtually finalized and product development can get into full gear. The timing of the announcement is a sign that USB 3.0 demonstrations could take place at Intel’s fall developer forum, which will open its doors on August 19. Commercial products are not expected to be released until late 2009.

When maxed out USB 3.0, will offer ten times the bandwidth of USB 2.0 – 4.8 Gb/s, which translates into a massive bandwidth of 600 MB/s.

Also noteworthy about Intel’s announcement is the fact that it got AMD to supply a quote for its press release. “The future of computing and consumer devices is increasingly visual and bandwidth intensive,” said Phil Eisler, AMD corporate vice president and general manager of the Chipset Business Unit. “Lifestyles filled with HD media and digital audio demand quick and universal data transfer. USB 3.0 is an answer to the future bandwidth need of the PC platform. AMD believes strongly in open industry standards, and therefore is supporting a common xHCI specification.”

Yes, it is one of those quotes you can easily live without. But read between the lines and the simple fact that AMD is quoted in an Intel press release should be indication enough that USB 3.0 is off to a good start. We also heard that Nvidia has signed the USB 3.0 agreement.

Two weeks ago, the IEEE said that it has approved the IEEE 1394-2008 specification, which increases the interface bandwidth of IEEE1394, also known as Firewire and i.Link, to 3.2 Gb/s.

Original here

What's Wrong With the 3G in iPhone 3G?

Iphonesewage

There's something in the air in the Apple community, and this time it's not the same old buzz about a next-gen iPod, Steve Jobs's health or a Mac tablet: It's the sour topic of the iPhone 3G's poor data reception.

Specifically, people aren't happy about how fast the iPhone downloads data over AT&T's wireless network. Many have reported that the phone frequently switches from the faster 3G network to the older, slower EDGE network without warning -- or drops the data signal altogether. And even on 3G, some users are disappointed by the performance, which seems far short of Apple's promise that it would be twice as fast as the old iPhone.

Apple and AT&T remain stoic about the issue, but there's no denying that these complaints have become more prominent since Wired.com last explored user reports of spotty iPhone 3G reception. Apple and AT&T have been either mum or have offered bland, "everything's fine here, nothing to see" statements (see below). That's led some bloggers and journalists to crowdsource the issue: CNET has been collecting user comments in an attempt to find patterns and an iPhone 3G user started a blog to collect comments, too.

But what's the source of the problem? Is it a bad 3G chipset in the iPhone? Problems in AT&T's network? Or something else altogether? We contacted several wireless network experts, as well as Apple and AT&T, to see if they could shed any light on the problem. Here's a rundown of theories about the iPhone's problematic 3G reception:

  • In an e-mail interview, David Nowicki, vice president of product development at femtocell developer Airvana, laid the blame at the network's feet. He pointed out that AT&T's 3G network is new and will take several years to optimize, which is normal -- problems crop up in new networks all the time. Also, when AT&T deployed its 3G equipment, the company put it on its existing transmission towers. Those towers were spaced based on the requirements of earlier, 2G technology, which has a longer effective range than 3G. That means that on the edges of any given cell, 3G reception is going to be much worse than comparable 2G or 2.5G (EDGE) reception. In short, EDGE has an edge over 3G in reaching your phone (pardon the pun).
  • Nowicki added that capacity constraints could be an issue: The network towers provide both data and voice services and they communicate with devices even when those devices are not in active use. That creates strain on the network. When a 3G tower gets overloaded with requests it dumps out data packets; some users' phones get no signal at all while others' default to the EDGE network on a less overloaded cell tower.
  • Sam Greenholtz, founder of Telecom Pragmatics, echoed Nowicki's explanation and added that AT&T and Apple simply were not prepared for this tremendous growth in the number of users. Apple was not even able to keep up with demand for iPhone 3G handsets, leaving many stores tapped dry. "AT&T may have had 10,000 users in downtown, and the cell site may have been engineered to handle that many calls, but with this phenomenal buying there are now 20,000 people out there that have AT&T service on the 3G iPhone," Greenholtz said in a phone interview. Greenholtz stressed that data traffic is the main cause of spotty reception -- especially in major metropolitan areas where 3G is being used the most, thus straining the network.
  • The inconsistent 3G connection is due to an "immature" 3G chip inside the iPhone, presumably manufactured by Infineon, says analyst Richard Windsor. This would point fingers at iPhone's hardware -- not the AT&T network.
  • FixMy3GiPhone.com blogger Matt Wakeling's report is essentially a combination of the above theories: He reports that Portland AT&T employees said Apple "went cheap on their chips," and that AT&T's 1900-MHz towers are not communicating well with the handset. AT&T told Wakeling it would be switching network towers to 850 MHz to improve reception for iPhone users.
  • AT&T spokesperson Brad Mays told Wired.com, "The new iPhone is performing very well on our network." He explained that reception issues must be examined on a case-by-case basis: "Customer experience on the iPhone 3G or any device can vary based on a number of factors, including the proximity to the cell site, buildings, trees, terrain and the number of people on the network at any given time."

Whatever the case may be, the widespread concern over 3G performance on the iPhone signifies a disconnect between Apple and AT&T, which doesn't come as a surprise: Each company already has a lot on its plate in addition to the iPhone. But considering how huge of a device the iPhone has become, that's not an excuse.

If you've got an iPhone 3G, what's your experience been? Is it faster, slower, more intermittent, or more consistent than with the previous iPhone or other phones?

Original here

Analyst Predicts September iPod 'Surprise'

munster-fortune.jpgAn analyst's job is much like that of a fortune teller. Especially the job of an Apple analyst, who has nothing more to go on than some educated guesswork and the pattern of the tea-leaves left in Steve Jobs' cup of Lapsang Souchong.

Today the analyst in question is the venerable Gene Munster of Piper Jaffray, and he is predicting a September refresh of the iPod lineup. From where did he pull this information? Well, perhaps he cheated and took a look at last years' calendar. And the year before that. And yes, the year before that. Predicting a September iPod event is akin to predicting Christmas sometime in late December: It happens every single year.

The technical details are equally "surprising". Munster reckons that the iPod Shuffle will double in size to 2GB, and that the iPod Touch might be slightly tweaked. We'd agree with both. Memory prices drop, so why not bump the Shuffle a bit? And a new, lower priced Touch, with coloured plastic replacing the shiny metal backplate seems a certainty, too. Munster isn't so sure about the Zune Nano, though.

And according to Macworld, Munster is also betting on a new line of MacBooks at the same time. We say no. Apple's special events usually focus on one product or line. The company likes to send a clear message, and also enjoy the free publicity that comes with its simple, easy to digest product announcements. So while we might see the long rumored aluminum MacBook later this year, it is very unlikely to show up at the same time as the new iPods.

Putting on my analyst hat for a moment, I would like to make a few tech predictions of my own. In the summer of 2009, Apple will release a new iPhone. Also, sometime in the next few weeks, I myself will become so incensed by a piece of perfectly good technology which has been covered in fake jewels that I will rant right here in the pages of Gadget Lab.

And, most predictable of all, in the next month Danny Dumas will turn up at his hairdresser with a page ripped from a teen magazine. Pointing at the hair of the latest boy-band heart-throb, he'll say simply "I want this one".