Followers

Tuesday, September 30, 2008

Microsoft tries to put fear of God into scareware vendors

By Joel Hruska

Microsoft and Washington State officials announced a new partnership today aimed at fighting scareware in general and one specific vendor in particular. Today isn't such a good day for one James Reed McCreary IV, of The Woodlands, Texas. Mr. McCreary is the sole director of Branch Software, which created the Registry Cleaner XP program, and the CEO of hosting company Alpha Red. Scareware, it should be noted, isn't malware—at least, not technically. Instead of installing its own set of viruses, worms, or Trojans, a scareware program tricks the end user into believing he or she needs the program to correct a nonexistent error within the operating system. This type of falsified error was a common tactic in the days before Windows XP's SP1 (supposedly) closed the door that made the random pop-ups possible; I still remember seeing ads pop up on customers' desks insisting that they needed to download Program X for $9.99 to fix this issue.

No one likes the badware industry, but scareware has to be one of Microsoft's least-favorite types. It's the equivalent of a mechanic who wants to change the oil in your serpentine belt, or who insists that the transmission fluid in your car needs to be swapped out every 10,000 miles. In a best-case scenario, scareware does no harm after the consumer has been tricked into installing it. Worst case, the stuff is as full of malware, exploits, and/or system-crashing instabilities as the problems it purports to solve. Malware exploits may give Microsoft a bad reputation in general, but scareware actually charges the user for her own infection, and that tends to make people a wee bit cranky. Ironically, Microsoft's reputation ends up tarnished, in this case, by a product that takes advantage of a consumer's gullibility rather than an OS-related design flaw.

Registry Cleaner XP is an example of exactly this sort of product. Web 0.5-styled website that appears to have been written by a kindergartner with an unactivated copy of FrontPage? Check. Product website consists of a single page, with no "About Us" information, list of press contacts, or even a list of other products? Check. Prominent endorsement from nonexistent rating service? Check. WebTronics, if such a service ever existed, doesn't appear to exist anymore, and there's no link to any sort of page promoting that recommendation. Read through what the program supposedly does, and it's an obvious fraud to those of us who know what to look for, but your average user confronted with bizarre pop-up errors and warnings may not be able to distinguish it from a legitimate piece of software. Running Registry Cleaner XP, by the way, always results in the program finding errors. Consumers are then told to pony up $39.95 if they want to clean their systems.


James Reed McCreary

AlphaRed's involvement is harder to quantify. McCreary is listed as a company employee (though not as CEO), but the firm appears to be a legitimate hosting business. Legitimate hosting companies can still host plenty of illegitimate activities—ask Atrivo if you don't believe us—but the links between AlphaRed and any sort of scareware or illicit business are at least slightly camouflaged. Digging around online, I found vague references to bad experiences with AlphaRed or its onetime affiliate OrangeWire, as well as references to a scammer that went by the name of Chris Gotzmann, aka Michael Sanduval, but most such discussions date back to 2004-2006. According to Microsoft, AlphaRed sold Registry Cleaner XP, though it's not apparent how/when such transactions occurred.

In its formal complaint (PDF), Microsoft details the means by which Registry Cleaner XP sought out and attempted to persuade victims to purchase its dubious services. Redmond charges that McCreary has committed five separate violations of the Consumer Protection Act and/or Washington State's Computer Spyware Act. The company than asks for a permanent injunction against McCreary's products, as well as relief and court costs. When Microsoft and Washington State announced this new initiative today, Washington Attorney General Rob McKenna had strong words for the companies and programmers that produce and market scareware. "The Attorney General's Office along with Microsoft has yanked the fear factor dial out of the hands of businesses that use scareware as a marketing tool and have spun it toward them," McKenna said. "We won’t tolerate the use of alarmist warnings or deceptive 'free scans' to trick consumers into buying software to fix a problem that doesn’t even exist. We've repeatedly proven that Internet companies that prey on consumers' anxieties are within our reach."

Original here

Visual Studio 2010 to come with 'black box'

Posted by Ina Fried

Airplanes are equipped with recorders that capture both cockpit audio and flight data, so in the event that something goes wrong, investigators can try to determine the source of the problem.

Microsoft is aiming to give software developers the same kind of access. In the next version of its developer tool suite, to be known as Visual Studio 2010, Microsoft plans to include the ability to record the full screens of what testers are seeing, as well as data about their machine. When a test application crashes, the technology will enable developers to see the bug as it occurred.

In an interview last week, Microsoft Developer Division Director Dave Mendlen said the feature is designed to avoid the all-to-frequent conflict that occurs when a software tester finds a bug that the developer says it can't reproduce. Internally, the feature has been called "TiVo for debuggers."

Visual Studio 2010 screenshot

Visual Studio Team System 2010 will offer tools for managing test cases and execution, and will boost support for filing actionable bugs.

(Credit: Microsoft)

Although the feature is initially only aimed at in-house testers, a similar feature could one day find its way into broader testing, potentially even into Microsoft beta products. "I wouldn't be surprised at all to see this become a way that we do beta management, going forward," Mendlen said.

Microsoft offered scant other details about Visual Studio 2010 and the .Net Framework 4.0. It's a safe bet that better support for cloud-based services will be included, though. "That is certainly an area that Visual Studio and the .Net Framework will have to address," Mendlen said. "As we enable service-based technologies, of course we will have to tool it."

The company is also talking about new modeling tools it says will make it easier for programmers new to a team to get a sense of how earlier versions of the software work. One of the other goals is to add more business intelligence tools--things like dashboards and cockpits--that enable the project managers to assess whether a development project is on track. "The guys that are paying the bills often get very little info," Mendlen said.

Microsoft wouldn't get too much into other features of the product, but it outlined a few broad areas where it is seeking to improve the product, including "enabling cloud computing" and "powering breakthrough departmental applications."

Mendlen said it is expected to ship in fiscal year 2010 (which runs through June 2010).

"I can tell you it won't ship in 2011," he said.

The Redmond giant is not the only company looking to transfer the TiVo notion to software development. A company called Replay Solutions launched a product in June for enterprise Java applications.

Microsoft itself used the notion of a "black box" feature back in 2005.

Microsoft Chairman Bill Gates talked about adding a "black box" to Windows (without the video-recording ability, though). Microsoft later said it wasn't broadly expanding the "Watson" error-reporting capabilities beyond the kinds of data it already had been collecting. It was never totally clear as to what Gates was referring to.

A Microsoft representative did say that "the two technologies are not related and that in Visual Studio Team System the 'black box' is only on testers machines and only turned on when the tester decides it should be turned on."

Speaking of 2005, that same year, pair of Canadian developers created a Visual Studio 2010 concept, kicked around by a back in 2005. Since they were the first to mention Visual Studio 2010, I thought I would give them some link love.

Original here

Gentoo Linux Cancels Distribution

By Sean Michael Kerner

For some Linux distribution projects, new releases come twice a year. That had been the plan for Gentoo Linux this year, until it canceled its current planned release -- the second time it's done so in the past 12 months.

But the news doesn't necessarily mean a setback for the project.

Instead, Gentoo developers said they are pushing a new model for their distribution -- one that eschews the conventional release wisdom used by Red Hat, Novell, Debian and others. Instead of fixed releases, Gentoo is promoting its vision of a live, continuously updating distribution. In practice, that effort revolves around its weekly minimal images, which are then supplemented with customized installed packages.

"We need to work harder to communicate the relative irrelevance of releases in a live distribution like Gentoo," Gentoo developer Donnie Berkholz explained to InternetNews.com. Releases "have an overly large impact on what non-Gentoo users think of the health of the distribution, so problems with a small team within Gentoo are magnified in their effect on public opinion."

The news comes as Gentoo continues to face not only competition from competing Linux releases but also persistent organizational issues. The Gentoo 2008.0 release came out in July, following the cancellation of the 2007.1 release. Gentoo developers had scrapped the 2007.1 release, citing limited time and effort to devote to the release -- the same basic reason why 2008.1 has now been canceled.

Working to distance itself from the concept of regular releases may help the group save some face. It's also in keeping with the fact that the entire concept of releases is a bit different for the Gentoo crowd.

Its developers consider their distribution to be a "meta-distribution," since users customize their distributions with the Gentoo Portage system of continuously updated packages. Berkholz added that official releases in Gentoo generally only have ever had two purposes: to provide new hardware support for installation, and to create some buzz around the distribution.

"The new hardware support should be covered by the weekly minimal CD images, and of course, it's possible to install Gentoo from nearly any CD that will boot any Linux distro," Berkholz said. "To make up for lack of buzz with less-frequent releases, we'll need to work harder to publicize the innovation happening in Gentoo on a daily basis."

Berkholz argued that community feedback toward the new approach is generally very positive. Still, not everyone is thrilled that Gentoo dropped its 2008.1 release.

"I was disappointed to find out that the 2008.1 release was canceled," Daniel Robbins, Gentoo's founder, told InternetNews.com. "I was hoping that the Gentoo project would turn the page after the cancellation of the 2007.1 release. However, you need to balance out this bad news with the good news of Gentoo's recent improvements in the way they interact and involve the larger Gentoo community."

Robbins also said he's also concerned about the impact the cancellation has had on the perceptions about Gentoo Linux held by people outside of the project.

"Any time you cancel a scheduled release, it does not reflect well on the health of an open source project," he said. "But there is a silver lining to all this."

Robbins suggested the development could spur some changes among the project's developers.

For one thing, he said he believes Gentoo's development model is broken, and suggested that tools like git, the distributed version control system created by Linus Torvalds, should be used.

Additionally, Robbins suggested that Gentoo begin using a completely automated tool for building releases. In his opinion, the project suffers from using a tool called catalyst to build new versions of its distribution -- a liability, since he said it lacks official documentation and isn't up to snuff for Gentoo's use.

He added that he has developed his own streamlined release build tool, used to builds new "Funtoo" releases of Gentoo on his personal workstation once or twice a day, which he posts to his own site.

"So I'd think that Gentoo could do it at least twice a year," he said. "It is not that hard."

"This tool will allow anyone to easily build a Gentoo release, and could also be used by the Gentoo project to build weekly or even daily releases if they wanted," Robbins added. "I am trying to help by giving the larger Gentoo community the tools they need to collaborate better, so that Gentoo development can happen in a more efficient and decentralized way. That will allow Gentoo to innovate and improve at a much more rapid pace."

Berkholz, meanwhile, also said that moving to a distributed version-control system like git is something that he'd like to see happen at Gentoo. He also said he'd like to see increased quality.

Original here

Scam sites vanish after owners' names, addresses posted

This summary is not available. Please click here to view the post.

Norway gives Apple deadline to open up FairPlay DRM

By Chris Foresman

Apple is again in the crosshairs of Norway's Consumer Council over FairPlay DRM and music player compatibility. The company recently met with Norwegian officials in February, and it agreed that it wants to sell DRM-free music and that it shares Consumer Ombudsman Bjørn Erik Thon's goal of interoperable systems. But, says Thon, "iTunes [sic] has now had two years to meet our demands regarding interoperability. No progress has been reported by iTunes since our meeting in February." Thon today announced that Apple has until November to comply with the Consumer Council's demands or face official sanctions.

Over two years ago, Apple got into hot water with the Norwegian government over iTunes Store DRM. The problem is that the terms of service for the iTunes Store covering DRM-managed content prohibit compromising Apple's copy protection, and this runs afoul of Norway's Marketing Control Act. Since then, Apple began selling at least some music in a DRM-free iTunes Plus format, including music from EMI and many independent labels. That move gained Apple some respect from the Norwegian Consumer Council's senior advisor, but since then no other major labels have agreed to sell music in the iTunes Plus format.

Under Norwegian law, users who buy digital media are allowed to use the media with whichever device they choose. However, Apple doesn't license FairPlay to any other device manufacturers, and so far Apple hasn't licensed Microsoft's WMP-DRM to play songs encoded in Windows Media formats. So, Thon plans to demand that Apple make FairPlay-protected content playable on other portable music players.

Steve Jobs made his position on the matter known last year with his open letter entitled "Thoughts on Music." Apple wants to sell DRM-free music, but it's up to the record labels to agree to sell the music in this format, and so far only EMI and a group of independent labels have done so. Apple has also revised the iTunes Store's terms of use to include information on burning CD's to remove the FairPlay protection, but Thon says that all of this is not enough.


Norway tells Apple enough is enough: Make iTunes content work with other players, or face the consequences.

The predicament that faces Apple in the Norwegian market is complex, since both sides have fair points in the dispute. There are a number of ways to get DRM-free tracks to play on iPods: ripping your own CDs, buying iTunes Plus tracks, and buying music from stores like eMusic and the Amazon MP3 store (which, oddly enough, has DRM-free mp3s licensed from all major labels). Steve Jobs claims the company cannot license FairPlay without potentially compromising the licensing agreements that Apple has made with the record labels. And there is no mention of Microsoft's MSN Music (which may or may not "play for sure") or its Zune-only Zune Store. Nor is there any focus on movies or other video content.

What, exactly, Apple could do to appease all parties in the matter isn't clear at this time. But this case, according to Thon, will be a "test case" for the Consumer Council. Norwegian law allows the council to order companies to change practices it deems unfair, and it can issue fines if companies do not comply. Ultimately, Apple may have to shut down the Norwegian iTunes Store, or remove all DRM-protected content. Although Norway is not a member of the EU, its actions could have a serious domino effect there as Finland, Denmark, France, Germany, and the Netherlands, are all supporting Norway's actions.

Original here

Congress finally passes broadband data collection bill

By Nate Anderson

While spending a busy weekend trying to bail out the nation's troubled financial system, Congress also found time to tell the FCC that its current method for collecting broadband usage data is unacceptable. The Senate passed S. 1942, the Broadband Data Improvement Act, after the House passed a similar bill in late 2007. The bill, which has bipartisan support, directs the FCC to get better data on broadband and to report on it more often.

Rather than release "periodic" reports, for instance, the FCC would have to report yearly on US broadband, and these reports would have to list:

  • the types of technology used to provide the broadband service capability to which consumers subscribe;
  • the amounts consumers pay per month for such capability;
  • the actual data transmission speeds of such capability;
  • the types of applications and services consumers most frequently use in conjunction with such capability;
  • for consumers who have declined to subscribe to broadband service capability, the reasons given by such consumers for declining such capability

The Census Bureau would also include questions on computer ownership and Internet connection methods as part of its work.

The FCC has historically collected quite limited data on broadband. Its definition of "broadband," for example, is famously low (200Kbps) and the agency only collects data at the ZIP code level. If a single customer in a ZIP code has broadband service, the entire ZIP code is counted "served." The new bills will force the FCC to collect far more interesting data, and to generate more useful metrics for judging the success of broadband rollouts, including the average price for each megabit-per-second, and the actual speeds that broadband users get.

Free Press, the group that recently pushed the FCC complaint against Comcast's P2P blocking practices, was pleased with the bill's passage.

"Our current broadband data collection system has had serious problems for years," said Ben Scott, the group's policy director. "The absence of accurate information about the price, speed, and availability of high-speed broadband has crippled our government's ability to advance innovative technology policies. In the last year, the FCC has taken some very important steps toward solving these problems. This bill gives more momentum to that progress."

Sen. Daniel Inouye of Hawaii, the Democratic chair of the Senate Commerce Committee, called the bill an important step forward. "The federal government has a responsibility to ensure the continued rollout of broadband access, as well as the successful deployment of the next generation of broadband technology," he said. "But as I have said before, we cannot manage what we do not measure. This bill will give us the baseline statistics we need in order to eventually achieve the successful deployment of broadband access and services to all Americans."

The two bills have plenty of similarities, but also some key differences; the Senate version, for instance, tacks on a "child pornography enforcement" section and an Internet safety campaign. Changes need to be hammered out this week before Congress recesses.

Original gere

Fujifilm's Real 3-D Camera Is Just the Beginning

By Jose Fermoso

Fuji_3d_camera

Camera makers are jumping into the 3-D photo market more than 20 years after the format was laughed out of town and ended up as Michael Jackson's sidekick in Captain Eo.

Recently, Fujifilm announced a two-lensed camera that takes images and movies in 3-D and captures wide-angle photos of single scenes simultaneously. As a result, we've heard some rumbling in the wires about other camera manufacturers coming out with their own version in the next year or so.

Since the lenses are 6-7 cm from each other (or about the same length between most people’s eyes), the Real 3-D's camera processes the images in real time to produce the stereoscopic 'trick' effect that makes them look as if they're floating in air. This is where the processing update of Fujitsu's 'Real Photo Processor 3-D' chip comes in.

The chip blends the dual images and all the important metrics (focus, zoom range and exposure) at once and pushes them out to the LCD, which is also on a 3-D display. By the time this camera is released next year, the company is promising to be able to shoot HD video on it as well.

Fuji_real_3d_logo

Of course, there are several 3-D custom rigs out there that are producing 3-D images, but they're usually very expensive. Some have even made 3-D rigs out of two mainstream digital cameras. In the case of the 3-D Advantage, Casio, a company has attached a tri-delta beam splitter to a single point-and-shoot camera that creates a 3-D image with a single shutter. But a pro-ready camera ready to shoot 3-D images is better than a custom rig, for obvious user-friendly reasons.

3dadvfSo while the majority of the renewed excitement about 3-D imaging has come from the display side and recent focus from the movie industry, don't expect Fujifilm to be the only one coming out with a consumer 3-D cam.

A few years ago, Olympus created a 3-D camera system for the da Vinci Surgical robot system that works in real time and is dependable enough to used during open surgery. They only have figure out how to transfer the tech to a more affordable, consumer friendly chassis and they'll be in the 3-D game. Don't be surprised if they've already figured out how to do that.

Fuji_3d_cam_back

02_jf_photokina4_006

Original here

Apple threatens to shutter iTunes over proposed royalty hikes

By Sam Oliver

Apple has threatened to close down its iTunes Store should regulators approve a royalty hike that would grant artists a 66 percent increase in commission for each song sold through online download services.

According to Fortune, the Copyright Royalty Board (CRB) in Washington, D.C. is expected to rule Thursday on a proposal from the National Music Publishers' Association to raise the rates paid to its members on songs purchased from digital services like iTunes from 9 cents to 15 cents a track.

The three-judge panel oversees statutory licenses granted under federal copyright law, which includes music sales, according to the report. The board's previous ruling covering physical CD sales was made in 1997 and expired last year, making the impending decision the first to affect digital music sales. It will reportedly span the next five years.

In a statement submitted to the CRB last year regarding the matter, Apple iTunes chief Eddy Cue suggested that company might decide to shutter iTunes rather than raise prices above 99 cents or eat the cost of the fee hikes.

"If the [iTunes music store] was forced to absorb any increase in the ... royalty rate, the result would be to significantly increase the likelihood of the store operating at a financial loss - which is no alternative at all," Cue wrote. "Apple has repeatedly made it clear that it is in this business to make money, and most likely would not continue to operate [the iTunes music store] if it were no longer possible to do so profitably."

Apple, which has leveraged the iTunes Store to help sell over 160 million iPods, typically collects 99 cents each time a customer downloads a song, of which 70 cents is turned over to the record labels. The record labels, in turn, then typically pay 9.1 cents to the music artists who own the copyrights to the songs. Most of Apple's remaining 29 cents is used for maintenance rather than profit.

Like Apple, Fortune reports that the record labels "are in no mood to pay the proposed royalty increase" out of their own pockets; CD sales have dropped by 20 percent in the past year and so put pressure on labels to recover this through downloads. Online sales surged 46 percent over the same period and have been poised to overtake physical albums as iTunes has pushed past Wal-Mart to become the US' largest individual music retailer.

They've instead asked CRB to do away with fixed royalties in favor of an 8 percent commission to artists, which would translate to about 5.6 cents on the wholesale cost of each 99 cent track. The Digital Media Association, which represents Apple and other digital retailers, is seeking an even lower rate of 6 percent, or 4.8 cents per track, according to the report.

Musicians charge that these attempts by Apple and the DMA to hold or even reduce ultimate royalty rates are effectively taking unfair advantage of their positions to push electronics. By keeping iTunes music below the dollar mark, Apple knows it can use its online store as an incentive to device buyers, according to NMPA president David Israelite.

"Apple may want to sell songs cheaply to sell iPods," he notes, "[but] we don't make a penny on the sale of an iPod."

Original here

Top Draw Generates Eye-Catching Wallpaper

By Gina Trapani

Mac OS X only: Today Google releases Top Draw, a nifty image generation application that rotates its creations on your desktop. Top Draw uses scripts to create colorful psychedelic images, and sets them as your desktop wallpaper with an option to automatically refresh at an interval you set. A Google Mac developer writes:

The Top Draw scripting language leverages Apple's Quartz and CoreImage rendering engines for graphical muscle. In addition to the drawing commands that are supported by the HTML canvas tag, there is support for particle systems, plasma clouds, random noise, multi-layer compositing and much more.

After just a few minutes, Top Draw's given me some beautiful images. Top Draw is a free download for Mac only.

Original here

Firefox Minefield: Faster Than Chrome

By Pavs

Quick tip.

If you are not happy that you can’t get to try a native chrome, arguably the fastest browser out there (chromium is not so stable); give Firefox Minefield a shot. Even though it’s an early Alpha build, under the hood, it has the fastest javascript engine out there. There are faster javascript engines, but none of them are released bundled with any browsers yet. (not that I know of).

- Download and install the latest nightly build.

- Start tracemonkey javascript engine by going to about:config and enabling the option javascript.options.jit.content.

- Browse websites blazingly fast!

According to some tests it is 10% faster than Google Chrome.

Original here

VLC goes to nine point two

By Nathan Willis

Sometimes I get the feeling that I only know about 5% of what VLC can do. Everyone knows it's a dependable, free, cross-platform media player. But read through the release notes for the latest version, 0.9.2, and you will see a dizzying array of server, network streaming, and conversion functions, most of which I have never even touched -- and the new VLC exposes a lot more of that functionality.

You can grab source code and Windows and Mac OS X binaries from videolan.org. Ready-made builds for Linux are not as easy to come by; VideoLAN prefers to work through distros' packaging systems, and the hot-off-the-presses 0.9.2 is not yet available in most of them.

There is a good chance that there is an official or amateur build is already compiled for your distribution; start by looking on your distro's mailing lists and forum, and check the VideoLAN forum as well. If none is available, you can always compile your own. The VLC wiki has instructions.

It's what's on the outside that counts

The first changes you will see are to the interface. Gone is the old wxWidgets-based look, replaced with Qt by default. As with previous releases, you can also start up VLC with a range of other interfaces, including command line, ncurses, mouse gestures, keyboard-only, a Web app running on port 8080, and (should you be so inclined) bitmapped "skins" a la Winamp.

The menus and tools have been reorganized, consolidating items from the View and Settings menus into Tools, aptly renaming File to Media, and introducing top-level Playback and Playlist menus. Playback contains navigation controls, including bookmarks and title/chapter seeking.

The new Playlist module functions like a traditional audio-only player (think Amarok or Rhythmbox), with music library browsing, searching, metadata and album art retrieval, and so on. It supports Shoutcast and Last.fm directly, and has a powerful Lua-based scripting engine with which you can write handlers for other network services. Example scripts supplied in the release open media from YouTube, Google Video, Metacafe, and DailyMotion.

Okay, fine -- the inside matters, too

The list of transport protocols and codecs supported in the new release is dizzying; check the project wiki for a complete list. The highlights include VP6, Monkey Audio, ATRAC3, video4linux2, JACK audio, MIDI synthesis, and TiVo transfers. Support is improved for many other formats, with the added features like subtitles, closed captioning, chapters, and tags.

VLC can pipe audio and video through built-in effects filters as it plays, regardless of the underlying codec, container, or source location. This release adds nine new video filters and two new audio filters. Some are standard image adjustments, such as saturation or gamma, some are special effects like motion blur or rotation, and some are just for fun, like the "puzzle game" that breaks the video into tiles and shuffles them on the plane.

It's an impressive collection, and some of the effects -- such as audio replay gain and watermark logo removal -- make VLC a useful tool for folks who edit video.

Conversion experience

Such usage of VLC is possible because, in addition to playing back media, the app can stream audio and video over a variety of network protocols, and can convert content between codecs and container formats.

While all major OSes offer dedicated video conversion utilities, VLC's built-in conversion tool is easier to use than any competitor I have tried. You do not have to look up a comprehensive series of command-line arguments and flags in order to construct a conversion filter; the app takes care of that for you. And since playback is built in, you can examine the input file's settings in detail before you start, something the drag-and-drop converters cannot do.

Both conversion and streaming use the same media selector for playback, and this selection tool is another of VLC's strengths. From the Media menu, all of the options, from Open File, Open Disc, and Open Network, to Convert/Save and Streaming, use a unified interface that exposes the relevant options for each alternative, and makes its best guess at the appropriate settings. After all, if all you want to do is save a copy of a video to local storage, you shouldn't have to learn how RTSP works.

Pros and cons

The new playlist features and reorganized user interface go a long way toward making VLC 0.9.2 the easiest open source media player to use. There are still quirks, however. For example, to activate audio and video filters, you must bring up the Adjustments and Effects window, which you do by selecting the Tools -> Extended Settings menu item. And there are, inexplicably, two menu items that bring up the playlist -- Playlist -> Show Playlist, and Tools -> Playlist.... Finally, I would prefer to have VLC function entirely within a single window, but instead it opens separate windows for the playback display, playlist, effects, bookmark browser, and media information screen. That gets cluttered.

But even with those quibbles, VLC is easier to use and does a more consistent job of exposing its functionality than either MPlayer of Xine. VLC's interface does not get in your way, and you do not have to read a tutorial to get started with it. It just works. Add to that the fact that VLC is robust on all three major desktop OSes, and you have a winner.

Original here

WebKit becomes first browser engine to fully pass Acid3 test

By Prince McLean

Maciej Stachowiak of the WebKit team has announced that the browser engine behind Safari is the first to fully pass the Acid3 test, including the test's condition of smooth animation rendering.

Acid3 is a test page from the Web Standards Project that scores how well a rendering engine follows defined web standards, particularly DOM and JavaScript. The test provided a metric for standards compliance that has resulted in rapid advancement among various rendering engines as each works to earn the top score.

In March, both Safari's WebKit and Opera's Presto announced earning a 100% score in developer builds of their browser. In addition to the numbered score, the test also requires the browser render a test page with pixel perfect accuracy using its default settings and that it render a smooth test animation.

Today, the development build of WebKit passed that last hurdle, which Stachowiak reported was due to "recent speedups in JavaScript, DOM and rendering."

Actual shipping builds of the world's various web browsers haven't yet reached 100%. According to figures in Wikipedia, the latest Safari 3.1.2 has a score of 75, while Firefox 3.0.2 has reached 71, Opera 9.52 has reached 84, and Internet Explorer 7 is at 14.

In internal builds, the Safari 4.0 Developer Preview has reached 100, while the latest build of Firefox Gecko engine has reached 87, the latest build of Opera earns 99, Google's new Chrome beta has reached 79, and the Internet Explorer 8 Beta 2 scores 21.

Among mobile browsers, the shipping version of Safari in iPhone 2.1 reaches 74, while the Netfront browser hits 11, Opera Mobile reaches 2, Opera Mini has hit 79. Pocket Internet Explorer can't run the test due to a lack of JavaScript support.

Original here

What Does "Beta" Mean to You?

Software or webapps in a "beta" phase should, according to tradition, not pick up any major features, and should be going through community testing mostly to work out kinks and bugs. But one astute writer at the Pingdom blog found that 22 of Google's 49 public products—a good 45 percent—were listed as "beta," despite going through significant feature changes and even entire version changes. They're hardly alone, as other webapp companies, like Jott, have taken to developing entire applications under the beta flag. Some suggest Google may be avoiding providing tech support or owning up to any major flaws found in their products, but we're wondering: Do you see the definition of "beta" changing elsewhere? Are you happy with the idea of getting early access to potentially flawed products, or would you rather just get a working package when it's ready? Tell us your take in the comments.

Original here

Mozilla chief promises mobile Firefox before 2010

By Christian Ziberg, Wolfgang Gruener

Chicago (IL) - Although several browser vendors have had mobile versions of their products for some time, Mozilla has been strangely absent from this space. We recently reported on a blog post of Mozilla’s “chief lizard wrangler” Mitchell Baker, who stated that the organization wants to have an “effective [Firefox] product in the mobile market”. Mozilla replied to our article and said that the statement in the post should not be interpreted as Mozilla planning to launch a mobile browser in 2010 – in fact, the organization aims to release a browser before that.

In a new blog post, Mozilla chairperson Mitchell Baker noted that she saw have an effective product in the mobile space”, which may indicate that Mozilla won’t ship a mobile browser by 2010. She added that “that is not the case at all” and that Mozilla “will ship well before then.”

“The intent of this goal was to say: in 2010 when we look at where we are, it should be screamingly obvious that we’ve done this. That means releasing a good product much sooner, seeing good results and acceptance, and seeing those results grow over time,” she wrote.

Obviously, the problem was the interpretation of “effective product.” Mitchell Baker gave us a few more hints on the meaning of an “effective product in the mobile market” in a recent blog post. According to the CLW, an effective product will have a certain mindshare, marketshare, momentum, a reference implementation for the generative Web and ongoing platform issues.

Of course, this brief explanation does not hint to any particular launch date. But here is our guess. Work on certain mobile platforms has just started and we are nearing the end of 2008, which means we may not see the browser this year. And if it is not 2010, then only 2009 is left. We leave it up to you to estimate when in 2009 that might be, but since we haven’t seen alpha, beta or RC version yet, we tend to believe that we are at least half a year away from a product.

Original here

Hole in Adobe software allows free movie downloads

By Daisuke Wakabayashi

NEW YORK (Reuters) - A security hole in Adobe Systems Inc software, used to distribute movies and TV shows over the Internet, is giving users free access to record and copy from Amazon.com Inc's video streaming service.

The problem exposes online video content to the rampant piracy that plagued the music industry during the Napster era and is undermining efforts by retailers, movie studios and television networks to cash in on a huge Web audience.

"It's a fundamental flaw in the Adobe design. This was designed stupidly," said Bruce Schneier, a security expert who is also the chief security technology officer at British Telecom.

The flaw rests in Adobe's Flash video servers that are connected to the company's players installed in nearly all of the world's Web-connected computers.

The software doesn't encrypt online content, but only orders sent to a video player such as start and stop play. To boost download speeds, Adobe dropped a stringent security feature that protects the connection between the Adobe software and its players.

"Adobe is committed to the security of all of our products, from our players to our server software. Adobe invests a considerable amount of ongoing effort to help protect users from potential vulnerabilities," it said in a statement.

Adobe said it issued a security bulletin earlier this month about how best to protect online content and called on its customers to couple its software security with a feature that verifies the validity of its video player.

An Amazon spokesman said content on the company's Video On Demand service, which offers as many as 40,000 movies and TV shows on its Web site, cannot be pirated using video stream catching software.

However, in tests by Reuters, at least one program to record online video, the Replay Media Catcher from Applian Technologies, recorded movies from Amazon and other sites that use Adobe's encryption technology together with its video player verification.

"Adobe's (stream) is not really encrypted," said Applian CEO Bill Dettering. "One of the downfalls with how they have architected the software is that people can capture the streams. I fully expect them to do something more robust in the near future."

HOW IT WORKS

The free demo version of Replay Media Catcher allows anyone to watch 75 percent of anything recorded and 100 percent of YouTube videos. For $39, a user can watch everything recorded.

One Web site -- www.tvadfree.com -- explains step-by-step how to use the video stream catching software.

Amazon.com's Adobe-powered Video On Demand service allows viewers to watch the first two minutes of a movie or TV show for free. It charges up to $3.99 to rent a movie for 24 hours and up to $14.99 to download a movie permanently.

Amazon starts to stream the entire movie during the free preview -- even though it pauses the video on the Web browser after the first two minutes -- so that users can start watching the rest of the video right away once they pay.

"It's the traditional trade-off, convenience on the one hand and security on the other," said Ray Valdes, analyst at research group Gartner.

However, even if a user doesn't pay, the stream still sends the movie to the video catching software, but not the browser.

Amazon's Video On Demand is the Web retailer's answer to declining sales of packaged movies and TV shows and the growth in demand for digital content that can be viewed and stored on the Internet.

Unlike Amazon, videos from Hulu.com, NBC.com and CBS.com are already free although the TV programs are interrupted by commercials. However, the stream catching software separates the commercials and the program into two separate folders, so people can keep the programs without the advertising.

Hulu.com, a video Web site owned by News Corp's Fox network and General Electric's NBC Universal, was the big networks' answer to YouTube, the popular video-sharing Web site where many users began uploading TV shows and other content owned by media companies.

The networks scrambled to post videos on their own sites in a bid to capture another stream of advertising revenue from a growing audience, but they have struggled with how best to show commercials which fund the programing when played on the Web.

YouTube, which started the online video boom before being bought by Google Inc for $1.65 billion in November 2006, has also struggled to cash in on its popularity even though its user base continues to mushroom.

DESTROYING BUSINESS MODELS

One possible solution would be to protect the video with a digital rights management (DRM) system. A Seattle-based company called Widevine Technologies has a DRM system that can encrypt online videos using Flash.

"The fundamental problem here is that Adobe's lack of technology is not allowing the business models to be preserved," said Widevine Chief Executive Brian Baker.

The lack of content protection, according to Baker, threatens all the business models used today to fund video on the Web.

Apple Inc, which sells movies and television shows at its online iTunes store, uses its own DRM technology called FairPlay, but it only works for video bought on iTunes.

Forrester analyst James McQuivey said he doesn't believe the video stream catching technology will entirely derail the advertising-supported business model used by the networks for online video.

"It's too complicated for most users," said McQuivey, noting that file-sharing services like BitTorrent already exist but only a small percentage of people use them.

"People want something easy to find and easy to use."

Original here

New clickjacking affects all browsers; cause remains unknown

By Joel Hruska

Jeremiah Grossman and Robert "Rsnake" Hansen initially planned to reveal details on a new browser-agnostic clickjacking exploit at the Open Web Application Security Project (OWASP) in New York City this week, but voluntarily pulled the presentation after discovering that the 0-day flaw affected an Adobe product. The term "clickjacking" refers to a process by which a user is forced to click on a link without his or her knowledge—the link itself may be nearly invisible or visible for only a fraction of a second.

Clickjacking isn't a new attack vector, but according to Grossman and Hansen, it's one that is "severely underappreciated and largely undefended." What makes the attack noteworthy, in this case, is that it appears to be completely browser-agnostic, and affects both Firefox 2 and 3, all versions of IE (including 8), and presumably all versions of Opera, Konquerer, Safari, and whatever other extremely marginalized and/or FailCat type of browser one might use to surf the web. The only browsers currently immune to whatever it is the two men discovered are text-based products, such as Lynx.

In this case, "whatever it is," actually is the only appropriate label for this new attack method; Grossman and Hansen have released virtually no information on how one would actually exploit the vulnerability. Grossman and his teammate appear to have held off publishing after Adobe requested they do so, rather than as a favor to the browser market. In his blog, Grossman writes: "At the time, we believed our discoveries were more in line with generic Web browsers behavior, not traditional “exploits,” and that guarding against clickjacking was largely the browser vendors' responsibility."


Yeah, it's kinda like that

Grossman and Hansen have, however, released a bit of information on what won't protect a user from the exploit. Turning Javascript off is apparently useless—the attack doesn't use it. Instead, it takes advantage of what the two call a "fundamental flaw" inherent to all modern browsers, and an issue that cannot be fixed with a quick patch. Using a frame buster script will protect a person from assaults that utilize cross-domain scripting, but will not prevent the attack from operating normally if it's on a page the user is visiting.

As exploits go, this particular one seems a tempest in a teapot. The vulnerability in question may affect all web browsers, but the total dearth of publicly available data means anyone wanting to utilize it has their work cut out for them. Grossman states that this particular attack is capable of some "pretty spooky," things, but that's all the detail we get. I'm not a fan of security through obscurity, but that's not what anyone is advocating—Adobe has acknowledged the problem, and the dev teams on both Firefox and IE are undoubtedly aware of the flaw's existence. Hopefully they also received a bit more information than the public did.

Original here

Judge: Microsoft documentation unfit for US consumption

By John Timmer

Microsoft may have made a big push to settle many of the antitrust actions facing it around the globe, but those efforts have run up against a major stumbling block: the company's inability to document the protocols need to interoperate with its own software. Documentation problems got Microsoft in hot water with the EU, and they're now the only reason it continues to be under court supervision in the aftermath of its antitrust settlement. But, despite having interoperability become a corporate strategy, its documentation efforts came under fire in a court hearing earlier today.

In the wake of antitrust actions, documentation of Microsoft technologies has become a method of allaying the concerns of legal authorities in both the US and EU. By providing documentation of the APIs and protocols used by its products, Microsoft would not only allow third-party and open-source software to interact better with Windows and other software, but potentially enable them to write replacements, in whole or in part, for Microsoft products. This, in theory, would enable more software companies to compete on equal terms with Redmond.

Unfortunately, the company has consistently had trouble with producing complete and useful documentation. As noted above, the company struggled to satisfy EU authorities that it was complying with the agreement—that was 2006. By 2008, documentation was rearing its ugly head in the US court system. Microsoft's consent decree with the federal and state attorneys general was set to expire, and most of the conditions were allowed to. But Judge Colleen Kollar-Kotelly, who is overseeing the consent decree, ruled that Microsoft still hadn't sufficiently documented some protocols, despite those documents having been due in 2003. As a result, the consent decree will remain in place at least until November of 2009.

Today saw Kollar-Kotelly hold the latest hearing on the status of the consent agreements, and a number of reports suggest that there still seem to be problems there (no legal documents arising from the hearing have been filed yet). Although the Dow Jones Newswire seems to think everything is fine, other reports seem to contradict this. Reuters, however, indicates that the Judge was a bit annoyed that Microsoft filed a document that suggested it viewed itself as being in full compliance with the agreement, given that the documentation wasn't ready. Referring to the 2009 date for the lifting of the consent agreement, she said, "That's not going to happen unless these things get done."

Meanwhile, CNet has quotes from the New York's Attorney General's office that suggest they are getting antsy. After complaining that Microsoft appears to act as if it's doing everyone a favor by complying with its legal obligations, Jay Hines is quoted as saying, "What we have today is the [technical committee] and its staff spoon-feeding the world's biggest PC company. Something about that just isn't right."

Given that this is the last sword hanging over its metaphorical head in the US, it's not clear why Microsoft isn't bending over backwards to make it go away. Most developers find Microsoft's API documentation to be pretty good, so it's clear that the company can produce similar documents when it is determined to.

Original here

Ubuntu up and running on Pandora

by Joseph L. Flatley

All kinds of exciting things are happening in the Pandora universe, and now one enterprising individual has succeeded in getting Ubuntu 7.04 up and running on his development model. Things move pretty slowly, and no luck yet with Firefox, but the thrilling video does catch him playing with GIMP and the Xfce desktop environment. See for yourself after the break.

[Thanks, Stern]

Original here

Linux for Older PCs : From Ubuntu to Vector Linux


By Anoj

My linux journey began with Red Hat and Corel Linux, in the 90s. For a long time, I just couldn't convince my dad to install linux on his windows laptop, the only computer we had then. Then came the 21st century, with linux distros getting more user-friendly with easier and manual-free installations. I shifted from being a long time RedHat Fedora fanboy to an interim PCLinuxOS fan to finally an Ubuntu believer.

Finally after 2 long years, this week, I decided to move on a bit, and try something new. My PC is getting older and constantly struggles to carry the huge processing needs for the latest KDE4 or Gnome and the delicious compiz, which has now become an integral part of the entire Linux Experience.

This week, I tried Vector Linux, a slackware based distro, known to be fast and stable, ideal for older machines like mine and yet never compromising on the features. Read on, to see if it delivered what it promised.


Before we start, here is my PC configuration. It's an old Compax nx7010 business laptop with a Pentium M(Centrino) 1.6 Ghz processor, 1.5GB of Ram and a 64 MB ATI Radeon 9200 gfx card. It's pretty fancy for a 5 year old laptop, but it cost a bomb back then.

I downloaded the VL 5.9 Standard Edition iso from the Vector Linux Homepage and gave it a try.


1. Installation
VL has a decently friendly text-based installer. I found it easy at most times but the partition tool still needs some rework. If you are not comfortable using fdisk or similar text based partition utilities, I would suggest creating the swap and root partitions beforehand using gparted or something else, and just select them for installation. It is a lot easier this way.
The installation was really really fast, finished in less that 20 minutes. Lilo autodetects other OSs on your system and configures easily. VL also prompts you to configure xorg and suggests drivers for your card. In my case, fglrx doesn't work for older ATI cards (pre Radeon 9550), so I selected the opensource radeon drivers. You can also configure your network settings. The system reboots to boot into your new VL environment.

2.Interface, design and usability
VL uses Xfce as it's default desktop/window manager. It also comes with jvm, fluxbox and other light alternatives. The default Xfce environment looks really polished with Thunar being the default file manager.


It's nice to log in as Root for a change, though it is never recommended. On top of the standard xfce applications set, VL offers its own control center called VASM where you can configure - display, network, boot systems etc.


Another nice addition is the package manager GSLAPT, which looks like a skinnier half brother of synaptic but has everything you will ever need to install the extra packages.


Overall, for newbees it's a pretty friendly experience, and seasonal Xfce users will be delighted with the VASM and SLAPT.

2.Quirks, Pains and Woes
I am really tempted to recommend it to everyone, but no OS is perfect, including VL. Firstly, I wasn't really impressed with the partition tool in the installer. Secondly, newbees who select the wrong graphics driver will be left wondering why their Xorg crashes with "no screen found" each and every time they do "startx". Thirdly, the built in wifi configuration in VASM refused to obtain an IP address from my d-link router. VL comes with Wifi-Radar, which again gave the same message. I finally had to get Kdenetwork manager, just to get my wifi connected. Users from other distros - Ubuntu, Suse, Fedora etc. will definitely miss the extensive package support in Vector Linux, which currently has a limited package repository. There are some articles online, which talk about using Slackware packages directly on VL, which I still need to read. Overall some hits and some misses.

3. Final verdict
Once all the initial quirks are resolved (especially the xorg issue), VL is a really solid distro. It is fast as expected from Xfce.
For older machines, fluxbox is a really nice alternative. VL packs a wbar, a mac like dock on startup, which can be activated for Xfce with following command.

$ wbar -above-desk -pos bottom -isize 40 -nanim 5



The best way to really speed up things is by choosing a lower desktop resolution. It really works but would you wish to give up your crisp high resolution for a faded 16 bit low res? The choice is yours, speed vs looks. And yeah, forget compiz. The last one really hurt.. right? Enjoy your linux experience, and please do give VL a try. I am off to try Mandriva 2009 RC2, but I have a gut feeling, I will be back again to VL. Have a nice weekend folks!

Original here

Roll custom social networking sites with Elgg 1.0

By Mayank Sharma

Elgg is an open source application for rolling out a social network. It installs like any Web-based software, but instead of a blog or a wiki, it gives you all the components of a social networking site -- your own MySpace! It's popular with educational institutes and used by several universities across the world, in addition to powering social networks of companies such as Swatch. The new Elgg 1.0, released last month, is modular in design, making it easier for developers to build social networks around the platform.

In a blog post on the new release, Ben Werdmuller, CTO of Curverider, the company that develops Elgg, explains that the software now incorporates features central to social networking, such as granular access permissions, cross-site tagging, and emphasis on personal ownership (tracing posts and other activity back to the author), right into the core. He says that after four years in development, the Elgg developers decided to rewrite the software from scratch. "While many applications take a simple beginning and try and duct tape social networking and next-gen features over the top, we started again. And as a result, Elgg is fast, flexible, extensible, and ready to power the next evolution of social technology."

Elgg 1.0 is available in two flavors, core and complete. The core version is for developers who want to roll their own social networks and decide what features they need. "As the Elgg 1.0+ community expands," Werdmuller says, "more and more plugins will become available, and who's to say you should have a blog, or forums, or photos?"

The complete version weighs in at around 1.4MB and bundles social networking features such as user profiles, blogs, file repository, forum, social bookmarking, and a dashboard. Many of the features are bundled as plugins and included in the complete version. The company provides detailed documentation on installing Elgg and configuring plugins.

Easier to customize and import/export data

You can customize Elgg by selecting plugins, tweaking style sheets, and, soon, by modifying themes. But depending on how you want to use the social network, you'd want to customize it a lot more. To help ease the customization process, the Elgg developers have built in several APIs and methodologies that lets developers rolling out their own social networks easily modify all user-facing services.

"The major differences with the new Elgg are a simplified data model and a separation of logic from view," Werdmuller says. "Every entity in Elgg now inherits a single ElggEntity class, and any entity can have an arbitrary relationship established between them. That means a user (with an appropriate plugin) could link together a blog post, file, and a user profile, and then the system could traverse those connections in a generic way to enhance search results and provide new kinds of functionality. RDF fans will grok the importance of this immediately."

Behind the scenes, these entities have arbitrary metadata attached to them, which is searchable as tags. Entities also have access permissions associated with them. To see how all this manifests in code, take a look at the tutorial to build a simple blog in Elgg 1.0.

Elgg 1.0 also supports multiple viewtypes. A viewtype lets visitors view Elgg pages for a particular interface. For example, an Elgg-powered site can have a stardard HTML viewtype for normal browsers, and a mobile viewtype for mobile devices.

Werdmuller says that the separation of logic from view means that it's easier to provide and add new viewtypes. For example, the RSS feed in Elgg 1.0 is literally just an RSS view on the same logic as the page you want the feed for. In addition to RSS, Elgg 1.0 also supports other viewtypes, including JavaScript Object Notation (JSON), Friend of a Friend (FOAF), and Open Data Definition (OpenDD). "The JSON view allows for AJAX fun, and we've also got extensible RESTful and XML-RPC API architectures so that plugins can add this kind of functionality easily. All this means that you could easily build a J2ME front end for Elgg, for example, and never bother with the default Web interface."

OpenDD, which is developed by the same developers as Elgg, allows you to copy your data from one social network to another without losing track of your friends network. Werdmuller says that the company is working on federating networks and working with other vendors to provide true data portability. "We think that's important for the future of the Web."

Talking about data portability in a blog post, Marcus Povey, senior developer at Curverider, illustrates Elgg 1.0's data export and import features via the views and actions interfaces.

Povey says that OpenDD will allow administrators to migrate between the previous Elgg releases (now known as Elgg Classic) and the new codebase with "minimum amount of effort." But since Elgg Classic and Elgg 1.0 are two different codebases, upgrading will require an intermediary script, which Werdmuller says the company will be releasing separately, but didn't give a timeframe.

Original here

The Pirate Bay Clashes with Book Publishers

Written by Ernesto

Swedish book publishers have presented a study in which they show how widespread book piracy is in Sweden. The publishers think that this copyright infringement has a disastrous effect on their income, while The Pirate Bay is surprised to see that the publishers used their torrent database illegally.

pirate bayThe Swedish book publishers organization recently issued a report in which they revealed that 85% of the best-selling books in Sweden are available on The Pirate Bay. Not really shocking news, Pirate Bay co-founder Peter Sunde told TorrentFreak he’s actually “a bit sad that it’s not 100%.”

Perhaps of more interest is the technique used by the publishers’ organization to gather their data. In the report they write that they had to code a specialized tool to scrape the Pirate Bay database for book titles, since there were no ready-made tools available.

Peter Sunde is now arguing that they were breaking the law by scraping the site multiple times without permission. “The Pirate Bay actually owns the copyright to its own database of torrents,” Sunde writes on his blog. Sunde further refers to the Pirate Bay’s Usage Policy, which the book publishers organization has violated.

In true MPAA style, Sunde is determined to fight for his intellectual property. “I called them up and asked them to present more information about the technical things, so we can send them an invoice if they don’t want to be dragged into court,” Sunde told TorrentFreak in a comment.

Meanwhile, the true reason behind the real book publishers’ study remains vague. As we’ve pointed out many times before, piracy can actually boost book sales. One of the prime examples is best-selling author Paulo Coelho, who said he sold thousands of extra copies because he pirated his own books. Coelho’s success later inspired the publisher and Leander Kahney - the author of the two books - to do the same, with several others following this example. Particularly for book authors, piracy seems to be a useful promotional tool, rather than a threat. For now, that is.

We can’t foresee what will happen if someone launches a Kindle ready pirate site. More on this later, for sure.

Original here

Making Social Networks Profitable

by Heather Green

Imagine there was one number that could sum up how influential you are. It would take into account all manner of things, from how many people you know to how frequently you talk with them to how strongly they value your opinion. Your score could be compared with that of pretty much anyone in the world.

Maybe it'll be called your Google number. Google (GOOG) has a patent pending on technology for ranking the most influential people on social networking sites like MySpace (NWS) and Facebook. In a creative twist, Google is applying the same approach to social networks it has used to dominate the online search business. If this works, it may finally make ads on social networks relevant—and profitable.

Google declined to discuss its idea with BusinessWeek. But it is based on the same principle as PageRank, Google's algorithm for determining which Web sites appear in a list of search results. The new technology could track not just how many friends you have on Facebook but how many friends your friends have. Well-connected chums make you particularly influential. The tracking system also would follow how frequently people post things on each other's sites. It could even rate how successful somebody is in getting friends to read a news story or watch a video clip, according to people familiar with the patent filing. "[Google] search displays Web pages with the highest influence—it makes complete sense for them to extend this to online communities and people," says Jeremiah Owyang, an analyst at Forrester Research (FORR).

How would this improve advertising on social networks? Say there's a group of basketball fans who spend a lot of time checking out each other's pages. Their profiles probably indicate that they enjoy the sport. In addition, some might sign up for a Kobe Bryant fan group or leave remarks on each others' pages about recent games they played or watched. Using today's standard advertising methods, a company such as Nike (NKE) would pay Google to place a display ad on a fan's page or show a "sponsored link" when somebody searches for basketball-related news. With influence-tracking, Google could follow this group of fans' shared interests more closely, see which other fan communities they interact with, and—most important—learn which members get the most attention when they update profiles or post pictures.

The added information would let Nike both sharpen and expand its targeting while allowing Google to charge a premium for its ad services. If Nike wanted to advertise a new basketball shoe, for example, it could work with Google to plop an interactive free-throw game only on the profile pages of the community influencers, knowing the game would be likely to draw the most attention in these locations. And because the new technique ranks links among groups, Google could also target the ads to broader communities. "I would pay a premium to get a particular video in front of someone who [shares] with others, and an even bigger premium for a lot of people who would share," says Ian Schafer, CEO of online ad firm Deep Focus, whose clients include Sean Jean and Universal Music Group.

Influence-ranking is no academic exercise for Google. So far the search giant has failed to earn much profit from social networking ventures. In 2006, Google promised to pay News Corp.'s (NWS) MySpace $900 million over three years for the right to put ads on the site. Google executives have expressed disappointment in that project, which is shaving 1.5% off Google's gross margins, according to Jeffrey Lindsay, an analyst at Sanford C. Bernstein. In its patent filing, Google acknowledged that some of its old approaches didn't work. With the new techniques, says Deep Focus' Schafer, "Google could be the Google of social media."

Original here

Net neutrality: An American problem?

This story was written by Brett Winterford and Julian Hill.

The leaders of three of Australia's largest ISP's have declared the Net neutrality debate as solely a U.S. problem--and further, that the nation that pioneered the Internet might want to study the Australian market for clues as to how to solve the dilemma.

Net neutrality is a term coined by Internet users who oppose the increasing tendency among network owners (telecommunications companies) to tier or prioritize certain content on the network.

The debate was sparked after several American and British service providers offered to charge a premium to prioritize traffic connecting with some sites over others. These service providers claim the Internet is "running out of capacity" due to excessive use of rich content like video and file-sharing traffic. The only model with which capacity can be expanded, they argue, is to charge large media companies to prioritize traffic to and from their sites.

But Simon Hackett, the managing director of Adelaide-based ISP Internode, argues that it is ridiculous to suggest bandwidth is "running out."

"I don't subscribe to the view that network capacity is finite at all... Optical fiber basically doesn't run out of capacity, it's just a question of how fast you blink the bits at each end," he said in a recent interview with ZDNet.com.au.


ZDNet Australia video

"The (Net neutrality) problem isn't about running out of capacity. It's a business model that's about to explode due to stress. The problem, in my opinion, is the U.S. business model," said Hackett.

"The U.S. have got a problem," weighed in Justin Milne, group managing director for Telstra Media and former chief of Australia's largest ISP, BigPond. "Their problem is that unlike Australia, they (offer) truly unlimited plans."

The problem with an unlimited-access plan, explains Hackett, is that it "devalues what a megabyte is worth." American customers have never been able to put much of a dollar value on traffic, as historically, U.S. ISPs have "had it very easy" in terms of bandwidth costs. The United States invented the Internet and developed the first content for it, and the rest of the world essentially subsidized the U.S. to connect to that content.

"It was quite rational to charge (users) a fixed amount of money for access (in the U.S) because the actual downloads per month were trivial," said Hackett.

Today, there is as much local traffic floating around the rest of the world as there is in the United States, and America is as much a consumer of the world's content as it is a distributor of content to the world. In addition, the traffic being carried is far richer in terms of content, so the cost of feeding capacity to the YouTube generation is considerably higher.

"Now everybody file-shares and sends video all around the place," said Milne, "and the problem for the telcos in the U.S. is they are having to expand their networks as they go, but they are not getting paid any more money."

Who pays?
American ISPs are thus faced with a choice as to whom to charge in order to build out their networks to accommodate the increased traffic.

The first choice is to absorb the costs themselves, the status quo to date, which is less than desirable as a business model. The second choice is to cease to offer unlimited plans, which passes the cost of excessive bandwidth use onto those users that consume the most.

The final choice, says Michael Malone, CEO of ASX-listed ISP iiNet, is to charge content providers, the model that has stirred up controversy.

"The attempt is being made certainly in the U.K. but also in the U.S. to push that cost onto the content owner by saying, you pay, and we'll prioritize your traffic," he said. "(And) if you don't pay, your traffic will be really crap."

American ISPs are hesitant to take the option of charging customers for excessive use, Milne says, because they will "probably all knick off and go to my competitors who are not charging them." Instead, they plan to "charge the guys who are putting big gobs of video traffic into my network--which would be people like Microsoft and YouTube and Google etc."

"Those guys say, you're kidding, what about Net neutrality? The Net is supposed to be free, man! You can't charge us for putting traffic in there because that's denying the natural rights of Americans! I think the argument is thin but nevertheless Congress seems to be picking it up."

Lessons from Down Under
The right choice, all three Australian ISP leaders agree, is to put the onus on the user, a model that has worked well in Australia.

As an Australian ISP, around 60 percent to 70 percent of traffic comes from overseas. "You've got to haul the traffic," explains Milne. "All of that traffic is volumetrically charged--the more traffic you haul from overseas, the more you pay.

"So all ISPs in Australia, because of our unique geography, have got used to pay-as-you-go and have handed those pay-as-you-go principles on to their customers."

Malone says that when users are offered truly unlimited access to download as much as they want, 3 percent of customers use over 50 percent of all the downloads. Download quotas can eradicate that problem if they are set at such a level that it affects this 3 percent, while having zero affect on the majority.

Quotas, Malone says, aren't designed to be punitive.

"Quotas are meant to be able to say that for 95 percent of customers, this (much data) is enough...This is an effectively unlimited connection for most people.

"From my point of view, (Net neutrality is) an artificial problem created out of fear of modifying the business model," says Hackett. "The idea that the entire population can subsidize a minority with an extremely high download quantity actually isn't necessarily the only way to live," said Malone.

The Australian model gives ISPs predictability about income and network costs, explains Hackett.

"If a user uses much more stuff, they wind up on higher plans, so we can actually afford to bring in more (network equipment and capacity)," he said. "So it's kind of self-correcting. In the U.S., an ISP is visibly afraid of the idea of customers pulling video 24/7. (Whereas) if our users use more traffic, it doesn't actually scare us. You get the sense that it actually does scare (U.S. ISP) Comcast."

Milne says a number of U.S. cable companies have taken the hint and started charging "volumetrically."

"I think that's actually where things will finish up," he says. "Be it electricity, travel, petrol, we as humans have got used to the idea that the more you use the more you pay, albeit with a discount. The Net in the U.S. just magically decided to avoid that, and now I think they'll have to come back to reality."

"You can't just keep on building these networks forever for free. You can build them bigger and bigger and bigger, but somebody has to pay for it. There has to be a business model by which the network is paid for," added Milne.

Brett Winterford and Julian Hill of ZDNet Australia reported from London.

Original here