Tuesday, February 10, 2009

New Opera JavaScript engine supports native code generation

By Ryan Paul

New Opera JavaScript engine supports native code generation

The Opera web browser could soon get a script execution performance boost from a new JavaScript engine that supports native code generation. The new engine, which is called Carakan, was unveiled through a technical overview published on the official Opera developer blog on Wednesday.

The growing relevance of script-intensive web applications has compelled browser makers to search for ways to improve the performance of their JavaScript implementations. Renewed competition in the browser space has helped to accelerate this process, as the primary contenders battle to be the fastest. Apple's immensely impressive SquirrelFish Extreme engine currently leads the pack, but Mozilla and Google are also moving forward with new high-performance JavaScript engines. These all use JIT compilation and native code generators to achieve their performance gains.

The new Carackan engine aims to deliver similar functionality and help Opera stay competitive. According to developer Jens Lindström, Opera has had a small team working on the new engine for several months. Carakan will replace Futhark, the engine that is used in the latest stable version of Opera. Futhark, which was introduced in Opera 9.5, is a lightweight, stack-based bytecode interpreter that was designed with an emphasis on low memory consumption rather than optimal execution speed. Carakan supports a new register-based virtual machine and a nascent native code generator that leverages static type analysis.

Although the native code generator isn't ready yet, the new virtual machine is already much faster than the previous one. Builds aren't available to the public yet, but Opera says that it's internal benchmarking shows that Carackan is roughly 2.5 times faster in the SunSpider benchmark. With native code generation, the performance difference will be much more significant.

"The native code generation in Carakan is not yet ready for full-scale testing, but the few individual benchmark tests that it is already compatible with runs between 5 and 50 times faster, so it is looking promising so far," Lindström wrote. "On ECMAScript code that is particularly well-suited for native code conversion, our generated native code looks more or less like assembly code someone could have written by hand, trying to keep everything in registers."

The conventional x86 and 64-bit native code generator backends have already been mostly implemented, but the ARM backend is still only at a very early stage of development. ARM support could potentially enable a whole new class of JavaScript-heavy web applications to work on handsets that run Opera's mobile browser.

Opera has also released some additional technical details about Vega, a hardware-accelerated vector graphics library that is used by the browser. Vega was originally developed to facilitate the implementation of SVG support in Opera, and it has since been adapted to power the HTML5 Canvas element. Opera developer Tim Johansson says that the browser could soon use Vega for all HTML rendering. This could potentially simplify the codebase by eliminating the need for platform-specific rendering code paths. It will also simplify implementation of some advanced CSS3 features and make it possible to leverage hardware acceleration for complex drawing.

Opera's move towards high-performance JavaScript and hardware accelerated graphics is another indication of how the reinvigorated browser market is leaving even the smaller players with a stark choice: keep pace or face irrelevance. The winners are clearly users, who can use the latest and greatest on the web with the browser of their choice.

Original here

Best Firefox Extensions: 13 Money and Time Saving FireFox Add-ons!

Image Source: Mouserunner

Posted by Abbas

Ten’s and Hundreds of Firefox plugins are released by individual developers every day. I looked through and found a few which I think are the best extensions which have the capability of replacing an application and saving money or simplifying a task and save some time.

Do leave a comment and let us know about your favorite plugin, and if you see it on the list we have created. This extensions can come in really handy and can improve your productivity.

Lets start, here goes the list…

This is a good replacement for a regular download manager. It offers all the typical functionality of a Downloads manager. This plugin currently has 1.3 million active users which is a testament of its popularity and usefulness. This certainly is a money saver.

“The first and only download manager/accelerator built inside Firefox!”

Downloads Page:

DejaClick plugin records all the events on a browser and plays it back for you in the same sequence whenever required. For example, if you open certain sites the moment you switch your computer on, this activity can be recorded and played every time you switch the computer on instead of manually opening one site after another.

“DéjàClick is a web recorder and Super Bookmark utility designed exclusively for Firefox. You can record and bookmark your browser activities, then with a single click, replay the entire sequence all over again.”

Downloads Page:

Download Videos from InternetVideo DownloadHelper

This plugin lets you download videos from a LOT of sites. Unlike the RealPlayer plugin this does not offer a video download on mouse over on the context menu. The best thing about this plugin is that it shows a list of 200+ sites from which videos can be downloaded.
"The easy way to download and convert Web videos from hundreds of YouTube-like sites. This works also for audio and picture galleries."

Downloads Page:

This has got to be one of the best extensions every made for Firefox. I use it day in & day out and “almost” never had an issue with it. I highly recommend this plugin for bloggers and webmasters, I love it for its simplicity and for the fact that it is free.

“FireFTP is a free, secure, cross-platform FTP client for Mozilla Firefox which provides easy and intuitive access to FTP servers.”

Downloads Page:

This extension simply connects to your gmail account and lets you upload or download files off the gmail space. Every time a file is uploaded you will see an e-mail in your account with the file attached. If the file is deleted using Gspace the e-mail disappears too.

“This extension allows you to use your Gmail Space (4.1 GB and growing) for file storage. It acts as an online drive, so you can upload files from your hard drive and access them from every Internet capable system. The interface will make your…”

Downloads Page:

Firebug is a MUST HAVE plugin for web designers or anyone who does any type of development online. Firebug lets you easily analyze, edit code or view CSS or JS while at the site.

“Firebug integrates with Firefox to put a wealth of development tools at your fingertips while you browse. You can edit, debug, and monitor CSS, HTML, and JavaScript live in any web page… Firebug 1.2 requires Firefox 3. Firefox 2 users should install the older 1.05 version of Firebug.”

Downloads Page:

Screengrab saves time and machine resources. It does not run as a separate program unlike most of the other screen capturing programs. As it is a Firefox plugin, it only works within the webpage area.

“Screengrab saves entire webpages as images…”

Downloads Page:

SQLite Firefox ExtensionSQLite Manager
Three words for this one… WOW, WOW and WOW! Creating tables was never so easy with an extension. Install it right now.

“Manage any SQLite database on your computer.”

Downloads Page:

Author description says it all. It’s the closest you can get to a professional web editing interface in a browser.

“Get the feel of Dreamweaver in a Firefox extension. Edit your documents right next to your web pages as you surf.”

Downloads Page:

Gmail ManagerGmail Manager
This is an ideal plugin to save time for people who maintain multiple Gmail accounts. It lets you configure multiple Gmail accounts and sits on the status bar and alerts you of new mails in any of the configured accounts.

“Allows you to manage multiple Gmail accounts and receive new mail notifications. Displays your account details including unread messages, saved drafts, spam messages, labels with new mail, space used, and new mail snippets.”

Downloads Page:

This is the best plugin is you want to surf the net undercover. It disables tracking of certain features on the browser which makes it possible for the users to browse the internet anonymously.

“If there are times you want to surf the web without leaving a trace in your local computer, then this is the right extension for you”

Downloads Page:

Menu Editor
Menu Editor lets you choose what you want to see on your Firefox menu and the options within a menu. All it takes is a click to hide the entire menu on the browser.

“Menu Editor is an extension that allows you to customize the menus in Firefox and Thunderbird.”

Downloads Page:

Greasemonkey is a plugin which adds additional functionality to webpages based on the custom JavaScript code feeded into the plugin.

“Allows you to customize the way a webpage displays using small bits of JavaScript.

Hundreds of scripts, for a wide variety of popular sites, are already available at

You can write your own scripts, too. Check out to get started.”

Downloads Page:

Measureit Firefox ExtensionMeasureIt
MeasureIt comes in really handy if you want to know the dimensions of an image or anything on the webpage. This plugin sits on a status-bar and on a click of a button it helps you measure the area of the given object.

“Draw out a ruler to get the pixel width and height of any elements on a webpage.”

Original here

Most Popular Twitter Clients Revealed

by Pete Cashmore

twitter-logoThe micro-messaging service Twitter became popular in part due to the thousands of different ways to post and view Tweets…but what are the most popular ways to share your Tweets with the world?

The Twitter analytics service TwitStat is continually running the numbers and provides us with the following, fascinating list. Unsurprisingly, most users post from the Twitter website, but the desktop app TweetDeck is a very popular way to post - more than 10% of the users surveyed were posting from it (see our TweetDeck vs Twhirl review for more info on TweetDeck).

When the Twitter search service Summize became popular, Twitter acquired it. Would it make sense, then, for Twitter to buy TweetDeck, considering its huge lead in the desktop Twitter software market?

Top 10 Twitter Clients

TwitStat’s current top 10 list is below, showing ranking and % of users posting from each service today. Note that where users made use of multiple methods, all were counted. The list is constantly changing, and the Top 100 can be found here.

1 Web - 29.57%

2 TweetDeck - 10.58%

3 Twitterfeed - 5.54%

4 Twhirl - 4.92 %

5 Twitterrific - 3.79 %

6 Tweetie - 3.65%

7 Text - 3.40 %

8 Mobile web 3.00 %

9 TwitterFon - 2.96 %

10 TwitPic - 2.94 %

Original here

Who's Messing with Wikipedia?

By Erica Naone

Credit: Technology Review

Despite warnings from many high-school teachers and college professors, Wikipedia is one of the most-visited websites in the world (not to mention the biggest encyclopedia ever created). But even as Wikipedia's popularity has grown, so has the debate over its trustworthiness. One of the most serious concerns remains the fact that its articles are written and edited by a hidden army of people with unknown interests and biases.

Ed Chi, a senior research scientist for augmented social cognition at the Palo Alto Research Center (PARC), and his colleagues have now created a tool, called WikiDashboard, that aims to reveal much of the normally hidden back-and-forth behind Wikipedia's most controversial pages in order to help readers judge for themselves how suspect its contents might be.

Wikipedia already has procedures in place designed to alert readers to potential problems with an entry. For example, one of Wikipedia's volunteer editors can review an article and tag it as "controversial" or warn that it "needs sources." But in practice, Chi says, relatively few articles actually receive these tags. WikiDashboard instead offers a snapshot of the edits and re-edits, as well as the arguments and counterarguments that went into building each of Wikipedia's many million pages.

The researchers began by investigating pages already tagged as "controversial" on Wikipedia: they found that these pages were far more likely to have been edited and re-edited repeatedly. Based on this observation, they developed WikiDashboard, a website that serves up Wikipedia entries but adds a chart to the top of each page revealing its recent edit history.

WikiDashboard shows which users have contributed most edits to a page, what percentage of the edits each person is responsible for, and when editors have been most active. A WikiDashboard user can explore further by clicking on a particular editor's name to see, for example, how involved he or she has been with other articles. Chi says that the goal is to show the social interaction going on around the entry. For instance, the chart should make it clear when a single user has been dominating a page, or when a flurry of activity has exploded around a particularly contentious article. The timeline on the chart can also show how long a page has been neglected.

Courting controversy: WikiDashboard gathers information about the social interactions underlying Wikipedia entries and displays it to a user. The entry for former U.S. president George W. Bush, shown above, stood out as the most controversial. The researchers discovered that certain statistics, such as the number of total revisions made to an article, could accurately predict controversy.
Credit: WikiDashboard/PARC

The page on Hillary Clinton, for example, shows that the main contributor has put in about 20 percent of the edits. Chi says this suggests that this individual has guided a lot of the article's direction. In contrast, an entry on windburn shows a much less heated scene: more even collaboration among the contributors.

The researchers released an early version of the tool in 2007 using data released a few times a year by Wikipedia. But Chi says that this version of WikiDashboard was limited, since it didn't show the speed of change online. His team spent much of 2008 getting access to live data, which Chi says was difficult because of Wikipedia's limited resources.

Daniel Tunkelang, chief scientist at Endeca, an information analysis firm based in Cambridge, MA, says that the tool is a step toward exploring the social context of Wikipedia entries, but he adds, "There's some room for compressing this into something more consumable." By this, Tunkelang means that the software could be more useful to the casual user if it summarized data more effectively. For example, he says that the list of articles that each editor has worked on could be shown as just a handful of easy-to-read tags.

At a talk given by Chi this week, Rob Miller, an associate professor at MIT's Computer Science and Artificial Intelligence Lab, noted that some Wikipedia editors try to rank up a high number of edits just to gain more kudos. He wondered how that tendency might affect WikiDashboard's measurements should the tool catch on.

Chi's group is still working on the WikiDashboard, and on Wikipedia data more generally. He says that he'd like to see a system that measures not just simple statistics such as the number of edits made, but also the quality of those contributions.

Original here

New Skype 4.0 software adds full-screen video calls

New Skype 4.0 software adds full-screen video calls
New Skype 4.0 software adds full-screen video calls

Skype has called the new software, which is available as a free download for all Windows users, “the most distinctive new release in the company’s five-year history”.

It brings full-screen video calling to Skype’s estimated 35 million active daily users, as well as “crystal clear call quality”, and a streamlined interface that’s more intuitive and easier to use.

Skype 4.0 also has a built-in bandwidth manager to ensure the performance of video calls even when using a slow internet connection, in an effort to try and keep judder and lag to a minimum.

“Video calling has emerged as a very popular way for people to communicate online,” said Mike Bartlett, director of product strategy for Skype. “Whether it’s saying hello to your daughter at college or reading a book to your child while you’re away on business, video calling lets you have the conversations that make a difference and it really feels almost as good as being there.”

Users can also switch more easily between calls, and can keep track of multiple conversations, as well as flit between voice and video calls to friends and family, or instant-messaging and text chats.

The changes have been in development for two years, said Skype, and many of the improvements were based on user feedback.

Skype’s service is increasingly being built into other devices, as well as computers. Owners of many of 3’s range of mobile phones have a Skype program pre-installed on their handset, which allows them to make free calls via Skype whenever they are in a Wi-Fi internet hot spot.

It is thought that eBay, which bought Skype for $2.6 billion in 2005, is looking to sell off the service. The online auction company's chief executive, John Donahue, said recently that Skype was “a great stand-alone business”.

Original here

Data breach incidents are increasing, study shows

by Jon Oltsik

My official title may be "analyst," but market research is the part of my job that appeals to the geek in me. Good thing I work at ESG, where we do market research around information assurance all the time.

Given an IT security landscape highlighted by regulatory compliance, publicly-disclosed data breaches, and increasingly sophisticated threats, we often ask survey respondents whether their organization suffered a data breach in the last 12 months. ESG has probably asked this very question in several research projects over the past few years. In the past, about 30 percent of large organizations (i.e. 1,000 employees or more) claimed that their organization had suffered a data breach within the last year.

This pattern was fairly consistent from 2005 through 2007, so I expected to see similar results when we conducted another research survey focused on application and database security at the end of 2008. I was shocked to see that things have actually grown much worse. In a November 2008 survey of 179 North American-based security professionals, 56 percent claimed that their organization had suffered a data breach within the past 12 months. In further analysis, 61 percent of organizations with 1,000 to 5,000 employees suffered a data breach in that time frame. It's easy to assume that these smaller firms are more at risk since they are likely to have fewer security technologies in place and smaller security staffs. Perhaps this is true, but even bigger companies are suffering data breaches--49 percent of organizations with 5,000 employees or more endured at least one data breach of their own.

Armed with data from several years of surveys, I think it is safe to assume that things are getting worse, not better. Sensitive data continues to flow throughout the enterprise, ending up in e-mails and IMs, laptops, and thumb drives, and into the hands of malicious or careless employees--an uphill battle indeed.

We all realize that the economy stinks and CIOs absolutely must cut IT spending. That said, the ESG data suggests that they take a prudent approach to security spending cuts. Remember that one publicly-disclosed breach can cost a lot more than a security staffer, technology safeguard, or additional training. Just ask TJX, Heartland Payment Systems, Monster, or the 56 percent of large organizations represented in the ESG Research data.

Jon Oltsik is a senior analyst at the Enterprise Strategy Group. He is not an employee of CNET.

Original here

Binary in 60 seconds


Hey guys, welcome to Math in 60 seconds. Today we are gonna talk about binary numbers in 60 seconds. So, let’s go for it here. Alright, binary numbers is what is called a base two system. There’s two things to meet to that. One is that each of the places is a power of two and we will talk about that shortly. In other words, we only have two choices. Everything’s gonna either be a zero or one. So, let’s look at it.
So, first, I wanna put in binary, the number 22. What we are gonna to do is, we are gonna think about how many of each of these are in 22. So, I am gonna start with my biggest. How many 16’s are in 22? Well there is one, one 16 in 22. I’ve used up one 16 where there are still 6 left because 16 of 22, there is 6 more. That means, eight is too much, I only have six left over. So, there is no 8s. I do need a four though. I have 6, I just used up four of them. So, there is a two left, there is a two there as well. That gives me 22. So, check it. 16 plus 4 is 20, plus 2 is 22. So, I don’t need any of the ones. Now, let’s try backwards here. Let’s make up a number. Say, a 01101, let’s see what that is. Well, there is no 22s, there is one 8, one four and one 1. So, that’s 8, a 4 is 12 plus one more gives us a 13. Hey guys, that’s binary numbers in 60 seconds. Hope that helped. See you again soon.

Original here

Why Bill Gates Is My Worst Nightmare

Posted By: Jim Goldman

Bill Gates
Bill Gates

By now you've heard the wacky way Microsoft [MSFT 18.9988 -0.4412 (-2.27%) ] Chairman Bill Gates tried to illustrate his point about impoverished people around the world trying to deal with the lethal problem of malaria, and what the Bill and Melinda Gates Foundation is trying to do about it: Gates opened a jar filled with mosquitoes on stage at the TED2009 Conference in Long Beach, California last night and subjected the audience to instant infestation.

And while the foundation itself declared that the insects were malaria-free (but no word on West Nile or encephalitis), that was little solace to the hundreds in the audience hearing that awful humming noise in their ears as the mosquitoes moved from victim to victim.

Microsoft, of course, is no stranger to bugs and their effects on helpless, unsuspecting victims. Think Vista and you get my point.

For me personally, I have a mosquito phobia. I'll never forget, as an 8-year old boy, suffering a particularly awful, hot, humid summer when I was being eaten alive daily by mosquitoes. One night, I was laying in bed and one of the blood-suckers was humming around my ear. It was driving me crazy. I laid there, in the dark, helpless. It got so bad that I crawled to my bathroom, shoved a towel under the crack of the door, curled up and slept in there. Ever since, nothing drives me crazier than that dreaded mosquito hum. It's awful.

Pierre Omidyar, eBay's founder, was at the event and Twittered, "That's it. I am not sitting up front anymore."


Gates was trying to raise awareness of the global malaria problem, with as many as a half-billion victims infected every year, according to the Centers for Disease Control and Prevention, leading to 1 million deaths. Gates' foundation will pump nearly $170 million into the PATH Malaria Vaccine Initiative working to create a malaria vaccine. Good thing he's not focused on Ebola (monkey-meat sandwiches anyone?) or the Plague (release the flea-infested rats!) His stunt worked, however. Google Gates and mosquitoes and you'll get a boatload of results.

Just goes to show that creative marketing can still generate results. And the mosquitoes were infinitely better than, say, trotting out comedian Jerry Seinfeld who made those appearances with Gates in those bizarre Microsoft ads from last year to try to help deliver the new Microsoft message. Heck, Microsoft used to use butterflies to get the message out about its MSN service in 2000. Maybe the whole company will adopt mosquitoes for Windows 7?

I've reached out to some of you in attendance last night, but haven't heard back. If you were there, got bitten or not, and want to share your experiences and thoughts from the event, feel free to write in. I'll post your comments when I get them.

Original here

What's so bad about XP?

by Don Reisinger

Windows XP

I prefer XP to Vista. Am I alone?

(Credit: Microsoft)

In an interview with Computerworld UK, Microsoft CEO Steve Ballmer had some interesting things to say about the enterprise and its desire to keep Windows XP instead of switching to Vista.

"If you deploy a 4- or 5-year-old operating system today, most people will ask their boss why the heck they don't have the stuff they have at home," Ballmer told the publication.

Ballmer went on to say that it's incumbent upon the business world to make employees happy and comfortable and the best way to do that is to deploy Vista as soon as possible.

According to market research firm Forrester, Windows Vista can be found on less than 10 percent of all the computers companies in North America and Europe are running. Windows XP can be found on over 71 percent of enterprise computers. Ballmer wants that to change.

But his premise that an increasing number of people will be asking for Vista doesn't really make sense. According to research firm Net Applications, Windows XP still controls almost 64 percent of the worldwide consumer market. It's trailed by Vista and Mac OS X 10.5 with 27 percent market share and 5 percent market share, respectively.

Based on those figures, I'm not convinced that there are that many people walking into their supervisor's office wondering when Vista will be deployed at the office. In fact, it's far more likely that they would rather use something they know--XP.

But Ballmer's desire to get enterprises to switch to Vista has me wondering what's so bad about Windows XP. Is it really such an awful operating system that every company should switch?

No way.

I realize Ballmer has a vested financial interest in seeing more companies switch to Vista, but I'm a firm believer that they should stick to XP until Windows 7 is released and all the kinks are worked out. That's why I only use Vista when I must.

It's not that I hate Vista, I just think that it suffers from too many issues to justify using it. It's too resource-intensive and I don't want to buy a new computer to optimize its performance. But I can (and do) run XP on my Asus Eee PC, as well as an older machine that isn't even capable of running Vista.

And although the mention of security is always made when comparing Vista to XP, I don't take the bait. I've installed Service Pack 3 into XP and you know what? It's just as secure, based on the way I use the OS, as Vista with SP1 installed.

But it goes beyond security. I prefer XP because, unlike Vista, I don't need to worry about the quality of my GPU or how much RAM my computer has. It just works with what I have. More importantly, I find that Vista is much slower, even with better components, than a comparably equipped XP machine. For a newer OS, that's unacceptable.

From a business perspective, it makes perfect sense to keep XP for now. The enterprise is still upset about Vista's compatibility issues when it first launched and because it's so resource-intensive, many companies would need to update their machines just to deploy the OS. Windows 7 is also right around the corner and it only makes sense, especially in uncertain economic times, to wait and save money for now.

Maybe Ballmer is right and he really does have his finger on the pulse of computer users across the world, but I prefer XP and think it's a better operating system than Vista. I know he has to say that companies should switch because his company spent all that money on developing Vista, but I think his logic is flawed (do employees really complain about Vista vs. XP?) and I think he's being too hard on the elder OS.

And it looks like many companies agree.

"IT decision-makers don't have an entirely rosy outlook for Windows Vista," Forrester analysts said in a recent report. "We found that 15 percent plan on skipping Windows Vista entirely and going straight to Windows 7 soon after its release in 2010. And another 22 percent still have no definitive plans for deploying Windows Vista, and 6 percent simply don't know yet what their plans are."


Check out Don's Digital Home podcast, Twitter feed, and FriendFeed.

Don Reisinger is a technology columnist who has written about everything from HDTVs to computers to Flowbee Haircut Systems. Don is a member of the CNET Blog Network, and posts at The Digital Home. He is not an employee of CNET. Disclosure.

Original here

Windows 7 UAC flaws and how to fix them

By Peter Bright

Windows 7 UAC flaws and how to fix them
The Windows 7 UAC Slider

Unlike many, I'm a big fan of Vista's User Account Control. Truth is, I don't get a lot of prompts asking me to elevate, and those that I do get are legitimate. Sure, the implementation isn't perfect; there are some scenarios that cause a rapid proliferation of prompts that are a little annoying (such as creating a folder in a protected location in Vista RTM), and there are even a few places where it forces elevation unnecessarily, but on the whole I think it's a good feature.

The basic purpose of UAC is to annoy you when your software needs Admin privileges. The reason for this is simple: a lot of Windows software demands Admin privileges not because it needs to be privileged for everything it does, but rather because it was the quickest, easiest way for the developer to do some minor task. For example, games with the PunkBuster anti-cheat system used to demand Administrator privileges so that PunkBuster could update itself and monitor certain system activity. This was bad design because it meant that the game was then running with Administrator privileges the whole time—so if an exploit for the game's network code was developed, for example, that exploit would be able to do whatever it liked.

The solution to this kind of problem is to split the application up in one way or another. In the PunkBuster case, the privileged parts were split into a Windows service (which has elevated privileges all the time), leaving the game itself running as a regular user. There are a number of other approaches of tackling the same problem, but in general they all require an application to be restructured somewhat so that privileged operations can be separated from non-privileged ones.

As well as this "annoyance" role, UAC also provides a warning when software unexpectedly tries to elevate its privileges. UAC has heuristics to detect applications that "look like" installers, and it also traps important system utilities like Registry Editor. Though Microsoft has cited this kind of behavior as a benefit of UAC, the company has also said that UAC is not a "security boundary." That is to say, if a malicious program figures out a way of elevating without a UAC prompt (or by tricking the user into agreeing to the UAC prompt) then that's not a security vulnerability. If you want real security with UAC you have to run as a regular user and enter a username and password to elevate—the Admin approval click-through mode (the mode that's the default for the first user account created on any Vista system) is not intended to be secure.

The winds of change are blowing

Why bring this up? Well, first of all, Windows 7 brings some changes to UAC to try to reduce the number of prompts that Administrators see. The basic idea is that if you make a change through one of the normal Windows configuration mechanisms—Control Panel, Registry Editor, MMC—then you don't get a prompt asking you to elevate. Instead, the system silently and automatically elevates for you. Third party software will still trigger a prompt (to provide the warning/notification that it's raising its privileges), but the built-in tools won't. In this way, you don't get asked to confirm deliberate modifications to the system; the prompts are only for unexpected changes.

In my naïveté I initially assumed that perhaps the differentiation was made according to where the action initiated; keyboard and mouse input (i.e., user actions) rather than something more simplistic like trusting particular applications. After all, the computer knows that a keystroke or mouse click originated in the hardware (because a driver gets to handle it), so it can easily tell what's real and what's not. A trusted application, however, could be started up by a malicious program and made to do bad things. So surely that wasn't the route Redmond chose?

It turns out that is indeed the route Redmond chose. For a number of years now, Microsoft has attached digital signatures to the programs and libraries that make up Windows; these signatures allow you to verify that a program did indeed come from Microsoft just by looking at the program's properties. In Windows 7, most programs with Microsoft signatures are trusted by UAC and won't cause a prompt. Instead, they just silently elevate. Unfortunately, Microsoft hasn't done anything to resolve the problem with this approach—trusted applications can be tricked into doing bad things. A few programs such as cmd.exe, PowerShell, and Windows Scripting Host don't auto-elevate (because they're designed to run user code, rather than operating system code), but they're the exception. Everything else elevates, and is vulnerable to being abused.

This was noticed a last week by Long Zheng at I Started Something. Together with Rafael Rivera, he put together an exploit for this silent elevation. The exploit programmatically passed keystrokes to an Explorer window, navigating to the UAC Control Panel, and setting the slider to disabled. Because Explorer is trusted, changing the setting doesn't cause a prompt. Instead, UAC is silently disabled.

Sending keystrokes is a bit crude, so a second attack was developed. This second attack was more flexible; instead of merely disabling UAC, it allowed any code to run elevated without prompting the user. It does this by using a Windows program called rundll32. rundll32 has been part of Windows for a long time; its purpose is, as the name might imply, to allow DLLs to be run, almost as if they were normal programs. The exploit simply puts the malicious code into a DLL and tells rundll32 to run it. rundll32 is trusted, so it elevates automatically.

Together, these attacks mean that Windows 7's default UAC configuration is virtually worthless. Silently bypassing the prompts and acquiring Administrator privileges is as easy as putting code into a DLL. Windows Vista doesn't have a problem, because it doesn't trust any programs; the problems are purely due to the changes Microsoft has made to UAC in the name of convenience in Windows 7.

Dismissing instead of fixing

Given the importance of security and UAC, one might expect Microsoft to take note of this problem and do something to fix it. Unfortunately, the company's first response was to dismiss the behavior as happening "by design."

Given the importance of security and UAC, one might expect Microsoft to take note of this problem and do something to fix it. Unfortunately, the company's first response was to dismiss the behavior as happening "by design." Redmond says that, because UAC isn't a security boundary, it doesn't matter if silent elevation occurs; it's not a vulnerability. UAC is only there to keep legitimate software authors honest, not to stop malware. After the second exploit was disclosed, on Thursday a company representative made a lengthy blog post reiterating that UAC is not a security boundary and that the behavior is by design—it's awfully convenient, you see, so it doesn't matter if it's actually useful as a security measure.

In essence, the argument Microsoft has made is that if a user runs malicious programs as an Administrator and those programs do malicious things, that's not a security flaw, because the user ran the programs as an Administrator, and an Administrator is allowed—by design—to do things that can break the system. What this argument misses is that, until elevated, the malicious program can't do all the nasty things that malicious programs tend to do; it can't modify system files, make itself run on startup, disable anti-virus, or anything like that. Choosing to run a program without elevation is not consent to running it elevated.

Maybe this needs to be fixed after all

Things then took a turn for the weird. A second post was made admitting that, well, the company had "messed up" with the first post, in two ways. First and foremost, the new UAC behavior is badly designed; second, the whole issue was badly communicated by the company. The Windows 7 team will change the UAC behavior from that currently seen in the beta to address the first flaw. This fix won't be released for the current beta, though, and we'll have to wait until the Release Candidate or even RTM before we can see it in action.

When fixed, the UAC control panel will be different in two important ways. It will be a high integrity process—which will prevent normal processes from sending simulated keystrokes to it—and changes to the UAC setting will all require a UAC confirmation, even if the current setting does not otherwise require it. Though this will resolve the first exploit, it looks like it will have no impact on the second, and since the second exploit was the more useful anyway (as it can be used to do anything, not just change the UAC setting), this fix doesn't seem extensive enough.

There is some irony in Microsoft's behavior to use a trusted executable model; the company knows damn well that trusted executables aren't safe, and uses this very argument to justify the UAC behavior in Vista.

In short, trusting executables is a poor policy, because so many executables can be encouraged to run arbitrary code. There is some irony in Microsoft's behavior to use a trusted executable model; the company knows damn well that trusted executables aren't safe, and uses this very argument to justify the UAC behavior in Vista. A system using trusted executables will only be secure if all of those executables are unable to run arbitrary code (either deliberately or through exploitation).

That clearly isn't the case in Windows 7; rundll32's express purpose is to run arbitrary code! Removing the auto-elevation from rundll32 may be unpalatable, too. While non-elevating programs like Windows Scripting Host and PowerShell are used predominantly for user code, rundll32 is used mainly for operating system code. Removing its ability to elevate would, therefore, reintroduce some of the prompts that Windows 7 is trying to avoid. And even if rundll32 lost its ability to elevate automatically, there are almost certain to be other trusted programs that can be abused in a similar way. So, in spite of the most recent blog post, this remains a poorly-designed feature. UAC is now only as strong as the weakest auto-elevating program.

It equally remains poorly communicated. Fundamentally, the defense that UAC is not a security boundary just doesn't cut the mustard. Microsoft sells UAC as providing "heightened security", as a way of limiting the "potential damage" that malware can do. To then argue that users should not, in fact, expect UAC to keep them secure is insulting. Moreover, even if the purpose of UAC is just to keep application writers honest, these exploits mean it fails to achieve even that. The simple fact is that it's a lot easier to restructure an application to make it use rundll32 to automatically elevate than it is to do things the Right Way. The unscrupulous or lazy software vendor who just wants to do the simplest thing possible to make the prompts go away will surely prefer that option to actually fixing their application.

As someone who thinks that UAC is a good idea, these efforts to undermine it are terribly disappointing. As things currently stand, Windows 7's default UAC settings render it pointless in Admin approval mode, as it's so trivially bypassed. It might as well be turned off completely for all the good it does. To break a security feature—boundary or no boundary, it's sold as a security feature, it acts like a security feature, so I'm certainly going to treat it as a security feature—for the sake of convenience is a grave mistake.

Original here

Windows XP's Days are Really Numbered Now

Gregg Keizer, Computerworld

In 10 weeks, Microsoft Corp. will begin to retire Windows XP by shifting the seven-year-old OS into a more limited support plan.

Windows XP, Microsoft's most successful operating system ever, will leave what the company calls "mainstream support" on April 14, and enter "extended support." Typically, Microsoft keeps a product in the former for five years, then moves it into the latter for another five, for a total of 10 years. However, the long span between the releases of XP and its successor, Windows Vista, forced the company to push out the support deadline to 13 years altogether.

Also, two years ago Microsoft bumped support for Windows XP Home and XP Media Center to the 2009 and 2014 dates, matching the dates that had previously been set for Windows XP Professional, the designated business edition of the operating system.

By Microsoft policy, mainstream support delivers free fixes -- for security patches and other bug fixes -- to everyone. During extended support, all users receive all security updates, but non-security hot fixes are provided only to companies that have signed support contracts with Microsoft.

Several Microsoft spokespeople confirmed that Tuesday. "Customers will have access to extended support for paid support, security support updates at no additional cost and paid hotfix support," a company spokeswoman said in an e-mail. Firms must purchase an extended support contract within 90 days of XP's mainstream support retirement in April.

"All security updates are provided through both mainstream and extended support," added Frank Fellows, another Microsoft spokesman.

Although it's not unusual for a version of Windows to be still in widespread use when it moves into extended support, XP is a unique case, said Michael Cherry , an analyst with Directions on Microsoft. "This is the first time I can remember that we have a situation where people will be continuing to buy devices with an operating system no longer in mainstream [support]," said Cherry.

The devices he was referring to are netbooks, the loose category of low-priced, small-sized laptops that accounted for a significant portion of PC sales in the last few months of 2008. About 80% of all netbooks sold in the last quarter shipped with a copy of Windows, Microsoft claimed last month. The bulk of those netbooks shipped with Windows XP; with its bigger footprint and heartier system requirements, Vista can't be squeezed into most low-end laptops.

Last year, Microsoft extended XP's sales lifespan specifically to account for netbooks, pushing the drop-dead date out to mid-2010.

"If you're buying a netbook with XP, you have to accept that XP is not in mainstream support," Cherry added. Not that that should matter much. "XP is well known by this point," Cherry argued. "A significant number of its problems have been identified and resolved, so the chances aren't great that there would be some new major issue."

Microsoft has, in fact, issued a total of three service packs for the aged operating system, the most recent, Windows XP SP3 , in May 2008.

"But you also have to look at the reality of the marketplace," Cherry cautioned. "Once XP is not in mainstream support, Microsoft is not going to make any functional enhancements to XP. If there's a functionality bug with no security issue, it probably won't get fixed."

For its part, Microsoft downplayed the impact on users who purchase new systems powered by XP after the operating system leaves extended support. "For any copy of Windows XP that you buy pre-installed, the OEM will provide the support," the company's spokeswoman said. "This support is not tied to the Microsoft Support Lifecycle policy, but rather to the OEM's support policy. So, if a consumer purchases a netbook today with Windows XP Home pre-installed, their primary support would be through the OEM."

Original here

Compiz community shakeup could bring big improvements

By Ryan Paul

Compiz community shakeup could bring big improvements

The development community behind the open source Compiz window manager is undergoing a major reorganization effort that will converge disparate branches of the project and help it overcome its recent lack of direction.

Compiz is responsible for bringing rich visual effects such as cube rotation, transparency, shadows, and wobbly windows to the Linux desktop. It includes a powerful compositing engine that leverages hardware-accelerated 3D graphics and the latest features of Xorg. It is shipped with several popular Linux distributions and is extremely popular among Linux enthusiasts.

The project originally began at Novell as the work of David Reveman, but it was forked by an independent community of developers who were dissatisfied with the lack of inclusiveness and the manner in which it was being managed by Novell. The fork, which was called Beryl, was eventually reunited with upstream Compiz when it merged with Compiz Extras and became Compiz Fusion.

Reveman's declining participation in the effort left Compiz without any leadership or direction. Several active contributors began making massive architectural changes to Compiz in their own branches outside of the project. This fragmentation weakened the project and created a lot of uncertainty about how it would move forward. In response to these issues, developer Kristian Lyngstol posted a message on the Compiz mailing list in December urging the community to come together and build a consensus around a new direction, consolidate its development tools, and improve technical documentation so that Compiz would be more accessible to new contributors.

Compiz developers participated in several conference calls in order to address the issues raised by Lyngstol. They made several important decisions which were announced in a mailing list post. The project will be led collectively by members of its new community council, which consists of five key members of the Compiz community. They have laid out a roadmap that will allow the project to move forward.

The somewhat arbitrary distinction between Compiz and Compiz Fusion will be dropped entirely. The project will simply be called Compiz and all of the development infrastructure-—such as bug report systems and mailing lists—will be converged. They are also planning to begin the process of preparing for an official 0.8 release, which will include the latest stable components.

Following the stable release, they intend to adopt the major architectural changes that have been prototyped by Dennis Kasprzyk in his Compiz++ branch. These changes include migrating the entire code base to C++ with an object-oriented design. His work will also provide support for tiled textures, improved ABI stability, support for pluggable rendering backends, and reparented window decorations. A 0.9 release is planned after Compiz++ has been fully integrated.

According to the new roadmap, Novell's Nomad enhancements could also potentially be merged into Compiz after the 0.9 release. Nomad is a project that aims to bring improved remote desktop capabilities to the Linux platform, including support for client-side compositing. This features some significant modifications to Compiz that aren't yet fully mature. There are also plans to conduct a complete code review and improve documentation throughout Compiz.

The new plan is ambitious but very promising. Compiz is becoming an important part of the Linux desktop stack and it has the potential to bring a lot of very rich aesthetic and usability improvements to the platform. The lack of momentum behind the project in recent months has been very troubling and the renewed efforts to break away from the current malaise of stagnation are a good sign that the developers are still committed to making Compiz shine.

Original here

Top 10 Responses to Why Should I Use Linux? - A Linux Evangelists’ Reference

  • StumbleUpon

Tux_bibleIf you’re a Linux enthusiast like me, you’ve probably tried to convert a few people over to Linux from another operating system. Even though you succeed many times, there are always a few ‘geniuses’ out there who need some real persuading to switch over to Linux.

So here are some quick and simple things about Linux you can point out to your potential convert.

1. Linux helps you get rid of viruses, worms, and other computer infections.

Although it is possible to get infected even with Linux (malware is mainly written for Windows), its system architecture, based on a server-client relationship makes it difficult for a virus to do any damage. See this post.

2. Linux is fast and will stay fast

Vista is a huge resource hog. XP isn’t too humble either. If you use Windows for a longer time and install lots of different programs, your computer becomes bloated with trash data, consequently becoming slower. With Linux, even a major distro won’t demand more than 256 RAM to run passably with all the bling and a maximum of 2 GB of space. And it won’t get slower.

3. Linux is easier to use than Windows. Using the Terminal is not necessary in most cases.

What could be easier than opening a package manager, selecting the software you want, and letting Linux download it for you? I hate it when people complain about how Linux’ hardware support bites. They probably aren’t aware of the fact that Linux supports more devices than Vista. I should know, I had my share of problems with three different versions of Windows with lots of pieces of hardware. Ndiswrapper-gui is fool-proof and you know it! You can’t expect a new user will know how to solve every issue.

4. Linux is free as in free beer AND as in free speech. Cost does NOT define the value of Free Software!

All of the major Linux distributions are free of cost. Yes, that’s zero dollars. Ain’t it great to have a whole operating system and not be afraid you’ll get fined? You can choose from a multitude of great open source apps which come under the GPL license, too. No need to worry about the

5. Your porn collection is safe with Linux

Don’t laugh. I’m not a big fan of porn, but it has a big market share on the Internet. On Linux, (which is trojan, virus etc. free), every user gets a password, and you can see the source code so you know that FBI isn’t checking you out, like on Windows.

6. Thousands of Windows-only applications (even Photoshop CS3) and games work with Linux through WINE just as fast. You can also find good alternative open source software.

The Linux experience may be a bit different for a user who needs some Windows-only software, because there is no good equivalent for it on Linux. There’s always WINE though. It’s a compatibility layer which can install Windows programs on your Linux Box, even games like Call of Duty, Unreal Tournament, Half-Life 2…

Search open source alternatives

Search the Wine application DB

7. Linux looks better than Windows Vista or Mac OS X

It’s a simple as this: Linux has got Compiz Fusion and a very nice default theme, normally. You can imitate ANY effect other operating system boast with. If a Mac user still thinks his Mac OS X or Vista look better, don’t worry. You can make Linux look exactly like it in under 15 minutes.

8. Linux is fun

People always tell me this. And why? Linux users have the power to expand their system for free, they are free to modify the system exactly to their needs. You learn something new this way every day with Linux and if you do things right it can reward you with the desired results.

9. You can help improve Linux

There are lots of jokes made about ‘teh 1337 powerz of communityz’!!! Sadly, they exist mostly because they don’t know any good arguments against community. In open source, you can actually speak to the actual developers, help translate, develop and improve the software you are using right now while not helping others make wads of cash.

10. Linux makes you give your computer a name

It’s possible on every OS, basically, but Linux demands you do it! Sheer coolness.

Additional links:

Get Linux

Things you can do with Linux and can’t do on Windows

Posted by Greg

Original here

Top 5 Linux Games for 2009

linux_gamesAs we go about “realizing” our New Year’s resolutions were maybe just a bit too stringent, I’m going review the top five games in Linux. Once the great downfall of the platform, gaming can now only be considered a strength, in the hopes you take up this guilty pleasure and wait for 2010 before you give up on gaming. May I present the premier Linux gaming software with the best from each genre.

Tremulous (Shooter)
This FPS (first-person shooter) game is portable on all main operating systems (Linux, Windows, Mac) and is built using the Quake engine.
Players choose between two races: aliens and humans. Both have their own unique strengths and weaknesses, and both are opposing teams tremulous2on the same map. Whilst uncommon for an FPS, Tremulous allows you to build working structures that serve many functions, the most important being “respawning,” whereby if a player is killed, he reappears at a respawn site. Kills for your race earn you credits. For humans, this means better weapons or upgrades; for aliens, kills enable them to evolve into more powerful beings, the most powerful being the “Tyrant”. The objective behind the game is to not only kill all players of the opposing team (i.e. race) but also to destroy their “respawning” site(s), so that they can’t reappear. With an average of 400 users online at a time, there won’t be a moment left in the day to regret the amount of time you spent playing.

According to SourceForge statistics the game has been downloaded over 1,000,000 times as of 16/10/2008. It was also voted Player’s Choice Standalone Game Of The Year in the Mod Of The Year 2006 competition.
Like all great software, it’s open source and can be readily made available to you from the follow link:

Sauerbraten (Shooter)
sauerbraten3Meaning “roast meat” in German, this C++ written FPS runs on the main operating systems (Windows, Macs and Linux) and is built using the rendering engine Cube 2, for those of you who aren’t keen followers of the Quake movement (as with Tremulous). The main distinction to make between Tremulous and Sauerbraten is the ability to edit the geometry of the map ingame. Coupled with an emphasis on 6-directional gameplay, this dynamic is going to keep you hooked. It supports both Singleplayer and Multiplayer modes, and the latter of the two offers three possible gameplays: Deathmatch, Last Man Standing, and Capture (whereby teams fight over certain areas of the map). For the Singleplayer mode, there is plenty to keep you busy, unlike in Tremulous. You have the option to play scenarios split into episodes, Deathmatches with bots ganging up on you, and the game even goes so far as to provide levels where you can fight in slow-motion.

MacWorld UK gave it four out of five stars, whereas Games For Windows: The Official Magazine mentioned it in Issue 3 with the reference “perfect for both stingy and creative gamers alike.” But now for an organization whose opinion matters… Phoronix, a purely Linux-orientated hardware and software reviews gave it a positive rating due to “several enchancements to its underlying “Cube 2″ engine”.

Like all great software, it’s open source and can be readily made available to you from the follow link:

Warzone 2100 (Strategy)
warzone_2100If you liked StarCraft, you’ll love this. The “3-D cross-platform real-time strategy” denotation doesn’t do justice to this once-proprietary program. This game is highly customizable, allowing everything from a wide array of camera angles, to the ability to customize drive systems (e.g. wheels/track) of your units. Warzone 2100 follows an episodic gameplay structure, following a sequence of scenerios whereby you have a time limit to complete the objectives stated using construction, upgrading, recruitment, etc. for the availability of the manpower required for the task. The latest stable version was released January 12, 2009.

Warzone 2100, once developed for the PlayStation (rating of 76%) and Windows, is now praised by the likes of IGN and Gamespot, which had the following to say about the game:
“Warzone 2100’s highly navigatable 3D engine, unique campaign structure, and multiplayer gameplay should please most real-time strategy fans”.

Like all great software, it’s open source and can be readily made available to you from the follow link:

Glest (Strategy)
glestThis Spanish game, developed using Glest Advanced Engine, is basically a cross between Tremulous and Warzone 2100. It imitates the 3-D, real-time strategy idea of Warzone 2100 but with a medieval theme. It mimics Tremulous in that there are two opposing factions, Magic and Tech, both with their own strengths and weaknesses, both fighting each other on the same map. The Tech team is composed of conventional warriors with medieval weapons at their disposals, with their own unique set of units, buildings and upgrades. The Magic team is targeted at more experienced users where most of their army is “morphed” or “summoned.” Whilst lacking close combat skill, it makes up for it in brute power and versatility. For those of you who loved StarCraft on Windows - this is the game for you.

Like all great software, it’s open source and can be readily made available to you from the follow link:

SuperTux (Retro)
supertux-030-4For those among us who miss hearing the upbeat music of Level 1 SuperMario, may I present SuperTux. It’s the classic side-scrolling adventure game we all played in our childhood, only now, instead of Mario you have “Tux”, the penguin mascot of Linux. With “Penny” captured by bad guys, it’s up to Tux to rescue her.

Receiving Game Of The Month award by HappyPenguin.Org when it first came out, SuperTux went on to celebrate eight version updates and the SuperTux Development Team and Blizzard Entertainment are eagerly working to bring you Supertux 2. The beta release reiteration of SuperTux really brings back memories of SuperMario with multiple “Worlds”, a variety of monsters and a complimentary, childlike plot.

I hope these referrals introduce more users to the variety of games on the Linux platform. While much remains to be accomplished, we can at least revel in the progress made up to 2009, and look forward to what this year will bring for us.

By Mihai Marcas

Original here

Is it Windows 7 or KDE 4?

Is it Windows 7 or KDE 4? In this video, we take to Sydney's streets to find out what people think of what they think is a Windows 7 demonstration.

Embed the video on your own site:

The net result? Mainly, people just didn't like Vista.

What do you think? Do Windows 7 and KDE 4 just look way too alike to tell the difference? Which one looks better? Post your comments below.

Original here

Yahoo Might Buy Tumblr, New York's Cutest Startup

By Owen Thomas

We hear Yahoo is in talks to buy Tumblr, a blogging startup run by 22-year-old David Karp for "low-to-mid eight figures" — which would translate to a small fortune for the New York entrepreneur.

And a quick one, too, without the troubles of figuring out how to make money off of Internet hipsters' self-indulgent ramblings. Karp has toyed with charging users for extra features, but it's not clear that adding fees would draw much revenue. Nevertheless, Tumblr was able to raise $4.5 million in December, an investment which reportedly valued the company at $15 million.

An incredible amount for such a young startup with such fuzzy hopes of making money. But it's a bargain compared to Twitter, a startup similarly unburdened by the depressing reality of actual revenues. Which is why Yahoo might, just might, be willing to part with as much as $50 million for it. (In a sad recognition of how late Yahoo is to the whole Twitter phenomenon, its PR department set up a Twitter account today.)

We hear the talks are serious, led by Tapan Bhat, a fast-rising executive in charge of Yahoo's homepage and other key properties — but as with any acquisition talks, they could fall apart. Fred Wilson, a partner at Tumblr investor Union Square Ventures and a Yahoo spokeswoman did not respond to inquiries about the talks. In a text message, Karp, confirming his reputation for adorably juvenile sarcasm, wrote, "You got it backwards."

What could kill the deal: Already, Yahoos are grumbling at the idea of spending tens of millions of dollars on a revenue-free startup. The company's spending spree on Web 2.0 startups like and Flickr has yielded few visible financial results. Some grumble that has more to do with Yahoo's mismanagement of the acquisitions, but the point is the same: Why should Yahoo spend more on startups, having failed to profit from the ones it already bought?

And there's also new CEO Carol Bartz, who is waging a pointless jihad on leakers. She may be angry enough that word of the talks has escaped Sunnyvale that she may kill the deal for that reason alone.

Update: Awww, Karp is adorably denying the rumor of Yahoo's interest in his company! Then again, he also claimed Tumblr was buying Yahoo, so who knows what to make of anything that comes out of his so-cute-you-could-pinch-'em cheeks? His lead programmer, Marco Arment, is also perkily insinuating that he would quit if Yahoo bought the company:

I hope they let me work on some of the many exciting projects at Yahoo! Who needs a high rank at a small company in New York? I want to move to California and get stuck in traffic every day on the way to my midlevel engineering job where I sit in a cubicle all day and can't make any product decisions while working on something nobody will ever see to manage regional ad clickthrough stats tracking.

Original here

Into the cloud: a conversation with Russ Daniels, Part II

By Jon Stokes

Into the cloud: a conversation with Russ Daniels, Part II

If you asked ten random techies to define "cloud computing," you might get twelve or thirteen different answers, but if instead you asked those same ten folks to identify the most overused buzzword of the last year, they'd probably all agree that "cloud computing" was it. Truly, "the cloud" is aptly named, because everyone who stares at the concept sees something a little different. So imagine my surprise when, on attending a session at this past summer's AlwaysOn conference, I heard someone on the stage talk intelligently, coherently, and technically about a topic that I had written off as so much noise.

That person was HP's Russ Daniels, CTO and VP of Cloud Services Strategy, and by the time his panel was done, I knew that I had to talk to him in more detail about cloud computing. I managed to land an interview with him a few weeks ago, and it was so good that I've reproduced it (in slightly edited form) in this article. The interview is extremely substantial, so if you're at all interested in the cloud—especially if you're a skeptic—I urge you to dive in.

This interview (see Part I for the first half) actually altered the way I thought about the cloud and about software delivery in a networked world. I hope that you find it as illuminating as I did.

Mobile devices as "sensors" for the cloud

RD: Let me give you another example that describes the expressiveness of the cloud and the role that devices play. We tend to think of devices too narrowly. I do a fair amount of business travel, and every now and then I'm lucky enough to be on a plane where I have a screen and I can watch a movie. But, a common occurrence is that the flight crew comes on the PA and announces that we're landing, so they shut down the entertainment system with ten or fifteen minutes left in the movie. Consequently, I have a surprising number of movies that I've seen most, but not all of.

Think about that problem, and then imagine that you go into your hotel room, turn on your entertainment system and it asks if you'd like to continue the movie that was interrupted in your flight. To do that, it's just a matter of propagating a small amount of state—the airline knew who was in the seat, they know what channel was being watched on the entertainment system and they know what frame the movie was interrupted on. That little bit of state can be propagated up to a profile that's associated with me, the passenger.

When I check into my hotel, I can provide access to that profile for the aspects of the profile that I think are relevant to the hotel, and that provides them with the opportunity to offer me that surprise of being able to finish watching the movie.

I didn't own the device in the airplane; I don't own the device in the hotel. By expanding our thinking about what Internet-capable devices ought to be, aside from the notebooks or phones, we are able to include anything that has the ability to be technology-enabled. These cloud-enabled devices can play a role in understanding what you're doing, offering you assistance and improving the experience that you have doing it.

Devices increasingly become important not only as user experiences, but also as sensors. One of the great things about a cell phone is that it has the ability to generate event streams relevant to what I'm doing—the hotspots that I go by, all of that kind of stuff. When you accumulate those streams, you can then do analytics to start to identify my defaults, my preferences. You can notice patterns of behavior that suggest when I do one thing, there's a pretty high probability that I'm going to do this other thing.

All this means that technologies can start to identify your intentions, rather than you having to map between what you want to do and how technology can help you—and, then, of course, forgetting the fact that you spent a lot of time coaxing the technology to go along.

Ultimately, the cloud creates a fundamental opportunity to approach user experience in a much different way.

Selling the cloud vs. selling clouds

JS: I want to go back to the four components of cloud computing that you mentioned. First, I have more of an observation than a question: this flexibility and granularity that cloud computing gives you in remapping resources seems to be related to #4 in your list—"the variable cost model related to delivery"—in the sense that they're almost the same thing. You can vary the cost of delivery precisely because you have this flexibility to do a fine-grained pairing of a set of resources with demand.

There will be a small number of customers that will have their entire infrastructure delivered as a service, but the vast majority of our customers will continue using data centers, servers, storage, and networks.

So my question is, you guys want to rent this out as a business, but you also sell the hardware that people can use to do this for themselves. So on the one hand, you're renting out this capacity... In your announcement for your HP Adaptive Infrastructure as a Service product, you said: "With HP AIaaS, customers can realize improved service levels and convert traditional capital investment into an ongoing operating expense because all assets are owned and managed by HP."

But you guys are selling that capital that enterprise customers would've invested in, so do you guys see yourselves moving in this services direction? Is there a tension with enterprise customers between leasing the service vs. running it in-house, or are you trying to sell both?

RD: The IT function in an enterprise has the responsibility for sourcing and delivering services that a business needs. Gartner describes this as an "enterprise-class problem," which means that IT only has to solve this for its own enterprise, not for everybody. But if you think about a service provider, they're providing services to many customers; Gartner calls that a "global-class problem." The problems are different, but that doesn't mean that they're simpler on one side or the other.

The IT function has to do with what the business needs, so they can't simply say, "No, that can't be done, we're not going to do it. Do without." The service provider can say "no." They can say, "Here's what we do—if you want something else then you can go to a different service provider." So that's different, and the cost structures are different too.

The enterprise IT function tends to have to deal with more complex environments and legacy systems, but again, they only have to solve the problem for one customer. Think about something like an infrastructure utility—how do I take advantage of virtualization and automation to deliver IT capacity at a lower cost? If you're the IT function, you want to build an infrastructure utility that you can use for your own purposes. We've been helping our customers with that for years.

HP has a lot of intellectual property in our software, enterprise server and storage businesses. We have servers, storage, and networking technologies that have been designed for this style of delivery. We have consulting services to help our customers with that. So, there's a lot of work that we do to help our customers build an infrastructure utility today. The direction that we're driving in that space is to make it more and more of a turnkey solution because, while you can do it today, there's some assembly required. We want to make it easier for customers to have their own infrastructure utility.

A service provider, similarly, wants to build an infrastructure utility, but they want to deliver infrastructure as a service to IT customers, and they need to support multiple customers. Consequently, they have different needs. Service providers have to be able to deal with issues of multi-tenancy: how do they make the tradeoffs between isolation of workloads vs. the cost and integration that they can get across their full infrastructure. The key thing to keep in mind is, if the IT function builds its own infrastructure utility, it has fixed costs: it owns the assets, it owns the data center, it has to do the power and cooling, it has to do the operations, and those are all fixed costs. It might be able to reflect those costs variably to the business, but for the IT function they're fixed.

Similarly the service provider can provide variable costs to their customers, and from their internal perspective, costs are fixed—they own the gear, they own the data centers, etc. Whenever you think about these things, you have to make the distinction between whether you're playing the provider role or the consumer role, because the dynamic changes in each case.

RD: HP does offer today various infrastructure as service offerings. These are focused on the enterprise, so we spend more time addressing key requirements that enterprises have around security, give them more control of the deployed architecture, and that all comes at a cost: it's a little less automated than the way we could deliver them; they pay more than they would have to pay for having an hour of a virtualized Linux container.

The types of infrastructural services that can be delivered vary based on how much control you have, what the basis for the billing is, the type of relationship you have and whether you can customize it.

Ultimately, we believe that long-term customers will operate in a hybrid environment. There will be a small number of customers that will have their entire infrastructure delivered as a service, but the vast majority of our customers will continue using data centers, servers, storage, and networks. These customers have essential operations running in those environments that are critical to them, and they need to be able to deliver it to the business.

So this idea that the end of IT is just around the corner—we think it's unlikely.

What's also true is that we have a lot of business selling to the service providers; we have specialized hardware products, servers, storage, and networking technologies designed to address the particular needs of a service provider who's trying to operate in a global-class environment. We also have more innovations in the pipeline.

Ultimately, you need to understand the relationship between the consumer and the provider as one of trying to balance risk.

Again, a lot of things that we do in our research group are focused on improving the results that each of these potential customers realize. This research determines how we help the service providers meet the much larger scale that they have to deal with, and how we provide the higher levels of automation that they need. It also helps us approach issues of multi-tenancy and how they can provide the right kind of security isolation. For example, we have a project called Cells as a Service, focused on creating secure isolation in a virtualized space that spans all the way from the client to the back room systems.

We also have lots of research taking place in areas such as exascale computing, where we want to be able to build systems that can scale to much larger than what's delivered today: lower power utilization, faster interconnects, photonics, etc.

Another thing to point out is that HP offers Internet services today. We have sites such as Snapfish, for photo sharing, Upline, a consumer backup cloud service, and other Internet-facing services that need that same global-class kind of infrastructure.

JS: Yeah, that answers my question. I was thinking of things in overly simplistic terms, as in, if a company needs X amount of capacity, but they buy X + 5 from you to overprovision, then that difference is more money for you guys. But if you're giving that up by giving them the ability to buy exactly X when they need it, and X + 5 when they have a spike in demand... Then, well, it's in your interest when they overprovision.

RD: We don't really think so. It turns out that one of the worst ways to build a business is to sell people stuff that they don't need, because the repeat business isn't all that great.

The reality is that systems tend to be overprovisioned, not because we have business models that are based on selling them that, but it's the result of a deeper reflection of the architectures that have evolved in the traditional IT space. We've created some very complex applications that are very sensitive to the configuration in which they run. Because of the role that those applications play in the business, if they aren't running, the business loses money.

Those kinds of issues cause you to be quite willing to have extra headroom, because it's a risk arbitrage.

It isn't that we're making money because we sold them too much. They bought the amount they needed because the risk of it not working was so great that it makes sense for them to have excess capacity.

I will say that when you think about the difference between building infrastructure and running it yourself vs. having someone else run it for you, it's not always a matter of it being cheaper. This idea of having an infrastructure utility or buying infrastructure as a service doesn't always translate into lower cost. That's one of the key things you might be concerned about, but there are others.

It can be things like knowing that you have a way to handle periodic workloads in a way that meets the business demand. If you think about the price you pay per computation, it might be higher than if you had bought that capacity and used it regularly as a steady state, it's just that that's not what you need. So you're willing effectively to pay a premium for using it a shorter period of time.

It's like renting a car. I commute back and forth from San Francisco on a train, and I used to keep a car here so that I could commute to Cupertino when I needed to, but it's really quite expensive, so I started using ZipCar. Now if I need to go to Cupertino I can arrange to rent a car for three hours and pay $8 an hour, but if you compare that to what it would cost me to have a car fulltime, it's cheap.

Ultimately, you need to understand the relationship between the consumer and the provider as one of trying to balance risk. There are different ways that the provider can charge you for what you use, and the differences end up balancing who takes what kinds of risks. If I charge you for the hours that you have a system accessible to you, then I'm not taking any responsibility for whether or not those systems are being utilized —it's just you had access for six hours, you pay me six hours times the fee.

This is a different model than if I charge you based on the number of compute cycles you consumed (or how much storage, or how much bandwidth), which is a pure utilization model. When I think about how to price this, certain kinds of pricing models mean that I have to start bundling more costs with those things than just the direct cost, in order to decrease the risk that I have with other parts of my business. So it's just balance back and forth.

If you buy a perpetual software license, you're taking the risk that that license will pay off to you over time. If you pay a subscription fee, you don't take that risk, but you might pay more per seat (per month per user) as a result, because you've shifted the risk to the provider—you can leave at any time, so the provider has to think about how they manage the cost that they have associated with a license.

JS: And in today's environment, you have to think about what happens if the provider goes under, and you all of the sudden don't have access to the services, which is a discussion that lots of people are having right now.

RD: It's a huge one, and I spend a lot of time talking to enterprises, and the reality is that the IT function needs to source and deliver services appropriately, so they have to think about things like data portability: can I get the information back out if I need to move? If I get it out, is it in a form that I can reuse?

In the same way that if you use one particular application and your data gets stored in a form that's usable by that application, if you want to change applications your data might have to go through a transformation, which might be expensive. If you're using an external service provider, those same kinds of concerns are there.

That's why, if you're going to be a service provider to the enterprise, you have to think of many classes of services that might make sense to meet the varying needs of the market you intend to serve.

Original here