Followers

Saturday, December 20, 2008

7 Things Google Chrome Needs - Now That It’s Out of Beta

I want to start out by saying, I LOVE Google Chrome. Recently I went away from Flock and made Chrome my primary browswer. Going out and saying it’s the “best browser I’ve ever used” would be a bit much, but it’s definitely in the top teir. This article isn’t about how great Chrome is, though.

A little while ago, Google announced the removal of the “Beta” tag from their brand new browser. Now, I have heard speculation that this is due to the fact that computer manufacturers won’t bundle software with “Beta” hanging around its neck. Normally Google applications stay in Beta for a very long time (see: Gmail), but after only a few months, Chrome is out in the wild with no reservations.

While this may be more of a marketing stunt than anything else, a “Gold” version of Chrome (that’s ironic!) must now assume the same responsibilities of the other browsers that have taken off their training wheels. Note that in this list I’m excluding most (but not all) of the things Google has explicitly promised to fix. These are in NO particular order.

1. More Extensive Options

To be honest, Chrome has fewer options than most of the freeware I review regularly. Essentially, Google has eliminated all but those options that are absolutely necessary, probably in order to see if any of the common options are actually unnecessary. Here are the areas I’d like to see enhanced or simply restored:

  • Allow users to set preferred media associations. In other words, restore the functionality of the “Applications” tab in Firefox options which allows the user to choose which plug-ins or programs handle certain types of media.
  • Add in options for better tab management (discussed later in this article).
  • Basic history controls (like how long to keep user history) would be nice. All the history controls have been made separate in the form of a “History Tab,” but it’s been oversimplified.

I will note that Chrome made it very easy for me to compare Firefox’s options to its own, because the Chrome options box does not disable use of the main browser window. Definitely a nice touch.

2. Compatability Mode

For a very long time, Firefox was plagued by websites that would only allow Internet Explorer (IE) usage. While this is still an issue, many secure or proprietary websites have begun to develop their sites so that they work with Firefox too.

Chrome is, in this sense, the new Firefox. There are already a large number of users who are running Chrome as their primary browser (and more will follow if Google begins making deals to bundle it into new PCs), but there is no easy way to work around browser incompatability.

Firefox users fixed incompatability by emulating IE with an extension called IE Tab. Essentually, IE Tab ran Internet Explorer inside of Firefox though a function called “Chrome” (no relation to the subject of today’s article, as far as I’m aware). While this worked for almost every instance, it seems that a superior solution could be built into Chrome (the browser) itself.

Why not give users the option to render the page using either their installed copies of IE or Firefox, in addition to Chrome? For the less experienced users who choose Chrome for its simplicity, this will remove one more headache whenever they try to access arcanely designed websites. This is the kind of thing my mom needs, even though she might not understand why.

3. Please Fix Flash!

Seriously! Flash is one of the most pervasive web components out there. While several browsers seem to be having problems with Flash at the moment, my experience with Chrome has been the worst. Separating the plug-in from the browser’s primary process is quite a blessing sometimes, but it also crashes all the time. When Flash crashes on Chrome it kills all the Flash elements on every page. It can be very aggravating to lose idle Flash games or buffering videos.

There are some proposed fixes out there, but most of them are of the “uninstall, reinstall” kind and none have worked for me so far.

4. Better Tab Management

If you’re at all like me, you have no less than 15 tabs open at any one time. This is because there are just so many gosh-darn cool things on the internet and tabs are a great way to store all the things you want to read later. Unfortunately, keeping tabs open requires memory (especially if they have a lot of Flash or Java elements), so it’s terrible for your performance. Chrome has done an excellent job of minimizing RAM leaks and separating each tab into its own process. If one tab freezes, the others will survive.

My grievance with Chrome’s tabs is the layout. While I love how the tabs ARE the the browser’s upper boundary, the interface breaks when too many tabs are open at once. The above image was taken with only 17 tabs open. If as many as 30 are opened simultaneously, the icons dissappear altogether and each tab is identical to the others. A simple solution to this would be to allow some of the tabs to fall off the edge of the screen and be accessable through an arrow button or mouse scrolling (as is done with Firefox). This might break Google’s aesthetic, but I’d be happier as a user.

5. Extension Support

Google has promised in the already mentioned blog post that an extension platform is on its way. Awesome. I have just three requests for Google:

  • Call them “Extensions” - not Add-ons, Plug-ins, Widgets, or Tools.
  • Cater to the people who are already making high quality extensions for Firefox - make it easy for them to port their projects over to Chrome.
  • Take extensions to the next level. Whether it’s a great developer pack or better integration with Google Docs, I feel like there is still new ground to plow in this field.

6. Built-in RSS Support

Almost all browsers now offer some kind of auto-detection for a site’s RSS feeds. Flock did the best job in my opinion. For some reason, Chrome has NO features related to RSS.

Image from Vox Daily

What needs to happen is for Chrome to “notice” whenever a site has a general RSS feed and notify the user, either with a distinct icon or a simple option to open that feed. If they wanted to include Google Reader integration, I wouldn’t complain.

7. Little Things and Beta Bugginess

Aside from these major topics, there are quite a few “little things” that have been bothering me in the last month or so. Some of these are things I really liked about Flock/Firefox and wish they were also in Chrome. Some are bothersome things that can be chalked up to the fact that this is still basically Beta software, no matter what Google calls it.

  • Switching from tab to tab has been slow for me. Often the tab I switch to is blank for quite some time before the page loads (this is after the page has already loaded previously).
  • There is no “View image properties” option in the right-click contextual menu. There is an “Inspect element” option, but it doesn’t really serve the same purpose as well.
  • An extension called CoLT allows you to choose between copying the text of a link, the location (or URL) of a link, or both in a specific format. This is perfect for transporting inserting links into blogs, microblogs, and emails. I’d love to see this added natively to Chrome.
  • I’d love to see spullsheck spellcheck built-in.
There are other little bugs that are slowly being worked out of Chrome, but it will clearly take some time for it to work as well and as consistantly as Firefox. Even so, it’s impressive to see just how far Chrome has come in such a short time. It will be interesting to see just how many of these issues Google decides to fix and what new concepts they will come up with.

Got Chrome gripes of your own? Post your own issues in the comments.

Original here

Source Code For Twitter-Like App, Trillr, Now Available

by Lisa Hoover

Developers of Trillr, a microblogging project similar to Twitter, announced this week that its source code is now available to anyone who wants it. The idea for Trillr was conceived in 2007 as a peer group experiment among team members who wanted to learn more about Python and Django, and was created as an enterprise tool with enhanced features like group discussion and a user directory.

Trillr project member Stefan Aust admits the code base is "kind of crappy" as it stands now, but that's to be expected since it was part of a learning process. He says that, looking back, he would have done some things differently but notes, "perfect source code does not create communities. Our source code can."

Trillr is a standalone Web app that uses JavaScript, but can also run without it. With some tweaking, it can be set up to run in about a half an hour, and is already capable of handling 100 users or more. Aust also says to be on the lookout for a similar contribution in the future -- a "more ambitious project called Trillr1" which has an API that's compatible with Twitter.

Trillr was released under the BSD license, and Aust says now that the project has ended " this is our gift to the world in general and our operations department in particular which gratefully volunteered to continue to operate CoreMedia's Trillr instance. You'll keep the flame, guys."

Original here

Gates Foundation to help libraries be better free 'net cafes

By David Chartier

Public libraries haven't been just about books for some time now, but they are finding it increasingly difficult to keep up with the costs of infrastructure, faster Internet access, and new computers. To help struggling libraries get on their 21st century feet, the Bill & Melinda Gates Foundation today announced a grant program of $6.9 million that will go toward launching a pilot broadband initiative in a handful of US states.

The seven states included in the Gates Foundation's pilot grant program include Arkansas, California, Kansas, Massachusetts, New York, Texas, and Virginia, and the money has been awarded to two separate organizations. $6.1 million goes to Connected Nation, a non-profit broadband Internet advocacy group that will help these states to gather and activate various public library leaders and officials who can support broadband Internet in each state's libraries.

The rest of the funds, a hair over $850,000, will go to the American Library Association's Office for Information Technology Policy (OITP), which will help state library agencies implement sustainable broadband strategies. The organization will also perform and distribute a series of case studies that demonstrate how other public libraries can successfully manage broadband services for their patrons.

A bull market in library usage

The Gates Foundation's grant comes at a crucial time when libraries across the US are reporting spikes in patron traffic due to the economic crisis. Students, the unemployed, and those without home Internet access are increasingly making use of the fact that local libraries double as free Internet cafes. In fact, a recent 2007-2008 study by the American Library Association (ALA) shows that 73 percent of public libraries are the only source of free, public Internet access in their respective communities. Despite this demand, however, only 38.9 percent of all libraries have a T1 (1.5Mbps) connection, and among those, 51.6 percent are urban libraries, 32.1 percent are rural.

These libraries are feeling the squeeze, too. Over 57 percent of libraries (up from 52 percent in the ALA's 2006-2007 study) report that their connectivity is too slow some or all of the time, and over 82 percent report that they don't have enough workstations some or all of the time. Because of these and other constraints, over 90 percent of libraries impose time limits on public Internet workstations, with 45.7 percent using a 60-minute limit, and 35.2 percent cutting users off at just 30 minutes; hardly enough time to finish registering at Monster.com or complete that web-based art history exam.

The Gates Foundation picked the seven states for this pilot program based on a variety of factors such as their high concentrations of public libraries with Internet speeds below 1.5Mbps and public policy support to improve public library broadband access. The foundation has already invested $325 million in grants and other support for computers and staff training in libraries across all 50 US states. If this pilot grant program goes well with these first seven candidates, the Gates Foundation may expand its support to a limited number of other states.

Original here

Job posting reveals Zune-Xbox integration

Posted by Matt Rosoff

Zune speculation is an armchair sport here in the tech sector of the Pacific Northwest (especially when we're all housebound because of a few inches of snow), and today Todd Bishop at TechFlash posted some interesting excerpts from the Zune team's job listings. Based on his post, it looks like the Zune Marketplace will begin to use the back-end from Musiwave, the European provider of music for mobile phones that Microsoft acquired a little more than a year ago--and if that doesn't point to a Zune service for mobile phones, nothing does--and will continue to feature DRM (couched in ever-so-reassuring phrases like "to let consumers enjoy music in new and interesting ways").

Next peripheral: great headphones?

(Credit: CNET Networks)

But here's something else: Zune is coming to Xbox. There's a job listing for a user experience designer to work on the Zune Device UI, Zune PC Client and--hang on a second--Zune Xbox.

Zune Xbox? Of course. Both Xbox Live and the Zune Marketplace use Microsoft's own pseudo-currency, Microsoft Points. Xbox Live already lets you download video content and stream movies-on-demand from Netflix (if you're a subscriber). You can plug any MP3 player into the Xbox 360 and listen to a mix of your music as you play. So why not take all these pieces to their logical conclusion and let you access the Zune Marketplace from Xbox Live.?Integration would be particularly useful for Zune Pass subscribers, giving them another device on which to use their unlimited monthly listens and 10 permanent downloads.

Matt Rosoff is an analyst with Directions on Microsoft, where he covers Microsoft's consumer products and corporate news. He's written about the technology industry since 1995, and reviewed the first Rio MP3 player for CNET.com in 1998. He is a member of the CNET Blog Network. Disclosure.

Original here

Installing Linux applications on OS X with Fink

By Pavs

Even though OS X is a Unix based operating system, you can not just take a Unix/Linux application and run it on OS X, you have to port it on OS X in order for you to use it. Since most of these applications are open source – this can be achieved fairly easily. Fink is a project that recompiles existing open source software and ports it on OS X and makes them available through a package manager called “Fink”, which is similar to Synaptic Package Manager for Ubuntu. Fink has a huge collection of some 2500+ packages.

Configuring and installing packages with Fink:

  • Download and install Fink.
  • Copy FinkCommander.app from FinkCommander folder to Application folder.

  • Now you will have to update sources and binaries. Update sources by going to Source –> Scanpackages from Fink, Index the source list with Source –> Utilities –> Index. Update binaries by going to Binary –> Update descriptions.
  • Finally to install application select the application you want to install and go to Binary –> Install.
Fink works by downloading the original source, patch – configure and build the packages automatically and the dependency system makes sure that the required libraries are present during the installation.

Original here

Will the Recession Kill Web 2.0?


The last fast few years have seen resurgence in Internet companies not seen since the bubble years of the late 90s. The growth of these advertising-supported "Web 2.0" companies has propelled online advertising sales to $21 billion from $6 billion between 2002 and 2007. But the last recession pricked the bubble in 2001. What will happen to this crop of Internet companies?

The broader economic recession has not spared the Internet sector. Online display advertising is projected to be flat to down by RBC Capital in 2009. It is hard enough for internet startups at the best of times. Which companies will come out of this recession the best?

About the Author

[Jeremy Liew]

Jeremy Liew is a Managing Director at Lightspeed Venture Partners, and an active blogger on the consumer internet industry.

I predict that media buyers will focus on both a flight to quality and a flight to surety. This will benefit three types of startups: companies with large audiences, companies that sell direct-response advertising, and companies that offer valuable niche content.

In a period of uncertainty, media buyers will look to trim lower quality sites from their ad buys and focus on a smaller number of higher quality placements. Quality will likely be measured by both familiarity and scale. Sites with well known brands and meaningful traffic will remain in the ad plan. Just as no one ever got fired for buying computers from IBM in the 80s, so too no one will get fired for buying advertising from Google and Yahoo! today.

But big companies at scale and monetizing well, like Yahoo! and Google, are feeling the bite of the recession. Both companies have guided expectations downward recently. This is because they are already at full monetization scale, so there is no escaping the economy.

However, many smaller companies have built huge audiences that are just starting to be monetized. These are the companies that will see the most growth through the recession. Today they are not fulfilling all the potential demand for their inventory because they don't have enough sales people to respond to all the possible advertises. The simple act of adding more salespeople will benefit them. Examples of such companies include social networking site, Facebook, news aggregator Digg and widget-maker Rockyou. (Full disclosure: I am an investor, and a believer in Rockyou).

In an uncertain environment, advertisers will also want to shift more of their advertising budgets to direct response (the online equivalent of 1-800-number advertisements) and away from brand advertising. Much of the direct response advertising is ultimately sold on a CPC (cost per click) or CPA (cost per action) model, rather than the CPM model based solely on the number of ad impressions shown. This makes direct response more of a "sure thing" for advertisers.

Media companies who can apply behavioral targeting and other forms of targeting will be able to improve click through rates and conversion rates. Companies like Revenue Science, Azoogle and Tatto can better identify which customer to show each ad, and will benefit disproportionately.

In a similar vein, ad buyers will want to stick with the surety of their established methods for targeting brand advertising. This is and has always been to buy content adjacencies. The demographic and behavioral targeting that may work for direct response will be harder to sell to brand advertisers in the recession. It will be very hard to prove that these forms of targeting are more effective and justify a premium price. As a result, vertically focused content sites will have an advantage. Examples include the Health Central Network, Flixster and Like.com, all companies that have aggregated large audiences with a singular content focus attractive to endemic advertisers. (Full disclosure: I am an investor and a believer in Flixster).

So there is a silver lining to even the dark clouds of this recession. Certainly there will be many companies that suffer over the next few quarters, and some that will fail. But Web 2.0 companies with large audiences and the right advertising model will benefit from a flight to quality and a flight to surety. They may exit the recession even stronger than today. However, since many of these companies are not yet profitable, they will only benefit if – and this is a big if – they have enough funding to ride out the storm.

Write to Jeremy Liew at jliew@lightspeedvp.com

Original here

How many atoms to build a computer?

Posted by Roland Piquepaille

Because transistors will inevitably stop to shrink in size in the future, European researchers are studying atomic-scale computing. According to ICT Results, this would allow computer processes to be carried out in a single molecule. ‘In theory, atomic-scale computing could put computers more powerful than today’s supercomputers in everyone’s pocket.’ So far, the EU-funded team has already designed a simple logic gate with 30 atoms that perform the same task as 14 transistors. The project coordinator said: ‘Atomic-scale computing researchers today are in much the same position as transistor inventors were before 1947. No one knows where this will lead.’ So don’t expect to use a computer based on molecular components anytime soon. …

Interconnections studied by the Pico-Inside project

You can see above a diagram showing the various interconnections studied within by Pico-Inside project going from atomic level to millimeter scale: “1) molecule to atomic wire (atomic level) 2) atomic wire to a metallic island of a few monolayers high which links the atomic level and the nano level 3) metallic island to a thin metallic ribbon which links the nano level and the meso level 4) thin metallic ribbon to micro-electrode which links the meso level and the micro level 5) microelectrode to macroscopic wiring which links the micro level and the macro level.” (Credit: C. Joachim, CEMES-CNRS) Here is a link to a larger version of this picture and another one to an image gallery related to the project.

The Pico-Inside project started in September 2005 with a budget of €5 million. Christian Joachim of the French National Scientific Research Center (CNRS)’s Center for Material Elaboration & Structural Studies (CEMES) in Toulouse, France, was the coordinator of the project. Joachim, who is the head of the CEMES Nanoscience and Picotechnology Group (GNS), is currently coordinating a team of researchers from 15 academic and industrial research institutes in Europe.

Now, why these researchers are trying to compute with molecules? Here is why. “Transistors have continued to shrink in size since Intel co-founder Gordon E. Moore famously predicted in 1965 that the number that can be placed on a processor would double roughly every two years. But there will inevitably come a time when the laws of quantum physics prevent any further shrinkage using conventional methods. That is where atomic-scale computing comes into play with a fundamentally different approach to the problem.”

As I said above, the team designed a logic gate with 30 atoms. This is interesting, but what’s next? [The researchers] “are focusing on two architectures: one that mimics the classical design of a logic gate but in atomic form, including nodes, loops, meshes etc., and another, more complex, process that relies on changes to the molecule’s conformation to carry out the logic gate inputs and quantum mechanics to perform the computation. The logic gates are interconnected using scanning-tunnelling microscopes and atomic-force microscopes — devices that can measure and move individual atoms with resolutions down to 1/100 of a nanometre (that is one hundred millionth of a millimetre!). As a side project, partly for fun but partly to stimulate new lines of research, Joachim and his team have used the technique to build tiny nano-machines, such as wheels, gears, motors and nano-vehicles each consisting of a single molecule.”

As you can deduct from the short excerpts above, commercial applications will not emerge before a long time.

Original here

It Costs Digg $5 Million a Year to Run the Internet

By Owen Thomas

Perhaps Digg really is the future of the news business. The headline-discussion site, once an icon of the Web 2.0 movement, is losing millions of dollars a year.

BusinessWeek's Spencer Ante got ahold of Digg's financial statements. They are frightful, even for a startup. Last year, the company took in $4.8 million and spent $7.6 million, for a loss of $2.8 million. In the first nine months of this year, losses grew almost as fast as revenues: Digg took in $6.4 million and spent $10.4 million, resulting in a $4 million loss. At an annual clip, that's more than $5 million out the door a year.

Keep in mind that Digg has a lucrative three-year advertising deal with Microsoft, that pays the site a guaranteed rate for its inventory. Without that arrangement, struck last year — driven, most believe, by Microsoft executives' desperation to get in on the Web 2.0 craze — Digg's losses would likely be far worse.

Now it all makes sense: Digg CEO Jay Adelson's repeated attempts to sell the company to News Corp., Current Media, and Google, at a valuation of $300 million or more, came to naught because there's no real business there. Those sales talks, while they were still under discussion, prompted entirely unfounded speculation that founder Kevin Rose was personally worth $60 million on paper. Instead, Digg took $28.7 million in venture capital at a valuation of almost half what the company hoped to sell for.

To be fair, that will last the company years, even at its current rate of red-ink spilling. But it's worth thinking about Digg's numbers amidst the litany of complaints about the ink-on-newsprint business: newspapers coast to coast are seeing devastating declines in advertising revenue. The New York Times has mortgaged its headquarters. The Tribune Company has declared bankruptcy. And yet, even in their decline, newspapers remain prodigious generators of cash. This moribund industry generated $13.7 billion in profit in 2007.

The same cannot be said of Digg, a site conceived by television host Kevin Rose as a replacement for the editors who pick headlines for readers. On Digg, readers vote headlines up by "digging" them, or down by "burying" them.

For now, Digg is safe, insulated from the marketplace as a well-funded private company. But if Adelson no longer plans to sell the company, he will have to take it public. And when the day comes that investors can vote the company's shares up or down, unless he can engineer a dramatic improvement in its finances, he and Rose will know what it feels like to be buried.

Original here

RIAA Stops Lawsuits, But Not the Threats

Written by Ernesto

For years the RIAA has been filing lawsuits against thousands of individuals who allegedly shared copyrighted music. Following recent court setbacks, the lobby group has announced it will stop mass lawsuits. Instead, it will focus on cutting deals with ISPs to disconnect ‘IP-addresses’ that repeatedly share copyrighted music.

riaaIronically, the decision by the RIAA to stop their mass lawsuits is followed by a proposal to target an even larger group of Internet users. The music industry lobbyists state that they are in the process of cutting deals with ISPs to target Internet subscribers that repeatedly infringe on the copyright of the major record labels - the so called three-strikes approach.

This means that millions of people will receive warning emails from their Internet service provider, based on ‘evidence‘ gathered by a third party with a vested interest in the outcome. This will also mean, however, that thousands of individuals will receive emails in error, as the evidence gathering techniques are not as solid as the anti-piracy outfits say. There have been a lot of false accusations already, and this was recently confirmed in mainstream media by the BBC show Watchdog.

The move from individual lawsuits to controlling piracy at the ISP level seems to be the new trend this year. Many countries have looked into the possibility of disconnecting file-sharers from the Internet, often gently pushed by anti-piracy lobbyists. France was the first to present their “three-strikes” law earlier this year, which would allow anti-piracy outfits to police the Internet. The IFPI now plans to implement this worldwide, with or without legislation.

It wont stop there though, if the RIAA gets its way ISPs will also have to pro-actively check for copyrighted content on their network. In their list of suggestions for the controversial ACTA proposal, the RIAA wants ISPs to spy on the files that are transferred by their customers, and check them against a reference database of “copyrighted files”.

ISPs worldwide are not looking forward to policing their networks, but they might find themselves with no other option. Adding further pressure, the RIAA wants ISPs to be held liable for the copyright infringement that takes place on their network, as their proposal suggests “…in the absence of proof to the contrary, an Internet service provider shall be considered as knowing that the content it stores is infringing or illegal, and thus subject to liability for copyright infringement…”

So, while dropping the mass-lawsuits might be considered to be a step forward by some, the change in tactics might very well result in a virtual police state where consumers (and ISPs) are guilty until proven innocent. The RIAA has lost some major battles in court, but if they gain control over ISPs, the future might be even darker than the past.

Original here

Google Shutters Its Science Data Service

By Alexis Madrigal

Googlescience

Google will shutter its highly-anticipated scientific data service in January without even officially launching the product, the company said in an e-mail to its beta testers.

Once nicknamed Palimpsests, but more recently going by the staid name, Google Research Datasets, the service was going to offer scientists a way to store the massive amounts of data generated in an increasing number of fields. About 30 datasets — mostly tests — had already been uploaded to the site.

The dream appears to have fallen prey to belt-tightening at Silicon Valley's most innovative company.

"As you know, Google is a company that promotes experimentation with innovative new products and services. At the same time, we have to carefully balance that with ensuring that our resources are used in the most effective possible way to bring maximum value to our users," wrote Robert Tansley of Google on behalf of the Google Research Datasets team to its internal testers.

"It has been a difficult decision, but we have decided not to continue work on Google Research Datasets, but to instead focus our efforts on other activities such as Google Scholar, our Research Programs, and publishing papers about research here at Google," he wrote.

Axing this scientific project could be another sign of incipient frugality at Google. Just a couple weeks ago, Google CEO Eric Schmidt told the Wall Street Journal that his company would be cutting back on experimental projects. First described in detail by Google engineer Jon Trowbridge at SciFoo 2007 — the slides from a later version of the talk is archived on the Partial Immortalization blog — the project was going to store, for free, some of the world's largest scientific datasets. In Trowbridge's slides, he points out the 120 terabyte Hubble Legacy Archive and the one terabyte Archimedes palimpsest.

"'It's a sad story if it's true," wrote Attila Csordas, a stem cell biologist and author of Partial Immortalization who recently moved to Hungary from Tulane University, in an email to Wired.com. "Assuming it is true that might mean that Google is still a couple years away from directly helping the life sciences (on an infrastructural level)."

Other scientists remained hopeful that the service might return in better times.

"The Space Telescope Science Institute has had a long positive relationship with Google that started with our partnership in GoogleSky in early 2006," said astrophysicist Alberto Conti of STSI. "We were looking forward to Google's commitment to helping the astronomical community with the data deluge, and we are sure Google will reconsider this decision in the future. While perhaps understandable in this economic climate, it's sad to see Google leave the field."

And Conti noted, other companies may step up to help scientists manage their information.

"Amazon is doing exactly the opposite and they might actually fill the void," he said.

Google representatives did not respond immediately to request for comment.

Image: flickr/DannySullivan

Original here

One in Six Use Only Cell Phones at Home

The portion of homes with cell phones but no landlines has grown to 18 percent, led by adults living with unrelated roommates, renters and young people, according to federal figures released Wednesday.

An additional 13 percent of households have landlines but get all or nearly all calls on their cells, the survey showed. Taken together, that means about three in 10 households are essentially reachable only on their wireless phones.

The figures, covering the first half of 2008, underscore how consumers have been steadily abandoning traditional landline phones in favor of cells. The 18 percent in cell-only households compares with 16 percent in the second half of 2007, and just 7 percent in the first half of 2005.

Leading the way are households comprised of unrelated adults, such as roommates or unmarried couples. Sixty-three percent of such households only have cell phones.

About one-third of renters and about the same number of people under age 30 live in homes with only cells. About a quarter of low-income people also have only wireless phones, nearly double the proportion of higher-earning people.

Stephen Blumberg, senior scientist at the federal Centers for Disease Control and Prevention and an author of the report, said there is no evidence the trend is slowing. He said the recession may fuel it further, especially as cell phone prices drop and their coverage and features improve.

"There's clearly a reason to give up a landline phone if budgets are tight," he said in an interview. "Given the current economic environment, I'd not be surprised to see more and more people give up their landline phones for economic reasons."

The findings have major implications for pollsters. In recent months researchers have concluded that people who have only cell phones have slightly different political views than those who do not.

Growing numbers of pollsters now include cell-only users in their samples, which is more expensive in part due to legal restraints against using computers to call them.

The survey also found:

*Just 9 percent of homeowners are cell-only, compared with 34 percent of renters.

*Older people are less likely to have only cell phones, with just 9 percent of those 45-64 and 3 percent of those 65 and up living in such households.

*By race, 22 percent of Hispanic adults, 19 percent of blacks and 15 percent of whites live in cell-only homes.

*The South and Midwest have more cell-only households than the Northeast or West.

People with landlines who seldom take calls on them largely have those phones hooked into computers, or rely so exclusively on their cell phones that they assume anyone calling the landline is a solicitor and seldom answer it.

The data is compiled by the National Health Interview Survey, conducted by the CDC. The latest survey involved in-person interviews with members of 16,070 households conducted from last January through June.

Original here

‘Wanted’ P2P Pre-Releaser Gets 2 Year Jail Sentence

Written by enigmax

A man who added custom subtitles to a pirated copy of the movie ‘Wanted’ and uploaded it to a file-sharing network has been sentenced. Kazushi Hirata, who uploaded the movie in advance of its Japanese theatrical release, received a 2 year suspended jail sentence.

WantedAll around the world, people who pre-release media onto the Internet face the prospect of harsh treatment if caught. The crew at EliteTorrents felt the full force of the DOJ for their uploading of Star Wars: Episode III, the uploaders on OiNK face uncertainty as their criminal trial is delayed again, and Kevin Cogill, the Chinese Democracy uploader, faces a year of confinement.

In September we reported that a Japanese man had been caught uploading the movie ‘Wanted’ before its Japanese theatrical release. Kazushi Hirata, a 33 year old from the city of Sendai, had painstakingly added Japanese subtitling to the movie, before uploading it to the Winny network. Following a complaint from Japan’s answer to the MPAA (Japan and International Motion Picture Copyright Association), Hirata was tracked down by the Kyoto Prefectural Police, the same department responsible for the 2004 arrest of Isamu Kaneko, the creator of the Winny software.

Less than a month after his September 20th arrest, November 11th saw Hirata pleading guilty to violating Japan’s copyright laws and faced the prospect of up to 10 years jail and a $95,000 fine. Yesterday the court came back with its sentencing decision.

Hirata was sentenced to two years in prison, suspended for three years.

“The conviction sends an important message about the illegality of movie piracy,” said Jimca executive director Yasutaka Iiyama adding, “Respect for intellectual property rights is critical to Japan’s economy and cultural identity.”

The arrest of Mr Hirata is believed to be the first in Japan relating to the uploading of a pre-release movie.

Original here

RIAA finds its soul, will stop suing individuals downloading music

by Thomas Ricker

When you retard fair use with pointless DRM and then sue anonymous children for illegally downloading music while ignoring those of the execs at the top of the music industry, well, you're asking for a public relations nightmare. Now, with more than 35,000 lawsuits to its credit, the RIAA says it will finally end the legal assault against consumers that began back in 2003. The Recording Industry Association of America will instead, focus its anti-piracy efforts with ISPs. Under the new plan, the RIAA will contact ISPs when illegal uploading is detected. The ISP will then contact the customer with a notice that would ultimately be followed by a reduction or cessation of service. As you'd expect, the RIAA is not commenting on which ISPs they are in cahoots with. The RIAA also says that it won't require ISPs to reveal the identities of individuals but could, of course, go after individuals who are heavy uploaders or repeat offenders. For the moment though, it appears that single-mothers are in the clear.

Original here

Open Source Torrents Forced Offline by Anti-Piracy Outfit

Written by Ernesto

The Internet can prove complex to some, especially copyright holders. Recently, the Entertainment Software Association (ESA) notified the webhosting company of a BitTorrent tracker dedicated to Open Source Software, that it was infringing copyright of one of their clients. Without any notice, the webhosting company pulled the tracker offline, not realizing that the tracker had done nothing wrong.

For those new to BitTorrent, terms like trackers and .torrent files might be confusing. When someone decides to share a file with others, they make a .torrent file, and add a tracker url that tells the downloader where it can find the other peers sharing the same file. A tracker doesn’t have to host the .torrent file, and is only a means of communication between BitTorrent users. They are no more liable than any ISP in this respect.

There are thousands of open BitTorrent trackers on the Internet, and most of these don’t actually host all the files on their website. However, since they are open, anyone can add the tracker to their torrent file. The tracker owner has no knowledge of the files being spread via his or her tracker. Recently the popular Internet TV network Revision3 was attacked by MediaDefender because they were running an open tracker, and they are not alone.

This week, ‘Open Source Torrents‘ had to deal with some remarkable consequences of hosting an open BitTorrent tracker. The tracker, dedicated to sharing Open Source Software through BitTorrent, was taken offline because it allegedly infringed the copyright of the game ‘Command&Conquer Red Alert 3′. There was never a .torrent file for this material stored on the server though, only hashes of data.

The webhost, Liandra Tech, took the tracker offline after it received a copyright infringement notice from ESA. “We have to terminate your webhosting account with us, due to complaint about copyright material infringement on ostorr.org,” they wrote to the founder of the site, as they forwarded the email they got from ESA.

Akash, the founder of the tracker was very surprised by the decision of his webhosting company, to shut down his site without even consulting him first. “These folks shut me down for “Command and Conquer” supposedly going through my tracker. I’ve never even played the game,” Akash told TorrentFreak. “We did host some actual files, but only mirrors of the open source software we track, which is definitely perfectly legal.”

Although OStorr.org is a relatively small tracker, it has helped to spread more than 50,000 copies of The Open CD, and thousands of copies of other free and Open Source software. So, the only torrent files listed on his website were of Open Source website. The tracker was also open to anyone else, like many others, but Akash has no way of telling what files are tracked.

Akash is forced to find a more understanding host now, but he assured us that the tracker will return. “Open source software is about as legal as it gets. Apparently not. Time to find a new host,” he said.

Update: Looks like the webhost made the right decision after all, Akash just wrote us: “After a lengthy apology from Liandra due to a misunderstanding that I had uploaded the C&C torrent (I’m told C&C is actually a pretty dodgy game…), I’ve been offered three months of free hosting. The site’s up and running and I’m going to block all non-authorised torrents.”

Original here

Military Technologies You Won’t Believe

  • By Tech Mog
    ()

If there’s anything cooler than technology, it’s warfare, and if there’s anything cooler than that, it’s the combination of the two.

Some may flap their hands in outrage at this statement, decrying the obscene cost of military R&D while we’re faced with “more pressing” issues than defense.

I’d urge such types to settle back onto their plushly-upholstered divans, take another puff on their cigarette’s extra-length, ivory holders and consider that nothing educates better than a rubber bullet. And nothing reduces a carbon footprint like a landmine.

Now, having worked the doves into a frothing frenzy, here are some cool, ridiculously expensive technologies to push them over the edge into berserker rage.

The XOS Robotic Exoskeleton

xos_2 Military Technologies You Wont Believe

Robotics company Sarcos have developed a particularly intimidating, serious-business suit. Looking like a hollowed-out Terminator droid, the XOS is the pack-humping, long-marching, body-armoring ally of the future infantryman.

Well, assuming we don’t simply splice rhino-DNA into our clone armies, but let’s not give anyone ideas.

Donning this superhero-suit is as easy as fitting a webbing pack and a locking a pair of skis, if not quite as easy as slipping into a coat held up by a butler. From there, the suit mirrors the actions of every limb, endowing them with a brute strength to shame The Gubernator himself.

In one demonstration, the suit performed 500 reps on a pull-down bar weighted at 200 pounds, before intolerable boredom set in for its wearer and he presumably set to punching down walls instead.

With all that power, you’d expect movement to be clumsy and limited. Wrong. Even though it weighs 150 pounds, the XOS is agile enough to conceivably wear to your next Grand Ball.

armyrobotsuit_1 Military Technologies You Wont Believe

They’re even talking of a model soldiers can disengage from, which will then continue with soldierly duties all by itself… Ah, DARPA contracts, hastening the inevitable robot apocalypse in style.

Non-military adaptations of the XOS, for firemen and cripples, are scheduled to appear in due course and we’re hoping the future everyman gets one too. The issue of parallel parking by females could finally be laid to rest, gingerly and neatly from an overhead deadlift.

Small Unit Space Transport and Insertion, Sustain

sustain_72 Military Technologies You Wont Believe

Small unit insertion may sound familiar to you but don’t be fooled - Sustain is an aerospace project being developed for the U.S. Marine Corps. Its aim is to transport a squad of over a dozen marines anywhere on the globe. Within two hours, max.

It’ll be a two-stage process. A carrier craft, perhaps a B-52 or similar, will power the lander to the upper atmosphere. The landing craft, probably Boeing’s X-51A currently in testing, will break away, fire up its scramjet engines and cruise at Mach 7 to its destination.

That it’ll do so just over the 50 mile limit to national airspace is perhaps something that’s occurred to the brass. They’re canny that way.

Now, while the thought of doing a flight that currently takes ¾ of a day, Los Angeles to Singapore say, in just two hours is cool if you’re a marine, what we’re really waiting for is the passenger version.

space-flight Military Technologies You Wont Believe

Or, for Singapore to declare war. And though it’s still a ways off for the average (GI) Joe, this new technology may soon eclipse the $50 M Gulfstream private jet as the transportation method of choice for the global super-class.

We hope they remember to pack their jetlag pills and industrial-strength barf-bags.

The 600 Ton Robot Lorry

600-lorry Military Technologies You Wont Believe

Industrial machinery manufacturer Caterpillar and Carnegie Mellon University scientists have teamed up to create the ultimate redneck fantasy.

This 600 ton robot lorry can drive right over conventional monster trucks, which are more like Tonka trucks in comparison. It can crush cars, tanks and even houses beneath its merciless treads.

It can transport loads of up to 380 tons, making it handy for pyramid construction. And best, or perhaps most disturbingly of all, Caterpillar’s new beast can operate without a driver. Yeah, you read right.

Following a successful series of DARPA trials to develop a self-driven car, it was only a matter of time before a self-driven, mechanical Godzilla reared its head.

From there, it’s a short but inevitable step to someone covering it in guns and missiles. From there, a short quick step to all of us learning to speak Binary.

Weapons Against Robots

emp

If the proliferation of vehicular kill-bots and humanoid death-droids sets off your “Danger, Will Robinson” alarm, you’ll be pleased to learn of WAR Defence.

This start-up was founded by dot com millionaire, Ben Way, who’s well aware of the increasing militarization of robotics technology. His company exists to find ways of defending against enemy robots, or any of ours made sullen and uncooperative by robot puberty.

WAR, “the world’s first defense company solely dedicated to weaponry against robotic entities,” is working on several means to this end. Most interesting is circuitry-frying, non-lethal (to humans anyway) technology like Electric Storm-AX1.

This high energy microwave device, along with similar directed EMP devices, offers humanity’s best hope of neutralizing hostile machinery to survive the inevitable robot apocalypse. Not to mention dealing with your neighbor’s blaring car alarm and rap music.

It’s all still in the early stages, of course, but that’s not going to stop us from buying shares in WAR Defense and posting several thousand “up yours, sentient artificial intelligence,” messages to the internet.

While it may be a few years before we see rocket-pack police jumping from spaceships, fighting off giant bulldozer robots with energy guns, that future scenario is closer to becoming a reality then ever before. Let’s just hope that before we all kick the bucket from some nuclear-robot holocaust that we also develop the time machine portal to send someone back to stop it all from ever happening in the first place.

Orihinal here

What Gmail does better than its competitors

Posted by Don Reisinger


As someone who spends an inordinate amount of time wading through e-mails, finding the best e-mail service is paramount in my life.

Realizing that, I've done my fair share of shuffling from one e-mail program to the next--trying to find the best service that not only offers speed and stability, but also reliability and spam control. And although e-mail services are getting better, it's abundantly clear that few offer the kind of experience I'm really looking for in an e-mail client. But Google's Gmail app is different. It's better than its competition on a number of levels and provides the kind of e-mail experience that's simply unrivaled online.

Spam, Spam, Spam

I've used practically every e-mail service on the Web and I can say, without a doubt, that Gmail blocks the most spam. To those who open a new account, spam may not be a serious concern. Your spam folder will likely remain empty for a while until your new e-mail address makes its way into the wild. But for my e-mail address, which is widely available and easily attainable, spam is a constant headache.

On services like Yahoo Mail, Windows Live Hotmail, and AOL Mail, the spam blocker tried but failed on too many occasions. In fact, dealing with spam in my already bloated in-box was a daily occurrence that got worse as more messages piled up. But Gmail is different. Right now, I have thousands of messages sitting in my spam folder that never made their way to my in-box. Even better, I can say with all honesty that I only see about two or three spam messages per day in my in-box--not perfect, but much better than anything the competition is offering.

Google Apps

Maybe it's not fair to compare e-mail clients on the basis of additional apps, but I'll do it anyway. After all, Google is competing with the likes of Yahoo and AOL--two major Web companies--and I don't see why these two can't release apps that provide an even greater value proposition to users.

There's something so appealing about receiving an e-mail from someone who attached a Word document or Excel spreadsheet and being given the option to open that attachment in Google Docs. And being able to switch to Google Calendar and Reader from Gmail cuts down on time spent on managing my day. Maybe that functionality appeals to me because I prefer using apps like Google Calendar and Reader to keep me organized and "in the know", but I honestly can't see myself using another e-mail client knowing how invested I am in other Google apps. Suffice to say that my affinity for Gmail stretches beyond e-mail.

Filters

Gmail's filter feature is the best in the business. Period. Unlike its competitors, which try to provide a filter tool that simply re-routes incoming messages, Gmail delivers a power user's dream. In a matter of seconds, you can create a filter that searches through all incoming mail looking for specific people or keywords and once found, immediately categorizes it into a specific folder, forwards it on to someone else, or moves it to the trash, to name just a few functions.

With the help of Filters, using Gmail becomes an even more rewarding experience. Gone are the days of spending big chunks of your time attempting to find just one e-mail that's lost in a collection of thousands. Other e-mail services try desperately to provide the same kind of filter features, but they fall flat. In my experience, messages are either missed, the filter has performs the wrong function, or simply not ends up not working. In fact, Yahoo Mail's filter feature works only in its Classic e-mail app and according to the company, won't be available in the new interface until it's done "tweaking the Yahoo! Mail Filters option." Yikes.

Annoying ads

Anyone who has used Yahoo Mail, AOL Mail, or Windows Live Hotmail knows all too well that the annoying ads are in abundance. But when you load up Gmail, it's an entirely different story.

Sure, there are ads on Gmail, but unlike the other services, they're not intrusive in any way. I never notice them when I'm working with the program, but when I load up Yahoo Mail or try out Hotmail, I'm inundated with ugly display ads that reduce the service's screen real estate and generally take away from the experience. Granted, ads don't have any impact on the viability of an e-mail service, but doesn't it stand to reason that if you're not forced to look at blinking ads while working in your e-mail, you'll be a happier user?

I certainly think so.

Conversation Displays

I realize there are many people out there who enjoy the "classic" style of displaying e-mails based on their arrival, but I'm not one of them. I like that Gmail groups an entire e-mail conversation into one and forgoes the use of individual strands. The latter strikes me as outdated and useless today in a world of constant e-mail communication.

That said, I realize my opinion isn't the most popular. Yahoo and AOL Mail are more popular than Gmail and each employs the "old" display style, suggesting that users prefer that over Gmail's style. But I think that's more of a reaction to what users know than to what they would like. In fact, I'm willing to bet that if those people were forced to use Gmail for a week, the vast majority would dump Yahoo or AOL in favor of Google's client as soon as a flurry of e-mails between two parties broke out and they needed to go back to find a particular message. Finding that message couldn't be easier in Gmail.

Don Reisinger is a technology columnist who has written about everything from HDTVs to computers to Flowbee Haircut Systems. Don is a member of the CNET Blog Network, and posts at The Digital Home. He is not an employee of CNET. Disclosure.

Original here

waterfigure_9892

waterfigure_9892 by fotoopa.
waterfigure_9892
Colored waterfigures. Tools used are a homemade hardware controller with an Altera FPGA chip (DE1 terasic board) 2 lasers are used for the detection. Multiple flashes are used. The controller calculate the delay timings, power for the waves on the speaker, camera control and flashes.

Update:
How to make this picture:

Lower part is the membrane on top of a speaker. Before the special waveform is applied a soapbubble is placed above a few colored waterdrops. At this point you have the membrane, color liquids and the soapbubble. The "ball" is a marble that fall and pass through a laserbeam. A photodiode give a signal to the controller to start all the timings needed. The waveform signal is applied after a time and that forms the waterfigures. But at the same time the marble falls into the soapbubble. At this moment its time to fire all the flashes. Ofcource the camera need to be actief at the right time to to have the correct picture. All this correct settings timings and delays give nice figures.

Original here

New iMacs and Mac minis confirmed to use NVIDIA chipsets

By Slash Lane

Apple's next-generation iMacs and Mac minis will adopt the same NVIDIA chipset platform found at the heart of the company's most recent notebook overhaul, new findings confirm once over.

A member of the InsanelyMac forums was recently rifling through the extension files that ship with the latest MacBooks and MacBooks Pros and discovered references to a "Macmini3,1" and "iMac9,1."

Running System Profiler on Apple's most current iMacs and Mac minis reveal the model number of those systems to be "Macmini2,1" and "iMac8,1," meaning the configuration files included with the company's latest notebooks are for still unannounced models.

Specifically, the extension file of interest pertains to a Mac's System Management Controller and Advanced Configuration and Power Interface (ACPI_SMC_PlatformPlugin.kext). It includes a variety of information, including strings that identify the supporting chipset of each Mac.

The entries for the unannounced iMac and Mac minis list their chipset as the "CFG_MCP79," which is the same exact NVIDIA MCP79 platform employed by unibody MacBooks, MacBook Pros, and MacBook Airs, which are similarly identified in the same file as the MacBook5,1, MacBookPro5,1, and MacBookAir2,1.

Also of interest is that the entries for the new iMac and Mac mini are dated 2008, which may provide evidence to support claims that these systems were originally targeted for a release in the November time frame but were pushed into the first quarter of the year due to unexpected delays.

Macmini iMac


While announcing its new notebook offerings in October, Apple had indicated that it would be using more of NVIDIA's technology in its Mac computer line going forward.

Original here

The Sharp X68000: The Retro Japanese Sister of the Mac and Amiga

Posted by Michael Pinto

In my final year of art school (which was 1987) I had a friend from Japan who owned the Sharp X68000 — in fact the computer was only ever sold in Japan. As you can see in the commercial above this machine was very friendly for folks who liked to work with video and graphics, and that wasn’t by accident as the box was powered by a Motorola 68000 CPU which was the same family of chips that powered the other artist friendly machines of that era which were the Macintosh and the Amiga. The first model of this system ran at 10 MHz, had 1 meg of of RAM and no built in hard drive, so it’s sort of amazing just how much this system could do. It’s also interesting to note that many game designers in Japan created arcade games using the X68000 and today you can find emulators for the system.

In the video below the 2nd commercial is for the Sharp X68000, notice how they’re pushing the “high resolution graphics” and music capabilities of the machine:

And here’s a photo of a later model which reminds me a great deal of the NeXT from the sleek black look which was unusual at the time:

The Sharp X68000

Original here

Evidence that Next iMacs and Mac Minis to use NVIDIA Chipsets

Written by Arnold Kim

Mac Rumors

Configuration files buried within some versions of Mac OS X show evidence that the next iMac and Mac Mini will indeed be based on the NVIDIA MCP79 chipset. Apple recently switched their MacBook, MacBook Pro and MacBook Air models to the NVIDIA chipsets in October. Amongst other benefits, the new notebooks have much improved graphics card capabilities, which make them more suitable to take advantage of OpenCL technologies coming in Snow Leopard.

Apple will apparently be bringing these improvements to both the new iMac and Mac mini. While many have expected the iMac to receive these upgrades, the fate of the Mac mini has been less certain.

A configuration file found in the Mac OS X version that ships with the new MacBook and MacBook Pros reveal entries referencing unreleased "MacMini3,1" and "iMac9,1" models. The relevant lines have been excerpted here:

The CFG_MCP79 appears to refer to the NVIDIA MCP79 chipset, which suggests these unannounced models will use this new chipset. The currently shipping iMac and Macmini carry model numbers of iMac8,1 and Macmini2,1. The findings were originally described in a forum post that we've subsequently verified. The references are found in System -> Library -> Extensions -> IOPlatformPluginFamily.kext -> Contents -> PlugIns -> ACPI_SMC_PlatformPlugin.kext -> Contents -> Info.plist in the new MacBook or MacBook Pro.

Original here

6 Ways To Turn Your iPhone Into An External Hard Disk

By Damien Oh


iphone-hard-diskIf you own an iPod, you will know that other than storing music, you can also use it as an external hard disk. You simply connect your iPod to your computer and you can easily transfer files over via drag and drop. When it comes to the iPhone however, Apple sells you a larger hard disk, gives you more functionality, yet does not allow you to use it as an external hard disk. Come to think of it, the iPhone is the gadget that you are more likely to carry with you wherever you go and it is the more likely candidate to become a portable hard disk, yet you can’t do anything to it.

If you are looking to transform your iPhone/iPod Touch into an external hard disk, here are some ways that you can get it done.

App Store

1. Discover

discover-logoDiscover is the only free app in the whole App store that gives you the full functionality of a wireless hard disk. With just a wireless network, you can freely upload/download files to your iPhone.

You can either use the WebDAV protocol to connect to your iPhone via Finder, Windows Explorer or Nautilus (Gnome) or use the Web interface and connect to it via your browser. If you are a Windows user, there is even a Discover Windows client that you can use to connect to your iPhone.

Discover comes with several useful features. There is an iPhone-discovery mode where you can discover other iPhones in your network and share files with them. There is also a viewer within the app that enables you to view your files while on the move.

2. Briefcase Lite

briefcase-lite-logoBriefcase Lite is the free alternative to the paid app Briefcase. This free app allows you to connect your iPhone to Mac and Linux in a local area network environment (Windows is not supported). Similar to Discovery, you can transfer files to/from your iPhone and there is a viewer to view the various type of files while on the move.

There are several limitations to Briefcase Lite compared to Briefcase. Firstly, you can only upload files, but not directories. This means that if you have plenty of files, you have to upload them one by one, rather than upload them as one folder. Secondly, Briefcase Lite does not come with the remote login feature that is available in Briefcase. Thirdly, Briefcase Lite does not allow you to send files to another iPhone, but you can receive files from a Briefcase user.

In general, if you want basic file transfer and viewing functions, Briefcase Lite is sufficient. If you are looking for more sophisticated features like remote login and remote mounting of disk images and installing of packages, then Briefcase is the one to go.

Free Third Party Software

Other than the apps at the app store which deal on the iPhone side, there are also several third-party software that focus mainly on the computer side. You only need to install these software in your Windows and you will be able to browse, transfer and control your iPhone via the USB cable. There are several software for Mac as well but they are paid apps, so I will not mention them here.

3. iPhonebrowser

iphonebrowser

iPhoneBrowser is a Windows based file browser for your iPhone. Once you have it installed in your Windows, you can navigate the whole iPhone filesystem and drag and drop files to it. You can also perform backups in case you accidentally delete some important files.

4. DiskAid

diskaid

Same as iPhonebrowser, DiskAid is another software that allows you to browse and transfer files to your iPhone. There are lesser options, but it has a far simpler interface and easy to understand icons.

Other alternatives

If you have jailbroken your iPhone, there are some useful applications in the Cydia app that you can use.

5. Netatalk

Installing Netatalk via the Cydia app gives you the ability to connect your iPhone to your Mac using the AFP protocol. You can access your iPhone via the Finder and drag and drop any files to it.

6. OpenSSH

OpenSSH may not be the most elegant way to connect to your iPhone, but it is the best and easiest way for you to tweak and hack your iPhone. You will need a SSH client on your computer to make use of OpenSSH. Cyberduck (Mac), WinSCP (Windows) and Filezilla (Linux) are a few great SSH clients that you can use.

With OpenSSH, not only can you transfer files to/from your iPhone, you can also move the internal files around (do it at your own risk) or change the file permissions. This is useful if you are looking to hack your phone and install any cracked applications.

The above mentioned applications are some of the free alternatives to turn your iPhone into an external hard disk. Surely there are many more ways/hacks that you can use to access your iPhone, some of them are paid apps and are not mentioned here. If you use any of those, do share them with us.

Original here

One More Thing: Apple’s New Multi-touch Mighty Mouse

Posted by Dave

Let’s face it, the Mighty Mouse is flawed. Luckily, Apple has filed for numerous patents directly hinting at a multi-touch Mighty Mouse, which could prove to bring us an unexpected treat sometime in the near future.

The Mighty Mouse made its public debut on August 2, 2005 becoming Apple’s first multi-button mouse ever. Its wireless counterpart was released the following year and aside from a minor aesthetic change in 2007, the device has remained practically untouched since its inception. While indeed, the Mighty Mouse brought a well received touch-sensitive top shell, a 360 degree clickable scroll ball, and pressure sensitive side “squeeze” buttons, Apple’s current touch technology and manufacturing methods leave the little old Mighty Mouse in the dust. Time for a change? Perhaps the design below can offer up what may be in store.


The Mighty Mouse is currently made of white plastic, a material that is becoming rare amongst Apple products (iPhone 3G aside). We have seen the white plastic iMac morph into an aluminum gem, and more recently have witnessed the older MacBooks lose their plastic case in favor of an aluminum unibody design. Simple logic would agree that a shift from the current white plastic shell to an aluminum casing would not be out of the question. Anyone who owns or has used a Mighty Mouse for any extended period of time is not only familiar with the gunk that accumulates along the edges but have probably had a problem with dirt building on the scroll ball rendering it useless until thoroughly cleaned. What would would be more amazing than gripping an aluminum Mighty Mouse that utilizes touch action scrolling?


Apple’s touch technology has grown leaps and bounds since the debut of the Mighty Mouse. A 2007 patent application details an “arbitrary shaped grippable member” (don’t get any filthy ideas here) that incorporates positioning and multitouch detection to determine a users action. In essence, it would act as one seamless area from which a user could scroll and pan by dragging a finger over the device’s surface. The mouse would also have the ability to process various movements and gestures assigned to different functions on the computer. “For example, gestures can be created to detect and effect a user command to resize a window, scroll a display, rotate an object, zoom in or out of a displayed view, delete or insert text or other objects, etc.” The patent goes on to detail that “gestures can also be used to invoke and manipulate virtual control interfaces, such as volume knobs, switches, sliders, handles, knobs, doors, and other widgets that may be created to facilitate human interaction with the computing system.”


As the design indicates, an updated aluminum Mighty Mouse could boast sleeker dimensions that are better contoured to the hand. A lower profile body would make touch usability more versatile for complex gestures while still retaining enough height to fit the necessary batteries to power it. Outfitted with a glass front end surface to capture finger movements, this could be the end all be all Mighty Mouse of the future.

Original here