Followers

Tuesday, September 23, 2008

8 hacks to make Firefox ridiculously fast

firefox-logo

Set your fox on fire with these eight handy speed hacks

ZoomZoom

<>

Firefox has been outperforming IE in every department for years, and version 3 is speedier than ever.

But tweak the right settings and you could make it faster still, more than doubling your speed in some situations, all for about five minutes work and for the cost of precisely nothing at all. Here's what you need to do.

1. Enable pipelining

Browsers are normally very polite, sending a request to a server then waiting for a response before continuing. Pipelining is a more aggressive technique that lets them send multiple requests before any responses are received, often reducing page download times. To enable it, type about:config in the address bar, double-click network.http.pipelining and network.http.proxy.pipelining so their values are set to true, then double-click network.http.pipelining.maxrequests and set this to 8.

Keep in mind that some servers don't support pipelining, though, and if you regularly visit a lot of these then the tweak can actually reduce performance. Set network.http.pipelining and network.http.proxy.pipelining to false again if you have any problems.

2. Render quickly

Large, complex web pages can take a while to download. Firefox doesn't want to keep you waiting, so by default will display what it's received so far every 0.12 seconds (the "content notify interval"). While this helps the browser feel snappy, frequent redraws increase the total page load time, so a longer content notify interval will improve performance.

Type about:config and press [Enter], then right-click (Apple users ctrl-click) somewhere in the window and select New > Integer. Type content.notify.interval as your preference name, click OK, enter 500000 (that's five hundred thousand, not fifty thousand) and click OK again.

Right-click again in the window and select New > Boolean. This time create a value called content.notify.ontimer and set it to True to finish the job.

3. Faster loading

If you haven't moved your mouse or touched the keyboard for 0.75 seconds (the content switch threshold) then Firefox enters a low frequency interrupt mode, which means its interface becomes less responsive but your page loads more quickly. Reducing the content switch threshold can improve performance, then, and it only takes a moment.

Type about:config and press [Enter], right-click in the window and select New > Integer. Type content.switch.threshold, click OK, enter 250000 (a quarter of a second) and click OK to finish.

4. No interruptions

You can take the last step even further by telling Firefox to ignore user interface events altogether until the current page has been downloaded. This is a little drastic as Firefox could remain unresponsive for quite some time, but try this and see how it works for you.

Type about:config, press [Enter], right-click in the window and select New > Boolean. Type content.interrupt.parsing, click OK, set the value to False and click OK.

5. Block Flash

Intrusive Flash animations are everywhere, popping up over the content you actually want to read and slowing down your browsing. Fortunately there's a very easy solution. Install the Flashblock extension (flashblock.mozdev.org) and it'll block all Flash applets from loading, so web pages will display much more quickly. And if you discover some Flash content that isn't entirely useless, just click its placeholder to download and view the applet as normal.

6. Increase the cache size

As you browse the web so Firefox stores site images and scripts in a local memory cache, where they can be speedily retrieved if you revisit the same page. If you have plenty of RAM (2 GB of more), leave Firefox running all the time and regularly return to pages then you can improve performance by increasing this cache size. Type about:config and press [Enter], then right-click anywhere in the window and select New > Integer. Type browser.cache.memory.capacity, click OK, enter 65536 and click OK, then restart your browser to get the new, larger cache.

7. Enable TraceMonkey

TraceMonkey is a new Firefox feature that converts slow Javascript into super-speedy x86 code, and so lets it run some functions anything up to 20 times faster than the current version. It's still buggy so isn't available in the regular Firefox download yet, but if you're willing to risk the odd crash or two then there's an easy way to try it out.

Install the latest nightly build (ftp://ftp.mozilla.org/pub/firefox/nightly/latest-trunk/), launch it, type about:config in the address bar and press Enter. Type JIT in the filter box, then double-click javascript.options.jit.chrome and javascript.options.jit.content to change their values to true, and that's it - you're running the fastest Firefox Javascript engine ever.

8. Compress data

If you've a slow internet connection then it may feel like you'll never get Firefox to perform properly, but that's not necessarily true. Install toonel.net (toonel.net) and this clever Java applet will re-route your web traffic through its own server, compressing it at the same time, so there's much less to download. And it can even compress JPEGs by allowing you to reduce their quality. This all helps to cut your data transfer, useful if you're on a limited 1 GB-per-month account, and can at best double your browsing performance.

How Linux lost the battle for your desktop

linux-penguin

It may not be on your PC, but chances are Linux is somewhere in your home

ZoomZoom

<>

A few years ago, it looked like Linux might – just might – take over the world. Companies like Lindows/Linspire were going to make it easy enough for your mother to use. Bright coloured boxes of SUSE and Red Hat and plenty of others were piled high in every computer store. The letters MS rarely went without an ironic '$', and oh, how we laughed. It was going to be a whole new era.

Except it didn't really happen, did it? The promise of Linux becoming a dominant player on the desktop was always just over the horizon, and over the years, the visible excitement waned.

What went wrong?

The most important thing is that desktop Linux was only ever 'easy' for two sets of people – hardcore types, and people with a very specific set of tools that could be installed and made bulletproof through security. If all you want from your PC is the ability to edit documents, send emails, and browse the web, Linux is indeed easy to use. The trouble comes when you advance to the part where you want to install a new printer, or play a new game, or the screen fills up with babble.

Tools like Linspire's Click and Run Warehouse went some distance to fixing this, but could only go so far without the support of hardware manufacturers and software developers. It's the chicken and egg problem. Without a big market, companies are reluctant to spend money supporting it; without support, the market can't grow.

Making it tougher was the fact that few people had a reason to move. Far from the centre of the world, to most, Windows is just that thing that came with the PC and lets their real programs run. A solid reason was required to get people to switch, and none really presented themselves.

Don't get us wrong – many of the reasons provided are valid. You'll find a list of most of the most common ones at makethemove.net/why.php. However, there's a massive gulf from technical improvements to compelling benefits for the average PC user who's never even going to run into that kind of page, never mind pull the trigger on a distro. Viruses and spyware are covered by tools like Norton and McAfee to the level that most people care about: feeling protected. Free software? Most of the best stuff, like OpenOffice.org and Firefox, is available on Windows.

The OS X coffin nail

Another nail in the coffin came in the form of OS X, which fulfilled most of the criteria that the average user actually cared about, and did so with style on its side. It's easy to use. The technobabble is hidden behind the scenes. Its apps are slick and well integrated. The hardware support quickly became relatively ubiquitous.

Where Linux was offering a technological advantage, Apple was in place to offer a whole new lifestyle to people wanting to step away from the Windows hegemony. Linux could offer many of the same features – indeed, OS X is based on a UNIX core – but the stigma of being the geeky OS never left it. Even if the user never had to compile anything or go to the terminal, they'd be hit in the face with that side of maintaining their system every time they went hunting for advice. Online comedy group Three Dead Trolls In A Baggie summed up the general sentiment in a verse of its snappily named 2001 song Every OS Sucks

"It's free, they say, if you can get it to run
The geeks say 'Hey, that's half the fun!'
Yeah, well I got a girlfriend and things to get done.
The Linux OS sucks!"

Invisible Linux

The irony of all this negativity is that Linux has never been closer to the victory its supporters have dreamed of all these years. It's simply coming from the other direction – encircling the enemy instead of stampeding towards its stronghold. Just because most of us don't see Linux on our screens when we boot up doesn't mean that we don't use it every day. Linux based netbooks like the Asus Eee are not only affordable, but beat their Windows equivalents in terms of price and performance any day. Google Android, the first model of which is due out this month, is based on Linux, too, as are many portable devices, PVRs like TiVo, and the web servers that run so many of our important services.

We're never likely to see Microsoft bow its head in defeat and step back, but that was always an impossible idea. Just because we increasingly won't be aware of Linux in no way means that it's given up the fight. The difference is that this time, it's in a battle it might eventually win.

10 Best Hacking and Security Software Tools for Linux

Linux is a hacker’s dream computer operating system. It supports tons of tools and utilities for cracking passwords, scanning network vulnerabilities, and detecting possible intrusions. I have here a collection of 10 of the best hacking and security software tools for Linux. Please always keep in mind that these tools are not meant to harm, but to protect.

1. John the Ripper

John the Ripper is a free password cracking software tool initially developed for the UNIX operating system. It is one of the most popular password testing/breaking programs as it combines a number of password crackers into one package, autodetects password hash types, and includes a customizable cracker. It can be run against various encrypted password formats including several crypt password hash types most commonly found on various Unix flavors (based on DES, MD5, or Blowfish), Kerberos AFS, and Windows NT/2000/XP/2003 LM hash. Additional modules have extended its ability to include MD4-based password hashes and passwords stored in LDAP, MySQL and others.


2. Nmap

Nmap is my favorite network security scanner. It is used to discover computers and services on a computer network, thus creating a "map" of the network. Just like many simple port scanners, Nmap is capable of discovering passive services on a network despite the fact that such services aren't advertising themselves with a service discovery protocol. In addition Nmap may be able to determine various details about the remote computers. These include operating system, device type, uptime, software product used to run a service, exact version number of that product, presence of some firewall techniques and, on a local area network, even vendor of the remote network card.

Nmap runs on Linux, Microsoft Windows, Solaris, and BSD (including Mac OS X), and also on AmigaOS. Linux is the most popular nmap platform and Windows the second most popular.


3. Nessus

Nessus is a comprehensive vulnerability scanning software. Its goal is to detect potential vulnerabilities on the tested systems such as:

-Vulnerabilities that allow a remote cracker to control or access sensitive data on a system.
-Misconfiguration (e.g. open mail relay, missing patches, etc).
-Default passwords, a few common passwords, and blank/absent passwords on some system accounts. Nessus can also call Hydra (an external tool) to launch a dictionary attack.
-Denials of service against the TCP/IP stack by using mangled packets

Nessus is the world's most popular vulnerability scanner, estimated to be used by over 75,000 organizations worldwide. It took first place in the 2000, 2003, and 2006 security tools survey from SecTools.Org.


4. chkrootkit

chkrootkit (Check Rootkit) is a common Unix-based program intended to help system administrators check their system for known rootkits. It is a shell script using common UNIX/Linux tools like the strings and grep commands to search core system programs for signatures and for comparing a traversal of the /proc filesystem with the output of the ps (process status) command to look for discrepancies.

It can be used from a "rescue disc" (typically a Live CD) or it can optionally use an alternative directory from which to run all of its own commands. These techniques allow chkrootkit to trust the commands upon which it depend a bit more.

There are inherent limitations to the reliability of any program that attempts to detect compromises (such as rootkits and computer viruses). Newer rootkits may specifically attempt to detect and compromise copies of the chkrootkit programs or take other measures to evade detection by them.


5. Wireshark

Wireshark is a free packet sniffer computer application used for network troubleshooting, analysis, software and communications protocol development, and education. In June 2006, the project was renamed from Ethereal due to trademark issues.

The functionality Wireshark provides is very similar to tcpdump, but it has a GUI front-end, and many more information sorting and filtering options. It allows the user to see all traffic being passed over the network (usually an Ethernet network but support is being added for others) by putting the network interface into promiscuous mode.

Wireshark uses the cross-platform GTK+ widget toolkit, and is cross-platform, running on various computer operating systems including Linux, Mac OS X, and Microsoft Windows. Released under the terms of the GNU General Public License, Wireshark is free software.


6. netcat

netcat is a computer networking utility for reading from and writing to network connections on either TCP or UDP.

Netcat was voted the second most useful network security tool in a 2000 poll conducted by insecure.org on the nmap users mailing list. In 2003, it gained fourth place, a position it also held in the 2006 poll.

The original version of netcat is a UNIX program. Its author is known as *Hobbit*. He released version 1.1 in March of 1996.

Netcat is fully POSIX compatible and there exist several implementations, including a rewrite from scratch known as GNU netcat.


7. Kismet

Kismet is a network detector, packet sniffer, and intrusion detection system for 802.11 wireless LANs. Kismet will work with any wireless card which supports raw monitoring mode, and can sniff 802.11a, 802.11b and 802.11g traffic.

Kismet is unlike most other wireless network detectors in that it works passively. This means that without sending any loggable packets, it is able to detect the presence of both wireless access points and wireless clients, and associate them with each other.

Kismet also includes basic wireless IDS features such as detecting active wireless sniffing programs including NetStumbler, as well as a number of wireless network attacks.


8. hping

hping is a free packet generator and analyzer for the TCP/IP protocol. Hping is one of the de facto tools for security auditing and testing of firewalls and networks, and was used to exploit the idle scan scanning technique (also invented by the hping author), and now implemented in the Nmap Security Scanner. The new version of hping, hping3, is scriptable using the Tcl language and implements an engine for string based, human readable description of TCP/IP packets, so that the programmer can write scripts related to low level TCP/IP packet manipulation and analysis in very short time.

Like most tools used in computer security, hping is useful to both system administrators and crackers (or script kiddies).


9. Snort

Snort is a free and open source Network Intrusion prevention system (NIPS) and network intrusion detection (NIDS) capable of performing packet logging and real-time traffic analysis on IP networks.

Snort performs protocol analysis, content searching/matching, and is commonly used to actively block or passively detect a variety of attacks and probes, such as buffer overflows, stealth port scans, web application attacks, SMB probes, and OS fingerprinting attempts, amongst other features. The software is mostly used for intrusion prevention purposes, by dropping attacks as they are taking place. Snort can be combined with other software such as SnortSnarf, sguil, OSSIM, and the Basic Analysis and Security Engine (BASE) to provide a visual representation of intrusion data. With patches for the Snort source from Bleeding Edge Threats, support for packet stream antivirus scanning with ClamAV and network abnormality with SPADE in network layers 3 and 4 is possible with historical observation.


10. tcpdump

tcpdump is a common computer network debugging tool that runs under the command line. It allows the user to intercept and display TCP/IP and other packets being transmitted or received over a network to which the computer is attached.

In some Unix-like operating systems, a user must have superuser privileges to use tcpdump because the packet capturing mechanisms on those systems require elevated privileges. However, the -Z option may be used to drop privileges to a specific unprivileged user after capturing has been set up. In other Unix-like operating systems, the packet capturing mechanism can be configured to allow non-privileged users to use it; if that is done, superuser privileges are not required.

The user may optionally apply a BPF-based filter to limit the number of packets seen by tcpdump; this renders the output more usable on networks with a high volume of traffic.


Do you have a favorite security software tool for Linux? Feel free to comment and tell us about it.

Original here

Fake popup study sadly confirms most users are idiots

By John Timmer

For most of us, security issues happen to "other people"—we block popup ads, we carefully examine dialog boxes and, for those of us on the Mac platform, we snicker when confronted with something that attempts to mimic a Windows system warning. But everyone knows that they are exceptional—what's the behavior of a more typical user like? Some researchers have tested how college students respond to fake dialog boxes in browser popup windows and found that the students are so anxious to get the dialog out of the way, they click right through obvious warning signs.

The authors, who work in the Psychology Department of North Carolina State University, crafted a set of four fake dialog boxes. All of them contained the following warning: "The instruction at '0x77f41d24 referenced memory at '0x595c2a4c.' The memory could not be 'read.' Click OK to terminate program." One of the warnings was indistinguishable from the standard Windows XP system dialog, but the remaining three were had a number of warning signs that should tip off users to potential malware.

In all cases, mousing over the "OK" button would cause the cursor to turn into a hand button, behavior more typical of a browser control; all dialogs also had minimize and maximize buttons, while a second added a browser status bar to the bottom of the window. Finally, the most blatant one alternated between black text and a white background and a white-on-black theme. All of these should metaphorically scream, "This is not safe!"

The researchers then created a Flash application that would mimic loading a series of medical websites. They told a panel of 42 college students that they were supposed to watch as the sites loaded and then would be asked a series of questions about the sites. Instead, the fake dialogs were loaded in a random order and the users' responses were tracked.

Of the 42 students, 26 clicked the OK button for the "real" dialog. But 25 clicked the same button for two of the fakes, and 23 hit OK on the third (the one with the status bar showing). Only nine of them closed the window—two fewer than had closed the real dialog. In all cases, a few of the users simply minimized the window or dragged it out of the way, presumably leaving the machine's next user at risk.


Even this warning might not have helped

The response time, which tracked how long it took the users to perform any action, was not significantly different among the different dialogs, indicating that there wasn't even any thought expended on evaluating the fakes.

Follow-up questions revealed that the students seemed to find any dialog box a distraction from their assigned task; nearly half said that all they cared about was getting rid of these dialogs. The results suggest that a familiarity with Windows dialogs have bred a degree of contempt and that users simply don't care what the boxes say anymore.

The authors suggest that user training might help more people recognize the risks involved with fake popups and the diagnostic signs of genuine Windows dialogs, but the fact that the students didn't appear to spend any more time evaluating the fake dialogs raises questions as to whether education is enough.

The study will appear in the Proceedings of the Human Factors and Ergonomics Society.

Original here

ISP: It’s Impossible For Us to Stop Illegal P2P

Written by enigmax

An ISP which was ordered by a court to stop illegal file-sharing on its network, says it simply can not. The Belgian ISP Scarlet says the court’s verdict is unworkable and after trying to slow traffic and also filter it, it says it’s not possible to stop the flow of illicit files since Audible Magic doesn’t work.

scarletIn mid-2007, after a battle with copyright group SABAM, a court in Belgium ruled that Internet Service Providers can be forced to block and/or filter copyright infringing files on P2P networks. Although most people familiar with the technical hurdles recognized that this was a massive if not impossible task, the judge in the case ruled that ISPs are indeed capable of blocking infringing content and gave Scarlet six months to comply.

Scarlet said right from the start that it believed that if it complied with the court order it would be breaking the law. The ISP claimed that Belgian law forbids it from spying on its customers so it lodged an appeal against the ruling, with managing director Gert Post saying: “This measure is nothing else than playing Big Brother on the Internet. If we don’t challenge it today, we leave the door open to permanent, and invisible and illegal, checks of personal data.”

Now, over a year later, Scarlet’s lawyers argued in court that the company simply cannot stop the flow of illicit files, which is a serious situation since the ISP has to pay compensation of 2,500 Euros for each day it fails to do so. According to a report, Scarlet has tried different techniques to try to comply with the ruling but has had no success.

First of all, Scarlet slowed down P2P traffic with the help of some Cisco technology. All this led to was complaints from the customers, and it did nothing to stop the availability of the illicit files. A lawyer for Scarlet, Christoph Preter said: “We have actually received complaints that P2P traffic was slower, but it remained possible. It is only a deterrent measure.”

The ISP quite rightly refused to block all P2P traffic, since it said it would be blocking legitimate traffic too. However, copyright group SABAM said this was not a valid excuse. “The argument put forward by Scarlet,” said SABAM’s lawyer, “is not about the impossibility of blocking, but about the consequences.” SABAM clearly doesn’t care who is affected, as long as it gets its way, stating that Scarlet simply hasn’t tried hard enough to comply with the court.

The second solution, the filtering of illicit files, was a solution put forward last year by SABAM itself. On the advice of an appointed P2P ‘expert’, the court ruled that Scarlet must use the content filtering technology offered by Audible Magic. However, Scarlet tried this system and it didn’t work when scanning for files on their network. During last year’s court case it was claimed that Audible Magic had experience with filtering in the US with Verizon and in Asia with another ISP. However, Scarlet made inquiries with Verizon about the partnership but was told that no such deal exists and Audible Magic refused to reveal who the Asian ISP is.

“We have misled the court,” said SABAM’s lawyer. “But SABAM followed the expert in the choice of Audible Magic, so we were acting in good faith.”

A ruling in the case is not expected until 2010.

Original here

Americans text more than they talk

Posted by Marguerite Reardon

American cell phone users are sending more text messages than they are making phone calls, according to a Nielsen Mobile survey released Monday.

For the second quarter of 2008, U.S. mobile subscribers sent and received on average 357 text messages per month, compared with making and receiving 204 phone calls a month, according to Nielsen. The new statistic is a clear indication that Americans have jumped onto the SMS text bandwagon.

On average, American teens send and receive 1,742 text messages a month.

(Credit: Marguerite Reardon/CBS Interactive)

In the first quarter of 2006, Americans sent and received 65 text messages per month. The number of messages sent and received today has increased 450 percent. But even though people are texting more, it doesn't mean that they've stopped talking on the phone. According to Nielsen, the number of phone calls that people make and receive each month has remained relatively flat over the past two years.

The wireless industry's trade association, CTIA, recently noted the explosion in texting in its own report. It recently reported that for the month of June, American cell phone subscribers sent about 75 billion SMS text messages, averaging about 2.5 billion messages per day. This represents an increase of 160 percent over the 28.8 billion messages reported in June 2007.

Short Message Service, or SMS, text messaging first became popular in Europe and Asia, because it was much cheaper to send short text messages than make an actual phone call. In countries such as the Philippines, the cost of sending one text is less than a penny. And in Europe where cell phone users are still penalized with high roaming charges between countries, texting is still a more economical form of communication.

But in the U.S. texting is proving to be a cash cow for carriers. Over the past two years, the cost of sending and receiving individual text messages without a special text message package has gone up 100 percent with individual text messages costing 20 cents per message. Carriers are now offering unlimited cell phone texting plans that cost an additional $20 a month, which makes sending texts more affordable for heavy texters.

The surge in text messaging is being driven by teens 13 to 17 years old, who on average send and receive about 1,742 text messages a month. Teens also talk on the phone, but at a much lower rate, only making and receiving about 231 calls per month. The report even suggests that tweens or kids under the age of 12 are also heavy text users, averaging about 428 messages per month.

Original here

Judge: School can suspend students over fake MySpace profile

By Jacqui Cheng

A federal judge has ruled that a Pennsylvania school can suspend two eighth-graders who created a fake MySpace profile of their principal depicting him as a pedophile and a sex addict, among other things. The September 11 ruling said the students' civil rights were not violated despite their actions taking place off school grounds because the language used on the profile was "lewd and vulgar," and because it was akin to speech that promoted illegal actions. Given the recent prevalence of fake MySpace profiles meant to taunt or harass others at school, this ruling could help decide future cases related to student speech online.

The story goes back to March 2007, when a profile representing the principal of Blue Mountain Middle School, James McGonigle, popped up on MySpace. The profile didn't explicitly identify McGonigle by name, but used a photo of him taken from the school district's website and labeled him as principal. Among McGonigle's alleged interests listed on MySpace were "f****** in my office" and "hitting on students and their parents." The profile also had a statement with the headline "HELLO CHILDREN," that read (in part), "yes. It's your oh so wonderful, hairy, expressionless, sex addict, fagass, put on this world with a small d*** PRINCIPAL I have come to myspace so I can pervert the minds of other principals to be just like me."

The two students who created the profile did so at home on one of their parents' computers, and according to court documents, a fair number of other students were already talking about the profile at school the next day. The students behind the profile, referred to as J.S. and K.L., claimed they set the profile to private that day, but McGonigle himself was still able to access it from a public computer days later. After speaking with J.S. and K.L. in his office along with a guidance counselor, they admitted to creating the profile. They were suspended for 10 days for violating the school discipline code, which prohibits making false accusations against school staff members, as well as copyright infringement for using his photo without permission. The students had the opportunity to appeal their discipline to the school board, but declined to do so.

It was at this point J.S. and her parents filed a lawsuit against the school district, the school, the superintendent, and McGonigle. The attorneys for J.S. argued that the school violated her First Amendment rights to free speech and that the Constitution prohibits the school district from disciplining a student's out-of-school conduct that does not cause a disruption of classes or school administration. As an example, they presented the US Supreme Court case of Tinker v. Des Moines from 1969, which ruled that students do not shed their constitutional rights to free speech when they enter the schoolhouse.

In this case, however, US District Judge James Munley ruled that Tinker v. Des Moines wasn't applicable. In the Tinker case, high school students engaged in silent protest of the Vietnam war by wearing black armbands at school, which did not interfere with schoolwork or with the rights of others. "In the instant case, the speech is not political; rather, it was vulgar and offensive statement [sic] ascribed to the school principal," wrote Munley in his 20-page opinion.

Munley cited another case, Bethel School Dist v. Fraser, where a high school student was suspended for using elaborate sexual metaphors during a school assembly to refer to another student; the suspended student filed a civil rights case but lost due to the vulgarity of the speech. "The profile contains words such as 'fucking,' 'bitch,' 'fagass,' 'dick,' 'tight ass,' and 'dick head,'" wrote Munley. "The speech does not make any type of political statement. It is merely an attack on the school's principal. It makes him out to be a pedophile and sex addict. This speech is not the Tinker silent political protest. It is more akin to the lewd and vulgar speech addressed in Fraser."

One of the attorneys arguing the case for J.S, Mary Catherine Roper of the American Civil Liberties Union of Pennsylvania, said in an interview she was disappointed with the outcome of the case, according to Law.com. She said that Munley failed to recognize that a school cannot restrict a student's speech "anywhere it is uttered" simply because it's vulgar and targets a school official. In the precedents cited by both sides, the students' conduct occurred in school, not at home behind a computer screen. Munley wrote in his opinion, however, that the intended audience was clearly other students at the school and that the profile was being discussed there, which helped bridge the case from being an off-campus matter to an on-campus issue.

Roper said that an appeal was still up in the air, but this issue isn't likely to go away anytime soon, regardless of whether J.S. and her parents choose to appeal. The combination of social networking sites, disliked school administrators, and perceived anonymity may be too tempting for some disgruntled students to resist, but this case demonstrates that there are limits to what students can say online.

Original here

RIAA rejects damage award, forces trial, looks hypocritical

By Eric Bangeman

What price innocent infringement? That's the question a San Antonio jury will have to address in mid-November, as the RIAA and 20-year-old Whitney Harper will battle in court over the amount of damages Harper will have to pay to the record labels after being found liable for copyright infringement by a federal judge.

Harper was 16 when MediaSentry discovered and downloaded a number of tracks from what proved to be her shared folder on KaZaA. The record labels sued her father, but he was dropped from the suit once Whitney admitted to using KaZaA for downloading and sharing music. Her admission was enough to convince Judge Xavier Rodriguez to hand the RIAA a summary judgment this past August, but he ruled that damages be capped at $200 per song, not the $750 at minimum sought by the RIAA.

The reason for the $200 limit was Harper's innocent infringement defense. She admitted to using KaZaA, but said that she didn't know that what she was doing was wrong due to her age at the time and general lack of knowledge about how computers and P2P systems work. There was no warning from KaZaA that the music on the network was "stolen or abused copyrighted material," noted Judge Rodriguez in his opinion. He also agreed with Harper's assertion that she had "no knowledge or understanding of file trading, online distribution networks or copyright infringement" and that she believed there was nothing illegal about her activities.

Ignorance of the law may be no defense against being held liable for infringement, but it can put a serious cap on damages. Under the Copyright Act, infringement is normally punishable by fines of up to $750 to $30,000 per act, and the upper limit can be raised to $150,000 if the infringement is deemed malicious. But for cases of innocent infringement, the judge can reduce the damages below the $750 floor. In the case of Maverick v. Harper, the judge told the record labels they could accept damages of $200 per song or have a jury decide what the total damages should be; the RIAA has chosen a jury trial over the damages.

The RIAA's decision to reject the judge's award appears a bit hypocritical on a couple of levels. The group has said on numerous occasions that its legal campaign against P2P users isn't about making money—indeed, an industry executive testified during the Jammie Thomas trial that the lawsuits are a money-losing proposition. Instead, the suits are meant, among other things, as a deterrent to copyright infringement and to teach P2P users a lesson.

Here, it seems painfully clear that the lesson has been taught. Harper has been found liable for 37 counts of copyright infringement and the judge is willing to award the RIAA $7,400 in damages—almost double what her father would have had to pay had he accepted the terms of the RIAA's prelitigation settlement letter back in 2005. Furthermore, in the 28,000-plus copyright infringement lawsuits filed by the record labels, the RIAA has never once asked for a set monetary damages, saying instead that it would be content with whatever the court deemed appropriate (former RIAA head litigator Richard Gabriel made this very statement during the Thomas trial). Judge Rodriguez did exactly that, and the labels have decided that it wasn't enough. Now, the RIAA will tie up a federal courtroom, a judge, and a jury for a few days before Thanksgiving in hopes of extracting an additional pound of flesh from someone who says she didn't know she was doing anything wrong back when she was just 16 years old.

Original here

How Businesses Can Benefit From Social Networking

For all the talk of businesses embracing Web 2.0 and social software tools, most companies are still at the very early stages of adoption, says Jonathan Yarmis, an analyst at AMR Research who focuses on emerging technologies. In his latest research note on companies taking their first step into social media, he says that companies must avoid the "Kumbaya Zone" - the place where social media is ultimately a time-waster and has little business value.

Yarmis talked with CIO.com about where companies have been missing the boat with Web 2.0 and social media, and what they must remember when they get started.

CIO.com: Social networking and related technologies have been popular in the consumer space for years now, but businesses have sometimes had difficulty finding value. What are they doing wrong?

Jonathan Yarmis: The conventional wisdom for companies looking to use social media, and I get sick of hearing it, is that "you need to engage in the conversation" on a Facebook or Twitter. Well, you need to ask yourself, why are you engaging in the conversation? What are you trying to accomplish by it? Merely engaging in the conversation doesn't make some kind of magical outcome occur.

In fact, there are some conversations that aren't a good use of your time and that will lead you down counterproductive paths. You want to get involved in conversations that are meaningful and impact your business.

What Comcast has done on Twitter has been brilliant. Cable companies, in general, do not do well with customer service. But there has been a lot of buzz around about how Comcast been using Twitter to foster customer service and answer questions about their service. That is a case of knowing what conversation to pick. They haven't changed much about their customer service, but they've changed the perception of it with some of their customers who use Twitter.

CIO.com: You noted that many companies start with the intention of getting involved with social media externally, perhaps by joining a social network or setting up social technologies on their own websites. But more often than not, they end up doing their most meaningful implementations internally first. Why is this?

Yarmis: External collaboration is a very complicated dynamic. Internally, you can control a culture and things might spread virally pretty quickly. The internal things will also occur naturally. Users will do these things themselves and always have. Users will utilize Web 2.0 or collaborative tools on their own. Just look at how they use SharePoint.

But once you get outside the boundaries of the company, you lose that kind of control and more complex relationships come into play. There may be legal requirements. All the health care or financial organizations I have conversations with who want to do something with social media have this specter of regulatory issues or European data privacy laws. While the younger generation seems happy to share everything with everyone, those up the food chain will say the information we share outside the organization has to be more carefully controlled than the stuff inside.

There are companies that are saying collaborative innovation on the Web is the future. I've got to learn new methods of sharing and get out of my comfort zone. There are other companies, though, that say, that sounds great and very "kumbayah," but I'm not going to share everything. I want to manage my intellectual property very carefully.

CIO.com: How would you describe the method for implementing social software to start? How is it different than traditional software?

Yarmis: In some ways it's optimized by Google: throw something up there and see what sticks. The days of these multiyear, multimillion-dollar contracts and software deployments - I won't say they'll be gone completely, but they basically just don't make sense anymore. If you start with some lofty business objective like "we want to improve customer satisfaction," you still need to just take a first step [with social media] and see what happens.

I see so many companies come up with 137 ideas about that they want to do with social media, but they should just start with one. Get comfortable taking that first step and see what works along the way. That is way different than the software model we're used to.

Original here

T-Mobile G1 first hands-on (updated)

by Joshua Topolsky

Yep -- there it is. We finally, finally got our mitts all over the very first Android device, the T-Mobile G1 -- hanging out in the crowd, waiting for the official announce, naturally -- and so far we like what we see. The phone is surprisingly thinner than we thought it would be, and it feels pretty solid in your hand (though they've opted for an almost all plastic device, no metal here). The keyboard seems usable and reasonably well thought-out, and the slider action is like butter, with a nice little swoop for good effect. But really, the pictures tell the whole story, so check out the gallery below!


Update: We're adding another gallery as we speak. Here's some initial observations: the browser is much choppier than the iPhone's, there seem to be be two separate mail apps, one for Gmail and a separate IMAP app, and there seems to be no multitouch functionality. Check out the gallery below for a lot more views, and we'll be updating this as necessary!

Meet the T-Mobile G1

Uh, c'mon guys -- this is a little ridiculous. Now typically, you want to keep this sort of thing under wraps until the day of your event, but it seems like the cats and kittens at T-Mobile can't contain themselves. Yes, you're looking at the first official product shot of the G1 Android phone ever. Enjoy it.

Update:
TmoNews has just uncovered new specs and info on the phone. Here's what they've got so far: the phone is 4.6 x 2.16 x 0.63 inches, weighs 5.6 ounces, features a 480 x 320 HVGA display, sports 3G (obviously), GPS, has a 3.1-megapixel camera, supports up to 8GB of memory (though no format is mentioned), and will feature 5 hours of talktime with 130 hours of standby. Strangely, the phone won't do video capture (what?), won't have stereo Bluetooth, will require a Gmail account, and won't be sold at stores outside of a 2-5 mile radius of T-Mobile's 3G coverage areas. That last bit sounds a little odd to us, but we're guessing a lot of the functionality of this device will be shot in non-3G regions.

Original here

Adobe releases Creative Suite 4

Posted by Elsa Wenzel

Adobe released details Monday about Creative Suite 4, its first update to more than a dozen design and editing tools since Adobe CS3 some 17 months ago.

The costs of the applications, set to reach consumers in October, haven't changed since CS3, but remain hefty. Should longtime users upgrade?

Click on this image for more details about the Adobe CS4 suites.

(Credit: Adobe)

Of course that depends on the specific tools you need. However, we suspect that only the most well-heeled will jump at the chance, as CS4 shares the majority of tools with its predecessor. Perhaps more dramatic, life-changing alterations will come with the next Creative Suite. That said, time-saving tweaks to Illustrator and Flash in particular could lure professionals immersed in them to upgrade.

With CS4, Adobe aimed to unify the interfaces of more than a dozen applications, including Flash and other former properties of Macromedia. You'll see similar pull down menus for toggling among workspaces that you can customize, as well as Flash-based panels that nicely snap open and shut. Corporate design departments will find plenty of enhancements for their teams to share work more quickly.

Adobe continues to improve integration among the applications. After Effects, as only one example, can import Photoshop 3D layers and export content directly into Flash.

Options for working with high-definition video and mobile content expand too, with support for the latest formats as well as for making Adobe AIR applications. Among other highlights:

Photoshop CS4 will use your computer's graphics chip for the first time, while offering support for 64-bit Windows.

At long last, you can handle more than one project at a time in Illustrator, thanks to the new multiple Artboards feature.

Flash CS4 has a rebuilt animation model, so you can make objects move on the stage in two quick steps. And Flash introduces a new, XML-based file format.

Dreamweaver provides plenty of shortcuts to CSS coding, including within the Properties panel.

We've been toying with the beta code of CS4 for several weeks. Check out our first take reviews and videos of the six suites and their individual applications for more details. We'll report back with rated reviews after working with the final code.

Original here

iPhone Developer: I Just Made $250K From App Store In Two Months (AAPL)

Trism.jpgSteve Demeter developed the iPhone puzzle game Trism as a side project, but now he's quitting his day job. Why? Because he says he's generated $250,000 in profits since he started selling the $4.99 game on iTunes this summer. That's after Apple (AAPL) has taken its 30% cut of total sales, and after subtracting his initial investment of about $5,000.

So while we've heard plenty of griping from developers who complain about Apple's restrictive grip on its store, you're certainly not going to hear Steve joining that chorus. In fact, he says, he's so pleased with Apple, that he's going to work exclusively with them, and will pass on the chance to work on other platforms, like Google's Android.

Why cut himself off from other markets? In part, because he's doing just fine with Apple. But Steve also says that Google's strategy of distributing its OS to multiple manufacturers who will create multiple handset models will actually cause him more headaches than its worth.

“Do I want to be spending 6 months to write the game, and another 6 months making it compatible? If I had Trism available for Android, and there are 50 Android devices and every time one of them crashes (the users) contact me, do I want that?”

So if he’s not expanding to the other mobile platforms, what is Steve going to do with his newfound wealth? He says he’s actively looking to hire more people – engineers and artists specifically. While he started off on his own, he now has four more people working for him in San Francisco, working on 5 more iPhone games.

Original here

uTorrent’s Mac Client Leaked

Written by Ernesto

An early Alpha release of the long awaited Mac version of the popular BitTorrent client uTorrent has leaked to the public. The application is still in development, but most features seem to work just fine. As expected, the application looks very Mac-like, and better than its Windows counterpart.

utorrent macThus far, only Windows users have had the pleasure of running uTorrent. The client saw its first public release in September 2005, and soon became the most widely used BitTorrent application. In 2006, uTorrent was acquired by BitTorrent Inc., who continued to develop the application, and promised a Mac version too.

The Mac version came later than expected. One of the initial developers was taken off the project, and the others were focusing more on the Windows release. This August, however, uTorrent developer Greg Hazel told TorrentFreak that the first public Alpha version of the Mac release would be ready in a few weeks.

It now seems that someone has beat the uTorrent developers to it, as an early release was posted on The Pirate Bay a few hours ago. A leak of the BitTorrent client, developed in Cocoa, seemed to be inevitable. As mentioned before, it is an Alpha version, and not all the features seem to work like they should (search is broken), but it’s definitely a good start.

Simon Morris, BitTorrent’s VP of Product Management told TorrentFreak in a response to the leak: “Apparently an internal development build of uTorrent for Mac has been leaked publicly. It has been referred to as an “alpha” quality build. The unfortunate part is that we did not intentionally release this build and would strongly recommend folks not to use it as it isn’t yet complete or stable enough to be released to the public.”

“The good part is that this is a testament to the fact that we’re serious about releasing uTorrent for Mac in the near future. (And counter to recent rumors, this is indeed the uTorrent code-base ported onto OSX, not just Libtorrent with a Mac UI). Hopefully more news coming soon. We have a sign-up page on the uTorrent website.”

Most of the people who have tried the application are reporting that the application is fully functional, but that it’s clearly an Alpha release. Nevertheless, the first reviews are quite positive. “It seems like the uTorrent every Mac-owner has been waiting for is coming,” an early user told TorrentFreak.

We posted some screenshots of the leaked Alpha release below, click to enlarge.

Main Window

utorrent mac main window

Settings

utorrent mac settings

Torrent Details

utorrent mac details

Original here

Google's Android: It's not just for phones

Posted by Stephen Shankland

The first phone using Google's Android operating system will debut Tuesday, a model from T-Mobile, and more are set come. But some Android partners say the software will use more broadly than just phones.

"We're starting to see Android get designed in on devices that extend way beyond the phone--things that might go in the automobile or things that might go in the home," said John Bruggeman, chief marketing officer at Wind River Systems, a Google ally that helps phone makers build and customize Android for their phone hardware.

It's not clear yet whether Google shares this broader Android ambition--the emphasis today is for mobile phones--but extending into new areas could increase both the prominence and competitive threat of the project. However, projects that spread wider also can be stretched thinner, and advantages such as broader developer interest could be offset by incompatibilities other drawbacks.

Bruggeman declined to share specifics about which Internet-connected devices might employ the operating system, but he did mention TVs and set-top boxes as well as cars. And he was confident some will arrive next year.

"I don't want to pre-announce any design wins," he said. "I think you'll see them in 2009. I would be shocked if you didn't."

Google didn't immediately responds to a request for comment.

Of course, Android is mostly open-source software, so there's nothing stopping people from doing anything they want with it. But Wind River is a notable member of the 34-company Open Handset Alliance that Google gathered to build, support, and use Android.

Wind River has years of experience with so-called embedded operating systems, starting with its own VxWorks and eventually extending to include Linux, which underlies Android. It's also got a lot of customers, and to beef up its Android support services, Wind River acquired mobile Linux firm Mizi Research in August for a price it said could reach $16 million.

The Android business "was significant enough for us that we acquired a company so we had additional resources," Bruggeman said. Mizi is based in Korea, as are LG Electronics and Samsung, two notable phone makers in the Open Handset Alliance.

Much depends on how Google sees the effort. It's got a lot of engineering resources, of course, but perhaps more important, it has a powerful brand, some pull with the programmers it's enlisting to write Android applications, and a strong will to spread Internet access far and wide.

But embedded computing is a tough nut to crack. Wind River, along with MontaVista Software and many others, have tried to spread Linux to embedded devices. And while they've had significant success, there's a lot of fragmentation, with nothing as universal as Windows or as standardized as the iPhone.

Google has come up with a prominent brand and strong developer program for Android, which brings compatibility issues to the fore. With a brand comes an implicit promise that everything sporting the brand works well together. The broader the Android brand spreads, the more complicated it gets. For example, what if a programmer wants to take advantage of the considerable computing horsepower in an Intel mobile Internet device (MID) for a game--would that work on an comparatively feeble feature phone with a smaller screen, no keyboard, a low-capacity battery, and inferior graphics?

One convenient element Android brings to the compatibility challenge is that software doesn't run on the Linux component of the operating system. Instead, it runs on a Java layer from Google called Dalvik. That means programmers writing applications for Android need not concern themselves with the underlying hardware, such as whether a device is running an ARM-based processor or an x86.

Here again, though, there are some compatibility issues. Sun Microsystems' Java, already used widely in mobile phones, is a slightly different foundation. Software may transfer more easily from one domain to another, but there won't be any guarantees of compatibility.

One of the big elements of the Android sales pitch is openness, though, and that could have appeal in other markets.

Perhaps a developer might want to sell an application that shows nearby Flickr's geotagged photos on in-dash navigation device without having to obtain General Motors' permission first. Perhaps a user might want to download a TV game from Google's Android Market without requiring clearance from Sony.

And of course, people might want to use Google to perform an online search. Which is why Yahoo, Microsoft, and Nokia probably shouldn't be too complacent about the possibility of Android might spread beyond phones.

Original here

Did Apple steal ideas for the iPhone? Here’s proof that it didn’t.

There have been some recent rumblings regarding Apple supposedly stealing an iPhone idea for a program which would allow users to review information from the iPhone screen while the phone is still locked. In other words, without unlocking the phone, a user would be able to check his/her missed calls, recent text messages, and stock prices etc.

Here’s where the controversy comes in. Recent Apple patent filings have been unearthed showing an iPhone screen that is extremely similar to an already available iPhone application that was rejected from the App store. On the surface, this seems to be highly suspicious, but digging a bit deeper reveals that all finger pointing is misplaced.

The iPhone patents from Apple date back to June of 2007, but the application in question, known as Intelliscreen, is and has only been available on jailbroken iPhones. Iphones weren’t jailbroken until July 2007, so it was impossible for Apple’s patents to copy Intelliscreen’s program. Moreover, if the patents were filed in June of 2007, it’s likely that the idea was thought of weeks, if not months, beforehand. And to put the final nail in the coffin, it seems that Intelliscreen was first released as a beta in May of 2008, a full 11 months after the initial patent filings were made.

Nevertheless, Intelliscreen puts out a kick ass app as evidenced by the screenshots below, and hopefully a similar program will be available for non-jailbroken iPhones sometime in the near future. Click on the images for a larger picture.

Original here