Monday, February 28, 2011

IT News HeadLines (Techradar) 27/02/2011



Techradar
Explained: A day in the life of an email
Email is now so ubiquitous, we no longer even consider how it all works. Billions of emails are sent each day (the majority of which are spam, admittedly), and even with the rise of social networking, we haven't abandoned email yet.
Some good, some bad; some work-related, some personal: it's the communications medium of the 21st century. But what exactly is an email? How does it get from me to you? What processes and servers have to be running in order to ensure all this magic works to the point where we don't need to worry about it?
Back in the very early days, messages could only be sent from computer to computer on the same network. For this to happen, both computers had to be running and online (that is, both endpoints had to have users logged in) since the originating computer made a direct connection to the destination computer in order to transfer the message.
This worked in essentially the same way that phone switches work to route a call: the originating and the destination phone must be connected directly for the length of the phone call. For computers on the same network, this method worked pretty well, but it didn't scale at all once we started to link local networks together.
The birth of email
In 1969, the precursor of the internet, ARPANet, was created by a research team at MIT and at DARPA (Defense Advanced Research Projects Agency). It was the first packet-switched network, so named because all data traffic was split up into packets. The packets were numbered sequentially and put into digital envelopes, with destination addresses encoded into the envelopes.
ARPANet was a collection of servers, each able to receive and pass packets onto other servers on the network. This meant that a large message would be split into different packets, and each packet might be routed a different way through the network to the destination. Each node on the network knew only enough to pass on packets that weren't destined for itself, and it was the receiving computer that was responsible for collecting all the packets that made up a message and checking that none were missing.
This methodology meant that packets from many different messages from many sources could be interleaved and sent on a link, without the need to tie up the link to send a single message.
A couple of years later in 1971, Ray Tomlinson implemented the first system that we would recognise as email. His system was based on a program that copied files across a network and allowed users of different networks to send messages (as files) to each other.
To help with the addressing of the email, he came up with a simple solution: separate the username from the remote network domain name by use of the '@' sign - a convention we still use today.
The earliest emails sent were text files, usually seven-bit ASCII. Although emails are no longer physical files, they remain as text.
An email consists of two main parts: the header and the message section, separated by a single null line (that is, a line that that only comprises a carriage return/line feed).
Modern messages
Nowadays, the message section can - and usually does - have a lot more structure associated with it thanks to the MIME (Multipurpose Internet Mail Extensions) standard. This standard extends the original seven-bit ASCII-only messages to incorporate other character sets including Unicode, attachments (usually encoded with something like Uuencode or base64) and multiple parts (where a message is encoded as pure text, HTML and rich text within the same email).
The header section remains resolutely ASCII (although MIME does allow for addressing with other character sets). It consists of various header information about the email, such as the subject, the recipient address(es), who sent it, a unique message ID, where replies should go to, and so on.
figure 1
Email clients usually suppress most of this information when displaying an email, although there's usually a way to show them. Figure 1 shows an example header section from a recent email from the Association for Computing Machinery (ACM). Reading this you can see who sent it (and where to send the response to if I wanted to reply) and when it was sent.
The message itself is in a multipart MIME format (the line that defines the boundary between the parts is shown) - as it happens, the message is represented in both straight text format and in HTML within parts of the email and it's up to the email client as to which is actually displayed to the user.
Routing for emails
What also generally happens when an email is sent across the internet is that intermediary servers add extra routing information to the header section. For simplicity, this information is prepended to the header section, so the server doesn't have to hunt for the end of the section to add it.
The routing information generally details which email systems looked after and rerouted the email on its way to the inbox. For example, I've set up my personal email so that all messages are rerouted to Gmail, which means I can access my email easily using a browser or my phone.
The routing information included on the example email from Figure 1 shows (reading from the bottom upwards) the originating server name, the receipt by my email server at my personal domain, its sending on of the email to Google, the receipt by Gmail, and the final delivery to my inbox (see Figure 2).
figure 2
By tracing the times shown on the routing information, I can see that the email appeared in my domain's inbox in a matter of seconds, whereas the automated Gmail fetch process took about 30 minutes. Although legitimate email servers will provide valid information as they prepend routing information, many others won't.
Spam emails especially tend to contain fake routing information, so you can't rely on this header information until the point when it reaches your email server.
Having touched on routing for emails, we should take a look at what goes on when you hit 'Send' on an email message until the point when the recipient reads it in their email client.
The vast majority of email uses two types of server to send an email from A to B: the outgoing mail server and the incoming mail server. The outgoing email server is almost certainly an SMTP server (Simple Mail Transfer Protocol), while the incoming server can be a POP3 (Post Office Protocol) or IMAP server (Internet Mail Access Protocol).
When you set up your email client (let's say this is Microsoft Outlook, since that's what I use), you specify for it the address of your SMTP server. You also define to it the user ID and password that has been assigned to you to use the server's facilities (without a properly protected SMTP server, your email could be hijacked for spam broadcast purposes).
You write an email in Outlook, specify the recipient and press 'Send'. Outlook formats the message according the email standards (since 2008 this is defined in the RFC5322 document, which superseded RFC2822 from 2001, which in turn superseded RFC822 from 1982). It then connects to the SMTP server on port 25, passing the user id and password for authentication, and sends the email.
Once the SMTP server gets the email (and adds its routing information), it looks for the address to send it to within the header section. It strips off the username and the @ sign, leaving the domain name that the email must be sent to. The SMTP server queries the Domain Name System (DNS) for the MX (Mail eXchange) records for that domain name.
The DNS entry for a domain name consists of a set of records defining the addresses of servers that process various types of connection (there are A records, AAAA records, CNAME records, and so on), and the MX record defines the server that can receive emails for the domain. For example, with my personal domain, the A record currently points to 97.74.144.79. This is the IP address of the server that hosts my domain and my website.
My highest priority MX record (you can have several MX records and they are ordered according their priority, the order in which SMTP servers try to connect with them) is pointing to smtp. secureserver.net, the GoDaddy server that deals with my email. And, yes, your SMTP server then has to resolve secureserver.net to an IP address in order to continue.
You've got mail
figure 3
Your SMTP server then sends your email to the recipient's MTP server using the Simple Mail Transfer Protocol. Of course, it may be that, due to unforeseen circumstances, my SMTP server is offline or down.
In this case, your SMTP server will put your email in a queue and try to send it again later. If the server finds that after several tries it can't send the email at all, it wraps the email in a 'cannot deliver' message and posts that to your email inbox. But let's assume that all goes well and my SMTP server receives your email (and adds its routing information). It in turn reads the recipient email address, works out the user name, and puts the email in my inbox.
By 'inbox', I don't mean the inbox in Outlook or whatever email client you use. I mean the inbox on the email server for my email address. In the old days, the inbox was very simple: it was a set of text files, one per email, in a folder named after my email address (or maybe a single text file and new emails were appended).
These days it's more integrated - the inbox is in a database, with the usual failsafe guarantees that provides. Incoming mail servers We now come to the opposite end of the email trail: the incoming mail server. Ignoring the heavy duty corporate email systems such as Microsoft Exchange, Lotus Notes or Blackberry Server, there are two main ones in use today: the POP3 server and the IMAP server.
POP3 is the older and less sophisticated of the two, but they both have roughly the same features. The main difference between them is what happens to the emails. With POP3, although you can leave emails on the server, there's no provision for marking any as read/ unread - the assumption is that emails are downloaded to your client and deleted from the server.
Of course, this presents a problem if you want to use a variety of clients to access your email, because you may find that a particular email that you want to read is on a different PC to the one you're currently using.
With IMAP, the assumption is the opposite: emails are left on the server and can be marked as read/unread. This means that you can access your emails through a variety of email clients (desktop, phone, web) and all clients will agree on the current state of the emails.
With IMAP you can also do things like set up an inbox folder tree on the server or move emails around the tree, and again the clients will all agree on the current state of the inbox.
Let's assume that I'm using POP3. Again, I will have configured Outlook so that my incoming server is at such and such address and has a particular user ID and password (I can't have all and sundry reading my emails after all).
When I ask Outlook to retrieve all my emails, it will log in to the POP3 server with the credentials I gave, ask for a list of emails, and then download and delete them one by one. It will read the header information from each email in order to ascertain how the message is structured, how the constituent parts are encoded, from whom the email came, the delivery date/time, and so on.
Outlook will then decode and display the emails for me to peruse and read, and with that we come to the end of the journey for that email, from your PC to mine.



Read More ...

Review: LG W2363V
You would have no trouble believing that the LG W2363V could have come from a design team such as Bang & Olufsen. The shiny-white oval base matches the smart white bezel, with a black bar across the bottom of the screen providing some visual contrast.
The ports and connectors are arranged vertically on the rear with DVI-D, VGA, RCA audio and Component, plus a mini jack for audio in, along with a mains power connector. On the left side of the screen there are two HDMI connectors above a headphone jack, so you have plenty of scope for connecting a DVD player, a TV set-top box and a games console.
While we hoped that the black bar might house a pair of punchy speakers, this is not the case. In fact, the headphone jack merely acts as a pass through for HDMI audio, while the black bar accommodates a series of lights for a gadget called Tru-light which is triggered by the audio signal that is fed to the display.
There's a rocker control to the right-hand side of the screen which controls the pattern of the lights 'to maximise the sensation of the entertainment experience'; however, the three settings are all much of a muchness. Thankfully, one of the options allows you to disable this affront to good taste.
There are other annoyances. The auto-detect seems reluctant to select the picture input, so you may have to fiddle with the OSD to make the screen come to life. This was especially vexing because the touch controls are tricky to use and the functions of each button unclear.
The LG display follows the current trend and combines basic TN panel technology with LED backlights. Despite this apparent lack of originality, we were impressed by the quality of the picture and found that the image really packed a visual punch.
LG has delivered a decent display at a low price, but it could have been so much better if they had dispensed with all the gimmicks.
Related Links



Read More ...

Review: Asus ML248H
The ML248H is part of the Asus Designo series. As the name suggests, the designers have been hard at work on this model and the panel is slender with a shiny-white casing on the rear.
At first there appeared to be no stand, but further investigation revealed that the package contains two strange metal hoops. You attach one hoop to the back of the panel with a single screw and then attach a second ring that forms the base with a second screw.
The screen is solidly supported and has swivel as well as tilt adjustment, but in terms of design it is unlike anything we have seen before.
The rear of the panel has a small opening to connect the external power adapter and also for the headphone jack, HDMI and VGA inputs.
Although the lower bezel of the display measures a beefy 75mm and looks as though it should house stereo speakers, this is just an illusion, as the ML248H is silent when you use a VGA cable. The headphone jack becomes active when you switch to an HDMI connection.
Asus includes VGA and HDMI-to-DVI-D cables in the box and both worked perfectly. On the other hand, when we fed the Asus with an HDMI-to-HDMI signal, the picture gained a 1cm black border and looked poor, so the blame seems to lie with the Windows signal that is fed to the display, rather than the HDMI input itself.
The picture is perfectly decent, but after we had moved through the Theatre, Game, Scenery, Night View and RGB modes, we left the display in Standard mode as we found that suited us best.
Changing brightness or contrast is easy enough, as there are shortcuts that reduce the button pressing, but diving into the menus, for example to change the headphone volume, is an arduous task.
The Asus people describe the touch controls as a Marmite feature. They may love the controls, but we certainly do not.
The Asus ML248H looks very appealing, but the size of the lower bezel is ridiculous, we didn't get on with the touch controls and the price is too high.
Related Links



Read More ...

Tutorial: How to detect unknown malware with WinPatrol
Host-based intrusion detection is a serious consideration for people wishing to stay safe online from as-yet unknown threats.

Knowing exactly what's happening under the hood is also the first step in controlling what your computer does and when. Linux has enjoyed the protection of major open source intrusion detection systems (IDS) for some time.
Windows users have fewer options, but that doesn't mean the threats facing it are any less dangerous. The landscape is now changing so fast that it takes a large and growing online security industry to keep up.
To help gain and keep the upper hand, it's becoming necessary to counter unknown threats as well as trying to spot and stop the known ones. To help, a new class of anti-malware has emerged.
Combining the advantages of an intrusion detection system (IDS) with other software can help detect and block malicious activity, and even clean up after a successful attack.
Detecting intrusions
There are two main types of IDS, which differ in the scope of their protection. A network IDS (NIDS) sits at a strategic point on the network – such as between the internet router and the internal network – where it can see all the data packets as they flow by. It inspects all traffic flowing across, into and out of the network, looking for activity indicating a remote attack.
By contrast, a host-based IDS (HIDS) is installed on each networked computer, and monitors traffic flowing in and out of just that machine. This second type of IDS can be quite specialised, and can monitor individual aspects of the system and its behaviour – such as changes to the Registry.
A protocol-based IDS (PIDS) is an even more dedicated IDS. It's installed on a server (or somewhere it can see all the traffic flowing in and out of the server) and monitors use of the server's specific network connections. It might be installed on a web server protocols, for example.
The detection techniques employed by an IDS fall into several categories. The simplest of these is signature-based. Like most antivirus packages, this tests a huge number of traffic patterns against a large database of profiles generated by known attack types. As with antivirus software, this database must be updated regularly, as new attack signatures become available.
Unlike static virus signatures however, an IDS attack signature has a distinct time element because it needs to understand the order, sequence and possibly even the delays between the packets involved in the attack as they arrive.
Anomaly detection
Anomaly-based intrusion detection is more sophisticated and intelligent. It first establishes a baseline of 'normal' network activity by monitoring network traffic for a while, including the general amounts of bandwidth used, the protocols used, the associated ports, the number of connections and which devices generally connect to each other.
Once in detection mode, the system will compare this baseline to subsequent network traffic patterns. Anything out of the ordinary is considered suspicious.
detected threat
POSSIBLE THREATS: If you find something potentially dodgy on your system, you can view its details and even add a note for future reference so you don't forget
In an application protocol IDS (APIDS), the baseline is even more specific and has to be far more detailed. To be effective, the APIDS monitors the traffic received and transmitted by the network protocol, so it has to understand in depth the way the protocol is being used in order to look for anything that deviates from the way it's normally used.
Regardless of the detection technique used, once an IDS identifies suspicious activity, it can take two courses of action: active or passive. A passive IDS simply detects and logs anomalies in system behaviour and reports them to the user or system administrator.
An active IDS (intrusion prevention system) can respond automatically to the perceived threat by blocking incoming IP addresses, blocking specific applications from transmitting data, blocking potentially malicious changes to the system, and even by preventing code from running.
WinPatrol
WinPatrol has been protecting computers for over a decade, and has just received an overhaul for Windows 7. Although the commercial version has some very useful facilities, the free one is perfectly good for protecting computers on a home network.
After downloading, run the installation executable and click 'Next'. That's all there is to it. At the end, click 'Finish' to run the application and the user interface will appear. If you have audio enabled you'll hear a 'woof' sound.
The main user interface is packed with three rows of tabs, though some are only accessible in the Plus (paid) version of the software. Click the 'Startup programs' tab and you'll see a list of all the programs that start when Windows does.
Although Windows 7 is blindingly fast to boot up compared to earlier versions of the operating system, it can be slowed by this extra load. By selecting a program and clicking 'Remove' or 'Disable', you can temporarily suspend auto-startup of that program, or if it proves to be the one increasing your system boot up time, remove it from the list.
Removal doesn't uninstall the program. If there's anything in this list that you don't recognise, select it and press the 'Info' button. If you're still not convinced that it's benign based on the information, disable it and reboot. If nothing untoward happens, remove it from the list.
The next tab, 'Delayed start', enables you to stagger the startup times of different applications. If you always use a browser first when you boot up and log in, you can add it to the 'Delayed start' tab to make sure that there are no resource contentions, and that the rest of the operating system is up and running before the browser tries to connect to the internet.
Click 'Add', then navigate to the executable for the application. Select it and click 'OK'. Select whether you want the application to start for all users or just you then click 'OK'. Now click the 'Delay options' button. Enter a title for the startup job and a time to wait from bootup to running the application. If the program needs any command line options passing to it, enter these in the 'Parameters' box.
Finally, select the way you want the program to appear – maximised, in a window or minimised to the task bar. Click 'OK' and the name of the delayed startup job changes to the one you entered. Reboot and WinPatrol should implement your changes.
Many people refuse to upgrade to the latest version of Internet Explorer, which means it's the target of all kinds of malicious and potentially malicious browser helper objects (BHOs). These extend the functionality of IE and are loaded when you run the browser. They can also increase the browser's startup time.
They often can't be uninstalled or even seen by normal users – perfect for installing adware and spyware.
Cleaning up IE
Click on the 'IE Helpers' tab in WinPatrol and you'll see a long list of these, plus the browser's toolbar add-ons. If you're irritated by installation programs insisting that you install the Yahoo Search bar, for example, you can remove it here.
The amount of on-screen space taken up by IE's normal toolbars is substantial, without having it further reduced by something you don't want. Select a BHO or toolbar from the list and click 'Info' to learn more. If you don't like what you see, click 'Remove' to delete it from IE and the system.
You'll be asked to confirm your choice before deletion takes place. Malware can also pose as or hijack legitimate scheduled tasks. To inspect these, click the 'Scheduled tasks' tab. Again, click 'Remove' to take any unnecessary or dodgy tasks out of the list.
This and the other two startup tabs are also a great way to clean up a new PC that annoys you with nagware. Now we can move on to the meat of host-based intrusion detection: detecting changes to the system that may indicate the presence of malware, spyware, or adware.
Click the 'Options' tab to configure WinPatrol for detection. Homepage hijacking is finding increasingly sophisticated roles in online crime. With 'Detect Changes to Internet Explorer home and search pages' selected, you'll be notified of any changes to the browser or its configuration.
Detecting changes
The HOSTS file is a throwback to the days before DNS, but it's also the first port of call for any internet-aware program trying to resolve domain names into IP addresses. These programs will use the domain/IP address mappings in the HOSTS file without question, so if this file is changed it can make you believe you're accessing legitimate websites when in fact you're being redirected to malicious ones.
HOSTS
HOSTS FILE: If malware makes changes to the HOSTS file on your computer, it can redirect you to anywhere on the internet without your knowledge
The 'Warn if changes are made to my internet HOSTS and critical system files' option will keep you safe from this form of attack. You can also view the 'HOSTS' file with the appropriate button; Notepad pops up to display it.
The 'HOSTS' file contains a few examples of mappings between DNS names and the associated IP addresses. If you see one without a hash ('#') symbol before it, indicating that the line is edited out, and you didn't put it there, place a hash at the start of the line, save the file and reboot to see if it breaks anything. If not, malware may well be trying to redirect you to a malicious page.
As WinPatrol runs, it creates a log file of events that you can view with the 'WinPatrol log' button. The resulting HTML page gives information about everything that happens on your PC. Pressing the 'Spreadsheet report' button will create a spreadsheet containing the same data. This is written to 'BillP\WinPatrol' in the 'Program Files' folder of your C:\ drive.
One last useful option on this tab is 'Lock file types'. If you've ever been frustrated by legitimate programs changing your carefully modified file associations even when you asked them not to, this option is for you. It prevents such changes from happening.




Read More ...

Review: Acer S243HLAbmii
Acer has gone to town with the styling of the S243HLAbmii and the result is unlike anything we have seen before.
The headline figure is the claimed panel thickness of just 15mm and if you look at the screen from the side, you'll see the panel is indeed very thin. This tiny form factor is thanks in part to the use of LED backlight technology, but there's no avoiding the fact that a fair amount of hardware has been shifted from the main panel to the stand.
The stand is a bulky affair that houses the dual HDMI and single VGA connectors, but no DVI input. The OSD control buttons are arranged vertically up the front of the stand. They work well enough, but are overshadowed by the panel, so you will be hard-pressed to see the legends next to the controls in low light conditions.
Acer has included profiles called Text, Graphics and Movies as alternatives to the standard profile. Text is darker than standard, while Graphics and Movies boosts the intensity of the colours, but frankly we found they look much the same as each other.
For some reason Acer has offset the stand to one side in an L-shape with the main support shifted to the right-hand side. If you give the display a knock it shakes around alarmingly and this arrangement offers no obvious benefit.
We were mystified to see that Acer has chosen a mono 2-Watt speaker rather than stereo audio.
The Acer has tilt adjustment, but the pivoting joint between the panel and the base has been turned into a piece of industrial art and the function suffers as a result. In use we found the panel doesn't tilt back far enough to give a comfortable viewing angle.
The Acer is a triumph of style over substance and, while the panel does a competent job, you would have to be rather shallow to be lured by its cosmetic appeal.
Related Links



Read More ...

Review: HP EliteBook 8540w
HP's EliteBook range targets those in need of power and corporate features and the EliteBook 8540w packs both in equal measures.
With its semi-rugged design, it is a highly resilient machine and ideal for demanding use. It has been tested to the MIL-STD 810G standards for resilience to vibration, dust, humidity and high temperatures, so is tough enough for busy work outdoors and at home.
With its magnesium-alloy chassis and brushed-metal finish, the chassis is tough and has bags of style. The sleek design and gun-metal finish look fantastic.
At 3kg, it is light enough for basic travel, but the 224-minute battery life significantly trails the Panasonic Toughbook CF-31.
Thanks to its large 15.6-inch screen, the HP has a spacious keyboard. All keys are firm, responsive and move near silently, making it great to work with. The board is spill-resistant and a tiny LED light in the screen panel illuminates the keys when working in dark conditions.
HP elitebook 8540w
The sharp screen features a Full HD resolution and is ideal for demanding multimedia use. Images are sharp, with strong colour and contrast bringing photos to life. The matt-TFT panel also suppresses reflections well.
Designed as a mobile workstation, it's no surprise that this laptop offers ample power. With its high-end Nvidia Quadro FX 1800M GPU, the HP provides ample graphics power, making it ideal for design and multimedia tasks.
Office performance is also extremely capable and falls in line with the Getac V200 and Panasonic. The cutting-edge Intel Core i7 620M processor and staggering 8192MB of memory run complex office software with ease and make light work of the most demanding daily multi-tasking.
Benchmarks
Battery life: 224 minutes
MobileMark 2007: 259
3D Mark 2003: 21,786
Ample storage
Where the HP also excels is storage. The 500GB hard drive is generous and is backed up by a Blu-ray drive for watching high-definition movie discs.
A 6-in-1 card reader is also in place for accessing the most popular multimedia storage cards. Adding to the security features of this machine, the hard drive is shockmounted and features an accelerometer. This detects when the laptop has been dropped and parks the disk heads, in order to prevent the disk being scratched and causing damage to your data.
While the semi-rugged design of the EliteBook 8540w won't suit the most taxing outdoor use, its tough chassis, great usability and stunning performance make it ideal for busy urban professionals. Packed with style and features, it makes a fantastic rugged workstation.
Related Links



Read More ...

Tutorial: Windows Event Viewer tips and tricks
The Event Viewer doesn't look like a very exciting Windows componment. If your PC is unstable you might use it to check for error messages, but otherwise, well, that's about all. Or is it?

Look a little closer and you'll discover that the tool has all kinds of useful additional capabilities. It can sometimes be hard to find important events using the default settings, but creating a custom view will help you zoom in quickly on the data that really matters, which can be an essential troubleshooting aid.
If you have a network, then you can set up one copy of the Event Viewer to collect events from several PCs, and manage them all centrally.
One excellent feature gives you the ability to run a particular program or task when a given event occurs. If a program crashes you could restart it, for example. If you're short on hard drive space, you could delete your temporary files – whatever you like.
Then there are the secret Event Logs that you may not even know exist, the leftover logs that need to be deleted, the hidden management features and a whole lot more.
Please note, while we're focusing on the Windows 7 Event Viewer here, much of what we're saying also applies to Vista and even XP. Whichever version of Windows you're using, the Event Viewer deserves a much closer look.
The basics
Event viewer
The prime purpose of Event Viewer is to act as a log for various applications and Windows components. Many of these issues don't have an interface, or don't report all their problems and status issues via alert messages, so if you want to find out what's really going on with your PC then it's essential to take a look at the Event Viewer on a regular basis.
You can access the viewer via the Control Panel (go to 'System and security | Administrative tools | View event logs' if you're using Windows 7), but we find it easier to launch the tool directly: click 'Start', type eventvwr.msc, click the 'Event Viewer' link and it will pop up in a second or two.
If you just want to find out more about your PC, then you can expand the 'Windows Logs' section of the tree and browse the Application, Security, Setup and System logs for any interesting looking events.
These logs are presented in reverse chronological order, so the most recent events are at the top and as you scroll down you'll move back in time.
What will you see here? It depends entirely on the setup of your system, but we checked a test PC and came up with many interesting entries. There were detailed error messages for application and system crashes, for instance. If you come home and someone tells you the PC crashed an hour ago, but they can't remember the error message, the Event Log may tell you more.
We found performance-related information, including an Outlook message that said its launch was delayed because of a particular add-on. There were also warnings about four boot drivers that had failed to load. That's information we wouldn't have found anywhere else, and could explain all kinds of odd system behaviour.
Other issues
There were also events relating to the PC startup and shutdown process, installed programs, hardware problem, and many other issues. You wouldn't want to browse the Event Viewer for fun, but if you're having any kind of computer issues then it's wise to give it a closer look – you just might find the clues you need to uncover their real cause.
The problem with scrolling through the main Windows logs is that there are only a few interesting events, and they're masked by a great deal of irrelevant junk. Fortunately the Event Viewer provides several alternatives that will help you zoom in on the data that matters.
Custom view
The Windows 7 Event Viewer, for instance, opens with a useful 'Summary of Administrative Events'. Particularly important event types, such as 'Critical', 'Error' and 'Warning', are listed right at the top and you can expand these to find out more.
Trying this on our test system revealed seven disk errors in the past week. Double-clicking the entry revealed the details, and it turned out one of our drives was experiencing controller errors. Could the drive be about to fail? We're not sure, but at least the Event Viewer has given us a warning so we can back it up.
Another possible option is to expand the 'Applications and Services Logs' section of the viewer. This area contains logs dedicated to applications and areas of your system, such as hardware events, Internet Explorer and Media Center.
Perhaps the most important log here is a little buried, though. Browse to 'Applications and services logs | Microsoft | Windows | Diagnostics-Performance | Operational' and you'll find information about your PC's boot and shutdown processes. Again, everyone will see different things, but when we checked this log on our PC we found a wealth of essential data.
There were events warning us that the Bonjour Service, Function Discovery Resource Publication Service and Orbit Downloader were all causing delays in the system shutdown process. Other events pointed fingers at particular programs for delaying our PCs boot, too – if we were to remove anything non-essential, our system would speed up.
There were general warning events too, such as 'Video memory resources are over-utilised and there is thrashing happening as a result'. If your PC seems slow, or unstable, then this could be a clue. Simply closing some windows could make all the difference, as might updating the video drivers.
As usual, these logs are packed with clues to all sorts of problems, many of which you may not even realise you have. Take a look – it's surprising what you can learn.
Subscriptions
Subscriptions
The Event Viewer isn't only able to reveal issues with your own PC. It can also collect information on Vista or Windows 7 systems all across your network, so you can troubleshoot many problems from the comfort of your own desktop.
To set this up you must prepare the remote computers to forward events. First launch an elevated command prompt on each of these (do this by right-clicking the link 'cmd.exe' and selecting 'Run as administrator'), then enter the command winrm quickconfig.
Next, go to the central PC where you'll be collecting these events, launch another elevated command prompt and enter the command wecutil qc.
You can then launch the Event Viewer on the collecting computer, click 'Subscriptions | Create subscription' and tell the system exactly which events you'd like to collect from which computers. These will then appear in the log you specify, and you'll be able to view and filter them just as you can events on your own computer. Well, that's the basic principle at least.
In practice, there are usually some complications. You might have to specifically allow the Remove Event Log Management process to connect through your firewall, for instance, and you'll need to add an account with administrator privileges to the Event Log Readers group on each of the remote PCs. Check the 'Event viewer help' file under 'Manage subscriptions' for more details.
Run a task
Alert
So far we've only used Event Viewer in a passive way, allowing it to record what various apps are doing, but the best part of the tool is that it can also be active and dynamic, responding to events with the specific action that you choose.
Suppose one of your favourite apps has its own event log, for instance. It might only add one event a week, but that event might be very important and you may want to know about it right away. Is this a problem? Not at all. In a few clicks you can be alerted whenever a new event appears.
To make this happen, launch Event Viewer, expand the 'Applications and services logs' section of the tree, right-click your log of choice and select 'Attach a task to this log'. Click 'Next' twice, choose the 'Display a message' option, and click 'Next' again. Enter a title for your message, then the message itself, and click 'Next'. Click 'Finish' and that's it – Windows will now display a pop-up alert with your selected message whenever an event is placed in this particular log.
You can also attach a task to a specific event. If you see something that might be really important, like a message that a hard drive is returning controller errors, then right-click it, select 'Attach a task to this event' and the wizard will appear. With a few clicks, you can ensure that you're informed directly about important events, rather than just hoping you'll catch them later.
Perhaps most usefully, the Event Viewer can also launch a task in response to a particular event. If your system is regularly displaying some low-level drive error, for example, you could automatically launch Windows chkdsk or some other drive error checker to confirm that all is well.
If you're running short of hard drive space and related events are appearing, you could have these launch something like CCleaner to quickly free up a little space.
The principle is the same: right-click an event and select 'Attach a task to this event' to launch the Create Basic Task Wizard. This time, when you get to the 'Action' point, select 'Start a program'. Click 'Next', choose your program or script and any optional command line arguments, then click 'Next', finish the wizard and your configuration is complete.
Event details
Windows will now respond automatically to events as they occur, which could mean your PC problems are fixed before you realise they've occurred.




Read More ...

In Depth: The future of the internet revealed
Technology changes so quickly, it's hard to remember how bad we used to have it. UK internet access didn't really take off until Freeserve launched in 1998, few of us had broadband before 2001 and the UK didn't even have a 3G mobile phone network until 2003.
Google was founded in 1998, Facebook and Flickr in 2004, YouTube in 2005 and Twitter in 2006. It's impossible to imagine life without them now, and the pace shows no sign of slowing down.
The internet we'll have in 2020 will look almost nothing like the one we have in 2011, from the information we access to the devices we use to connect. Can we predict exactly what it's going to look like?
Almost certainly not, but we can see the seeds of it even now, and work out a few of the directions the industry will have to travel down to make it happen.
At the back end
If you think the internet is busy now, think again. The current internet population of 1.7 billion is expected to exceed five billion by 2020 - and we're not talking about people. Everything from televisions to old favourite the internet fridge will be hooked up.
At the moment that's impossible, simply because we need many, many more IP (internet protocol) addresses than the current IPv4 system allows. IPv4 has room for four billion IP addresses, and according to internet co-creator and Google evangelist Vint Cerf we'll use up the last ones in the spring of 2011.
The new IPv6 standard has capacity for "340 trillion, trillion, trillion" unique IP addresses, Cerf says, "so the theory is we won't run out, at least until after I'm dead".
The move to IPv6 is crucial for several reasons. In addition to freeing up lots of internet addresses, it also improves network security and makes routers' lives easier. Unfortunately, it isn't backwards compatible with IPv4, so networks running IPv6 won't be able to talk to networks running the older protocol. Desktops, smartphones, laptops and routers generally support IPv6, but many ISPs and business networks haven't switched to it yet.
To address the issue, 6UK is raising awareness of the looming crisis and urging businesses to act. "The biggest set of changes in the history of the internet [is] happening today," Cerf explains. "The change in the address space, the change in domain name languages, the introduction of digital signatures in the domain name system, the introduction of end-to-end cryptography in accessing internet-based services. This is a huge change in the net."
The arrival of cloud computing has enabled us to outsource storage and applications to distant servers, and the trend won't just continue, but accelerate: Gartner Research predicts that by 2012, cloud computing will have become so pervasive that one in five businesses won't own a single IT asset.
Moving to the cloud
Our email, images and our work documents are often in the cloud already, and entertainment will follow in their footsteps.
Video on demand services are ten-a-penny online, but streaming, not downloading, seems to be the technology of the future: it's the solution used by Netflix in the US and iPlayer here, and Apple is widely expected to unveil a streaming version of iTunes soon (which would explain why it's building a billion-dollar data centre in North Carolina). Buying something online will increasingly mean buying access to it, with no direct ownership at all.
Gaming may move to the cloud too. A service called OnLive promises console quality games with minimal hardware by doing the processing in its data centres and streaming the results to a tiny 'micro-console'.
OnLive
STREAMING GAMES: OnLive promises to deliver console-quality gaming with the processing performed remotely
OnLive is a serious company - it boasts 200 employees, and its investors include Warner Bros and BT. It's available now in the US and looks set to grow quickly.
Cloud computing will be particularly important as smartphones and other mobile devices become the platforms of choice for most of our online activities. Phones don't yet have the power or storage necessary for desktop-calibre applications, so the emerging model is what Microsoft calls 'three screens'.
Three screens
As Steve Ballmer explains it, this is "an experience that spans the PC, the phone, the TV and the cloud". Rather than store your entire computing world on a desktop PC, you store it in the cloud and then access it on whatever device happens to be handy.
There are some things that a big desktop will almost certainly always do better than a smartphone, including data input, but there's no reason why the app has to be installed or data isolated to its own hard drive. With smartphones expected to outsell PCs by 2013 and Google's cloud-based OS Chrome on the horizon, cloud computing is going to be very important in the coming decade.
According to the Pew Internet and American Life Project, by 2020 most people can expect to "access software applications online and share and access information through the use of remote server networks, rather than depending primarily on tools and information housed on their individual, personal computers." It's all very exciting, unless you're an ISP.
Our appetite for online video is enormous and it's growing: the BBC's iPlayer delivers seven petabytes (7,000 terabytes) of video a month, while YouTube's bandwidth is estimated at 126 petabytes per month. Networking firm Cisco predicts that video will account for 90 per cent of consumer internet traffic and 64 per cent of mobile internet traffic by 2013.
Microsoft thinks online video isn't smart enough, and its solution is adaptive streaming, which it calls Smooth Streaming. Unlike traditional streaming, where your connection speed is checked once (if at all), adaptive streaming monitors your internet connection constantly.
If it becomes congested, the bitrate drops to something your connection can handle. When the congestion clears, the bitrate goes up. It works well, even on large-scale events, and you can see it in action at www.smoothhd.com.
The problem with adaptive streaming is that it still uses the old client/server model, where the server transmits data to you directly across the entire internet. BitTorrent creator Bram Cohen has an alternative idea, dubbed Project Pheon, which uses peer-to-peer networking to deliver streaming video.
Speaking at the 2010 NewTeeVee conference, Cohen promised "around five-second latency from when the content goes out to when it's actually displayed on people's machines".
Join the swarm
Pheon - like BitTorrent - uses swarming rather than traditional downloading. As you download a file, the bits you've downloaded are shared with other downloaders, so in theory you should get faster downloads by connecting to somebody near you rather than a distant server.
It's a technology that works best for popular files, and if you're a regular torrent user you'll know that new, popular torrents download like lightning while obscure ones crawl. This means swarming is best suited to big events, such as newly released films, live sports and concerts.
Of course, to actually access such high bandwidth services, we'll need fast broadband. Will we have the super-fast service the government is promising by 2017?
Trefor Davies is CTO and co-founder of business ISP Timico. "The problem facing the government is that the task is a huge one, and it would be very easy for them to decide that the only way they can realistically get to the end game is by roping in BT to help," he says, pointing out that while BT has offered to match the government's آ£830m funding to deliver 90 per cent super-fast broadband coverage by 2017, "coming from a company that claims to have 99 per cent broadband coverage, this makes us wonder what is meant by '90 per cent high speed broadband'."
Davies believes the only way to get fast broadband throughout the UK is to involve communities. "There are things that communities can do to make it easier and cheaper to roll out fibre networks," he explains.
"For example, companies like BT are charged anything up to آ£10 per metre for wayleaves to run cables across private land. [That's] a nice little earner for landowners: the average length of fibre in the Eden Valley is around 20km per community. That's a lot of wayleave charges that BT has to built into its costs." Landowners might waive those costs for community organisations, making fibre roll-out cost-effective.
Going mobile
Google goggles
SMART CAMERAS: Google Goggles' visual search is a taste of things to come
Our 3G phone networks weren't built with data in mind, and if you've ever struggled to download an email on a five-bar 3G signal, you'll know that networks are already struggling to cope. "I suspect that network operators have been caught by surprise by the increase in demand,"
Davies says, pointing out that operators "work on a two-year planning horizon, so if they do come across unexpected capacity problems it isn't always possible to do a quick fix."
Two developments should help: freeing up the 800MHz frequency band when analogue TV signals are switched off in 2012, and network operators upgrading to LTE (Long Term Evolution), often dubbed 4G. LTE won't be widespread for several years though, so we're stuck with 3G until at least 2015.
There is another option: HSPA+ (High Speed Packet Access Plus) - an upgrade to existing 3G networks that can, in theory, deliver 42Mbps download speeds. It isn't as solid or as fast as LTE, but several UK operators are considering it.
With demand for mobile data soaring, a crunch is coming. Network planners already report congestion issues, but these aren't spread equally: some congestion is in specific locations, other congestion at particular times of day. In a survey commissioned by telecoms billing firm Amdocs, 20 per cent of operators reported 'severe overload' at times, and just 37 per cent said their networks are running fine. 40 per cent of operators said 'bandwidth hogs' were contributing to problems. Expect to start paying more for mobile access, or more limits on what you can and can't do.
"O2 is forecasting a hundredfold increase in bandwidth requirements over the next few years… the figures don't stack up," Davies says. "Mobile operators have to raise the cash to pay for the new infrastructure and so they are looking at innovative pricing mechanisms. This hasn't arrived in the public domain yet, but we are being warmed up for it."
One such mechanism could be Dynamic Spectrum Access, a kind of electronic auction where your phone bids for bandwidth it needs. The good news is that the crunch isn't imminent, because the real bandwidth hogs are 3G dongles rather than smartphones.
"I don't think that mobile apps will drive the need for the same bandwidth as fixed line in this timeframe, although we should never say never," Davies says. "My HTC Desire HD already supports 720p HD video, but the industry needs to sort out battery life before we will see serious high speed usage from handsets."
Once that happens, mobile bandwidth is likely to become a premium product. If you want a fast, lag-free connection at peak times or in congested areas, you'll have to pay more for it.
Bye bye browser?
It seems backwards, but just as software moves online, the browser itself is becoming less important. Increasingly, data is delivered to and received from a range of apps on a myriad of devices, which all have a bit of browser built in.
You can see that happening in social networking. Much of Twitter, Facebook and Flickr's traffic comes from applications: mobile phone apps, desktop software with social media export options, stand-alone photo uploaders, desktop widgets, browser-based aggregators that combine multiple networks in a single browser window and so on.
It's a similar story with YouTube, which delivers video to the web, mobile phone browsers, televisions, and mobile phone and iPad apps, and which accepts uploads not just from the YouTube website, but from cameras, camcorders and games.
The key to these apps is the API, short for application programming interface. APIs are hooks that developers can use to get data from or put data into online services. For example, Twitter's API enables third-party applications to post to and get data from Twitter.
But the API is just part of the picture: the data also needs to be delivered in a format that means it's easy to use and easy to combine with other sources. Increasingly that means open standards such as HTML5.
HTML5 and Flash
Designing for the web used to be simple. If it was static, you built it in HTML and CSS. If it moved, you built it in Flash. Not any more. The emerging HTML5 standard does animation, video and effects too, and Apple for one believes that it's going to make Flash obsolete. Apple may be right.
HTML5 does many things that once required plugins or stand-alone software, including video, local storage and vector drawing - and unlike Flash, it's an open standard that produces search engine-friendly content.
HTML5 is far from finished and it'll be some time before it's as attractive to developers as Flash. It lacks the write-once run-anywhere appeal of Flash, because different browsers have implemented bits of HTML5 in different ways, and it lacks the excellent authoring tools that Adobe has spent years refining.
In the long term, however, we'd expect more and more content to be created in HTML5 rather than Flash, much of it using Adobe tools.
HTML5 has one particularly interesting trick up its sleeve: microdata. This enables designers to label content, and it's something Google already supports. Its Rich Snippets uses microdata to pull the relevant information about a website and display it in the search results. That information could be details of a restaurant's cooking style, the verdict of a review, contact details or anything else that can be expressed as text or a link.
This metadata is machine readable, and machine-readable metadata gets people like Tim Berners-Lee very excited. In 1999, he said: "I have a dream for the web [in which computers] become capable of analysing all the data on the web - the content, links, and transactions between people and computers… the day-to-day mechanisms of trade, bureaucracy and our daily lives will be handled by machines talking to machines."
Visions of the future
At Intel's Visual Computing Institute (VCI) in Saarbrueken, Germany, researchers are exploring what could be the next generation of interfaces by combining motion capture, photorealistic computer graphics and 3D navigation. Can we expect 3D interfaces in the near future?
new interface
FUTURE INTERFACES: Intel's Visual Computing Institute is experimenting with interfaces that mimic the real world
"The answer is a qualified yes," says Hans-Christian Hoppe, Co-director of Operations at the VCI, who admits that "in the late 1990s, VR arguably fell into the category of 'a solution looking for a problem', and unfortunately not a very elegant solution. One could draw a parallel with tablet devices," Hoppe continues.
"They have been around for many years, in many guises, but it was only when the user experience met expectations that the devices became successful." Hoppe says we're reaching that point with immersive interfaces.
"Hardware has indeed advanced, in particular for mobile devices. That isn't the key issue, however. Social networking has evolved from a fad to addressing people's needs… immersive virtual worlds are now able to look like a natural extension of the likes of Facebook - suddenly, all this technology might be useful."
"Pervasive network connectivity and performance are important, processing power and energy efficiency likewise, particularly on mobile devices," Hoppe says. But the technology needs to be matched with innovative thinking.
"All we have today are extensions of tried and trusted 2D interfaces," Hoppe says. "What is needed is a uniform way of interacting with a mixed 2D/3D environment that's easy to understand, convenient to use and that doesn't strain the human perception and motion senses."
We've been promised 3D internet before, but Intel's vision isn't blocky avatars but realistic, ray-traced images delivered in real time, and we're approaching the point where our hardware and networks are fast enough to deliver it.
Whether we'll want it remains to be seen, but when it comes to technology, having extra options to explore is rarely a bad thing.



Read More ...

Review: Getac V200
Getac sells a wide range of ruggedised mobile products, from laptops and tablets to compact handhelds. The V200 is a convertible tablet, much like the Dell Latitude XT2 XFR, and packs impressive power and features into its hard-wearing chassis.
The V200 is tested to military standards for resilience. The magnesium-alloy chassis is staggeringly tough and its impact-resistant design will effortlessly take everything you can throw at it.
Flawed covers
With an ingress protection rating of IP65 the chassis is also well protected against infiltration from dust and water. Unlike the Dell and Panasonic's lockable port covers, however, the rubber plugs used here can snag on external objects and be pulled open unexpectedly, which isn't ideal.
At 3.3kg, this is one the heavier ruggedised laptops, but while it lacks the built-in handle of the Panasonic, its textured finish makes it comfortable to carry. Battery life isn't that impressive, however, delivering less than four hours of use.
Usability is excellent, as with the Dell, but the keyboard features an indented Control key, with the Function key in its place, which can take some getting used to. Nevertheless, the board is otherwise impressively smooth and responsive.
As with the Dell and Pansonic, the Getac's 12.1-inch display is multi-touch compatible, letting you open and manipulate files using your fingers or the included stylus. Unlike the Dell, however, the screen uses resistive technology as standard, allowing use even when you're wearing gloves.
While it lacks the responsiveness of the Dell's capacitive panel, interaction is suitably smooth. The screen is also incredibly bright, making it readable in even the brightest conditions. Unlike the Dell, however, the screen features a hazy quality often seen on resistive panels.
Where the Getac stands out is its impressive power for such a rugged laptop.Although bettered by the HP and Panasonic, the use of a high-powered Intel Core i7 CPU means office software and tools for fieldwork are dispatched with effortless ease, even when multi-tasking.
Benchmarks
Battery life: 219 minutes
MobileMark 2007: 239
3D Mark 2003: 1916
Extra features are equally pleasing. The 320GB hard drive provides capacious storage for this type of machine, while a rotatable camera above the screen can be used to record front or rear-facing video in the office or when working out and about.
Getac v200
The tough chassis and striking power of the V200 make it a great machine for extreme outdoor use. With its resistive screen suiting use in even the coldest conditions, it makes a great alternative to the Panasonic Toughbook CF-31.
Related Links



Read More ...

Review: Dell Latitude XT2 XFR
The Latitude XT2 XFR adds to Dell's corporate range and is billed as being the world's first multi-touch compatible rugged tablet. Combining a stunningly hard-wearing design with average performance, it makes a tough and unique, albeit slightly flawed, choice.
The fully rugged chassis is incredibly tough and has been tested to MIL-STD- 810G standards. You can drop it from three feet, use it at altitudes of up to 15,000 feet and work in sub-zero temperatures without any risk of significant damage. All the ports are sealed beneath lockable panels, providing protection against dust and water.
The 12.1-inch screen has been tested to withstand impact, ensuring even this traditionally fragile component is well protected.
Despite its tough build quality, the 2.6kg chassis is surprisingly portable. It is one of the thickest laptops around, but the rubber padding on its edges makes it comfortable to hold.
The 254-minute battery is average, however, and bettered by the Panasonic Toughbook CF-31.
Usability is excellent. The keyboard is spacious and responsive and a pleasure to work with. A rubber, backlit keyboard can be added for آ£286 (ex. VAT) to improve accessibility and protection. The tiny touchpad is less easy to use, but the Dell compensates with its touchscreen panel.
Multi-touch panel
The multi-touch display lets you easily use your fingers to scroll, rotate and zoom documents. A stylus is also included, letting you write on the screen to take notes. While the capacitive screen cannot be used with gloves on, a resistive option will be available soon to allow such use.
Dell latitude xt2 xfr
Image quality and visibility are excellent. The matt finish prevents reflections and it is an extremely bright screen, so is easy to view in direct sunlight. Colour, contrast and sharpness are equally strong, producing a detailed and vibrant picture.
Where this machine falls short of its rivals is performance. Rather than a cutting-edge Intel Core processor, it uses last-generation Core 2 Duo technology. There is ample power for running office applications, but consider your needs if you need maximum power in the field.
Benchmarks
Battery life: 254 minutes
MobileMark 2007: 175
3D Mark 2003: 2050
Storage also falls short with just 128GB of space, so bear this in mind if you have many large files to carry with you. The Solid State Drive is more stable than the mechanical disks of its rivals, however, so you may consider the improved resilience it provides to be a worthwhile compensation.
While there's no denying the quality and build of the Latitude XT2 XFR, its limited specification, power and battery life lag a little too far behind the competition. It's a fantastic fully rugged laptop though and well worth considering.
Related Links



Read More ...

Explained: Powerline networking: what you need to know
Sending data over power lines has been around long enough now for us to forget how amazing the idea was when the first HomePlug standard emerged in 2001. Imagine, you could use the power wires in your house as network cables, a fully-wired house in an instant!

The simplest use is to connect your broadband box of tricks hanging off your phone jack, which is often somewhere useful like the downstairs hall, to your PC, which is likely to be anywhere but the hallway.
The widespread adoption of wireless modems has reduced the impact somewhat, but for networking your house it's a neat and simple solution. You've no problems with thick walls either: it'll run up to 300 metres and in theory you can add 254 devices (although more than 10 starts to slow things down).
Plus it all happens outside your PC (let's just say that the lack of Windows drivers is no bad thing). It's about as simple as networking gets. The technology goes under a lot of names, but we'll go with HomePlug.
Actually the basic idea is an old one: the power companies have been sending control signals over the mains since the 1920s - that's how electricity meters know to switch to the off-peak rate. And it's about much more than just networking computers and their peripherals: sending data over the power lines can be used to monitor and control everything connected to the mains.
Enter the internet fridge, and, we trust, more useful appliances.
The same, only faster
So why are we here again? New faster standards, that's why! The original HomePlug 1.0 standard accelerated out of the blocks at a somewhat pedestrian 14Mb/s, peaking at 85Mb/s a few releases down the line.
This was followed in 2005 by HomePlug AV, which could manage a peak rate of 200Mb/s and traded on the ability to cope with video and voice, hence 'AV'. The Technical Working Group of the HomePlug Alliance (both are just what they sound like they are) has now rustled up the HomePlug AV2 standard, which it claims "is expected to deliver a five times increase in performance". Big talk, as that'll be 600 Mb/s.
AV2 is fully interoperable with the older HomePlug AV standards and two other new formats: Green PHY and IEEE P1901. The Institute of Electrical and Electronics Engineers is the world's largest professional association dedicated to advancing technological innovation and excellence for the benefit of humanity (that's what it humbly says about itself anyway). These are the people behind the 802.3 Ethernet and 802.11 Wireless standards and they've outlined the P1901 standard, which offers 500Mb/s.
P1901 is based on HomePlug AV and is in the final stage of certification. There's also Green PHY, or HomePlug GP, which isn't aimed at PCs as such - it's a sort of green thing for the remote control of devices and part of the Smart Grid idea. It's basically a trimmed-down, low-bandwidth and low-power version of HomePlug, enabling you to monitor and turn things on and off remotely and whatnot.
Eventually you'll be able to sit on the sofa and control all the electrical devices in the house. But is the emergence of two new higher-speed standards a problem?
Not at all: they will work with each other and P1901 is really just another layer of standardisation and ratification for HomePlug, a base standard to work from. If things go to plan, just about all the versions should talk to each other and you'll be able to mix and match, buying on price and performance rather than being tied to any one standard (backward compatibility for version 1.0 aside - that's technically possible, but economically unfeasible).
This is important for the whole industry, so we're sure they won't muck it all up by splitting the market. We don't want any more incompatibilities such as that between the 200Mb/s NetGear and Devolo kit. Right, guys?
Real-world speeds
HomePlug AV2's top speed of 600Mb/s is a theoretical maximum of course. Add encryption, fluctuations in the mains, interference, distance, various overheads and so forth and well, we shall see.
The 200Mb/s AV version returns about 190Mb/s under ideal conditions in a simple speed test across about 10 feet. In the real world this'll mean it'll be more like 150Mb/s or so, dropping to under 100Mb/s if you really push the parameters. We won't get 600Mb/s, sigh.
We should easily see over 400Mb/s or so though, which is fast enough for lots of HD video, which is what everybody seems to insist on pushing across links now. It's no Gigabit LAN, sure, but it has to do much more work in a hostile environment.
Sounds great, but there's a good reason why everybody doesn't use HomePlug: it's not cheap. A basic two plug starter system goes for around آ£50 for version 1.0 and آ£80-odd for HomePlug AV. Adding another device adds another آ£25 or more.
When you start to explore what's going on inside a HomePlug unit, it becomes clear why they cost what they do: there's some serious electrical trickery involved. A lot of the current kit is beautifully built: Devolo in particular knows how to produce good looking, family friendly kit.
We recently tested its dLAN 200 AVSmart+ Starter Kit and were most impressed - it was built like a tank. The other big player is Netgear: its kit isn't quite as accomplished, but it's cheaper. Then there are Belkin, Cisco, Sharp and a few other big names: not a lengthy list yet, but growing.
music streaming
MULTI-ROOM AUDIO: The Collage system from Russound features natty remote controls with which you can access all areas
HomePlug isn't the only network you'll need. Wireless is still, well, wireless, and if you want the fastest possible connection then using a Ethernet cable is still the way to go, and always will be; you can't beat a direct wired connection. But the idea has grown from a curious approach to a solid standard for houses that aren't fully-cabled expensive new builds.
There's also the technology to go the whole hog and use the mains to deliver your broadband too, no phone socket required. With Broadband over Power Lines (BPL), your modem can be plugged into any socket in the house. This one does require a spot of infrastructure investment in mains wiring, though - repeaters and filters and whatnot. It's currently one of those 'It's technically possible, should we give it a proper go commercially?' ideas that are currently floating around.
You'll be seeing a lot more of HomePlug and similar technologies in the future. Electrical devices are going to start talking to each other through the power lines - a network of wires that makes the internet look meagre in comparison.
Computers started life in isolation. Networking and then the internet brought a whole new dimension. Now every electrical device you plug into the wall can join in. Hyperbole? Yes, but you get the picture.
Meanwhile, HomePlug is a very workable and neat system. At the moment it's expensive, but it will soon get fast enough to be very interesting. What we need now is volume to lower the cost of the kit.

Related Links



Read More ...

Review: Toshiba Satellite Pro C660-171
We've long been fans of Toshiba laptops, but the Satellite Pro C660-171 falls below the usual high standards of the Japanese manufacturer, due to a sub-standard design and a lack of features.
Whereas other manufacturers including Acer and Dell, have created laptops that are a pleasure to own thanks to excellent designs, this laptop looks considerably cheaper than it is. This is down to its very plastic design that you could only call functional, with the only positive being it's very hard to mark and will prove suitably durable on the road.
Usability suffers slightly as a result. The plastic keyboard is surprisingly flexible and the action of the keys is a little too stiff for our liking. You might not mind this, however, but we would recommend trying at your local superstore before you buy.
The laptop's screen is decent enough, and the 1366 x 768-pixel resolution provides a detailed enough picture for viewing photos and films. It's also worth mentioning the shiny Super-TFT screen coating is very reflective in bright light.
Features-wise the laptop falls short here. This laptop doesn't feature an HDMI video out for connecting to an external high-definition (HD) source, such as a TV or monitor, and you also only get 10/100 Ethernet. This is the slowest wired connection currently available, and those who regularly use Ethernet for setting up home or office networks will prefer the Gigabit connections offered on similar laptops at this price.
Finally, there are also only two USB ports supplied. Those who are used to plugging in a wealth of peripherals, from smartphones to external mice, will find this restricting to say the least.
Toshiba satellite pro c660
Decent performance
Nevertheless, the Intel Core i3 370M chip provides decent performance. Everyday users will have no problem running all their regular software, even concurrently, although the 3072MB of memory is a bit low for the price, making the laptop less suited to multi-tasking than the Dell Inspiron 15R.
Similarly, the integrated Intel graphics card provides limited multimedia power, although there's enough performance on offer here for entry-level photo editing, as well as watching films and streaming video from the internet.
Benchmarks
Battery life: 209 minutes
MobileMark 2007: 228
3D Mark 2003: 3807
The 500GB of hard drive storage is excellent for the money, and will provide a great deal of future-proofing for all but the most data-hungry users.
Portability is also just about acceptable, with 209-minutes of battery life available and a chassis weight of just 2.4kg.
It's hard to like the Satellite Pro C660-171 when it offers so little for its asking price. Splashing out a little more on the Acer Aspire 5741G or Dell Inspiron 15R will pay greater dividends.
Related Links



Read More ...

Review: MSI CX623-258UK
MSI enjoyed a successful year in 2010 producing a succession of impressive machines that ranged from excellent netbooks to high-end gaming kit. The CX623-258UK comes from the manufacturer's classic range and, despite a few design quirks, makes a competitive choice for the whole family.
The screen is one of the best we have seen around this price and is on a par with the Dell Inspiron 15R. Brightness is exceptional, as is colour reproduction, but it's the contrast and deep black levels that really impress. As a result, this laptop would certainly suit those for whom viewing videos and photos is a priority, as it makes them look great.
The keyboard is very spacious and uses isolation-style keys. The keys are also large and, therefore, hard to miss-hit, but the board is quite spongy, giving a slightly rattly feel to proceedings. Ultimately, however, we found the laptop very comfortable to type on for long periods.
Irritating touchpad
Unfortunately, the touchpad is very easy to brush when typing, meaning you're likely to spend a lot of time rearranging bits of text which you've inadvertently launched across the page. This is a big flaw and quickly becomes hugely infuriating.
Despite the flexible keyboard, the laptop is very well put together, featuring tough plastics. However, some potential purchasers may be put off by the unusual paint scheme used.
MSI cx623
Another strange design choice is to place the majority of ports at the back of the laptop's chassis, making attaching peripherals, for example, a bit fiddly at times.
Internally, an Intel Core i3 380M processor and 4096MB of memory provide average performance, judging by the benchmarking scores, but in reality we found the laptop to perform well, running everyday applications, such as email clients, with little issue.
Benchmarks
Battery life: 246 minutes
MobileMark 2007: 116
3D Mark 2003: 8911
However, if you're after the most for your money in terms of performance, you would do better considering the Dell Inspiron 15R.
Graphically the laptop is surprisingly powerful, thanks to its Nvidia GeForce 310M chip, and those wanting to edit their holiday videos and photos with high-end software will be well served here. The laptop will struggle to run the latest games, however.
The 640GB hard drive is exceptionally generous for the price, and will take the average user many years to fill, saving you from having to constantly move your excess data to an external drive to free up space. A standard CD and DVD rewriter is also in place for creating your own discs.
A few design quirks aside, we were quite impressed by the CX623-258UK. It may not be the most comprehensive option here, but it is certainly worth your consideration at this price.
Related Links



Read More ...

No comments: