Electronic books go 3D
3D is beginning to make an impact on the TV industry and home gaming industry, and now it also looks like some new electronic reading tech from South Korea could also see us reading 3D eBooks in the near future.
South Korean scientists have developed 3D tech for books that makes characters appear to jump right off the page in front of your eyes.
3D reading specs
The tech arrives hot on the heels of the runaway successes of recent 3D blockbusters coming out of Hollywood, including Avatar and Tim Burton's Alice in Wonderland.
3D TVs start to go on sale in April and the computer games industry looks set to lead the march of 3D into our homes, with a firmware update for PS3 promised soon that will enable 3D games on your new TV.
South Korea's Gwangju Institute of Science and Technology has used new 3D technology to animate two children's books of Korean folk tales, featuring traditional dragons and heroes jumping over mountains.
Pictures in the books have cues that trigger the 3D animations, with readers having to wear 3D glasses.
"It took us about three years to develop the software for this," said Kim Sang-cheol, the team leader of the project.
"It will take a while to market this technology to the general public," Kim added, who is sure that the tech will be affordable enough to be mass market.
Read More ...
Opinion: Why 3D TV is just a pointless gimmick
According to the industry, 3D is the next big thing. TV manufacturers are chomping at the bit to sell us new monitors.
Sky is planning a huge 3D blitz this year, including the launch of a dedicated 3D TV channel. Games are an obvious contender for the 3D treatment, thanks to the fact that they have all the data they need to produce an effective world already built in.
Verily, the planet is on the cusp of an incredible 3D revolution and we should all be excited! But I'm not. To hell with 3D. If I could change one thing about the cinema-going experience – other than shooting bloody Pearl & Dean into the sun – it would be to watch every blockbuster in IMAX. That would be a genuine improvement.
3D is just another gimmick, right down there with Smell-O-Vision, electric shocks coming through the seat, vibrating cinema chairs and, of course, the last 17 times that the industry has tried to make 3D into the Next Big Thing. And we still don't need it. I've never, ever seen a 3D movie that so much as breathed softly on my socks, never mind blew them off.
Realistically, the technology offers exactly two tricks of note. There's the annoying one, as demonstrated in Monsters vs Aliens, which opens with a guy batting a ball at the screen just to go, 'Ooooh! 3D in your face!' If I never see that trick again, it'll be too soon.
The other one, which is largely pushing the 3D revolution, is all about adding depth to scenes. This trick can work, I'll admit, and it can also be effective. You definitely notice it – especially in a film such as Avatar – but, more importantly, you can actively not notice it and still get some benefit, which is what really matters. At least, in theory.
Expensive headaches
The problem is that, for all the potential benefits, 3D just seems to be Hollywood's most expensive way to give me a headache, even including the Bourne movies and the continued acting career of Shia LaBeouf.
Yes, this is probably just a question of my rubbish eyes, all maggoty with astigmatism and myopia as they are, but I don't care.
By the end of Avatar's seven-hour running time, my whole face felt as though someone had just opened the Ark of the Covenant over on the next row. My eyes oozed blood and gooey eyeball juice into my popcorn. Still, at least it stopped anyone else from stealing any.
Even before that point, though, Avatar only gave me about five minutes of genuine 3D 'Oooh!' before the effect faded, as any effect inevitably does.
From that point on, the glasses, the popping tricks and the background shimmer – in fact, all the pieces of technology that were meant to be immersing me in the action – served only as a constant lingering reminder that I wasn't in fact on a distant jungle planet with lots of sexy blue people, but in a cinema and in need of some aspirin.
The trade-off simply wasn't worth it, especially when coupled with the dark tint that the obnoxious 3D glasses put over all the film's beautiful bright colours. Also, the film was a bit rubbish.
Even watching great 3D movies, such as Pixar's Up, I've never been able to settle in and just enjoy the film or get completely lost in the action, not with every background shimmering away like a desert mirage and each character popping into the screen.
I quite often lift up the glasses just to compare the two images and every time it's the same: any power that the 3D version of the film has ultimately comes from the 2D version being exquisitely made. I've never wanted for that extra half a dimension as much as I craved the brighter colours and a lack of intense eye-trauma after leaving the cinema.
The industry wants 3D
Of course, it's no wonder that the industry desperately wants 3D technology to be a big deal. Right now, it's the only real benefit cinemas can offer over home theatre systems, aside from ever-more obnoxious advertising and snot-smeared pick 'n' mix.
Looking ahead, hardware companies see it as the next big reason to make us all upgrade our kit. And good for them. It's still not an upgrade I can see myself rushing out to make, or can imagine recommending anyone else to go and do likewise.
When we finally get TVs that can add that illusion of depth without needing glasses, we'll have a genuine step forward. Until then, it's just a gimmick – an effect we'll all get accustomed to and subsequently bored of in a couple of weeks.
If anything, the best thing for 3D would be for it to stay as popular as it is now – an occasional treat for people who like it, something that's to be savoured and allowed to maintain what power it has.
Taking it mainstream can only ruin the effect in ways that my astigmatism and quick-drying contact lenses can only dream of – and you can bet that losing the magic won't come cheap.
Read More ...
iPad apps set to cost more than iPhone apps
The immediate thing that many have noticed from the leaked iPad App Store screens is that the iPad apps on the screens seem to cost considerably more than iPhone apps.
iPad pricing will most probably fluctuate considerably for the first few months of the new device being on sale, until the market finds an equilibrium between what consumers are willing to pay and what developers want to charge.
£10 Twitter apps anybody?
There was a smattering of £10 Twitter apps when the original App Store launched, which gives you an idea of how some developers will chance their arm when any new opportunity to make some cash presents itself.
Until consumers tell them politely to flip off by simply not buying their wares, that is.
The iPad is a far more powerful and bigger-screened device than the iPhone of course, so we can only hope that some of the new dedicated iPad apps we see arriving in the coming months are offering genuinely useful productivity services and some groundbreaking and fun new gaming experiences, which is really what is going to persuade all of us to part with our hard-earned and start buying apps for our shiny new toy.
Gizmodo has published a handy graph comparing iPad apps with the iPhone iterations. We should see on April 3 what the real differential is, when the iPad App Store launches in the US.
Read More ...
iPad App Store details leak online
Apple's iPad App Store is set to launch on April 3rd, but the first screens and details on the new software store for Apple's new slate computer have started leaking out onto the internet.
The iPad App Store features the same bottom tab bar layout as iPhone's App Store.
Hot or not?
The What's Hot and Featured pages are currently populated with iPhone apps, though we expect to see them populated with dedicated iPad apps at launch on 3 April.
The Top Paid and Top Free iPad-specific Apps lists are displayed side-by-side in groups of 10, with the Top Revenue iPad Apps list below them.
When searching for apps, results will show up both iPhone and iPad apps.
Read More ...
Apple iPad is jailbroken prior to launch
It looks like the Apple iPad had been hacked or 'jailbroken' prior to the 3 April US launch next week, if infamous hacker George Hotz is to be believed.
Hotz is the famed hacker behind recent iPhone and PS3 exploits and claims to have already cracked the Apple iPad prior to its US launch.
Hacks for technophobes
Hotz is best known for hacking Apple's iPhone, allowing users to use the phone on any network and to install unauthorised, cracked apps.
Hotz has also claimed to have hacked the PS3, though his hack still does not allow gamers to run cracked games on Sony's console.
Hotz's iPhone-cracking software blackra1n offers the most technically-inept iPhone user a one-click jailbreaking solution, much to Apple's dismay. Hotz now says that blackra1n "will probably work on iPad too".
We should see if he is right or if Apple has managed to block blackra1n from jailbreaking its new iPad at launch on 3 April.
Apple working in some sort of block to Hotz's software before the iPad arrives next month.
Read More ...
Is Apple's 'iAd' Steve Job's 'next big thing'?
While all of the current wave of Apple pre-launch hype is focused on next week's iPad launch, it is rumoured that Steve Jobs' "next big thing" is going to be a personalised mobile advertising platform, possibly called 'iAd'.
Online Media Daily reports that Apple's new mobile ad platform will be officially unveiled to the ad execs on New York's Madison Avenue on April 7 and has been described as "revolutionary" and "our next big thing" by Steve Jobs.
Jobs on Madison Avenue
As ever, Apple is not responding to requests from press for further information, but the fact that Apple bought mobile advertising developer Quattro back in January for nearly $300 million surely means that the Cupertino computing giant has some plans in the mobile ad arena.
"The war has been mounting ever since Google introduced its Android mobile operating system to compete with Apple's iPhone, and agreed to acquire mobile ad firm AdMob for $750 million, but it is expected to reach ballistic proportions following Apple's April 7th announcement, which insiders say will be every bit as important as other recent marketplace introductions, including the iPod, iTunes, iPhone and iPad launches," writes MediaPost's Joe Mandese.
"Apple appears to have been more successful in its revenue diversification, developing substantial software and service businesses, including iTunes downloads, iPhone wireless subscriptions, and App Store downloads. And while advertising has always loomed as a huge possibility for Apple, it had essentially been unexploited as a business model until Apple acquired Quattro."
Jobs and Schmidt spotted in coffee shop
Is this why Steve Jobs was spotted chatting over lattes with Google chief and former Apple board member Eric Schmidt the other day?
The smart money is on Apple offering advertisers a hypertargeting capability for location-based advertising so that consumers get ads based on their locality at any given time.
"Everyone will be following this very closely," says Josh Lovison, the mobile lead at Interpublic's Emerging Media Lab, adding: "Given the way that Apple is able to package things up, with very slick presentations, it will be interesting to see what they do with that advertising."
Via MediaPost.com
Read More ...
Review: Apple Aperture 3
Any Windows user who is serious about photography has one good option for software that will let him or her manage their collections and edit photos: Adobe Photoshop Lightroom. On the Mac, though, we also have a solid competitor and it's from Apple itself: Aperture 3.
The timing of the release of Aperture 3 is awkward. The next version of Lightroom is in public beta, but we think it's unfair to compare a beta release to a shipping product, so throughout this review, we'll be referring to the currently shipping version, Lightroom 2. (Lightroom 3 promises, among other features, watermarking, richer printing and slideshow capabilities, a sophisticated noise reduction engine, and the ability to add film grain.)
Aperture 3's big thing is with its iPhoto heritage and integration. While it was possible to import iPhoto libraries before, it was a particularly imperfect system; now, everything is correctly ported across, though the process can take some time.
And it's not just the import; throughout, Apple has tried hard to make iPhoto users feel at home.
Aperture improvements
There's little such help given to Lightroom users, and indeed, the process of switching from Lightroom to Aperture (or in the other direction) is a painfully manual one.
Metadata – EXIF, IPTC, GPS – should be retained, but you can kiss goodbye to your lovingly made adjustments. This alone means that few pros will switch wholesale from one to the other – though there's nothing stopping you referencing the same masters from both apps – and frankly, we get the feeling Apple's cool with that.
There are few Mac-using semi-pro and pro photographers in the world compared to the millions of potential sales from the existing iPhoto install base. It's a shame that so little help is given to Lightroom switchers, however, as Aperture 3 has a huge amount to offer.
It inherits Faces and Places from iPhoto. The former mixes face detection and face recognition to tag people in your photos automatically, which is useful for enthusiasts and commercial photographers alike not only when browsing their libraries, but also when building, say Smart Albums.
Places is the geotagging engine from iPhoto '09 that uses information on where photos were taken (embedded at capture by only a very few cameras, none of which, currently, are pro-level) to pin them to a map.
Here, though, as well as letting you drop un-tagged photos onto the map manually (reverse-geocoding them so that you can search for your pictures of Big Ben, for example, without having to keyword that in) it can use a series of waypoints from outdoor GPS devices (or iPhone apps such as Trails) to match up timestamps to geotag your photos for you.
It's smart, though; knowing that sometimes camera's clocks aren't set correctly, Apple has it so that you can place your GPX trail on Aperture's map and then drag a single photo whose location you know to that trail; it then offers to geotag all the others.
The third big new feature is Brushes, similar to Lightroom 2's Adjustment Brush. Rather than applying corrections globally, you can paint them on selectively.
Brushes support pressure-sensitive graphics tablets, and offers a range of modes including a Photoshop Quick Mask-like red overpainting that makes it easy to identify your retouchings.
All edits are non-destructive, and while it is possible to apply more than one instance of each adjustment, that feature is a bit hidden away. There's no way to paint on exposure adjustments (though you can do dodging and burning) and we miss Lightroom's graduated filter.
A suitable Mac
Aperture 3 now has a range of presets such as cross-process, a range of filtered black and whites, and white balance nudges. You can save and share your own, and import others; some pros are already selling presets packs.
Given sufficient hardware, performance is good. Our 2.66GHz MacBook Pro did stutter a bit with 4GB of RAM when dealing with big raw files, but when Crucial helped us stuff it with 8GB, everything went buttery-smooth.
Still, dealing with raw files is something that takes a lot of grunt, so if you're going to be spending a lot of time working with photos from an SLR or high-end compact, you'll need a capable Mac.
As 64-bit apps, both Aperture 3 and Lightroom can address more than 4GB RAM when running on 64-bit processors. We experienced some early stability issues with Aperture, but the 3.0.1 update seemed to nix them completely.
Aperture is better at getting your pictures out compared to Lightroom. As well as direct uploading to Flickr and Facebook, its printing and slideshow features are richer, and you can also create book layouts then send them to Apple – and even to a few vetted third parties to have them printed and bound.
Aperture feels predictably more Mac-like than Lightroom, and iLife/iWork apps' media panels can grab pics from Aperture much more easily. And as well as the workflow and cataloguing benefits of Faces and Places, we appreciate other pro-level features such as the ability to split a project off into a separate Library for an assistant, say, to work on, before merging it back into the master Library.
But if you've already built a Lightroom library, you'll waste hours trying to migrate to Aperture; for many, it won't be worth it. If you haven't yet implemented a photo workflow, or if you're an iPhoto user bumping your head against the limits of its abilities, Aperture 3 is wonderful.
Related LinksRead More ...
In Depth: How to choose the best home networking option
Wired, wireless you name it: USB-based solutions through to powerline options, there's a networking solution for everyone available. In this article we'll look at just how fast Gigabit LAN can get, whether you can stream HD movies over powerline gear and just what difference a Wireless-N connection could make to your existng network. It's time to discover whether you should upgrade...
Maybe you're running an ageing Wireless-G network or perhaps you've got Gigabit onboard, but are wondering if it's worth investing in a new switch or router to take full advantage of it? Perhaps you're eyeing up powerline as an alternative but don't know how well it'll perform? Or perhaps you're wondering if Wireless-N is right for you? What are the options for those planning on extending, upgrading or building a network?
Wired networks are still the world's favourite method of communicating between PCs . Ethernet networks have been around for decades, they're robust, reliable, easy to work with, standardised, extendable, cheap and fast. Those are all words we like.
It's so ubiquitous that every motherboard produced will have a port integrated on it, this from a industry that attempts to shave every penny from production costs. The same goes for laptops, so when your standard shipped products offer it effectively for free it's something you have to consider. Particularly when 30m cables costs as little as £7 and that's long enough to reach any corner of most houses.
The beauty of Ethernet is that it's fast and reliable, largely as it has been around for so long. All equipment should be at least 100BaseT compliant, the older and slower 10BaseT was supersede by 'Fast Ethernet' in the mid-1990s. The even faster Gigabit 1000BaseT standard has been widely integrated as standard since 2004 with it being an option since the early 2000s.
In speed terms theoretically a 100BaseT network offers 12.5MB/s of bandwidth, take into account TCP and the Ethernet protocol requirements and in practice file transfers tend to hover around 11MB/s.
Gigabit offers the potential of 125MB/s transfers, though the real-world performance can fall far short of this for a number of reasons, but if you're the sort who likes to throw large files around then any reduction in transfer times is a good thing. We say far short, as the largest bottleneck will be the hard drives.
For our testing this will be an irrelevance, but in the real world this can limit the potential maximum to 20, 40 or at best around 60MB/s for fast up-to-date drives. Our test circumvents this issue and tests the raw throughput, in this instance of the Sitecom 300N-XR. We were pleased with the 101MB/s peaks and average transfer of 95MB/s with an average CPU usage of 36 per cent.
The final issue if you want to rollout Gigabit in your home is that you have to have an end-to-end solution. It's no good connecting your Gigabit laptop to a 100BaseT router and than that to your Gigabit capable home server. There's little to no options for infrastructure upgrades at the moment, it seems all ADSL/Cable routers still only offer 100BaseT ports.
You either need to invest in a Gigabit switch – five-port ones can be picked up for around £30 such as the Netgear GS605 – or opt for a suitable wireless router that happens to sport a Gigabit router, such as the Sitecom 300N-XR Gigabit Wireless router for around £90.
One final mysterious area we want to clear up is the type of cables you need to use to enjoy Gigabit speeds. Simply put: even CAT5 will work as well as CAT5e or even CAT6. Technically yes, CAT5e and CAT6 are superior but generally the extra expense isn't necessary even over long runs.
Just to test this out we re-ran our Gigabit tests with some 15-year-old CAT5 cables that we had lying around. There wasn't a jot of difference to the performance, well, perhaps there was a 5MB/s drop in the minimum recorded speed but the average remained the same.
There's no arguing that Gigabit is fast, very fast. But wiring your house up with CAT5e wire isn't the most spouse-friendly activity or indeed house-friendly activity. You can drop cables under floorboards and behind the skirting, but you are going to need Ethernet faceplates at some point or end up tacking them to walls, for a less desirable finish.
Powerline adaptors seductively promise both the solid reliability of a wired network, combined with the cable-minimising side effect of using your home's existing power sockets as network ports.
It's a clever trick; the mains power in your home works on a 50Hz sinusoidal wave. Powerline adaptors 'imprint' on this low-frequency wave an ultra-high frequency signal in the high MHz range, this is carried around the home piggybacking the standard 50Hz power signal. As it passes through the other connected powerline adaptors it can be decoded and the networking data extracted. How high you make the modulated carrier frequency determines how much data you can transmit.
As with wireless networking powerline adaptors started off slow – around 14Mbps – but have slowly progressed in speed from 85Mbps to 200Mbps and even supposedly 1Gbps from the latest Belkin devices.
WHY ADD MORE WIRES: Powerline networking piggybacks on your existing electrical network so you dont have to lay extra cable
They offer a blindingly convenient solution, you can have your ADSL router wherever you need it plugged into its power socket, then right beside it a powerline network adaptor with corresponding network cable. Cut to the living room where another powerline adaptor is found, again with a network cable connected to your media PC. If need be a third connects to a server and a fourth to the gaming rig upstairs.
As PCs need to be near power sockets anyway, you're guaranteed to only need a short network cable to seal the deal.
So what's the downside? All of that extra modulation/demodulation, plus riding on the back of the mains signal introduces additional latency. At which point every gamer just did a nice vampire-seeing-a-cross impression. Let's not overreact, in testing we were seeing latencies of around 4 to 6ms, that's at worst less than 10 per cent of your total typical latency.
Potentially more annoyingly would be if powerline networks were affected by mains interference. Appliances can create spikes on the mains when switched on and off, plus devices such as washing machines and hair dryers can create high-frequency interference.
High-speed HomePlus AV adaptors such as the Devolo, are designed to plug into the wall socket and then any extensions plugged into it. This helps eliminate interference, as it can be filtered out. Additionally frequency hopping technology helps reduce interference effects.
There's also the issue of security, potentially anyone within the same building and even in the direct vicinity could receive the network information. Again this is catered for in the form of 128-bit AES encryption. The Devolo adaptors provide software that enables you to issue a password to each connected adaptor from a single location, though you'll need the security code – similar to a MAC – for each adaptor plugged in.
A more annoying issue is future compatibility. Currently the HomePlug Powerline Alliance maintains the HomePlug standard that's most widely used. An alternative international standard is due to rollout in 2010 called G.hn. It's an idealistically-pure standard, devised to carry network signals over any existing home wiring be it power, telephone or coaxial.
In a way it's a non-issue as it's not out at the time of writing, but it just means if anything new and shiny does appear it won't work with your existing HomePlug adaptors, but being a happy and understanding lot they should all coexist together.
Let's face it, wires are for losers. All the cool kids are down Starbucks with their Macs updating their MySpace pages, while slurping a decaf fat-free mocha choca latté with extra sprinkles, just to make sure you had to wait in line for as long as possible. Little does the loafing wireless user know, but making that connection is a vast array of complex protocols and signaling technology.
If you look into the 802.11 standard it's a hideous mish-mash of compromises, that if anything has only just been sorted out with the ratification Wireless-N. Why do we say that?
Take 802.11a, this ran at 5GHz and used a modulation system called OFDM (Orthogonal Frequency-Division Multiplexing). The preferred frequency range is 5GHz, but back in 1997 this was only available in the US and Japan. The compromise was that the rest of the world used 2.4GHz for 802.11b.
WI-FI ROUTER: Switching to 5GHz band can mean you can transmit twice as much data each time
The problem is this is a horribly noisy band competing with microwaves, DEC phones, Bluetooth and military 'airborne devices'. So it had to use a less-efficient modulation system, one called CCK (Complimentary Code Keying) for 2Mbps and 1Mbps speed and DSSS for 5.5Mbps and 11Mbps speeds.
802.11g upped speeds to 54Mbps by using the OFDM system, but this slaughters the maximum usable distance offering no improvements over 'B'.
To muddy the water even more a raft of manufacturers jumped on the 'turbo G' bandwagon. This makeshift standard used a number of techniques to improve the speed and distance of communication over standard 'G'.
The first from Atheros dubbed 'Super-G' and RangeMAX used channel-bonding. As the name suggests this makes use of two channels to broadcast data, doubling throughput to 108Mbps, but at the expense of potentially interfering with other wireless devices. Broadcom released a competing standard with a claimed throughput of 125Mbps that used a combination of aggressive compression and frame-bursting to boost bandwidth over a single channel.
The final crowning glory was the introduction of MIMO. At its simplest MIMO uses multiple data or spatial streams to transmit more data in one go and with better reliability via multiple antennas.
With the ridiculously elongated ratification process of 'N' the industry released its own MIMO-G options, then Pre-N kit, then Draft-1 N, Draft-2 N and now the dual-band capable 2.4/5GHz 'N' products. Thankfully, Draft-2 products are highly mature and the speed they're capable of is impressive.
WIRELESS GADGETS: You might have some trouble connecting this up with Ethernet cable so you might want to consider going wireless
The idealistic 300Mbps throughput is entirely fantastical – devised only in a laboratory – but we did record peaks as high as 17MB/s in a same room scenario and average transfers of 14MB/s are entirely possible. For day-to-day performance expect around the 8-10MB/s mark.
What you plan to do with your new connection plays a massive part in which to opt for. If for whatever reason you want to throw around a lot of heavy-duty file transfers nothing can touch Gigabit for its speed. The downside is you have to have hardwired CAT5 or CAT6 cabling in place between every machine. If you can meet that criteria then welcome to 100MB/s transfer city, it's a lovely place to stay.
More likely is that you're going to opt for the powerline or wireless solutions. With both of these you're going to want to keep bandwidth requirements at the back of your mind. But just how much do you need?
The short answer for most applications is not that much. Take 'super-fast' 20Mbps broadband; that's a mere 2.5MB/s, which every powerline adaptor can handle the full-load of without breaking into a sweat.
Equally for online gaming demands are very low; for instance, World of Warcraft uses a miniscule 2KB/s while the more demanding Call of Duty 4 averages around 75KB/s peaking to 150KB/s. A bigger issue with online play is latency, but even here the addition of a 4 to 6ms delay isn't too awful in the grand scheme of things.
The biggest strain on a wireless or powerline connection is the streaming of media around the home. We can discount audio, even high bitrate audio at around 320Kbps is only going to consume 40KB/s. The big issue is video; not only does the connection need plenty of bandwidth for the stream, but it also has to be reliable and have enough overhead to absorb any drop outs.
At the low-end a good SD DivX runs at about 1,200Kbps, which is no more than 150KB/s, again no problem. Move up to a good quality 720p HD stream with DTS audio though, and this can be 6,000Kbps, around 750KB/s. Jump to 1080p with DTS and it's more along the lines of 9,000Kbps or 1.1MB/s. At the far high-end you could stream a raw Blu-ay rip running at 36,000Kbps, which will consume around 4.5MB/s.
At this point we're starting to hit the limits of powerline technology and 802.11n under certain circumstances. Even at this extreme end all the solutions on test are up to the job, if that's the only stream on the network.
Considering the price drops in both powerline and wireless kit it's possible to upgrade your existing network for about £100 to handle such uses. Of course, a Gigabit network will handle multiple HD streams, if you can get your better half to agree to a little wire play.
Read More ...
Review: Buffalo Dualie 500GB
The Buffalo Dualie is a portable hard drive and a docking station for your iPhone or iPod.
It connects to your Mac via USB, enabling you to sync or otherwise manage either device while it's in the dock. But as it's also mains-powered, you can recharge your phone or audio player even when your computer's switched off.
Behind the dock sits a 500GB HDD (supplied). Connecting through the rear dock's USB outlet, the drive can be removed and carried around just like any other portable drive.
When used outside the dock, USB and FireWire 800 connections are available, with no power supply required. The Dualie even has two extra USB ports on the back, giving you a hub to connect further USB peripherals.
At the time of going to press it was launching as an Apple store-exclusive, but it might be on sale elsewhere by the time you read this.
The Dualie looks the part, with both the dock and the hard drive sporting a brushed aluminium and rubber finish. Your iPod touch or iPhone looks right at home there. It's perfectly compact too, taking up only 10cm by 10cm on your desk.
In our QuickBench tests, the drive proved admirably speedy with the front dock bare, but docking an iPod or (especially) an iPhone slows it down by up to 25%. It's best to make sure only the hard drive is docked when transferring large files.
It's a little expensive too, but if you wait for the inevitable price drop, it's an excellent addition to any Mac owner's desktop.
Related LinksRead More ...
Review: IDAPT I3
Rather than wrestling with a spaghetti of cables, trying to find a spare plug socket, and plain losing chargers, buy the IDAPT I3.
The idea is simple: plug it into one power socket, and click your gadgets on top to charge them. There's space for three, and the genius of the system is that each charging socket can be pulled out and replaced with a different 'tip' for a given gadget.
You get six in the box – iPhone/iPod (works fine with the 3GS), Micro-USB and Mini-USB, and tips for the latest types of Sony Ericsson, Nokia and Samsung mobile phones. And there's more to buy online.
Each costs around £6, with options for rechargable AA/AAA batteries and iPod shuffle, and there's more to come. Plus it works.
However, some gadgets might sit at an odd angle – our Kindle felt precarious wobbling on its Micro-USB connection – and you might have to shuffle the tips around to get everything to fit. But it's still a great, convenient system, with all gadgets charging simultaneously. There's a power switch on the back too.
It might not suit all combinations. There's no straight USB-A socket for charging gadgets such as Jawbone's series of Bluetooth headsets, and some things, thanks to bulk or where their charging points are, just won't fit.
It's supplied only with a British power plug, but since it's a standard C8 (figure-eight) connector on the back of the I3, there's nothing to stop you plugging in a different power cable when abroad; it supports 85-240V.
Related LinksRead More ...
Review: Nvidia GeForce GTX 480
The Nvidia GeForce GTX 480 is indeed the fastest single GPU graphics card in the world.
The air of relief is palpable as the great and the good of Nvidia gather beside this latest graphical opus in its downtown Paris office.
The relief is not just our ours at having finally gotten hold of a working sample of the GeForce GTX 480, but representing the culmination of a lot of hard work, a lot of missed launch slots and a lot of rumour-mongering in the world's tech press.
It's one hell of a relief for everyone at Nvidia associated with the GeForce brand.
- Also read: 10 best graphics cards in the world today
The opening slide of the inevitable PowerPoint-a-thon is simply one word standing clear on a black background: finally.
So yes, finally it is here. The GF100 GPU - known as Fermi - exists outside of the rumour mill and has its first derivative card; the GTX 480.
GeForce 400-series
There will be others along very soon, most noticeably the cheaper GTX 470, but this graphical behemoth represents the top-end of Nvidia's Fermi launch.
Originally pencilled in for a pre-Christmas debut, the DX11 riposte to AMD's HD 5xxx series of graphics cards has seen innumerable delays, sparking fears that something had gone badly wrong with Nvidia's brand new silicon.
We had expected the fastest derivative of the new Fermi architecture to be around the £600 mark - putting it in direct competition with AMD's fastest graphic card, the ATI Radeon HD 5970.
But with a recommended launch price of just over £400, it's immediately obvious where this card sits without even having to look at the benchmarks.
It sits slap-bang in between the two top AMD cards - the Radeon HD 5870, still topping £300, and the HD 5970, knocking on the door of £600.
Having a completely redesigned architecture and a half-year delay, there's no way Nvidia could afford to under-cut the competition with this part. So we'd already expected the performance to completely reflect the pricing.
With this focus Nvidia's hoping that, unlike when the first DX10 cards came out, this top-end card will still be a worthy part when full DirectX 11 games hit the shelves with heavier and heavier reliance on tessellation.
We've already seen the DX11-lite titles such as STALKER: Call of Pripyat and DiRT 2, which bolted on some effects to buddy up with the launch window and possibly some shared budgets with AMD, but it's only really the likes of Metro 2033 that are developed with DX11 more in mind.
The GeForce GTX 480, representing the top-end of the Fermi spins, does hint at some trouble in the yields from Nvidia's 40nm wafers.
Problems with production?
The full GF100 chip sports 512 CUDA cores (previously known as stream processors, or shader units) across its 16 streaming multiprocessors (SMs). However, the GTX 480 is only sporting 480 of those CUDA cores, supporting the rumour that in order to improve the yields of the 40nm Fermi wafers Nvidia had to sacrifice some of the shader units.
As Tom Petersen, Nvidia's Technical Marketing Director, told us, "it's not the perfect [GF100] chip."
So it's wholly possible that yields of fully functional chips, running all 512 cores, were so low as not to be commercially viable. Switching off 32 of these units is thought to have upped the number of usable chips from each of its 40nm wafers enough to make a commercially viable chip.
Of course, Nvidia doesn't agree this is what was responsible for the many delays to the launch of a Fermi-based card. Nvidia puts the blame for the delays down to DirectX 11 itself. The big, sexy buzzword surrounding Microsoft's latest graphics API is tessellation, and it's this feature Nvidia claims was responsible for the card's tardiness.
Nvidia sees tessellation as the great hope for PC gaming going forward and was so desperate to make sure it had the best architecture to take advantage of Microsoft's new API that it was willing to wait half a year before releasing a graphics card capable of using all that tessellated goodness.
While initially this seemed like a risky strategy, it looks like it could pay off well for the green side of the graphics divide
AMD made sure it had DX11 cards ready and waiting (though not necessarily on the shelves thanks to its own yield issues) for the launch of DX11, but in relative architectural terms Nvidia claims it only really pays lip-service to tessellation.
With the single tessellation unit of AMD's Cypress chips compared to the 16 tessellation units wrapped up in each of Fermi's SMs it's immediately obvious where Nvidia sees the future of PC graphics.
Despite the fact the GF100 GPU is a big, blazing hot, power-hungry beast of a chip, it is also rather neatly designed. In terms of layout it's very much like a traditional CPU, with independent sections all emptying into a shared Level 2 cache.
This means that it's designed from the ground up to be well versed in the sort of parallelism that Nvidia is aiming for from its CUDA-loving graphics cards.
This parallelism was initially taken as proof positive that Nvidia was abandoning its gaming roots, making more concessions architecturally to the sort of high performance computing (HPC) that gets used to solve advanced computing problems.
The GTX 480, for example, is the first GPU to be able to achieve simulation speeds of 1ms/day when plugged into the Folding@home network.
But speak with Nvidia's people and it becomes clear that this architecture is also perfect for taking advantage of today's demanding 3D graphics too. The full GTX 480 chip has sixteen separate stream microprocessors, each housing thirty of Nvidia's CUDA cores.
Each of these SMs can tackle a different instruction to the others, spitting the results out into the shared cache, before starting another independent instruction.
This means the GTX 480 is able to switch between computationally-heavy computing - like physics for example - to graphics far quicker than previous generations. Each of these SMs has its own tessellation unit too, so for particularly intensely tessellated scenes the chip is able to keep running other instructions at the same time.
AMD's solution outputs its data directly into the DRAM, compared with Nvidia's solution that keeps everything on the chip. This means the GTX 480 isn't overloading the valuable graphics memory quite as much.
With this focus Nvidia's hoping that, unlike when the first DX10 cards came out, this top-end card will still be a worthy part when full DirectX 11 games hit the shelves with heavier and heavier reliance on tessellation.
We've already seen the DX11-lite titles such as STALKER: Call of Pripyat and DiRT 2, which bolted on some effects to buddy up with the launch window and possibly some shared budgets with AMD, but it's only really the likes of Metro 2033 that are developed with DX11 more in mind.
In our tests it was Metro 2033 that brought the GTX 480 to its knees, especially when we ramped up all the effects at Nvidia's target resolution of 2560x1600.
So it will run the DX11 games of the future, but maybe not at such eye-bleedingly high resolutions. Still, AMD's HD 5870 struggled to even manage a single frame per second on Metro 2033's top settings compared to the, still judderingly slow, GTX 480 numbers of 18fps.
So there it is then in black and white; the GTX 480 is definitely faster than AMD's HD 5870. But then we all knew it would be; Nvidia could in no way justify coming out with a card six months later than its main competition and not be faster.
But how much faster? Contrary to popular tech press rumours, it is by a fair margin across the board.
Impressive performance
The first DX11 benchmark, Unigine's Heaven 1.0, is a test that heavily incorporates, and ably demonstrates, the power of tessellation. At the standard HD res of 1920x1080 the GTX 480 is over 35 per cent faster than the HD 5870 and in the mutated DX11 goodness of STALKER: CoP it's more than twice as fast.
Call up AMD's flagbearer of DX11, DiRT 2 and the GTX 480 comes in at 52fps. As you'll see on the next page, in the more modest surrounds of the DX10 stalwarts, Far Cry 2 and World in Conflict, the numbers still favour the Nvidia card .
But the GTX 480 is definitely not the fastest graphics card on the planet. Nvidia will have to make do with the runners-up prize of the "the fastest single-GPU card on the planet" tagline as AMD's dual-GPU HD 5970 still rules that roost, dropping frames to the GTX 480 only in DiRT 2 and STALKER: CoP.
In the only tessellation-heavy benchmarks we've got outside of Nvidia's own tech-demos, Heaven and Metro 2033, the AMD card still shows a fair lead. Of course that ought to be expected from a multi-GPU solution, but shows that even with only two of its Cypress tessellation units it can best the GTX 480's 16.
The HD 5970 though is still, at cheapest, a £560 card and only then if you can find it in stock anywhere. And this brings us to something of a surprise feature of the first Fermi-based cards; the price.
Now, at £429, the GTX 480 by no means a bargain GPU, but definitely far cheaper than the £600 price-tag which was being bandied around only a few months ago by people reportedly in the know. Considering the HD 5870 is currently sitting around the £299 mark that gives the GTX 480 a nice little price-point to fall into.
This is obviously how Nvidia has come up with the price, knowing how the GTX 480 compared to the competition. The £429 price is the most it felt it could charge before they expected gamers to figure they might as well go for a HD 5970 instead.
The extent by which the GTX 480 bests the HD 5870 may be a bit of a surprise to those that expected the card to flop due to its reputed yield issues, but we're still surprised by how much the HD 5970 keeps its lead.
Nvidia is sure it's approach to the key DirectX 11 feature of tesselation is far superior to AMD's HD 5xxx series, the HD 5970 though manages to beat the GTX 480 in all the independent, tesselation-heavy benchmarks we used.
That card isn't even the equivalent of two HD 5870s either, being more akin to a twin HD 5850 instead. Still, it proves the GTX 480 is the fastest single-GPU card available right now.
Addendum: The Dirt 2 performance shown here is for the demo which runs in DX9 mode. In the he full release of the game (which does run in DX11 on Nvidia cards), it gets 52fps instead of the 67.5fps shown here.
So, in essence, the GTX 480 isn't actually competing directly with any other card on the market, sitting as it does slap-bang in between the two top AMD cards in both price and performance terms. The baby brother, the GTX 470, is deliberately priced up against the HD 5870 at £299.
We may well see the first retail boards being priced above that by virtue of the typical price premium manufacturers enforce on new graphics cards.
The relative scarcity of cards may also add to the price depending on how many units Nvidia can actually get to market. It was incredibly cagey about supply on launch stating there will be many customers out there that want boards that wont be able to get them.
Speaking with Asus recently, though, it was surprised as to the volumes it was receiving. It remains to be seen if the yields allow for the sort of production a new board needs.
We liked:
The GTX 480 is definitely a good card in terms of both its performance and the micro-architecture. It's faster than the HD 5870 by a fair amount, which will only increase as tessellation becomes more prevalent.
With the DX11 tessellation feature touted to propel PC gaming far beyond the realms of what its console siblings can offer, the power of the GTX 480's DX11 pipelines should see it staying the course far longer than AMD's top single-GPU card. The red company's dual-GPU solution is a bigger fly in the ointment, keeping Nvidia's top card in its quite monstrous shadow.
We disliked:
Problems loom on the horizon though - AMD is bound to take the butcher's knife to current its pricing scheme. But that's not the extent of it; there are also the spectres of the oft-rumoured 5890 and the overclocked 5970s gradually coalescing. Quite what Nvidia can do to best the 5970 is tough to see,
Obviously the full compliment of 512 CUDA cores may help once yields improve (GTX 485 anyone?) but there looks to be precious little extra to be got from this current GPU. It's already far thirstier in terms of power consumption than AMD's multi-GPU card and in testing our GTX 480 was regularly hitting a finger-blistering (yes, we touched the heat-pipe...) 94 degrees centigrade under load.
If you ramp up the fan to take care of this (in the release drivers the fan speed was locked at around 60 per cent) it quickly becomes a jet-turbine of a card, with a whine you'll hear throughout your domicile.
Verdict:
We may well see Nvidia fighting harder at the lower end of the market then, taking a leaf out of modern AMD's book. The revamped architecture of the Fermi GPU means that it is far more modular than the outgoing GT200 chips so mid-range cards need not have quite so much of their hearts ripped out to hit a certain price point.
But the GTX 480 is here to stay and it looks like it's in for the long-haul thanks to a little architectural revamp to favour DX11's tessellation bent. It does blow the HD 5870 out of the water, but then only really to the extent the extra £150 ought to get you.
So it's the fastest commercial single-GPU on the planet at the moment, beating the competition into a pulp. Unfortunately the dual-GPU joy of AMD's 5970 still wins hands-down. If Nvidia can get units onto the shelves should do well - AMD 5xxx cards are still rare as dog's eggs.
Related LinksRead More ...
No comments:
Post a Comment