
In Depth: Will we ever see the iPad Air's processor in a Mac?

The short answer? No, there won't be a Mac that runs on the A7. It just doesn't have the performance. But an A9 or A10? That's less clear.
Rumours that Apple is looking to replace Intel's processors in its Mac range have circled ever since the company first revealed that it was making its own chips for the iPhone and iPad. Apple's success has been built on integration and the idea of controlling everything from the development of the hardware to the optimisation of the software to make it the best possible experience.
Macs, however, use a lot of third-party hardware in their construction, with Intel's chips being one of the most prominent. This partnership has offered great advantages, though, with Intel's and Apple's priority largely in alignment at the moment: power management.
Apple's best-selling Macs are its laptops, and with Intel having previously focussed on computing power over energy usage, it's now putting its considerable effort into making its mobile chips as electrically frugal as possible - a process that has resulted in the excellent battery life on the latest MacBook Airs.
Yet Intel is still miles behind the technology that underlies Apple's A-series chips. The architecture of Apple's chips has two advantages over Intel when it comes to power use: it uses literally a different way of processing information that's more efficient; and the chips are less complex in design. The flipside of being simpler, though, is that they're less powerful, even if performance is improving significantly every year - the dual-core A7 in the iPhone 5s appears to be slightly more than a third as powerful as the Intel Core i5-4250U processor in the latest MacBook Air (based on Geekbench scores).
Both of these chips are dual-core, with a clock speed (meaning the frequency of operation) of 1.3GHz. That there's such a difference in performance at the same clock speed tells you a lot about the more advanced design of Intel's chips, but if Apple were to look at putting its chips in a Mac instead of a mobile device, it could make some changes to the design, particularly since it would have to worry less about the amount of power used and heat produced.
In the first instance, Apple could increase the clock speed, though this is no magic solution to better performance - increase too far and you start to get steep increases in the power needed for even modest additional gains. Some architectures are also optimised for lower-clocked operations, including Apple's.

"Apple has made some very specific design choices that will make it incredibly difficult, if not impossible, to take the A7 and run it at anything like 3GHz," says David Kanter, principal analyst at Real World Tech. He gives the example of the small memory cache on the CPU - Apple uses a cache twice the size of Intel's that's accessed more efficiently. It's a great design, but it doesn't scale up to desktop chips.
Doing it this way at something like 3GHz would increase the power needed for that one task hugely - Apple has optimised for low-power, low-speed chips. Turbo Boost However, a more modest increase to something like 1.8GHz is feasible, especially if combined with something akin to Intel's Turbo Boost mode, which lets CPU cores produce much higher clock speeds for short times under heavy loads.
Apple could also switch to a quad-core design, doubling the number of CPU cores available, but again, this isn't quite the fantastic solution it might sound: it only produces significant speed increases in apps that can split their processes to take advantage of more cores.
These two improvements both carry energy use and size penalties, but these could certainly be accommodated in a laptop form factor. Of course, Apple will also be working to improve the more fundamental design elements of its CPUs, making each core more powerful even at the same clock speed. However, while Apple is improving the performance of its chips to get near the current MacBook Air range, Intel will be bringing the energy usage of those same chips down to get that kind of performance into tablets - the opposite approach to Apple.
The current MacBook Air chips use up to 15 Watts of power, while the kind of chips in the iPad typically use more like 4 Watts, though this can rise under heavy loads. That's a big advantage to Apple, but Intel's next generation of chips, codenamed Broadwell, looks set to drop energy use by at least 30% while still offering good performance boosts.
Intel has also dabbled in versions of the current generation Core i5 chips that can use just 6W of power under lighter loads, and 11.5W when under stress. Interestingly, though, this shows that Intel's performance currently doesn't scale down as well as its power use, with that chip only outscoring the A7 in Geekbench benchmarks by 30%.
Processor performance
When it comes to processor performance, currently it looks like Apple could take the advantage in the balance of performance and energy use in the gap between the iPhone/iPad and the MacBook Air, but when you get into the flexibility needed to make chips that scale from phones and tablets up to laptops, Intel has huge advantages.The fabrication plants it uses to build its processors are years ahead of those Apple uses in terms of creating smaller, less energy-hungry components, and can produce components capable of running at lower voltages. By the time Apple can make its tablet CPUs start to compete with Intel's current laptop range, Intel will have moved on, and may be putting chips of the current level of performance in tablets.
But does the balance of power shift if the processor becomes less important? The Heterogenous System Architecture (HSA) standard is a guide for building the kind of system-on-a-chip that Apple makes, but places a much larger emphasis on using the graphics processing unit (GPU) for general computing more often, instead of relying on the CPU for almost everything. The reason is that, though the CPU is good at many tasks, it works by processing things serially, working through a problem one step at a time.
The GPU is designed to work in parallel, computing large amounts of data in intensive tasks simultaneously. It can not only do these kinds of tasks faster than a CPU, but much more efficiently too, literally using less power to do the same job.
Not all computing tasks will benefit from being on the GPU, but the idea is to make sure than any that can be made faster by going there do so, rather than just having the CPU do everything out of convenience - even if they're small tasks such as gesture recognition or face detection. It's unlikely that Apple will use the HSA standard exactly, but may implement many of the ideas in a similar way.
Intel, being better at processors than graphics cards, is not certain that HSA is the future, unsurprisingly. Like its processors, Intel's GPUs are more powerful than those Apple uses, but the PowerVR Series 6 GPU in the iPhone 5s is the most powerful in the mobile market, and is designed to scale up easily. It's currently about eight times slower than Intel's GPUs, but if you used a faster version of the PowerVR Series 6 GPU, and used 20 of its 'clusters' instead of the four in the iPhone 5s, you could match the raw computing performance of Intel's current GPUs.

It's interesting to note that Apple is already taking a GPU-heavy route with the new Mac Pro, which will feature two extremely high-end GPUs, but only one CPU. However, while GPU-optimised computing is great as an ideal for changing the way we think of computers, there are practical problems with trying to introduce it for operating systems with long legacies, and that can hold it back as being a replacement for the CPU in more general-purpose use.
"The truth is that an awful lot of code cannot be moved over to the GPU," says Kanter. "The point of the CPU is that you spend a lot of resources compensating for poor programming. A lot of the things that architects spend a lot of time creating is just there to tolerate bad code, and the kicker is that if you run bad code on a GPU, it will just run terribly."
Of course, there's the question of which operating system would run on an Apple-chip-based laptop. If it ran OS X, it would have to be a new version adapted for the completely different architecture of Apple chips compared to Intel, and this would make the current range of OS X apps unavailable on it - Apple would have to supply a way for developers to recompile their apps for the new type of machine, though there's no guarantee that all developers will take advantage. It would also mean a third platform for Apple to support, effectively - iOS, OS X for Intel, and OS X for Apple chips.
An alternative might be that instead of adapting OS X to run on the Apple chips, Apple could evolve iOS to include features we've come to expect and rely on, such as mouse support, true multitasking and the ability for apps to pass information to each other. But aside from these changes, there's also the problem that iOS apps wouldn't fit the widescreen format of laptops, so it would need either some form of windowing or more flexibility in apps' layout and shape, which again would mean more work for developers.
Cash in your chips
With all of the issues of developer support and technical capabilities, and the fact that Intel will continue to create more powerful chips, you might wonder why Apple would bother doing any of this at all.There is another factor, though: cost. Intel's laptop chips cost nearly $300 dollars to buy for manufacturers. Apple's A-series chips are estimated to around $30 to produce. Now, a more complex Apple chip needs to become significantly larger, and costs will increase hugely with that. But let's say that Apple were able to create a chip as powerful as what's in the current MacBook Air for around $150 - that would still make it $100 cheaper than an Intel one. That could allow Apple to create a new lower-priced line of MacBooks or an even smaller Mac mini starting at $500/£400.
That said, Apple doesn't tend to introduce lower-priced options without strong reason, so perhaps it's more likely that Apple would keep the MacBook Airs at their current price and include features such as Retina screens or 4G as standard, giving it a huge feature lead on the competition for the price.
All of the above, though, assumes that an Apple-powered Mac would work roughly the same as current computers do. Things may change by then. Across the industry, 'the cloud' is starting to be used for actual computations, rather than just for storage. Apple's iCloud version of iWork does a lot of work on the server side, YouTube offers video editing through your web browser, and Autodesk already offers cloud rendering for some 3D modelling tasks.
We might see the return of the 'thin client', where your computer only needs a processor powerful to be the interface for these apps, with all the hard work done elsewhere. In that case, even the current Apple chips might be suitable - you wouldn't need a fast computer, you'd just need fast internet. But even that still assumes a fairly traditional form factor for the Mac.
What if even the concept of what a computer is made up of has changed over the next few years? Apple has been working on technologies that make wireless connections utterly configuration-free, and that make wireless video smooth and fast. Its iBeacons technology uses Bluetooth to let devices see when they're close to each other, and pass information back and forth appropriately, while Wi-Fi Direct is being used to establish AirPlay Wi-Fi video connections without a router.
These technologies could form the basis of a system of flexible computing - your desktop computer could be simply a largescreen display with Wi-Fi capabilities, with a wireless mouse and keyboard (or whatever we use to control PCs in the future).
When you sit at your desk, the iPhone in your pocket detects the set up and gives you a custom desktop display on the screen using wireless video, letting you control it by using the mouse and keyboard - the iPhone becomes your computer hardware, capable of performing light tasks itself, and with heavy lifting done by servers in the cloud. In that case, there certainly would be an Apple chip in your future Apple computer, but the Mac may be long gone.
- Now why not read 20 OS X Mavericks tips and tricks?
Read More ...
BLIP: Paws for thought? Dog-to-English translator reaches funding goal

With the untimely demise of Family Guy's Brian Griffin, there's a distinct gap in the market for talking dogs and one group of ambitious Scandinavian inventors have convinced Indiegogo backers they can fill it.
The No More Woof headset is powered by a Raspberry Pi and an EEG and claims to be able to translate a range of canine thoughts, such as hunger, tiredness and curiosity, into the Queen's spoken English.
The Scandinavian Nordic Society for Invention and Discovery reached its modest $10,000 funding goal with 50 days to spare and will ship the first devices out to backers in April.
We've been dreaming of this day since bawling our ours out watching Homeward Bound as kids (*cough* last week). Check out No More Woof in the video below.
More blips!
Woof, woof, woof, woof! That means check out these other blips...- Rdio killd the Vdio star as spin-off service canned
- The Ministry of Magic version of Gov.uk is way better
- Star Wars gets a Tumblr, promises no cat .gifs
Read More ...
LG's intuitive display-waking Knock tech coming to all future phones
LG has decided the popular Knock feature on its LG G2 handset shouldn't be kept from the masses and is rolling out the tool to all current L Series II handsets and including it as standard in all future devices.
The neat UI tool (previously known as KnockON) allows users to turn the display on and off simply by double tapping the screen.
It made its debut on the G2 as a means of getting around the rear-facing power button, ensuring users wouldn't need to pick up the device every time they wished to wake the display.
Knock graduated to the recent LG G Pad 8.3 and is also featured on the new LG G Flex with the curved display.
Taking the Knock
Now it's coming to all thanks to a software update and inclusion in the company's next wave of devices."Knock is distinctively an LG UX and a great example of what happens when you marry the latest in mobile technology with consumer-centric insights," said Dr. Jong-seok Park, President and CEO, LG Mobile.
"No one ever thought that a power button needed to be improved until our engineers wondered why they couldn't turn the entire screen into a power button."
The Knock tech is LG's answer to 'Quick Glance,' which allows the user to see missed calls, notifications, the time and more, simply by waving their hand over the sensor.
Read More ...
Inflame: Tech rage: the angriest internet comments of 2013

The Global Politeness Index has been in free fall. Despite some moves to get us all posting under our real names in the hope that the fear our mums might read what we say might encourage us to be excellent to each other, the internet commenters remain as resolutely rude as ever.
We see it every week. Every day. It used to be that posting "First!" was the most predictable comment you'd seen under an article, but nowadays it's a race to be first to furiously type out the most aggressive and cynical deconstruction of the words and ideas above.
This year we've seen threats to shoot down Amazon's drones and beat up all wearers of Google Glass on sight, as we continue our depressing mission to collate all manner rude fury in the role of baffled impartial observer.
But what were the biggest tech stories of 2013 and how were they best deconstructed by the cynical online commenterati?
Crash report
The most commonplace and oft-used headline of the year related to the value of Bitcoin crashing. It crashed in April, it crashed in June, it crashed again in December. If you'd bought some on all those low days you'd be laughing now, of course, such is the benefit of dealing currencies in hindsight.In April of 2013, you could've bought a Bitcoin for just $120 after crash #1, leading Ars Technica commenter Chuckstar to suggest bitterly: "It's a good thing Bitcoin has a fixed monetary base. It's really been helping keep the volatility down.[/sarcasm]"
And amid the flood of comments likening Bitcoin to a Ponzi or pyramid scheme/scam, IrishScott explained the future limitations of it in a perfectly simple fashion, with: "You forget that bitcoins only have any monetary value at all because of the merchants who accept them as currency. If said merchants didn't exist, bitcoins would be absolutely worthless pieces of digital cryptography."
C the light
Another thing that fluctuated wildly this year was the expected price of Apple's new "cheap" iPhone. Eventually, it turned out that the iPhone 5C wasn't the rumoured budget Apple model we were hoping for, with the plastic phone selling for just a few quid less than the 5S.This disappointment led Trusted Reviews reader MattMe to claim of the 5C: "I don't understand why every review seems to simply overlook, or just not care that this is an iPhone 5 in a cheaper case, selling for the same, or maybe even more than the 5 would have, had they continued production. Shameless customer-raping pricing by Apple, yet again."
In response, Chris Winter raved about the unsung heroics of plastic, saying: "Some of the sturdiest, most reliable mobile phones I've used in my life have been the cheap-end Nokia phones, particularly the plastic 3000-series of the early 2000s. I even dropped one from a third story balcony, walked down, assembled the battery again, and presto."
He shoots! He [PAUSE FOR ANALYSIS] scores!
The tech world had one big crossover win in 2013, thanks to the UK Premier League finally giving goal-line technology the go-ahead for use in England's top league matches. You'd think people would be grateful, but no. Goal reader !999! summarised opinion, and mistrust of football's rich men well, with his comment: "Yes but how many times do we get goal line related incidents? Maybe 6-7 times in the entire season. What about players diving and winning penalties, genuine penalties not given, offside goals allowed etc?"They're still allowed so there's something to talk about.
Taking a tumble
Even today's super-rich footballers would've raised their tweezed eyebrows at the amount of money Yahoo threw at blogging portal Tumblr, which it acquired back in May for a whopping $1.1billion in cash. It's OK, though, as The Verge reader Stewsburntmonkey has the answer to monetising it and making all that money back: "It isn't making much money, so buying it for $1+ billion doesn't make much sense. It seems like all Yahoo want to do is stuff a bunch of ads on Tumblr and hope it eventually pays off."But DesignerFX has seen it before from serial tech-thing-acquirer Yahoo, suggesting it'll unfold like this: "I give it 6-7 months (post acquisition) before a goodwill writedown and 1 year + 3 months (post acquisition) before an acknowledged screwup."
Nobody's watching
And if Tumblr and all the animated GIFs she holds was the internet growth story of 2013, the hardware world was dominated by the somewhat baffling appearance of the smartwatch. After a few disastrous attempts by Sony and other hardware makers to launch smart wearable devices, we finally saw Pebble appear on sale - and Samsung quickly attempted to trump that with its Galaxy Gear model.No one seemed particularly thrilled at the prospect of using a wrist-thing to control a mobile phone, though. Beneath our very own review, NeuTroll got to the heart of the matter in just 26 words and one ironic emoticon, saying: "It's going to be a flop dear Samfans. Someone I know has one and the battery lasts just 20 hours. My G-shock has lasted 9 years :)."
And a passive-aggressive New Year to you, too.
Read More ...
Available Tags:iPad ,





No comments:
Post a Comment