Monday, September 2, 2013

IT News Head Lines (Techradar) 9/3/2013

Techradar



Optus adds a My Plan option for the low-cost user
Optus adds a My Plan option for the low-cost user
Optus has added a $35 plan to its new My Plan range, giving the budget-conscious another option.
While Optus already offers $50, $60, $80 and $100 My Plan options, the new $35 plan will give you 200 minutes of voice calls, free SMS and MMS, and 300MB of data per month ($840 minimum total cost).
Optus CEO Kevin Russell stated that one of the drives to bring in a new budget option was because the telco's research had shown that more customers on entry level plans than on higher plans were cancelling their contracts due to excess charges.
"We estimate that an average light user is hit with close to $100 in excess usages charges over 12 months," he said.

Budget choice a good choice?

While options are good, we can't help but think that Optus may be trying to take advantage of the uncertainty surrounding MVNOs after the debacle with ISPOne, Telstra and Kogan.
However, Optus still doesn't offer the same value for money as other MVNOs. Amaysim, for example, uses the Optus network and offers unlimited calls and SMS, and 4GB for $39.95 a month over 30 days, though it doesn't offer 4G.
Woolworths Mobile, also on the Optus network, offers a $29 prepaid deal over 45 days, including 5GB of data and $500 worth of talk and text.
Interestingly, the $35 My Plan has the same inclusions as its $25 SIM-only, month-to-month plan, which means with a $35 a month My Plan contract, you're basically paying $10 more a month for a handset and will be locked into a 24-month contract.
It does mean you will have access to decent handsets without having to pair up your My SIM plan with a My Mobile plan or bringing your own phone, as well as the ability to tier-jump for a month, starting from $10 for an extra 250 minutes and $5 for an extra 300MB data - which still isn't much data at all.
However, the automatic tier-jumping nature of Optus' My Plan will reduce high excess charges and will probably be the biggest lure for the budget-conscious, especially for those considering moving from Vodafone and Telstra.

    








Read More ...




Google's Project Loon balloons will 'flock' to maintain constant coverage
Google's Project Loon balloons will 'flock' to maintain constant coverage
Google has explained a little more about how its Project Loon concept will be able to defy the elements and allow for consistent internet coverage to be delivered to those below.
The pie-in-the-sky initiative, which involves suspending armies of Wi-Fi-enabled balloons high above the ground, aims to provide connectivity for those in remote areas.
However, as there are no plans for the balloons to be tethered to each other, what's to stop them just floating away on the breeze? Well, Google has a plan.
The company has performed simulations showing how balloon will be able to sense its proximity to the balloons around it and move with it to ensure the group remains equidistant from each other.
Kind of like a flock of birds....

Spread out nicely

"They [the balloons] look at their near-neighbors and tried to spread each other out nicely," says a member of the Rapid Evaluation team at Google called 'Dan' in a YouTube video posted by the company.
"But as we move forward, we may use methods that take into account everything. So every balloon essentially will have information about what every other balloon is doing. In future, it will probably be a much more sophisticated simulation."
So far Google has tested its theory in New Zealand, which it says proved 'Loon' was "a feasible project not just some crazy science project."
Check out the Google video below to take a look at the company's latest simulations.
YouTube : http://www.youtube.com/watch?v=mjyLynnQuC4
    








Read More ...




Opinion: Why the death of Moore's Law wouldn't be such a bad thing
Opinion: Why the death of Moore's Law wouldn't be such a bad thing
Moore's Law is dead. Long live Moore's Law.
Rinse and repeat using the latest techno-detergents including extreme UV lithography and the cycle of impending doom followed rebirth looks like going on forever.
Except it can't go on forever. So what happens when it stops? Actually, it doesn't mean an end to the progress of computing. Not immediately, anyway. In fact, it could help focus attention on areas of computing performance that currently tend to be ignored.

Sorry, what's Moore's Law again?

In the extremely unlikely event that you're unaware of what Moore's Law is, well, it's simply the observation that transistor densities in integrated circuits double every two years.
Put another way, Moore's Law essentially says computer chips either double in complexity or halve in cost – or some mix of the two depending on what you're after - every couple of years.
Processor architecture
Anyway, I've been at this technology journalism lark for almost exactly a decade. And if there's a single theme that's sums it all up it's Moore's law.
It seemingly underpins the progress of computing of all kinds and constantly looks under threat of coming to an end. It's the opposite of nuclear fusion or a cure for cancer. Both of those seem to remain 30-odd years away but never arrive. Moore's Law looks constantly doomed to end but never actually dies.

When the music stops

Of course, one day Moore's Law will surely will grind to a halt. There are lots of reasons why it might happen and some why it must happen. And not all of them technological.
Air travel is a handy proxy here. 40 years ago it seemed inevitable that supersonic flight would become commonplace. Today, jet liners are generally little faster than most of those in the 70s and notably much slower than one from that era, Concorde.
For air travel, then, costs and practicalities have prevented progress, not the laws of physics or the limits of scientific endeavour. We can do supersonic transport, but we choose not to.
The same thing may happen with computer chips. As transistors shrink, the new technologies required to keep the show going are getting prohibitively expensive. So, we may choose not to pursue them.

You cannae break the laws of physics

Then there's the aforementioned laws of physics. In very simple terms, you can't make devices that are smaller than their component parts, those component parts being atoms. So, that's a physical limit.
There are plenty of other physical limits before you even get down to the singular atomic scale, but you get the point.
Yes, you can shake things up with new paradigms like quantum computing. But even if that turns out to be practical, it's more of a one-time deal or a big bang event than a process of refinement that will keep things going for decades as per Moore's Law.
The question is, then, what happens when Moore's music stops? Well, in the medium term, an end to Moore's Law might actually invigorate the computing industry.

Fast chips make for lazy coders

That's because the fact of ever faster chips allows coders and programmers to be lazy. Perhaps 'lazy' is unfair. But the point is that with ever more performance on offer, there's often little to no pressure on coding efficiency much of the time.
Take that hardware-provided performance progress away and suddenly there'd be every reason to make the code more efficient. Indeed, competitions to run advanced code like graphics on seriously old chips happen for fun. What can be achieved is sometimes staggering.
Process shrinking has led to millions more transistors on a die
So, there's almost definitely the equivalent of quite a few Moore's Law cycles on offer courtesy of more efficient code.
That should apply to power efficiency as much as it does performance. And power efficiency is beginnging to take over from pure performance as the most critical metric for consumer computing devices.

Better use of what we've already got

Similarly, existing transistor budgets could be used more intelligently and more efficiently to produce more performance or use less power courtesy of improved chip design. Again, the promise of ever more transistors doesn't exactly encourage a thrifty attitude to their usage.
All of which means that for the foreseeable future, an end to Moore's Law probably wouldn't make much difference to the end-user's experience of ever-improving computing.
That said, Moore's Law does actually look good for at least the next decade of process shrinks – and that's before you even take into account possible major shake ups like optical or quantum computing or chips based on graphene rather than traditional semi conductors.
As it stands, then, there are problems if you look beyond ten years. But it was ever thus. And like I said, if we hit the wall in terms of the physical size of chip components, there are plenty of other opportunities.

    








Read More ...




PS4 may get headstart as Xbox One release tipped for 'late November'
PS4 may get headstart as Xbox One release tipped for 'late November'
While the world waits with bated breath for Microsoft to confirm when the Xbox One console will go on sale, any little hint is welcome. This weekend expectant fans were given another straw to clutch.
The good folks at Pepsi are giving away Xbox One consoles to lucky competition winners, but are informing entrants in the small print that "Xbox One has targeted launch date of late November 2013."
Microsoft so far has only committed to a 'November' release date, so late in the month would certainly be at the lower end of the expectations.
It could also give Sony an all-important head start on Microsoft as the PS4 goes on sale on November 15 in the United States.

Guesswork

Alternatively, Sony's planned European launch on November 29 would be in line with the latest guesswork for the Xbox One.
These new reports are contrary to recent rumours suggesting the long-awaited next-gen device would arrive on November 8.
The saga continues.

    








Read More ...




New Asus Transformer Pad with Tegra 4 processor teased for IFA launch
New Asus Transformer Pad with Tegra 4 processor teased for IFA launch
Teaser season is in full swing, with the IFA tech show getting under way in just four days, and this weekend ASUS showed part of its hand, suggesting a new Transformer Pad tablet is on the way.
In a series of photos posted on its Facebook page, the company promised to 'transform your mobility' while displaying glimpses of a new 10-inch tablet, perhaps a revamp of its excellent Infinity device.
"We transform your mobility! Yes, a new generation is coming, but of which product?" the company posted along with the images.
Recent rumours have suggested that the company has been plotting an upgrade of its popular Android tablet line, complete with a new Nvidia Tegra 4 processor.

Evolution

In further posts on Sunday, the company urged its Facebook fans to "get ready for the evolution of an iconic notebook design."
That could mean a new iteration of the company's Zenbook Ultrabook line, may also on the way.
All will be revealed at, or before IFA, which kicks starts in Berlin later this week.

    








Read More ...




In Depth: Looking beyond Google Glass: the future of wearable tech
In Depth: Looking beyond Google Glass: the future of wearable tech

Beyond Google Glass

What if we told you that Google Glass and Galaxy Gear were just the beginning? That the impending arrival of Google's super-futuristic wearable computer and Samsung's wrist-based wonder were simply the commencement of our ascent into the realms of science-fiction cyborg-dom?
Beyond the AR specs spearheaded by Google and the smartwatches in the works from Apple, Samsung and others, there are countless minds creating wearable solutions that will revolutionise health and fitness, the workplace and everything in between - from our socks to our sex lives.
TechRadar has been chatting to some of the people at the forefront of the wearable revolution, some of whom will have devices coming to market sooner than you think.
"Right now, everything in this first phase is quite bulky to wear and less than powerful," explained Davide Vigano, former Microsoft executive now CEO of Heapsylon, to TechRadar.
"After this first generation of devices, we'll see computing merge with what we wear.
"Advances in microelectronics, means more powerful tech and batteries will fit in to packages as small as a shirt button. They'll be completely transparent to the wearer and those around them."

Glorified pedometers

Vigano's contribution to this wearable tech revolution is a pair of socks. No really. More specifically, the Sensoria Fitness Tracker, which also features a Bluetooth-enabled ankle bracelet. He says the product makes current devices like the Jawbone Up and Nike Fuelband look like "glorified pedometers".
Sensoria sock
The Sensoria sock is made from smart textiles, which behave like traditional sensors and communicate data to a bracelet. That data is then received by a smartphone app. It reached its $100,000 Indigogo funding goal with ease this month and will now go into production.
YouTube : http://www.youtube.com/watch?v=8WJtEY-gl4M
The idea is to prevent injuries in runners by monitoring how they land. 75 per cent of runners heel strike, which generates more impact and can cause more injuries. 30 per cent of runners suffer injuries every year and Sensoria will, in real time, allow runners to see how they're performing through and modify their technique accordingly, to match up with best practice.
It's all part of what Vigano calls the 'quantified self movement'.
"These devices will allow us to gather data from our bodies and will respect our privacy," he added. "The data is carried to our personal devices and will allow us to make more informed decisions, especially when that conversation is extended to healthcare. Wouldn't it be nice to have these devices woven into what we wear rather than visiting the doctor?"
These health and fitness monitors are just one area of this fast-moving sector. Wearable tech is also permeating the fashion world. Wearable Solar has a jacket that can charge a smartphone up to 50 per cent for every hour the wearer is in the sun. Dutch design team Studio Roosegart has developed the Intimacy 2.0 dress. The leather and smart-oil dress is stimulated by the user's heartbeat. When she (or he) becomes aroused, the dress will become translucent. Wearable tech as a sex aid. Other concepts give existing, everyday items a whole new lease of life.
Intimacy dress
One example is the AirWaves pollution mask, envisioned by Frog Design after the high design company challenged its global teams to come up with wearable tech solutions. The mask, imagined by the Shanghai team, senses the level of pollution in the air, feeds that data back to a smartphone and creates a map of a city, advising residents on good and bad areas at any given time.
"I think the fantastic thing about that is, the object itself is a symbol," Brandon Edwards Executive Creative Director in the Shanghai studio told us. "The team did a good job of giving it a more positive relevance in everyone's life, but the software around it is where everyone gets really excited. When you connect it to everyone who has one or connect it to other kinds of pollution monitors, you all of a sudden have maps of what's going on at any given time. From a data perspective you can unlock new ways of seeing a city."
AirWaves pollution mask
The company is now in talks with manufacturers about making the concept a reality.
It's devices like the AirWaves that'll help to reduce the levels of public scepticism surrounding wearable devices and the further invasiveness of tech into our social and private lives.
"The rate that the smartphone revolution has permeated across all sectors of society, in some cases where you'd never expect to see some people so comfortable with smartphones using them everyday," Edwards added. "Behaviourally, that leads to a better comfort level with technology and when you look at how that branches out, it's a natural evolution."
Socially, wearable tech presents different challenges. Users are unlikely to be tapping on touchscreens, in order to interface with these devices. They'll be speaking to them in public places, or gesticulating in thin air. How's that going to work?
"You're going to have some people experimenting and coming up with all different ways of interacting with an object, space or an environment," Edwards added. "I think that's a good thing. Let it germinate for a while and see where we land. I don't think any kind of interaction model will win out, I think it'll be a mixture, unless you get to some kind of tipping point with voice and a social norm where people are ok with talking to objects. To date that hasn't happened. I still think that's quite a ways off.
"This idea that today we can't walk about talking to thin air, because people will think we're loons, we might actually get over that when everyone starts to do it with increasing density.
"I'm not sure we'll get there, but I'm pretty sure it'll happen. If you look at the cellphone, it would be quite uncomfortable to have it out at a dinner or to not leave it in your pocket or in your purse. That was only 15-20 years ago. When you look at the last 5-8 years everyone has it out all the time and it has become its own character in a narrative of people being held together, almost like you have a second self. I think you'll see that with other types of tech as well. What feels very awkward at first will become very common."
pollution

Beyond Google Glass: revolutionary wristbands

While some of these wearable devices will undoubtedly be controlled by our voices, pioneered of course by Google Glass, bodily gestures will certainly represent another key input method for devices on our purpose. One company representing that element is Thalmic Labs in Waterloo, Ontario, Canada - home of BlackBerry.
The Myo wristband, which is currently available to pre-order (25,000 people already have, for $149), responds to the electrical activity in the muscles, enabling users to control secondary devices like smartphones, PCs or games consoles through gestures.
YouTube : https://www.youtube.com/watch?v=oV0PbKZyqg8
Co-founder and CEO Stephen Lake told us: "We believe that as we move forward we believe that coupling between humans and technology is going to become tighter and tighter, the line between the two will continue to be blurred. The exciting part for us is that we'll be able to use the technology to enhance our abilities and enhance our every day lives in various ways.
"The Myo is a missing link right now. It's a wearable human computer interface, connecting the real and the digital world. In our experience voice is that it's not really a viable input mechanism. There are certain limited cases where it works really well, but it's really challenging to be that crazy guy on the subway, talking to yourself with your funny glasses on. We're really excited about how better interface methods can improve the experience."
Myo wristband
The benefits are clearest in the work and gaming space. Controlling presentations sans a tethered clicker, or playing first person shooters with the lightweight Myo communicating your physical pull of the trigger back to the console. Healthcare and the creative industries are a big possibility too, Edwards said.
"We have a long tail of developers that have ideas all over the place. Someone has been talking to us about using this for image-guided surgery, while others are interested in creating musical content, 3D modelling or manipulations, there are thousands of different ideas. A big part of what we've been doing is trying to enable those developers to come to us with those applications and really get some more insight into which ones of those thousands of ideas arte going to be the salient ones."
Eventually, according to Frog Design's Brandon Edwards, wearable tech might see a devolution of the smartphone into dedicated devices, carried on our person, stitched within our clothes, hugging our skin, performing dedicated tasks, creating the fabled bionic man.
"Personally, I compare the iPhone to a Swiss army knife," he said. "There was a time when everyone carried one, but those days have gone away. The smartphone works and it's not going to go away when these single purpose, single task objects come about, but they'll be much more intuitive and much more in tune with what someone wants to do with them.
"I get very excited about seeing the miniaturisation of so many different sorts of tasks and pulling them out of that Swiss army knife into discreet areas where they could be very, very helpful. We've only just started to scratch the surface."
Are you ready to become Human 2.0?

    








Read More ...




LG announces G Pad 8.3 tablet to tackle Galaxy Note and iPad mini
LG announces G Pad 8.3 tablet to tackle Galaxy Note and iPad mini
Ahead of next week's IFA tech show, LG has officially unveiled its rumoured G Pad 8.3 tablet, which it hopes will rival similarly-sized options from Apple and Samsung.
As the name suggests, the new G Pad has an 8.3-inch screen, which packs a Full HD resolution of 1920 x 1200. LG claims it as the first full HD tablet in its size category.
The device bares an extremely strong resemblance to the company's recent G2 smartphone, with a slimline, minimalist design.
Internally, the G Pad 8.3 comes loaded with Android 4.2.2 Jelly Bean (no 4.3 out of the box), there's 16GB of storage and a 1.7GHz Qualcomm Snapdragon 600 processor and 2GB RAM.
The company is also promising a new app called QPair which will enable users to manage calls and messages by pairing up with a smartphone.

Stiff competition

LG has said it will launch the device in the US and Europe in the fourth quarter of the year, but is yet to confirm pricing. Perhaps we'll find out more from the show floor at IFA in Berlin.
Just as it is in the smartphone world, LG is up against stiff competition to break the Apple and Samsung stranglehold in the 8-inch category.
Apple is rumoured to be launching a second-gen iPad mini before the year is out, while Samsung's Galaxy Note 8.0 is a highly-rated option.

    








Read More ...




In Depth: Pixar: We choose characters based on story, not technology
In Depth: Pixar: We choose characters based on story, not technology
Disney-Pixar may be one of the most successful studios in the world, but it has never felt that it has put profit over pixels. Peel back the layers of its computer animations and what you will find are movies saturated in, well, soul.
The look of genuine fear on Woody's face near the climax of Toy Story 3. The feeling of love and loss that emanates from Carl Fredricksen during the opening scenes of Up. The unparalleled joy felt when Eve plays with a cockroach in Wall-E. These are moments swathed in raw emotion that are up there with anything any real actor can convey.
This reason for this Bill Polson, Disney-Pixar's director of industry strategy, speaking to TechRadar, believes is simply down to why the stories are chosen: "We choose our characters based on the story, not on technology."

Monster hit

Behind the scenes, though, technology is vastly improving to make sure that the characters in these movies actually jump off the screen way before they hit cinema and during the exhaustive creation process.
This is so that when scenes from a movie are previewed they are seen not as barebones wireframes but a near-final render.
Helping to achieve this is a tech collaboration with Nvidia. Disney-Pixar used the new Quadro K6000 card ahead of public release while it was creating its current box-office success Monster's University.
Monster's University
"We have been collaborating with Nvidia over the past year on real-time raytracing using their Optix technology," explained Polson.
"The K6000 is the newest most powerful card capable of running this technology, so we were grateful for the opportunity to try the project on this card."
Although Pixar continued to use its own Renderman software for final renders, according to Polson, coupling this with the K6000 meant previews were vastly improved.
"The Kepler features allow us to get interactive previews that look much closer to final render," he explained.
"Prior to this technology we were mostly looking at wireframes. Now we are looking at images that are getting close to final render. This enhances the artist's experience and lets us iterate much more quickly."
monster's university
Nvidia may have missed out on equipping the PS4 and Xbox One with GPUs to AMD, but of late it has had a nice sideline in offering professional cards to studios such as Disney-Pixar.
The K6000 has been dubbed as a pro graphics card and is being pushed, naturally by Nvidia, as 'the fastest GPU ever'.

Keplar technology

In real money, this means that the card is based on Keplar architecture and offers 12GB GDDR5 memory, along with 2,880 streaming multiprocessor cores.
The sheers grunt the K6000 offers, according to Polson, polished the preview process and took the strain away from the mountains of memory used to make Monster's University.
Monster's University
"The added memory allows larger scenes to be represented, so we can get these interactive previews on more of the final shot," said Polson.
"We don't have to work in pieces or layers quite so much, and we don't have to do as much optimising to get the scene into memory."
Nvidia's card is just one part of progress that Disney-Pixar has seen since it released Toy Story back in 1995.
Computational power has changed drastically since. Back then it took four hours to render one frame; in 2009 when Toy Story was re-rendered for 3D it took around four minutes a frame.
Monster's University
If Disney-Pixar had used just one CPU to render Toy Story (it uses hundreds) then for an 81 minute movie in 1995 it would have taken 53 'CPU' years. In 2009 this drastically shrunk to 324 'CPU' days, proving what a difference 14 years makes.
And don't expect this speed of innovation to change anytime soon. Steve May, chief technology officer for Disney-Pixar told TechRadar: "We are always looking to push the visual complexity of our films to support our stories, so we are always excited about higher computational capacity."
Monster's University
This speed of computer upgrade is enough to make Sulley in Monsters University's 5 million hairs on stand on end.
More so when you realise that this is up from a mere 1 million rendered for Monsters, Inc.
Monster's University is in cinemas now, all images used copyright of Disney-Pixar.

    








Read More ...




INFLAME: Ballmer is out. Anyone, even a trained chimp, in?
INFLAME: Ballmer is out. Anyone, even a trained chimp, in?
Steve Ballmer's had a mixed time at Microsoft. He took over as CEO when it was completely dominant in the desktop PC market, and he leaves with it... still really quite dominant.
Apple's had a bit of a go at it of late, and there was that year when people thought Linux might spread out of the niche DIY self-build banana crate enthusiast computer market, but when Mr Average buys a PC in 2013 it's still got Windows on it. And announcing his retirement lifted the share price seven per cent, so that's... nice.
But if you were judging Ballmer on the weight of public opinion alone, you'd think he was the worst company boss in history, presiding over deaths, switching the core of the business to manufacturing a variety of toxic gasses and insisting everyone eats 20 Brussels sprouts before being allowed to check their email.
He's hated, most likely only because he's a bit mouthy, stroppy, sweaty and tends to shout. But now he's on his way out, everyone on the internet is slapping him on the back and wishing him a relaxing and long retirement, right? Not exactly.

Nerd do well

On Business Insider, someone called Someone claims Ballmer's problem was that he simply didn't have a nerdy enough background to thrive at MS or present himself to the public as a visionary figure, pointing out that: "Bill Gates at least is half a programmer who knows how to motivate talented people to outdo other talented people. One key point in this plan is the leader has to genuinely sound smart. Bill Gates sounds very smart even when he appears with other heavyweights like Steve Jobs. Steve Ballmer could only make fun of himself being dumb."
Or, as vendetta247 puts it a bit more strongly: "MS would have been better using a chimp and a multiple choice selection to make decisions."

Blue shirt of doom

Or has the big man just been a victim of poor timing and a harsh, fashion-led conspiracy against boring old Microsoft? AverageJoe on Entrepreneur reckons it's a coolness issue: "Over the past 5 years, somehow, it has become 'cool' to hate on Microsoft. A lot of this has been due to Mac fan boys mindlessly repeating marketing material, and to a lesser extent to the hipster crowd toting their Apple products around everywhere they go."
In response, Vipre416 claims it's more to do with the "decades of abuse" heaped on users by Ballmer and his various tech teams, adding that Microsoft was guilty of "almost single-handedly creating the anti-virus market with the crappy OS products."

Device and rule

Gamers invaded the discussion over at ZDNet, where talk turned to the Entertainment and Devices Division, which encompasses Xbox and is blamed for recent flops like Zune and Kin.
Poster Matthew_Maurice reckons Ballmer shouldn't have bothered with Xbox at all and left the money in the building society, saying: "Microsoft would have been much better off buying Apple stock or oil futures than developing and selling the Xbox. Hell, you could even make the point that with the Surface fiasco, they didn't even get valuable hardware development experience from it."
Someone rather keener on Microsoft replied, probably from his Surface RT, saying: "...like the Xbox before, the Surface tablets only need time before they can become successes. The Xbox took time to become the best, and the best seller, and Surface tablets are going through the same growing pains."
Yeah, sure, and it's only a matter of time before those new Zunes come along to obliterate the iPod touch range.

Control panellists

As for speculation over his replacement, Guardian commenter Undecidable isn't on the side of potential candidate Stephen Elop from Nokia, saying: "Yeah, he's done a stellar job at Nokia. Microsoft should definitely consider taking him back if they're looking to liquidate in the next 10 to 15 years."
But Ballmer has some friends on the internet he can rely on to always thumbs-up and five-star his comments, such as Oppo Fanboy on The Verge, who says: "Ballmer doesn't get enough credit, for most of his tenure he ran a company that had its hands tied by anti-trust measures, which none of its competitors had.
"It would have been easy for Microsoft to fade away and become a footnote in history. He has kept the company going and kept it going very well."
Which sounds like Oppo is impressed by people who manage to remember to feed their pet dog every day for a decade.
Inflame is TechRadar's weekly round-up of the best comment from around the web. Want to read more? Then check out:

    








Read More ...






Available Tags:PS4 , Xbox , Asus , Google , LG , tablet , Galaxy , iPad

No comments: