Monday, April 20, 2015

IT News Head Lines (Techradar) 4/21/2015

Techradar



Opinion: Binge-watching is ruining the magic of TV
Opinion: Binge-watching is ruining the magic of TV
Watching five episodes of a new thing in a row isn't a good idea, no matter how enticing Netflix and opinion columns about Game of Thrones in the Guardian make the concept seem.
Yes, it gives you something to talk about on the internet the next day, but it devalues the experience of watching a series unfold and grow.
Good TV used to be associated with a time in your life. A few years. A period in which you, and the characters, changed. BlackAdder, for example, sees me go from being 10 and not really getting it, to being 16 and it being the funniest thing to have ever existed.
Watching everything ever made all at once in a blur lessens the impact. I have stronger memories of big telly events from 30 years ago than of things I watched last week.
Admittedly this might be because I didn't have a mobile phone and a child waving Lego at me as I tried to watch Blake's 7's climactic destruction of the Liberator at the end of series three in 1980, and there were only three channels and one of them was ITV, but still.
Modern landmark television events are ruined by binges. You watch things on your own, out of sync with the world. It's isolating and weird.
Recently I started powering through Sherlock, seeing as the internet wouldn't shut up about it, trudging through the first three moderately decent episodes before hitting the first episode of series two. It was a stonker.
If I was watching it "live" back in 2012, I'd have a week to ruminate on its events and cleverness and it would permeate my soul. I'd remember it.
But as the next one's right there on my computer, I'll probably watch that one tomorrow and move on, consuming at a pace that simply isn't sensible for the forming of memories and attachments.
Plus, seeing as bingers are always at different viewing points, me making a witty observation about Sherlock S02 E02 on Twitter's not really going to work, so binges end up leaving you feeling oddly disconnected from the rest of the viewers. Instead of "Did you see it last night?" the conversation has become "Oh, which one are you on?" and the talking points are lessened.
You can't preface every tweet with the episode number and original broadcast date, as then you don't have enough letters to make your little joke. So my comparison between Sherlock's 'The Woman' and Alex Polizzi will have to stay untweeted.

We were grateful for Newsround

The thrill in watching Star Trek TNG as it aired on BBC2 in the early 1990s was that you really, really appreciated it. You didn't know if the BBC would show the next series, or even the next episode, plus there was no internet to spoil everything by revealing that, yes, Patrick Stewart has signed on to be in series 4, so he's unlikely to stay borg-ified for longer than the 42 minutes of the next episode.
It was television on a knife-edge. As such, I was even grateful for the turgid episodes about Worf's family history.
Take the gradual mental deterioration of a character, as seen in modern binge-classic and all-in-one-go watching innovator Breaking Bad. In real-time, one hour a week style, four or five weeks seems like a reasonable amount of time for someone to go completely bonkers.
But when you binge watch it episode after episode in the same day or even watching one a day, it's immediately noticeable that the make-up department has done the eyes of said mentally-deteriorating character a bit darker for the next episode.
It's jarring. You need time for memories to fade a bit, so your brain can paper over the joins. It's more realistic that way, plus it becomes part of your life when something's there, once a week, for several months or years.
You also know how much there's left when powering through with abandon, which is another massive tension killer. Series four of Breaking Bad's not as exciting as it could be, as you know there's series five queued up already, so it's unlikely Walt's going to get shot in the face and die during S4's finale.
The whole idea of promoting binge watching feels like a crappy solution to the modern marginalisation of TV.
It solves the short-term problem that everyone would rather be looking at their tablets all the time than gathering around the TV to watch landmark episodes as one nation, together, experiencing live the moment Del Boy fell through the bar for the first time, but it devalues the experience.
It makes TV more forgettable and less anchored to key ages and moments in our lives. Which is only going to further lessen its impact and hasten its demise.









Read More ...




Running Man of Tech: Can you turn a runner into a triathlete in 8 weeks?
Running Man of Tech: Can you turn a runner into a triathlete in 8 weeks?

Swimming: harder than I remember

I'm clinging to slippery tiles, fingers rapidly wrinkled, while small children play happily on foam tubes near me. In between the waves of sickness I'm wondering when these spaghetti-like buoyancy aids were invented.
Did they have them when I was young? Was I deprived of the joy they're bringing nearly ever child in this pool? Would I be able to pass as a swimmer now if I'd had one?
These are the thoughts running through my mind as I attempt to come to terms with the fact I've got to run the Windsor Triathlon (do you run one? Do one? Try to tri? I'm not even sure of the terminology) in less than 2 months.
It's day 1 of 62, I've got into the pool for the first time in five years and after 50 metres of what I think is front crawl, but probably more akin to a dying dolphin, I'm coughing up chlorine and panicking. I'm never going to be ready to race a triathlon (OH, RACE. That's it) by June.
It's technology's fault. I've had the Garmin 920XT strapped to my wrist for a couple of months now, and every time I skip past the 'Swim' or 'Triathlon' settings to start a run I feel a pang of guilt. This watch wants to be swimming and cycling too.
So I make a promise to this inanimate object last Sunday as I stand waiting for the start of the Hyde Park 10K race. If I can post a decent time here (the distance of the final leg of an Olympic triathlon), I figure it won't be too hard to add in splashing and spinning to running. The race goes well, I post my third best time ever, and I'm confident I'm fit enough to easily tame this triathlon.
Garmin 920XT
24 hours later I'm certain I won't. What's worse: the Garmin, which I was sure was going to help me supercharge my swimming, has let me down. It's telling me things about SWOLF and asking me if I want to start drills – I have no idea what's happening. And the pool I'm in is so small there's not even a mode for it on the watch, so the length counting is all wrong.
I flop up and down the posh pond for another 700m in the next hour before calling it a day, my great hopes that the Garmin would somehow give me swimming superpowers sadly not come to fruition.

Help me, magic swimming man

The next day I call in the reinforcements. I head to the London YMCA (where I hear it's fun to stay) where they have dedicated triathlon swimming sessions, praying this will help me pick myself off the ground.
But as I stand on the side of the pool, shivering while impossibly well-built men laugh confidently with one another next to me before plunging into the pool and racing off into the distance, I realise something: my internal equation of 'me being sort of fit from running + fancy gadgetry = easy triathlon' is miles out.
I'm going to need training. Lots of it. Possibly more than I can cram into the time left. So after a harrowing 90 minutes, where it was kindly suggested a few times that I sit out a couple of drills, I slink back home and back to one of my greatest skills: Googling stuff until I find something that makes me feel better and I can forget I was worried in the first place.
I learn that SWOLF isn't an enemy from Doctor Who - it's the metric Garmin came up with (inexplicably a mixture of swimming and golf) to see how efficient you are per length.
I find out how to turn drills off. I learn that I'll need SO MUCH STUFF to do a triathlon (two different kinds of shoes, lubricant to get the wetsuit off more quickly, … there's even a company that specifically makes towels to help you transition between segments) that it's either going to be brilliant fun to try all this stuff out or a horrendous mess that sees me line up dressed as Left Shark for the 1.5KM swim because I've got confused.
And I find someone who might agree to coach me for this event. Speaking on the phone, you can hear the pause when I tell him what's ahead of me. Yes, eight weeks to go. No, I can't do 400m swim yet. Well, I cycle about a mile each day to the station.
So he sends me a few tests to see where I'm at, meaning my entire weekend is taken up with a mini triathlon to find out if I can do this, leading to me sprinting 8 minutes in my local Parkrun before hopping on the bike for a couple another 16 minutes of hell.
While pedalling I realise another two things: I'm going to need a helmet (because you can go really fast on a bike if you push really hard, which is a recipe for tree crashing) and the Garmin Fenix 3, which I've been testing since Friday, needs a brighter screen because I keep looking down to squint at it and forgetting about the road (more on that next week).
Today comes the swim test, followed by a long hill session for the legs. The fact I'm looking forward to the latter, my usually most-dreaded drill, tells me this is going to suck.









Read More ...




Moore's Law at 50: how his predictions have shaped the future
Moore's Law at 50: how his predictions have shaped the future

Introduction and impact of Moore's Law

It's 50 years since Gordon E Moore, co-founder of the Intel Corporation, made the observation that became known as Moore's Law. In 1965 Electronics magazine had asked him to write an article predicting what would happen in the semiconductor component industry in the subsequent 10 years. Moore was, at that time, director of R&D at Fairchild Semiconductor, and this made him something of an expert in the field.
Moore looked at the elements – transistors, resistors, capacitors, and diodes – being used in chips at the time (approximately 60), and based on their use in the preceding years, came to the conclusion that the industry would double these elements every year for 10 years until they hit 60,000 per chip.

Moore lays down the law

Ten years later and Moore's prediction proved very accurate, leading a colleague to coin the term 'Moore's Law', but at this time Moore revised his prediction to a doubling every two years. Ultimately transistors came to be the dominant element in chips, becoming the most useful measure of an integrated circuit's complexity.
But Moore's Law wasn't just about the quantity of elements and a chip's resulting performance – Moore was also concerned with economics. His original prediction was based upon the number of elements within each chip where cost per component was at a minimum. Interestingly, in the past ten years, increases in transistor numbers have come to be more about cost than performance, with transistors being made smaller in order to keep costs down – although this further miniaturisation has resulted in performance gains in any case. Moore's Law economics in action!
Although Moore's first and second predictions were, initially, a means of chronicling the industry's progress, over time Moore's Law became something of a driving force, encouraging semiconductor manufacturers to keep pace with the Law. Today, there are billions of transistors on chips, and this magnitude has a great deal to do with the existence of Moore's Law. It is said that the semiconductor industry still uses it to guide its planning and to set targets for R&D.

Major impact

This being the case, Moore's Law's impact on our lives and the progress of business and industry cannot be overstated. The way we communicate has changed irrevocably over the past few decades. If Moore's Law hadn't been adopted by the semiconductor industry as a call to arms, would we be working on our own individual computers, making business calls on smartphones or travelling to meetings in today's computer-controlled cars (if we bother to travel at all – videoconferencing has never been more sophisticated)? Unlikely.
And there'd almost certainly be no internet without Moore's Law. Gordon Moore has helped to determine our technological reality and is arguably even more (no pun intended) influential in indirectly shaping our futures than Arthur C Clarke – and that's some achievement.
So, what helped Moore's Law to gather momentum in the early years following the 1965 publication of Electronics magazine? Of course, the invention of the integrated circuit, which instigated and influenced Moore's article has a huge part to play – without it there would be no Moore's Law, I'd be typing this piece on a typewriter and TechRadar Pro would be a print magazine. We have Jack Kirby at Texas Instruments and Robert Noyce at Fairchild Semiconductor to thank for the 'birth' of Moore's Law and for keeping it alive ever since. But there were other contributions to the Law's early development.

Super shrinkage

The invention of DRAM (Dynamic Random Access Memory) in 1967, by Robert Dennard at IBM, made it possible to fabricate single-transistor memory cells. The invention of the excimer laser at the Lebedev Physical Institute in 1970, and the subsequent invention of deep UV excimer laser photolithography by Kanti Jain at IBM in the early 80s, ultimately led to the smallest components in integrated circuits to shrink from 800 nanometers in 1990 to a low of 22 nanometers in 2012.
The invention of flash memory, also in the early 80s, by Fujio Masuoka at Toshiba opened the doors for high-capacity, low-cost memory. And manufacturing costs for chips were driven down thanks to developments in CMP (chemical mechanical planarisation) by IBM and Motorola throughout the 90s. CMP smoothes the surface of chips making them easier and cheaper to manufacture.
More recently, various announcements have made the future or Moore's Law – which Moore himself reckons has a limited shelf life – look pretty secure. Researchers at the Tyndall National Institute in Ireland announced that they had fabricated the junctionless transistor in 2010; researchers at the University of Pittsburgh announced the development of the single-electron transistor, 1.5 nanometres in diameter in 2011; and in 2012 a team at the University of New South Wales announced the development of a transistor comprising a single atom placed in a silicon crystal.
Intel Wafer

Will it last?

How long can Moore's Law go on for? As mentioned above, Moore reckons there'll be an end to it. "It can't continue forever," he told TechWorld.com in 2010. Moore says it'll be "two or three generations" before transistors are the size of atoms and that we'll reach a "fundamental limit" with existing processes before 2030.
Also in 2010, the International Technology Roadmap for Semiconductors predicted that transistor-per-chip growth would slow by 2014, projecting that the Moore's Law doubling would shift from every two years to every three. Of course, the growth of nanotechnology could restore Moore's Law to its doubling-every-two-year predictor.
However it might be sustained, as it did before, Moore's Law will have to evolve to survive. As we've seen, developments in semiconductor technology will continue to occur, but this won't necessarily mean that costs will continue to fall – Moore's original economies.
Other measures of the Law will have to come into play, meaning that it will morph into what some in the semiconductor industry call "more than Moore" and 'Gentleman Scientist' Chris Mack calls Moore's Law 3.0. Mack cites the cell phone camera as an example of 3.0 in action. These cameras incorporate image sensors directly onto digital signal processors using large vertical lines of copper wiring called through-silicon vias, so uniting non-logic functions that would previously have been kept separate from the chips themselves.
Assuming Moore's Law continues to have an impact on technology, what might the future look like? However you want it to look, basically, especially if nanotechnology becomes more influential. Cyber body parts, brain implants, wearable tech that interacts with your body biologically – our lives are set to change again and again thanks to Moore's Law as we keep living in what feels like the future. And as Moore's Law evolves to keep up with the technological evolution, there are no limits to where the semiconductor will take us. Here's to the next 50 years.









Read More ...






Available Tags:TV , his

No comments: