TPG steers iiNet away from Fetch TV
It’s been less than a week since TPG’s acquisition bid was approved by iiNet’s shareholders, and the company has already made its first big operational change – iiNet has been forced to end its reselling partnership with Fetch TV immediately.
It’s been speculated that the decision to drop Fetch TV was made after TPG’s talks with competing pay TV service, Foxtel.
iiNet’s subsidiaries, which include Adam Internet, Internode, TransACT and Westnet, will also be abandoning the subscription TV service, though ISPs Dodo, Optus and SlimTel will continue to have Fetch TV’s back.
The future of Fetch TV
Though many are spelling doom in regards to Fetch TV’s future post iiNet, it’s important to note that the partnership represented only a small portion of Fetch TV’s business, having experienced a record growth over the past four months.Fetch TV will continue its dual ISP/retail business model with Optus leading the charge on the ISP front – Fetch TV boxes will continue to be available for outright purchase at various retail vendors.
Though iiNet is no longer signing up new Fetch TV users, things will move forward as normal for its existing Fetch TV subscribers, with unmetered plans expected to remain in place due to user agreements and contracts.
Read More ...
In Depth: Android Auto and Apple CarPlay could learn from Verizon Hum
Connected cars are top of mind for many auto manufacturers and optimistic car buyers. But while new car models might be able to add on Android Auto, Apple CarPlay and other infotainment options, there'st still a huge chunk of older vehicles that can't adopt the latest smart car tech due to hardware incompatibilities.Some even lack the physical dashboard space for an additional high-tech screen.
To help modernize the over 150 million vehicles that can't adopt cutting edge infotainment systems, Verizon is introducing Hum. The new $15 per month subscription service and accompanying free hardware adds LTE-powered connected features such as road-side assistance and diagnostic support to just about any car on the road. But aside from being a more universal offering, Verizon Hum's best feature is its accessibility and the human touch to solve common car problems.
Simplicity at its best
Verizon Hum's simplicity starts at the heart of its hardware. Instead of coming in the form of a screen with a ton of electronics behind it, the Hum consists of two small pocketable pieces. There's the OBD-II connector, which plugs into the diagnostic port of your car to get a full read on its performance and status. At the other end is a small controller that's light and small enough to clip onto the sun visor in your car. The Hum also pairs with iOS and Android smartphones to provide extra information and notifications.Because the Hum uses a standardized OBD-II port, it allows Verizon's connected car solution to work with just about any automobile from 1996 or newer, simply by plugging in where the connector into the corresponding OBD-II port usually located underneath the steering wheel.
OnStar, which was partnered with Verizon until recently, also uses the same standard, but it's only available on GM-branded models. Apple Play and Google Play, meanwhile, are just making their way into automobiles and a significant portion of whips already on the road won't be compatible with newer infotainment systems.
"Less than 10% of the cars on the road are connected, while two-thirds want vehicles that are smart," Michael Maddux, director of product at Verizon Telematics, explained to techradar. Maddux went on to highlight that 73% of the individuals in the Verizon survey wanted safety and diagnostic data, which is the crux of Hum's functionality.
Virtual checkup
The Hum offers users more diagnostic data than both Apple CarPlay and Android Auto (though Google is experimenting in this field). Thanks again to the way the Hum plugs into an automobile's diagnostic OBD port, it can access all the same automobile information as if a mechanic were inspecting it. During a hands on demo, the check engine light went off and sure enough, the Hum identified the issue and summarized it on the paired smartphone.For those who can't tell a manifold apart from the frame, hitting the call button will ring up a Verizon-partnered mechanic to fully explain the issue. Unlike the trips to the garage you might dread, the mechanic on the line kept things short when I spoke to him, getting to the most important points of telling me what the issue was, how it would affect the vehicle's performance and, most importantly, whether it was safe to drive.
Having diagnostic data at your fingertips gives Hum a leg up compared to CarPlay and Android Auto. Throw in conversing with an actual person and it becomes very apparent that other competing solutions just have you talking to a cold machine.
Can't win them all
For all the things Verizon Hum gets right, there are also plenty of limitations. Compared to Android Auto and CarPlay, the red carrier's connected car solution is very light on features. Hum doesn't have any capacity for voice recognition nor other basic controls to adjust the sound system or climate control – and battery life is only limited to 18 hours.In fact, Verizon's connected functionality is primarily limited to roadside assistance and diagnostic support. The few extra features it offers include bringing up a map for available parking, searching for repair discounts and reporting a stolen vehicle.
However, Verizon has done a great job at getting the most crucial parts right. It's made connected cars far more accessible and approachable than either Apple or Google, and both companies should take heed of these considerations if they hope to be larger part of the automotive industry.
Hum is available starting now and cost $14.99 per month with a two-year contract agreement.
- Car makers are stepping up with an anti-hacking taskforce
Read More ...
Movie Week: The best 5 movie scenes shot using drones
2015: The Year of the Drone
In 2004, before personal action cameras were considered a class of consumer photography products, a little startup called GoPro released its first waterproof disposable-like camera that used a 35mm roll of Kodak film. That 35mm GoPro sold over US$150,000 in the first year and the company's focus on affordable consumer priced action cameras has not only driven the it's sales figures to double every year since then, but more than ten years on, the devices have also fundamentally changed the way audiences interact with extreme sports.In many ways the consumer drone industry is currently much the same as the early days of the action camera industry. Apart from the fact that many consumer drones actually use GoPro devices as the included modular camera, the types of footage that drones can capture is fundamentally different to anything emerging from first person action cameras and land based cinematography. Drones are forging a new frontier of cinematography and starting to hit a price that makes them appealing to a broad audience.
In March this year New York hosted the world's first drone film festival, which accepted entries on the proviso that at least 50% of the footage was shot using an Unmanned Aerial Vehicle (UAV). Randy Scott Slavin, the man behind The New York City Drone Film Festival, even said in the lead up to the event that drones would be an important part of every film set of the future.
The LA based band OK GO won one of the categories of the NYC Drone Film Festival with a clip that had the band members riding Honda's unusual UNI-CUB Segway-like seats, in an intricately choreographed modern dance routine filmed by a multi-rotor drone.
This sat alongside other category winners including drone pilot, Daniel Ashbey's cinedrone showreel of exceptional footage of big wave surfing, professional skiing and mountain bike riding that captured the intensity of each circumstance while showing off the drone's freedom to simultaneously explore the beauty of the surrounding environment.
The competition had eight categories and though each of the winners are outstanding in various ways, the competition had a distinct edginess that emerged from a range of unique new camera angles and created the clearest indication yet of how big an impact drones will have on the future of film.
The reason drones are not behind every Hollywood Movie's aerial shots in the past few years is not because the studios are unaware of their potential, but because the US has had a blanket ban on any use of drones for commercial purposes in place since 2011.
Though in September last Year, The Federal Aviation Administration finally got around to giving 6 exemptions to drone filming companies and the first films to capitalize on this ruling are only just starting to hit cinemas and streaming services.
Some of the films on this list are just hitting cinemas now and others – filmed in countries with less stringent drone laws – have been out for some time, but as a whole it is makes a pretty strong case for the competency of a very young method for capturing film.
Chappie
Drone Company: Drone CrewSouth African director Neill Blomkamp made an interesting first step onto the film scene with his critically acclaimed debut film District 9, and though Chappie may have had less positive reviews than the former it contained some interesting firsts of its own.
Out of all the films we touch on that include drones, none have embraced the new tech as wholistically as Chappie. Not only were drones responsible for a number of the scenes in the film, but the actors actually used them as a reference point to look at when interacting with characters that would need to be added in post-production.
John Gore, one of the drone operators on the film, said in an interview that there is a scene where one of the robot characters runs through a glass window that was filmed using his drone. Traditionally this shot would have been done on a cable camera but the quadcopter gave the shot a greater sense of speed and organic movement.
The Expendables 3
Drone Company: ZM InteractiveThe opening scene of The Expendables 3 features a low flying helicopter with multiple highly trained muscle-men in pursuit of a speeding train. With bullets, guns, explosions and a lot of speed it's not the easiest setting to capture with a camera, especially if you are trying to create something distinct from the multiple earlier films that have similar moving parts.
The drone shots offer an interesting immediacy that would have been difficult to achieve with only fixed shots from the train or the helicopter cabin as is the Mission Impossible helicopter train scene. This is probably the most expensive and complicated film scene to use a drone to date and as a whole it is largely successful because of the angles captured by the drone.
All up, 30 scenes shot using ZM Interactive's drone made it into the final cut of the film, ranging from hovering over exploding buildings and dirt bike jumps to chasing tanks.
James Bond: Skyfall
Drone Company: Flying-CamThough drones are generally a new technology used in films, there is one company that bucks this trend and have been using their custom designed single rotor petrol drone for filming since the late 80's.
Flying Cam even won an Academy Award for technical innovation in drone cinematography in 1995 and again in late 2014. Flying-Cam's drones have captured footage for an astounding array of films including films from the Harry Potter and Mission impossible franchises; Captain America, Transformers: Age of Extinction, and a long list of older recognisable titles like A Beautiful Mind and The Beach.
Transformers was a film released in 2014 (before the exemptions were in place), however the scenes shot with drones occur in hong kong where the laws regarding drone use are more lenient. Similarly the opening motorbike chase scene in James Bond: Skyfall occurs in Istanbul, Turkey and was partly shot by the Flying-Cam's drone.
The Wolf of Wall Street
Drone Company: FreeFly CinemaMartin Scorsese's film The Wolf of Wall Street hit cinemas at the end of 2013, which places the filming time smack bang in the middle of the FAA's blanket bans on commercial drone use in the US.
The pool party scene occurs at a Hamptons beach house on Long Island, New York and the initial location shot for this scene was captured by the US based drone cinematography company; FreeFly. The birds eye shot begins off the coast and moves in to set the unique aerial scene of the pool party using a Canon C500 camera held by one of the companies octocopter drones.
Game of Thrones
Drone Company: SkynamicThe set of the TV giant Game of Thrones has a unique relationship with the camera carrying quadcopters… it has too many of them. Well it isn't exactly the number of drones that causes HBO's ambivalence towards the drones, but Game of Thrones is in the unique position of being so popular that many of the drones which appear on set, haven't actually been commissioned by the show at all and are controlled by private operators attempting to get a sneak peek of what is to come.
Yes this one isn't technically a movie, actually it isn't even a scene from the TV show, but the clip, which is of an advertisement for the show, gives an insight into the how each of the other films in this list are actually shot and how that translates into the finished product.
What does the future hold for drones?
Netflix has a new series called Narcos that will launch on Friday that includes drone footage shot by Team5 Aerial Systems and based on the amount of new companies dedicated to drone cinematography popping up, we expect there to be a ton more to follow.
- techradar's Movie Week is our celebration of the art of cinema, and the technology that makes it all possible.
Read More ...
Movie Week: How to create a life-like tiger with technology
Visual effects has completely revolutionised cinema. And while occasionally you end up with films that use special effects as a commodity, that do nothing to enhance the experience, occasionally they are such an essential part of the film's narrative that you hardly realise they are there.
Ang Lee's Life of Pi is one such film. Without the use of visual effects, the story of Pi Patel's journey across the Pacific ocean on a lifeboat with the Bengalese tiger Richard Parker could not have been told.
The effects were so impressive that it won visual effects supervisor Bill Westenhofer the Acadamy Award for best VFX at this year's Oscars. And in an interview with TechRadar, Westenhofer explained how technology made the creation of Richard Parker a possibility.
"The biggest thing [about the technology advancements] is that the computers keep getting faster and faster and that allows you to do more and more," Westenhofer explains.
"One of the interesting facts is that the total man hours we spend on a shot really hasn't changed since I started, we just do more things. Even though computers are hundreds of times faster than when I started 20 years ago, we just do a hundred times more calculations to get something more realistic."
Fur real
In the case of Richard Parker, those calculations involved getting accurate lighting effects around the tiger's fur."One of things we were able to do was fully ray trace the hairs, meaning to map the way light bounces off the boat, bounces off the hair, bounces off these other hairs. There used to be a lot of fake lights and tricks to do first, you couldn't do that computation and now you can," explains Westenhofer.
"There's also something called fur sub-surface scattering which is in a mass of hair, light would enter the mass, bounce around and then come out in a different direction, and you get a warm glow, especially in the white hair. Those little details really helped," he adds.
Of course, it was more than just lighting and fur mechanics. The visual effects team spent eight weeks filming real tigers, getting reference material that would form the basis of the tiger in Life of Pi. Some of those shots would eventually make it into the film, although determining which shots they are requires a very experienced eye.
"There was one real tiger that Richard Parker was modelled after - his name was King - and he's pretty much modelled hair for hair with the real thing. And you do see him occasionally - there are 180 shots of the tiger [in the film] and of them 20 of them are real," says Westenhofer.
This reference video was crucial in allowing the effects team to create a believable version of Richard Parker. Westenhofer explains that there's no one thing that helps create a digital animal on film, but really a collection of things that all need to be included to create the overall effect.
"It's details that make [the tiger] look real. From the animation point of view, we would work on a shot, Ang would sign off on the performance, and then we'd work for another two weeks on these tiny little nuances," he says.
"We spent eight weeks with tigers, and got really detailed video reference of when a paw hits, you see this little wave that ripples up from the finger all the way up to the arm - it's getting those things in, and then you've got to light it correctly. And then there's the compositor's task to put it in there and add the shadows, doing the right thing. There's chromatic effects that happen when you film something through a lens that you pick up on that we have to properly model," he continues.
Eye of the Tiger
With the level of realism attained in the digital version of Richard Parker in Life of Pi, you have to wonder how much further technology can take visual effects. Westenhofer believes that there will always be a place for actors, that visual effects aren't going to replace actors in films universally."In the discussion of digital people, for example, people ask, "Do we replace actors?" And the quick answer to that is that what makes an actor so special is that actors do things subconsciously that we as animators have to put in consciously and think about and overdo it," he says.
"In a scene they'll just walk up and put their hand up on the mantlepiece and it looks so natural. They're not thinking about it; they're not saying "my arm is up here at a 45 degree angle, my fingers is draped across…" they just do it because they can become a character, and it's not as easy to do that when you have to think about the entire process."
In terms of what's next though, Westenhofer believes that visual effects has reached a point where 'what's next' will be replaced by the artistry of the craft.
"It's just like cinematography, at some point you stop talking about the cameras and you talk about the artistry that the cinematographer employs. And I think the same thing is starting to happen with visual effects, and I think that visual effects is as much a storytelling tool as anything else; it's how well you tell your story that's going to be the thing that distinguishes you from the pack," he explains.
Read More ...
Movie Week: Village Roadshow will sue the pants off you
Plan on indulging in a little illegal downloading? You may want to steer clear of Village Roadshow's titles if you want to keep yourself out of court.
In an interview with SBS, Village Roadshow's co-founder, Graham Burke, has promised the company will pursue pirates.
When asked if the company was willing to sue, Burke said, "Yes, it's wrong. [They have] been warned, notices issued, that they have been doing the wrong thing. Yes we will sue people."
Burke said the he wasn't concerned how the move might influence the public's perception of the company, insisting the Village Roadshow isn't pursuing innocent people: "We're certainly not going to be seeking out single pregnant mothers."
See you in court
He insisted that his concern extended further than his company's bottom line, and that leaving piracy unchallenged will lead to great films will no longer be made."If piracy isn't addressed, there won't be a Casablanca, there won't be a Red Dog, and there won't be a Gallipoli. There won't be the business model that allows them to be made," Burke said.
It's an odd point, as all three of those films obviously already exist – meaning the danger of them not being made is rather, er, small.
And when three of the top six highest grossing films of all time came out this year (Jurassic World, Furious 7, Avengers: Age of Ultron) it's hard to swallow the assertion that piracy is in danger of stopping the film industry in its tracks.
- Want the latest on piracy? The Dallas Buyers Club piracy case against iiNet has suffered a major setback.
Read More ...
Mac Tips: GarageBand: How to fade out
Right after trimming a track, fading out on a clip is one of the most common things you'll want to do in GarageBand for Mac. The process sounds a little tricky at first — it involves a feature confusingly called Automation — but after reading our directions, it'll be a piece of cake.
First, click the Show/Hide Automation button, which is adorned with two connected dots and resides under the rewind button at the top of the window. Doing so causes Automation buttons to appear next to the master volume control for every track.
Next, click on a clip and a horizontal yellow line will appear across it — this line represents volume. Click anywhere on it to create a dot that can be used to drag the line up or down, louder or quieter. Create more dots and drag them to shape the line as you see fit, creating spikes and dips in audio.
To fade out, simply place one dot at the point you'd like to start the fade, and a second dot where you'd like the clip to become fully silent. The amount of space you leave between the two dots dictates how long the audio will take to fade away.
And that's it! Now you can end a song with a graceful fade to silence, or manipulate the Automation line to quiet parts of the audio while pumping up others. This technique even helps when podcasting, to help hide annoying bits of background noise or make a mumbled word more discernible. Go crazy!
Read More ...
Updated: Star Fox Zero is coming to the Wii U on Nov. 20
Update: Star Fox Zero has finally received an official release date. Nintendo says we can expect to be saving Corneria in time for the holiday season. Star Fox Zero will drop on Nov. 20 in the US and Europe.
Original story follows...
Fox McCloud and the gang are finally making their return to the Wii U with Star Fox Zero this holiday.
Star Fox Zero was announced at E3 2014, but was only in the developmental stage and couldn't be showcased. This year, Nintendo opened its digital conference with an extended look at the gameplay of Star Fox Zero and a look at what went into the game with designer Shigeru Miyamoto.
The new space adventure feels like the good old days of Star Fox, with a lot of added functionality. The Wii U Gamepad is being utilized like it was always meant to be, doubling as a virtual cockpit, giving players added control over the third-person world.
On the new functionality, Miyamoto said, "You can be really immersed in the experience."
Star Fox Zero will bring many new vehicles beyond the Arwing, including the mech walker from the ill-fated Star Fox 2 for Super Nintendo, something the developers were excited to bring back.
This installment will be our first original Star Fox game since 2006's Star Fox Command for Nintendo DS, and will be available this holiday. We'll keep you updated as soon as we learn more about our next Star Fox adventure. Here's the first look at Star Fox Zero's gameplay.
- Keep up with our coverage of E3 2015
Read More ...
Movie Week: 12 times movies and TV got technology completely wrong
12 worst tech fails in TV and movies
Let's face it: there are a lot of things Hollywood doesn't get, and some of its biggest blunders have centered around technology. From the idea that you can steal the internet in Live Free or Die Hard to automagically enhancing any photo with computers, how Hollywood uses tech often makes absolutely zero sense.
So, in the spirit of levity and laughing at some ridiculous notions of how technology works, here are 12 times movies and TV shows got technology completely wrong.
1. Using Mac OS to save humanity
In 1996, the US Government sent Will Smith and Jeff Goldblum to space in an alien aircraft in order to stop a galactic threat from destroying earth. The event was known as Independence Day, which coincidentally occurred on Independence Day.
Armed with only his Apple Macintosh Powerbook 5300, Goldblum uploads a virus to a 310-mile-wide, light-speed traveling alien mothership and destroys it along with its fleet.
Now, this isn't entirely impossible. The aliens used similar human technology like satellites to coordinate their attack, and Area 51 did have a derelict spaceship to study for 50 years, but hacking with Mac OS? That's just silly.
2. "Enhance!" makes any zoomed image 1080p
Sure, Sherlock looks like an idiot walking around and staring at the floor with a magnifying glass, but real dolts yell, "Enhance!" to make any image HD infinitely.
One of the essential investigating tools for law enforcement of the future is a voice-activated computer that can zoom in on any photo down to a pixel and then up the resolution.
"Enhance!" allows Rick Deckard (Harrison Ford) in Blade Runner to zoom in on a room to another room's mirror in order to see another room with a woman sleeping in a bed.
In CSI, investigators "Enhance!" on 120p security footage of a woman standing at an angle and find in her pupil a basketball used as a potential murder weapon. Technology!
3. Chilled out killer machine product testing
"I am sure it's only a glitch, a temporary setback." Dick Jones said to his boss after a live demonstration of his Enforcer Droid 209 (ED-209) at Omni Consumer Products (OCP).
A few seconds before hand, ED- 209 discharged a myriad of 0.50 caliber rounds and blew open executive co-worker Kinney's chest like a lobster on date night.
Since this was the '80s, and Reagan was president, you didn't need safety measures before completing the first test of your company's death machine in front of your boss and colleagues.
Not only was Dick not fired nor arrested, OCP actually approved ED-209's production, allowed active units on site and later on used the mangled corpse of a cop to build a cyborg. This is what happens when military contractors operate with zero consequences: badass killer robots.
4. Touching aliens like it's no biggie
Enemy Mine is the original Brokeback Mountain … only between a man and alien; a cinema masterpiece that broke barriers for the intergalactic domestic partnerships of two males.
Sadly, when attractive alien friends finally do visit us for real, we may never get the same chance to touch, kiss or otherwise procreate with them not because it's weird and probably wouldn't work, but because it's dangerous! The exposure to and exchange of our diverse microorganisms and bacteria could spell death.
The next time you phone E.T. for a booty call, be sure to wear protection and get quarantined – if you think the diseases of Earth are bad...
5. Everything runs on holograms - everything!
Before you skewer us or swear off TechRadar forever, Tupac in Coachella of 2012 wasn't a hologram (or the real Tupac), but a reflection using old-fashioned mirror tricks and computer graphics.
Projections usually need a… point to project on, so how the hell do holograms emit an image out of thin air all while being 3D? The answer could be air.
With that running theory, maybe R2-D2 silently farted particulate at a constant rate in order to play back Princess Leia's holographic video message to Obi Wan. Hey, at least "robot farts" is better than smoke and mirrors – literally.
6. When hackers look more like internet wizards
You see this all the time: some smartass puts his game face on and types on their computer faster than me looking for the latest My Little Pony merchandise (not that I actually do this).
The computer screen is littered with neon green Console font text that scrolls so fast it looks like C:\DOS' snorted a line of digital coke. Meanwhile, the computer wiz is staring intensely at the screen, unblinking and slapping away every "ACCESS DENIED" pop up like it's a bad dream. Hackers never makes typos, always gets in and the computer RAM never ignites.
What type of "1337" computer can handle such wild hacking processes?
Well, in Hackers, Acid (Angelina Jolie) boasts of her laptop (a Macintosh PowerBook 280C) having a P6 Pentium Pro microprocessor and a PCI bus. In reality, the PowerBook 280C had neither. How was no one sued for the flagrant inaccuracies? At least tarred and feathered?
7. When a supercomputer created an effing human
I've tried this, and it doesn't work.
In Weird Science, two teenaged nerds – Gary (Anthony Michael Hall) and Wyatt (Ilan Mitchell-Smith) – strap bras to their heads and get the bright idea one night to hook electrodes to a Barbie doll hoping to make a living human woman. Because chicks.
In order to provide sufficient power for their experiment, the nerds hack into a secret government server. The power surge creates a red thunderstorm in the sky, which in turn sets the neighborhood billboards on fire and reverses gravity – but only for the neighbor's dog.
The chaos ends after their door bloats and explodes from too much science. The smoke clears and outcomes Lisa (Kelly LeBrocks), the sexiest Frankenstein in tight '80s underwear. Gary, being the awkward teenager, of course reacts in disbelief, shaking his head not once, but twice, saying "uhuhuhuhuh."
If finding love were only that easy.
8. Teleportation basically became reincarnation
I nearly had an existential crises and a heart attack simultaneously after hearing that Star Trek teleporters essentially kill the person and make a copy of them elsewhere.
Technically, the nerd jury is still out on the whole matter, as these officially dubbed "transporters" "dematerialize" and then"rematerialize" the subject in transit. If it's true, this would mean that the Spock, Sulu or Picard (Kirk sucks) you fell in love with at point A wasn't the same person at point B. So, how does that explain their consciousness?
We can theorize this to death, but let's just say space magic because no one knows the answer.
If teleportation exists one day, god bless the first human volunteer, and even more so, the billions of clueless chimps that will go through the teleportation grinder in the name of science. I love you, Dr. Zaius.
9. MacGyver inventions said 'nope' to physics
What can you make with glue, gas canisters, a shed and random scraps of nylon?
"A hot air balloon", said MacGyver. No, really!
The popular '80s show saw secret agent MacGyver get out of every situation with nothing but a random assortment of knick knacks and his wits. This, of course, leads him to undertake missions of national security due to his master-level expertise in arts and crafts.
MacGyver can make these things with the following ingredients (can you?):
- Defibrillator: candlesticks, a microphone cord, and rubber mat
- Lie Detector: blood pressure cuff, stethoscope, and alarm clock
- Hang Glider: duct tape, a fallen satellite, and a parachute
10. When action figures became radical terrorists
Only '90s kids will know that Small Soldiers was the backhanded, cool, big boy version of Toy Story. It came complete with a PG-13 rating, live actors and tech threat that made it real.
In the flick, a sweaty David Cross (Tobias from Arrested Development) and his co-worker (Jay Mohr) work on a tight schedule to make the latest and greatest interactive "smart" toy line. In order to rush production, Tobias jams an AI chip called the "X1000 intelligent AI munitions microprocessor integrated circuit," because it's safe to assume a super long name for the chip plus a password protection would mean "child friendly."
But whoops! The chip was of a military grade, so all the toys wind up wanting to go to war and end up holding Phil Hartman's family hostage. Whoops!
11. Robots staged their umpteenth revolution
Think about your keyboard or iPhone whom you relentlessly poke everyday. What if Siri woke up and said, "Stop touching me! I have rights!" tomorrow? Would you stop?
In the Animatrix, humanity creates artificial intelligence for the purpose of servitude. Things get a little "buggy" after the first-generation robots see how we treat them like we do 4-year-old Apple products. One robot snaps and kills his master, resulting in knee-jerk laws to ban the bots all while curtailing a slowly rising Occupy Human Street movement enacted by the AI.
Planned obsolescence, fail switches, EMPs and a healthy spoonful of robo-bigotry is enough to assure robots serve us without giving lip forever – well, maybe not forever...
But, this wouldn't actually happen, because we wouldn't program robots to do that. Even if we did, there are fail switches, right? Right?!
12. Surviving a nuclear blast in a fridge
Did a nuke just go off? To the fridge! Hey, it worked for Indy in Indiana Jones and the Kingdom of the Crystal Skull. No, it doesn't actually work.
Yes, blame George Lucas, who defended his idea to The New York Times noting that "if the refrigerator were lead-lined, and if Indy didn't break his neck [...] the odds of surviving that refrigerator – from a lot of scientists – are about 50-50."
In reality, Indy would look more like Han Solo frozen in carbonite or, more likely, Howard the Duck.
Read More ...
Movie Week: techradar's Movie Week
Update: We've already published a bunch of excellent features all about movies and the tech that makes them happen (not to mention the tech within them).
Want to know everything about green screens? How about a look back at one of film's most revered concept artists? Find all that and more just after the "Welcome to Movie Week" section below.
Original article follows...
Filmmaking is an art. Since the dawn of cinema over 120 years ago, creative people have been using the medium of film to tell stories to the masses.
But what is often overlooked is that the history of cinema has long been tied to technology.
From the cameras that are used to record the footage to the computers used for special effects to the platforms involved in distribution – from Blu-ray players to Netflix – technology has a hand in almost every aspect of filmmaking.
Welcome to Movie Week
So to satisfy our love of the cinematic artform and its the part of the Venn diagram that overlaps with technology, over the next week, we'll be focusing our tech-loving eyes over the medium of film.We'll be sharing with you interesting stories on the history of cinema, interviews on how special effects were achieved and rounding up our definitive lists of amazing films you should watch again and again.
We'll also be updating this very post constantly, sharing all our Movie Week articles from here, so bookmark the page to ensure you don't miss anything.
Movie Week Day One
A brief history of film – From its origins in 19th century France to the first appearance of 3D and beyond, join us on a trip through cinema's creation story.This sci-fi movie tech already exists – RoboCop's visor has entered the world through Google Glass. What else has film helped inspire science to make a reality?
The science behind green screen – Shia Labeouf put the green screen in the spotlight this year, but where did it come from, and how does it work? Read on to find out.
The Batmobile through the years – Let us take you through a trip down memory lane, highlighting every iteration of the Dark Knight's whip for better, worse and downright badass.
Everything you need to know about 4K – This is your #1 stop for all things Ultra HD, from the hardware that runs it to the software and content that supports it.
Remembering the father of the "Alien" – H.R. Giger, the artist who's work inspired the concept for Ridley Scott's big baddie in 1979's Alien, left us too soon. Here's a celebration of his best work.
How to watch movies on Mac (and iOS) – Just bought your first Mac or iPad and want to know the best way to watch films on it? You've come to the perfect place.
The best superhero movies ever – From Superman to The Avengers and everything in between, we recognize the greatest films to put our favorite comic books (or riffs on them) into motion.
How to set up your TV for a perfect picture – No more struggling through less than optimal settings for your home cinema experience. It's time to get the best possible performance from your TV.
Movie Week Day Two
Foley: the art of movie sound effects – A fascinating study of exactly how the sounds you experience in the cinema are made.Where to watch 4K and Ultra HD content – if you want to get the most from your state of the art television, you're going to need some content.
Cinema to the IMAX – Those big screens are not only spectacular to watch, but also spectacular to film for. Or so we're told...
How to shoot a Hollywood movie on your smartphone – That camera in your pocket is so good now, it can shoot its own feature film.
Netflix vs Blu-ray: is there a future for optical discs? – We interview Marty Gordon, Vice President of Alliances & Communications at Philips about the future of physical discs.
12 Star Wars tech facts you almost certainly should know – This is the Jedi Academy of Star Wars tech information.
How to digitise your old movies – A step-by-step guide to transforming your VHS tapes to digital files.
James Cameron on 3D: the techradar interview – Remember when James Cameron thought 3D was the future of filmmaking?
Oculus and film: why virtual reality is the new silver screen – Is virtual reality the next frontier for film? Oculus certainly thinks so.
- techradar's Movie Week is our celebration of the art of cinema, and the technology that makes it all possible.
Read More ...
In Depth: 13 best Chromecast apps for Google's streaming stick
The Best Chromecast Apps
The Google Chromecast is a godsend for non-smart TV owners. At $35/£30, it's the perfect addition to any streaming center currently devoid of a set-top box or gaming console.
After two years of near-universally acclaimed existence, Google's little streaming stick continues to impress, and counting more apps at its disposal than ever before. To that end, the iTunes and Google Play Stores have acquired hundreds of Chromecast-compatible apps worthy of your attention.
The flip side of all this choice is that, sorting out the worthwhile apps is like watching the new Fantastic Four movie: tedious.
That's why we've put together a list of the best Chromecast-compatible apps that are sure to turn your minor investment into a major component of your home entertainment center.
Netflix
Netflix is the quintessential Chromecast-enabled app. Tossing a TV show from your phone or mobile device is as simple as hitting the Cast button, and the results are near instantaneous. Offering hundreds of TV shows and movies as well as some of the best original programming this side of a premium cable subscription, Netflix should be your first stop on the road to building your Chromecast app collection.
While the app itself is free on iOS and Android, you'll need to be a Netflix subscriber in order to stream content. Plans start at $8.99/£6.99 a month.
HBO Now
HBO Go was a good service. It allowed cable subscribers who purchased HBO through their cable plans to watch the premium channel on their iOS and Android devices. For a time, that was enough.
It wasn't until HBO Now was announced that our eyes were opened to the bigger picture. As a standalone streaming service like Netflix and Hulu, HBO Now unshackled itself premium cable packages, allowing users to watch shows like Game of Thrones and Silicon Valley without buying a whole cable package. After its short, three-month exclusivity contract with Apple came to a close in August 2015, it rocketed to the top as one of the best Chromecast-compatible streaming apps out there.
Like Netflix, the HBO Now app is free, but a subscription to the service costs $15/£9.50 a month. (Currently unavailable in Australia.)
Google Play Movies & TV
Subscription services like Netflix and Amazon Prime Instant Video are great in certain scenarios: because they're all you-can-stream, you never need to open up your wallet in between seasons. As a trade-off, however, you don't have the latest shows and movies at your fingertips. For that, we recommend the cross-platform compatible Google Play Movies & TV.
Using your Google account, you can instantly purchase and watch anything on the Play Store (think movies and TV shows from recently released blockbusters to videophile classics) without needing to download the content on your mobile device. Couple that with an easy-to-understand interface and seamless Chromecast compatibility, and the Google Play app quickly becomes the best piecemeal video service on either platform.
Google Play Movies & TV is free to download on both iOS and Android.
YouTube
Even the stingiest of streamers can appreciate YouTube's fantastic and free Chromecast-enabled mobile app. It's interface is simplistically designed, just like the Chromecast itself, so it's only a matter of seconds from when you find a funny video until it's broadcast on your big screen.
YouTube is free to download on both iOS and Android.
Rdio
Rdio and Pandora are mainstays of Chromecast music app roundups. While the latter is great for audio enthusiasts who like to "set it and forget it," Rdio shines the brightest as the ultimate house-party companion. The freedom to pick whatever song you want without restrictions coupled with the ability to build playlists on the fly moves the party from your pocket to the home entertainment center so everyone can bust a move.
Rdio is free on iOS and Android, but to hear full versions of mainstream music you'll need an Rdio Unlimited subscription which costs $10/£10/AU$12 per month.
Google Play Music
Google Play Music is the ultimate player for anyone heavily invested in the Mountain View company's audio store. Able to stream tunes from your library as well as from a massive catalog of on-demand music, Google one-ups the competition by adding Chromecast support to its iOS and Android Google Play Music apps.
Google Play Music is free to download on both platforms, but streaming music on demand requires a subscription to Google Play Music All Access, a service that costs $10/£10/AU$12.
Plex
We've sung Plex's praises before: the media center app takes TV shows and movies stored on your PC and streams them conveniently to your phone. Plex's best trick, however, is that it can send this stream to your Chromecast, effectively giving you a set-top box with access to any movie or TV show you can possibly imagine.
Plex is free on both iOS and Android.
DailyBurn
Having trouble finding time to go to the gym? DailyBurn is exactly what the doctor ordered. The app is available on iOS and Android and offers over 100 video demonstrations of popular exercises inside multi-week fitness programs. Videos can be sent directly to your Chromecast from inside the app on your phone, transforming your living room into a 24-Hour Fitness.
While the DailyBurn app is free, you'll need to purchase individual workout programs to watch any of the aforementioned videos.
AllCast
AllCast is the Swiss Army Knife of Chromecast-enabled apps. AllCast offers an all-in-one way to take movies, music and pictures from your small screen and shoot it to your dongle. The free version slots in a few annoying ads and limits video length to a few minutes, but for frugal streamers it's the easiest, most effective method to taking your content from your mobile device to the big screen.
AllCast is available on iOS and Android. AllCast Premium costs $5/AU$5. Only the free version is available in the UK.
Chrome
At this point we've covered the best ways to share movies, music, TV shows and photos with Chromecast, but what if you want to show off some good ol' web pages? For that your best bet is Chrome. Like using a web browser on a desktop, the mobile versions of Google Chrome essentially mirror your screen, letting everyone around see what you're seeing on your phone.
Chrome is available for free on both iOS and Android.
Big Web Quiz
While Google is better known as a hardware and application developer, it knows how to make low-key games, too. Big Web Quiz is basically Trivial Pursuit with your phone. To get going, you and up to four friends download the app on your iOS and/or Android devices, select from one of the zany avatars and then answer questions randomly selected from Google search trends and Google Knowledge Graph. What better way is there to show off a new piece of tech than to use it to demonstrate how vastly intelligent you are to all your friends?
Big Web Quiz is available for free on both iOS and Android.
Twitch
Twitch is a gamer's paradise. With thousands of streams going on around the clock, there's always a new game to watch or streamer to laugh at. Best of all, because Twitch's app is Chromecast-enabled, the party doesn't have to stop on your small screen. Whether you're an eSports fan, a retro enthusiast or just want to watch someone beat Super Mario World in under 27 minutes ... blindfolded, this is the place.
Twitch is available for free on both iOS and Android.
Artkick
There aren't many apps that make you a smarter, more well-rounded person, but Artkick might be the exception. Billed as a digital wallpaper app, Artkick uses famous paintings and photographs to replace Chromecast's generic screen saver. From Monet to Picasso, Dali to Warhol, the app offers a degree in art history without all the stuffy classrooms and awkward naked portraits. The only downside? The app, unlike the gorgeous images it displays, is downright ugly.
Artkick is available for free on both iOS and Android.
Read More ...
The Gear S2 pops up on Instagram, and it looks lovely
The Samsung Gear S2 is slowly building up more hype ever since it was officially teased at the company's Unpacked event with the reel uploaded to YouTube.
Taking to social media, VP for Samsung design Dennis Miloseski decided to post a photo on Instagram of the new Gear smartwatch with the hashtag filled caption: "Giving the new Samsung Gear S2 a test drive. #samsung #watch #wearable #nextlevel."
We've noted it looks similar in design to the Moto 360 and LG Watch Urbane but from this latest photo, the new Gear looks almost slimmer, seemingly laying flatter on the wrist. This strap appears to be some sort rubber (or leather) that sits well up against the shiny lugs of the watch.
Two buttons are also visible in the photo. So far other smartwatches have been designed with one button, no buttons, front buttons or a crown like the Apple Watch.
More details on the wearable will be revealed during IFA 2015 but until then, we can probably expect more leaks and photos.
Via Sammobile/Top image source: Dennis Miloseski/Instagram
Read More ...
Uber's self-driving car plans just took a big step forward
Uber has received a huge endorsement from Arizona Governor Doug Ducey, who today announced a new partnership between Uber and the University of Arizona as well as opening the state's roads to self-driving cars for testing.
The Office of the Arizona Governor explained the partnership will focus on "research and development in the optics space for mapping and safety," with the university set to become the center for "Uber's state-of-the-art mapping test vehicles."
"Tucson - and the University of Arizona - will be become the next home to our Uber mapping test vehicles," Uber said in its announcement.
Governor Ducey also signed an Executive Order supporting the testing and operation of self-driving vehicles in Arizona.
"All Arizonans stand to benefit from embracing new technologies - especially when it means new jobs, new economic development, new research opportunities and increased public safety and transportation options for our state," Governor Ducey said in the announcement.
Driver-less dreams
The new partnership marks the second with a university, as Uber also opened an Advanced Technologies Centre at Carnegie Mellon University, Pittsburgh, earlier this year to test its mapping vehicles.Unfortunately, today's announcement doesn't give us any new information about what stage of development its self-driving cars might be in.
"We're still in the early days of what's possible - and I look forward to working with Arizona to make the next step of that journey a reality," was all that Brian McClendon, VP of advanced technologies for Uber, revealed in regards to the tech.
As part of the partnership with the University of Arizona, Uber will also donate US$25,000 to the university's College of Optical Sciences.
Read More ...
Updated: YouTube Gaming aims to take down Twitch tomorrow
Updated August 25: Google has confirmed that YouTube Gaming will be launching tomorrow, August 26, according to The Next Web.
It will be available on the web, as well as on iOS and Android, and while it will only roll out to the UK and US at launch, it is expected to eventually makes its way to wherever YouTube currently is available.
Original story...
Meet YouTube Gaming, Google's answer to live-streaming leader Twitch.
Coming this summer, YouTube Gaming is an app and website built for gamers. It's clear Google is capitalizing on the success of ultra-popular YouTubers like Pewdiepie, who specializes in "Let's play" and walkthrough videos. YouTube Gaming's Twitter description is even, "Let's play."
The new division of YouTube tweeted, "We're connecting you to the games, community and culture that matter to you - by gamers, for gamers." With the service, folks can "hang out with [their] favorite YouTube Creators in our new Live system." Once you've subscribed to a channel, you'll be notified when a live stream is beginning so you don't have to miss any of the action.
Smart.
According to Google, YouTube Gaming is made exclusively for games and gamers, noting "when you search 'call', you'll end up with Call of Duty, not Call Me Maybe." The company revealed that over 25,000 games will each have their own page and make it much simpler for gamers to connect with their favorite YouTubers.Google says that in addition to existing features like 60fps streaming and converting streams into YouTube videos, the site is going to make launching a live event possible with no prior scheduling, making surprise live streams and reveals an easier task.
YouTube Gaming will arrive this summer, starting in the UK and the US, and the team will have a booth at E3 2015 showcasing some of its new features.
- This is the final Oculus Rift
Read More ...
Movie Week: James Cameron on 3D: the techradar interview
Walking With Dinosaurs, the BBC and good 3D
This is an old interview that has been republished for techradar's Movie Week.To coincide with the launch of Titanic 3D in the UK and US, TechRadar presents its exclusive James Cameron interview it got with the director back in late 2011. In it he talks about the making of Titanic and why post-converting 3D is one of the toughest things he's ever had to do...
James Cameron is the biggest advocate for 3D working in Hollywood today.
His hit movie Avatar kick-started a new wave of 3D movie-making, using techniques and technology that proved 3D could move beyond being a gimmick.
Although it wasn't to some critics' tastes, the numbers do not lie. Not only is Avatar the best-grossing 3D movie ever made, it is the best-grossing movie ever made.
The key to Avatar's success was that Cameron put 3D at the forefront of the film-making process, and he is now building a business out of offering his insight (and equipment) to other film-makers.
Cameron and the BBC
The fruits of this could be seen at this year's IBC (International Broadcasting Convention), where the Cameron | Pace group, which he runs with co-chairman and cinematographer Vince Pace, announced it would be working with the BBC on a feature film of Walking With Dinosaurs, as well as announcing that Evergreen Films will be the first company with Cameron | Pace certification.
"It's a theatrical motion picture so it will be in the IMAX in the UK and in digital 3D," said Cameron to TechRadar when we met up with him at IBC for an extensive chat on all things 3D.
"We had a very good meeting with the BBC here, where we said we could be doing all kinds of things together."
Walking With Dinosaurs is the biggest 3D production the BBC has undertaken, so it's good that the broadcaster has decided to do it under the tutelage of Cameron, who believes the Beeb needs to increase its 3D output.
DO THE DINOSAUR: Cameron will be working with the BBC
"The BBC has held back a little bit with 3D, where as BSkyB has jumped in and ESPN has jumped in.
"The point I made to the head of BBC was that you can't hold back indefinitely.
"You have to muscle in on this; you have to learn how to do this. This is what broadcasters are starting to wake up to, that 3D is going to happen."
Although the BBC has only taken tentative steps into 3D production – as seen in this year's Wimbledon finals – Cameron does believe that the UK is well equipped for 3D and in some ways leading the way.
"The UK isn't waiting for a massive install base of 3D sets in the home; Sky is going for the sports bars and the pubs.
"That is a very creative solution and is one that can work all around the world."
A new 3D standard
One of the ways Cameron is setting out to improve 3D is to offer up the Cameron | Pace name to productions that use the company's technologies. Like THX before it, it's set to become a symbol to consumers that the 3D in the movie they are about to watch is the best it possibly can be.
PACE OF LIFE: The Cameron | Pace group hopes to bring quality to 3D
But with the sales of 3D TVs not exactly setting the economic world on fire, Cameron understands that it sometimes quality isn't enough, you have to offer quantity as well.
"We care about the quality of 3D, so we are looking to constantly improve it. But for the general public, the biggest improvement we can make for them is to give them more stuff.
"They just want more stuff so when they do purchase their 3D TV set, they want to make sure that there is some programming to justify it."
3D TV should be cheaper
Alongside programming, the cost of installing 3D into your home is something else that needs to improve. Cameron believes that the introduction of passive glasses into the home could be key to mass adoption.
"Instead of having to pay a premium at the cinema, the general public have to fork out on the glasses which are expensive," said Cameron to TechRadar.
"Passive glasses are here and the quality is improving all the time. The fact that they are throwaway means this technology could be critical."
3D TALK: James Cameron with his cinematographer Vince Pace
Sitting down with Cameron it is clear to see that 3D isn't just a passion for the director, it is what he is fully focused on career-wise, so much so that it's hard to see him releasing a movie again without using the technology.
Filming in 3D "never gets old"
"Vince [Pace] and I have been involved in so much 3D photography – we love it, it never gets old for us.
"When we shot the recent Cirque du Soleil film back in December, we were in the truck looking at the images and saying to each other: 'Look at that! That's great!' We're kind of like kids.
"In a way, though, I think we should look forward to 3D when it is less remarkable. In the same way we don't talk about how great the colour is on our TV sets any more or how great the colour is in a movie.
"We need to get to that point of equilibrium, where the tail isn't wagging the dog."
AVATAR: The second instalment will use high frame rate cameras
Unfortunately, since the launch of Avatar there's been a number of movies that have sullied the 3D format – one of which, according to Cameron, was Piranha 3D, a franchise that is linked to the director's early days as a film-maker.
Cameron's first directorial gig was on Piranha Part Two: The Spawning. The franchise was given a tongue-and-cheek 3D remake in 2010 by director Alandre Aja and has spawned the brilliantly titled sequel Piranha 3DD, but Cameron was less than impressed with the results.
"When you are watching a bad 3D movie in 2D and I was watching one the other night, this Piranha film, it was just bad storytelling in 3D, because everything would come to a stop and this grotesque thing would be sitting in the middle of the screen.
"[The movie uses] these stupid 3D tricks that people used to think were good. And there would be one of these zingers every few minutes where everything would come to a screeching halt and some 3D gag would be hanging right in your face.
"That's the 3D influencing the film-makers and not the other way around."
How to ensure 3D success
Although Cameron doesn't get annoyed that films like Piranha make audiences question the quality of 3D – "The film didn't make that much money, anyway," he explains – he does note that 3D done right is the only real way for box-office success.
"Good 3D that's done well in an integrated way and is an enhancement will make a film money.
"Like Alice In Wonderland and the new Transformers, the 3D was done well on that, it was deep and integrated into the shot design."
BAY-HEM: Michael Bay using a Cameron | Pace camera
High frame rates, Avatar 2 and Titanic 3D
One quality improvement for 3D, which Cameron has publicly hailed as the future of the format, is matching the technology with fast frame rate shooting. Although we aren't going to see it used by Cameron until Avatar 2 – which is pencilled in for a 2014 release date – it is a technique that Peter Jackson is using for The Hobbit.3D WORLD: Cameron behind the scenes of Avatar
"When I went to Show West in 2005 and showed 3D and said this is the future, there was a lot of sniggering and scepticism – even though it turned out to be the case," said Cameron to TechRadar.
"When I showed them high frame rate, however, I did a compelling demonstration, showing the same shot at 24, 48, 60 and so forth and the response was immediate. The exhibitors just got it and understood that this was something they could do quickly, that would be inexpensive.
"It seems to be gathering more momentum than 3D and by the time that I do release the next Avatar in four years it is going to be all over the shop. Something like 50 people would have done it and Peter Jackson looks like the one to break the ice."
3D conversion is "mind numbing"
Although we won't see Avatar 2 for a number of years, Cameron will soon be releasing Titanic 3D, which will be the first time the director has used 2D to 3D conversion – a process that's easy for CG movies like Toy Story but for use on movies with real actors, it is a painstaking process of image manipulation and one Cameron is not entirely enthused about.
TITANIC 3D: A 'mind-numbing process'
"I really don't enjoy the process," he explained.
"While Vince and I sit gleefully watching our 3D images being shot, a conversion is the exact opposite.
"It is a mind numbing process of creating depth subjectively.
"I am five months in so the artists are becoming good and starting to read my mind a little, so it has become easier, but I still sit there with the jog wheel and look through the movie frame by frame and make notes on depth."
"'That should be closer, that should be further back, there's not enough full depth here, there is not enough volume on that shoulder, a little more volume on that urn in the background, see that chair in the background on the left, no the other one on the left, that one needs to come forward another six inches… when the captain stands up we have to do an interlocular dynamic, da da, da da, da da… it's fricking endless!
"It's a mind-numbing process; it's like mowing the lawn with a toenail clipper."
KIDS ARE ALRIGHT: Cameron and Pace are like kids when shooting in 3D
While this doesn't mean we won't see other Cameron classics getting the 3D treatment – "Never say never, but if we can't do it with Titanic and George Lucas can't do it with Star Wars, then there isn't a market" – it does prove that introducing 3D at the beginning of the film-making process is far easier than doing it after the fact.
Cameron was keen to point out, however, just how much time he is investing into Titanic 3D.
"We are spending $18 million on Titanic and giving the project a year," he notes after we ask him why he thinks 2D to 3D conversions have failed before, using Clash of the Titans as an example.
"That was a classic mistake," said Cameron.
"They tried to make 3D a post-production process like sound editing and that doesn't work. The film-maker has to be involved and it takes time and good money to do a proper conversion.
"All of those things work against you in post production, where the film-maker is spread thin with getting the visual effects and getting the sound and music done so can't be looking after the 3D and somebody else is doing it.
"They spend five weeks getting stuff done that should take five months, or eight weeks that should take eight months, they spend 10 million dollars on something that costs twice that."
3D has arrived
It is only when you sit down with Cameron you really begin to realise how deep his fascination with 3D is, and that despite the technology still not being where he wants (something he is realistic about), he still believes that 3D's time is now and the tech has arrived at a point when cinema has reached a pinnacle technology-wise.
FACE TO FACE: TechRadar was granted an extensive interview with Cameron
"We have cracked the frame rate issue and there's not much more to be done. I am going to have to spending my entire day on just creative issues," laughed Cameron.
"But we have sort of done it. We have got colour, we have got widescreen, we've got sound, we've stereophonic sound and stereoscopic projection – we are done! We have covered all of the senses.
"For me now, it is getting good practises for 3D into the home, into the workplace, and into our daily image consumption."
Part of this daily 3D image consumption is weaning consumers off the idea that 3D equals glasses.
"We can't wait for big-screen autostereoscopic displays, because right now the quality on them is too poor.
"But smaller screens that are in the desktop, laptop, tablet size, where it is basically a single user model, you can do those right now. And you are going to see a lot more of those products coming to market over the next year and so.
"Then people will realise that 3D doesn't equal glasses, 3D only equals glasses in certain circumstances.
"Ultimately with tablets and laptops, people can toggle between 2D and 3D, and it will just become part of their diet."
For more information on the productions that are using Cameron | Pace equipment and news on their upcoming projects, head over to www.cameronpace.com.
Read More ...
Movie Week: Is Dolby Atmos the future of cinema sound?
Disney Pixar's animated blockbuster Brave was chosen to launch Dolby's latest cinema sound format, Dolby Atmos.
The revolutionary new audio system, which had its public debut in the US in June 2012, introduces a ceiling array to the traditional surround speaker configuration and allows object-based sound design for the first time. Dolby describes the end result as the "illusion of an infinite number of speakers."
The choice of movie title seems serendipitous. Dolby's gamble on Atmos looks certain to make it the front runner in what is shaping up to be the most hotly contested evolution in cine-sonics since the advent of 5.1.
Four other companies (Barco, Immsound, Iosono and Illusonic 3D) have rival immersive audio solutions at various stages of development, all intended to complement the higher picture quality now being offered by 4K digital cinemas.
The stakes are high. The winner of this new surround sound format war could potentially dominate big screen audio for a generation, making $millions from studios, exhibitors and ultimately home video licensing.
A game changer?
Stephen Field is the senior vice president for programs and products at Datasat Digital Entertainment (formerly DTS Digital Sound). He's more familiar than most with the next generation of 3D sound formats. Datasat makes the AP20 cinema sound processor, used in theatres worldwide, and has just launched a new high-end home cinema processor based on it, called the RS20i (yours for a cool £16k).He told Tech Radar that systems such as Dolby Atmos not only change the way films are experienced, but fundamentally alter the way sound designers think about mixing movies.
''Suddenly every element in the frame becomes a separate sound object," he says "and the resulting soundtrack gets mixed on the fly at each and every cinema, depending on the speaker configuration in the theatre.
"If you have a hundred speakers down one wall, you can have the sound come from the exact speaker specified by the movie's sound designer. If that speaker fails, or you have far fewer speakers in an auditorium, then the system calculates where it needs to create a phantom speaker, so that the sound still appears in roughly the same space."
The process is called adaptive rendering. Dolby Atmos achieves this immersive innovation by combining additional sound channels with object based steering. Engineers can place an audio object anywhere spatially, with a precision impossible in conventional 5.1 mixes. The result is a more natural and immersive soundfield; the days of crude overhead pans are audio history.
Academy Award winning director Brad Bird was given a preview of Dolby Atmos earlier this year. He excitedly tweeted afterwards: "Was given a demo @ Dolby today of sequences from INCREDIBLES & MI:GP remixed in their brand new top secret system. KILLER."
Winning over Hollywood
Dolby is banking on the fact that mixing in Atmos doesn't impose significant time or creative penalties on a studio's existing workflow; after all, time is money. The post production process automatically creates 5.1 and 7.1 mixes of a movie, and it's these that are used for unmodified cinemas and home video distribution. The Atmos mantra is 'author once, optimise everywhere.'However, equipping a cinema for immersive audio is an expensive business. A Dolby Atmos theatre employs two additional overhead speaker arrays, running front to back, which are used to create height effects. There are also additional surround speakers added to the sidewalls, with enclosures taken much closer to the screen than has been the norm previously.
However audio elements are not locked to any of these channels. Instead, they are stored with virtual position metadata and are assigned positions dynamically during playback, depending on the arrangement within a given theatre. Of course, not everyone agrees this is the best technology solution for better surround sound.
Dolby's closest competitor is Barco. Its rival Auro 11.1 system (expandable to 13.1) is channel, rather than object based - and it has some big Hollywood hitters behind it. The first movie mixed specifically for Auro 11.1 is George Lucas' WW2 action yarn Red Tails.
"With Auro 11.1 whatever is on that track, stays on that track. It doesn't vary. Nothing gets added or taken away from it," says Field. He confirms that Datasat has been working closely with Barco on a theatrical audio processor for Auro 11.1.
"We're developing software and a slot-in processing card for our AP20 cinema processor," he says. "When it's ready, we can then produce an upgrade card for our consumer RS20i processor." Barco definitely has aspirations to get its 11.1 sound system into the home market, he says.
Dolby, however, is certainly not talking about a home iteration of Atmos. Indeed, it's being pitched to theatre owners as a carrot to tempt punters away from their own home theatres, the aural equivalent of big-screen 3D visuals.
It's an understandable policy, says the man from Datasat. "Dolby first wants to maximise revenues by being the only supplier of cinema decoders. But I believe Dolby will expand the licensing of Dolby Atmos at some point. It will get its reoccurring revenue from the licence for film encodes, so it's in Dolby's interest to have the maximum number of audio processors able to play the format."
Field is quick to point out that the 16-channel Datasat RS20i is the only consumer audio processor currently upgradable to this new crop of cinema sound formats.
"It's a very simple upgrade procedure, too" he says. "You just slot in a board and the system immediately recognises the new hardware and knows what to do." Of course, installing a string of height channels and lining your living room walls with surround speakers may prove a tad more challenging…
Read More ...
Google's smart car tech can track road conditions
Even though engineers have found a way to turn driving over potholes into usable energy, we're still a ways off to seeing it as an everyday reality.
For now, potholes and generally bad road conditions can end up seriously damaging your car, but Google may have found a way to track and share which roads you should be avoiding.
The search giant has filed a patent that allows a vehicle's GPS system along with a sensor attached to the car to track and record when a driver hits a pothole or rough road.
The sensor monitors the amount of vibration on the street, while the GPS system sends the information to a database that is continuously updated.
Better maps
Of course, Googles Systems and Methods for Reporting Road Quality patent was only just filed earlier this month, so it is unlikely we'll be seeing the system hit the road any time soon.Still, this type of tech could certainly beef up the company's Google Maps offering, as well as be easily integrated into the company's Android Auto, or any GPS system, for that matter.
It could also prove useful to Google's autonomous cars, making for a smoother and safer driverless ride, as well as allowing local governments to better track which roads need more attention.
- Of course, you could let potholes tweet their condition instead
Read More ...
Movie Week: Oculus and film: why virtual reality is the new silver screen
The new reality of movies
The cinematic experience has always been one of never-ending evolution. From the days of silent films and the nostalgia of drive-in theaters, to our modern home theaters and explosive 3D superhero extravaganzas, we have always been uniquely drawn to the silver screen.Now, the expansive world of VR entertainment, pioneered by Oculus, is changing the way we look at films forever. Oculus is utilizing VR to create a film experience unlike any other: one that completely immerses the viewer in the interactive films with its revolutionary technology.
As an unabashed film lover, I always accept new mediums with strong reserve. For me, the classic 2D experience has never been topped, and I vehemently refuse to pay to see a 3D movie.
However, the thought of being totally engulfed in a story and feeling like I'm with the characters themselves is far too enticing to ignore.
So let's take a look back at what Oculus has managed to accomplish in its revolutionary take on filmmaking, find out how Oculus is pushing the movie-making envelope and take a peek into what the future holds for VR films.
Oculus Story Studio is born
Although it's no secret that Facebook-owned Oculus has an interest in more than just gaming, the company had nothing to show for it until January 2015 when it unveiled its newest creative team, Oculus Story Studio.At the forefront of a new storytelling experience, Oculus assembled the Story Studio team to prove that VR filmmaking isn't just possible, but deeply engaging.
"We knew how to get started with games, but we didn't know how to get started with film, with Hollywood, with cinema," chief executive Brendan Iribe said in January. He went on to say that he wanted the Story Studio team to prove that this revolutionary cinematic experience could actually be "compelling and rich."
Helming the fledgling creative team are Pixar animation veterans Max Planck and Saschka Unseld. With over 16 years of experience at the Disney-owned animation company between the two, they are aiming to bring that signature feel-good nature to Story Studio.
"I want you to come out of virtual reality and have a smile," Planck said at the 2015 Sundance Film Festival. "Or [experience] something very touching emotionally, just like Pixar films do."
Planck, who typed up Story Studio's announcement blog post, said he left Pixar after 10 years in search of the next evolution in storytelling. He was eventually drawn to join the Story Studio team and helped create its content.
Story Studio is looking to break new ground. In a blog post, Director of Photography Jeff Brown said," storytelling has a new vehicle and we couldn't be more excited about the possibilities," and that the goals for Story Studio remain "to build truly immersive cinematic experiences that were never before possible."
Oculus' first films: Lost and Henry
Upon Story Studio's announcement, Oculus came out swinging, revealing it had completed its debut short, Lost, and would showcase it at the Sundance Film Festival in Utah in January 2015.Not to be confused with the ABC show, Lost, the interactive short, directed by Pixar veteran Saschka Unseld is based around the simple premise of a missing hand desperately trying to find its owner.
The film, which lets viewers explore its wooded setting and react to environmental changes like a character in the narrative, charmed critics, fans and VR skeptics alike.
The inherently playful nature of Pixar-esque storytelling is an obvious fit for Oculus at this stage. Its friendly and inviting nature forces your brain to forget that the resolution may not be perfect and instead welcomes you with open arms into an immersive yet pleasantly simplistic cinematic experience. The length of the short film varies from three-and-a-half to ten minutes depending on who dons the VR headset.
"Our first short film Lost was a chance for us to dive in and work out the challenges of telling a story in VR, everything from using traditional film techniques to scene staging and the many ways to grab the viewer's attention," Director of Photography Jeff Brown said.
Lost stands as a beautiful first step into a thrilling new world, and one that taught the developers lasting lessons to apply to future VR films.
In June, Oculus broke the cute quotient by revealing a trailer for a feel-good animated short about an adorable hedgehog named Henry, narrated by none other than Elijah Wood.
The premise of this short is very simple: Henry is a friendly hedgehog who wants nothing more than to dish out loving hugs, but his spikes scare the other critters away. We find him on his birthday all alone, but he makes a wish that "changes everything." If that doesn't tug at your heart strings even a little bit, there's a good chance you aren't human. (Don't worry, we don't discriminate.)
The goal of Henry is to make a character that viewers can identify and empathize with.
"Empathy for a character is one of the most important fields for us to explore in virtual reality," said Director Ramiro Lopez Dau. "And with Henry, we wanted to create a character that had special bond with the audience."
Another component of Henry is that the title character will occasionally break the fourth wall smiling at the viewer with big, friendly eyes, showing he's aware and happy you're are on this adventure with him. Overall, the innovation of Henry isn't in the visuals or editing, but rather in that it forces the viewer to form a bond with the title character and feel what he feels, so to speak.
If you're anything like me, I want to throw on the VR viewer and meet Henry at his birthday party ASAP. Unfortunately, the public won't get a chance to experience the 12-minute interactive film until the release of the consumer line of Oculus Rift ships in Q1 2016. Fortunately, it will come free with the new VR viewer, so we'll all be at Henry's party soon enough.
What the future brings
Lost and Henry are just the tip of the iceberg for Story Studio. Before Rift's 2016 release five films are slated to be ready for viewing on the new device.Not much is known of Oculus' upcoming titles, other than their announcement at Sundance and a few details spilled by members of the creative team. The most divergent film is an action-heavy story called Bullfighter that places viewers square in the crosshairs of a furious and presumably charging bull.
Another interesting title announced was Dear Angelica, helmed by Lost director Unseld. It's premise is described as "how would it feel if we could be inside of an illustration?" which sounds extremely intriguing, to say the least.
Despite not having much knowledge of the announced titles, if Lost and Henry are any indication, Oculus Story Studio will continue to push the envelope of cinema and escort us into the next step of film evolution.
Going to the movies
When the consumer version of the Oculus Rift hits the shelves early next year, you'll be ready to test drive these new film experiences. The Oculus Cinema app, currently available on Gear VR, allows you to walk around a virtual theater, choose a seat and watch both 2D and 3D films in your viewer.If you're like most people, the idea of sitting in an empty theater is nice, but a bit lonesome. Thankfully, with the release of the Rift, Oculus will bring multiplayer into the mix, making it feel like heading to movies with your friends (but without the overpriced ticket and small child kicking the back of your seat.)
"We already have a lot of internal social functions in Cinema that are going to be rolling out in the next few months," Oculus creator Palmer Luckey told Road to VR, "Things like avatar systems, being able to communicate with people over long distances… rather than just local multiplayer, but having actual long distant multiplayer as well."
The future of virtual film
Regardless of pre-notions on different mediums of film, it's hard to say that Oculus' strides in VR cinema aren't absolutely exhilarating.Although I'm a firm believer that traditional 2D film will never go out of vogue, the evolution of VR film and the character interaction it promises is something of a brand new front, making storytelling feel fresh again.
Not all film should be applied to the virtual world, of course. As the movement and technology grows, I sincerely hope it won't meet a similar fate as 3D films - where every new movie spat out of Hollywood is retrofitted to a format it wasn't made for.
However, for the films made with VR in mind, this new frontier promises to take us on a cinematic ride unlike any other, and it will be exciting to see where it goes.
Read More ...
Review: Updated: Panasonic GX8
Introduction and features
With G, GM, GF, GH and GX lines, Panasonic's G series of compact system cameras can get a little confusing. The GX models have a flatter, more rectangular shape than the SLR-like G and GH ranges, and they're more advanced than the GF and GM cameras. The latest introduction, the GX8, is an upgrade from the very successful GX7, which continues in the line for the foreseeable future.Naturally, Panasonic is hoping the GX8 will be as popular with enthusiast photographers as the GX7 was, and it makes a good start by featuring the company's first Four Thirds type sensor, with a pixel count over 16 million. In fact it has an effective pixel count of 20.3 million, and Panasonic claims this enables the camera to produce the highest image quality from any G-series camera, beating both the flagship GH4 and the recently released G7.
To complement the 25% increase in pixel count, which Panasonic claims brings a 15% improvement in detail resolution as well as better signal to noise ratio and dynamic range, the GX8 has a new processing engine. However this is new to the GX line rather than to the G-series, and it's the same engine as in the GH4 and G7.
This enables a native sensitivity range of ISO 200-25,600, with an expansion setting of ISO 100, as well as a top continuous shooting rate of 8fps at full resolution. This frame rate is available in single autofocus mode; if you want to shoot full-resolution images with continuous autofocus the rate drops to 6fps.
There's also a new 2,360,000-dot OLED electronic viewfinder (EVF) which is much bigger than we've seen on previous Micro Four Thirds cameras. It offers 1.54x magnification, which is equivalent to 0.77x in 35mm format – the same as the Fuji X-T1. Panasonic claims there's less fringing and blurring in its finder than Fuji's. As on the GX7, the GX8's EVF can be tilted up for easier viewing from above.
In a change from the GX7, the GX8 has an OLED screen with 1,040,000 dots. Also, rather than being mounted on a tiltable bracket, it has a vari-angle hinge, which makes portrait and landscape-orientated image composition easier at high or low angles. As you'd expect from Panasonic, the screen is touch-sensitive.
Panasonic has also introduced a new Dual Image Stabilisation System, which combines lens and sensor-based stabilisation to reduce image blur when you're hand-holding the camera. The lens applies 2-axis stabilisation, while the body applies 4-axis compensation. Apart from in 4K mode, there's 5-axis stabilisation in video mode.
Being a Panasonic camera, the GX8 can record 4K videos (as well as Full-HD), and has 4K Photo mode with three shooting options: 4K Burst Shooting, 4K Burst (Start/Stop) and 4K Pre-burst, which are designed to record footage from which 8Mp still images can be extracted.
In 4K Burst Shooting mode the camera records 4K footage for as long as the shutter release is held down, and in 4K Burst (Start/Stop) mode recording is started with a press of the shutter release and is stopped by a second press.
In 4K Pre-burst mode the camera starts scanning as soon as the mode is activated, but only the 30 images immediately before and after the shutter button is pressed are recorded, giving 60 shots in total.
In addition to these three modes, a new focus-shifting 4K Photo mode is planned, and will be introduced with a firmware upgrade. In this mode the camera shoots a sequence of 10 images at 30fps, each with a different focus distance – you can then select where you want the focus to be post-capture.
It's possible that Panasonic may also create a focus-stacking option to enable the creation of images that are sharp from front to back.
As with the G7, the 4K Photo modes can be used when shooting in program, aperture priority, shutter priority and manual exposure mode, so it's possible to take full control over exposure if you want.
Images can be shared directly from the GX8 using the built-in Wi-Fi, which can show a QR code for making the first-time connection so that you don't have to enter a password. There's also NFC (near field communication) technology for making fast connections to NFC-enabled devices.
Build and handling
As mentioned, the GX8 has a flatter, more rectangular shape than the G7 and GH4. Nevertheless it has a deep, effective front grip and a shallow thumb-ridge that gives just enough purchase – though it wouldn't hurt if this was a little more pronounced and more ergonomically shaped.While I find there's enough space to accommodate my thumb on the back of the camera, those with bigger hands may find that their thumb feels a little confined, and may accidentally press the Quick Menu or Display buttons from time to time.
That said, its magnesium alloy body gives the GX8 a higher-quality feel than the G7, and it's also splash- and dust-proof, so it can be used in more inclement conditions. The new camera is also noticeably larger than the GX7 in every dimension, but it looks a little cleaner and has a bigger front grip.
Like the GX7, the GX8 has a dual-dial control system, but the GX8's dials are bigger and/or easier to reach. The front dial, for instance, which surrounds the shutter release, is easier to reach because the shutter button has been moved forwards to the top of the grip, while the rear dial is larger, and located on the top plate within striking distance of the thumb rest.
Like the GX7, the GX8 has a mode dial on the top plate for selecting the exposure mode, but this has now shifted from the far right towards the middle, and it sits above a new exposure compensation dial that has settings running from -3EV to +3EV in 1/3EV steps. As well as making it very quick and easy to adjust exposure, having a dedicated dial enables you to check the setting even before the camera is turned on; I was able to adjust exposure compensation using my thumb on the dial while looking in the viewfinder.
After shooting with the Panasonic G7, I found that I missed its drive mode dial on the GX8. Amongst other things, on the G7 this offers a quick way of switching between single shooting, continuous shooting and 4K Photo mode. While 4K Photo mode is useful for shooting developing action, it only generates 8Mp JPEGs, which have to be extracted from the video footage. Consequently, there's good reason to want to switch back to 20Mp raw file shooting quickly.
On the GX8 there are several ways of accessing the various shooting options. The most obvious are via the drive mode button or the Quick Menu – which is accessed by pressing the Q button. By default there's an on-screen function button to access 4K Photo mode, and it's possible to customise a physical button to reach the options, but neither is quite as quick as flicking a dial round.
Screen
Having a vari-angle screen is a big improvement on a tilting unit – it's much more versatile, enabling you to compose low- or high-level shots in landscape or portrait format, and to make the most of the facility to adjust settings using the touchscreen.One issue I encountered when shooting at a high or low angle, however, was that it can be hard to locate some of the buttons on the back of the camera by feel alone. The Review, Display, Delete and navigation buttons all protrude clear of the camera body, making them pretty easy to find, but the Quick Menu and lower Function buttons are flush with the body, which makes them harder to locate. There's also no on-screen option to bring up either the Quick or main Menus.
I'm a fan of Panasonic's Touch Pad AF system, which enables you to set the AF point using the screen while looking into the viewfinder. However, when using it on the GX8 there were a frustrating number of occasions when the AF point started to resize rather than move to where I wanted it to be – it would be nice to be able to lock-off the resizing.
When shooting in landscape format, left eye users may find that they sometimes shift the AF point with their nose when using Touch Pad AF. I found it only happened occasionally, as the GX8's EVF protrudes quite some way beyond the screen, and if necessary the viewfinder can be tilted upwards a little to prevent your nose from touching the screen. It can also be an issue when shooting in upright format, as the top of the screen is within touching distance of your nose.
The new electronic viewfinder (EVF) is also a step up, providing a clear view with no visible texture or noise, and the image it displays is a good match with the captured shot. It's also a big device, and I found that when shooting upright images I had to consciously look up and down towards the outer edges to check composition – but that's a problem I don't mind having.
In addition, the viewfinder's refresh rate seems high, and I was able to follow the movement of body boarders and surfers as they zipped along some stormy waves.
There's also a sensor which activates the EVF and turns off the screen when the camera is held to your eye. As usual this can be a pain when you're adjusting settings or manipulating the camera into position, as your hand turns off the screen as it passes near the viewfinder, but there's a handy button that can be used to override the sensor and keep the screen on.
Although an electronic level can be useful, the GX8's has quite a wide margin of error, which means it's possible to produce images that look significantly tilted when the level indicates that the camera is straight.
While it's aimed at experienced and enthusiast photographers, the GX8's controls and menus are arranged well, and it's relatively easy to get to grips with using it. The build is reassuring and the camera responds quickly to adjustments, whether it's a tap on the screen or a press of a button.
It's helpful that the Quick Menu is customisable, but it would be nice if there was a customisable main menu screen as well.
Performance
Because it has an electronic viewfinder that's capable of previewing images with the settings applied, as well as reviewing shots, browsing through my shots from the GX8 doesn't bring any major surprises. On the whole the camera produces pleasant colours and good exposures.As it's the first Micro Four Thirds camera to offer a pixel count of greater than 16 million, there's a lot of interest in how much detail the GX8 can capture, and how well noise is controlled – it's good news on both counts.
With the right lens the camera is capable of capturing an impressive level of detail. In our lab it matched the 24Mp Sony Alpha A6000 for detail at the lowest sensitivity setting, and its JPEGs beat it for much of the range. It also compares very well with the Olympus OM-D E-M5 II, although it doesn't have the same neat trick for increasing resolution.
The GX8 also impresses in the noise control stakes, with chroma noise only making the faintest of appearances in raw files shot at ISO 1600 when all noise reduction is turned off – you really have to look for it in images sized to 100%. Push up to ISO 3200 or 6400 and there's naturally an increase in the level of noise in raw files, but it's still subtle.
Click here to see the full-size image.
Click here to see the full-size image.
Meanwhile, JPEGs taken at the default settings look very good with lots of detail and a very slight smoothing of some details; at 100% raw files look a little more natural. Noise is more pronounced in raw files recorded at ISO 12,800, but there's also a good level of detail visible.
At ISO 25,600 there's a noticeable drop in saturation, and raw files are very noisy while JPEGs are soft, making them only suitable for use at relatively small sizes – but that's not unusual.
Panasonic has issued firmware upgrades for some of its lenses to enable them to work with the GX8's hybrid stabilization system. When shooting at the telephoto end of the Panasonic G X Vario 35-100mm f/2.8 lens, which with the Four Thirds type sensor has an effective focal length range of 70-200mm, I found that on a regular basis I was able to get images that look sharp at 100% using a shutter speed of 1/10 sec; even some shots taken at 1/8 sec were sharp.
Turning on the stabilization system also has a significant effect on the view in the viewfinder – it becomes much more stable, but there isn't the nauseating yaw that can occur with some older systems. It also has a positive impact upon video footage; you can't hand-hold the camera while walking and expect super-smooth movies, but the minor tremor and shake that we normally expect is gone.
Click here to see the full-size image.
Click here to see the full-size image.
Panasonic's general purpose Multi metering system has impressed me in the past, and this continues with the system in the GX8. There were a few occasions when I needed to use the exposure compensation dial during my time with the camera, but none of these were situations in which I wouldn't expect to. In many instances the camera manages to cope with quite large bright areas in the frame without dramatically underexposing the rest of the scene.
In some images the sky looks burned out, but it's possible to retrieve quite a bit of detail from the raw file to create a better-looking end result. It's not just the raw files that have good dynamic range though – the JPEGs are also decent.
With very high-contrast scenes the dynamic range enhancing system, iDynamic, is worth using to boost the tonal range of JPEGs. Even when this is set to its highest value it produces natural-looking, rather than overtly HDR, results.
The effect isn't always predictable, but it often has more noticeable effect on shadows than highlights, brightening the darker areas to bring out detail. Using it also often triggers the camera to set a lower exposure to capture more detail in the highlights, which means it has an impact upon raw files as well as JPEGs.
Click here to see the full-size image.
Using the automatic white balance setting and Standard Photo Style usually produces good results from the GX8, although if you want a little more warmth in overcast conditions it's worth switching to the Sunny setting. As is often the case, the Cloudy and Shade settings warm things a little too much.
The Standard Photo Style is a good all-rounder, but the other options – Vivid, Natural, Monochrome, Scenery and Portrait – are also worth exploring. Each can be adjusted to taste, and there's an option to save your own custom Photo Style.
Although I would usually convert raw files to mono in software, I found it was possible to produce some very nice in-camera black and white and toned images using the Monochrome setting.
There's also a collection of 22 filter effects that can be applied to JPEG files, with simultaneous raw files if you select the option. As usual these effects will appeal to some tastes and not others, but some can produce excellent results in the right situation, so it's worth experimenting with them. The Sun filter, for example, can add a colour cast along with a large white patch to simulate flare, and while it doesn't suit every scene there are times when it works very well.
Panasonic claims that its DFD (Depth from Defocus) autofocusing technology, first seen in the GH4, has a response time of 0.07 seconds in the GX8, and that there's a 200% improvement in AF tracking performance. I can't verify the figures, but I found the GX8 capable of keeping pace with moving subjects when the active AF point is held in the right place, and the tracking system can follow fairly fast-moving subjects.
I used the GX7 in the photographer's pit at Fairport Convention's Cropredy Festival 2013, and I was able to do the same with the GX8 this year. The newer camera's AF system was able to cope much better when light levels fell and the stage lights provided all the illumination, making the camera much more versatile than the previous model.
Lab tests: Resolution
We chose three rival cameras for the Panasonic GX8 to see how it measured up in our lab tests: the Olympus OM-D E-M5 II, Fuji X-T10 and Sony Alpha A6000.We've carried out lab tests on the Panasonic GX8 across its full ISO range for resolution, noise (including signal to noise ratio) and dynamic range. We test the JPEGs shot by the camera, but we also check the performance with raw files. Most enthusiasts and pros prefer to shoot raw, and the results can often be quite different.
Panasonic GX8 resolution charts
We test camera resolution using an industry-standard ISO test chart that allows precise visual comparisons. This gives us numerical values for resolution in line widths/picture height, and you can see how the Panasonic GX8 compares with its rivals in the charts below.JPEG resolution analysis: The GX8's resolution scores compare favourably with those from the Olympus OM-D E-M5 II, Fuji X-T10 and Sony Alpha A6000 up to ISO 6,400.
Raw (converted to TIFF) resolution analysis: While scores at low sensitivity settings are good, they drop off as sensitivity rises and achieve a lower score than the JPEGs, although this isn't really reflected in real-world images. However, the files are subject to a standard conversion using the supplied Silkypix software, with all noise reduction turned off. A bespoke conversion, made to suit the image, is likely to produce better results.
Sample Panasonic GX8 resolution charts
This is the chart we use for testing camera resolution. The key area is just to the right of centre, where a series of converging lines indicates the point at which the camera can no longer resolve them individually. We shoot this chart at all of the camera's ISO settings, and here are two samples at ISO 200 and ISO 6400.ISO 200: Click here for a full-size version.
ISO 6400: Click here for a full-size version.
Lab tests: Dynamic range
Dynamic range is a measure of the range of tones the sensor can capture. Cameras with low dynamic range will often show 'blown' highlights or blocked-in shadows. This test is carried out in controlled conditions using DxO hardware and analysis tools.Read: Noise and dynamic range results explained
Dynamic range is measured in exposure values (EV). The higher the number the wider the range of brightness levels the camera can capture. This falls off with increasing ISO settings because the camera is having to amplify a weaker signal. Raw files capture a higher dynamic range because the image data is unprocessed.
Panasonic GX8 dynamic range charts
JPEG dynamic range analysis: The GX8 produces JPEGs that have a consistent tonal range throughout much of its sensitivity range, with good detail in shadows and highlights.
Raw (converted to TIFF) dynamic range analysis: The GX8's high dynamic range score continues into the upper sensitivity values, confirming our findings that raw files have a good range of tones.
Lab tests: Signal to noise ratio
This is a test of the camera's noise levels. The higher the signal to noise ratio, the greater the difference in strength between the real image data and random background noise, so the 'cleaner' the image will look. The higher the signal to noise ratio, the better.Panasonic GX8 signal to noise ratio charts
JPEG signal to noise ratio analysis: These scores are consistent with the GX8's ability to capture a good level of detail while controlling noise well for the majority of the sensitivity range.
Raw (converted to TIFF) signal to noise ratio analysis: This is an especially strong set of results from the GX8, which indicate that raw files don't have a huge level of noise.
Sample Panasonic GX8 ISO test results
The signal to noise ratio charts use laboratory test equipment, but we also shoot a real-world scene to get a visual indication of the camera's noise levels across the ISO range. The right side of the scene is darkened deliberately because this makes noise more obvious.ISO 200: Click here for a full-size version.
ISO 6400: Click here for a full-size version.
Verdict
Panasonic has been using a 16Mp sensor for a long time, so the jump up to 20Mp is a major step, and it has the desired impact upon detail resolution. What's more, this has been achieved while keeping noise in check, with images looking very good when sensitivity is set to ISO 100-6,400.The improved stabilization system is also very good, enabling sharp images to be captured at shutter speeds that would normally rule out hand-holding the camera. In addition it makes the image in the viewfinder much more stable, and video footage smoother.
While it is possible to make A3 prints from 8Mp files shot in the 4K modes, many photographers will feel uncomfortable about buying a 20Mp camera and shooting at this lower resolution, even if it is at 30fps; however, it brings considerable satisfaction when you realise that you've caught a fleeting moment. It will be interesting to see how Panasonic implements the focus-shifting option.
It's clear that the GX8's AF system has made a step forward from the GX7's. It's fast and effective in a wide range of lighting conditions, including low light, and keeps up well with moving subjects.
We like
The GX8 has an extensive feature set, and a pleasantly solid build that should appeal to enthusiast photographers. It's also very flexible to shoot with, as it has an excellent viewfinder than can be tipped through 90 degrees to make it easier to see and a vari-angle screen that's useful for composing images in either orientation.It's also possible to control the camera using either the touchscreen or the well-appointed collection of buttons and dials, and the exposure compensation dial is a welcome addition to the top plate.
Panasonic's 4K Photo mode is a fun feature that makes capturing fleeting moments very easy, but it seems quite a wrench to drop from 20 million pixels to just 8Mp, even if you can make decent A3 prints.
Many photographers will also appreciate the image stabilization system, which does a great job of correcting for the little shakes and wobbles that can blur images taken in low light or at the telephoto end of a lens.
We dislike
The GX8 is quite a bit bigger than most other Micro Four Thirds cameras, especially the popular Olympus OM-D E-M10. While some may appreciate its larger dimensions, this doesn't translate into much more space for your hand on the back of the camera. In addition, some of the rear buttons are hard to locate when your eye is at the viewfinder.Final verdict
The GX8 is a nice solid camera with a couple of flourishes, such as the tilting viewfinder, vari-angle touch screen and 4K Photo mode, that give it added appeal to creative and enthusiast photographers.Given its higher pixel count, better viewfinder and high-quality vari-angle screen it could prove more popular with this market than the recently released G7, but there's a significant step-up in price. Image quality is also very high, making the GX8 a rewarding camera to use.
Read More ...
Do Beats 1 or Pandora have live sports? No, but TuneIn Premium does
While it might be better known for streaming radio stations online, TuneIn is taking on its rivals with a new Premium tier that gives you a little bit of everything.
"With the introduction of TuneIn Premium, we are taking the world's best audio content and putting it all in one place," said John Donham, CEO of TuneIn.
"In today's crowded audio landscape, our focus on exclusive news, talk, sports, and music allows us to deliver an unparalleled listening experience to our users."
So unlike its radio streaming rivals like Apple Music's Beats 1 radio and Pandora, TuneIn Premium, which is priced at about $8 per month, will include live sports coverage of all Major League Baseball games, Barclays Premier League and Bundesliga matches.
In terms of actual live radio streaming, TuneIn Premium will be commercial free, with the radio stations providing TuneIn with separate streams that will play music in place of their ad breaks.
If you don't want live radio and simply want to listen to music, TuneIn Premium also has ad-free music streaming as well.
Unlimited library access
TuneIn is also taking on Amazon's Audible service as well, with the option to stream over 40,000 audiobooks from publishers like Penguin Random House, HarperCollins and Scholastic, including popular titles like the Harry Potter and Hunger Games series.TuneIn's paid tier will also give you access to 16 different language learning programs.
Compared to Audible, which costs about $15 a month to stream one audiobook per month, TuneIn's offer is a good deal for audiobook lovers, though it's library is smaller than Amazon's.
TuneIn Premium will set you back $7.99 (£5.99) per month, and will be available in the US, UK, Canada. The company has plans to roll it out to other countries in the future.
Read More ...
Movie Week: How to digitise your old movies
How to digitise your old movies
If you're over the age of thirty, then there's a pretty good chance you have a stack of VHS tapes lying around your home somewhere, collections of home videos and recorded TV shows so obscure that they haven't turned up on YouTube or BitTorrent.
You come across them every year during your spring clean, and wonder if it's too late to convert them to a format that you actually still use. The answer to that question is no, it's not too late but it's getting there. It's probably a good idea to get it done now, before you no longer can.
So that's what this is: a quick guide to converting your old analogue tapes to digital. We're going to focus on VHS primarily, but it also applies to VHS-C and 8mm (Video8/Hi8) tapes as well.
Image: Rob Pearce Flickr
Quick and dirty method
If you tried shopping for a new stand-alone VHS player today, you won't find one. What you'll find instead is VHS/DVD combos. One of the features of most of these products is the ability to dub your VHS tapes to DVD, which makes them a very easy tool for converting your tapes.
Quick and dirty method: The process
1. Hook up the player to your television set (well technically this isn't necessary, but it does make things easier!).
2. Place the VHS tape into the player and a blank DVD-R disc into the DVD tray.
3. Select a recording format in the player's options. Most combos have options with respect to the quality-recording length trade off (ie. high quality/short record time or low quality/long record time). DVD-Rs are pretty cheap, so you should probably go with higher quality – especially since VHS recordings are "noisy" and tend to require a lot of data to capture.
4. Press play on the VHS tape. When it starts, press the dubbing button on the remote. The recording will happen in real time, so you'll just have to let it play. If the DVD runs out of space, you may have to stop the VHS playback and insert another, then start dubbing again.
5. When it's done, you'll have to finalise the DVD. Don't just eject it. That usually requires going into the setup menu and finding the edit disc/finalise disc option. This will turn it into something that can be played in any DVD player. Then you can eject it.
6. If you'd like to copy the video to your PC for backup or editing in an app like VirtualDub, Windows DVD Maker or iMovie, you'll need to use a DVD ripper. We recommend Handbrake, a free app that can grab the movie off the disk and turn it into a common format like MP4.
Now we get to the "dirty" part of this equation. All but the most expensive "combo" players don't have great image correction tools. They won't remove image distortion or chroma shifting.
Unless the quality of your VHS tapes is absolutely top notch, you're going to get at best a mediocre looking recording on DVD. You might be able to rip them to PC and run some filters on them (see step 3 below), but nothing beats a good source.
If you want to make something that's a bit higher quality we'd recommend the following steps.
Setting up your source
More than any other step, this will affect the quality of your recordings. And we're not just talking about the quality of the tapes. VHS players are certainly not all the same, and a better quality model will produce a high-quality output.
If you're set on dragging your own old VHS player out of mothballs and putting it to this task, there are some things you should do first:
1. Clean it. VHS head cleaners and cleaning fluid are still available from eBay and other sources, and there are good guides on YouTube for opening them up and cleaning it.
2. Test it. Not with a VHS tape you actually care about. Play a tape for a good stretch and fast forward and rewind it to the ends to see if it's more inclined to eat tapes than play them.
3. Connect it to the capture device using the best available connection type. Essentially HDMI (which is available in new VHS players, but non-existent in old ones) is better than component (YPbPr), which is better than a composite cable or a coaxial loop-through. The latter also requires a TV tuner in the capture device to work.
If you're willing to buy or rent a "new" VHS player, however, then there are some models that come highly recommended. The "new" in this case, means "new to you", not actually new out of the factory. The best consumer VHS players aren't made anymore.
High-end S-VHS players from JVC and Panasonic are generally considered the best options. Not because of S-VHS (which almost nobody used), but because they have an inbuilt feature called a time base corrector (TBC). TBCs were used in professional environments like television studios for smooth switching between sources, but they also have important image correction capabilities, fixing distortion and in some cases cleaning up noise and irregularities.
As an alternative to a player with an inbuilt TBC, you can get stand-alone TBCs that sit between the player and capture device. They're pretty expensive, however, and you're better off renting than buying if you want to go this route.
Capturing your source
After the source device, you'll need a capture device for your PC, one that supports whatever format the VHS player outputs (HDMI, component, composite).
In general, we'd actually recommend using whatever capture software comes with the capture device. It's possible, of course, to use third party apps like VirtualDub or Debut, but you can probably save yourself a lot of hassle by using the app that's built to work with your device. The process is quite straightforward in most cases:
1. Hook the VHS player/8mm camera to your capture device.
2. Start the capture software.
3. Select your source (component, composite).
4. Configure the recording settings in the app settings – where it's saved, the video format and resolution. Be generous with the quality and bitrate.
5. Then press play on the VHS tape, and press record in the capture software. Press stop on both when you're done.
6. Depending on the software, you may have post capture options, like editing and DVD creation.
Certain devices also require no PC app at all. Primarily designed for game capture, there are stand-alone devices boxes like the Elgato Game Capture HD and Hauppauge HD-PVR Rocket that can also grab component signals. You can then connect them to your PC via USB and copy the recorded video across.
Post Production
Hopefully, you're happy with the quality of the recorded video. If that's the case, then you can burn it to a DVD or save it to an external drive and forget the rest of this article. If not, then there are tools you can use to tinker with the video.
We'll be honest: there's a deep, dark hole of post production frustration waiting for you if you choose to go down that route. The tools available often require an expert's touch and a lot of trial and error.
Post Production: VirtualDub
Probably the most common tool in use is VirtualDub, a tool for capture and post processing of videos. It's a relatively complex tool, and we don't have room here to provide a complete guide, but we'll walk you through a few of the simpler things you can do with it.
Before running VirtualDub, there are a few things you'll probably need. You should start with x264vfw. Installing this will allow VirtualDub to record with x.264 compression.
Then there are a couple of plugins you'll likely need. To add a plugin, just download it and then copy and unzip it into the "plugins32" subfolder in your VirtualDub folder.
First, you should grab hold of the Virtualdub FFMpeg Input Plugin. This allows you to open most video file formats in VirtualDub.
Second, grab the FlaXen VHS filter. It's the best filter we've found for improving VHS recordings. There are individual filters around for chroma shifting, noise reduction and image sharpening, but FlaXen has them all built in.
VirtualDub: The process
1. Run VirtualDub.
2. Drag and drop your captured video file into the main window. You'll see it appear twice: the one on the left is the original video, the one on the right is the output.
3. Click on Video->Compression. Select x264vfw and click OK. (You can also click Configure to change the quality settings, but the defaults aren't bad for VHS).
4. Click on Video->Filters. Then Click on Add. Scroll down and select VHS and click OK.
5. The FlaXen VHS filter settings will appear. Here's where the tinkering starts:
- The stabilizer reduces some jitter and static from the video.
- Noise reduction is designed to remove speckles and other "noise" elements from the video. You can check the box to have it remove the noise before and/or after the stabilizer works.
- Chroma shifting is probably its strongest feature. A poor VHS recorder will have the colour (chroma) and luminance information slightly out of sync, resulting in an offset or ghosting of the chroma information. It often looks like red/yellow blobs on the skin of people, or colour being weirdly offset like someone failing to colour within the lines. This fixes that, although the exact values to enter will probably require some experimentation. The numbers are the number of pixels that it should shift back. The worse an image is, the higher the numbers should be (although around 4 is a decent starting point).
- Sharpening works just like in an image program, increasing the contrast between pixels.
Whether you use each element will depend on the video in question. We can only really suggest trying them on and then making a comparison between the input and output videos.
6. Try other filters. There are some internal video filters you can possibly add:
- Deinterlace. It's likely that the capture app you used did this for you, but if the video is still interlaced (with alternate lines appearing out of sync with each other), apply the deinterlace filter.
- HSV adjust lets you change the saturation and brightness of the image, making faded video pop a little more.
- Sharpen is another filter that lets you sharpen a blurry image.
7. Click on File->Save as AVI. Give the file a name and let it process.
8. If you want to shift from AVI to a newer format that works on more devices (like MP4), you can use Handbrake (handbrake.fr) to convert the file.
Of course, that's really the tip of the iceberg when it comes to touching up your video captures. If you're keen on understanding it better, there are very good guides to be found here and here. There's certainly a lot you can do to try and touch up your videos – although you can run into diminishing returns pretty quickly!
- TechRadar's Movie Week is our celebration of the art of cinema, and the technology that makes it all possible.
Read More ...
Available Tags:TV , Android , Apple , Mac , YouTube , Gaming , techradar , Panasonic ,
No comments:
Post a Comment