Monday, October 12, 2015

IT News Head Lines (AnandTech) 13/10/2015

AnandTech



ASUS Announces the ROG Maximus VIII Impact Z170 Motherboard
At the initial Skylake launch, we provided a rundown of all the Z170 motherboards we could get information on from the major motherboard manufacturers. Even with details on over 55 motherboards, there were some noticeable parts missing from that list as we were told to expect more as the platform developed. The Maximus VIII Impact from ASUS is one of those to come a couple of months after the initial launch, and builds on the previous Impact products from the Republic of Gamers brand.


As with other Impact motherboards, the major difference between it and other mini-ITX motherboards is the power delivery on a right-angled daughter board at the top. This allows ASUS to provide a larger power sub-system, cool it appropriately, and over the multi-generational iterative design we are told the power losses involved with this method are continually minimised. From the images it is difficult to make out, but the Z170 generation of the Impact comes with a rear metal guard (look on the right hand side and over the motherboard screw hole in the top right corner). Presumably this is either for rigidity or for extra PWM cooling. Also as with other Impact designs, the power connectors are on the right hand side of the board, outside the DRAM slots, to make them easier to access for cable management. Underneath this is the front IO panel, a fan header and a USB 3.0 header from the chipset.

Being a mini-ITX board, there is really only space for two DRAM slots, and the Impact uses dual DDR4 slots with single-sided latches and support for 32GB and DDR4-4133. The latches for the slots are on the PCIe side due to the power delivery daughterboard, which might make DRAM removal with large GPUs installed more difficult. Inside the DRAM slots are four SATA 6 Gbps ports, but storage is aided by a U.2 connector near the rear panel. This is done due to the size of the U.2 connector, and it’s great to have the functionality, though it does mean that SSD 750 owners will be routing the cable over the motherboard to get to it in most cases. Back at the launch of the Z170 chipset, after the SSD 750 had been out for a month, I spoke to motherboard manufacturers about actually replacing SATA Express with U.2 onboard – here’s one of the first examples (there are a couple of others).

The last generation Impact came with an additional daughterboard for extra fan headers, to aid the two already on the motherboard. Here, instead of that daughterboard, we get the EXT_FAN header in the top right which is a breakout connector to a separate PCB containing fan headers and power. When talking to ASUS about this on previous designs, the move towards the fact that Impact owners tend to have customized systems, so having this option allows them to have a more configurable system design.


While we don’t have HDMI 2.0 here, ASUS seems to be using Intel’s Alpine Ridge controller to provide two USB 3.1 ports at 10 Gbps with type-A and type-C support. There are two Intel ICs next to the USB 3.1 ports, the smaller of which is the network controller and the larger seems to be the Alpine Ridge and is similar to other implementations – there isn’t an ASMedia controller nearby at any rate. This would be one of the first times we’ve seen the Alpine Ridge controller outside of GIGABYTE boards, perhaps suggesting that their launch day exclusivity has now come to an end. ASUS also puts on the board an Intel I219-V network controller, a 2x2 MU-MIMO 802.11ac Wi-Fi Go module, and their SupremeFX Impact III daughterboard audio solution, now encased in a full EM shield and with LED-illuminated audio jacks and headphone impedance detection/adjustment up to 600 ohms.  The rear panel also supports BIOS flashback, four USB 3.0 ports, and their Impact Control III information/control panel.

Because this information is from the ROG team and not an ASUS regional office, pricing and availability is not yet known. Although typically an ROG team announcement in the past has become a 2-4 week lead time to US deployment in the past, and previous Impact motherboards have been in the $230-$280 area.


Source: ASUS ROG

Update 10/11: We've been told that the Max VIII Impact will retail for around $249 and be available next week in North America.




Read More ...




Micron acquires SSD Controller Designer Tidal Systems, Inc.
A year ago, several veterans of SSD controller design firms SandForce and Link_A_Media Devices formed a new startup called Tidal Systems, Inc. to focus on developing NVMe SSD controllers. Tidal has spent the past year operating in stealth mode, and their website has almost no information about the company's work.

Over this past week, during a financial call to investors, it was announced that Tidal has now been acquired by Micron. There are no details available regarding the acquisition and Micron reported their quarterly results just before the acquisition so we won't get any more information soon. On the technical side, Micron plans to fold Tidal into their Advanced Controller Group and use Tidal controllers in client PCIe SSDs.

While the news that the acquisition has been floating around for a few days, we wanted more and so contacted Micron's PR to get further information about the acquisition. Ultimately we were told very little as the details are being kept under wraps for the time being. We were told that the acquisition includes Tidal's "inventory, equipment and intellectual property rights"; not mentioned in our discussions with Micron were the key people involved in developing Tidal's technology. Some amount of staff turnover during an acquisition is normal, and we'll have to wait and see who stays with Micron or gets put up the chain. Neither company has given an indication of how close to market Tidal's controller/controllers (we don't even know how many are involved here) may be, so it is difficult to gauge how much of an impact, both in terms of technology and personnel, this acquisition will have.

This acquisition is motivated by Micron's desire to develop high-end client SSDs without being dependent on third-party controllers from Marvell or others. This would give Micron more opportunity for product differentiation and keep more of the design in-house. This is becoming important as consolidation takes place - vertical integration of the SSD business has been working out very well for Samsung and Intel, and the industry has seen a lot of the consolidation ethos in recent years. Micron's acquisition leaves SanDisk as the only NAND manufacturer that doesn't do in-house controller design for the client SSD market, so it's likely that they're is sizing up the remaining independent SSD controller vendors.


Read More ...




ASUS Announces ROG PG279Q & PG27AQ Gaming Monitors
Today ASUS announced two new gaming monitors at their Republic of Gamers Unleashed event in San Francisco. Both displays are 27" IPS panels, although there are some significant differences between the two that make each one appeal more or less for certain genres of games. Below you can find all the relevant specifications for both of ASUS's new monitors.

ASUS PG279Q ASUS PG27AQ
Resolution 2560x1440 3840x2160
Panel Size 27"
Panel Type WLED + IPS
Refresh Rate 144Hz (OC 165Hz) 60Hz
Contrast Ratio 1000:1
Peak Brightness 350 nits 300 nits
Response Time (GtG) 4ms
Viewing Angle (H/V) 178° / 178°
Inputs and Outputs DisplayPort 1.2

HDMI 1.4

2x USB 3.0

3.5mm audio
Color Depth 16.7 million (8bit) 1.07 billion (10bit)
Speakers 2x 2W Stereo
Other Features NVIDIA G-Sync

NVIDIA Ultra Low Motion Blur
NVIDIA G-Sync
Price $799 N/A

Starting with the PG279Q, we see that it's a 27" WQHD IPS panel with a refresh rate of 144Hz. This monitor is definitely targeted more toward gamers who play games like first person shooters where a high refresh rate is a greater asset than a higher resolution. When paired with a GTX 960 or faster NVIDIA GPU the display's refresh rate can be boosted up to 165Hz, and ASUS has even included a button on the monitor to switch between the two refresh rates on command. This is actually more useful then it sounds, because a user can easily move to 165Hz while gaming, and stick with 144Hz in typical use which will also eliminate telecine judder in 24fps video content.


As for the PG27AQ, I would imagine that gamers who play RTS and simulation games would choose it over the PG279Q for its higher resolution. It's a 27" UHD panel with a refresh rate of 60Hz, and a greater 10bit color depth than the PG279Q's 8bit color. It shares most of the remaining specifications with the PG279Q, including a contrast ratio of 1000:1, a 4ms grey to grey response time, a 178 degree viewing angle on both axis, and the inputs and outputs listed above.

Both of these new monitors feature NVIDIA's G-Sync adaptive refresh rate technology. Even on the PG27AQ this can be useful despite it only being a 60Hz panel, as it will produce a much more fluid image if a game's frame rate drops below 60fps than a non G-Sync / FreeSync display. However, only the PG279Q has NVIDIA's Ultra Low Motion Blur technology which uses a strobing backlight to reduce motion blur. It's worth noting that G-Sync and ULMB are mutually exclusive and you need to choose which one you want a game to use based on whether or not you can maintain the PG279Q's native refresh rate of 144/165Hz.

The ASUS PG279Q will be available in November, with a starting price of $799 in the United States. Pricing and availability for the PG27AQ is currently unknown, but I would imagine that the price will be in the same realm as the PG279Q.


Read More ...




Western Digital My Cloud Mirror Gen 2 Review
Western Digital is well known to the average consumer as a hard drive manufacturer. By extension, it also opens up the network-attached storage (NAS) market to them. In 2014, the company unified their embedded Linux-based offerings under the My Cloud tag. The My Cloud Mirror units targeted home users, while the EX 2-bay and 4-bay units targeted prosumers and small office / home office (SOHO) installations. For business users, the DL series was introduced earlier this year. A few weeks back, Western Digital announced an updated operating system for the My Cloud units - My Cloud OS 3. Along with that, the My Cloud Mirror Gen 2 was also introduced. WD sent us the 4TB version for review. Read on to see how the unit stacks up against the competitors in this space.


Read More ...




Analyzing Apple's Statement on TSMC and Samsung A9 SoCs
Since we first learned that the A9 SoC in Apple’s iPhone 6s lineup is dual sourced - that is that it's being made by two different vendors with two distinct manufacturing processes - one major question has remained in the process of reviewing these two phones. The main issue under question here is whether the TSMC A9 or Samsung A9 have any difference in performance and power consumption. If there is a difference, the question then becomes whether the difference is significant.

In an atypical move for the normally tight lipped manufacturer, Apple issued a statement this afternoon in response to these questions and some rudamentary end-user benchmarking showing that there may be a difference:

With the Apple-designed A9 chip in your iPhone 6s or iPhone 6s Plus, you are getting the most advanced smartphone chip in the world. Every chip we ship meets Apple's highest standards for providing incredible performance and deliver great battery life, regardless of iPhone 6s capacity, color, or model.

Certain manufactured lab tests which run the processors with a continuous heavy workload until the battery depletes are not representative of real-world usage, since they spend an unrealistic amount of time at the highest CPU performance state. It's a misleading way to measure real-world battery life. Our testing and customer data show the actual battery life of the iPhone 6s and iPhone 6s Plus, even taking into account variable component differences, vary within just 2-3% of each other.

It interesting to see this response as Apple normally doesn’t comment on anything like this, which in turn is likely a good indicator of how seriously Apple is taking any concerns. However, this statement is also of interest because it's revealing in terms of what internal data Apple has collected on the issue. Apple has in recent years been one of the better companies in accurately promoting the battery life of their products, and that kind of accuracy comes not only from taking a conservative (safe) stance in marketing, but also collecting massive amounts of data to understand their products and their capabilities.

The Test


To get to the meat of matters then, let's talk about battery life, tests, chips, and statistics. In terms of the testing that has seemingly spurred on this Apple response, it’s likely that the “manufactured lab tests” in the statement refer directly to Primate Labs’ GeekBench battery life benchmark. The GeekBench test runs parts of the GeekBench CPU benchmark in a loop, making sure to do a fixed amount of work per time interval while idling the rest of the time, and using the score result as a modifier for the runtime score. This makes the GeekBench battery life benchmark primarily a SoC/CPU/Memory benchmark, and that in turn has repercussions for interpreting the data.


In the case of our own web browsing battery life test, for example, this is a test that attempts to simulate light reading of web pages, meaning that the SoC is only working hard a fraction of the time. The vast majority of the time the SoC is idling, making the display the biggest power consumer; and this is especially the case on these latest generations of high performance flagship smartphones. A heavy test on the other hand would be a test that keeps a sustained and significant load on the CPU and GPU, which shifts the power consumption of a test from the display to the SoC.

Battery Life: Light vs. Heavy

An Example of the Wide Gulf Between Light and Heavy Battery Life Testing

Due to the nature of its use of fixed size workloads, the GeekBench battery life benchmark lies somewhere in between a heavy load and a light load (Primate Labs states it's around 30% on the 6s). And this is notable because if this is the case, it means that GeekBench is in fact highlighting the difference in power consumption between the TSMC and Samsung A9s. However as Apple points out in their statement, a sustained workload is not necessarily representative of what real world usage is like, with the real world having a burst of of different types of workloads. This doesn't mean GeekBench doesn't return valuable data, however it means we're looking at a slice of a bigger picture. Ultimately if there is a difference between the TSMC and Samsung A9s, then it means that GeekBench is likely to be exacerbating the difference versus what a real world mixed use case test would see.

The Chips


As for the data itself, due to the fact that GeekBench is a heavier workload, it means that there are a number of factors that could explain why battery life in this test shows such a large difference, and not all of these factors are easy to account for. With the chips themselves, it could be that the Samsung A9 variant is simply reaching higher average temperatures due to its smaller die size (same heat over a smaller area), which then accordingly affects power draw due to the nature of semiconductor physics (increased leakage). It doesn’t have to be at the point where the workload is causing thermal throttling, but even a sustained load for significant periods of time could be enough to cause this effect, as the CPU will reach higher temperatures even if the phone is cool to the touch.



An example of the temperature versus power consumption principle on an Intel Core i7-2600K. Image Credit: AT Forums User "Idontcare"

Given the nature of chip manufacturing, it’s also hard to say one way or another whether an individual chip and phone pair will have better or worse battery life than another chip and phone pair. This is a pretty complicated subject, but it basically boils down to the difficulty of injecting exactly a certain number of ions into a small part of the silicon wafer or depositing a layer of insulator that meets an exact thickness. This results in chip manufacturing quality being distribution based - a wafer will come out of production with individual chips of varying quality, with some chips operating at lower voltages or lower leakages than others, and other chips being altogether defective. This is colloquially known as the "silicon lottery."

It's then from these chips that a customer (e.g. Apple) needs to made tradeoffs between how "poor" of a chip they are willing to accept and how many chips per wafer they'd like to be able to use from each wafer (the yield). In doing so, a customer will set minimum tolerances, certain parameters that a chip needs to meet to be qualified. However as these are minimums, it means that a customer will also receive chips that exceed these minimums, as we can see in our completely fictitious chart below.



A completely fictitious processor quality distribution example. Chips in the white area are used, outlying chips in red areas are rejected

In the case of someone like Intel, they will bin these passing chips as different products (Core i3/i5/i7) and different clockspeeds to charge the most for the best chips. However since Apple currently only uses at most two bins of chips (those suitable for 6s and those suitable for 6s Plus), this means there is a wider variation in the chips used in each phone. As a result, even if there isn't a true and consistent difference between TSMC and Samsung for A9 SoCs, you could easily have a pair of phones where due to the silicon lottery there is a notable - though not extreme - difference in power consumption.

This variation is an expected part of chip manufacturing, and while Apple could disqualify more chips and thereby reduce the yield, they will always have a certain degree of variation. The key here is to set rigorous minimums, advertise a phone based on those minimums (e.g. battery life), and should a customer end up with a phone, then they have won the silicon lottery in this case.

The Statistics


So how does one compare phones when there's a natural variance in chip quality? Ideally it would be done just like Apple does, testing a large number of phones (chips) for power consumption and battery life to determine the distribution. Otherwise if we only test one TSMC A9 and one Samsung A9, we don't know where in the distribution each A9 lies, and consequently whether each phone is a representative "average" sample or not.

Of course this is easier said than done, as the greater the accuracy desired the greater the number of iPhones required, a bill that at $650/$750 a unit adds up quickly. This makes it incredibly impractical for any one group short of a large corporate competitive analysis team to get enough samples, especially since at the press level Apple only distributes one phone of each type (for a total of 2) to each press organization. In lieu of that, typically the best one can do is look at a small number of samples, which offers some data to account for variance, but not much.


This brings us back to where we started with GeekBench. As part of the GeekBench benchmark, results are uploaded to Primate Labs' servers, where they are available for browsing. I've had a couple of people ask whether these results are a collective whole - in essence crowdsourced benchmarking - can answer anything, and the answer to that is a "yes, but" kind of scenario.

The big problem right now is that GeekBench doesn't know what model processor is being tested, so there's no way to sort out TSMC versus Samsung. Update: And as if right on cue, the same day we publish this GeekBench 3.4 makes it through the iOS app store approval process, adding the ability for GeekBench to tell which processor is in a phone.

Even if there was, we'd get to matters of testing rigor. How bright is the screen on each phone? Are there any background tasks running? Is it in airplane mode or spending power talking on WiFi/cellular? With Geekbench running at on average just a 30% duty cycle, these are all potential power consumers that can significantly impact the resulting battery life. In turn, these are all things that are accounted for in formal testing, but they cannot reliably be accounted for in benchmarks run by the wider public. This as a result adds even more variance to the equation, which makes individual or even small groups of results potentially very inaccurate.

The Conclusion


Wrapping things up then, where do we stand? The short answer is that all we know is that we don't know. What we know is that there isn't enough information currently out there to accurately determine whether the TSMC or Samsung A9 SoC has better power consumption, and more importantly just how large any difference might be. 1-on-1 comparisons under controlled conditions can provide us with some insight in to how the TSMC and Samsung A9s compare, but due to the natural variation in chip quality, it's possible to end up testing two atypical phones and never know it.


To that end I suspect that Apple's statement is not all that far off. They are of course one of the few parties able to actually analyze a large number of phones, and perhaps more to the point, having a wide variation in battery life on phones - even if every phone meets the minimum specifications - is not a great thing for Apple. It can cause buyers to start hunting down phones with "golden" A9s, and make other buyers feel like they've been swindled by not receiving an A9 with as low the power consumption as someone else. To be clear there will always be some variance and this is normal and expected, but if Apple has done their homework they should have it well understood and reasonably narrow. The big risk to Apple is that dual sourcing A9s in this fashion makes that task all the harder, which is one of the reasons why SoCs are rarely dual sourced.

As for AnandTech, we'll continue digging into the matter. Unfortunately all of the iPhones we've received and purchased so far have used TSMC A9s - it's a silicon lottery, after all - but whether there is a real and consistent difference between the TSMC and Samsung A9s is a very interesting question and one we're still looking to ultimately be able to address.


Read More ...




Dell XPS Lineup Is Reinvigorated With Skylake On The New XPS 12, XPS 13, And XPS 15
Dell really put their stamp on the 2015 Ultrabook lineup this year with the Dell XPS 13 with its amazing Infinity Display. They packed a 13.3-inch display into a notebook that would normally house something closer to 11-inches, and no other manufacturer has come close to it this year. Performance was great, battery life was the new benchmark, and other than a couple of foibles such as a camera that points up at your chin, and some aggressive use of Content Adaptive Backlight Control (CABC) there was very little to complain about on the XPS 13. Today, Dell is refreshing the XPS 13 with Skylake, and trying to bring the same amazing design to the XPS 15 and the new XPS 12 2-in-1.

XPS 13



Let’s start with the XPS 13, which should be familiar to anyone who read my review of it. Dell has tried to take the XPS 13 and push it to the next level, and they are starting with Skyake. Dell will offer Core i3-6100U all the way up to Core i7-6600U. This means the GPU will be the Intel HD 520 model, so no Iris options on the XPS 13 range unfortunately. Memory options start at 4 GB and go up to 16 GB of LPDDR3, and storage gets a bump too. The base 128 GB model is still a SATA based SSD, but the 256 GB, 512 GB, and 1 TB models are all PCIe versions. These are evolutionary updates, but the move to Skylake has also given Dell the opportunity to add Thunderbolt 3 to the XPS 13 through a USB Type-C which also supports 10 Gbps USB, VGA, HDMI, Ethernet, and charging.

Battery life was a pretty big part of the Broadwell based XPS 13, and on the 1080p model we got over 15 hours on our light workload. The move to Skylake looks to move that bar even further out with Dell saying the new model is rated at up to 18 hours.

Yes, it is an evolutionary update, but it is an evolutionary update of one of the best notebooks of 2015 so far.

XPS 15



For those that prefer a larger notebook, the XPS 15 has been around for a while now, but when we saw it refreshed back at CES, it was still in the 2014 chassis. Today Dell has brought the look and feel of the Infinity Display to the XPS 15. They have squeezed a 15.6-inch display into the body of a 14-inch notebook. Let’s talk about that display too. It is an Ultrasharp 4K Ultra HD model, which comes in at 282 pixels per inch and has a 350-nit brightness rating. The 4K model also covers 100% of the Adobe RGB color space which is a wider gamut than the typical sRGB space of most notebooks. It offers PremierColor Software as well to remap the smaller sRGB space into Adobe RGB so that colors are not portrayed incorrectly when viewing sRGB content.

The XPS 15 has been the model where Dell steps up the performance to a quad-core version, and despite the smaller chassis due to the Infinity Display that has not changed. The base model comes with the Intel Core i3-6100H which is a dual-core 35 Watt part, but you can also get the i5-6300HQ and i7-6700HQ which are both quad-core 45 Watt CPUs. Up to 32 GB of DDR4 is available through two SODIMM slots, and the XPS 15 offers a discrete NVIDIA GTX 960M GPU as well. You can get the XPS 15 with a spinning drive if you want (I wouldn’t recommend it) or, like the XPS 13, Dell offers PCIe SSDs up to 1 TB. Dell is offering either a 56 Wh or 84 Wh battery, and the larger battery on the 1080p version of the XPS 15 is rated for up to 17 hours. For those that need faster wireless connectivity, Dell also offers a 3x3 802.11ac wireless card. Like the XPS 13, it also features Thunderbolt 3 through Type-C. One thing you do lose by moving to a 14-inch chassis is the room for a dedicated number pad.

XPS 12



The final XPS model announced today is the XPS 12, which is a 2-in-1 tablet with a docking keyboard. Think of this as Dell’s take on the Surface Pro, but Dell has taken a couple of different paths than Microsoft did on their 2-in-1 tablet. First, the display is a 3840x2160 UHD with a smartphone level 352 pixels per inch and 400 nits of brightness and 100% Adobe RGB. There is also a 1080p model with sRGB coverage which should help with battery life. Dell has gone with the latest Skylake version of Core m to power this tablet, with the m5-6Y54 processor which turbos up to 2.7 GHz. 8 GB of LPDDR3 memory is available, and you can get either 128 GB of 256 GB of SATA SSD storage. Keeping all of this powered is a 30 Wh battery.

We tested the Dell Venue 11 7000, which was similar to this in that it had a docking keyboard, but the keyboard added a lot of weight due to the extra battery inside. Dell has taken a different approach here and the keyboard base only adds just under a pound to the 1.75 lb tablet. The keyboard on the Venue 11 7000 was also not great, and the XPS 12 is offering a full size backlit keyboard with 1.3 mm of key travel, which should be a lot better. The trackpad is a glass precision trackpad.

The device itself is made of a magnesium alloy and covered in soft touch paint, and the display has Corning Gorilla Glass NBT.

Dell seems to have gone all-in on Thunderbolt, with this tablet featuring not one but two Type-C connectors with Thunderbolt 3. The 45 Watt A/C adapter also connects over the Type-C, and it is great to see Dell embracing this to get rid of the myriad of proprietary charging connectors that have plagued PCs for decades.

Dell XPS
XPS 12 XPS 13 XPS 15
CPU Intel Core m5-6Y54 (1.1-2.7 GHz dual-core 4.5W Skylake) Intel Core i3-6100U (2.3 GHz dual-core 15W Skylake)

Intel Core i5-6200U (2.3-2.8 GHz dual-core 15W Skylake)

(January)Intel Core i5-6300U (2.4-3.0 GHz dual-core 15W Skylake)

Intel Core i7-6500U (2.5-3.1 GHz dual-core 15W Skylake)

(January)Intel Core i7-6600U (2.6-3.4 GHz dual-core 15W Skylake)
Intel Core i3-6100H (2.7 GHz dual-core 35W Skylake)

Intel Core i5-6300HQ (2.3-3.2 GHz quad-core 45W Skylake)

Intel Core i7-6700HQ (2.6-3.5 GHz quad-core 45W Skylake)
GPU Intel HD 515 Intel HD 520 Intel HD 530

NVIDIA GTX 960M
Memory 8 GB dual-channel LPDDR3-1600 4-16 GB dual-channel LPDDR3-1866 8-32 GB dual-channel DDR4-2133
Display 12.5" 1920x1080 sRGB

12.5" 3840x2160 Adobe RGB
13.3" 1920x1080 sRGB

13.3" 3200x1800 sRGB
15.6" 1920x1080 sRGB

15.6" 3840x2160 Adobe RGB
Storage 128-256 GB SATA SSD 128 GB SATA SSD, 256 GB, 512 GB, 1 TB PCIe SSD 500 GB - 1 GB HDD, 256 GB, 512 GB, 1 TB PCIe SSD
Battery 30 Wh 56 Wh 56 Wh

84 Wh
Ports Thunderbolt 3 x 2 (Type-C)

Headset

SD Card Reader
Thunderbolt 3 x 1 (Type-C)

USB 3.0 x 2

Headset

SD Card Reader
Thunderbolt 3 x 1 (Type-C)

USB 3.0 x 2

Headset

HDMI

SD Card Reader
Dimensions Tablet:

291 x 193 x 8 mm

11.46 x 7.6 x 0.31 inches

Tablet plus Keyboard:

291 x 198 x 16-25 mm

11.46 x 7.8 x 0.63-0.99 inches
304 x 200 x 9-15 mm

11.98 x 7.88 x 0.33-0.6 inches
357 x 235 x 11-17 mm

14.06 x 9.27 x 0.45-0.66 inches
Weight Tablet:

790 g

1.75 lbs

Tablet plus Keyboard:

1.27 kg

2.8 lbs
1.2 - 1.29 kg

2.7 - 2.9 lbs
1.78 - 2.0 kg

3.9 - 4.4 lbs
Price $999+ $799+ $999+

Dell was already at the forefront this year with their notebook design, so it’s great to see them take that same design and apply it to the XPS 15. The XPS 12 looks to be a decent tablet with a good looking keyboard dock, and that has been one of the biggest issues with convertible tablets with attachable docks so I am excited to see this in person and give it a try. The notebooks will be available on Dell.com starting today, and the tablet will be coming in November.

Source: Dell


Read More ...




Netflix, Windows 10 and Metered Network Connections
Readers of our mini-PC reviews would have noticed that our routine involves detailed power consumption tests. Ensuring a level playing field for all the units involves turning off automatic Windows updates (so that we don't have unnecessary processes taking up CPU cycles or downloading of updates consuming network bandwidth and driving up the idle power consumption). On Windows 8.1 and earlier versions, turning off automatic updates was trivial, but Windows 10 presents some challenges.

Our review of the ECS LIVA Core (published yesterday) was the first to make use of Windows 10. Turning off updates in Windows 10 Professional is not too difficult using the Group Policy editor, but Windows 10 Home has only one way to prevent updates from getting downloaded - by setting the network interface as a metered connection. (Update: It is also possible to turn off the Windows Update Service itself in Windows 10 Home) After having configured the WLAN connection on the ECS LIVA Core to be metered, we set about running our benchmarks.

Testing out Netflix streaming is a part of our evaluation of the HTPC credentials of a system. Firing up our test stream in the Windows 10 Netflix app and bringing up the debug information gave us a nasty shock. Despite being on a 75 Mbps Internet connection, the app was streaming the test title at a measly 235 kbps. Trying to manually set a higher buffering bitrate in the Stream Manager (accessible via Ctrl-Shift-Alt-S) ended up force-quitting the app without any warning.


Windows 10 Netflix App

I immediately searched online and found a Reddit thread dealing with a similar issue. Initially, I thought that the Netflix app was buggy (as a previous version on another Windows 10 PC was able to stream at the maximum bitrate without any issues). In order to rule out the PC's WLAN connection as the culprit, I fired up Microsoft Edge and streamed the test title using the HTML 5 interface for Netflix.


Windows 10 - Microsoft Edge - Netflix HTML 5 Streaming

The title streamed with the maximum video bitrate (similar to the app), but the audio was only 2.0 at 96 kbps (no Dolby Digital Plus 5.1 stream at 192 kbps).

A few days after finishing up the benchmarking, the OS threw up a message indicating that it had been a long time since it had been able to check for updates. That is when it struck me that the metered connection could have been the culprit for the behavior of the Netflix app. I immediately tested out with the network connection set to metered and also with the setting at default (non-metered)



As I had guessed, the metered connection indeed turned out to be the issue. I also confirmed that the metered connection setting had no effect on the HTML 5 streaming case using Microsoft Edge.



All in all, the PSA here is that if one sets the network connection to metered for any purpose, make sure to turn it off if Netflix streaming at the highest bitrate is required through the Windows 10 app.


Read More ...




The Andyson N500 Titanium PSU Review: High Efficiency For The Common PC
80Plus Titanium certified PSUs are very rare and most are high output units that are not meant for normal PCs. Andyson is making a surprise move with the release of their N Titanium series, which are targeted towards mainstream users and come with reasonable price tags. Today we are having a look at the N500 Titanium, the 500W variation of the series, a unit ideal for typical home and gaming PCs.


Read More ...




The Seagate Game Drive For Xbox: Do You Need A Green Drive?
When Seagate first announced the Game Drive for Xbox, they chose Gamescon as a venue to launch the new product. With the prolific Xbox branding, this is not a drive they would expect you to purchase for use with your MacBook. Not that you couldn’t of course, but they seem to have a pretty clearly defined target market for the Game Drive.

When Microsoft first launched the Xbox One, it came with a 500 GB internal drive, of which about 360 GB is available for the end user. Since that time, they have also released models with 1 TB of internal storage. It is unfortunate that the Xbox One does not feature an easily replaceable drive, but a few months after release they added support for external USB drives. The Seagate Game Drive for Xbox is specifically designed for this role. It has been certified by Microsoft for use with the Xbox, but mostly it has been branded with Xbox and colored Xbox green.


The package comes with just the drive and a micro USB 3.0 cable for use, which is 18-inches in length. The drive is pre-formatted with NTFS and if you want you can of course use this as a USB 3.0 storage drive on a PC as well. Maybe you are an Xbox fan. The drive is powered by the USB cable as well so there are no other connections necessary.

Configuring the drive for use with the Xbox is as easy as plugging it in. The Xbox will detect it and ask if you want to format it for use with games and content, and you just say yes. Easy as that.

Once installed on the Xbox it will ask you if you want to use it as the default save location for new downloads and game installs. The Xbox does not pool this drive, which makes it slightly more complicated to use since if you want your existing games there you have to move them. Pooling would make this seamless for the end user. The advantage of pooling though is that you can bring your Game Drive with you over to another Xbox and you will be guaranteed that your games are there, so you can just start playing.


As someone who has been around the field for a while, it continues to amaze me how much storage they can fit into a magnetic platter. The Game Drive for Xbox is 2 TB (calculated in base 10 for reasons only marketing would be able to explain) and adds a significant amount of storage to the Xbox One without being bulky. It is very small and quiet, being a laptop based drive. There are no cooling fans to fail, or ramp up in volume.


Performance of the Game Drive for Xbox is quite good for a small spinning drive. Read and write speed for sequential files is over 130 MB/s and being a spinning disk, random workloads fall well below that maximum speed.


The physical device measures 117mm x 80mm x 14.8mm and weighs 170 grams. It is actually smaller in person than I expected from the product shots. The green case is only green on the top, and black underneath. I like the green, but considering the Xbox One is generally black, it kind of stands out and doesn’t really match any home theatre equipment.

So do you need a green USB hard drive for the Xbox? No, of course not. Considering the package is only a couple of dollars more than Seagate’s standard 2.5-inch drive USB 3.0 product, I appreciate them trying to reach out to a different audience than those who just want a basic black hard drive, and in case you were wondering, it also works with the Xbox 360.


Read More ...




Legere Blasts Microsoft for "Bull***t" Snub of T-Mobile and Verizon
Microsoft's pitch to carriers has been apathetic if reports are to be believed; AT&T favoritism allegedly lives on

Read More ...






Available Tags:ASUS , SSD , Gaming , Western Digital , Samsung , Dell , Windows , Seagate , Microsoft

No comments: