Thursday, May 14, 2015

IT News Head Lines (AnandTech) 5/15/2015

AnandTech



Microsoft Details Windows 10 Editions
In a blog post on the Windows Blog today, Tony Prophet outlined the various editions of Windows 10 that will be available this summer. And while there are a lot of different versions for various markets, on the consumer side it is not too bad. Once you step into the more esoteric Stock Keeping Units (SKUs) it does get a bit crazy though.

The version most consumers will be seeing is Windows 10 Home. This will be the main version sold on all consumer desktops, laptops, and larger tablet form factors. It will include continuum to transform from a desktop operating system to the touch friendly tablet version, and of course include things like Cortana. Although not specified, if it continues the tradition of Windows, Home will not be able to be joined to a domain and will be missing the business features.


Is the date on this image a hint as to launch date? Likely not...

The other version that consumers will see is Windows 10 Mobile, which is the version that will be installed on phones and smaller tablets. This is almost Windows RT’s successor, since it will be restricted to just being able to be used with the Universal Windows Apps which must be installed from the store. However this will certainly run on x86 as well as ARM, so it is not a direct successor to Windows RT. The version of Continuum available for this SKU will allow a phone or small tablet to be hooked up to a keyboard, mouse, and display, and used like a desktop PC, and although it will be restricted to Universal Windows Apps, these apps will then be able to provide the desktop experience, so as an example, Mail will expand out to the same version you would see on a PC.


That is really it for the consumer side. Windows Mobile has been resurrected, and therefore Windows Phone is no more. Since this version will be available on more than phones though, it does make sense, and it is not like Windows Phone as a brand has been very successful anyway (don’t yell at me – I use one).

On the business side, there are more choices, but these again mirror what is already available today on the business side. Windows 10 Pro is the next step up from Home, and of course it will be similar to Home but add features like the ability to join an Active Directory domain. New features coming will be the ability to use Windows Update for Business, which is a new feature for Windows which was announced at Microsoft’s Ignite conference last week. This will expand upon what is already available through Windows Server Update Services (WSUS) and add some new features that we have already seen in the Technical Preview for Windows 10. By that I mean it will support distribution rings, so that updates can be rolled out easily in waves, and the most critical infrastructure can be saved for last to ensure there are no quirks. Maintenance windows can also be specified, and just like with Windows 10, updates can be delivered peer to peer, which should save a lot of bandwidth at remote locations.

The next version available is strictly for volume license customers, and that is Windows 10 Enterprise. This will have access to a Long Term Servicing Branch, as well as the aforementioned Windows Update for Business. There will be additional deployment options for this version as well to assist large companies with imaging and other methods of deployment.


The final business SKU is actually Windows 10 Mobile Enterprise, which is certainly a new branch. This will also be available to volume license customers, and will offer additional security and mobile device management options.

There will also be an education version, aptly called Windows 10 Education, which is designed to “meet the needs of schools, staff, administers, teachers, and students” and it will be available through the academic volume licensing program. As to what this version offers that is not available on the others is unclear at this time, but assuming the price is right it may be something to combat Chromebooks popularity in schools. But we will have to wait and see.

Finally, there will be a version of Windows 10 for embedded devices like ATMs and such, as well as the Windows 10 IoT Core which will be available for devices like Raspberry Pi.

There are a pile of SKUs, but this is actually not that unusual for Windows. There are certainly some new versions available, and although it is a bit confusing, it’s not too bad. As far as upgrades, they have stated again that anyone with Windows 10 will be a free upgrade for qualifying Windows 7, 8.1, and Phone 8.1 devices for the first year after Windows 10 does launch. There will be no fees later to keep Windows 10 going on that device. It will be upgraded and then kept current with whatever Windows 10 brings in future updates.

If you have any questions sound off in the comments and I’ll do my best to provide answers.

Source: Windows Blog



Read More ...




Blu-ray Disc Association Completes Ultra HD Blu-ray Specification
Yesterday the Blu-ray Disc Association formally completed the Ultra HD Blu-ray specification. The specification has been under development for some time, with the first information about it being released in September of last year. The new specification allows for higher resolutions, a greater range of colors, and larger capacity disks in order to store a new generation of Ultra HD content.

The biggest point of the new Ultra HD Blu-ray specification lies in its name. Ultra HD Blu-ray will support the 3840x2160 Ultra HD resolution that has become standard across so called "4K" or Ultra HD televisions. That being said, an increase in resolution is not the only important part of the Ultra HD Blu-ray spec. The Ultra HD content standard, more accurately known as BT.2020, defines various aspects that go beyond resolution, including color gamut, color bit depth, and frame rate.



In my view, the most important aspect of the BT. 2020 standard is the use of the Rec. 2020 color gamut. The color gamut that has been used for basically all picture and video content for quite some time now is called Rec. 709 or sRGB. sRGB is actually quite a narrow gamut, and has an lower overall number of colors than even the NTSC (1953) gamut that was used for video content before it. The Ultra HD specification uses the much larger Rec. 2020 color gamut, which will allow for colors of greater saturation to be reproduced. You can see this in the image above, with sRGB being the smaller triangle, and Rec. 2020 being the larger triangle that surrounds it.

In order to support the larger Rec. 2020 color gamut without introducing color banding, a higher bit depth is required. This is because a greater number of discrete colors will be required to display gradations that span a greater range of saturations. Ultra HD Blu-ray supports 10bit per channel color depth for content that uses Rec. 2020 for its color encoding. This moves the number of possible colors that can be displayed from approximately 16.7 million to 1.07 billion. I think it would have been better to use 10bit color for sRGB content and 12bit color for Rec. 2020 content, as current 8bit sRGB content can already experience noticeable color banding, but it looks like the additional space and hardware support required have not been deemed worth it.

While the new Ultra HD Blu-ray standard supports the existing 50GB capacity for Blu-ray disks, there will be disks of greater capacity for content that requires higher bitrates. 50GB disks will have video encoded at up to 82Mbps, while 66GB disks can support up to 108Mbps, and 100GB disks support 128Mbps. In order to encode videos with these high resolutions, bitrates, and greater color depth, Ultra HD Blu-ray will make use of HEVC video encoding.

While the appeal of physical media such as Blu-ray is in decline due to the rise of streaming media, it's still the go-to for users who care about having the highest possible visual quality. It will definitely take time for Ultra HD Blu-ray to be adopted in the market, and possibly longer for Ultra HD TVs that actually support the Rec. 2020 color space. It will be interesting to see where the market for movies and TV shows moves in the future, and what position physical media will be in at that time.


Read More ...




The Truth About SSD Data Retention
In the past week, quite a few media outlets have posted articles claiming that SSDs will lose data in a matter of days if left unpowered. While there is some (read: very, very little) truth to that, it has created a lot of chatter and confusion in forums and even I have received a few questions about the validity of the claims, so rather than responding to individual emails/tweets from people who want to know more, I thought I would explain the matter in depth to everyone at once.

First of all, the presentation everyone is talking about can be found here. Unlike some sites reported, it's not a presentation from Seagate -- it's an official JEDEC presentation from Alvin Cox, the Chairman of JC-64.8 subcommittee (i.e. SSD committee) at the time, meaning that it's supposed to act as an objective source of information for all SSD vendors. It is, however, correct that Mr. Cox works as a Senior Staff Engineer at Seagate, but that is irrelevant because the whole purpose of JEDEC is to bring manufacturers together to develop open standards. The committee members and chairmen are all working for some company and currently the JC-64.8 subcommittee is lead by Frank Chu from HGST.


Before we go into the actual data retention topic, let's outline the situation by focusing on the conditions that must be met when the manufacturer is determining the endurance rating for an SSD. First off, the drive must maintain its capacity, meaning that it cannot retire so many blocks that the user capacity would decrease. Secondly, the drive must meet the required UBER (number of data errors per number of bits read) spec as well as be within the functional failure requirement. Finally, the drive must retain data without power for a set amount of time to meet the JEDEC spec. Note that all these must be conditions must be met when the maximum number of data has been written i.e. if a drive is rated at 100TB, it must meet these specs after 100TB of writes.


The table above summarizes the requirements for both client and enterprise SSDs. As we can see, the data retention requirement for a client SSD is one-year at 30°C, which is above typical room temperature. The retention does depend on the temperature, so let's take a closer look of how the retention scales with temperature.


EDIT: Note that the data in the table above is based on material sent by Intel, not Seagate.

At 40°C active and 30°C power off temperature, a client SSD is set to retain data for 52 weeks i.e. one year. As the table shows, the data retention is proportional to active temperature and inversely proportional to power off temperature, meaning that a higher power off temperature will result in decreased retention. In a worst case scenario where the active temperature is only 25-30°C and power off is 55°C, the data retention can be as short as one week, which is what many sites have touted with their "data loss in matter of days" claims. Yes, it can technically happen, but not in typical client environment.

In reality power off temperature of 55°C is not realistic at all for a client user because the drive will most likely be stored somewhere in the house (closet, basement, garage etc.) in room temperature, which tends to be below 30°C. Active temperature, on the other hand, is usually at least 40°C because the drive and other components in the system generate heat that puts the temperature over room temperature.


As always, there is a technical explanation to the data retention scaling. The conductivity of a semiconductor scales with temperature, which is bad news for NAND because when it's unpowered the electrons are not supposed to move as that would change the charge of the cell. In other words, as the temperature increases, the electrons escape the floating gate faster that ultimately changes the voltage state of the cell and renders data unreadable (i.e. the drive no longer retains data).

For active use the temperature has the opposite effect. Because higher temperature makes the silicon more conductive, the flow of current is higher during program/erase operation and causes less stress on the tunnel oxide, improving the endurance of the cell because endurance is practically limited by tunnel oxide's ability to hold the electrons inside the floating gate.

All in all, there is absolutely zero reason to worry about SSD data retention in typical client environment. Remember that the figures presented here are for a drive that has already passed its endurance rating, so for new drives the data retention is considerably higher, typically over ten years for MLC NAND based SSDs. If you buy a drive today and stash it away, the drive itself will become totally obsolete quicker than it will lose its data. Besides, given the cost of SSDs, it's not cost efficient to use them for cold storage anyway, so if you're looking to archive data I would recommend going with hard drives for cost reasons alone.


Read More ...




Buzzfeed, NatGeo, NBC, and NYT Pay to Push Stories to Your Facebook Feed
Facebook will now automatically post news from its partners, in exchange for a cut of the proceeds

Read More ...




Seagate Senior Researcher: Heat Can Kill Data on Stored SSDs
Average "shelf life" in climate control is around 2 years; but drops to just 6 months if your storage site hits 95° F / 35° C

Read More ...






Available Tags:Microsoft , Windows , Blu-ray , Blu-ray , SSD , Facebook , Seagate

No comments: