Wednesday, February 25, 2009

IT News HeadLines (InfoWorld) 25/02/2009



VMware shows virtualization on the mobile phone

We've had virtual storage, virtual machines, and virtual datacenters . Now VMware is looking to the next frontier: the virtual mobile phone.

The company has announced the mobile virtualization platform (MVP) aimed at handset vendors to enable mobile users to choose between two different platforms or phone numbers on a single handset.

[ In other news from VMworld, VMware added to its cloud strategy | Track the latest trends in virtualization in InfoWorld's Virtualization Report blog. ]

Speaking at the VMworld conference, VMware chief technology officer Steve Herrod said that it would enable users who had separate phones for work and for home to be able to use both phones from a single device.

"The way that I envisage it working would be that IT departments would outline a set of policies and particular software build. In essence, this means that users would be able bring any device into the workplace and the IT team could use that device to include any particular workplace policies. "It could even be the case that one device could have multiple operating systems running so that you see Symbian and Android in a single view," Herrod said.

The software for MVP is a result of VMware's acquisition of Trango Software . Herrod said that although MVP has sprouted from Trango's original work, there had been a VMware team working with them to fully incorporate it into VMware's product line.

He said that the software was now available for handset vendors to look, but couldn't specify when products might be available as that was in the hands of the vendors.

Herrod warned that there were still some issues to be tackled. "Power management is a big issue. We've done power management on the desktop but nothing like to the amount we've have to do on the mobile phone."

There are other issues to be tackled however. Julia Austin, senior director for R&D, said that one of the biggest challenges was working with carriers. "I think it depends who you're talking to: some welcome the opportunity to simplify the software process, some are more defensive." She said that billing in particular could prove to be a hurdle to overcome, although again she said that some carriers were more responsive than others.

One application where MVP could prove to be useful is for users who have multiple SIM cards from different countries' operators. "Think of it as a way of putting a new SIM card into your phone when you change country -- without needing a SIM card," she said.

Techworld is an InfoWorld affiliate.

Read More ...
Adobe to patch Flash vulnerabilities for three platforms

Adobe Systems has updated its Flash multimedia software to eliminate five flaws affecting Windows, OS X, and Linux systems.

The update fixes a critical flaw which could cause a PC to be hacked merely by viewing a malicious SWF (Shockwave Flash) file, according to Adobe's advisory.

[ Learn how to secure your systems with Roger Grimes' Security Adviser blog and newsletter, both from InfoWorld. ]

Flash vulnerabilities are particularly dangerous due to the widespread use of the graphics format across the Internet for rich Web pages and banner advertisements. Most Web browsers have the Flash player plugin installed, which makes it an attractive target for hackers.

Online advertising networks have struggled to keep malicious Flash advertisements off their networks, as they are often difficult to detect.

Victims of a Flash attack are usually duped via a social engineering trick or by viewing malicious content injected into a trusted site, according to a warning from iDefense, the security branch of VeriSign.

Two of the other Adobe updates address potential problems with "clickjacking," a difficult but powerful hack that lures a victim into clicking on a certain place on a Web page in order to enable an attack.

The other two updates fix a potential denial-of-service condition caused by an input validation problem, and the remaining one fixes an information disclosure problem on Linux systems.

The most up-to-date Flash player for most users is 10.0.22.87. Other versions are available for users of AIR or Flash CS3 Professional: Adobe has published a chart in its advisory listing the upgrades.

Adobe has a Web page that will automatically display what Flash version a computer is using. Flash also has an auto-update system that will prompt a user that it's time to upgrade.

The latest Flash problems come as Adobe is grappling with another serious vulnerability in its Acrobat and Reader products, used for reading PDF (Portable Document Format) files, that affects both Apple and Windows users.

The flaw can allow an attacker to take over a computer if someone opens a malicious PDF file. Adobe has said it will issue a patch by March 11, but security experts have cautioned it leaves a wide time window for attacks.

Security vendor Sourcefire has said it has traced PDF attacks going back to Jan. 9. The company has issued an unsupported temporary patch.




Read More ...

Microsoft tweaks Experience Index for Windows 7

Microsoft has changed the PC performance rating tool Windows Experience Index for Windows 7 to better measure faster graphic cards, multicore processors, and drives.

The index's top score will go up from 5.9 to 7.9, and add several new tests to more accurately measure the performance of hard-disk and solid-state drives, says a mid-January post at the Engineering Windows 7 blog.

[ From InfoWorld'sTest Center: Windows 7 benchmarks unmasked | Special report: Early looks at Windows 7. ]

Windows Experience Index, first introduced with Windows Vista , is intended to help users discover which parts of their system needs to be upgraded for Windows and applications to run well, or if the PC needs to be replaced.

Reviews of the first public beta of Windows 7 indicate that it generally runs faster and more smoothly than Vista, despite the two sharing a very similar codebase.

But critics have alredy begun questioning the revamped index's usefulness and accuracy. One beta tester, going by the handle 'Hurricane Andrew' on Microsoft's MSDN developer Web site, complained that an older hard drive using the slower IDE interface was awarded a much higher rating than his newer, larger hard drive using the faster SATA-II interface.

"I hardly believe that's accurate," he wrote.

Others complained that the new scale, from 1.0 to 7.9, was counterintuitive, or that the criteria for drive performance should not have changed between Vista and Windows 7 for consistency's sake.

Michael Cherry, an analyst with the independent firm Directions on Microsoft, said he "doesn't put much stock" in the index's scores.

A Microsoft representative said in an e-mail that the company was "closely monitoring" input from beta testers aboutWindows 7, including for the new index, but would not say if changes would result from the feedback.

Windows 7 apps may not run faster on quad-cores

The Windows Experience Index, found under the System Icon in Vista or Windows 7's Control Panel, quickly scans hardware befor delivering five results, including for: processor, memory (RAM), graphics for general desktop work, gaming graphics performance, and the primary hard drive's performance. The results are based on the rated specifications of each component, not on their actual performance history in the scanned PC.

Because PC performance is often determined by the speed of the slowest-performing component, the index's "base score" is defined by the lowest of the five scores, rather than an average of all five.

PCs with a base score of between 1.0 and 2.9 can run Office applications and surf the Web, but not play games and videos or use Vista and Windows 7's Aero graphical user interface, Microsoft says.

Computers with a base score in the 3.0-range should be able to run Aero and most of Vista and Windows 7's new features, while those with scores in the 4.0 to 5.0-range should be able to enjoy high-definition (HD) video and 3-D gaming.

An analysis by the Malaysian technology blog TechARP found that dual-core CPUs that were scored a maximum 5.9 in Vista's Windows Experience Index will now be scored between 6.0 and 6.5.

"Well-performing" triple-core CPUs, which were not available during Vista's release, will be scored between 6.3 and 6.9, while quad-core CPUs will score somewhere above 7.0, TechARP said.

Microsoft is setting 7.9 as the performance benchmark for an 8-core CPU with simultaneous multi-threading (SMT). Eight-core CPUs, such as future versions of Intel's Nehalem chips , aren't even available for PCs yet.

TechARP said that Windows 7 PCs sporting triple and quad-core CPUs will excel at multi-core enabled applications such as very large Excel spreadsheets, sophisticated graphics rendering, software compiling and scientific applications.

However, it said such apps remain in the definite minority today, meaning that most software won't feel any zippier to users when run on higher-scoring 3- to 4-core systems.

In another change, Windows 7's Experience Index will not only measure how fast disk drives are at reading data, but how fast they are at writing large and small blocks of data. This will be especially helpful for measuring the true performance of solid-state disks (SSD) , which currently read data much more quickly than they write it.

On memory, Windows 7 will continue to base the score primarily on the RAM's speed, while limiting the score for lower amounts of RAM. PCs will need more than 3GB of memory to gain scores greater than 5.5.

To gain a gaming graphics score in the 6.0- to 6.9-range, PCs need to support Direct X 10 graphics and deliver between 40 and 50 frames per second of gaming or video at 1,280x1,024 resolutions, says Microsoft'sEngineering 7blog.

Computerworld is an InfoWorld affiliate.

No news on Windows 7 Upgrade Advisor

The Windows Experience Index is different from the Windows Vista Upgrade Advisor , a free application that is downloadable from Microsoft's site. With it, users can determine which version of Vista their Windows XP PCs can support, and whether components or software need to be upgraded.

Microsoft would not say exacty if it will release a Windows 7 Upgrade Advisor. A representative said in an e-mail: "Microsoft is investing in tools like the Windows Upgrade Advisor to help customers assess application compatibility. However, we have no additional information to share at this time."

Customers eager to find out today if their current PC is Windows 7-capable can use the Vista Upgrade Advisor, as "Windows 7 is being designed to run as well as or better than Windows Vista," she said. "As a result, there aren't new requirements that machines must meet in order to run Windows 7 well and deliver a truly great experience."

Consumers interested in finding out the Windows Experience Index score of various PC models and components can consult sites such as WindowsScores.com for laptops or DriverMax.com for video cards and CPUs. They can also upload their index score to ShareYourScore.com.


Read More ...

After CERT warning, Microsoft delivers AutoRun fix

Microsoft is pushing out a software update to some Windows users that fixes a bug in the Windows AutoRun software, used to automatically launch programs when DVDs or USB devices are introduced to the PC.

The bug fix, delivered through Microsoft's standard automatic update systems, comes one month after the U.S. Computer Emergency Readiness Team (US-CERT) issued a security alert warning that Windows did not properly disable AutoRun on Windows 2000, XP, and Server 2003.

[ Learn how to secure your systems with Roger Grimes' Security Adviser blog and newsletter, both from InfoWorld. ]

"Disabling AutoRun on Microsoft Windows systems can help prevent the spread of malicious code," CERT said in its advisory. "However, Microsoft's guidelines for disabling AutoRun are not fully effective, which could be considered a vulnerability."

Microsoft had said that technical users could disable AutoRun by setting a Windows Registry value called NoDriveTypeAutoRun to 0xFF. The problem was that, even with this value set, some versions of Windows would launch AutoRun programs whenever the user clicked on a device's icon using Windows Explorer.

That could mean big trouble for some users, as the widespread Conficker worm uses AutoRun to spread from USB devices to PCs.

There has been some internal debate within Microsoft as to whether Windows should enable AutoRun by default, since the software can be misused. AutoRun helped install the notorious Sony rootkit copy-protection software on users' PCs four years ago.

Although Microsoft describes its fix as a nonsecurity update, the patch "certainly does have security implications," said Ben Greenbaum, a senior research manager with Symantec Security Response. "It allows users who were expecting -- with good reason -- a certain level of protection out of the feature to actually get that level of protection."

It turns out that Microsoft had actually produced a patch for the issue, which users could download themselves, as far back as May 2008. It had also pushed out a July update that fixed the problem for Vista and Server 2008; but this fix was not automatically updated for Windows 2000, XP and Server 2003 users until Tuesday.


Read More ...

The Windows-versus-Linux server face-off

Linux certainly has established itself as a prominent server OS these days, pushing Unix into the background. But the open source OS shares the stage with commercial software giant Microsoft, which remains a dominant player with Windows Server.

Gartner research published this month found the server OS market shaping up as a battle between Windows Server and Linux. Gartner in other research also has found both OSes on a growth track in terms of revenue. "There still seems to be plenty of robust interest in deploying on Windows, but Linux is still very key," says Gartner analyst George Weiss.

[ The InfoWorld Test Center rates Windows Server 2008. | Why Linux is greener than Windows Server. | Has Linux killed OpenSolaris? ]

A lot of Linux usage is in Web server applications, but it's become increasingly common in mission-critical applications, Weiss notes. But "I don't have an indicator that says Linux is chewing up the market for Windows," he adds.

Other forms of Unix continue to fade away in what is becoming a two-OS choice for IT. "The key here is that really Linux and Windows are moving away from the pack here and it's becoming a two-horse race," says Jim Zemlin, executive director of the Linux Foundation.

Both Linux and Windows Server see datacenter growth
Regarding migration of current workloads, 43 percent of respondents in a Gartner survey at a Linux-oriented conference anticipated migrating mostly from Unix to Linux, 13 percent said they would migrate mostly from Windows to Linux, and only 4 percent said they would switch off Linux to go to Windows. Twenty-one percent had no plans to migrate workloads.

Gartner expects IT organizations to shift their focus to more-complex Linux deployments and continue a trend of migration from Unix. Gartner found that 52 percent of respondents anticipate that the total workload of their Linux server environment will increase moderately in 2008; another 25 percent said there would be a substantial increase. Only 5 percent anticipated moderate decrease, while 4 percent expected a substantial decrease in Linux workloads for this year. Respondents were three times more likely to migrate workloads from Unix to Linux than from Windows to Linux.

Although Linux growth is strong, so is that of Windows Server, according to Gartner's research. Linux was ranked by 39 percent of respondents as the OS expected to have the most growth in their datacenters during the next five years. Windows was a close second, ranked as the OS with the most growth potential by 35 percent of respondents at the Linux-oriented conference.

Based on Gartner's annual estimates for worldwide server shipment revenues, both Windows Server and Linux will increase. Windows Server sales will move from about $20 billion last year to roughly $22 billion in 2012; Linux will grow from about $9 billion last year to $12 billion in three years. But because Linux is often provided at no cost (with vendors making revenues from support contracts and other services), those numbers may not be comparable.

Roy Schestowitz, an ardent supporter of Linux and ardent opponent of Microsoft who runs the Boycott Novell Web site, argues that Linux gets shortchanged in surveys on market share because only "sold" OSes are counted -- and often just those sold as part of server hardware by major companies such as Dell Computer, Hewlett-Packard, and IBM.

Is it really a race?
With both OSes growing, should IT be thinking of Linux and Windows Server as either/or propositions? No.

Linux provider Red Hat sees heterogeneity ruling the day, with users deploying both Linux and Windows Server. Linux already has a large base in Web deployments but is expected to move into high-end database and enterprise application deployments, says Nick Carr, Red Hat's marketing director. Windows, meanwhile, has a large base as a server for Exchange Server, SQL Server, and file and print deployments, he notes.

"Nobody has a sort of homogeneous world anymore," Carr notes. "People tend to think that one grows at the expense of the other but that's not what's happening at the moment." That's why Red Hat and Microsoft recently agreed to let Red Hat Linux users run Windows servers in virtual machines and let Windows Server users run Red Hat Linux in VMs. "Increasingly, such servers that run in mixed environments rely on virtualization," notes Linux proponent Schestowitz.

CRIS Camera Services is an example of the mixed Linux-Windows Server environment that will keep both OSes in demand. At CRIS, Linux gets the nod for running PHP, MySQL, and Apache software, says Josh Treadwell, the company's IT director. But it relies on Windows Server for its Microsoft Dynamics and SharePoint applications. And its use of Windows Server benefits from the wide availability of Windows training certifications. "We have found college education circles around Microsoft languages," he says, noting there is no central certification for Linux.

The cost question
Possible reasons for moving to Linux include antipathy toward Microsoft and the perception that Linux is cheaper in terms of license fees, says Gartner's Weiss. Linux has an inside track with startups as well as with larger ventures such as Google, he notes -- two environments where cost or "anyone but Microsoft" concerns are key.

But the Linux financial advantage may not be real, Weiss says. When adding up the numbers for Linux deployments in a larger scalable environment, he does not see much difference among Linux, Unix, and Windows once you factor in the costs to achieve high availability, implement a global file system, and get technical support. Also, equipment expenses are a wash between Linux and Windows, he says: "Windows and Linux can run on the same hardware."

The Linux Foundation's Zemlin argues that Linux is in fact cheaper to use than Windows. One reason is the lack of licensing costs for Linux. Another is that Linux runs across a much wider variety of systems than the predominantly Intel x86-based Windows platform, he says, so you get an economy of scale across a mix of hardware. In addition to x86 serves and blades, both of which run Windows, Zemlin notes that Linux runs on mainframes, IBM Power systems, and other Unix-oriented hardware. "Linux can be a very cost-effective common denominator among these systems, he contends.

Some organizations may see cost savings from running free, unsupported Linux distros, but Gartner's Weiss says that is foolishly dangerous. A major outage or security breach without immediate access to a Linux support provider can easily wipe out any money saved from relying only on yourself. (Windows Server support is also needed, and it too requires paying a support provider.)


Read More ...

Update: Attackers targeting unpatched vulnerability in Excel 2007

Microsoft's Excel spreadsheet program has a zero-day vulnerability that attackers are exploiting on the Internet.

A zero-day vulnerability is one that does not have a patch and is actively being used to attack computers when it is publicly revealed. Microsoft said Tuesday that it plans to patch the issue, but did not say when. The company's next set of security patches are set to be released March 9.

[ Learn how to secure your systems with Roger Grimes' Security Adviser blog and newsletter, both from InfoWorld. ]

"At this time, we are aware only of limited and targeted attacks that attempt to use this vulnerability," wrote Microsoft Spokesman Bill Sisk in a blog posting. "We are developing a security update for Microsoft Office that addresses this vulnerability."

The vulnerability affects Microsoft Office 2007, Microsoft Office 2003, Microsoft Office 2002, and Microsoft Office 2000. It also affects the following Mac products: Microsoft Office 2008, Microsoft Office 2004, and the Mac's Open XML File Format Converter.

It was first disclosed Monday in an advisory posted on SecurityFocus, a Symantec-run Web site that tracks software flaws.

The program's vulnerability can be exploited if a user opens a maliciously-crafted Excel file. Then, a hacker could run unauthorized code. Symantec has detected that the exploit can leave a Trojan horse on the infected system, which it calls "Trojan.Mdropper.AC."

That Trojan, which works on PCs running the Vista and XP operating systems, is capable of downloading other malware to the computer.

Like another zero-day bug in Adobe's Reader and Acrobat software, this flaw is being exploited in dozens of targeted attacks, where victims are sent specially crafted messages tailored to make them open the malicious document, according to Vincent Weafer, vice president of Symantec Security Reponse. "We're seeing a lot of them come around different Asian government or industries or defense contractors," he said.

Hackers have increasingly sought to find vulnerabilities in applications as Microsoft has spent much effort into making its Vista OS more secure.




Read More ...

Enterprise architecture groups merge

Two industry organizations focused on enterprise architecture have merged.

The Association of Open Group Enterprise Architects (AOGEA), an affiliate of The Open Group, has merged with the Global Enterprise Architecture Organization (GEAO), the two groups announced this week.

The merged organization now has more than 9,000 members in 72 countries and will focus on advancing enterprise architecture, such as SOA, by helping to develop standards of "excellence and ethics" for members, according to a statement from the groups.

[ Earlier this month, The Open Group upgraded its enterprise architecture offering. ]

"From my point of view, there are a lot of complementary strengths in both organizations," said Allan Brown, president and CEO of The Open Group.

While GEAO has more businesspeople who use IT, AOGEA members tend to be enterprise architects who are more IT-focused. The Open Group has sponsored TOGAF (The Open Group Architecture Framework), which has been used by more than 60 percent of Fortune 50 and 80 percent of global Forbes 50 companies, according to The Open Group.

"I see [the merger] as the maturing of the profession and starting to add much more of the business goals to the profession," Brown said.

GEOA will get access to a huge base of members from The Open Group, said Ben Ponne, GEAO president.

Although GEOA becomes part of AOGEA, both organizations will retain their names for the foreseeable future and retain their identity.

Read More ...
Juniper's datacenter answer to Cisco: Stratus Project

Juniper said Tuesday it is partnering with server, storage, and software companies to develop a converged datacenter fabric under a multiyear project that will compete with Cisco's largely solo effort.

Juniper's Stratus Project is a year old and comprises six elements: a datacenter manager, storage, compute, layer 4-7 switching, appliances, and networking. It is intended to be a flat, non-blocking, lossless fabric supporting tens of thousands of Gigabit Ethernet ports, an order of magnitude reduction in latency, no single point of failure, and with security tightly integrated and virtualized.

[ Today, Juniper's main target, Cisco, joined with Accenture to launch a collaboration and datacenter virtualization effort. ]

Stratus is expected to support the CEE (Converged Enhanced Ethernet) datacenter fabric specifications being defined and endorsed by several vendors.

Juniper isn't naming names yet as to which companies it is partnering with on the Stratus Project but did recently outline its cloud computing plans with IBM.

Stratus will be managed like a large JUNOS-based switch, said David Yen, Juniper executive vice president for emerging technologies, at the company's annual analyst conference here this week. (New CEO Kevin Johnson shared a grim IT spending outlook at the event.)

Stratus is designed to relieve datacenter scaling "pain" due to latency, power, space, cost, and complexity, Yen said. It is also intended to support datacenter virtualization for "elasticity and efficiency," he said.

Yen would not provide details into Stratus products, configurations, pricing, or availability. He indicated, though, that it will not have a material impact on Juniper's 2009 revenue.

Juniper is announcing Stratus now to allow customers to plan their long-term datacenter migration strategies, Yen said.

"Stratus extends Juniper's high-performance networking core competencies into the datacenter," Yen said. "It allows Juniper to enter a new addressable market space. We have no vested interest in prolonging suboptimal legacy architectures. We are in a unique position to revolutionize the datacenter."

Cisco is looking to transform datacenters by developing its own blade servers that integrate networking, compute power, and virtualization, and that utilize the company's networking incumbency. Cisco is believed to be partnering with VMware and BMC Software for this project, which is code-named California, but is testing its longtime relationships with datacenter server stalwarts IBM and HP with the project.

"With one stroke, Juniper is devaluing Cisco's incumbency," says Tom Nolle, president of consultancy CIMI Corp. "They are positioning away from current technologies [such as FibreChannel and Infiniband] that have no accommodation to the fabric as a network backplane, or as the basis for future virtualization support.

"Cisco has a lot of collisions with incumbents," Nolle adds. "Juniper cannot hope to match Cisco in breadth so it is making that an asset instead of a liability. Juniper is timing its success with Stratus to the economy's recovery and to developing symbioses with partners."

Separately, UBS analyst Niko Theodosopoulos states in a bulletin: "The product targets large scale datacenters, offering an improvement of 10x versus current technologies, at least. Juniper is partnering with IBM on R&D for a total next-gen datacenter solution.

"We expect Stratus to be available in late 2010 or early 2011," Theodosopoulos adds. "While this is [about] two years away, the recession and standard delays for Fiber Channel over Ethernet make the plan seem reasonable in our view. We note Cisco will likely have an integrated blade server and Nexus datacenter offering by mid 2009 vs. the Juniper/IBM offering."

Network World is an InfoWorld affiliate.

Correction: This article as originally posted contained incorrect information about the Stratus project and the wrong surname of Juniper Networks' CEO. The story has been amended.




Read More ...

Netezza scoops up data auditing vendor

Data warehousing appliance vendor Netezza said Tuesday it has bought Tizor Systems, maker of data auditing and protection software. Terms were not disclosed.

[ Learn how to secure your systems with Roger Grimes' Security Adviser blog and newsletter, both from InfoWorld. ]

Tizor's centerpiece product is Mantra, which, like Netezza's database, is delivered in appliance form.

Mantra automatically discovers and examines data network traffic, scanning for suspect behavior and alerting administrators to problematic activities, according to Tizor's Web site. It can simultaneously audit Oracle, SQL Server, IBM DB/2 and Sybase databases as well as Windows file servers.

In a statement, Netezza said the deal will give customers the ability to meet regulatory requirements and perform deep forensic analyses of their data. Tizor, which is based in Maynard, Massachusetts, will be run as a Netezza subsidiary.

This is just the latest acquisition for Netezza, which also bought the specialized analytics vendor NuTech Solutions last year.




Read More ...

Microsoft offers language-agnostic coding technology

Bolstering developers, Microsoft Research has developed Code Contracts, offering a language-agnostic way to express coding assumptions in .Net.

The technology was released on the Microsoft DevLabs site this week. Code Contracts provide the innovations of "design by contract" programming to .Net programming languages. Contracts take the form of pre-conditions, post-conditions, and object invariants, and they act as checked documentation of APIs. Also, contracts improve testing via runtime checking and enable static contract verification and documentation generation.

[ At?the VSLive conference today, Microsoft showed off the new Visual Studio UI. ]

"Today?s release, Code Contracts for .Net, is a general design-by-contract mechanism that all .Net programmers can now take advantage of," said S. "Soma" Somasegar, senior vice president of the Microsoft developer division, in his blog on Monday. "Using it, programmers provide method preconditions and post-conditions that enrich existing APIs with information that is not expressible in the type systems of .Net languages. Additionally, contracts specify object invariants, which define what allowable states an instance of a class may be in (i.e. its internal consistency.) "

Contracts, Somasegar said, are used for runtime checking, static verification, and documentation generation. Additionally, contracts allow automatic documentation checking and improved testing. Code Contracts for .Net consists of three components: the static library methods used for expressing the contracts, a binary rewriter, and a static checker.

The binary rewriter modifies a program by injecting contracts, which are checked as part of a program execution. Rewritten programs improve testability, and each contract offers a pass/fail indication for test runs. The static checker decides if there have been any contract violations, checking for implicit contracts, such as null dereferences and array bounds, as well as checking for explicit contracts.

Code Contracts can be used within Visual Studio 2008 Team System.

The design-by-contract mechanism was pioneered by Eiffel, Somasegar said.

Read More ...
VMware CTO: New strategy is all about management

More details about VMware's Virtual Datacenter Operating System, including a collection of management tools, will be provided in coming weeks, but company executives at the VMworld conference Tuesday continued to outline the broad initiative that is aimed at helping customers build internal clouds.

Using Virtual Datacenter Operating System (VDC-OS), or vSphere as it is known, enterprises will be able to develop internal cloud computing systems, with the vCenter suite key to managing cloud infrastructures and applications, the company said. VMware's strategy and the products that it will soon roll out hinge on the management of the cloud.

[ In earlier news from VMworld, VMware announced plans to add to its cloud strategy | Also, competition is heating up as Microsoft and Citrix join forces against VMware | Track the latest trends in virtualization in InfoWorld's Virtualization Report blog. ]

"I think it's everything. We have a major investment in the platform layer, but it's not worth anything if you can't expose those features in a useful way," said Stephen Herrod, chief technology officer at VMware, in an interview at the event.

Using the vCenter suite, companies will be able to manage more parts of the infrastructure compared to current management tools from VMware, according to Herrod.

"Today, we don't have any application level management, capacity or configuration management. So we are basically plugging a very large number of important holes on that front," he said.

The tools in vCenter Suite will have a common look and feel within a user interface, share data, and work with one another. The most common way of accessing the different features will be a set of tabs, according to Herrod.

On the application side, AppSpeed, which is based on technology VMware got when it acquired Israeli company Beehive, is one of the most important additions. "For us it's a very big deal because we have typically been thought of as an infrastructure company, but this is an application-level monitoring and discovery tool," said Herrod.

It uses network packet inspection to map out the environment and help companies figure out what is causing problems for end-users, according to Herrod. It will also span the physical and the virtual worlds, which is another first for VMware, Herrod said.

As with the rest of VMware's push into the cloud space, it is keeping quiet on the details of vCenter Suite, at least for now. "We haven't announced any packaging or pricing, but the overall message is that it's a suite of interoperating components that some will ship together, and others will come in over time across 2009," said Herrod, who says more detailed announcements will start in he coming weeks.

But VMware isn't the only virtualization vendor with management on its mind. Microsoft and Citrix announced they will cooperate on the management of their respective virtualization products, and Red Hat announced management tools for servers and desktops.

Both announcements were aimed at VMware, and made the day before VMworld started, which didn't come as a surprise to Herrod.

"A lot of people announce things the night before VMworld at this point. This has become the event for virtualization," he said.

But he doesn't seem too worried about the competition, at least not Citrix and Red Hat. The virtualization industry has moved on from the kind of product announcements that VMware's competitors are making, according to Herrod. VMware has 2,500 engineers who do nothing but virtualization and to get to the same level that VMware is aiming to do with vSphere is a massive undertaking, he said.




Read More ...

Google sets billing rates for App Engine

Google on Tuesday will institute new billing services for developers using its App Engine hosting service for Web applications, but developers will be able to go beyond the capacity quotas that had been in place, a Google product manager said.

Until now, App Engine storage has been free but developers could only access around 40 hours of CPU time per day along with 500MB of storage and 10GB of bandwidth, said Pete Koomen, product manager for Google App Engine.

[ Related: "Google App Engine preview is now open" | Keep up with developers' issues by reading InfoWorld's Strategic Developer blog. ]

"What we're enabling [Tuesday] is for developers who want to go beyond these limits, we're now giving them the ability to pay for more resource usage," Koomen said. On top of the free services, developers can pay 10 cents per CPU core hour for application processing, 10 cents per gigabyte of data transferred into the application, 12 cents per gigabyte of data transferred out of the application, and 15 cents per gigabyte per month of storage. The storage capabilities cover static files served by the application as well as structured data using the Google Datastore API.

In addition to the 2,000 e-mails a day that applications can send for free, Google will let applications send 10,000 e-mails per day for $1.

"[The effort is] a way to allow developers to grow beyond the free quotas," Koomen said.

"App Engine will always be free to get started," the company stated in a planned blog post. "However, we've made many performance improvements over the past 10 months and we've also learned that we were pretty conservative with our initial estimates on what our free resource quotas should be. Therefore, in 90 days we will be reducing the free quota resources. We believe these new levels will continue to serve a reasonably efficient application around 5 million page views per month, completely free."

The company, however, will double its free storage quota.

"Data stored in the datastore incurs additional overhead, depending on the number of indexes, as well as the number (and size) of associated properties," Google said. "This overhead can be significant in some cases and it's something that we've been underreporting to date. So you may notice adjustments in the amount of data stored that's listed in the Admin Console. To decrease the impact of these adjustments, we've doubled the free storage quota to 1GB," the company said.

An estimated 45,000 applications have been built on App Engine, Koomen said. Among these applications are BuddyPoke, which is an OpenSocial application with more than 30 million users, and Lingospot, for content discovery.


Read More ...

Microsoft trying out netbook processors in datacenters

It sounds crazy to use processors designed for tiny netbooks in datacenters, with their potentially massive computing requirements, but Microsoft is experimenting with doing just that -- and finding it may lead to cost savings.

A netbook processor uses one-fifth or one-tenth the power of a typical server processor, offers about one-third the performance, and is cheaper than server processors, said Jim Larus, director of software architecture for datacenter futures, a group within Microsoft's Research group. That means that even though a datacenter would require three times as many netbook processors, the power requirement would still be lower than that of typical server processors.

[ Microsoft is making more moves into the netbook space: "Chip vendors bring Windows 7 closer to netbooks" | Keep up on green IT trends with InfoWorld's Sustainable IT blog and Green Tech newsletter. ]

At TechFest, an annual demonstration of some of the technologies that Microsoft researchers are cooking up, Larus showed off a typical datacenter cabinet stocked with off-the-shelf netbook processors.

The cabinet doesn't require the large fans that are typically built into the containers. The processors produce less heat because they consume less power, in part because they offer less performance but also because they were designed to require as little energy as possible so as not to quickly drain the battery of a laptop, Larus noted. The cabinet, which contained 50 dual-core processors, can be plugged into a standard electrical outlet, he said.

The setup in its current configuration isn't really usable -- because the processors came off the shelf, they are built into boards that include unnecessary ports for add-ons like monitors. That means they take up more space in a cabinet than most datacenter operators would like.

But Microsoft, which is sharing its experiment with server vendors, hopes that vendors will take interest and begin building products for the datacenter based on the smaller processors. Ideally, the processors would be designed so that they can be packed into a tighter configuration in a cabinet, he said.

In addition, the netbook processors aren't as fast as datacenter operators would like, he said. There's probably a "sweet spot" in between netbook processors and today's server processors that's just right, he said.

Microsoft is also experimenting with typical features built into the netbook processors that might be useful in the datacenter. One is the ability for the processor to sleep when it's not in use. Larus showed a demonstration of the netbook processors running a theoretical search program. While keeping the response time to a search query to three-tenths of a second, the processors not in use powered down.

Larus works in a new group within Microsoft Research that was started about 10 months ago to focus on improving datacenter efficiencies. Running efficient datacenters will be increasingly important in the future, he noted. In the past, Microsoft sold software to people and the end-user bore the costs associated with running the software, including buying the hardware and consuming the electricity. But Microsoft increasingly expects to host software for people, meaning it will be running that software in energy-consuming datacenters.




Read More ...

Research will offer recession boost, Microsoft's Mundie says

Microsoft's research efforts will help the company emerge strong from the economic downturn, predicted Craig Mundie, Microsoft's chief research and strategy officer, at the company's annual TechFest event.

Research is even more important during times of economic troubles than during prosperity, he said. "If you look back at economic downturns including the Great Depression, the companies that fared best in almost every case were those that actually did two things," Mundie said. "They responded in terms of cost containment and continue to invest in product development."

[ Special report: IT and the financial crisis. ]

Often, the products in development during economic downturns help the company get going again, he said. "We think it will be a key part in how Microsoft performs well as the world starts to pick up and perform better economically," he said.

Research will help the company in the long term too, he said. "I think of research as one of the things that we have to do and elect to do in order to ensure we survive over the long term," he said. Companies that cut research in the face of short-term pressure or never start pure research tend not to last very long, he said. "My belief is the company would struggle to survive and prosper if we didn't have research investment."

One particular area of development that might help Microsoft is new ideas around the user interface, Mundie said, echoing a favorite theme of Microsoft founder Bill Gates.

Computing currently is controlled mainly with the keyboard and mouse. "The reality is that many people in the world aren't really computer literate in that sense," Mundie said.

The way that people interact with computers will become increasingly critical, he said. As economies worldwide falter, governments are interested in using stimulus programs to invest in health care and education, he noted. Key to implementing those programs may be letting people access them in nontraditional means, he said.

Gates also often talked about new kinds of user interfaces, like pen-based computing and touch.

Using natural voice commands might also be a part of the way that people use computers, he said. Mundie envisions a future where a person can search for information in a far less manual way than today. Rather than surfing online for information about buying a house or a car, a person will simply ask the computer to do it for them, he said.




Read More ...

Accenture and Cisco launch joint services drive

Accenture and Cisco have jointly launched a major initiative to deliver collaboration and datacenter virtualization services to large companies.

The two suppliers have created a virtual group that will "design, build, and run business solutions to help companies integrate unified communication and collaboration tools into multiple applications across companies' IT infrastructures."

[ Track the latest trends in virtualization in InfoWorld's Virtualization Report blog. ]

The alliance will target key, IT-intensive verticals, including telecommunications, health care, and government to financial services, energy, and utilities.

"Our clients face a range of challenges, from reducing operational costs and managing complex consolidations to competing globally and facing competitors with new cost-efficient business models," said Andre Hughes, managing director for Accenture, who leads the virtual group. "We formed the Accenture & Cisco Business Group to provide a seamless services team that can help clients improve their performance."

The initial offerings focus on customer contacts, unified communications, and collaboration and infrastructure transformation.

The Oracle-Cisco Customer Contact Transformation Solution promises to bring together customer relationship management, analytics, workflow performance, and hiring software applications with collaboration products.

The Unified Communications and Collaboration Solution promises to combine "an understanding of the client's business processes, IT architecture and how users actually adopt technology," to help improve collaboration and operational performance and to accelerate decision-making.

The Infrastructure Transformation Solution uses Accenture's business process experts and Cisco's networking experts to optimize networks and datacenters.

Computerworld UK is an InfoWorld affiliate.




Read More ...

No comments: