Friday, February 27, 2009

IT News HeadLines (InfoWorld) 27/02/2009



Obama's budget blueprint increases tech spending

U.S. President Barack Obama's new budget blueprint includes millions of new dollars for health IT and for technology research, according to the budget document.

The $3.6 trillion budget blueprint, released Thursday, includes a $25 billion increase in the budget of the U.S. Department of Veterans Affairs over the next five years, with some of that money going to updated health IT systems. This first Obama budget document includes broad goals, but generally doesn't include details about how agencies will spend the money allocated to them.

[ After listening to President Obama's recent address to Congress, InfoWorld's Eric Knorr advises companies to look for opportunities for transformation. | Your source for the latest in government IT news and issues: Subscribe to InfoWorld's Government IT newsletter . ]

The budget documents also note that the recently passed economic stimulus package includes $19 billion to drive adoption of electronic health records.

In addition to the budget increases, the blueprint stresses the importance of technology improvements.

"To create a platform for our entrepreneurs and workers to build an economy that can lead this future, we will begin to rebuild America for the demands of the 21st Century," the budget says. "We will repair crumbling roads, bridges, and schools as well as expand broadband lines across America, so that a small business in a rural town can connect and compete with its counterparts anywhere in the world. And we will invest in the science, research, and technology that will lead to new medical breakthroughs, new discoveries, and entire new industries."

The budget refers to the $7.2 billion in broadband grants and loans approved in the economic stimulus package.

The budget includes $70 million for the National Institute of Standards and Technology's Technology Innovation Program, which invests in tech research. Obama's budget also includes funding increases for cybersecurity efforts at the Department of Homeland Security, new money for technology at the Securities and Exchange Commission and new money for the Government Accountability Office to make information on government contracts available to the public.

In addition, the National Science Foundation's budget would increase from $6.1 billion in fiscal year 2008 to $6.9 billion in 2009. Part of the budget increase will go into the NSF's Advanced Technological Education program, which works with two-year colleges to improve science and engineering technician education.

Several Republican lawmakers ripped into Obama's budget, saying it was fiscally irresponsible. The budget will increase the U.S. budget deficit to $1.75 trillion in 2009, but Obama's goal is to cut the deficit by more than two thirds by 2013.

"Unfortunately, this budget plan is once again a missed opportunity for American taxpayers -- it raises taxes on all Americans, implements massive new spending, and fails to make any tough choices to control the deficit and long-term fiscal crisis posed by the huge entitlement programs," said Senator Judd Gregg, a New Hampshire Republican. "Where is the spending restraint? Instead, government spending continues to grow and expand, while the economy continues to suffer."


Read More ...

Borland tool counters sloppy coding

Borland unveiled a 'release readiness' system earlier this week called TeamInspector, which gathers and reveals key metrics used in software development, in order to counter the 'appalling reputation' of IT projects and to make sure new applications work properly and are fit for purpose.

The company says that TeamInspector can analyse code, test coverage and the build trends, and make sure that the new software application is standards compliance. All this is designed to give the development manager factual evidence that the software they deliver is ready for customer use.

[ Keep up with app dev issues and trends with InfoWorld's Fatal Exception and Strategic Developer blogs. ]

The product presents metrics in a dashboard displaying real-time and trend information across projects. This information is apparently gathered by TeamInspector's automated inspectors and aggregate key readiness metrics from an array of developer test utilities, static code analysis and build tools.

Borland said that TeamInspector will be part of Borland Management Solutions, the software delivery management platform that enables companies to track, measure, predict and improve delivery performance and quality.

But isn't this just a typical software testing application then? Well, it is part of the software testing remit, Borland admitted.

"Borland views 'release readiness' as one element within the larger scope of software testing and our differentiation lies in our open, management level and metric driven approach," the company told Techworld via email.

"TeamInspector has the capability to inspect various third party tools related to code analysis, standards compliance, test coverage and build trends; the results (risk indicators and trends) are then revealed to management in a single cross-project dashboard ensuring visibility across your entire software project portfolio."

"TeamInspector works both in Java and .Net shops," the company added. "Specifically, TeamInspector has inspectors for several developer test coverage, test success, code analysis and build tools, including: Ant, NAnt, Checkstyle, Emma, JUnit and NUnit. Borland has plans to continue to add to this inspector library and to provide interfaces for generic inspector connections."

Borland said that "IT projects in general have an appalling reputation for being over budget, delayed and failing to fulfil the requirements of the business. However, it is very rare that software is given specifically as the reason for these problems. One notable exception was the Heathrow Terminal 5 fiasco, where a lack of testing was blamed on a software systems failure," the company said.

A 10-user pack costs $19,266 and a 100-user pack costs $142,700.

Techworld is an InfoWorld affiliate.


Read More ...

Sun's McNealy: Some feds see open source as anti-capitalist

Sun Microsystems Chairman Scott McNealy wants President Barack Obama's administration do what the United Kingdom, Denmark, and other countries have done: Encourage, as a matter of policy, open-source software adoption.

Although open-source platforms are widely used today in the federal government -- particularly Linux and Sun's own products, Solaris and Java -- McNealy believes many government officials don't understand it, fear it and even oppose it for ideological reasons.

[ InfoWorld's Neil McAllister advises Obama: Repealing software patents could give a much-needed boost to the tech industry | Track the latest trends in open source with InfoWorld's Open Sources blog . ]

McNealy cited an open-source development project Sun worked on with Health and Human Services Department, during which a federal official said "that open source was anti-capitalist." That sentiment, McNealy fears, is not unusual or isolated.

"If you think about it, proprietary software is the software equivalent of a planned economy led by a dictators, whereas open source is all about choice, the market economy, and multiple competitive players," said McNealy.

That's the message McNealy and Bill Vass, the president and chief operating officer of Sun's federal division, are now delivering. They have already met with Obama Administration officials to offer a paper on open source that has since grown into a discussion about the merits of having a federal CIO. The new administration has plans to appoint a chief technology officer, but not a CIO.

"There is not a corporation, a Fortune 1000 company, around that doesn't have a CIO," said McNealy. "Yet, the federal government dwarfs all those organizations and they really have an empowered, cabinet kind of position."

The Obama administration has not yet spelled out its federal technology plan nor has it appointed all the key people it needs to run it. Meanwhile, outside observers are trying to glean broad policy directions from tactical moves, such as the use of the open-source content management system Drupal for the Recovery.org Web site.

The Fiscal Year 2010 budget released Thursday reaffirms plans to appoint "the nation's first [CTO] to ensure that our Government and all its agencies have the right infrastructure, policies and services for the 21st Century. The CTO will work with each of the federal agencies to ensure that they use best-in-class technologies and share best practices."

That's little different from what the president said during the 2008 campaign, and it doesn't provide specifics about what those best practices may be. But the Obama Administration has been soliciting advice -- and McNealy's arguments may be particularly well timed.

The open-source push is growing. Just this week, the U.K.'s Chief Information Council updated its policy on open-source software. That government has long encouraged its use, and noted that during the last five years "many government departments have shown that open source can be the best for the taxpayer." The policy also said government agencies need to speed up open-source adoption and look to re-use software when possible.

One advocate of open source is Rick Dietz, director of IT for Bloomington, Ind. On its Web site, the city spells out its views clearly. "The city is committed to using and creating open-source software whenever possible."

Dietz sees limits, however, particularly for specialized applications where open-source alternatives don't exist. But he believes the Obama Administration could play a big role by encouraging its use.

"It would be an interesting project to look at what are the software needs across these government entities and then work on some uniformed, collaborative solutions," said Dietz, " so that governments aren't spending tens of thousands of dollars to support a myriad of systems, all of which are essentially doing the same thing."

The U.K. government's position may grow more appealing to the U.S., especially as the deficit soars.

According to Vass, open source has won adoption among U.S. intelligence agencies because they believe it is inherently more secure in the development process. Even so, he said, one of the arguments used by other federal agencies in rejecting it concerns security issues.

Said McNealy: "You would be astounded to know how many people are scared to death [of open source] or have mandated out open source."

And McNealy believes a federal CIO, with budget power similar to a corporate CIO, will be needed to bring about the changes needed to move toward software built on collaboration and re-use.

A long list of vendors have argued for open source, and the likely counter-attack by proprietary vendors was summed up last week in a study about how virtualization, cloud computing and open source could save governments big money. In response, Susie Adams, the CTO of Microsoft Federal, said that open source is "just another business model."

But it is a business model that has won specific endorsements from other nations and places such as Bloomington, and it's a business model McNealy intends to keep pushing at the Obama Administration.

Computerworld is an InfoWorld affiliate.


Read More ...

Dell sees opening for Atom-based PCs in small business

Could computers based on Intel's Atom processors, already popular among consumers, catch on for cash-strapped business? Dell seems to think so -- at least in some markets.

"We're exploring that, and there is some potential," said Steve Felice, president of Dell's small and medium business group, during a conference call with reporters.

[ Related: "Intel's Atom grows up, moves out of netbooks" and "Netbooks in the business: Do they make sense?" ]

Dell is already offering Atom-based computers to business in some markets. However, demand for Atom-based computers remains relatively small compared to demand for systems that use mainstream Intel processors, especially the low-end Celeron line, Felice said.

Atom-based laptops have been a rare bright spot for computer makers, which have taken a beating in recent months. However, popular laptops like Asustek Computer's Eee PC and Acer's Aspire One are primarily aimed at consumers, not at businesses.

Dell, which was late responding to demand for Atom-powered computers, eventually released its own Atom laptops for consumers, as well as the Vostro A100 desktop, a business PC that runs Ubuntu Linux and uses Intel's 1.6GHz Atom 230 processor.

The relative popularity of Dell's Celeron-based desktops compared to the Atom systems may be partly due to availability. For example, the Atom-based A100 does not appear for sale among a list of Vostro desktops offered for sale on Dell's U.S. Web site, while Celeron-based systems are offered.

Performance doesn't appear to be an issue for using Atom-based machines in an office environment.

While Intel executives frequently deride the inexpensive Atom processors for lacking the performance of their multi-core cousin, Dell's Web site for Middle East sales said the chip "delivers the performance required for basic office computing."


Read More ...

MS wants MVPs in Windows, cloud computing, virtualization

Want to become a Microsoft Most Valuable Professional (MVP), one of the company's elite volunteer army of tech experts?

Then now's the time to brush up on Windows Vista and Windows 7, the upcoming cloud-computing platform Windows Azure, and virtualization software such as Hyper-V , according to a Microsoft executive in charge of the MVP program. And plan on sharing that knowledge as widely as possible on developer and user forums run by Microsoft and others.

[ Get the analysis and insights that only Randall C. Kennedy can provide on PC tech in InfoWorld's Enterprise Desktop blog. And download our free Windows performance-monitoring tool. ]

Microsoft is heavily recruiting MVPs for these areas, Toby Richards, general manager for community and online support, said this week. It is also actively looking for MVP candidates in important overseas Microsoft markets such as China, Russia, India, and Brazil.

Microsoft, which holds its annual MVP Summit March 1-4 in Seattle, added several hundred MVPs this year and now has 4,200 worldwide. "We would like to keep expanding our pool of influencers," said Richards, adding that despite recent layoffs and belt-tightening at Microsoft, the MVP program has seen "no de-investment."

Microsoft started the MVP program in 1993 with 38 initial MVPs.

MVPs are chosen primarily for the amount and quality of free technical advice they dispense in Web forums and blogs. Contrary to some myths and jokes, there is no set formula that determines whether someone is MVP material, said Richards.

But a willingness to evangelize, and, if necessary, defend Microsoft products from haters helps. "I have not met one MVP who lacked the conviction and courage to challenge" attacks on Microsoft software, said Richards.

MVPs are unpaid, and the title does not connote any minimum level of technical knowledge, as a Microsoft certification does. However, the awards carry prestige in the large Microsoft technical community, and often open up job opportunities for awardees at Microsoft and its partner firms.

Another perk is the opportunity to attend the MVP Summit. Some 1,500 MVPs and/or their employers are expected, according to Richards.

Microsoft is trotting out a lineup of executives for speeches and Q&A sessions worthy of its much larger conferences. They include CEO Steve Ballmer; Antoine Leblond, senior vice president for Office productivity applications; Soma Somasegar, corporate vice president for developer tools; Mike Nash, corporate vice president for Windows product management; and Bob Kelly, corporate vice president for infrastructure server marketing.

Microsoft wants to give MVPs, 75 percent of whom are IT professionals or developers in their own right, early access to beta products and more say on those products, something they continue to clamor for.

At the MVP Summit, Microsoft plans to hold 700 sessions, bringing together MVPs with product and engineering teams.

That outside feedback helps supplement the vast amount of quantitative research that Microsoft collects, Richards said. The bigger value for Microsoft is the timely technical help that MVPs provide. For instance, 92% of the questions around Windows 7 posted on Microsoft's TechNet forum for IT pros is answered by a non-Microsoft employee, such as an MVP.

Over on the Microsoft Developer Network forum, every answer provided by an identified MVP is viewed on average 875 times, said Richards. Each view potentially saves Microsoft the several-hundred-dollar cost of providing a technical support phone call, he said.

Computerworld is an InfoWorld affiliate.


Read More ...

Microsoft cuts contractor pay

In an effort to reduce costs, Microsoft is cutting current contractor pay by 10 percent and future contractor pay by 15 percent.

"We held discussions with some of the impacted agencies and settled on the 10 percent reduction based on the economic climate and the need to achieve greater cost reductions," Microsoft said in a statement.

[ InfoWorld's Eric Knorr recommends companies use the economic downturn as an opportunity for transformation. | Tom Kaneshige looks at life after layoffs. | Get sage advice on IT careers and management from Bob Lewis in InfoWorld's Advice Line blog and newsletter. ]

According to a letter reportedly sent to contract agencies that was posted on the TechFlash blog, Microsoft also plans to cut future contract pay 15 percent. Microsoft did not reply to a query asking to confirm the legitimacy of the letter.

The contractor pay cuts follow Microsoft's announcement in late January that it plans to cut 5,000 employees . Microsoft employs nearly 96,000 people directly. It does not disclose how many contractors work at the company, although estimates are in the tens of thousands.

In a conference call with analysts and reporters earlier this week, Microsoft CEO Steve Ballmer said he'll continue to look for ways that the company can reduce costs. He also emphasized the importance of continued investment during the global recession.


Read More ...

Dell records 48 percent drop in net income

Dell's net income dropped 48 percent for the fourth quarter, the company said Thursday, as it also announced it is increasing its cost-cutting goal to $4 billion by the end of fiscal 2011 as it tries to come to terms with the recession.

The company recorded net income of $351 million for the fourth quarter ended Jan. 30, a 48 percent drop from the $679 million it recorded in last year's fourth quarter. Net income per share was $0.18. Analysts polled by Thomson Reuters expected net income of $496 million.

[ Related: InfoWorld's Eric Knorr recommends companies use the economic downturn as an opportunity for transformation. | Tom Kaneshige looks at life after layoffs. | InfoWorld reveals the top 5 spending priorities you can't compromise on. ]

Revenue fell to $13.4 billion, a 16 percent drop from a year ago, and short of analyst estimates of $14.2 billion.

Reduced IT spending has taken its toll on the company, with spending deferred until the economy improves, CEO Michael Dell said in a statement. However, the company is taking other steps to reduce costs to adapt to the recession.

The company now aims to reduce costs by $4 billion by the end of fiscal 2011, a change from the original target of $3 billion announced in May.

The steps Dell has taken in the past to cut costs include compensation reduction, staff cuts, restructuring its product design and distribution, and realigning its manufacturing strategy by shutting down factories.

"Within our business, we're being very disciplined in managing costs, generating profitability and cash flow, and investing in ways that separate Dell from others today and when the economy inevitably improves," Dell said.

Predicting how long the recession will last is difficult, Dell said during a conference call. The company can't fight the recession, so it will focus on elements it can control, such as cutting costs, Dell said.

In the fourth quarter, the company cut operating costs by shutting down factories and by outsourcing hardware manufacturing, he said. The company has reduced the manufacturing cost of each PC by about 5 percent, Dell said.

Similar measures will probably be taken to meet the additional $1 billion cost-cutting measures by the end of fiscal 2011, wrote John Spooner, senior analyst at Technology Business Research, in a research note.

"We expect Dell's belt-tightening to involve additional layoffs and plant closings within the company's North America organization," Spooner wrote.

The company employed about 78,800 people at the end of the fourth quarter of 2008, a headcount reduction of 9,400 from a year earlier. Dell officials declined to comment on any pending additional layoffs.

Dell may pursue additional cost-cutting opportunities if demand for products and services remains volatile, said Dell Chief Financial Officer Brian Gladden during the call.

The company will also selectively focus investments in higher-margin products, such as enterprise servers, storage and software, Dell said. It is also increasing its focus on cloud computing opportunities, such as online services from its Dell.com Web site.

Dell also has aggressive plans to push Microsoft's upcoming Windows 7 OS in enterprises. Hardware purchases may be delayed as customers wait for Windows 7, but in preparation for that OS, Dell is readying hardware that takes advantage of specific features planned for it, Dell said.

Server revenue for the quarter was $1.35 billion, down 16 percent year-over-year, while unit shipments declined 18 percent. Revenue from storage products increased by 7 percent to $692 million. Services revenue declined by 3 percent to $1.36 billion.

On the consumer side, laptop shipments were flat and revenue declined by 17 percent to $4.01 billion. Desktop shipments declined 21 percent, while revenue declined 27 percent to $3.5 billion. The company doesn't disclose the numbers of laptops or desktops shipped, a company spokesman said. The company is now selling its consumer products in 24,000 stores worldwide.

Netbooks aren't a large part of the company's laptop shipments, Dell said. Consumers are demanding the larger screens offered in traditional laptops, but Dell said it will continue to offer netbooks, with a variety of screen sizes, as part of its product mix.


Read More ...

Microsoft sues Linux-based vendor over patents

Microsoft's lawsuit against Linux-based technology vendor TomTom over alleged patent violations could signal a more aggressive stance by the software giant over intellectual property issues -- or it could be just an isolated case involving a dispute with one vendor.

The Linux Foundation is monitoring the situation, and an intellectual property attorney suggested the case might crimp open source usage. A TomTom representative said the company rejected Microsoft's claims and will vigorously defend itself.

[ In 2007, Microsoft demanded royalties from open source software companies | Microsoft could be taking a closer look at open source because Linux is the primary competition against Microsoft's Windows Server OS. ]

Microsoft has a filed a lawsuit against TomTom, a maker of automobile-based navigation systems, saying the company had violated eight Microsoft patents. TomTom's devices run a version of the Linux OS. Microsoft charges that TomTom's Linux implementation violates three of its patents.

Microsoft has sought to negotiate a licensing of its technology for a fee from TomTom but has been unable to reach an agreement, said Horacio Gutierrez, Microsoft's corporate vice president of intellectual property. Citing other intellectual property licensing agreements, such as a controversial one with Novell in 2006, Gutierrez said the company wants to license its intellectual property on reasonable terms. But some cases will arise when a "pragmatic business solution is not attainable. In those cases, we will have no choice but to pursue litigation," he said.

The lawsuit, filed Wednesday in U.S. District Court for the Western District of Washington, follows what Gutierrez described as a good faith effort that went on for more than a year to resolve the matter. "Frankly, our hope is to be able to resolve this through a licensing agreement that makes sense for both companies," he said.

"Microsoft respects and appreciates the important role that open source software plays in our industry, and we respect and appreciate the passion and the great contributions that open source developers make in our industry," Gutierrez said. "This approach and respect is not inconsistent with our respect for intellectual property rights."

This particular case relates to TomTom's specific implementation of the Linux kernel, said Gutierrez. Asked if the lawsuit would signal other similar litigation to follow, he responded, "We can't speculate about that. We have a strong track record of licensing, which evidences our commitment to that approach and that will continue to be the focus of efforts going forward."

Microsoft only has filed three patent litigations in its history, and this is the first one involving Linux, Gutierrez said. He stressed that open source "is not the focal point of this action," Rather, the litigation is over patents Microsoft said TomTom is using in proprietary software. TomTom, said Gutierrez, develops products based on a mix of proprietary and open source code.

The Linux Foundation, meanwhile, emphasized a readiness for any claims against Linux. "The Linux Foundation is working closely with our partner the Open Invention Network and our members, and is well prepared for any claims against Linux," said foundation Executive Director Jim Zemlin in a statement. "We have great confidence in the foundation they have laid. Unfortunately, claims like these are a by-product of our business and legal system today. For now, we are closely watching the situation and will remain ready to mount a Linux?s defense, should the need arise."

Zemlin described the case as a private dispute between Microsoft and TomTom. "We do not feel assumptions should be made about the scope or facts of this case and its inclusion, if any, of Linux related technology," he said. "It is our sincere hope that Microsoft will realize that cases like these only burden the software industry and do not serve their customers? best interests. Instead of litigating, we believe customers prefer software companies to focus on building innovative products," said Zemlin.

Recently, Microsoft has made overtures to the open source community, such as becoming a sponsor of the Apache Software Foundation and offering its Web Sandbox project for securing Web content via open source. But in the past, the company has irked open source proponents by claiming that open source technologies, including some in Linux, violate 235 Microsoft patents.

Microsoft's move against TomTom could put a damper on commercial use of open source software, an intellectual property attorney said. "I think it certainly has the potential to do so, and whether that has any long-lasting effect is another question," said Jason Haislmaier, of Holme Roberts & Owen in Boulder, Colo.

"You might have a strong reaction based on fear," initially, he said. Over time, there still could be some effect but not as much of the shock effect, said Haislmaier. Linux, he said, is just as susceptible to a patent infringement lawsuit as any other OS, he said.

Whether Microsoft takes more action remains to be seen, Haislmaier noted. He acknowledged the company previously has complained about its patents being allegedly violated by Linux. "The proof will happen over time whether this is the opening salvo [of] Microsoft putting patents where its mouth has been," said Haislmaier.

He advised management of open source risks by knowing what open source software is being used and complying with applicable licenses. There are also are indemnification services that cover multiple open source projects, Haislmaier said. He has done work for OpenLogic, which has offered this type of service, he added.

A critic of Microsoft, Roy Schestowitz, editor of the Boycott Novell Web site, emphasized Microsoft's pursuit of royalties as a new development. "My stance is that TomTom is likely to be one company among several more that were quietly pressured to pay Microsoft for software patents," Schestowitz said. Microsoft declined to respond to Schestowitz's comment.

The three US patents Microsoft says are violated by TomTom's Linux kernel include:

* Patents 5,579,517 and 5,758,352, providing a common name space for long and short file names.
* Patent 6,256,642, for a method and system for file system management using Flash-EPROM.

The other five patents include:

* Patent 6,175,789, pertaining to a vehicle computer system running multiple applications.
* Patent 7,054,745, offering a method and system for generating driving directions.
* Patent 6,704,032, for interacting with a controllable object in a graphical user interface environment.
* Patent 7,117,286, providing for a portable computing device-integrated appliance.
* Patent 6,202,008, for a vehicle computer system with wireless Internet connectivity.


Read More ...

Oracle prepping broad-based social-networking suite

Oracle is developing an enterprise social-networking suite that employs technologies initially created for internal use by the Oracle Asia Research and Development Center, according to a pair of official company documents.

The technology, titled Oracle Social Suite, has apparently not been formally announced by Oracle. It combines a wide range of social-networking features, according to an internal case study produced in September.

[ See also: Social networking dangers exposed | Keep up on the latest tech news headlines at InfoWorld News, or subscribe to the Today's Headlines newsletter. ]

They range from the basics -- such as a blog system that uses Movable Type as a front end; bookmarking; tagging; and aggregated information feeds -- to more conceptual ideas, like Oracle Social Graph, which provides a visual map of the connections between users and content.

The suite also includes OpenSocial Container, which enables users to plug in applets that meet Google's OpenSocial standard.

An architectural rendering of the suite depicts it as a layer that sits on top of Oracle's database, middleware, and search technologies, culminating in a top layer of "social enabled" enterprise applications.

Oracle has already moved in this direction through its Social CRM applications, but the company is apparently hoping to make social technologies more pervasive.

The Social Suite project dates back several years at OARDC, according to the documents.

OARDC is a distributed division, with hundreds of employees working in nine cities and seven countries, across multiple time zones. Growing frustrated with sprawling e-mail volumes, repetitive meetings and high travel expenses, OARDC began adopting social tools, according to the documents.

The effort eventually became a beta project code-named "Shiji." Over time, the technology was pushed out to other Oracle business units, the documents state.

It is not clear when or if the Social Suite will become a commercially available product. Neither document provides a release date, although one states that Oracle is "actively recruiting" proof-of-concept customers in Japan.

Oracle may be wise to test the commercial waters slowly, given the saturation level in the social-networking market. Scores of companies are selling platforms for enterprise and outward-facing use, and not all are making it. An Intel-backed product called SuiteTwo, announced in 2006 to great fanfare, is being phased out.

Oracle's suite may have the most appeal for Oracle-centric shops, since it is built on top of many Oracle products.

Social Suite will also have to find a comfortable role alongside technologies like Beehive, Oracle's secure messaging and collaboration platform.

An Oracle spokeswoman could not immediately provide additional comment.


Read More ...

Report: Microsoft to bring full SQL Server database to cloud

Microsoft, which last year announced plans to bring a limited version of its SQL Server database online, now plans to offer a fully capable cloud version.

Microsoft told The Register earlier this week that it plans to incorporate as many features from its flagship database into SDS (SQL Data Services) before the cloud-computing platform upon which it will run, Windows Azure, is released.

[ Follow the cloud with InfoWorld's Cloud Computing blog ]

Microsoft CEO Steve Ballmer told Wall Street analysts earlier this week that Azure, currently in beta form, would be finalized by the time of Microsoft's PDC (Professional Developers Conference) in November. Microsoft could not immediately be reached for comment.

In a follow-up posting at its SDS blog, Microsoft confirmed that at its MIX Web developer show next month, it will be "unveiling some new [SDS] features that are going to knock your socks off."

SDS was announced at MIX last spring. Still in beta, SDS was to offer a fraction of SQL Server's feature set and was aimed at developers at startups and smaller Web-focused companies with limited database experience.

This would be easier and cheaper to set up than a full, on-premises SQL Server, or even hosted version of SQL Server, which Microsoft already allowed its partners to sell.

In that scenario, users often still need to manage -- albeit remotely -- a full SQL Server database, and usually also buy SQL Server licenses and the underlying hardware.

Microsoft has acknowledged that pressure from partners and customers motivated it to enhance its upcoming Web 2.0-compliant SDS.

It may also have been motivated by cross-town cloud computing rival, Amazon.com, which offers a full version of SQL Server and Oracle 11g and MySQL via its Elastic Compute Cloud app hosting service.

Other vendors announcing cloud versions of their databases in the past year include Aster Data Systems, Vertica Systems, and even IBM.

Computerworld is an InfoWorld affiliate




Read More ...

HP adds to apps performance management pack

HP boosted its software for managing application performance this week with upgrades to its performance testing suite and load-testing software.

With the HP Performance Center 9.5 suite, customers can validate application performance against business requirements during the testing cycle to mitigate risks associated with deployment and upgrades, HP said. Featured in the suite is a new version of LoadRunner, which is HP's load-testing package. HP acquired these application performance management technologies when it bought Mercury Interactive in 2006.

The company also updated its ALM (application lifecycle management) services to enable IT organizations to build "Centers of Excellence" to increase application quality and reduce costs, HP said. Centers of Excellence is being defined as a platform for performance optimization processes along with consulting and support.

[ Also in the ALM space, Microsoft showed off its new Visual Studio UI. ]

Products launched are intended to provide a business-centric view of the application lifecycle to better meet service expectations.

Performance Center 9.5 consolidates performance testing capabilities in a Center of Excellence, offering features for sharing tests and trending capabilities. With trending, users can see changes in performance between iterations. Version 9.5 also addresses Web 2.0 applications with support for such technologies as Adobe AMF (Action Message Format). Businesses also can build rich Internet applications.

"Web 2.0 is driving some very interesting new applications on the whole," said Mark Sarbiewski, senior director of products for HP solutions. But Web 2.0 also is "creating all kinds of performance problems," he said.

With support for AMF and externalized objects, such as streaming and video in a Web application, HP hopes to help users understand performance better.

HP also is offering the ability to identify protocols in LoadRunner and Performance Center. "We now can identify and advise on the protocols that we're finding, which helps to really reduce the time to set up the right tests," Sarbiewski said.

With the Centers of Excellence Model, HP Performance Center users can increase resource efficiency and overall quality, according to HP. Centers of Excellence also help companies share skills across an organization and retain talent, according to HP.


Read More ...

ManageEngine moves network management tools to the cloud

ManageEngine is beginning a transformation this week to offer network management tools SaaS-style.

The company took its first step down the software-as-service path with a free beta of OpManager On-Demand, a service that enables customers to manage servers, desktops, and applications just as they would with the company's on-premise application.

[ Related: Cloud computing shapes up as a big trend for 2009. ]

ManageEngine also owns the Zoho office, collaboration, and CRM services, and the company is hoping to tap into similar success with its array of network management products. Girish Mathrubootham, a vice president at ManageEngine, says that Zoho passed the 1 million users milestone last year and "is still seeing more and more people come onboard every day."

To that end, Mathrubootham says that throughout 2009, and likely into the future, the company will continue making more of its management applications available via SaaS. Within two months or so, ManageEngine will launch a service desk offering that will provide asset tracking and lifecycle management, he adds.

"We also have plans to offer customers an option to move data from the cloud to on-premise, or vice-versa, if they want," Mathrubootham explains. "That could be a next logical step."

Indeed, analysts at IDC's Cloud Computing Forum this week said that among the chief concerns IT shops express about the model are security, interoperability, and choosing a host that won't go out of business during this recession.




Read More ...

Lawmaker questions gov't money for broadband rollout

Even though she has constituents in her congressional district who want broadband but can't get it, Representative Marsha Blackburn suggested Thursday that government should have little to no role in stimulating broadband deployment.

Blackburn, speaking at a communications policy forum, held a telephone town meeting Wednesday evening, and one woman called in to complain that broadband service stopped a mile from her house. The constituent, living in a rural area, complained that she was "on the dial-up," and her continuing efforts to convince a broadband provider to offer service have been rebuffed, said Blackburn, a Tennessee Republican.

[ Related: "Obama includes broadband, smart grid in stimulus package" and "Does the U.S. need a new broadband policy?" | Your source for the latest in government IT news and issues: Subscribe to InfoWorld's Government IT newsletter. ]

"I need high-speed Internet delivered to my home, and I'm tired of waiting," Blackburn quoted the woman as saying.

Asked if the U.S. government should provide money to reach people like her constituents, Blackburn said no. A $787 billion economic stimulus package that was passed by Congress earlier this month included $7.2 billion for broadband deployment to rural and other underserved areas, but Blackburn was critical of the legislation.

If more people in the constituent's area demand broadband, a provider will bring it to them, Blackburn said. "That is where I think we do let the market handle the job," she said. "I fully believe that the market can work this out."

Blackburn criticized the broadband money in the stimulus bill, saying it came with too many strings attached. More than half of the money includes net-neutrality regulations prohibiting companies receiving broadband grants from discriminating against some Internet traffic and from refusing to connect with other providers.

New neutrality regulations, supported by President Barack Obama, could slow deployment and inhibit broadband competition in the long run, she said. She called the policy "short-sighted."

Competition among providers will work out any problems with some blocking or slowing Web content, she said. "There is diversity of opinion, diversity of content and media platforms to distribute hat content than at any other time in history," she said.

Blackburn also noted that the federal government hasn't determined what areas of the country are not covered by broadband. Money for broadband mapping was included in the stimulus package.

Blackburn found little disagreement with her net-neutrality views at a panel discussion following her speech at a communications conference hosted by conservative think tank, the Free State Foundation. The panel included four large broadband and wireless providers and two conservative professors, but no strong net-neutrality advocates.

Advocates of net-neutrality rules say they are necessary to preserve an open Internet where customers can find the content of their choice. Broadband providers may be tempted to give higher priority to content provided by themselves or partners, net-neutrality advocates say, and the U.S. Federal Communications Commission has already sanctioned Comcast for slowing P-to-P (peer-to-peer) traffic in the name of network management.

But representatives of Comcast and Verizon seemed to disagree with Blackburn about the role of government in broadband deployment. Many of the remaining areas of the country without broadband are "very expensive to reach," said Thomas Tauke, executive vice president for policy at Verizon. In those cases, there is a role for government subsidies, he said.

A year ago, broadband providers would have been "ecstatic" to hear that Congress was planning to provide a "couple hundred million" dollars for broadband deployment to rural areas, Tauke said.

It was "very heartening" to see Obama's first move on broadband policy was to provide funding for grants to areas unserved, added Joseph Waz, senior vice president for external affairs at Comcast.



Read More ...

How to achieve more 'Agile' application security

Application security has become a critical component of all software development efforts. It includes all measures taken throughout the software development lifecycle to prevent programming flaws from being exploited. The flaws that creep in during the requirements, design, development, deployment, upgrades, or maintenance stages of applications become the basis of cyberattacks.

Lack of security built into applications coupled with poor programming techniques are routinely exploited by hackers and has helped contribute to massive monetary losses attributed to data breaches and theft of intellectual property. That is why security must be an integral part of the application development methodology and the Agile process is no exception.

[ Keep up on development trends with Martin Heller's Strategic Developer blog. | Discover the top-rated IT products as rated by the InfoWorld Test Center. ]

The Agile development process is said to have been born back in November 2001. This approach to software development has been gaining ground in recent years. The Agile process is focused on iterative discovery and development that aligns development with customer needs and company goals. That said, I have found that the basic characteristics of Agile tend to push security off till after the business functionality has been built. I have found that many times security is not included in the initial release of functionality, thus making Agile security bolted on rather than built in.

In the examination of application security issues, I tend to group them into two categories: business logic flaws and technical vulnerabilities. It has been recognized that security must be built into applications rather than bolted on at the end. This presents a challenge when using the Agile methodology. It also has caused great debate about the suitability of Agile for applications that involve sensitive security information or applications that could provide a covert pathway (aka backdoor) into other systems.

I like many believe the Agile processes are unsuitable for security-sensitive software applications. This is primarily due to the lightweight, informal, build-as-you-go nature of Agile processes. Security must be built in, not bolted on; therefore, it must be integrated throughout the Agile process.

Security starts before the project. The security strategy and objectives must be established at the very beginning. After that and during the project-definition step, high-level security requirements and objectives must be established, documented, and communicated to the development team. Once we have these cornerstones of security set, we must assess what is necessary to meet these objectives. In the scoping and estimation step, time must be allocated to review the evolving requirements and refine the security requirements and objectives. After the scope of security is defined and the level of effort estimated, high-level security plans are developed. When you are developing these high-level security plans, you must establish security coordination activities that evaluate the security measures at each point along the iterative development effort and make sure those measures are considered. Now the foundation has been established and we can move on to security within the Sprints.

One of the first steps for iterative development is to establish a theme for each Sprint that defines what type of capabilities will be addressed during this segment of development. As the theme of each Sprint is identified, forecasted security implications are discovered and documented and possibly stubbed in during development.

Stubbing is a technique that allows portions of the application to be developed without the entire functionality being coded at that time. Now it is important to capture security-related scenarios and details that coincide with the capabilities being defined throughout the iterative discovery and development process of the specific Sprint.

As the security requirements are uncovered, a decision on whether to include the security capabilities, stub them in or defer them to later Sprints must be made. There are two points that really frame the decision to include the security in the Sprint. The most significant influence to this decision is whether or not the software will be deployed and actively used as well as the security risks and sensitive information involved.

If it will be deployed, security must be built in and tested as part of the Sprint. If not, stubbing or deferrals are both viable options. At this point, iterative development begins. The software is developed and typically undergoes low-level testing as well as demos and code walkthroughs with the customer. These test cases and scenarios must exercise the security measures. That being said, it has been my experience that this does not happen. All security related functions and features that have been included must be exercised and demonstrated. Increased attention in terms of testing and code reviews must be given to race conditions, cross-site scripting, information leakage and SQL Injection. These four coding problems have proven to be the most common software vulnerabilities in Web applications. Once basic testing has been completed, the software can be moved to a simulated deployment environment.

The software delivered as part of the Sprint is installed into a representative operational environment. The vast majority of the time the development environment is far different from the operational environment. This can and has caused software operational problems and security issues. Deployment in an environment that mimics that of production is required so these problems and issues can be resolved. All the test cases and scenarios used previously form the basis for regression testing. The automated test script is replayed to validate proper operation in the new environment. In addition, new test scripts and scenarios are created to fully exercise the software from end to end as well as to examine and test for security vulnerabilities. Once all of these tests are successful, the software moves on to the next stage which is acceptance by the customer.

At this stage the software delivered as part of the Sprint or Sprints is demonstrated, evaluated, and verified and is accepted or rejected by the business customer. This must include the examination of security. Just as the customer has business acceptance criteria, they should also create security acceptance criteria. While it seems to be black and white, it is not. Conditional acceptance is the norm. Often time the sponsor will only accept the Sprint delivery if this, that and the other thing is changed. As identified as conditions of acceptance, the rework is scheduled. Once scheduled, the rework required to meet the conditions for acceptance is done. Once the rework is done and tested, the rework is reviewed and demonstrated to the business customer to ensure the intent of the conditional acceptance has been met.

Capturing lessons learned is the process of gathering, documenting, and analyzing feedback that has been received during the Sprint. This is critical so subsequent Sprints can benefit from this experience. This is critical because capturing security lessons learned gives the team members a chance to reflect on tasks, events, and activities during the Sprint that contributed to security short-comings. In addition, capturing lessons learned requires a retrospective examination of the risk management implement and documenting successes and shortcomings.

Putting it all together, all the software developed to date must be integrated and validated together. Once this is done, it must be validated to ensure proper function. Monitoring and tracking are required in the short term to make sure all the software and systems components are working together properly and do not create security vulnerabilities. Interactions between systems must be tested from a security standpoint. New security test scenarios must be developed and executed within this step.

In addition, as discussed earlier, increased attention in terms of testing must be given to cross-site scripting, information leakage and SQL injection. Once everything is working together properly and is secure, it is ready to be released into production. Release Management (RM) is the relatively new part of all software development methodologies and projects. RM becomes the liaison between all the business units and IT to guarantee smooth transition to the new system. Finally, the time has come to move to the next Sprint. The wrap-up step results in the Sprint being officially closed. In addition, this step creates the project artifacts and ensures proper documentation and the backup and archive of the code. This step often requires changes to security monitoring processes and capabilities that are in place at the business.

Conclusion
The security industry has given 2008 the dubious distinct honor as the "data loss year" due to the significant number of sensitive information security breaches that occurred. In congressional testimony by Director of National Intelligence Dennis C. Blair, he stated that last year global companies may have lost over $1 trillion worth of intellectual property to data theft in 2008. Software vulnerabilities represent one of the leading causes. This should concern everyone involved in application development regardless of methodology. It should also be seen as a warning to Agile development projects that they need to ensure proper security is built in, not bolted on.

CSO Online is an InfoWorld affiliate. Kevin G. Coleman is a 16-year veteran of the computer industry. A Kellogg School of Management Executive Scholar, he was the former Chief strategist of Netscape. Now he is a Senior Fellow and International Strategic Management Consultant with the Technolytics Institute -- an executive think-tank. For six years he served on the Science and Technology advisory board for the Johns Hopkins University Applied Physics Lab, one of the leading research institutions in the United States and served for four years on the University of Pittsburgh Medical Center's Limbaugh Entrepreneurial Center's Advisory Board. He has published over sixty articles covering security and defense related matters including UnRestricted Warfare and Cyber Warfare & Weapons. In addition he has testified before the U.S. Congress on Cyber Security and is a regular speaker at security industry events and the Global Intelligence Summit.


Read More ...

No comments: