French, Tekrati –
Bloor Research crosses the aisle and declares Linux enterprise-ready. META Group sees signs of changing tides in data center operating system dominance. The Linux-mainframe marriage makes sense with planning according to D.H. Brown and Giga, while Illuminata suggests maybe you shouldn’t care. Aberdeen Group finds that infosec attacks leveled the playing field among operating system targets in 2002 and urges suppliers and users to replace outdated defense strategies. Gartner says lack of user demand has slowed distro adoption of available Linux security enhancements.
The Enterprise Penguin
Linux is winning the favor of more industry analysts. Following the recent Forrester Research brief Linux Finds Footing in the Enterprise, Bloor Research will declare Linux enterprise-ready in December. This reverses the firm’s 3-year position against Linux in enterprise IT. The upcoming report, Linux – Enterprise Ready? examines Linux on six fronts: scalability/consolidation, availability, reliability, security, manageability and flexibility.
“Linux now scales well on Intel hardware, and by taking advantage of failover extensions from Linux distributors and Grid suppliers, high availability can be achieved,” said Joe Clabby, president of Bloor Research North America. “Linux is proven to be reliable, especially for dedicated applications, and its open source nature ensures that it is at least as secure as its rivals.”
Bloor Research findings include:
- At present, Linux scales extremely well horizontally in distributed “grid” computing configurations and scales well vertically to 6-way SMP on Intel hardware. Over the next three months, Bloor analysts expect to find Linux 2.5 scaling to be backported into Linux 2.4 kernel distributions (allowing for 8-way scaling) and Linux 2.6 providing up to 16-way scaling in about a year. Bloor also judges IBM zSeries Linux scalability using virtual machine technology as “ideal for server consolidation purposes.”
- Grid vendors offer very sophisticated Linux management tools in the form of distributed resource management tools, utilities, and applications. Plus, some vendors with major commitments to systems management did Unix-to-Linux ports resulting in rich suites of Linux management tools, utilities, and applications.
- Linux has been highly successful in achieving its flexibility goals: running on small, mobile hardware chipsets such as ARM; various embedded chipsets; popular but somewhat obscure chipsets such as Saturn, Hitachi’s H8, Amtel AVR, the Motorola 68K family; powerful enterprise server chips, such as HP’s Alpha, Sun’s UltraSparc, Intel’s Pentium, and Itanium series, and IBM’s PowerPC series.
Data Center Darwinism
META Group research indicates that data center operating system capacity mix will change dramatically by 2012. According to research by Rob Schafer, program director for META Group’s Enterprise Data Center Strategies, Unix will maintain dominance through 2007 (declining from 45% to 40%), Windows will grow from 20% to 38%, and Linux will move from 3% to 11%. z/OS will continue its slow 14% growth rate, shrinking its capacity share from 32% to 11%. By 2012, Windows will dominate with 51%, Linux at 26%, Unix at 20%, and z/OS at 3%. In addition, (net) data center server annual capacity growth is slowing from the historical 45%-50% to 40%. Schafer attributes deceleration to primary drivers such as budget constraints, improved server utilization and better demand management. META Group’s bottom line: Data center users should focus on optimizing Windows operational processes and management tools while aligning Unix and Linux support.
Always a Bigger Fish
For IT shops looking to consolidate smaller mainframes, D.H. Brown finds the IBM eServer zSeries delivers economies of scale. The zSeries offers a variety of extensions allowing workloads from Linux, UNIX and Windows servers to be consolidated effectively. Typical workloads for Linux consolidation include file and print, messaging, web serving, infrastructure (firewall, DNS), and line-of-business applications. Analysts report that strong zSeries scalability, reliability, security, and workload management capabilities translate into a number of business benefits including physical resource savings and lower costs. Source.
Mainframe Linux also came under Giga Information Group scrutiny. Linux analyst Stacy Quandt cautions that selecting a mainframe Linux distribution certified for specific workloads is a non-trivial task. She offers Giga clients advice in Criteria for Selection: Mainframe Linux Distribution Providers.
Illuminata’s Gordon Haff says stop being so concerned about when or if Linux can successfully attack Big Iron’s stronghold. He says that’s looking through the old lens, as is measuring the operating system based on how many processors it can support in a single box. The new lens is all about distributed computing, many aspects of which Linux already handles extraordinarily well. Source.
Two Views: Kicking Black Hats
Aberdeen Group incited heated debate with their perspective, Open Source and Linux: 2002 Poster Children for Security Problems, as evidenced in linux.com postings. Intended to draw attention to misconceptions and outdated infosec practices, the paper points to CERT advisories as an indicator that operating systems are equally vulnerable to increasing attacks and that proportionate changes in vendor, user and industry defense strategies are needed. “The threat from traditional viruses is largely dead, replaced by active Internet content attacks.” said Jim Hurley, vp and managing director of Aberdeen security and privacy research. “These attacks range from simple annoyances like spam to more nefarious activities, such as taking over port 0 and taking over a computer.”
He said that no operating system is inherently secure against these invisible and pervasive software microbes, a fact easily verified through any reputable bug tracking source. “No matter which way you slice it, the data points to the same direction: no one operating system is more susceptible than the others.
“What we’re really pointing to is the need for software suppliers to do a better job of delivering more secure software in the first place,” said Hurley. “However, it will likely take suppliers years to get there. As a result, IT buyers may be forced to look to new ways of maintaining security currency to avoid being compromised.
“Whether you are using computers to run a business or deliver mission critical services, you need to look at protecting your systems in new ways. You need to know what vulnerabilities exist and where, the impacts of applying a fix or patch and how to deploy the fixes.”
Based on research and analysis over the last two years, Hurley’s team advocates:
- Users: Automating customized security processes through new-generation active electronic infrastructure management (Active eIRM)
- Vendors: Exercising greater rigor in testing for security glitches before releasing software, while addressing longterm quality assurance
- Industry/Community: Creating a more active, global security vulnerability notification and response system
Gartner, Inc. says that several initiatives aim to create a secure Linux, but the enhancements have yet to appear in popular distributions. What are the obstacles? “I don’t think there is a problem with the distro’s on a technical level,” said Ant Allan, Gartner research director . “There is not yet a demand that would justify the effort.” His report, Linux Security Capabilities: Perspective, includes Gartner picks for leaders in mainstream Linux distributions and secure Linux distributions and enhancements, as well as technology alternatives. Gartner positions on winners and losers usually directly affect the purchasing preferences of IT organizations worldwide.
For more on Linux infosec, read the Linux Advisory Watch column.
Catch ’em Live
21st Annual Data Center Conference
December 9-11: Gartner Conference, Las Vegas, Nevada
Cost: US$1,495 to $1,795
Info/Registration
John Phelps, a Gartner vice president and research director, presents the firm’s data center scenario in his keynote address. As data center platforms evolve, the IS department is asked to do more work with less staff and budget. Phelps will examine ways to continue managing traditional workloads while adapting to handle new application environments. Key issues include:
- How are Unix, Linux and Windows evolving to meet the more stringent needs of the mission-critical data center?
- Will the mainframe become a more or less significant player during the next five years?
- What issues are causing sleepless nights for data center management?
Category:
- Linux