Sunday, January 9, 2011

Fastest Computer

Cloud computing holds plenty of promise—one key to success is planning ahead.

Cloud computing is an umbrella term used loosely to describe the ability to connect to software and data via the Internet (the cloud) instead of your hard drive or local network. The following story is the second in a three-part series—aimed at helping IT decision-makers break through the hype to better understand cloud computing and its potential business benefits.

As more and more businesses consider the benefits of cloud computing, IT leaders are suggesting the best strategy to this emerging technology model is a thoughtful plan that weighs its impact in all corners of the business.

Because cloud computing is a still-fledgling model, vendors and standards bodies are busy sorting through definitions and interoperability issues. That aside, businesses can be certain that cloud computing, at its core, is an outsourced service and outsourcing by its very nature implies potential risks.
Cloud computing can offer real benefits, including lower data center and overall IT costs, streamlined operational efficiency, and a pathway to the latest technology. But that doesn't mean it's the best path for every company or every application.

Outlined below, Lori MacVittie, technical marketing manager at Seattle-based application delivery provider F5 Networks, and William Penn, chief architect at Detroit-based on-demand platform provider Covisint, weigh in on key risk issues companies should consider before making the move to cloud computing.

1. Lack of planning. The biggest risk is not having a roadmap, says Penn. Companies need to understand how external services fit into their enterprise as a whole.

"It can be a struggle for some people to have the vision to incorporate outside services into their business plan," says Penn. "But the outside network, the cloud, needs to be part of that roadmap."

2. Integration challenges. Most businesses aren't moving all of their applications to the cloud, and probably never will-and this causes data integration challenges. Penn says it's important to remember that it isn't just hardware or software that needs integration, but also processes, problem resolution, and employee interaction with data and systems.

3. Security concerns. Security is top of mind for IT executives—both the physical security of the data center, as well as the intervening network and security of the data itself. Data in the cloud is housed and accessed via an offsite server owned by a third party. Companies need to carefully consider the security and liability implications for proprietary data and overall business models.

4. Compliance guidelines needed. Cloud providers haven't yet addressed various industry standards such as HIPAA or Sarbanes Oxley, so companies with strict compliance or audit constraints are less likely to be able to use external applications.

5. Lack of technology standards. For now, there are no technology industry standards for coordination within and among data centers or vendors. Technology industry leaders are still debating the definition of cloud computing itself, so it will take some time before any standards are set.

"We'll need them, though," says MacVittie. She cautions that in the meantime, it's possible for early adopters to select a vendor now that may not be compliant with future standards. And, she says, vendor lock-in is common right now.

"That's okay for a company testing the waters with smaller applications or departments," says MacVittie. "But what if you're locked in and that application becomes critical?"

Both Penn and MacVittie agree that, at the moment, there are enough uncertainties and risks that IT decision-makers considering integration of cloud computing into their businesses should step back first to weigh these risks along with the potential benefits to their businesses.


The world's fastest computers are Linux computers

There are fast computers, and then there are Linux fast computers. Every six months, the Top 500 organization announces "its ranked list of general purpose systems that are in common use for high end applications." In other words, supercomputers. And, as has been the case for years now, the fastest of the fast are Linux computers.


As Jay Lyman, an analyst at The 451 Group points out, Linux is only growing stronger in supercomputing. "When considered as the primary OS or part of a mixed-OS supersystem, Linux is now present in 469 of the supercomputer sites, 93.8% of the Top500 list. This represents about 10 more sites than in November 2007, when Linux had presence in 91.8% of the systems. In fact, Linux is the only operating system that managed gains in the November 2008 list. A year ago, Linux was the OS for 84.6% of the top supercomputers. In November 2008, the open source OS was used in 87.8% of the systems. Compare this to Unix, which dropped from 6% to 4.6%, mixed-OS use which dropped from 7.2% to 6.2% and other operating systems, including BSD, Mac OS X and Windows, which were all down this year from the November 2007 list."

Microsoft is proud that a system running Windows HPC Server 2008 took 10th place... behind nine supercomputers running Linux. Even then, this was really more of a stunt than a demonstration that the HPC Server system is ready to compete with the big boys.

You see, there are no Microsoft programming tools to write supercomputer compatible applications. That will come years from now with Visual Studio 2010 and when Microsoft's F# is more than a research project language. In short, Windows HPC isn't ready for prime-time.

In the meantime, the real work is being done on the Linux computers. The number one supercomputer? Once more it's IBM's Linux-powered Roadrunner That's the same supercomputer, which this summer broke supercomputing's sound barrier: a sustained run of more than one petaflop per second or 1.026 quadrillion calculations per second. Beat that Microsoft!

The Roadrunner does have competition now though. The Cray XT Jaguar also recently busted the petaflop wall. The Cray also, of course, runs Linux. In the XT's case, it's running CNL (Compute Node Linux). CNL is based on SUSE Linux.



Needless to say, all the Linux systems do have working parallel-processing languages, like GCC, PGI and PathScale. For now, and the foreseeable future, Linux will not only stay the fastest computers, they'll also be the most useful fast computers.


A supercomputer is a computer that is at the frontline of current processing capacity, particularly speed of calculation. Supercomputers introduced in the 1960s were designed primarily by Seymour Cray at Control Data Corporation (CDC), and led the market into the 1970s until Cray left to form his own company, Cray Research. He then took over the  supercomputer market with his new designs,holding the top spot in supercomputing for five years (1985–1990). In the 1980s a large number of smaller competitors entered the market, in parallel to the creation of the minicomputer market a decade earlier, but many of these disappeared in the mid-1990s "supercomputer market crash".

Today, supercomputers are typically one-of-a-kind custom designs produced by "traditional" companies such as Cray, IBM and Hewlett-Packard, who had purchased many of the 1980s companies to gain their experience. The IBM Roadrunner, located at Los Alamos National Laboratory, is currently the fastest supercomputer in the world.