Magazine Article | October 1, 2002

What's In Your Storage Future?

Source: Field Technologies Magazine

As new technologies emerge to handle data growth, one storage factor won't change - the need for management tools.

Integrated Solutions, October 2002

We've entered the new millennium, an "information age" greatly anticipated and sometimes feared. The power of information will mold and guide our thinking, our working, our living - truly every aspect of human endeavor. The Information Age actually started with Johannes Gutenberg's 15th century invention of moveable type, which led to the printing press. By making it possible to mass-produce books and other documents, the printing press caused the cost of information to plummet and exponentially increased its availability.

Centuries later, the Internet has accelerated information to the speed of light. Digital data is currently increasing at a rate of more than 1.5 EB (exabytes) per year. The amount of data under management globally has already surpassed 21 EB.

As a result of this rapidly escalating data, applications for storage are now being planned at levels never before imagined. The entertainment and broadcasting industries alone are estimated to be using more than 6 zetabytes (1021). How much data is that? Well, a stack of 3.5-inch floppies holding one zetabyte of data would reach from Earth to Saturn. One European broadcasting company has already defined a requirement for 60 EB of storage. That's the equivalent of 1.5 quadrillion pages of text.

In terms of storage technologies, are we prepared for the future - near and far - of storage growth? Of course! Let's take a look at the road map.

Disk, Tape, And Beyond
The beginning of an anticipated slowdown in magnetic disk progress is evident. While magnetic recording continues to show an ability to scale to increasingly dense operating points at ever decreasing costs per unit of storage, physical limitations are in sight.

While magnetic disk progress is likely to slow, magnetic tape development should continue. Thus far, recording to the flat, spinning surface of a disk has been an easier engineering challenge to tackle than recording to the wavy surface of tape. That has kept tape some 10 years behind disk in density. However, magnetic tape has largely held onto its advantages in volume and cost over other technologies by using a larger media area to counter lower data densities. Since it hasn't achieved its peak in density, tape will develop through several more generations.

New data storage technologies are evolving with the potential to rival existing magnetic disk and tape technologies. MEMS (micro-electromechanical systems) probes will someday compete with both solid-state and disk memory. MEMS devices use integrated circuit-like technology to create tiny disk arrays on chips. They offer data densities beyond disk, as well as transaction rates 10 times that of performance-class disk drives. They also consume much less power. Multiple developers continue to work toward releasing MEMS products.

Another alternative, holography, promises higher storage densities than disk and tape in low-cost media (less than $1 per TB), with data transfer rates of hundreds of MB per second. With capacities reaching as high as 10 TB per cubic inch, holographic recording could translate to petabytes of storage in future subsystems. To date, however, only one proof-of-concept for holographic storage has been reported.

From Magnetics To Molecules
So, information storage has a future beyond what we get from magnetics. But, what about our ability to manipulate data? What will happen with computing power?

In terms of chip and circuit design, there are limitations associated with the physics of traditional photolithography (the use of light to etch patterns on silicon chips). One key issue is the ability to precisely position wafers within a few nanometers (billionths of a meter) from each other. Eventually, the spacing between the photo-etched paths on a chip will become so small that it will be impossible to keep the metal in each path from touching. The result is a short circuit, which, of course, ends the usability of the chip.

Roughly in the same time frame that photolithography reaches its limits, molecular computing will become commonplace. Credible organizations and corporations are developing ways of manipulating proteins to create synthetic DNA structures. These structures form new molecules, which are the computing equivalent of a transistor. DNA and synthetic DNA strands are used to create nanomachine lattices for molecular circuits or transistors. Molecules sandwiched between two layers of electrodes act as switches or memory units when voltage is applied across the electrodes. Remember, these molecular transistors can be used for computing or for data storage.

While molecular computing may seem like the stuff of science fiction, it is important to remember how rapidly information-processing power has progressed in the computer age. In 1954, when IBM created the first mainframe computer, the processor required 2,000 transistors. Today's microprocessors can have as many as 42 million transistors. In the next 15 years, we will see microprocessors with 42 billion transistors running at speeds in the terahertz range. Intel has already announced for 2007 a microprocessor billed as a terahertz processor that will have the equivalent processing power of 10 GHz.

Some neuroscientists have estimated that, within the next 15 to 20 years, a microprocessor will have the equivalent computing power of a human brain. Other scientists have estimated that, by 2055, $1,000 worth of computing will equal the combined processing power of all human brains on Earth.

Autonomic Storage Ensures Your IT Survival
What does all of this mean for those of you charged with managing central data repositories and corporate data centers? One thing is clear: the need for policy management is extreme. It is easy to argue that managing a data center today has already exceeded human intelligence. Take away all of the network and operations management tools you already have in place and see how long your data center would continue to run. Not long. The massive technology shifts on the horizon will only amplify management complexity.

Therefore, there will continue to be two major thrusts for data management. The first involves information management. Information management has to do with the extraction of disparate data into real-time managed information. Database technologies will reign supreme here. Organizations that can't do sophisticated, real-time data extraction will simply not be able to compete.

The other thrust will be in infrastructure management. While infrastructure management involves the physical assets in the enterprise, it also includes utilizing storage resources for sharing, archiving, migration, backup, and hierarchical management. In response, there will be vendor consolidation in order to provide broader, more comprehensive solutions. Automated management capabilities will be top priorities for vendors as they develop products.

What you should do now is be sure that you partner only with providers that can deliver policy-based, hierarchical storage solutions. When you talk to vendors, make sure they aren't just talking about how they incorporate "what's cool" in terms of the latest technologies. If they don't begin the conversation by describing specific business benefits their products can bring, be wary.

Managing data growth on flat or declining budgets with levels of availability at 24/7 is difficult and challenging, to be sure. Fortunately, policy management can take the complexity out of the minds of humans and into the minds of computers. Things will get more complex and challenging, but within the next 10 to 15 years, storage will become autonomic - that is, self-managing, self-administering, and self-healing. You can rest assured that the stress of trying to add more and more sufficiently skilled IT staff to manage your data will join the Gutenberg press as just another subject for the history books.