Organizations are also gathering more data now than ever before. This ‘big data’ is a rich source of competitive advantage, but many still deal with more fundamental concerns: dealing with growth affordably, assuring constant access, and driving up productivity.

Exponential growth

Traditionally, businesses relied on structured databases that grew at predictable rates. Now, organizations see spikes in unstructured data like audio, video, and photographs.

Ninety percent of the data stored worldwide is less than two years old, explains Corey Dyer, Vice President Storage Sales at HP Canada. Yet, despite this exponential increase, organizations are still using physical storage devices that were designed 15 or 20 years ago. These old designs are simply unable to keep up.

"Research indicates that, in most enterprises, one minute of downtime could cost as much as $5,600."

“Businesses are likely to be behind the times if they don’t take a hard look at the systems that they’re using to store, serve, and protect their information,” says Dyer. “That’s why it’s important for companies to understand what they’re doing with their data, and how it’s stored.”

Technologies like thin storage, deduplication, and other storage efficiency features are critical to bridge the gap between capacity and cost.

Keep your data protected

Not modernizing storage or using the right protection technologies could put a business in serious financial danger. “Storage is the one aspect of the corporate data centre that is persistent,” says Dyer. “All the other pieces move from point A to point B. So without the right storage, human error or a major event could have a drastic impact on the businesses survival.”

Research indicates that, in most enterprises, one minute of downtime could cost as much as $5,600. Despite this risk, most companies are using traditional dual-controller storage with limited resiliency and haven’t done a major refresh of their backup infrastructure in over five years. “They are likely to face major issues when they do a recovery and get back up and running,” says Dyer.

Technologies like multi-controller scale-out design, replication and federation, as well as snapshots and disk-based backup, are all modern innovations that customers should be integrating into their datacenter plans.

Turbo-charge application performance

“Businesses are likely to be behind the times if they don’t take a hard look at the systems that they’re using to store, serve, and protect their information.”

Many companies that use virtual machines fail to realize that storage performance is the most important link to maximize ROI from server virtualization.

“One of the biggest innovations in data storage is the utilization of flash technology,” says Dyer. “People now expect the speed of access that they get from their smartphone or tablet in their business applications, and flash technology allows that to happen.” At the same time, the right flash storage can allow companies to reduce power consumption and store data more efficiently, which keeps costs down and frees up resources for critical projects.

All-flash storage has reached something of an inflection point. Everybody wants more speed but nobody wants to compromise on reliability or scale, and organizations are not willing to pay a premium. Second generation all-flash arrays from mainstream storage vendors are stepping up to the plate.

“Businesses now have the option to purchase high performance storage with robust tier-1 features and guaranteed resiliency, and it now costs the same as the spinning hard disk you were purchasing,” says Dyer.