HDS enabling "continuous cloud"

Time was, storage value was measured in lowest cost per stored bit and pricing trend lines exhibited a relentless spiral downward as technology innovation led to storage commoditization. In enterprise environments of today, the information explosion has meant that this equation is more likely to be calculated in cost per terabytes or even petabyte of data. But an even more momentous shift is underway, as the value of storage systems are assessed increasingly by their ability to align with business requirements. Hitachi Data Systems (HDS) defines those requirements as ‘availability’, ‘agility’ and ‘automation’ — always on IT infrastructure that enables companies to operate in our technology dependent and hyper competitive business environment. To help customers achieve ‘business defined IT,’ the company has introduced a number of storage product and platform enhancements, which HDS launched on a recent seven stop “Follow the Rising Sun” global city tour that included a Toronto location. From the Skyservice Business Aviation hangar in Mississauga, HDS outlined its “Continuous Cloud Infrastructure” and the software defined architectures that enable it, introducing integration partnerships with SAP and VMware aimed at helping the company better position in solution sales.

Marcel Escorcio, , RVP and GM of Hitachi Canada
Marcel Escorcio, RVP and GM, HDS Canada

As Marcel Escorcio, RVP and GM of Hitachi Data Systems Canada explained, this round of HDS innovation covers a range of hardware, software and partner initiatives designed to help customers manage challenges introduced by cloud and Big Data. Hardware announcements include Hitachi Virtual Storage Platform G1000, the latest version of the Hitachi Command Suite management platform featuring new levels of scale and flexibility. According to HDS, the platform can start small and scale block-storage throughput performance of well over 3M IOPS, has over 48GB/sec of usable bandwidth and NFS ops/sec performance of over 1.2M in unified configurations, and because the new platform has different proof points and prices, customers can ‘right size’ their deployments. The platform acts as a virtualization controller, a unified storage system for up to 8-node Hitachi NAS Platform clusters, and offers high availability for “Hitachi Content Platform environments.

On the software front, the company launched a new operating system, the Hitachi Storage Virtualization Operating System (SVOS), which represents the final stage in the HDS abstraction of data and information assets that began four and a half years ago. With storage virtualization complete, the new platform allows management of heterogeneous hardware environments to treat storage, including legacy systems, as one management pool and content as “fluid,” and allows customers to reduce complexity associated with managing different vendor hardware and operating systems (or finding skills) and to dispense with the need for different systems to address the various service level needs of different applications. In this latest SVOS, HDS has added flash optimization, advanced storage virtualization, automated tiering, non-disruptive data migration  and a new native global active device feature that will provide multi-system and multi-datacenter active-active capabilities without an appliance — an “industry first,” according to HDS — as well as the automation that is key to the HDS vision of Continuous Cloud.

For customers, new features mean that Hitachi technology has a longer shelf life as it is better able to adapt to changing business needs without technology replacement, and has greater potential to future proof the implementation as it reduces the complexity in provisioning, migration of data and systems and management. Escorcio noted the cost savings that this enables: “It’s a financial and risk proposition,” he stated. “From a financial perspective, the longer I can keep a piece of hardware, the longer the span of time before I have to do a refresh, the better. There’s also the risk of change — if I can do non-disruptive migration of storage or any infrastructure, I don’t have to take down my applications or data for that period of time. If I can do this over a ten year, rather than three year period, there is a saving for the enterprise.”

The company also introduced enhancement to its Hitachi Unified Compute Platform converged computing offerings to ease integration with partner and HDS products as follows:

Hitachi Unified Compute Platform (UCP) and Unified Compute Platform Director 3.5, with support for VSP G1000 and SVOS, and new entry-level configurations of UCP for VMware vSphere, as well as increased capabilities in UCP Director, such as server profiling for simplified provisioning and enhanced disaster recovery integration; and

Hitachi Command Suite, the latest version of the company’s integrated management platform which supports the new global storage virtualization features in SVOS and offers a common REST API across the platform, as well as an updated, streamlined user interface.

Miki Sandorfi, VP solutions and cloud, HDS
Miki Sandorfi, VP solutions and cloud, HDS

For HDS, “deep ecosystem integration” is a key component of its solution strategy. Through development with strategic partners such as Microsoft, SAP and VMware, HDS has worked to ensure that SVOS and the VSP G1000 are certified in key initiatives like Microsoft Private Cloud deployments and SAP HANA’s Tailored Data Center Initiative and that integration within VMware ecosystems is extended. The goal with VMware integration is to simplify storage provisioning and management for clients through control of Hitachi storage systems via the VCentre master console and VMware tool sets, and for HDS, to leverage the platform ubiquity of this strategic partnership. According to Miki Sandorfi, HDS VP solutions and cloud, a first integration with Microsoft three and a half years ago had a “Hitachi look and feel about the work flow” with Microsoft management “under the covers,” an approach that proved less successful since customers prefer tools that are familiar to them. In the v 2 integration with VMware, HDS took the opposite approach with Hitachi snap ins, and have now followed course with Microsoft, enabling a familiar user experience through System Centre for provisioning on the front end, while more complex performance management and scale tasks are addressed through the HDS UCP.

Similarly, through focus on the business value proposition, HDS storage innovation combined with “SAP thought processes” on data management in the HANA product, is designed to help customers transition from an “infrastructure cloud” to a “content cloud,” defined by Sandorfi as encompassing everything that is “non-database” such as unstructured data, on the way to an “information cloud,” with SAP providing the missing application link, and HDS providing Hitachi servers and tiered storage scale to complement SAP in-memory resources. “VMware, SAP, Microsoft and Oracle are the key environments in the businesses that we engage with. These are very complex, need to be simplified and we need to have better management for these ecosystems,” Sandorfi explained, as there are literally hundreds of steps that have to be taken to connect HDS technology into these environments. Through deep integration in the UCP Platform, HDS has automated all of the steps that are needed before the environment can be managed and resources provisioned by a partner such as VMware. “Where we’re going with the UCP Director is to allow customers to focus on what they’re doing with the infrastructure,” Sandorfi added, “not how they have to set up the infrastructure. We’re also building more intelligence into the software, using Big Data analytics to understand the natural and expected behaviours of the infrastructure and anomalies” to enable predictive monitoring of infrastructure.”

Rather than a branding challenge, HDS views keeping ‘under the covers’ as the means to developing its solutions orientation: Sandorfi asked “Why would you use the VSP G1000 in a scale out HANA implementation? It’s because it gives you multi-centre data tolerance, data sharing and on top of that, we’re helping you use those HANA tools to solve bigger problems. So it’s really no longer about the Hitachi brand wrappers around just the hardware infrastructure any longer. It’s really the services and the solution that are on top.”

Michael Cremen, EVP global accounts, HDS
Michael Cremen, EVP global accounts, HDS

According to Michael Cremen, HDS EVP global accounts, the announcements spell “huge opportunity” for HDS, as when combined they provide a “heterogeneous hardware solution that hits small, medium and large enterprises, all managed with one software suite.” In a general sense, the company’s strategy relies on the continued importance of private — as opposed to public — cloud deployment in most organizations. As Sandorfi, explained, “we’re not downplaying the validity of [public] cloud or the fact that cloud is changing how businesses are dealing with IT. Clearly there was a boiling point, where it [traditional IT] did take too long and was too expensive, and IT was a siloed, walled off organization with fantastic budgets. Businesses were looking at increasing budgets, but not seeing increased value…” As opposed to the public option, however, which can be much more complex than people recognize, HDS is posing a private solution that can deliver business value — that provides transparency into what is needed for the business, as well as the ability to charge or pay for services and the agility needed to quickly stand these up.

“It’s not a binary decision [public or private cloud],” Cremen noted, there’s a “pendulum that swings between private, public and hybrid strategy” and the company will also play in the public sphere through relationships with partners listed above and with other telco/service provider partners who are building their own branded offerings. Public clouds, powered by Hitachi, such as those built by Verizon or CGI, that may ultimately involve joint marketing, and programs like Hitachi’s Cloud Provider Program which can leverage a growing channel or SI ecosystem for quick delivery will be the route HDS technology to public cloud markets.

 

LEAVE A REPLY

Please enter your comment!
Please enter your name here

This site uses Akismet to reduce spam. Learn how your comment data is processed.