Senior Storage Solutions Architect
In the past, we have viewed service level agreements (SLA) as the guiding light toward designing backup and DR systems to meet or exceed recovery objectives, in the event of an operational or data center site failure.
This is the insurance policy organizations hope they never have to use. Except for periodic testing, or the rare instance of declaring a data center disaster, the onus of exercising SLA’s is an infrequent event.
The challenge for organizations to maintain parity, with the pace of market place changes, has forced their adoption of microservices architecture. Before microservices, most organizations were unable to deliver agile IT services, which often resulted in organizations moving to Cloud-based IT, so business units could get their applications tested and deployed in a timely manner. But, the Public Cloud is not for every organization. Being able to have the same agility as the Public Cloud, but on prem, also known as a Private Cloud, has gained momentum only recently.
A key ingredient toward having a Private Cloud is deploying a microservices architecture, which enables the capability to conduct frequent testing of applications. Gartner has characterized this IT development as “Bimodal IT.” Moreover, it is important that testing is done with production data to ensure accurate results. This requires coordination between the application and IT operation teams, which is characterized by Dev/Ops. Dev/Ops can be thought of as Infrastructure as a Service (IaaS), which promotes self-service as a key construct.
The result has been a new SLA requirement within organizations that is not based on recovering data from failures. Instead, it is based on the continuous requirement to present production data to development environments for testing, on a weekly, daily or hourly schedule. If it takes too long to present the data, it can delay the application development process, which degrades an organization’s capability to be agile, increasing the time to market, along with the cost of software developers being idle. Furthermore, the data requires self-service accessibility to further expedite testing by making production data available on-demand to the application development teams, and in effect, circumventing IT operations.
This new IT Service requirement is called Copy Data Management (CDM). Certain storage companies have a CDM technology, including IBM. IBM’s CDM offering is called “IBM Spectrum Copy Data Management.” Most advanced storage subsystems have copy functions, but lack robust management features. This is especially true when it comes to having a catalog to keep track of storage-based copies, with associated management policies.
IBM Spectrum CDM is unique, in that it can orchestrate the copies of production data with an interface that provides role-based access to extend self-service capabilities to application teams. By automating the delivering of data copies, development teams have on-demand access to production data, with the added capability of data masking. Masking is important since production data often has sensitive records that cannot be seen by un-authorized personnel. Another important attribute of IBM CDM is its ability to apply life-cycle policies to self-provisioned storage, which in turn, helps storage administrators reclaim space after a development cycle has been concluded.
In summary, IBM’s CDM efficacy does not stop at Dev/Ops. Application teams often go directly to the Public Cloud, because they perceive the Cloud as a faster way to access compute and storage to accelerate their development, which has fueled the spread of shadow IT. In some cases, IT is simply overwhelmed with administering monolithic systems, preventing IT from being able to pivot toward adopting a microservices architecture. By using automation, IT has a force multiplier to offset personnel resource constraints. IBM CDM embraces other automation tools through REST API’s so Continuous Integration and Continuous Delivery (CI / CD) can be leveraged to improve the efficiency of the testing pipeline. In turn, this can help keep critical IT function within the walls of the data center. That said, if the Public Cloud is also part of the IT landscape, IBM CDM can work in a hybrid configuration by using its automation and catalog management to better orchestrate Public Cloud-based storage copies.
Please contact your Mainline Account Executive directly, or click here to contact us with any questions.