Data Company Central

 View Only

A 7-Step Checklist to Bring DataOps to Life

By Michael Torok posted 08-19-2019 11:44:26 AM

  
Don’t get left behind. Here are 7 ways to enable the flow of secure, 
Bring DataOps to Life
high quality data across your enterprise through a platform-based approach to enable DataOps.
This article was originally published on the Delphix website here November 16, 2018.

As companies make significant headway with DevOps to meet the needs of modern app development and deployment, DataOps is an emerging practice that can further enhance individual and team outcomes by bringing people, technology and process together to create an iterative agile flow.

Recognized by Gartner as a new entrant in its 2018 hype cycle report, DataOps is “a collaborative data management practice focused on improving the communication,
integration and automation of data flows between data managers and consumers across
an organization
.”


With the exponential growth of data, companies are demanding data be made available in more places. Thus, the notion of DataOps is expanding as more organizations take on initiatives that depend on the free flow of data. 

Here are ways to enable the flow of secure, high quality data across your enterprise through a platform-based approach to successfully enable DataOps, according to the latest TDWI’s report.

1. Accelerate data delivery 

Legacy approaches to test data management can be a huge blocker for modern QA practices that promise higher quality and faster releases. Many times, this involves developer requests being pushed to the end of the queue due to slow moving, request-fulfill processes and long refresh times that impact QA and reduce relevance in tight turnaround times. 

Adopting a platform-based approach is critical, so enterprises can automate the rapid provisioning of different test data based on developer needs while observing modern data security practices, such as masking non-production data, to accelerate data delivery and reduce delays in the DevOps lifecycle. 

2. Reduce data friction 

As companies demand more data to be made available in more places, data friction emerges because the demands of data operators aren’t met by data consumers. 

DataOps brings those two key audiences together as one team. It’s a key enabler of data flow, giving data consumers access to and control over the right data in the right place while allowing data managers with the efficiency and oversight and confidence to support the business at scale. 

Implementing a data platform that reduces or eliminates the friction and automates the provisioning of high-quality data sets can help overcome inherent bottlenecks by compressing source data, creating virtual replicated copies and rapidly transmitting them to development, QA and data analysis teams. 

3. Eliminate manual work 

As data sprawls across a variety of platforms, the challenges of keeping in sync with production data become more important. Thus, automating provisioning can allow teams to identify where the bottlenecks are and reduce manual tasks. 

A self-service data platform can eliminate ticket-driven IT requests; automate data copying, encryption/masking and compression as a prelude to data replication; reduce dependence on DBAs by automating database configuration and mounting; lastly, version and bookmark provisioned test data to reduce duplicate requests. 

4. Simplify data collaboration with data protection

Designing and establishing key security practices is integral in today’s data sharing economy for both internal and external purposes. By adopting a data platform that takes a comprehensive approach to data security, like Delphix, enterprise teams can identify sensitive data, continuously mask data in a simple and repeatable manner by replacing confidential information with fictitious yet realistic values, apply governance measures to control data access and finally, provision secure data copies to any target environment and stay in compliance with privacy regulations. 

Incorporating practices, such as data masking, with data sharing procedures should simplify the process of collaborating across teams without compromising security. 

5. Provide self-service

At a large enterprise, the scale of managing data can be overwhelming and can cause tremendous data friction in a company’s digital operations. Data friction emerges when the demands of DBAs, developers, data analysts and business decision makers, are not met by data management and IT professionals. 

By enabling and adopting a self-service data platform, data consumers no longer need to wait for database administrators to complete their requests. Your platform solution should have the capability to identify data sources, configure a self-contained environment designed for a specific data consumer, manage those data containers as well as the flow of data from the sources to the containers to ensure synchronization and consistency with the source. 

6. Manage heterogeneity through a single point of control

The modern enterprise depends on a heterogenous set of sources rather than a single type of data source, and provisioning heterogenous data for all the use cases, including development, testing and reporting, often requires a complex process to allow sets to flow to where they’re needed. 

To address this challenge, enterprises should consider implementing a standardized approach to managing, securing and moving data across those sources by adopting a platform solution that works across all data sources various teams depend on. 

7. Simplify migration 

Organizations today recognize that cloud is a key enabler of digital transformation, but migration to the cloud requires a clear strategy. Whether it has to do with accelerating software delivery, providing self-service models to developers, automating workflows or enhancing IT productivity, data must flow - securely and rapidly - across teams within an organization. 

That’s why it’s important for enterprise teams to leverage a data platform that enables access to data to those who need it and replicates data in a way that maintains consistency and synchrony with the sources. Specifically, the synchronization of virtual data sets ensures there is fresh test data to create an environment, and automating that synchronization eliminates transmitting physical media. Finally, there are security risks when migrating data to the cloud, so establish a uniform approach to masking your data and secure it ahead of the migration. 

Download “Seven Ways to Liberate Data with a Platform for DataOps” for a deep-dive into these seven principles. 

 
“SEVEN WAYS TO LIBERATE ENTERPRISE DATA WITH A PLATFORM FOR DATAOPS” COPYRIGHT © 2018 BY TDWI, A DIVISION OF 1105 MEDIA, INC. EXCERPT REPRINTED BY PERMISSION OF TDWI. VISIT TDWI.ORG FOR MORE INFORMATION.

0 comments
11 views

Permalink