Press "Enter" to skip to content

Big data and DevOps: No longer separate silos, and that’s a good thing

The pandemic has caused major shifts in the way IT and big data work. Now they may be working together for better outcomes.

Image: iStock/ RRice1981

More about Big Data

The world has changed a lot since March 2020, and the coronavirus pandemic has affected nearly every aspect of our lives. While we’ve seen massive changes in technology already, another change happening right now is in big data and its role with DevOps.

“The COVID-19 pandemic has accelerated the blending of data analytics and DevOps, meaning developers, data scientists, and product managers will need to work more closely together than ever before,” said Bill Detwiler, editor in chief of TechRepublic. 

SEE: TechRepublic Premium editorial calendar: IT policies, checklists, toolkits, and research for download  (TechRepublic Premium)

Detwiler was interviewing managers at Tibco, a leader in big data integration and analytics. They said the coronavirus pandemic had caused organizations to rethink how they were using big data and analytics, generating what appears to be a movement toward merging IT DevOps methodologies with big data analytics.

For IT organizations, this is more than just a story about how the pandemic has altered how companies think about big data and analytics. The emergency of COVID has placed emphasis on getting analytics insights and results to market quickly. This has redefined analytics reporting as mission-critical, and not just as an ancillary tool for how companies operate and strategize.

SEE: Return to work: What the new normal will look like post-pandemic (free PDF) (TechRepublic)

The change is also creating revisions in operations and culture for IT. Here are some we’ve seen.

A move from waterfall to DevOps development

Developing, testing, and deploying big data applications is an iterative process. Because the process is iterative (i.e., develop-test-deploy until you get what you want), it doesn’t follow the more linear and assembly line-like development methodology of traditional IT waterfall development, which is a serial sequence of handoffs from development to QA (test) to an implementation staff.

SEE: Are you a big data laggard? Here’s how to catch up (TechRepublic)

A majority of IT departments are still organized around the waterfall development paradigm. There are separate silos within IT for development, testing, and deployment. These functions have to come together with each other and end users in the more collaborative and iterative process of big data application development. To do this, functional silos of expertise have to dissipate. 

Culturally (and perhaps organizationally) this changes the orientation of IT. The culture shift is likely to entail the creation of interdisciplinary functional teams instead of work handoffs from functional silo to functional silo. End users also become active participants on these interdisciplinary teams.

Fewer absolutes for quality

The testing of big data applications becomes more relative and less absolute. This is a tough adjustment for IT because in traditional transaction systems, you either correctly move a data field from one place to another, or you obtain a value based on data and logic that absolutely conforms to what the test script dictates. If you don’t attain absolute conformance, you retest until you do. 

SEE: Big data: How wide should your lens be? It depends on your use (TechRepublic)

Not so much with big data, which could start off with results being only 80% accurate, but with the business deeming them close enough to indicate an actionable trend.

Working in a context where less-than-perfect precision is acceptable is a challenging adjustment for IT pros, who are used to seeing an entire system blow up if a single character in a program or script is miskeyed.

The shift of big data into mission-critical systems

If you’re a transportation company, the ability to track your loads on the road and the health and safety of the cargo that they’re carrying becomes mission-critical. If you’re in the armed forces and you’re using drones on the battlefield to conduct and report reconnaissance in real-time flyovers, the data becomes mission-critical.

SEE: Big data success: Why desktop integration is key (TechRepublic)

This means that organizations must begin to attach the label of mission-critical to big data and analytics applications that formerly were classified as experimental. 

IT culture must shift to support mission-critical big data applications for failover, priority maintenance, and continuous development. This could shift IT personnel from traditional transaction support to big data support, requiring retraining to facilitate the change.

Also see

Source: TechRepublic