Sunday 22 December 2024
 
»
 
»
ANALYSIS

Putz: We need to be able to analyze and react to data
where it is created

Make data 'the centre of IT infrastructure strategy'

DUBAI, June 4, 2018

By Christian Putz

We’ve been talking about data growth for the past two decades, but data is accelerating like never before. In fact, 90 per cent of the world’s data was created in the past two years alone. One of the biggest changes is that most of the data being created today is generated by machines, and not by humans using applications.

Compounding this problem is that this data has gravity—if it’s created at the edge of your network or in your factory, you may not be able to move it to a central data centre or the cloud, to be able to process it in real-time like your business may need.

So, data gravity matters, and increasingly we need to be able to analyze and react to data where it is created. Artificial intelligence (AI) and deep learning (DL) are bringing what’s possible with analytics to the next level, and it’s impacting every industry. In fact, by 2020, Gartner predicts that AI will be pervasive in almost every software-driven product and service.

The intelligence opportunity is particularly exciting. For one, it helps us analyze big data and transform to go beyond human consciousness. Secondly, it gives us the ability to build exciting new services for our customers. Finally, it allows us to automate infrastructure which in-turn allows us to be far more efficient and reliable in our operations.

The challenge is that today’s infrastructure wasn’t built from scratch to implement a cohesive data strategy—it was built over time, probably project by project and is most likely fragmented, and not very cloud-like (simple, scalable and agile). Given the importance and value of data, it’s time to re-think the IT infrastructure from the bottom-up, and to put data at the centre of the design. You have to invest in building a truly data-centric architecture, built on five key principles.

Consolidated & simplified

To drive efficiency and achieve the potential of data, it is essential to consolidate and move away from data islands. This is where all-flash makes the difference. Flash enables consolidating many applications into large storage pools, where what used to be tiers of storage can all be simplified into one. This drives efficiency, agility, and security. Management can also be converged, ensuring that storage plugs-in nicely to your infrastructure orchestration strategy.

Real-time

The second dimension is that you have to build for real-time, as slow data just isn’t an option anymore. Real-time data makes your applications faster, your customer experiences more instant and immersive, and your employees more productive. It’s also worth pointing out that real-time not only means real-time data, it also means real-time copies. This is the ability to take copies of data and easily share it between multiple consumers, e.g. fresh copies of production data shared with test and development.

On demand & self-driving

This third pillar—on-demand and self-driving—represents a paradigm shift in terms of how we think about operating storage for the business. What if your storage team could stop being a storage operations team, and instead think about their mission as running the in-house storage service provider for customer X? What if they could deliver data-as-a-service, on-demand, to each of the development teams, just like those developers could get from the public cloud? Instead of the endless cycle of reactive troubleshooting, what if the storage team could spend it’s time automating and orchestrating the infrastructure to make it self-driving, and ultra-agile?

For this to become a reality though, it will require some significant changes in how you operate. You have to get ahead of the curve, anticipate the business needs and design a set of elastically-scalable storage services that allows you to build ahead of consumption. On the front-end, this is all about standard services and standard APIs. Whereas on the back-end, this is about automation over administration, and the tools to make that happen are getting very accessible, and very sophisticated.

Multi-cloud

I believe the future architecture will be multi-cloud, even if you run everything on-premise.  Think about your environment today—you have a production cloud, you probably support multiple development environments, you likely run a cloud for analytics and you operate a global backup network. Each of these environments increasingly wants to run in the cloud model and they expect those same cloud attributes: simple, on-demand, elastic, and a driver for innovation.

At the same time, each of these have their own requirements which means that you have to design a next-gen data strategy that delivers the cloud data experience to each of these environments, yet also delivers the unique capabilities they demand.

This is why your data-centric architecture should be designed with multi-cloud in mind—an assumption that you need to manage data across multiple clouds and achieve the data portability and openness to make this possible. If you don’t design for this, you face a real danger of lock-in, because data is the foundation of infrastructure lock-in.

Ready for tomorrow

Finally, you have to embrace the reality of how fast data is moving. About eight years ago, 1PB of flash required six racks of space, and AI was a research project. Fast forward to today and we can store 1PB in less than 3U, and AI and automation are becoming mainstream. Data moves fast, and you have to design an architecture that has the performance for tomorrow but is built to evolve and innovate.

So, what does this all mean? What if you could implement a data-centric architecture, one that was consolidated on flash, one that enabled real-time across your business, and one that delivered data-as-a-service to your various internal clouds? A data-centric architecture will allow you to simplify performance of core applications while reducing cost of IT, and empower developers with on-demand data, making builds faster and enabling the agility required for DevOps and the continuous integration/continuous delivery (CD) pipeline.

It will also deliver next-gen analytics as well as act as a data hub for the modern data pipeline, including powering AI initiatives. In short, a data-centric architecture will give you the platform to accelerate your business and stay one step ahead of the competition.

About the author

Christian Putz is director, Emerging, EMEA at Pure Storage, a leading global provider of enterprise, all-flash data storage solutions.




Tags: | IT Infrastructure | Pure Storage |

More Analysis, Interviews, Opinions Stories

calendarCalendar of Events

Ads