2019 Trends In Cloud Computing

2019 Trends In Cloud Computing

Ciarán Daly

November 27, 2018

8 Min Read

by Jelani Harper

The cloud has long operated at the juncture of the most meaningful technologies to dominate the data management landscape, facilitating everything from big data deployments to predictive and prescriptive analytics, applications of cognitive computing to those of edge computing. Gartner predicts the global public cloud services market will exceed $200 billion in 2019.

Although the cloud’s most recent alteration to data-centric practices isn’t quite as celebrated, it’s certainly as influential as any of the foregoing ones. Amid the more discernible advancements in cloud data stores, security measures, data fabrics, hybridization, containers and more is an almost imperceptible trend that will flourish in 2019, overshadowing these others.

The cloud isn’t just solidifying Service-Oriented Architecture, but rather revamping data-driven practices into an entire service-oriented economy. It’s changing the collective market from one in which vendors simply sell products to becoming “a service provider of people, of applications, of workloads, of data centers,” NetApp VP of Americas Partner Sales Jeff McCullough revealed. “It’s a very different world today than it was five years ago.”

That world offers a multiplicity of service options, pricing flexibility, and overall nimbleness of which contemporary organizations can avail themselves largely due to the cloud’s ubiquity. “The public clouds have created this service based model, a model that really allows organizations to focus on the top part of the problem: the application and the data,” NetApp GM and SVP of Cloud Data Services Anthony Lye commented. “These services we can choose to run independently of one another, or you can combine them.”

By offering those services in some of the most vital facets of the data space, the cloud will herald a service-oriented economy in 2019.

Cloud Databases, Warehouses and Data Lakes

Cloud data stores are pivotal to the emerging service-oriented economy, partially because they rectify the most venerable problems with data warehousing and Business Intelligence. According to Looker CEO Frank Bien, disproportionately slow relational methods and their ponderous schema calibrations resulted in a lack of business “access” to data, while conventional self-service platforms were characterized by a dearth of “reliability” due to disparate data management. Today there are several options “built not on the old way but on this new architecture, this new approach, this new cloud-oriented data store that scales so much better,” Bien reflected.

“It’s a very different world today than it was five years ago.”

The scalability of these stores is not only suitable for data quantities characteristic of Artificial Intelligence and Internet of Things applications, but also for the newfound rapidity that data and analytics are available to business users—which is perhaps the cloud’s cardinal benefit. “As soon as data is in the cloud, it can start to be accessible to other services that are in the cloud,” Bien said. “You think about Amazon Athena or the BigQuery Transfer Service. The fact that I’m running my applications in the cloud can almost instantly make that data available through their search interfaces.” The cloud’s other chief value proposition is delivering these benefits in distributed settings. “Having a service which is cloud and global to start with, where data minimization is one of the primary design principles, really makes it easy to…be in tune with the pace of innovation,” UJET CEO Anand Janefalkar said.

The Data Fabric Proposition

Accessibility is critical to cloud deployments and cloud services, which may be exacerbated by “the rapid expansion of the volume of data in the cloud and the many different applications of an organization,” propounded Looker Chief Privacy and Data Ethics Officer Barbara Lawler. Frequently, those applications and their data are part of hybrid clouds utilizing on-premise deployments. Data fabrics are gaining ground as an increasingly viable means of unifying data—and their access—wherever they are, from “the edge to the core to the cloud,” NetApp CEO George Kurian said. This uniform accessibility impacts the cloud’s service-oriented economy by ensuring users have a homogeneous means of getting to data, regardless if data are in the cloud or on-premises.

Such utility is important for international deployments. “It is extremely difficult to stand up in new countries or geographies; cloud is kind of the primary necessity [for doing so],” Janefalkar said. Data fabrics not only support hybrid on-premise/cloud use cases, but also multi-cloud ones as well. “To access a fabric, we would of course ensure that fabric was available…as a service—not something you would have to install, not something you would have to maintain, not something you would have to configure,” Lye mentioned. This approach to data fabrics consists of end points existent among clouds (public and private, multi-tenant and otherwise), on-premise settings, and at the cloud’s edge. It circumscribes data sprawl by empowering users to position resources “all from a single pane of glass, [to] select from the world’s biggest public clouds,” Lye said.

Related - 2019 Trends In Data Science: Innovation and Acceleration

Layered Security and Virtualization

Security and risk are still the most persistent inhibitors to cloud adoption. “What’s interesting to me is although security of the cloud’s data is the top concern for a cloud implementation, the security benefits of a properly installed cloud solution are actually the strongest point in favor of a cloud solution,” reasoned Tori Ballantine, Product Marketing Lead for Hyland Cloud & Cloud Applications. The traditional merit of cloud security is its multi-layered method in which there’s “your perimeter, your network, you’ve got your hosts inside there, then you’ve got your applications and then in the very middle is where your data needs to be,” Hyland Senior Solutions Engineer Steven Wyant explained.

"Security and risk are still the most persistent inhibitors to cloud adoption"

One way organizations can ensure these layers are constantly updated for new attacks is by utilizing data virtualization. “In the cloud environments, we can build a virtual machine so when I need something new, like if a new operating system comes out, I just build a new virtual machine to keep things refreshed,” Wyant said. With the cloud’s global accessibility, however, organizations must remain vigilant about how what Janefalkar termed “security protocols, privacy protocols” and governance requirements vary according to where the data are actually located.

Preventing Breaches

Software defined perimeters are perhaps the most cogent security mechanism to buttress hybrid and multi-cloud deployments largely because, as Wyant noted, they facilitate security at the application layer. According to DH2i CEO Don Boxley, this methodology allows organizations “to dynamically deploy perimeter security wherever they need it in order to isolate specific services for fine-grained, user access.” Competitive software defined perimeters randomly generate micro-tunnels between applications in distributed environments, connect them, then close their ports so they’re essentially invisible. Consequently, “nobody would know this is happening” Boxley said about the ensuing communication between these remote applications. This approach decreases the attack surface of organizations’ networks, encrypts the data so not even the software facilitating the connections understands them, and is transferable wherever connections are necessary. “The basic goal is to significantly reduce attack surfaces, which will lead to vastly reduced numbers of security breaches,” Boxley observed.

Containers

Containers frequently accompany hybrid and multi-cloud use cases. According to SAS Director of Product Management for Fraud and Security Intelligence Carl Suplee, REST APIs and containers “allow us to get to third-party data faster, allow us to bring that data in and allow that data to be consumed by your processes faster.” Containers are influential developer tools because they’re lightweight and scale swiftly. These attributes also prime them for application or software deployment, so developers “take that package, extract it and push it up to their production systems and make it easier to deploy when they’re ready,” Suplee said. Containers can also optimize ROI for multi-cloud use cases for purposes other than high availability.

NetApp Chief Product Architect Adam Carter described scenarios in which containers deploy applications in multiple clouds to “play the two off of each other in pricing like, is it cheaper for me to run this month more in Amazon rather than in Google?”

Containers are also critical to the cloud’s service-oriented economy by enabling resources to run in multiple settings, which enhance data fabrics by affecting factors others than the location of data. Carter remarked containers “make it super easy to define mobility. The things you need to move [applications] between those clouds, you can standardize them within containers so that you can pick up this container and set it in this container environment over here, and it helps make that much easier than if the application was just running on an OS.”

A Hybrid and Multi-Cloud Tomorrow

Cloud computing advancements are veering the economy of data assets from a product focused one to a service based one. New organizations, for example, could facilitate most data management needs in the cloud, if they were so inclined. “If you look at some legacy providers and they have to stand up a contact center in multiple locations, it takes a good four to six months,” Janefalkar acknowledged. “That can take three to four weeks if it’s designed on cutting-edge technologies, cloud based.”

However, for those with legacy systems and uncertainties about cloud security, the reality is an on-premise, hybrid cloud approach that frequently involves multiple clouds for disaster recovery and competitive pricing. Thus, organizations should anticipate the majority of tomorrow’s data management offerings will be, as Bien emphasized, “built as services, not as a monolithic stack.”

Jelani Harper is an editorial consultant servicing the information technology market, specializing in data-driven applications focused on semantic technologies, data governance and analytics.

Keep up with the ever-evolving AI landscape
Unlock exclusive AI content by subscribing to our newsletter!!

You May Also Like