Data by itself is not very useful. Data only becomes useful as it is understood and infused into app experiences. This desire to put data to work has fueled a boom in cloud-based analytics. Although a relatively small amount of IT spending currently goes to the cloud (roughly 6% according to IDC in 2020), all the momentum is shifting away from on-premises legacy business intelligence tools towards more modern cloud-native options like Google BigQuery, Amazon Redshift, Databricks, or Snowflake. The popularity of bridging data and cloud is shown in Snowflake’s skyrocketing in DB-Engines’ database popularity rankings, from number 170 in November 2016 to number 11 in January 2023. Part of the success of Snowflake absolutely boils down to performance, scalability, separation of storage and compute, and other benefits.
But arguably an even bigger benefit is simply the cloud. Snowflake was born in the cloud and offers a natural path for companies looking to migrate to the cloud. Yes, that same cloud continues to power new databases versus legacy alternatives. That same cloud promises to continue to revolutionize the world of data in 2023.
All cloud, all time?
While I don’t fully agree with my InfoWorld colleague David Linthicum that “2023 could be the year of public cloud repatriation”, I can agree that we shouldn’t fall blindly in love with a technology or view it as a hammer and therefore treat every business problem like a nail. The cloud solves many problems, but not all. However, in areas related to advanced data-driven applications, the cloud is indispensable, as Linthicum acknowledges: “When it comes to advanced IT services (AI, deep analytics, massive scaling, quantum computing, etc.), public clouds They are usually cheaper. ”
Not only cheaper, but also more practical.
Years ago, AWS executive Matt Wood explained this case to me, and it is as persuasive today as it was in 2015. “Those who go out and buy expensive infrastructure find that the scope and domain of the problem changes very quickly,” he said. . “By the time they manage to answer the original question, the business has moved on.” As he continued, “if you dump a lot of change into a data center that’s frozen in time,” the questions you can ask about your data get stuck in a time warp. Even in tough economic times, the wrong way to think about the cloud is through a narrow cost lens. The elastic infrastructure creates flexibility to make sense of the data. Meaning dollars, so to speak, instead of dollars and cents. Those are cloud-based analytics tools.
Companies seem to understand this. At a recent analyst conference, Snowflake CFO Mike Scarpelli discussed the competitive dynamics in the data storage market. “We are never competing with Teradata [an incumbent data analytics company founded in the on-premises software era]. When a customer has made the decision to go off-site, they are never against Teradata. They have made the decision to leave.” If the company is already looking to the cloud when doing a digital transformation exercise, where is it looking? “According to Scarpelli, “When we are competing for a local migration, it is always [against] GoogleMicrosoft [and] AWS [but AWS] tends to associate more with us [out of] the door.”
In other words, the customer may have spent years with their on-premises data warehouse or BI solution, but that’s not where they’re betting their future. Your future is the cloud. If they’re considering the next step, it’s not likely to be Oracle unless they’re so involved with Oracle that introducing a new system seems difficult. Most of the time, companies will be looking for a cloud-based database, data warehouse/lakehouse, or machine learning/artificial intelligence system. More Google BigQuery, in other words, and less SAP BusinessObjects.
Another reason for the success of the cloud is simplicity, or it can be. The cloud, of course, is not inherently easier to use, but many cloud systems have emphasized a SaaS approach that places great importance on the user experience. Take, for example, this comment from a Reddit board, describing his experience with Snowflake: “If you need a PhD in physics to use your SaaS tool, your tool is useless. MySQL users love it (analysts), C-suite loves it, the only people who have a hard time are nerdy engineers like me who had enough hubris to think they could do it all themselves and that everyone in the world would learn PySpark one. day.”
I recently wrote about the democratization of data, how companies are trying to give more employees access and ability to work with more and different data. I made the point that if companies really want to democratize data, they will need to teach employees how to effectively use cloud-based tools to test cloud-based data.
Fortunately, the cloud also allows machine learning systems to take some of the heavy lifting. As my MongoDB colleague Adam Hughes writes, “The combination of real-time, operational, and embedded analytics (what some call translytics, HTAP, or augmented transaction databases) now enables application data-driven analytics to help to determine, influence and automate decision making. for the application and provide real-time information for the user”. This doesn’t mean that machines do the thinking for us, but rather that they take the undifferentiated heavy lifting out of computationally heavy data processing, leaving the user with the more thoughtful job of understanding what that data means to an application and, ultimately, the business.
All of this isn’t completely cloud-powered, but it’s absolutely cloud-enhanced and accelerated. Data has never been more important, and accessing and understanding it has never been easier thanks to cloud computing. If I wanted to pick an almost certain prediction for 2023, it is that this trend will continue and accelerate.
Copyright © 2023 IDG Communications, Inc.
Be First to Comment