Joey on SQL Server

Microsoft Build 2024: AI Dominates the Database

Let's take a look at some of the bigger product and feature announcements made, and what it means for your enterprise database.

Microsoft's annual Build conference kicked off in Seattle today with the extensive product announcements you usually expect. While there was no news about a new version of SQL Server, there were many announcements around new features and services across the data platform and Azure services.

As you might expect, many announcements are about AI, including a new laptop chip that can run CoPilot detached from the cloud. Let's look deeper at some announcements and how they impact large and small IT organizations.

It's worth noting that while Build is primarily a developer-focused conference, it also brings forth a wealth of infrastructure-related announcements. Given the prevailing IT trends, you should not be surprised that the spotlight was on AI. The discussions revolved around practical implications, such as reference architectures for the Microsoft Open AI service, landing zone accelerators for better AI application support and more guidance on building and deploying your own AI applications. These insights into AI advancements provide a clear understanding of their real-world applications.

Among the data platform integrations, one that stood out was Azure's AI search. Microsoft has been conducting numerous demos, showcasing retrieval-augmented generation (RAG), a process that updates AI models with new, relevant information. The announcements around RAG, particularly related to Azure AI Search, were noteworthy. However, what caught my attention was the platform integration with Microsoft Fabric's OneLake via a connector. This integration could enable the dynamic update of organizational data in AI models, a feature that could significantly enhance Fabric's AI capabilities.

The other big Fabric announcement is that Real-Time Intelligence is being launched within Microsoft Fabric. This offering builds on Data Activator, which was introduced shortly after the initial launch of Microsoft Fabric and aims to allow customers to make better business decisions more quickly, using data streams in either low-code or code-rich scenarios. While this sounds a lot like Power Automate or Azure Logic Apps, Fabric introduced a real-time hub, a single place to ingest and process event data, and a more straightforward approach to integrating streaming data into the workflow.

While being able to query streaming data, using technologies like Apache Kafka and Azure Stream Analytics has been available for the better part of a decade, those solutions have been very challenging to integrate into a broader analytics ecosystem, particularly for enterprises with limited experiences around working with streaming data. The real-time hub, connectors and the newly introduced Event house, which allows for data exploration, aim to make this process much easier. Based on the inclusion of Data Activator and KQL, it feels like Microsoft is building this solution based on some of their experiences with managing Azure telemetry in Log Analytics, a mature solution.

Another announcement around real-time analytics in Fabric was the mention of "Event-Driven Fabric," which allows users to respond to system events within Fabric and trigger Fabric actions. These "triggers" sound like how I like to use Azure Logic Apps within the Azure platform to manage resources, which could be promising. There were a couple of other Fabric announcements of note. The first was Data Sharing, which allows for real-time data sharing across users and apps. Data sharing has long been lacking when comparing Microsoft to Snowflake, so seeing the feature develop over time will be interesting. The other compelling preview features are a GraphQL API and Fabric's support for user data functions. GraphQL is a potent query language that has become very popular in recent years and should be a good fit for the distributed nature of Fabric. User-defined data functions can reduce repetitive efforts around ETL (extract, transform and load) processes, which you still need to do despite any marketing material you read in Fabric.

In relational database news, Microsoft made some announcements related to AI capabilities in PostgreSQL. At recent conferences, Microsoft has shown several demos using the database for data storage for AI models. Azure PostgreSQL now includes calling into the OpenAI service to generate vector embeddings. One of the demos Microsoft has shown around this is querying a recipe database to look for "high protein recipe" -- obviously, this query wouldn't work with a traditional table structure. Still, when converting that data to vector embeddings, the query can return correct results. Azure PostgreSQL now includes real-time predictions using pre-trained machine learning models. This feature SQL Server has had for a while works well for suitable data sets.

Likewise, the Cosmos DB service announced many AI-related features, including vector support. My prediction is that you will see nearly every commercial and open-source database announce support and enhancements for storing vector data over the next 12 months, just a symptom of the rise of AI. Cosmos DB also announced cross-region disaster recovery for vCore-based MongoDB accounts. I would like to see Microsoft move to this vCore pricing model for all of the Cosmos DB offerings -- customers want to know their total costs of ownership at the beginning of the month, not the end.

One of the more exciting announcements out of Build was that Microsoft announced the availability of Cobalt 100 Arm VMs, built on Microsoft's own Cobalt chips. While this is an early offering, you will continue to see Arm deployment on Azure over time. Related to this announcement, SQL Server Data Tools, which runs on Visual Studio, now supports Arm64. Over time, you will begin to see more and more Microsoft offerings on Arm because it can reduce both power and computing costs for Microsoft in Azure, improving Microsoft's cloud margin. The other SQL-related announcement was that the Azure SQL CoPilot features I wrote about in March are now available in all Azure subscriptions in the preview.

Microsoft, through both its investment in OpenAI and through its own service offerings, continues to make big investments in integrating AI across all their products. As Build shows, they are also making investments in allowing organizations to build their and integrate their own AI models into their applications and data workflows and building tools to help support those efforts. The other trend you see is that Microsoft continues to invest in Arm chips, including building their own chips and deploying them into Azure.

About the Author

Joseph D'Antoni is an Architect and SQL Server MVP with over two decades of experience working in both Fortune 500 and smaller firms. He holds a BS in Computer Information Systems from Louisiana Tech University and an MBA from North Carolina State University. He is a Microsoft Data Platform MVP and VMware vExpert. He is a frequent speaker at PASS Summit, Ignite, Code Camps, and SQL Saturday events around the world.

Featured

comments powered by Disqus

Subscribe on YouTube