Databricks Data + AI Summit
Curious where enterprise AI is headed? I just returned from the Databricks Data + AI Summit where I got to see a pretty compelling roadmap. And I think it’s fair to say Databricks is redefining the very fabric of enterprise data and AI strategy.

The week was packed with bold ideas, real-world urgency and enough hybrid cloud talk to make your head spin, in the best possible way, of course! I came away energized by the pace of innovation and the seriousness with which leaders are approaching responsible AI at scale.

So, let’s dive into my biggest takeaways, from the keynote stage to some fantastic side chats throughout the summit.

Day 1: The lakehouse went agentic and Postgres joined the party

Ali Ghodsi, CEO of Databricks, wasted no time setting the tone by stating, “The significance of data, open formats, governance and innovation are key to unlocking AI’s potential.”

That message resonated through every major announcement, starting with Lakebase, Databricks’ new managed Postgres engine built specifically for AI-native workloads. Think of it this way: If you’re building intelligent apps, you don’t just need a fast database. You need one that plays nicely with agents, lakehouses and your existing governance model. Lakebase aims to do all that by elevating Postgres in the Databricks ecosystem. It’s in public preview, and from the buzz I heard, customers are intrigued.

Also worth noting was the vibe created by Jensen Huang of NVIDIA, which was electric. Huang stated, “The future of AI depends on powerful compute and open data platforms like Databricks to accelerate innovation.” It wasn’t just a throwaway soundbite. The convergence of open data models and serious GPU horsepower is real, and customers are watching it closely.

That focus on real-world architecture choices came up often in conversations throughout the event. More than half of the Databricks users I met were also using or evaluating Snowflake, and almost all of them were trying to reconcile the trade-offs. Databricks’ open-source DNA and emphasis on ownership clearly gave it an edge with teams that want flexibility, and cost transparency is becoming the new battleground.

Day 2: Governance took a front seat

If day one was about innovation, day two was about responsibility. And let me tell you, the governance conversations were everywhere. From Unity Catalog to ethical AI, leaders were treating this as mission-critical, not optional.

Ali Ghodsi summed it up perfectly when he said, “People underestimate how difficult it is to fully automate tasks and take humans out of the loop.”

That resonated deeply. It’s easy to talk about automation and agents. It’s a lot harder to build systems you can trust, especially when those systems are making high-stakes decisions.

One of the strongest moments came from Dario Amodei of Anthropic, who stressed the need for “AI that is safe and interpretable.” And Jamie Dimon from JPMorgan Chase drove it home with a call for strong governance to manage AI risk and compliance when he said, “To scale AI responsibly, we need strong governance frameworks to ensure compliance and trust.” And that’s exactly where I saw Quest customers leaning in.

Key themes throughout the summit

Whether they were in finance, healthcare or retail, everyone’s concerns were the same.

I heard many versions of the questions:

  • How do I govern AI workloads across hybrid environments?
  • How do I manage data without vendor lock-in?
  • How do I keep my costs under control as AI scales?

At our booth, we had great conversations around how we solve these issues. You could feel the enthusiasm among attendees who were eager to tackle three main challenges:

Governance that spans platforms. Many teams were excited about how our metadata and glossary capabilities can enhance governance across Databricks, Snowflake, Oracle and Postgres. With Unity Catalog updates, we’re now even better aligned to support secure, fine-grained access control and multi-cloud strategies.

Hybrid readiness. As Databricks integrates more closely with AWS, Azure, GCP, Tableau and Power BI, the appetite for interoperable governance and modeling tools is growing. Our platform-agnostic approach lets customers manage data lineage and policy across stacks, without being boxed in.

Cost visibility and control. As more workloads move to GPU-heavy environments, customers are watching every dollar. Our solutions track usage, optimize queries and spot anomalies before they spiral. AI can deliver massive ROI, but only if you can see and manage what you’re spending.

One use case that sparked a lot of interest was enhancing Lakebase governance for AI workloads by integrating modeling, cataloging and lineage, so teams can trust the outputs of agent-driven applications. As Ghodsi put it, “Agents are now creating databases.” That’s amazing… or terrifying if your governance isn’t in place.

Beyond the keynotes: What really stuck with me

The most telling conversations happened in between sessions. Here’s what kept coming up behind the scenes.

Everyone’s rethinking data strategy. The differentiator is having clean, well-governed, context-rich data to feed your models. If your foundation is fragile or siloed, AI just magnifies the mess.

Dual-platform adoption is popular. Snowflake and Databricks aren’t an “either/or” for many teams; they’re a “both/and.” That makes vendor-agnostic, integrated governance tools more important than ever. It’s also why I love how we support Unity Catalog alongside other platforms.

Open formats are winning hearts and minds. Companies want choices, and they don’t want to refactor every time they switch vendors. Databricks’ commitment to open formats and open data resonated strongly, especially with enterprise architects.

Sustainability matters. Optimizing compute not just for performance but for energy efficiency came up more than once. Databricks is clearly thinking about how to reduce AI’s carbon footprint, and so are their customers.

The community vibe was fantastic. Between the open-source contributions, hackathons and packed hallway chats, I could feel how deeply the Databricks community is invested in solving real-world problems together. You can’t fake that energy, and it was inspiring to be a part of it.

Conclusion

If I had to sum up the week with one takeaway, it would be this: AI multiplies whatever foundation you have. That’s exciting if your data strategy is strong. It’s dangerous if it’s fragmented or poorly governed.

That’s where companies like Quest play a critical role. We’re not here to compete with platforms like Databricks. We’re here to help users maximize them. Whether it’s managing lineage, integrating glossaries, modeling new workloads or keeping costs in check, we’re helping enterprises build a foundation for AI they can trust.

And that’s the whole point. AI isn’t something you add on at the end. It’s something you build toward by ensuring your data stack is ready. And it was cool to hear firsthand how Databricks users are relying on Quest to lead that charge.

Is your data ready for AI?

Avoid costly failures in AI implementation by ensuring your data foundation is strong.

Learn More

About the Author

Randy Rouse

Randy Rouse is Field CTO at Quest Software, where he empowers enterprise leaders to achieve secure, AI-ready transformation by aligning data strategy, governance, and technology with business outcomes. With a career spanning presales engineering, solution architecture, and executive leadership—including leadership roles at EMC and Dell—Randy is recognized for driving operational resilience and data trust in complex environments. A passionate advocate for cyber resilience and modern IT, he regularly creates and shares engaging LinkedIn video content on AI adoption, identity security, endpoint protection, and Microsoft/database modernization. Randy is known for turning complexity into clarity, helping organizations scale securely and unlock the full value of their technology investments.

Related Articles