10 Databricks Financial Services Use Cases That Give You an Unfair Advantage

Blog Databricks 2 Jun 2025

Dan Williams

I checked every Financial Services use case session at the Databricks Data & AI Summit and I picked my favourites.

There are a lot of sessions that are dedicated to client use cases. I analysed 42 of them but I’m sure there are a lot more sessions that offer real-world insights.

Whenever I speak to clients, their main interest is to identify Databricks use cases that can be applied to their organisations. As my interest is to service that need, I’m always ideating on applied use cases that we can help them with.

Seeing how banks, insurance companies and other companies in the financial services space are using Databricks, will open up a lot of avenues that both you and I can explore long after the Summit has ended.

Therefore, my hope for the Databricks Data + AI Summit is that we’ll get a sneak peek into what their use cases are and also how they’re shipping Databricks workloads to production.

Businesses aren’t buying the tech..

dais24 + dais24 Mask
..they’re buying outcomes. As engineers, analysts or a managers, real-world use cases drive the work that we do.

For example seeing a credit risk Lakehouse, a Delta Sharing pipeline or a live trading dashboard. These demos make you think about your own data estate and your own infrastructure. That’s why use case driven sessions are important, they provide shortcuts for you to explore those opportunities yourself.

There are almost 700 sessions. I can’t attend all of them no matter how much I'd want to. There’s not enough time. Also a summit is a great networking opportunity. You get to meet people that you've only seen on social media and you get to get a feel for where the industry's going. So in this article I'll try to help you by identifying 10 sessions that I believe can help you on your Databricks journey.

How To Plan Your Summit

session scheduler
Use the Databricks Scheduler. The web tool and the phone app make double-booking impossible.

The Top Five Rule. Pick 5 sessions that you believe are most interesting to your needs, then add “nice-to-haves” around them.

Leave Hallway Gaps. The best intel often comes from chatting between talks. Think about this and don’t overschedule yourself.

As I recommended above, I used the Databricks scheduler to plan the sessions and it was great as I could include as many as possible, without double-booking. I’ve booked a lot of sessions as you’ll see below, but expect to not be able to attend all of them. Just keep your top 5 in mind and try to work around that.

So here are 10 use-case-focused Financial Services sessions that I’m recommending:

1

Transforming Credit Analytics With a Compliant Lakehouse at Rabobank
Transforming Credit Analytics With a Compliant Lakehouse at Rabobank

This shows Rabobank's transition to a secure and audit-ready data architecture using Unity Catalog. Most likely they tore out some data marts and then migrated everything to one Unity Catalog governed Lakehouse.

I’m really excited about this as they say they’re covering a framework for a phased migration to Unity Catalog and also how they meet regulatory compliance with UC’s lineage tracking.

There are a lot of challenges that companies face when migrating so it’s going to be a great session.
Reserve session

2

Leveraging Databricks Unity Catalog for Enhanced Data Governance in Unipol
Leveraging Databricks Unity Catalog for Enhanced Data Governance in Unipol

Although similar to the Rabobank Unity Catalog use case, I think it's worth seeing if you deal with complex corporate structures. They run 7 subsidiaries, each on its own AWS account.

They will probably show how they map catalogs and schemas to each subsidiary, apply masking for sensitive claims data and how they automate permissions with Terraform so teams can get access quickly. I hope this will give a clear blueprint for governed sharing across multiple business units.
Reserve session

3

Transforming Financial Intelligence with FactSet Structured and Unstructured Data and Delta Sharing
Transforming Financial Intelligence with FactSet Structured and Unstructured Data and Delta Sharing

The reason why this session matters is because FactSet delivers some of the most widely used market-data feeds in capital markets.

But ingesting that data can be problematic because it can mean nightly file drops, custom connectors or a long onboarding. I hope the team will show how Delta Sharing removes or at least improves those steps.

In this session we’ll hear from both sides: FactSet will explain how they publish the data, and their customer will explain how they consume and govern that data in production.
Reserve session

4

How Nubank improves Governance, Security and User Experience with Unity Catalog
How Nubank improves Governance, Security and User Experience with Unity Catalog

Again similar to the Rabobank and Unipol ones - but I could listen to 10 migration and upgrade stories and not get bored. So I’m looking forward to this one as well.

I think this is relevant because Nubank is one of the world’s largest digital banks, they run a Databricks estate that serves 3000 active users and they have more than 4000 production notebooks and jobs. They will show how they moved that scale onto Unity Catalog and I'm sure it will be interesting to see.
Reserve session

5

Real-Time Market Insights — Powering Optiver’s Live Trading Dashboard with Databricks Apps and Dash
Real-Time Market Insights — Powering Optiver’s Live Trading Dashboard with Databricks Apps and Dash

This is in my top 5 sessions as of now. High-frequency trading is not a traditional use case for Databricks because it doesn’t offer the latency needed for trading. But powering trading dashboards is a perfect use for the Databricks Intelligence platform.

In this session we'll see how Structured Streaming and Databricks Apps can show continuous market views. I’m looking forward to hear about their architecture and hopefully about some data processing techniques that they use.
Reserve session

6

Building Real-Time Trading Dashboards With DLT and Databricks Apps
Building Real-Time Trading Dashboards With DLT and Databricks Apps

This is also in my top 5 as Databricks is perfectly suited for post-trade monitoring use cases.

Seeing how Barclays is building real-time trading dashboards will be interesting to see.
Reserve session

7

Enterprise Financial Crime Detection
Enterprise Financial Crime Detection: A Lakehouse Framework for FATF, Basel III, and BSA Compliance

FinCrime detection is another use case that’s perfect for Databricks as a platform. As AI and technology as a whole evolves, I believe Fincrime will put more and more organisations on the spot. We have stricter global rules and increasing data volumes so regulators will keep adding to their list of demands.

Barclays will show how a Databricks Lakehouse can meet regulatory demands while cutting false-positives and showing suspicious activities faster.

There are a lot of things that I'm looking forward to see from this session. For example implementing policy-based access control for investigators, scientists and auditors. Or how workflows and jobs package this alert data and attach lineage evidence for regulators.
Reserve session

8

Enhancing Efficiency With Security- How Morgan Stanley is Adopting a Fully-Managed Lakehouse
Enhancing Efficiency With Security: How Morgan Stanley is Adopting a Fully-Managed Lakehouse

Morgan Stanley has always had properly controlled cloud accounts because they need to satisfy various global banking rules. That self-managed position keeps data secure but costs both time and money and you need a lot of engineers to manage it.

In this session they’re going to show a fully-managed Lakehouse that meets those encryption, isolation and audit demands.
Reserve session

9

Learning from Goldman Sachs' Legend Lakehouse for Data Governance
Learning from Goldman Sachs' Legend Lakehouse for Data Governance

You can see the pattern here in terms of use cases: Data Governance, Unity Catalog and modernising data estates.

This session is about how Goldman Sachs merged their data contracts layer with a Databricks Lakehouse. They will talk about how a highly regulated firm can keep fine-grained entitlements, open formats and real-time performance in one Lakehouse architecture.
Reserve session

10

Crypto at Scale- Building a Cost-Efficient, High-Performance Platform for Real-Time Blockchain Data
Crypto at Scale: Building a Cost-Efficient, High-Performance Platform for Real-Time Blockchain Data

Blockchains never sleep. New blocks are added every second on Bitcoin, Ethereum, Solana and hundreds of other chains. Their data platform ingests that data, tags addresses and creates risk scores. And they do that in real time for exchanges, banks and for other financial institutions.

In this session they'll cover how they keep their cloud spend in check when dealing with so much data. Also how they use structured streaming and advanced analytics to create alerts and get deep into analysing blockchains.
Reserve session

What sessions are in your top 5? What are you looking forward to see at the Databricks Data & AI Summit?

If you enjoyed this article, feel free to connect with me on Linkedin as I’d love to know how you’re taking advantage of Databricks in your organisation.

Share this:

LET'S CHAT ABOUT YOUR PROJECT.

GET IN TOUCH