Have you ever tried to create an Azure key vault backed secret scope in Databricks using the UI, only to run into permission issues? I recently encountered this problem and wanted to share my experience.
When setting up the secret scope, I made sure to give access to the Databricks managed resource group’s managed identity. However, I still couldn’t retrieve the secret from the key vault. After some digging, I realized that the default service principal is different from the one present in the managed resource group, which is why I was getting an insufficient permission error.
I’ve seen videos where the solution is to assign ‘Databricks’ as a managed identity in Azure role assignment, which provides access to all workspaces. But when I checked my role assignment window, I didn’t see that option. It’s possible that this option isn’t available for premium workspaces, which have better access control.
If you’re also working on a premium Databricks workspace on Azure free trial, you might encounter this issue. I’d love to hear about your experience and any potential solutions you’ve found.