So, Databricks offers authentication and access to their REST API’s using these two types of tokens:
1) PAT’s
2) Azure Active Directory tokens

A PAT is used as an alternative password to authenticate and access Databricks REST API’s. I am going to show you how to use PAT’s to access some Spark managed tables that I created for this blog post specifically using PowerBI.

A good thing to know is that PAT’s are enabled by default for every Databricks workspace created on or after 2018. If you have a workspace older than that, have an administrator enable them.

As a user, you can create a PAT and use them in REST API requests. Tokens have optional expiration dates, and they can also be revoked if needed.

Token Creation

  • First things first, go ahead and login to your Databricks workspace.
  • Once logged in, in the upper-right hand corner, click on your username, then choose “User Settings”:
  • On the left hand side under the User blade, choose Developer:
  • One of the first options you should see on the next page is “Access Tokens”, go ahead and click on Manage:
  • Click on “Generate Token”:
  • Fill out the info, I like to be specific on what the token is being used for. I also choose an expiration date, once filled out click on generate:
  • Once you hit generate above, a popup will display your key. COPY THIS KEY SOMEWHERE! It will only be displayed this one time. If you do not copy it, I don’t know of a way(right now) that it can be retrieved for you.

Configuring PowerBI with Token

Now that we have our token created, the next thing we will need is to grab our Databricks compute cluster information and put that into our PowerBI connection window. Follow along below to accomplish this.

  • On the left side of your Databricks webpage, click on “Compute”
  • Click on the cluster you will be connecting into:
  • Click on the “Advanced Options” section to expand it and copy down the following fields: “Server Hostname”, and “HTTP Path”:
  • Now, let’s open up PowerBI Desktop, and click on “Get Data”, then choose more:

  • Then in the popup menu, choose Azure on the left side, and on the right side choose Azure Databricks, then click connect:

  • Fill out the Server Hostname, and HTTP Path fields with the information you copied from your Databricks Cluster compute information, then click on “OK”:

  • Choose the Personal Access Token section, and put in the token that was generated earlier that you copied down, then click on connect:



If all goes correctly, you should end up with a Navigator screen to browse the folders\file structure in your Datarbricks cluster/workspace.

Some things to note: Make sure your cluster is running! If it is not running, you won’t be able to connect.

Leave a comment below and let me know what you think!

By admin

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.