top of page
Writer's pictureAntriksh Sharma

Move Power BI Datasets to different workspace with Python

As the title suggest let's see how to move (technically recreate) a PBI dataset in PBI service into a different workspace, the gist of the blog is to download the BIM file which is a JSON file containing data model in textual form, it basically contains object available Tabular Object Model hierarchy or what you see in Power BI.


We will use TOM DLL with Python so you first need to follow this blog for the setup: https://www.antmanbi.com/post/installing-tom-library-to-connect-python-to-power-bi-ssas


First we start with the basic loading of the libraries:

import json
import clr # From pythonnet, remove any existing clr -- pip uninstall clr
import os
import msal
import requests


# Tabular Object Model DLLs that are required to interact with SSAS Dataset

folder = r"C:\Windows\Microsoft.NET\assembly\GAC_MSIL"

clr.AddReference(folder + 
    r"\Microsoft.AnalysisServices\v4.0_15.0.0.0__89845dcd8080cc91\Microsoft.AnalysisServices.dll")

clr.AddReference(folder +
    r"\Microsoft.AnalysisServices.Tabular\v4.0_15.0.0.0__89845dcd8080cc91\Microsoft.AnalysisServices.Tabular.dll")

clr.AddReference(folder +
    r"\Microsoft.AnalysisServices.Tabular.Json\v4.0_15.0.0.0__89845dcd8080cc91\Microsoft.AnalysisServices.Tabular.Json.dll")

import Microsoft.AnalysisServices as AS
import Microsoft.AnalysisServices.Tabular as Tabular

Then we need to declare variables that correspond to the application that you need to create here: https://app.powerbi.com/embedsetup or you can do this in Azure as well.


You need to add this Application into workspaces either individually or just add the Application into a Security group which is part of workspaces, with appropriate rights.


Once done the further code will be this which contains Application ID, PBI Tenant ID, Application Secret that you will get at the time of creation (this is shown only once, so if you lose it you need to create another Secret)

app_id = '*********************************'
pbi_tenant_id = '*********************************'
app_secret = '*********************************'


def get_access_token():
    
    authority_url = f'https://login.microsoftonline.com/{pbi_tenant_id}'
    scopes = [r'https://analysis.windows.net/powerbi/api/.default']

    client = msal.ConfidentialClientApplication(
        app_id, 
        authority=authority_url, 
        client_credential=app_secret
    )
    
    response = client.acquire_token_for_client(scopes)
    token = response.get('access_token')
    
    return token


token = get_access_token()
headers = {
    'Content-Type': 'application/json',
    'Authorization': f'Bearer {token}'
}

To makes sure your application is able to authenticate you can do a print(token) which will return the access token


Next we will get a list of workspaces that are on either PPU or Premium

# Get list of PPU workspaces as the TOM communication can happen only through XMLA

def get_workspaces():
    
    workspaces = 'https://api.powerbi.com/v1.0/myorg/groups?$filter=(isOnDedicatedCapacity eq true)'
    response_request = requests.get(workspaces, headers=headers)
    result = response_request.json()
    
    workspace_names = [workspace['name'] for workspace in result['value']]
    return workspace_names

Next we need to create a function that will iterate the model and extract the model.bim file:

bim_folder = r"C:\Users\antsharma\Downloads\Power BI\\"
if not os.path.exists(bim_folder):
    os.makedirs(bim_folder)

# Downloads Model.bim file to a specific folder on system
def export_model_json(server: Tabular.Server):
    
    for db in server.Databases:

        script = Tabular.JsonScripter.ScriptCreate(db)
        json_file = json.loads(script)['create']['database']
        raw_json = json.dumps(json_file, indent=4)
        
        with open(bim_folder + db.Name + '.bim', 'w') as model_bim:
            model_bim.write(raw_json)

Next we will iterate the result of get_workspaces function and iterate all the workspaces and PBI Datasets in them and call the export_model_json function:

# Iterate each workspace and download Model.bim file
def connect_workspace_and_export_bim():
    
    workspaces = get_workspaces()
    
    for name in workspaces:
        workspace_xmla = f"powerbi://api.powerbi.com/v1.0/myorg/{name}"
        conn_string = f"DataSource={workspace_xmla};User ID=app:{app_id}@{pbi_tenant_id};Password={app_secret};"
        
        server = Tabular.Server()
        server.Connect(conn_string)
        
        export_model_json(server)
        server.Disconnect()
        

connect_workspace_and_export_bim()

After this we need to have that one workspace on which we want to recreate the datasets:

# Once all files are downloaded publish them to a workspace:
server = Tabular.Server()
workspace_xmla = "powerbi://api.powerbi.com/v1.0/myorg/Created w Python"
conn_string = f"DataSource={workspace_xmla};User ID=app:{app_id}@{pbi_tenant_id};Password={app_secret};"
server.Connect(conn_string)


def publish_model_bim(bim_folder_path, server: Tabular.Server):
    
    for filename in os.listdir(bim_folder_path):
        f = os.path.join(bim_folder_path, filename)
        
        if os.path.isfile(f):
            file_name = os.path.splitext(os.path.basename(f))[0]
            new_dataset_name = server.Databases.GetNewName(file_name)
            
            with open(f) as bim:
                json_file = json.load(bim)
                json_file['id'] = new_dataset_name
                json_file['name'] = new_dataset_name
                json_file['model']['defaultPowerBIDataSourceVersion'] = "powerBI_V3"
            
            raw_json = json.dumps(json_file, indent = 4)

            db = AS.JsonSerializer.DeserializeDatabase(
                raw_json, 
                DeserializeOptions = 'default', 
                CompatibilityMode = 'PowerBI'
            )

            script = Tabular.JsonScripter.ScriptCreateOrReplace(db)
            server.Execute(script)


publish_model_bim(bim_folder, server)
server.Disconnect()

The end result is this:

Complete code:

The next step is to takeover the datasets and bind them to a gateway which I will add shortly in this post.


765 views0 comments

Recent Posts

See All

Comments


bottom of page