Microsoft Fabric
Satori streamlines and simplifies the process of controlling access to data in Microsoft Fabric. Satori reduces the risk of data leakage by misconfiguring users or permissions. When using Satori to deliver access to data in Microsoft Fabric, organizations provision a single set of AWS credentials that are only used by the Satori DAC to access Microsoft Fabric.
Satori generates temporary credentials for data consumers instead of their existing Microsoft Fabric credentials.
Prerequisites
Perform the following steps to grant Satori access to Microsoft Fabric:
- Go to the Admin portal of your Microsoft Fabric account.
- Click the settings button on the toolbar and open the Admin portal.
- Go to the Admin API Settings and enable the Service principal can access read-only admin APIs toggle.
- Now apply this to the Specific security group (provide a name of the group in the input field).
- Enable the Enable the Enhance admin APIs responses with detailed metadata toggle.
Create an azure service account with the following permissions:
NOTE: Ensure that the service account is associated with the previously created security group.
- Access to the power-bi api
- Write access to an Azure Data Lake Storage Gen2 account used to hold the tables and the dax files.
- Or, a fabric-lake-house that uses storage gen2
- Make sure that the Service account has access to all of the relevant workspaces.
- Add the workspace-id, tenant-id, and the client key and secret for your azure service account that you created
- Now, Create the datastore.
Adding a Microsoft Fabric Data Store to Satori
- Login to the Satori Management Console.
- Go to the Data Stores view and click the Add Data Store button.
- Select the Microsoft Fabric option.
- Provide an informative name for the data store, for example:
Sales Data Warehouse
. - Select the AWS region where your data is hosted, for example:
us-east-1
. - Select the Data Access Controller (DAC) to use for accessing the data store.
- Enter the AWS account ID for the Microsoft Fabric data store.
- Enter the AWS Access Key ID and Secret Access Key you copied in the previous step.
- Click the Create button. Satori will then validate that the credentials and AWS Account ID match
- Finally, you will be redirected to the Data Stores list view.
Note: Once the datastore is successfully configured the scan will start automatically and the location should appear in the inventory.
Setting Up the Microsoft Fabric Data Store
Once the datastore has been added to Satori perform the following tasks:
- Go to the Data store view select you new Fabric data store instance and select the Data Store Attributes tab.
- Choose fabric attributes from the Data store Specific Attributes drop menu.
- Now, select Azure Storage V2 from the storage location drop menu.
- Finally, configure the storage path for your azure gen2 storage location. This is the URL format: ( https://{accountName}.{dnsSuffix}/{filesystem})
Optional - Using Fabric Lakehouse as your Storage Account
In the event that you would like to use Fabric lakehouse as your storage account, you need to fetch the storage account path by performing the following steps:
- Login in into your Fabric Lakehouse
- In the explorer panel, Click on the folder that you created for satori, Satori will use to store your DAX and entitlement files. 3. Click on the three dots icon that appears and then select the Properties list item.
- . Copy the Properties URL
Creating an RLS Security Policy in Satori
Once you have created an RLS security policy for your Fabric data store in Satori you must now go to your storage instance and verify that three new file types have been created by Satori. These file types are as follows:
- One dax.text file for each of the tables that you defined in the Satori security policy.
- One csv and one parquet file that contain the entitlements for the semantic models.
Applying the RLS Policy to your Fabric Semantic Model
To apply the RLS security policy to the Fabric semantic model perform the following tasks:
- Open your semantic model in Power BI
- Now create a table called
satori_user_filter
from the csv or parquet file that Satori uploaded into your Azure Data Lake Storage Gen2. The name of the file is the same as the semantic model.
Note: To understand how to load tables from Azure storage to Power BI, refer to the following tutorial. https://learn.microsoft.com/en-us/power-query/connectors/analyze-data-in-adls-gen2
Applying the DAX Filter to a Table
To apply a DAX filter to a table perform the following tasks:
- Copy the content of the DAX file ([semantic model].[table].dax.text) from storage file that was created by Satori
- Click the Manage Roles button in the toolbar and then paste the content into Table Filter DAX expression input in the Manage Roles Security panel.
- Ensure that the Table Filter DAX expression is applied to the user role for all Fabric service users.
Note: To understand how to assign DAX filters to tables, refer to the following tutorial: https://learn.microsoft.com/en-us/fabric/security/service-admin-row-level-security
Applying the RLS to the Fabric Service
To apply the RLS to the Fabric service perform the following tasks:
- Publish the semantic model from your Power BI
- Go to the semantic-model security configuration and apply the role that you defined on power-bi desktop application for all users.
- Go to the Semantic model and click the refresh button to validate that you have correctly configured the credentials.
Security Validation
To validate that everything is working correctly, perform the following steps:
- Go to the Workspace view and select your semantic model line on the table, then click on the three dots icon and select Security from the list menu.
- Now go to the role you just created and click on the three dots icons and select Test as Role.
- Click the Now Viewing As: tab and select the Select Person tab
- Now enter the name of the user that you wish to test and ensure that the report is filtered correctly.
- You're Finished :)
Updating the Fabric Security Policy
To update or change the Fabric security policy perform the following tasks:
- Open the Fabric Security Policy and make the relevant changes to the policy.
- If you made a change to the location selector a new DAX file is created. You must now reapply this newly created DAX file as you did in the previous section called Applying the DAX Filter to a Table
- If you made a change to the filters, a user attribute or a group change the entitlement table is automatically updated and the change is applied at the next semantic model refresh point.