Connect to Power BI dataflows and datasets using Power BI Desktop with multiple accounts

Image result for multiple accounts

Power BI datasets and dataflows are the two native data sources for Power BI reports. Connecting to a datasets allows a report to be built against an existing Power BI dataset in place, and dataflows represent a source of data that has had transformations applied to it. When connecting to Power BI dataflows, data is imported, into a data model but the connection to a Power BI dataset is a direct connection.

The two sources handle identity in drastically different ways, and this can lead to confusion when dealing with multiple accounts and tenants. This post is an attempt to help clarify this confusion

Connecting to a dataset

To connect to a Power BI dataset, select the “Get Data” button from the ribbon, select the “Power BI” tab, select Power BI dataset, and finally the “Connect” button.

Connecting to a Power BI dataset

Next, if a user is signed into Power BI Desktop a list of workspaces and their datasets are presented. If not, the user is prompted to sign in. The dataset can then be connected.

Selecting a Power BI dataset

The important thing to notice here is the list of workspaces itself. The list presented is a list of the workspaces available in the tenant belonging to the currently signed in user. It is the same list of workspaces that can be chosen as a publishing target. It should also be noted that the identity of the user is displayed in the upper right corner of the dialog box, and the identity can be changed directly from there.

Connecting to a dataset in a different tenant

The signed in user can be changed by selecting “Sign in” at the upper right of the Power BI Desktop client, or within the connection dialog itself. If the user signs into a different account (in a different tenant), a different list of workspaces and datasets will then be offered. The dataset source is hard linked to the currently signed in user. In this way, the Power BI dataset source behaves differently than all other data sources, which maintain connection credentials separately.

Connecting to a Dataflow

Connecting to a dataflow follows the same steps as a dataset, with the exception that the “Power BI dataflows” option is chosen.

Connecting to a Power BI datflow

At this point, a Power BI data connection dialog will be shown.

Signing in to a Power BI dataflow

There is only one authentication option because Power BI dataflows only support one authentication option.

Unlike datasets, dataflows are NOT linked to the currently signed in user. The connection is authenticated, not the current user. The “Sign In” button must be selected, and authentication completed to connect to a Power BI dataflow.

Once signed in, selecting the “Connect” button will display a list of workspaces that contain dataflows. Expanding the workspace and then the dataflow will expose a list of entities that can be imported into the Power BI data model.

Selecting a Power BI dataflow

The connection information for the dataflow is cached with Power BI Desktop, and subsequent connections to dataflows will not require the user to sign in. The same authentication credentials will be used.

it should be noted that unlike the dataset connection dialog, this one does not show the current credentials and does not allow those credentials to be changed. This makes the process of changing credentials to use dataflows in multiple tenant somewhat less than intuitive.

Connecting to a dataflow in a different tenant

With datasets, changing the currently signed in user will result in a different set of datasets being presented when the dataset option is chosen. This is different with dataflows. No matter what user is currently logged in, the cached credentials will be used.

This behaviour can be confusing when multiple tenants need to be accessed. With most other data sources, the cached credentials are linked with the specific data source. For example, when two different SQL databases are connected, Power BI caches two different sets of credentials.

To connect to dataflows in a different tenant, the current connection information needs to be cleared. This can be done with any data source, but it is particularly important to dataflows as it is the only way to switch connection credentials.

To clear the credentials for the dataset, select “File”, “Options and Settings” and the “Data Source Settings”. The Data source settings dialog will then be presented.

Power BI Data source settings
Clearing the credentials for a Power BI dataflow

Unlike most other data sources which can have multiple entries in the list, one for each unique data source, there will only be one source for dataflows. It is named “Power BI dataflows. For example, if the current instance of Power BI Desktop has authenticated to 3 different SQL servers, there will be three SQL Server connections in this list, but there will only be one for dataflows, no matter how many tenants that have been connected.

To switch tenants, the current credentials must be either cleared, or edited. The cached credentials can be fully removed by selecting “Clear Permissions” or they can be changed by selecting “Edit Permissions”. If cleared, the user will be prompted for credentials the next time the dataflow option is selected. If edited, the new credentials will be stored.

Conclusions, and recommendations

It is possible to work with multiple tenants for both connected datasets and dataflows. However, the methods for doing so are completely different for either option. This can obviously lead to some confusion.

It is my opinion that this behaviour should be changed, and that the behaviour or connected datasets is the more intuitive. If the credentials for the currently logged in user were user for both types of connection, it would be much more intuitive, and also easier to user for report designers.

Advertisements

What Domain Accounts to Use For SharePoint 2010

When setting up a SharePoint farm, whether 2007 or 2010, you have the option of providing various service identities throughout the process. Indeed, every application can run with its own identity. Far too often, administrators pick a single account, and use it for everything. While this is certainly the easiest approach, it is far from the most secure, and it can be very limiting down the road if you need to get granular with your permissions. The trouble is that there are a lot of intricacies as to what account does what, and getting it right requires a pretty comprehensive understanding of the product.

We now have enough work under our belt with SharePoint 2010 that I feel comfortable sharing some of our best practices around account creation for SharePoint 2010. The product itself has gotten more complex, and so therefore have the configuration options. There is no “one size fits all” approach for all scenarios, but the list that I am providing below should work as a good starting point. There is often a trade-off between the ease of manageability and providing good security, and the approach below,I feel,find a good balance.

The chart below describes the account, its purpose, what rights it needs to the local machines in the farm (including the SQL server machine(s), the rights it needs for SQL Server directly, and the rights it needs to the Active Directory domain.

Base Set of SharePoint 2010 Service Accounts

Account Purpose Local Rights SQL Rights Domain Rights
spSetup
  • Used to login to the farm servers
  • Used to install bits on the farm servers
  • Administrator
  • Remote Desktop Login
  • DB Creator
  • Security Admin
  • Member
  • spFarm
  • Identity for all Windows Services
  • Identity for all SQL Services (optional)
  • Identity for Profile Synchronization Service
  • Identity for all code running with elevated permissions (web parts)
  • None (1)(3)
  • DB Creator
  • Security Admin
  • Member

     

  • spApps
  • Identity for all SP Application App Pools (4)
  • None None
  • Member
  • spServices
  • Identity for all SP Service Applications (4)
  • None None
  • Member
  • spUPS
  • Identity for the User Profile Service 
  • None None
  • Member
  • Replicating Directory Changes(2)
  • spCrawl
  • Used by the Indexer to crawl content
  • None None
  • Member (5)
  • spBI
  • Trusted account for Reporting Services and PerformancePoint when not using Kerberos
  • None
  • DB Access as appropriate
  • Member
  • spSuperUser
  • Used for Object Caching
  • None
  • None
  • Member
  • spSuperReader
  • Used for Object Caching
  • None
  • None
  • Member
  • (1) Needs to be a part of the Local Administrators group while the User profile service is being created. See my previous post for more details. Once created, this account can be removed.

    (2) AD Permission required by the User Profile service

    (3) Required for a specific AD container when using the incoming email service. See this post for details on how.

    (4) There may be a large number of these, one per entity

    (5) Appropriate rights will need to be granted to this account for any EXTERNAL content being crawled (file system, shared folder, Lotus Notes, etc)

    Hopefully this will help a few of you get started with a little less head scratching.