Skip to content

Tag: Power Query

Exceed the 500,000 row limit in Application Insights and Log Analytics with Power BI

The combination of Power BI and Application Insights (AI)/Log Analytics (LA) is a powerful one. These tools provide a quick, convenient, and relatively cheap way to collect and analyze telemetry on a wide variety of applications. One drawback of AI/LA is that any data query will return a maximum of 500,000 rows, which can be quite constraining in some cases. This article describes a way to work around this limit.

In this example, we’ll be working with an Application Insights instance that is being populated by the WordPress Application Insights plugin – in fact, it’s the one used on this very blog. There are a couple of ways to connect Power BI Desktop to AI data. The Power Query code is downloadable directly from Application Insights, and you can also use the Azure Data Explorer proxy address as outline in my post on the topic here. This approach will work for both methods, and for our purposes, we’ll be using the generated Power Query code approach.

To begin, access your Application Insights instance, and open the Logs window. If necessary, dismiss the “Queries” window that pops up. Next, form your query using Kusto Query Language (KQL). In our case, we want a simple dump of all rows in the “pageViews” table, so the query is simple – just pageViews.

Once we have the query the way that we want it, we select the Export button, and choose “Export to Power BI (M query). M is the name of the language that Power Query uses. Once chosen, a text file will be downloaded that contains the Power Query that we will need in Power BI Desktop.

At this point, we launch Power BI Desktop, and choose “Get Data”. Since we already have the query that we need, we will choose “Blank Query”.

Next, we name our query “Page Views”, and select the Advanced Editor. This is where we can paste in the query generated by Application Insights above. At this point, we open the file that was downloaded above, copy the contents, and paste them in this window (the top comments can be excluded).

Of note here is the value that will be automatically set for timespan. By default, this will be set to P1D, which means data will be retrieved only for the previous day. In our example above, we have changed it to show data for the past 365 days.

Selecting “Done” will load a preview of our data into Power Query. However, if we want to then load it into the data model, it will do so in a single pull, and we will be subject to the 500,000 row limit. What we need to do is break up our query into multiple queries, and Power Query lets us do this through the use of functions.

The first thing that we’ll need to do is to decide on how to segment the AI data. In our case, it is unlikely that we will have more than 500,000 page views per month, so if we performed one query per month, we should be able to retrieve all of our data. In order to do this, we need to go back to Application Insights, and form up a query that will return a list of year and month for our data. In our case, this query is:

pageViews
| where timestamp > now(-365d)
| summarize by 
    Year = datetime_part('Year',timestamp), 
    Month = datetime_part('Month',timestamp)

Note that the number of days in the where clause above should match the number of days in the larger query above. Next, export this query to Power BI, and create another query in Power Query. Leave the name as default for now. Selecting Done should return a list of years and months for your data. These values are all numbers, and Power Query recognizes them as such. However, we need to work with them as text later on , so we change their types to text.

Now we will return to our original query, and modify it so that it only returns data for a single month. Reopen the advanced editor and replace the query “pageViews” with:

pageViews | where datetime_part('Month',timestamp) == 10 and datetime_part('Year',timestamp) == 2020

The values chosen don’t matter, but they should return data, In the end, the edited code should look as follows:

Selecting done, we verify that we have data restricted to the specified month. This is where the fun begins. We are now going to turn this query into a function. To do so, we right click on our pageViews Query, and select “Create Function”

We are then presented with a dialog box that asks if we want to create the function without parameters. We can go ahead and select “Create”. We are then prompted to name the function, and we’ll call it “GetViewsByMonthAndYear”. We now need to edit the function. To do so, with the function selected in the query pane, we select the Advanced Editor once again. We then dismiss the following warning, and then we edit the function in two places. First, we need to define two variables to be passed to the function Month and Year , and then we add them to our query.

In the function declaration we add “Month as text” and “Year as text”. We then replace the explicit month and year that we originally queried for with these new variables, Month and Year. Our function code now appears as below:

Now we are ready to use our function. We select our query that contains the list of years and months, select the “Add Column” tab from the ribbon, and choose “Invoke Custom Function”. We give the new column a name “Views”, select our function from the dropdown, and then we select our column containing years and the column containing months to be passed to the function.

At this point, selecting “OK” will cause the function to be executed for each of the listed months. These are individual queries to AI, not one large one. Each query is still subject to the 500,000 row limit, but provided that no specific month exceeds that limit, all of the data will be returned.

Initially, the data is returned as a single table per day, but selecting the expand icon at the right of the column header allows us to retrieve the row values. It’s also a good idea to turn off the “Use original column name” option.

Selecting OK at this point displays all of the appropriate column values. We can then remove the “Year” and “Month” columns, as well as the original Page Views table that we used to create the function. We also need to set the data types for all of our columns because Power Query is unable to detect them using this approach.

Renaming our combined Query to Views, gives us the following result:

We still have a single table, but there is no longer a 500,00 row limit. At this point, we can load the data into the model and build our report.

3 Comments

Connect to Application Insights and Log Analytics with Direct Query in Power BI

Application Insights (AI) and Log Analytics (LA) from Microsoft Azure provide easy and inexpensive ways to instrument applications. Using just an instrumentation key, any application can send operational data to AI which can then provide a rich array of tools to monitor the operation of the application. In fact, the blog that you are reading uses an Application Insights plugin for WordPress that registers each view of a page into an instance of AI in my Azure tenant.

Application Insights data can be queried directly in the Azure portal to provide rich insights. In addition, the data can be exported to Excel for further analysis, or, it can be queried using Power Query in either Excel or Power BI. The procedure for using Power Query can be found in this article. The approach for doing so, uses the Web connector in Power Query, which can be automatically refreshed on a regular basis. The Web connector does not however support Direct Query, so the latency of the data in this scenario will be limited by the refresh schedule configured in Power BI. Any features that depend on Direct Query (Aggregations, Automatic Page Refresh) will also not work.

If you’ve worked with AI or LA, and dropped down to the Query editor, you’ve been exposed to KQL – The Kusto Query Language. This is the language that is used by Azure Data Explorer (ADX), or as its code name, “Kusto”. This is of course not a coincidence, as the Kusto engine powers both AI and LA.

Power BI contains a native connector for ADX, and if you can configure an ADX cluster for yourself, populate it, and work with it in Power BI for both imported and Direct Query datasets. Given that ADX is what powers AI and LA, it should be possible to use this connector to query the data for AI and LA. It turns out that the introduction of a new feature known as the ADX proxy will allow us to do just that.

The ADX proxy is designed to allow the ADX user interface to connect to instances of AI and LA and run queries from the same screens as native ADX clusters. The entire process is described in the document Query data in Azure Monitor using Azure Data Explorer. The document explains the process, but what we are particularly interested in is the syntax used to express an AI or LA instance as an ADX cluster. Multiple variations are described in the document, but the ones that we are most interested in are here:

For LA: https://ade.loganalytics.io/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.operationalinsights/workspaces/<workspace-name>

For AI: https://ade.applicationinsights.io/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.insights/components/<ai-app-name>

By substituting in your subscription ID, resource group name, and resource name, you can treat these resources as if they were ADX clusters, and query them in Power BI using Direct Query. As an example, a simple query on this blog can be formed using the ADX connector:

And the result will appear as:

The precise query is provided in the query section of the connector above.

Once the report is built, it can be deployed to the PowerBI service, and refreshed using AAD credentials.

It is important to note that this method does NOT require you to configure an ADX cluster of your own. We are simply utilizing the cluster provided to all instances of AI and LA. We therefore do not have any control over performance levels, as we would have in a full ADX cluster. However, if the performance is adequate,(and the queries designed appropriately), this can be a good approach to work with AI and LA data that has low latency (near real time) requirements.

31 Comments

Get the most out of Power Query in Power BI Dataflows

Power BI dataflows

Power BI dataflows are the first place that many users will encounter the new Power Query web based interface. Until now Power Query has been restricted to Power BI Desktop and Excel. This new web interface is, well, new, and it doesn’t contain all of the capabilities of the more mature client based interface. The good news is that you can take advantage of both the reusability of dataflows, and the maturity of the Power BI Desktop interface.

Two Interfaces

A quick glance at the user interfaces for Power Query on the web and in Power BI Desktop reveals the feature difference.

Power Query editor in dataflows
Power Query editor in dataflows

Power Query editor in Power BI Desktop
Power Query editor in Power BI Desktop

The Desktop editor has a full ribbon interface with a wide array of capabilities, while the web interface has a simple button bar with a subset of features. In the images above (which show the exact same set of queries in the two interfaces) it is easy to see that the combine binaries, or expand tables feature is not there for the “Content” column.

Advanced editor

The key to bringing all of these capabilities in Power BI Desktop to Power Query in dataflows is the Advanced editor. Power Query is at its essence an interface that is used to construct queries using the M language. This core code is available to you in both Desktop and dataflows.

In Desktop, the Advanced editor is available from the ribbon both in the Home tab and in the View tab. In the web based editor it is available by right clicking on an entity (query) and selecting Advanced editor.

 

The code revealed by this editor can appear rather daunting for a complex query, but all you really need to understand is how to copy and paste it. Build up whatever query you need using Desktop (or Excel!), open the Advanced editor and copy it to your clipboard. Then, either create a new dataflow or add an entity to a new dataflow using the Blank Query data source. Once the editor is open, right click on the query, open the Advanced editor, and paste the query from your clipboard.

Done. Well, almost. While both environments execute M code, there are a few differences to be aware of.

Some functions may not work

There are subtle differences between the M engine in Excel and the one in Power BI Desktop. This approach works very well with these two products, but occasionally an incompatibility can crop up. This is no different with the M engine for dataflows. If you do encounter an incompatibility, try achieving the same thing a different way in Desktop and trying again.

Data Sources

Dataflows do not support all of the data sources that Power BI Desktop and Excel do. This will of course change over time, but as of this writing, dataflows are in preview, and currently support 24 data sources compared to the almost 100 in Power BI  Desktop.

Data sources supported in dataflows - Nov 2018
Data sources supported in dataflows – Nov 2018

Queries posted into a dataflow that use an unsupported data source will therefore likely not work. However, there’s nothing stopping you from trying, I’ve been pleasantly surprised by a few.

Functions

Functions ARE supported in dataflows. They can be created using a blank query (and copying function from Power BI Desktop). However, if that’s all that you do, you may receive an error like “This dataflow contains computed entities, which require Premium to refresh” or “We cannot convert a value of type Table to Function”.

You do NOT need Premium to use functions, but a function must its “Enable Load” function disabled. This is done by right clicking on the function and toggling the Enable Load item to off.

Turn off Enable Load for a function
Turn off Enable Load for a function

Computed Entities

Computed entities (or calculated queries) are supported by dataflows but because the type of calculation can’t be predicted, they require the isolation that dedicated capacity (Premium) provides.

Referenced tables are an example of computed entities.  If you are in the habit of designing a base query that does not load data, and then creating variants of that table that do in your reports, you will need to change that design in dataflows in order to avoid the Premium requirement.

In Power BI Desktop, this is the difference between Duplicate and Reference when creating a new query from a base query. Duplicate will simply create a new query with the same steps, while Reference will create a computed entity. If you want to avoid Premium, you’ll need to use Duplicate.

New query create options in Power BI Desktop
New query create options in Power BI Desktop

The Power Query capabilities in dataflows are more powerful that they might appear at first glance. Power BI Desktop is the key to unlocking them, unless you’re already a total wizard at writing M code. Even then, the new editing features in Power BI Desktop likely put it over the top as an editor.

For now we need to cut and paste, but I would love to see a day when Power BI Desktop could connect directly to a dataflow and edit it in place.

4 Comments

Use Power BI dataflows to warehouse SharePoint list data

Image result for data warehouse

Reporting on SharePoint data has been a requirement for a long time, and there have been many approaches to fulfill this need. Custom web parts, Data View web parts and SSRS direct connected reports have historically been some of the solutions, but they all suffer from the same problem. If you have any serious amount of SharePoint data, you’ll quickly begin to bump into capacity limits and performance limitations, and in some cases, you can impact the performance of the overall system. In order to avoid this problem, it is necessary to warehouse SharePoint data first, as I argued in this post from 2012.

Once your list-based data is in a relational database, the performance issue is taken care of. However, the means of getting it moved there have traditionally been problematic. For a long time, there was a CodePlex project called the SharePoint List Source and Destination. This solution provided read and write access to SharePoint lists from SQL Server Integration Services (SSIS). Unfortunately, it was last updated in 2012, it was unsupported by Microsoft, and it did not support authentication for Office 365. This of course rendered it useless for use with SharePoint Online. In 2015, SQL Server Integration Services got an OData source, and given that SharePoint lists have OData endpoints, this became a viable option, particularly given that it did support Office 365 authentication. The OData connection from SharePoint did however have some limitations as well.

For cloud scenarios, Power BI has emerged as a very competent way of reporting against SharePoint data. It has native connectors for SharePoint list data, both on premises and in the cloud and Power BI reports can be hosted in the cloud through the SharePoint Power BI web part. On premises, the same can be done with Power BI Report Server. The structure of Power BI reports mean that the data is cached in a data model, so reports are not run directly against the list data source. This avoids the performance issues listed earlier.

Earlier this year I published a series of articles detailing how to do exactly this. The only issue with this approach is that the data shaping and preparation is always specific to a single report. If I have 5 different reports that use one list, I must query and shape that data 5 different times – one for each report. This is where Power BI dataflows come in.

In this context, dataflows are essentially a data warehousing layer with transformation capability. Instead of each report connecting back to a source list, the dataflow connects to the list, shapes the data with Power Query online and stores it in a data lake. The Power BI reports then connect to the dataflow as their data source. Transformation and storage only need to happen once.

As of this writing, dataflows are in public preview, so be warned – some things could change.

Creating a dataflow

Creating a dataflow from a SharePoint list is relatively straightforward. In our examples below, we will work with the same sample list from the series of articles on SharePoint data earlier this year. To begin open Power BI and navigate to a workspace (your personal workspace will not have dataflows). Click on the workspace name in the navigation pane and the dataflows tab should be available.

To create a new dataflow, Select the Create button, and click dataflow.

Select the Add new entities button and the data source selection will appear. SharePoint list and SharePoint online list are both options. SharePoint list is for on premises list data which will work with the On-Premises Data Gateway. In our case we are working with SharePoint Online, so we select the SharePoint Online source.

At this point, you enter the URL for the site that you want to connect to (NOT the URL for the list) and select the Next button. Power BI Will connect to the site and you can then select which list you want to work with. In our case, we need our Listings data, so we select that list and click Next.

Finally, we’re in the Power Query editing screen. This should be quite familiar to those used to working with Power Query in either Power BI Desktop or in Excel. From here you can select the columns that you want to include in the dataflow.

Although this experience is similar that building queries in the Power BI Desktop, there are a few noticeable differences. Queries in a PBIX file are referred to as queries, but within a dataflow they are referred to as entities. These entities can be custom, or they can be mapped to Common Data Model object types. The Power Query web editor also does not include the full featured editing ribbon found in Power BI Desktop, but instead has a button bar. Many of the editing options available in Power BI Desktop are not available in the Power Query web experience.

If you have read through some of my earlier articles on working with SharePoint data in Power BI, you will notice that there are fewer columns available than we see in the Desktop Power Query editor. Most notably for us working with SharePoint data is the FieldValuesAsText column which is the convenient way of retrieving the text representation of complex SharePoint list column types. At first glance, this would appear to be quite limiting.

However, by right-clicking on the entity name, we can access the Advanced Editor.

This Advanced editor allows you to write queries by hand using the M language. The side benefit of the Advanced editor is that it makes queries portable between platforms -Desktop, Excel, and now dataflows. You can therefore build your queries in Power BI Desktop using its fully functional editor and then copy and paste it into a new blank query in the dataflow editor. Using this approach allows you take advantage of the SharePoint helpers built into Power BI Desktop as the FieldValuesAsText column, and other columns are available. Using this technique, the Listings example can be transformed into several normalized tables in the dataflow.

Click on Done to save your entities, and then the Save button to save your dataflow. You will be prompted to Refresh Now which is a good idea because by default, the dataflow has no data contained within it. To keep the data up to date, you need to set a refresh schedule by clicking the schedule refresh icon under actions for the dataflow in question. From here, you schedule data refresh in the same manner as you would with ta Power BI Report.

Using the dataflow

Once data is loaded into the dataflow it becomes a source for a Power BI report. You must use Power BI Desktop to create this report, there is no way to connect a report to a dataflow in the pure web interface. Start Power BI Desktop and select “Get Data”. Choose the Power BI blade and then Power BI dataflows.

After clicking Connect, you will be presented with a set of Power BI workspaces that contain dataflows. Opening the workspace will allow you to open the dataflow and select the desired entities.

Once loaded, the report can be built just like any other. When it is refreshed, it will be refreshed from the data stored in the dataflow, NOT directly from the SharePoint list. It is therefore important to keep the dataflow itself up to date.

Any number of reports can be created from the dataflow. Instead of having all the transformation logic tied up within a single report, dataflows allow them to be centralized and consistent. With a little work, these transformations allow you work with your SharePoint data just as though it were relational. Power BI dataflows really are the best way to perform data warehousing with your SharePoint data, whether you SharePoint is on line or on-premises.

3 Comments

Using Excel Files with Power BI Desktop and SharePoint

As discussed in a previous post, Working with Excel Files in the Power BI Service, Excel and Power BI have a rich, complex relationship. Power BI Desktop is the primary design tool for Power BI, and it has many feature overlaps with Excel as an analytic tool. Excel can be used both as an analytic tool and a data source, and the structure of the Excel file will dictate the way that Power BI Desktop can be used with it. If Excel is being used as an analytic tool (i.e. connected to data), the appropriate items in the file can be imported into Power BI Desktop. If it is being used as a data source (data in worksheets), Power BI Desktop will connect to it, and use its data to build a model. This post attempts to articulate the nuances of both scenarios.

Importing from Excel

Power BI Desktop has an unfortunate name in my opinion. It is a design tool and is not meant to replicate the capabilities of the Power BI service on the desktop, as the name might suggest. A better name for the product would I believe be Power BI Designer. Its purpose is to connect to and transform data (Power Query), build data models for Analysis (Power Pivot) and build reports (report designer).

Used as an analysis tool, Excel has all these capabilities as well. In fact, the first two (Power Query and Power Pivot) are identical to what is already in Power BI Desktop. Excel also has Power View for analytic reporting. Power View is very similar to the type of reporting in Power BI Desktop, but uses a different technology and has been deprecated for some time. As a result, Excel charts and pivot tables are the primary means of visualizing data in Excel.

So why would you need to use Power BI Desktop if you are using Excel? As explained in my previous post, the Power BI service can fully interact with Excel as an analysis tool, and allows you to interact with Excel right from the Power BI Service. If Excel is meeting all your analytics needs, then there may be no need to introduce Power BI Desktop at all. However, if you wish to take advantage of Power BI’s analytic reporting capabilities, and you have existing Excel assets, you may wish to convert them to the native Power BI format.  Whatever the reason, moving from Excel to Power BI is relatively straightforward with Power BI Desktop.

From the File menu in Power BI Desktop, select Import, and then Excel Workbook Contents.

Importing Excel Files contents

You are then prompted to select an Excel file. Once selected, you are then presented with a warning dialog.

Excel files contents warning dialog

The dialog does a very good job of explaining what will happen, specifically the fact that data from workbooks will not be brought into this new file. Any Power Pivot data models or Power Queries will be brought in. If the workbook contains legacy Power View sheets, they will be converted to native Power BI visuals. In addition, any legacy (non-Power Query) data connections used by the source file’s Power Pivot data model will be converted to Power Query and imported.

Imported Power View sheet

Legacy Power View Sheet converted to Power BI visuals

A complete list of workbook content and what is/isn’t converted is below:

Excel Content Import to PBI Desktop Support
Data in sheets Not imported
Data model (Power Pivot) Imported
Data connections Converted to Power Query and imported
Power Queries Imported
Power View Sheets Converted to PBI visuals and imported
Pivot charts/tables Not imported
Excel charts Not imported
Macros Not imported

Once imported, the new Power BI file (PBIX) lives on its own and contains no connection or any other type of relationship to the original source Excel file. If the source Excel file is changed, there is no way to update the PBIX file. Any imported data connections are between the PBIX file and the original data source. The new PBIX file can be published to the Power BI service like any other.

Connecting to Excel

Connecting to Excel as a data source is a very different thing than importing from it. In this scenario, the data in the worksheets and only the data in the worksheets is brought into the data model. The is very different behaviour than that of connecting to Excel files to the Power BI Service, where both the model and the worksheet data is brought in.

Using the Excel Connector

The easiest, and most obvious way to connect to Excel worksheet data is by using the Excel connector. From the ribbon in Power BI Desktop, select Get Data. The Excel connector is right at the top of the list.

Connecting to Excel files on the file system

Selecting it allows you select your source file, and then the workbooks within it, and then build out the data model.

This approach works well but carries with it an important limitation. The new queries are  connected to the file using a local file system. This means that to be refreshed, an on-premises data gateway is required. In order to eliminate the gateway requirement, you can connect to the file in SharePoint using the SharePoint folder connector.

Using the SharePoint Folder Connector

The SharePoint Folder connects to all the files stored in libraries of a SharePoint site. It allows you to report on file metadata, but it also allows you to drill into file contents.

From Power BI Desktop select Get Data but instead of selecting Excel, Search for SharePoint and select SharePoint folder.

Using the SharePoint folder connector

Once selected, enter the URL of the SharePoint site (NOT the URL of a library or folder) in the dialog box.

Next, you will be presented with a preview of all the files in your site. Unless you are only interested in file metadata, click on the Edit button to bring up the Power Query editor.

The initial view will contain all the files in the site, but we are interested in the content of just one of those files. Every file in this view will contain the hyperlinked value “Binary” in the Content column. Clicking that link for the file that you want to connect to will drill down into the contents.

Site contents using the SharePoint folder connector

From this level, you can build your Power Query, data model, and report as needed just as if you had used the Excel connector. The difference is that now when you publish your report to Power BI, it will know the file is stored in SharePoint and will connect directly to it. It will not require a gateway for refresh purposes. Once credentials are registered, the report will refresh itself directly from the workbook stored in SharePoint.

XLS vs XLSX

A note of caution. The above SharePoint folder approach only works for XLSX files. The Power BI Desktop and the Power BI service both support both Excel file formats (XLS and XLSX). However, refresh does not. If the source file format is XLS, and a refresh is attempted, you will receive the classic “microsoft.ace.oledb.12.0 provider” error in the Power BI service.

Excel files refresh error with XLS file types

The older Excel file format (XLS) requires an Access driver to refresh, which is not a part of the Power BI service. The newer XLSX file does not require this driver. As a result, if the source file is XLS, refreshing it requires going through an On-Premises Data gateway, and that gateway machine must also have the ACE components installed.

To recap, you can bring Excel assets into Power BI Desktop by using the import function, and you can load data from Excel files through Power Query. The two operations have very different results, and the can be combined if a source workbook contains both analyses and data.

5 Comments