This article discusses some common issues you may encounter when using the On-premises Data Gateway.
Note: If you encounter an issue that is not listed below, you can ask for further assistance in the following locations.
- For Power BI, you can use the communities site or you can create a support ticket.
- For PowerApps, you can use the communities site or you can create a support ticket.
- For Microsoft Flow, you can use the communities site of you can create a support ticket.
- For Logic Apps, you can submit a support ticket through the Azure portal.
A lot of issues can surface when the gateway version is out of date. It is a good general practice to make sure you are on the latest version. If you haven't updated the gateway for a month, or longer, you may want to consider installing the latest version of the gateway and see if you can reproduce the issue.
Here are a few common issues and resolutions that have helped a number of customers in environments that restrict internet access.
Your proxy may require authentication from a domain user account. By default, the gateway uses a Service SID for the windows service log on user. Changing the log on user to a domain user can help with this. For more information, see Changing the gateway service account to a domain user.
Some proxies restrict traffic to only ports 80 and 443. By default, communication to Azure Service Bus will occur on ports other than 443.
You can force the gateway to communicate with Azure Service Bus using HTTPS instead of direct TCP. Although, this will greatly reduce performance. You will need to modify the Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.dll.config file. Change the value from
Https. This file is located, by default, at C:\Program Files\On-premises data gateway.
<setting name="ServiceBusSystemConnectivityModeString" serializeAs="String"> <value>Https</value> </setting>
You may receive this error if you are trying to install the gateway on a domain controller. Deploying on a domain controller is not supported. You will need to deploy the gateway on a machine that is not a domain controller.
The gateway runs as a Windows service, so you can start and stop it in multiple ways. For example, you can open a command prompt with elevated permissions on the machine where the gateway is running and then run either of these commands:
To stop the service, run this command:
''' net stop PBIEgwService '''
To start the service, run this command:
''' net start PBIEgwService '''
All of the details are available, but the call to the Power BI service returned an error. The error, and an activity id, will be displayed. This could happen for different reasons. You can collect, and review, the logs, as mentioned below, to get more details.
This could also be due to proxy configuration issues. The user interface does now allow for proxy configuration. You can learn more about making proxy configuration changes
Information was received from the Power BI service, to the gateway. The information was passed onto the local windows service, but it failed to return. Or, a symmetric key generation failed. The inner exception will be displayed under Show details. You can collect, and review, the logs, as mentioned below, to get more details.
At the end of configuration, the Power BI service will be called again to validate the gateway. The Power BI service does not report the gateway as live. Restarting the windows service may allow the communication to be successful. You can collect, and review, the logs, as mentioned below, to get more details.
You may receive a script error when signing into Power BI as part of the on-premises data gateway configuration. Installing the following security update should resolve the issue. This can be installed via Windows Update.
You may encounter an error similar to the following.
Failed to update gateway details. Please try again. Error updating gateway configuration.
This will include a stack trace, and that stack trace will may include the following.
Microsoft.PowerBI.DataMovement.Pipeline.Diagnostics.CouldNotUpdateGatewayConfigurationException: Error updating gateway configuration. ----> System.ArgumentNullException: Value cannot be null. Parameter name: serviceSection
If you are upgrading from an older gateway, we preserve the config file. There may be a section that is missing. When the gateway tries to read it, we will get the above null reference exception.
To correct this, do the following.
Uninstall the gateway.
Delete the following folder.
c:\Program Files\on-premises data gateway
Reinstall the gateway.
Optionally apply the recovery key to restore an existing gateway.
Within Show details, it should display the error message received from the data source. For SQL Server, you should see something like the following.
Login failed for user 'username'.
Verify that you have the correct username and password. Also verify that those credentials can successfully connect to the data source. Make sure the account being used matches the Authentication Method.
We were able to connect to the server, but not to the database supplied. Verify the name of the database, and that the user credential as the proper permission to access that database.
Within Show details, it should display the error message received from the data source. For SQL Server, you should see something like the following.
Cannot open database "AdventureWorks" requested by the login. The login failed. Login failed for user 'username'.
This error could occur for different reasons. Be sure to validate that you can connect to the data source from the machine hosting the gateway. This could be the result of the server not being accessible.
Within Show details, you will see an error code of DM_GWPipeline_UnknownError.
You can also look in the Event Logs > Applications and Services Logs > On-premises Data Gateway Service for more details.
We were unable to connect to the specified data source. Be sure to validate the information provided for that data source.
Within Show details, you will see an error code of DM_GWPipeline_Gateway_DataSourceAccessError.
If the underlying error message is similar to the following, this means that the account you are using for the data source is not a server admin for that Analysis Services instance. Learn more
The 'CONTOSO\account' value of the 'EffectiveUserName' XML for Analysis property is not valid.
If the underlying error message is similar to the following, it could mean that the service account for Analysis Services may be missing the token-groups-global-and-universal (TGGAU) directory attribute.
The user name or password is incorrect.
Domains with Pre-Windows 2000 compatibility access will have the TGGAU attribute enabled. However, most newly created domains will not enable this attribute by default. You can read more about this here.
You can confirm this by doing the following.
Connect to the Analysis Services machine within SQL Server Management Studio. Within the Advanced connection properties, include EffectiveUserName for the user in question and see if this reproduces the error.
You can use the dsacls Active Directory tool to validate whether the attribute is listed. This is tool is normally found on a domain controller. You will need to know what the distinguished domain name is for the account and pass that to the tool.
dsacls "CN=John Doe,CN=UserAccounts,DC=contoso,DC=com"
You want to see something similar to the following in the results.
Allow BUILTIN\Windows Authorization Access Group SPECIAL ACCESS for tokenGroupsGlobalAndUniversal READ PROPERTY
To correct this issue, you will need to enable TGGAU on account used for the Analysis Services windows service.
Another possibility for user name or password incorrect
This error could also be caused if the Analysis Services server is in a different domain than the users and there is not a two-way trust established.
You will need to work with your domain administrators to verify the trust relationship between domains.
Unable to see the data gateway data sources in the 'Get Data' experience for Analysis Services from the Power BI service
Make sure that your account is listed in the Users tab of the data source within the gateway configuration. If you don't have access to the gateway, check with the administrator of the gateway and ask them to verify. Only accounts in the Users list will see the data source listed in the Analysis Services list.
This will occur if you have a single row greater than 4 MB in size. You will need to determine what the row is from your data source and attempt to filter it out or reduce the size for that row.
This can occur when the certificate CN is for the servers fully qualified domain name (FQDN) but you only supplied the netbios name for the server. This will cause a mismatch for the certificate. To resolve this issue, you will need to make the server name within the gateway data source, and the PBIX file, to use the FQDN of the server.
This could be because of a few different scenarios.
The server and database name don't match between what was entered in Power BI Desktop and the data source configured for the gateway. These need to be the same values. They are not case sensitive.
Your account is not listed in the Users tab of the data source within the gateway configuration. You will need to get with the administrator of the gateway to be added to that list.
Your Power BI Desktop file has multiple data sources within it and not all of those data sources are configured with the gateway. You will need to have each data source defined with the gateway for the gateway to show up within Scheduled Refresh.
If one of your data sources requires OAuth authentication, you will not be able to configure it with the On-Premises Data Gateway. OAuth authentication is not currently supported with the On-Premises Data Gateway. You will need to remove the data source that requires OAuth authentication from Power BI Desktop in order to configured scheduled refresh.
The exact limitation is 10 GB of uncompressed data per table. If you are hitting this issue, there are good options to optimize and avoid the issue. In particular, reducing the use of highly repetitive, long string values and instead using a normalized key or removing the column (if not in use) will help.
This is usually caused by one of the following.
The data source information does not match what is in the underlying dataset. The server and database name need to match between the data source defined for the on-premises data gateway and what you supply within Power BI Desktop. If you use an IP Address in Power BI Desktop, the data source, for the on-premises data gateway, needs to use an IP Address as well.
There is not a data source available on any gateway within your organization. You can configure the data source on a new, or existing, on-premises data gateway.
If this report is making use of a live Analysis Services connection, you could be encountering an issue with a value being passed to EffectiveUserName that is either not valid, or doesn't have permissions on the Analysis Services machine. Typically, an authentication issue is due to the fact that the value being passed for EffectiveUserName doesn't match a local user principal name (UPN).
To confirm this, you can do the following.
Find the effective username within the gateway logs.
Once you have the value being passed, validate that it is correct. If it is your user, you can use the following command from a command prompt to see what the UPN should be. The UPN will look like an email address.
Optionally, you can see what Power BI gets from Azure Active Directory.
Browse to https://graphexplorer.cloudapp.net.
Select Sign in in the upper right.
Run the following query. You will see a rather large JSON response.
Look for userPrincipalName.
If your Azure Active Directory UPN doesn't match your local Active Directory UPN, you can use the Map user names feature to replace it with a valid value. Or you can work with either your tenant admin, or local Active Directory admin, to get your UPN changed.
For information on providing proxy information for your gateway, see Configuring proxy settings for the Power BI Gateways.
You can test to see if your firewall, or proxy, may be blocking conections by running Test-NetConnection from a PowerShell prompt. This will test connectivity to the Azure Service Bus. This only tests network connectivity and doesn't have anything to do with the cloud server service or the gateway. It helps to determine if your machine can actually get out to the internet.
Test-NetConnection -ComputerName watchdog.servicebus.windows.net -Port 9350
Test-NetConnection is only available on Windows Server 2012 R2 and later. It is also available on Windows 8.1 and later. On earlier OS versions, you can use Telnet to test port connectivity.
The results should look similar to the following. The difference will be with TcpTestSucceeded. If TcpTestSucceeded is not true, then you may be blocked by a firewall.
ComputerName : watchdog.servicebus.windows.net RemoteAddress : 184.108.40.206 RemotePort : 5672 InterfaceAlias : vEthernet (Broadcom NetXtreme Gigabit Ethernet - Virtual Switch) SourceAddress : 10.120.60.105 PingSucceeded : False PingReplyDetails (RTT) : 0 ms TcpTestSucceeded : True
If you want to be exhaustive, substitute the ComputerName and Port values with those listed for ports
The firewall may also be blocking the connections that the Azure Service Bus makes to the Azure data centers. If that is the case, you will want to whitelist (unblock) the IP addresses for your region for those data centers. You can get a list of Azure IP addresses here.
You can find the data center region you are in by doing the following:
Select the ? in the upper right of the Power BI service.
Select About Power BI.
Your data region will be listed in Your data is stored in.
If you are still not getting anywhere, you could try getting a network trace using a tool like fiddler or netsh, although these are advanced collection methods and you may need assistance in analyzing the collected data. You can contact support for assistance.
There are a number of performance counters that can be used to gauge the activities for the gateway. These can be helpful to understanding if we have a large load of activity and may need to make a new gateway. These counters will not reflect how long something takes.
These counters can be access through the Windows Performance Monitor tool.
There are general groupings of these counters.
|ADO.NET||This is used for any DirectQuery connection.|
|ADOMD||This is used for Analysis Services 2014 and earlier.|
|OLEDB||This is used by certain data sources. This includes SAP HANA and Analysis Service 2016 or later.|
|Mashup||This includes any imported data source. If you are scheduling refresh or doing an on-demand refresh, it will go through the mashup engine.|
Here is a listing of the available performance counters.
|# of ADO.NET open connection executed / sec||Number of ADO.NET open connection actions executed per second (succeeded or failed).|
|# of ADO.NET open connection failed / sec||Number of ADO.NET open connections actions failed per second.|
|# of ADO.NET queries executed / sec||Number of ADO.NET queries executed per second (succeeded or failed).|
|# of ADO.NET queries failed / sec||Number of ADO.NET failed queries executed per second.|
|# of ADOMD open connection executed / sec||Number of ADOMD open connection actions executed per second (succeeded or failed).|
|# of ADOMD open connection failed / sec||Number of ADOMD open connection actions failed per second.|
|# of ADOMD queries executed / sec||Number of ADOMD queries executed per second (succeeded or failed).|
|# of ADOMD queries failed / sec||Number of ADOMD failed queries executed per second.|
|# of all open connection executed / sec||Number of open connection actions executed per second (succeeded or failed).|
|# of all open connection failed / sec||Number of failed open connection actions executed per second.|
|# of all queries executed / sec||Number of queries executed per second (succeeded or failed).|
|# of items in the ADO.NET connection pool||Number of items in the ADO.NET connection pool.|
|# of items in the OLEDB connection pool||Number of items in the OLEDB connection pool.|
|# of items in the Service Bus pool||Number of items in the Service Bus pool.|
|# of Mashup open connection executed / sec||Number of Mashup open connection actions executed per second (succeeded or failed).|
|# of Mashup open connection failed / sec||Number of Mashup open connection actions failed per second.|
|# of Mashup queries executed / sec||Number of Mashup queries executed per second (succeeded or failed).|
|# of Mashup queries failed / sec||Number of Mashup failed queries executed per second|
|# of multiple result set OLEDB queries failed / sec||Number of multiple resultset OLEDB failed queries executed per second.|
|# of OLEDB multiple resultset queries executed / sec||Number of OLEDB multiple resultset queries executed per second (succeeded or failed).|
|# of OLEDB open connection executed / sec||Number of OLEDB open connection actions executed per second (succeeded or failed).|
|# of OLEDB open connection failed / sec||Number of OLEDB open connection actions failed per second.|
|# of OLEDB queries executed / sec||Number of OLEDB multiple resultset queries executed per second (succeeded or failed).|
|# of OLEDB queries failed / sec||Number of OLEDB mutiple resultset failed queries executed per second.|
|# of OLEDB single resultset queries executed / sec||Number of OLEDB single resultset queries executed per second (succeeded or failed).|
|# of queries failed / sec||Number of failed queries executed per second.|
|# of single result set OLEDB queries failed / sec||Number of single resultset OLEDB failed queries executed per second.|
You may find that response through the gateway is slow. This could be for DirectQuery queries or when refreshing your imported dataset. You can enable additional logging to output queries and their timings to help understand what is performing slow. When you find a long running query, it may require additional modification on your data source to tune query performance. For example, adjusting indexes for a SQL Server query.
You will need to modify two configuration files to determine the duration of a query.
Within the Microsoft.PowerBI.DataMovement.Pipeline.GatewayCore.dll.config file, change the
EmitQueryTraces value from
True. This file is located, by default, at C:\Program Files\On-premises data gateway. Enabling
EmitQueryTraces will begin to log queries that are sent from the gateway to a data source.
Enabling EmitQueryTraces could increase the log size significantly depending on gateway usage. Once you are done reviewing the logs, you will want to set EmitQueryTraces to False. It is not recommended to leave this setting enabled long term.
<setting name="EmitQueryTraces" serializeAs="String"> <value>True</value> </setting>
Example query entry
DM.EnterpriseGateway Information: 0 : 2016-09-15T16:09:27.2664967Z DM.EnterpriseGateway 4af2c279-1f91-4c33-ae5e-b3c863946c41 d1c77e9e-3858-4b21-3e62-1b6eaf28b176 MGEQ c32f15e3-699c-4360-9e61-2cc03e8c8f4c FF59BC20 [DM.GatewayCore] Executing query (timeout=224) "<pi> SELECT TOP (1000001) [t0].[ProductCategoryName],[t0].[FiscalYear],SUM([t0].[Amount]) AS [a0] FROM ( (select [$Table].[ProductCategoryName] as [ProductCategoryName], [$Table].[ProductSubcategory] as [ProductSubcategory], [$Table].[Product] as [Product], [$Table].[CustomerKey] as [CustomerKey], [$Table].[Region] as [Region], [$Table].[Age] as [Age], [$Table].[IncomeGroup] as [IncomeGroup], [$Table].[CalendarYear] as [CalendarYear], [$Table].[FiscalYear] as [FiscalYear], [$Table].[Month] as [Month], [$Table].[OrderNumber] as [OrderNumber], [$Table].[LineNumber] as [LineNumber], [$Table].[Quantity] as [Quantity], [$Table].[Amount] as [Amount] from [dbo].[V_CustomerOrders] as [$Table]) ) AS [t0] GROUP BY [t0].[ProductCategoryName],[t0].[FiscalYear] </pi>"
Within the Microsoft.PowerBI.DataMovement.Pipeline.Diagnostics.dll.config file, change the
TraceVerbosity value from
5. This file is located, by default, at C:\Program Files\On-premises data gateway. Changing this setting will begin to log verbose entries to the gateway log. This includes entries that show duration.
Enabling TraceVerbosity to
5 could increase the log size significantly depending on gateway usage. Once you are done reviewing the logs, you will want to set TraceVerbosity to
4. It is not recommended to leave this setting enabled long term.
<setting name="TracingVerbosity" serializeAs="String"> <value>5</value> </setting>
|MGEQ||Queries executed over ADO.NET. This includes DirectQuery data sources.|
|MGEO||Queries executed over OLEDB. This includes SAP HANA and Analysis Services 2016.|
|MGEM||Queries executed from the Mashup engine. This is used with imported datasets that use scheduled refresh or refresh on-demand.|
To determine the time it took to query the data source, you can do the following.
Open the gateway log.
Search for an Activity Type to find the query. An example of this would be MGEQ.
Make note of the second GUID as this is the request id.
Continue to search for MGEQ until you find the FireActivityCompletedSuccessfullyEvent entry with the duration. You can verify the entry has the same request id. Duration will be in milliseconds.
DM.EnterpriseGateway Verbose: 0 : 2016-09-26T23:08:56.7940067Z DM.EnterpriseGateway baf40f21-2eb4-4af1-9c59-0950ef11ec4a 5f99f566-106d-c8ac-c864-c0808c41a606 MGEQ 21f96cc4-7496-bfdd-748c-b4915cb4b70c B8DFCF12 [DM.Pipeline.Common.TracingTelemetryService] Event: FireActivityCompletedSuccessfullyEvent (duration=5004)
There are several logs you can collect for the gateway. Always start with the logs!
%localappdata%\Microsoft\on-premises data gateway\GatewayConfigurator*.log
On-premises data gateway service logs
C:\Users\PBIEgwService\AppData\Local\Microsoft\on-premises data gateway\Gateway*.log
The On-premises data gateway service event logs are present under Application and Services Logs.
Fiddler is a free tool from Telerik that monitors HTTP traffic. You can see the back and forth with the Power BI service from the client machine. This may show errors and other related information.
When using the gateway for scheduled refresh, Refresh History can help you see what errors have occurred, as well as provide useful data if you should need to create a support request. You can view both scheduled, as well as on demand, refreshes. Here is how you can get to the Refresh History.
In the Power BI navigation pane, in Datasets, select a dataset > Open Menu > Schedule Refresh.
In Settings for... > Schedule Refresh, select Refresh History.
For additional information about troubleshooting refresh scenarios, take a look at the Troubleshooting Refresh Scenarios article.
Configuring proxy settings for the Power BI Gateways
On-premises Data Gateway
On-premises Data Gateway - in-depth
Manage your data source - Analysis Services
Manage your data source - SAP HANA
Manage your data source - SQL Server
Manage your data source - Import/Scheduled refresh
More questions? Try the Power BI Community