
Imagine you're the lead data architect at a SaaS company that provides financial analytics to mid-market retailers. Your product managers are demanding that Power BI dashboards showing inventory turnover, profit margins, and seasonal trends be seamlessly integrated into your existing React-based customer portal. They don't want users jumping between systems, and they certainly don't want to pay for hundreds of Power BI Pro licenses. Your users should see personalized dashboards that respect row-level security, all while maintaining your application's look and feel.
This is precisely the scenario Power BI Embedded was designed to solve. Unlike Power BI Premium or Pro, which are designed for organizational business intelligence, Power BI Embedded lets you programmatically embed reports, dashboards, and visuals directly into your custom applications. You control authentication, you manage capacity scaling, and your users never know they're looking at Power BI—they just see your application with incredibly rich analytics.
By the end of this lesson, you'll have the expertise to architect and implement a complete Power BI Embedded solution that scales to thousands of users while maintaining security boundaries and optimal performance.
What you'll learn:
This lesson assumes you have:
Power BI Embedded operates on a fundamentally different model than traditional Power BI. Instead of users having individual licenses, you purchase dedicated capacity in Azure that serves embedded content to your application users. This capacity-based model makes economic sense when you have many users who need analytics but don't require the full Power BI service.
The architecture involves several key components working together. Your custom application authenticates with Azure Active Directory using a service principal (app registration) that has appropriate permissions to your Power BI workspace. This service principal generates embed tokens that allow your application to display specific reports to specific users. The Power BI service validates these tokens and serves the embedded content through the JavaScript SDK.
The capacity you purchase in Azure (measured in "A" SKUs like A1, A2, etc.) determines how much concurrent load your embedded reports can handle. Unlike Power BI Premium's "P" SKUs, "A" SKUs are designed specifically for embedding scenarios and can be paused when not in use, making them cost-effective for applications with variable usage patterns.
Security boundaries are maintained through a combination of app-only authentication, embed tokens, and row-level security (RLS) rules defined in your datasets. When a user requests a report in your application, your backend generates an embed token that specifies exactly which report they can see and which data they're allowed to access.
Let's start by provisioning the Azure infrastructure you'll need. Power BI Embedded requires an Azure subscription and a properly configured Power BI workspace associated with your embedded capacity.
First, create a Power BI Embedded capacity in Azure. Navigate to the Azure portal and create a new resource of type "Power BI Embedded." Choose your resource group and region carefully—the region affects both latency and compliance requirements for your users.
For capacity sizing, start conservatively. An A1 capacity ($750/month) provides 1GB of memory and 1 virtual core, which can typically handle 50-100 concurrent report views depending on report complexity. A2 doubles this capacity, and so on. You can always scale up, and more importantly, you can pause capacity during non-business hours to save costs.
# Create Resource Group
az group create --name "powerbi-embedded-rg" --location "East US"
# Create Power BI Embedded Capacity
az powerbi embedded-capacity create \
--resource-group "powerbi-embedded-rg" \
--name "retailanalytics-pbie" \
--location "East US" \
--sku "A1" \
--administrators "admin@yourcompany.com"
Next, configure your Power BI workspace. In the Power BI service, create a new workspace specifically for your embedded content. This workspace should be separate from any workspaces used for internal business intelligence to maintain clear boundaries and easier management.
Assign your newly created capacity to this workspace. In the workspace settings, you'll see an option to assign it to Premium capacity—select your embedded capacity here. This assignment is crucial because only content in capacity-assigned workspaces can be embedded.
Now create an Azure AD app registration that will serve as your service principal. This app needs specific permissions to interact with Power BI on behalf of your application:
# Create Azure AD App Registration
az ad app create \
--display-name "RetailAnalytics-PowerBI-Embed" \
--available-to-other-tenants false \
--homepage "https://yourapp.com" \
--reply-urls "https://yourapp.com/auth/callback"
In the Azure portal, navigate to your app registration and configure API permissions. Add the following Power BI Service permissions:
Dataset.ReadWrite.All (Application permission)Report.ReadWrite.All (Application permission)Workspace.ReadWrite.All (Application permission)Generate a client secret for this app registration and store it securely—you'll need this for authentication. Also note the Application (client) ID and Directory (tenant) ID, as these are required for the authentication flow.
The final step is adding your service principal to your Power BI workspace as an Admin. In the workspace settings, add the service principal using its Application ID. This grants your application the rights to generate embed tokens for content in this workspace.
Authentication in Power BI Embedded follows the "app owns data" pattern, where your application authenticates with Azure AD on behalf of all users. This is different from user-based authentication where individual users sign in with their credentials.
Your backend service needs to obtain access tokens from Azure AD using the client credentials flow. Here's a robust implementation in C# that handles token caching and renewal:
public class PowerBITokenService
{
private readonly string _clientId;
private readonly string _clientSecret;
private readonly string _tenantId;
private readonly IMemoryCache _cache;
public PowerBITokenService(IConfiguration config, IMemoryCache cache)
{
_clientId = config["PowerBI:ClientId"];
_clientSecret = config["PowerBI:ClientSecret"];
_tenantId = config["PowerBI:TenantId"];
_cache = cache;
}
public async Task<string> GetAccessTokenAsync()
{
const string cacheKey = "powerbi_access_token";
if (_cache.TryGetValue(cacheKey, out string cachedToken))
{
return cachedToken;
}
var app = ConfidentialClientApplicationBuilder
.Create(_clientId)
.WithClientSecret(_clientSecret)
.WithAuthority($"https://login.microsoftonline.com/{_tenantId}")
.Build();
var scopes = new[] { "https://analysis.windows.net/powerbi/api/.default" };
var result = await app.AcquireTokenForClient(scopes).ExecuteAsync();
// Cache token for 50 minutes (tokens expire after 60 minutes)
_cache.Set(cacheKey, result.AccessToken, TimeSpan.FromMinutes(50));
return result.AccessToken;
}
}
For Node.js applications, you can achieve the same result using the Microsoft Authentication Library:
const { ConfidentialClientApplication } = require('@azure/msal-node');
class PowerBITokenService {
constructor(clientId, clientSecret, tenantId) {
this.clientApp = new ConfidentialClientApplication({
auth: {
clientId: clientId,
clientSecret: clientSecret,
authority: `https://login.microsoftonline.com/${tenantId}`
}
});
this.tokenCache = new Map();
}
async getAccessToken() {
const cacheKey = 'powerbi_access_token';
const cached = this.tokenCache.get(cacheKey);
if (cached && cached.expiresAt > Date.now()) {
return cached.token;
}
const clientCredentialRequest = {
scopes: ['https://analysis.windows.net/powerbi/api/.default']
};
const response = await this.clientApp.acquireTokenByClientCredential(clientCredentialRequest);
// Cache token
this.tokenCache.set(cacheKey, {
token: response.accessToken,
expiresAt: Date.now() + (50 * 60 * 1000) // 50 minutes
});
return response.accessToken;
}
}
The key consideration here is token management. Access tokens expire after one hour, so implement proper caching and renewal logic. In production systems, consider using distributed caching (Redis) instead of in-memory caching if you have multiple application instances.
Also implement proper error handling for authentication failures. Network issues, service principal permission problems, or capacity being paused can all cause authentication to fail. Your application should gracefully handle these scenarios and provide meaningful error messages to users.
Power BI's REST APIs provide programmatic access to all embedding operations. Understanding these APIs deeply is crucial for building robust embedded solutions, as the JavaScript SDK ultimately makes these API calls on your behalf.
The most important endpoints for embedding scenarios are:
/groups/{groupId}/reports - List and manage reports/groups/{groupId}/datasets - Manage datasets and refresh operations/groups/{groupId}/reports/{reportId}/GenerateToken - Create embed tokensLet's build a comprehensive API service that handles the most common embedding operations:
public class PowerBIApiService
{
private readonly HttpClient _httpClient;
private readonly PowerBITokenService _tokenService;
private const string BaseUrl = "https://api.powerbi.com/v1.0/myorg";
public PowerBIApiService(HttpClient httpClient, PowerBITokenService tokenService)
{
_httpClient = httpClient;
_tokenService = tokenService;
}
public async Task<IEnumerable<Report>> GetReportsAsync(Guid workspaceId)
{
var token = await _tokenService.GetAccessTokenAsync();
_httpClient.DefaultRequestHeaders.Authorization =
new AuthenticationHeaderValue("Bearer", token);
var response = await _httpClient.GetAsync($"{BaseUrl}/groups/{workspaceId}/reports");
response.EnsureSuccessStatusCode();
var content = await response.Content.ReadAsStringAsync();
var reportsResponse = JsonSerializer.Deserialize<ReportsResponse>(content);
return reportsResponse.Reports;
}
public async Task<EmbedToken> GenerateEmbedTokenAsync(
Guid workspaceId,
Guid reportId,
string username,
IEnumerable<string> roles = null,
IEnumerable<EffectiveIdentity> identities = null)
{
var token = await _tokenService.GetAccessTokenAsync();
_httpClient.DefaultRequestHeaders.Authorization =
new AuthenticationHeaderValue("Bearer", token);
var tokenRequest = new GenerateTokenRequest
{
AccessLevel = "View",
AllowSaveAs = false,
Identities = identities?.ToList() ?? new List<EffectiveIdentity>()
};
// Add effective identity for RLS if roles are specified
if (roles?.Any() == true)
{
tokenRequest.Identities.Add(new EffectiveIdentity
{
Username = username,
Roles = roles.ToList(),
Datasets = new List<string> { await GetDatasetIdForReport(workspaceId, reportId) }
});
}
var requestJson = JsonSerializer.Serialize(tokenRequest);
var requestContent = new StringContent(requestJson, Encoding.UTF8, "application/json");
var response = await _httpClient.PostAsync(
$"{BaseUrl}/groups/{workspaceId}/reports/{reportId}/GenerateToken",
requestContent);
response.EnsureSuccessStatusCode();
var responseContent = await response.Content.ReadAsStringAsync();
return JsonSerializer.Deserialize<EmbedToken>(responseContent);
}
private async Task<string> GetDatasetIdForReport(Guid workspaceId, Guid reportId)
{
var token = await _tokenService.GetAccessTokenAsync();
_httpClient.DefaultRequestHeaders.Authorization =
new AuthenticationHeaderValue("Bearer", token);
var response = await _httpClient.GetAsync(
$"{BaseUrl}/groups/{workspaceId}/reports/{reportId}");
response.EnsureSuccessStatusCode();
var content = await response.Content.ReadAsStringAsync();
var report = JsonSerializer.Deserialize<Report>(content);
return report.DatasetId;
}
}
The GenerateToken endpoint is particularly important because it controls what users can see and do with your embedded reports. The embed token includes several key parameters:
For multi-tenant scenarios, you'll often need to generate tokens with specific effective identities that map to your application's user context:
public async Task<EmbedConfig> GetEmbedConfigAsync(
string userId,
Guid reportId,
Dictionary<string, object> userContext)
{
var workspaceId = await GetWorkspaceForUser(userId);
var report = await GetReportAsync(workspaceId, reportId);
// Build effective identity based on user context
var effectiveIdentity = new EffectiveIdentity
{
Username = userId,
Roles = await GetUserRoles(userId),
Datasets = new List<string> { report.DatasetId }
};
// Add custom data for RLS
if (userContext.ContainsKey("TenantId"))
{
effectiveIdentity.CustomData = userContext["TenantId"].ToString();
}
var embedToken = await GenerateEmbedTokenAsync(
workspaceId,
reportId,
userId,
effectiveIdentity.Roles,
new[] { effectiveIdentity });
return new EmbedConfig
{
Type = "report",
Id = reportId.ToString(),
EmbedUrl = report.EmbedUrl,
AccessToken = embedToken.Token,
TokenId = embedToken.TokenId,
Expiration = embedToken.Expiration
};
}
Implement proper retry logic for API calls, as Power BI APIs can occasionally return transient errors. Use exponential backoff with jitter to avoid thundering herd problems:
public async Task<T> ExecuteWithRetry<T>(Func<Task<T>> operation, int maxRetries = 3)
{
var delay = TimeSpan.FromSeconds(1);
for (int i = 0; i < maxRetries; i++)
{
try
{
return await operation();
}
catch (HttpRequestException ex) when (i < maxRetries - 1 && IsTransientError(ex))
{
await Task.Delay(delay);
delay = TimeSpan.FromMilliseconds(delay.TotalMilliseconds * 2 + Random.Next(1000));
}
}
return await operation(); // Final attempt without catching
}
private bool IsTransientError(HttpRequestException ex)
{
return ex.Message.Contains("429") || // Too Many Requests
ex.Message.Contains("502") || // Bad Gateway
ex.Message.Contains("503") || // Service Unavailable
ex.Message.Contains("504"); // Gateway Timeout
}
Embed tokens are the security cornerstone of Power BI Embedded. These tokens control not just which reports users can access, but also what data they see within those reports through row-level security (RLS) and effective identity mapping.
Understanding token lifetime management is crucial for production deployments. Embed tokens expire, and your application must handle token refresh gracefully. The maximum token lifetime is 1440 minutes (24 hours), but for security reasons, you should use shorter lifetimes and implement automatic refresh:
class EmbedTokenManager {
constructor(apiClient) {
this.apiClient = apiClient;
this.tokenRefreshBuffer = 5 * 60 * 1000; // Refresh 5 minutes before expiry
this.activeTokens = new Map();
}
async getValidToken(reportId, userId, userContext) {
const cacheKey = `${reportId}-${userId}`;
const cached = this.activeTokens.get(cacheKey);
if (cached && this.isTokenValid(cached)) {
return cached;
}
const newToken = await this.apiClient.generateEmbedToken(
reportId,
userId,
userContext
);
// Schedule refresh before expiration
this.scheduleTokenRefresh(cacheKey, newToken);
this.activeTokens.set(cacheKey, newToken);
return newToken;
}
isTokenValid(tokenConfig) {
const now = Date.now();
const expiry = new Date(tokenConfig.expiration).getTime();
return (expiry - now) > this.tokenRefreshBuffer;
}
scheduleTokenRefresh(cacheKey, tokenConfig) {
const now = Date.now();
const expiry = new Date(tokenConfig.expiration).getTime();
const refreshTime = expiry - now - this.tokenRefreshBuffer;
if (refreshTime > 0) {
setTimeout(() => {
this.refreshToken(cacheKey);
}, refreshTime);
}
}
async refreshToken(cacheKey) {
const [reportId, userId] = cacheKey.split('-');
const userContext = await this.getUserContext(userId);
try {
const newToken = await this.apiClient.generateEmbedToken(
reportId,
userId,
userContext
);
this.activeTokens.set(cacheKey, newToken);
// Update any active embed instances
this.notifyEmbedInstances(cacheKey, newToken);
} catch (error) {
console.error(`Failed to refresh token for ${cacheKey}:`, error);
}
}
}
For row-level security implementation, the effective identity mechanism allows you to pass user context that your dataset's RLS rules can use to filter data. This is where the real power of Power BI Embedded shines for multi-tenant applications.
Consider a retail analytics scenario where different franchisees should only see their store's data. Your dataset would include RLS rules like:
[StoreId] = USERPRINCIPALNAME()
But since you're using app-only authentication, USERPRINCIPALNAME() would return your service principal's name, not the actual user. This is where effective identity comes in:
public class MultiTenantEmbedService
{
public async Task<EmbedConfig> GenerateEmbedConfigAsync(
string userId,
Guid reportId,
string storeId,
List<string> allowedStores = null)
{
var effectiveIdentity = new EffectiveIdentity
{
Username = userId, // This becomes USERPRINCIPALNAME() in DAX
CustomData = storeId, // This becomes CUSTOMDATA() in DAX
Datasets = new List<string> { await GetDatasetId(reportId) }
};
// For users with access to multiple stores, pass the list
if (allowedStores?.Any() == true)
{
effectiveIdentity.CustomData = string.Join(",", allowedStores);
}
var embedToken = await _apiService.GenerateEmbedTokenAsync(
_workspaceId,
reportId,
userId,
identities: new[] { effectiveIdentity }
);
return new EmbedConfig
{
Type = "report",
Id = reportId.ToString(),
EmbedUrl = await GetReportEmbedUrl(reportId),
AccessToken = embedToken.Token,
Settings = new EmbedSettings
{
FilterPaneEnabled = false,
NavContentPaneEnabled = false
}
};
}
}
Your dataset's RLS rules would then use:
// For single store access
[StoreId] = USERPRINCIPALNAME()
// For multi-store access using custom data
[StoreId] IN VALUES(TRIM(PATHITEM(SUBSTITUTE(CUSTOMDATA(), ",", "|"), [StoreIndex])))
Security considerations extend beyond just data filtering. Implement proper HTTPS everywhere, validate all user inputs, and never expose embed tokens in client-side code where they could be extracted. Always generate tokens on your backend and pass them securely to your frontend:
// BAD: Token generation in frontend
const embedToken = await powerbi.generateToken(reportId);
// GOOD: Token from secure backend endpoint
const embedConfig = await fetch('/api/embed/report/' + reportId, {
method: 'POST',
headers: {
'Authorization': `Bearer ${userToken}`,
'Content-Type': 'application/json'
},
body: JSON.stringify({ userContext: currentUserContext })
}).then(r => r.json());
For highly sensitive applications, consider implementing additional security layers like IP allowlisting, request signing, or short-lived tokens with aggressive refresh policies.
The Power BI JavaScript SDK provides the client-side embedding functionality, but using it effectively requires understanding its event model, configuration options, and integration patterns with modern frontend frameworks.
First, install and initialize the SDK properly:
npm install powerbi-client
For React applications, create a reusable PowerBI component that handles the embedding lifecycle:
import React, { useEffect, useRef, useState, useCallback } from 'react';
import { models, Report, Embed } from 'powerbi-client';
import { service } from 'powerbi-client';
const PowerBIReport = ({
embedConfig,
onLoaded,
onError,
onDataSelected,
style = { height: '600px', width: '100%' }
}) => {
const embedContainer = useRef(null);
const [report, setReport] = useState(null);
const [isLoading, setIsLoading] = useState(true);
const embedReport = useCallback(async () => {
if (!embedConfig || !embedContainer.current) return;
try {
setIsLoading(true);
const config = {
type: 'report',
id: embedConfig.id,
embedUrl: embedConfig.embedUrl,
accessToken: embedConfig.accessToken,
tokenType: models.TokenType.Embed,
settings: {
panes: {
filters: {
expanded: false,
visible: false
},
pageNavigation: {
visible: true
}
},
background: models.BackgroundType.Transparent,
layoutType: models.LayoutType.Custom,
customLayout: {
displayOption: models.DisplayOption.FitToPage
},
bars: {
statusBar: {
visible: false
}
},
...embedConfig.settings
}
};
// Embed the report
const embeddedReport = service.embed(embedContainer.current, config);
// Set up event handlers
embeddedReport.off('loaded');
embeddedReport.on('loaded', () => {
setIsLoading(false);
onLoaded && onLoaded(embeddedReport);
});
embeddedReport.off('error');
embeddedReport.on('error', (event) => {
const errorDetail = event.detail;
console.error('Power BI embed error:', errorDetail);
setIsLoading(false);
onError && onError(errorDetail);
});
embeddedReport.off('dataSelected');
embeddedReport.on('dataSelected', (event) => {
const dataPoints = event.detail.dataPoints;
onDataSelected && onDataSelected(dataPoints);
});
// Handle token expiration
embeddedReport.off('tokenExpired');
embeddedReport.on('tokenExpired', async () => {
try {
const newConfig = await refreshEmbedConfig(embedConfig.id);
await embeddedReport.setAccessToken(newConfig.accessToken);
} catch (error) {
console.error('Failed to refresh token:', error);
onError && onError(error);
}
});
setReport(embeddedReport);
} catch (error) {
console.error('Failed to embed report:', error);
setIsLoading(false);
onError && onError(error);
}
}, [embedConfig, onLoaded, onError, onDataSelected]);
// Refresh embed config (token refresh)
const refreshEmbedConfig = async (reportId) => {
const response = await fetch(`/api/embed/report/${reportId}/refresh`, {
method: 'POST',
headers: {
'Authorization': `Bearer ${userToken}`
}
});
return response.json();
};
useEffect(() => {
embedReport();
// Cleanup on unmount
return () => {
if (report) {
service.reset(embedContainer.current);
setReport(null);
}
};
}, [embedReport]);
// Handle config changes (e.g., filter updates)
useEffect(() => {
if (report && embedConfig) {
// Apply any new settings or filters
if (embedConfig.filters) {
report.setFilters(embedConfig.filters);
}
}
}, [report, embedConfig.filters]);
return (
<div style={style}>
{isLoading && (
<div style={{
display: 'flex',
justifyContent: 'center',
alignItems: 'center',
height: '100%',
background: '#f5f5f5'
}}>
<div>Loading report...</div>
</div>
)}
<div
ref={embedContainer}
style={{
width: '100%',
height: '100%',
display: isLoading ? 'none' : 'block'
}}
/>
</div>
);
};
export default PowerBIReport;
For advanced scenarios, implement programmatic filtering and interaction:
const AdvancedPowerBIReport = ({ embedConfig, userFilters }) => {
const [report, setReport] = useState(null);
// Apply filters when user context changes
useEffect(() => {
if (report && userFilters) {
const filters = userFilters.map(filter => ({
$schema: "http://powerbi.com/product/schema#basic",
target: {
table: filter.table,
column: filter.column
},
operator: filter.operator || "In",
values: Array.isArray(filter.values) ? filter.values : [filter.values],
filterType: models.FilterType.BasicFilter
}));
report.setFilters(filters)
.then(() => {
console.log('Filters applied successfully');
})
.catch(error => {
console.error('Failed to apply filters:', error);
});
}
}, [report, userFilters]);
// Handle visual interactions
const handleDataSelected = useCallback(async (dataPoints) => {
if (!dataPoints || dataPoints.length === 0) return;
// Extract selected data for cross-filtering other components
const selectedValues = dataPoints.map(point => point.identity);
// You could trigger actions in your application based on selections
await updateRelatedDashboards(selectedValues);
}, []);
// Export report data
const exportReportData = useCallback(async () => {
if (!report) return;
try {
const pages = await report.getPages();
const activePage = pages.find(page => page.isActive);
if (activePage) {
const visuals = await activePage.getVisuals();
for (const visual of visuals) {
const data = await visual.exportData(models.ExportDataType.Summarized);
console.log(`Visual ${visual.name} data:`, data);
}
}
} catch (error) {
console.error('Failed to export data:', error);
}
}, [report]);
return (
<div>
<PowerBIReport
embedConfig={embedConfig}
onLoaded={setReport}
onDataSelected={handleDataSelected}
/>
<button onClick={exportReportData}>
Export Data
</button>
</div>
);
};
For Vue.js applications, the pattern is similar but uses Vue's reactivity system:
// Vue component for Power BI embedding
export default {
name: 'PowerBIReport',
props: {
embedConfig: Object,
filters: Array
},
data() {
return {
report: null,
isLoading: true,
error: null
};
},
mounted() {
this.embedReport();
},
beforeDestroy() {
if (this.report) {
service.reset(this.$refs.embedContainer);
}
},
watch: {
filters: {
deep: true,
handler(newFilters) {
if (this.report && newFilters) {
this.applyFilters(newFilters);
}
}
}
},
methods: {
async embedReport() {
if (!this.embedConfig) return;
const config = {
type: 'report',
id: this.embedConfig.id,
embedUrl: this.embedConfig.embedUrl,
accessToken: this.embedConfig.accessToken,
tokenType: models.TokenType.Embed,
settings: this.embedConfig.settings || {}
};
this.report = service.embed(this.$refs.embedContainer, config);
this.report.on('loaded', () => {
this.isLoading = false;
this.$emit('loaded', this.report);
});
this.report.on('error', (event) => {
this.error = event.detail;
this.isLoading = false;
this.$emit('error', event.detail);
});
},
async applyFilters(filters) {
const powerBIFilters = filters.map(filter => ({
$schema: "http://powerbi.com/product/schema#basic",
target: {
table: filter.table,
column: filter.column
},
operator: filter.operator || "In",
values: filter.values,
filterType: models.FilterType.BasicFilter
}));
await this.report.setFilters(powerBIFilters);
}
},
template: `
<div>
<div v-if="isLoading" class="loading">
Loading report...
</div>
<div v-if="error" class="error">
Error loading report: {{ error.message }}
</div>
<div ref="embedContainer" style="width: 100%; height: 600px;"></div>
</div>
`
};
Performance considerations for frontend integration include lazy loading reports, implementing virtual scrolling for dashboards with many reports, and optimizing re-renders when filters change. Always debounce filter applications to avoid overwhelming the Power BI service with rapid successive calls.
Row-level security in Power BI Embedded is where multi-tenant applications truly shine. RLS allows you to show different data to different users from the same dataset, eliminating the need for separate datasets per tenant and dramatically simplifying your architecture.
The foundation of RLS lies in creating roles within your Power BI dataset and defining DAX filter expressions that restrict data access. These expressions are evaluated when embed tokens are generated with effective identities, ensuring users only see data they're authorized to access.
Let's design a comprehensive RLS strategy for a multi-tenant retail analytics platform. Our dataset contains sales data for multiple franchisees, and we need to ensure franchisees only see their own data while allowing regional managers to see multiple franchisees.
First, create roles in Power BI Desktop. In the "Modeling" tab, select "Manage Roles" and define roles that match your application's user hierarchy:
-- Role: Franchisee
-- Description: Single store access
[StoreId] = USERPRINCIPALNAME()
-- Role: Regional Manager
-- Description: Multiple stores in region
[RegionId] IN VALUES(TRIM(PATHITEM(SUBSTITUTE(CUSTOMDATA(), ",", "|"), [Value])))
-- Role: Corporate User
-- Description: All data access (no filter)
1 = 1
The real power comes from combining these roles with effective identity in your embed token generation:
public class TenantAwareEmbedService
{
private readonly IUserService _userService;
private readonly PowerBIApiService _powerBIService;
public async Task<EmbedConfig> GenerateSecureEmbedConfigAsync(
string userId,
Guid reportId)
{
var user = await _userService.GetUserAsync(userId);
var effectiveIdentity = await BuildEffectiveIdentityAsync(user);
var embedToken = await _powerBIService.GenerateEmbedTokenAsync(
_workspaceId,
reportId,
user.Email,
roles: effectiveIdentity.Roles,
identities: new[] { effectiveIdentity }
);
return new EmbedConfig
{
Type = "report",
Id = reportId.ToString(),
EmbedUrl = await GetReportEmbedUrl(reportId),
AccessToken = embedToken.Token,
TokenId = embedToken.TokenId
};
}
private async Task<EffectiveIdentity> BuildEffectiveIdentityAsync(User user)
{
var identity = new EffectiveIdentity
{
Username = user.Email,
Datasets = await GetUserAccessibleDatasets(user.Id)
};
switch (user.Role.ToLower())
{
case "franchisee":
identity.Roles = new List<string> { "Franchisee" };
// Username becomes the filter value in USERPRINCIPALNAME()
identity.Username = user.StoreId;
break;
case "regionalmanager":
identity.Roles = new List<string> { "Regional Manager" };
var managedStores = await _userService.GetManagedStoresAsync(user.Id);
identity.CustomData = string.Join(",", managedStores.Select(s => s.RegionId));
break;
case "corporateuser":
identity.Roles = new List<string> { "Corporate User" };
break;
default:
throw new UnauthorizedAccessException($"Unknown user role: {user.Role}");
}
return identity;
}
}
For complex multi-dimensional security, you might need dynamic RLS based on time periods, product categories, or customer segments. Here's an advanced example that combines multiple security dimensions:
-- Advanced RLS with multiple dimensions
-- Table: Sales
VAR UserStores =
VALUES(
TRIM(
PATHITEM(
SUBSTITUTE(CUSTOMDATA(), ",", "|"),
[Value]
)
)
)
VAR UserTimeFrame =
IF(
FIND("LIMITED_TIME", CUSTOMDATA()) > 0,
DATE(YEAR(TODAY()), MONTH(TODAY()), 1), -- First day of current month
DATE(2020, 1, 1) -- Full historical access
)
VAR UserProductCategories =
IF(
FIND("CATEGORY:", CUSTOMDATA()) > 0,
VALUES(
TRIM(
PATHITEM(
SUBSTITUTE(
MID(CUSTOMDATA(), FIND("CATEGORY:", CUSTOMDATA()) + 9, 100),
";",
"|"
),
[Value]
)
)
),
VALUES(Products[Category]) -- All categories
)
RETURN
[StoreId] IN UserStores
&& [Date] >= UserTimeFrame
&& RELATED(Products[Category]) IN UserProductCategories
The corresponding effective identity generation becomes more sophisticated:
private async Task<EffectiveIdentity> BuildAdvancedEffectiveIdentityAsync(
User user,
EmbedRequest request)
{
var identity = new EffectiveIdentity
{
Username = user.Email,
Datasets = await GetUserAccessibleDatasets(user.Id)
};
var customDataParts = new List<string>();
// Add store access
var userStores = await GetUserStores(user.Id);
customDataParts.Add(string.Join(",", userStores.Select(s => s.Id)));
// Add time-based restrictions
if (user.HasLimitedTimeAccess)
{
customDataParts.Add("LIMITED_TIME");
}
// Add product category restrictions
if (request.ProductCategories?.Any() == true)
{
customDataParts.Add($"CATEGORY:{string.Join(";", request.ProductCategories)}");
}
identity.CustomData = string.Join("|", customDataParts);
identity.Roles = await DetermineUserRoles(user, request);
return identity;
}
Testing RLS is crucial but often overlooked. Power BI Desktop allows you to test roles locally, but you should also implement automated testing for your RLS logic:
[Test]
public async Task RLS_FranchiseeUser_ShouldOnlySeeOwnStoreData()
{
// Arrange
var franchiseeUser = new User
{
Id = "user1",
Email = "franchisee@store001.com",
Role = "Franchisee",
StoreId = "STORE001"
};
// Act
var embedConfig = await _embedService.GenerateSecureEmbedConfigAsync(
franchiseeUser.Id,
TestConstants.SalesReportId
);
// Assert - Token should contain proper effective identity
var tokenPayload = DecodeEmbedToken(embedConfig.AccessToken);
Assert.That(tokenPayload.Identities[0].Username, Is.EqualTo("STORE001"));
Assert.That(tokenPayload.Identities[0].Roles, Contains.Item("Franchisee"));
// Integration test - Verify actual data filtering
var reportData = await GetReportDataWithToken(embedConfig.AccessToken);
Assert.That(reportData.All(row => row.StoreId == "STORE001"), Is.True);
}
Performance considerations for RLS include understanding that complex DAX filter expressions can impact query performance. Monitor your dataset's refresh and query performance in the Power BI Admin Portal, and consider optimizing your data model with proper relationships and calculated columns for commonly filtered dimensions.
Also be aware that RLS doesn't apply to dataset refresh operations—it only filters data during query time. Ensure your underlying data source already has appropriate access controls to prevent unauthorized data from being imported into your dataset.
Performance in Power BI Embedded involves multiple dimensions: capacity sizing, query optimization, report design, and embed configuration. Understanding these interdependencies is essential for building solutions that scale to hundreds or thousands of concurrent users.
Capacity management is your first line of defense against performance issues. Power BI Embedded capacities are measured in virtual cores and memory, but the relationship between capacity size and performance isn't linear. A single complex report with many visuals can consume more resources than ten simple reports.
Monitor capacity utilization through Azure Monitor and the Power BI Admin Portal. Set up alerts for key metrics:
public class CapacityMonitoringService
{
private readonly ILogger<CapacityMonitoringService> _logger;
private readonly IMetricsCollector _metrics;
private readonly PowerBIApiService _powerBIService;
public async Task MonitorCapacityHealthAsync()
{
var capacities = await _powerBIService.GetCapacitiesAsync();
foreach (var capacity in capacities)
{
var utilization = await GetCapacityUtilization(capacity.Id);
// Log metrics for alerting
_metrics.Gauge("powerbi.capacity.cpu_percentage", utilization.CpuPercentage,
new[] { $"capacity:{capacity.DisplayName}" });
_metrics.Gauge("powerbi.capacity.memory_percentage", utilization.MemoryPercentage,
new[] { $"capacity:{capacity.DisplayName}" });
// Check for concerning trends
if (utilization.CpuPercentage > 80)
{
_logger.LogWarning(
"Capacity {CapacityName} CPU utilization is {CpuPercentage}%",
capacity.DisplayName,
utilization.CpuPercentage);
await ConsiderCapacityScaling(capacity);
}
// Monitor query duration trends
var queryMetrics = await GetQueryPerformanceMetrics(capacity.Id);
var averageQueryDuration = queryMetrics.AverageQueryDurationMs;
if (averageQueryDuration > 5000) // 5 seconds
{
await AnalyzeSlowQueries(capacity.Id, queryMetrics.SlowQueries);
}
}
}
private async Task ConsiderCapacityScaling(Capacity capacity)
{
// Check if scaling is beneficial
var currentSku = capacity.Sku;
var recommendedSku = CalculateRecommendedSku(capacity.Utilization);
if (recommendedSku != currentSku)
{
_logger.LogInformation(
"Recommending capacity {CapacityName} scaling from {CurrentSku} to {RecommendedSku}",
capacity.DisplayName,
currentSku,
recommendedSku);
// In production, you might want to auto-scale or alert administrators
await NotifyAdministrators(capacity, currentSku, recommendedSku);
}
}
}
Implement intelligent capacity scaling based on usage patterns:
public class IntelligentCapacityScaler
{
private readonly IAzureResourceManager _azureRM;
private readonly ICapacityUsagePredictor _usagePredictor;
public async Task<bool> ShouldScaleCapacity(string capacityName)
{
var currentUtilization = await GetCurrentUtilization(capacityName);
var predictedUtilization = await _usagePredictor.PredictNextHourUtilization(capacityName);
var historicalPatterns = await GetHistoricalUsagePatterns(capacityName);
// Scale up if:
// 1. Current utilization > 80%
// 2. Predicted utilization > 85%
// 3. Historical pattern shows sustained high usage
return currentUtilization.CpuPercentage > 80 ||
predictedUtilization.CpuPercentage > 85 ||
historicalPatterns.SustainedHighUsage;
}
public async Task ScaleCapacityAsync(string capacityName, string targetSku)
{
_logger.LogInformation(
"Starting capacity scale operation for {CapacityName} to {TargetSku}",
capacityName,
targetSku);
// Pause capacity during scaling to prevent disruption
await _azureRM.PauseCapacityAsync(capacityName);
try
{
await _azureRM.ScaleCapacityAsync(capacityName, targetSku);
// Wait for scaling to complete
await WaitForCapacityReady(capacityName);
}
finally
{
// Resume capacity
await _azureRM.ResumeCapacityAsync(capacityName);
}
_logger.LogInformation(
"Completed capacity scale operation for {CapacityName}",
capacityName);
}
}
Report-level performance optimization requires understanding how Power BI processes queries. Implement report performance monitoring:
class ReportPerformanceMonitor {
constructor() {
this.performanceMetrics = new Map();
}
startReportLoad(reportId) {
this.performanceMetrics.set(reportId, {
startTime: performance.now(),
embedStartTime: performance.now()
});
}
onReportLoaded(reportId, report) {
const metrics = this.performanceMetrics.get(reportId);
if (metrics) {
metrics.loadTime = performance.now() - metrics.startTime;
metrics.report = report;
// Start monitoring ongoing performance
this.monitorReportQueries(reportId, report);
}
}
async monitorReportQueries(reportId, report) {
report.on('dataSelected', () => {
this.recordInteractionTime(reportId, 'dataSelected');
});
report.on('pageChanged', (event) => {
this.recordInteractionTime(reportId, 'pageChanged');
});
// Monitor render performance
const pages = await report.getPages();
for (const page of pages) {
const visuals = await page.getVisuals();
for (const visual of visuals) {
this.monitorVisualPerformance(reportId, visual);
}
}
}
recordInteractionTime(reportId, interactionType) {
const startTime = performance.now();
// Record when interaction completes
setTimeout(() => {
const duration = performance.now() - startTime;
this.sendPerformanceMetric({
reportId,
interactionType,
duration,
timestamp: new Date().toISOString()
});
}, 0);
}
async sendPerformanceMetric(metric) {
// Send to your analytics platform
await fetch('/api/metrics/report-performance', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify(metric)
});
}
getReportMetrics(reportId) {
return this.performanceMetrics.get(reportId);
}
}
// Usage in your React component
const performanceMonitor = new ReportPerformanceMonitor();
const OptimizedPowerBIReport = ({ embedConfig }) => {
const handleReportLoad = useCallback((report) => {
performanceMonitor.onReportLoaded(embedConfig.id, report);
// Optimize report settings for performance
report.updateSettings({
layoutType: models.LayoutType.Custom,
customLayout: {
displayOption: models.DisplayOption.FitToPage,
pagesLayout: {
[0]: { // First page
defaultLayout: {
displayState: {
mode: models.VisualContainerDisplayMode.Visible
}
}
}
}
}
});
}, [embedConfig.id]);
useEffect(() => {
if (embedConfig) {
performanceMonitor.startReportLoad(embedConfig.id);
}
}, [embedConfig]);
return (
<PowerBIReport
embedConfig={embedConfig}
onLoaded={handleReportLoad}
/>
);
};
Caching strategies can dramatically improve performance for frequently accessed reports:
public class EmbedConfigCacheService
{
private readonly IDistributedCache _cache;
private readonly PowerBIApiService _powerBIService;
private readonly ILogger<EmbedConfigCacheService> _logger;
public async Task<EmbedConfig> GetCachedEmbedConfigAsync(
string userId,
Guid reportId,
Dictionary<string, object> userContext)
{
var cacheKey = GenerateCacheKey(userId, reportId, userContext);
var cachedConfig = await _cache.GetAsync(cacheKey);
if (cachedConfig != null)
{
var config = JsonSerializer.Deserialize<EmbedConfig>(cachedConfig);
// Check if token is still valid
if (IsTokenValid(config))
{
_logger.LogDebug("Returning cached embed config for {UserId}, {ReportId}", userId, reportId);
return config;
}
}
// Generate new config
var newConfig = await _powerBIService.GenerateEmbedConfigAsync(userId, reportId, userContext);
// Cache for 45 minutes (tokens expire after 60 minutes)
var cacheOptions = new DistributedCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(45)
};
var serializedConfig = JsonSerializer.SerializeToUtf8Bytes(newConfig);
await _cache.SetAsync(cacheKey, serializedConfig, cacheOptions);
return newConfig;
}
private string GenerateCacheKey(string userId, Guid reportId, Dictionary<string, object> userContext)
{
// Include user context in cache key for proper multi-tenant isolation
var contextHash = ComputeHash(JsonSerializer.Serialize(userContext));
return $"embed_config:{userId}:{reportId}:{contextHash}";
}
private bool IsTokenValid(EmbedConfig config)
{
if (DateTime.TryParse(config.Expiration, out DateTime expiration))
{
// Consider token valid if it expires more than 5 minutes from now
return expiration > DateTime.UtcNow.AddMinutes(5);
}
return false;
}
}
For optimal performance, also consider report design best practices: minimize the number of visuals per page, use efficient DAX measures, implement proper data model relationships, and avoid circular dependencies. Monitor dataset refresh times and optimize your data pipeline to ensure fresh data doesn't come at the cost of performance.
Let's build a complete multi-tenant retail analytics application that demonstrates all the concepts we've covered. This exercise will create a customer-facing dashboard system where retail franchisees can view their store's performance data embedded seamlessly within a custom portal.
Our application architecture includes:
Start by setting up the backend project structure:
mkdir RetailAnalyticsPortal
cd RetailAnalyticsPortal
dotnet new webapi -n RetailAnalyticsPortal.Api
dotnet new react -n RetailAnalyticsPortal.Web
cd RetailAnalyticsPortal.Api
dotnet add package Microsoft.PowerBI.Api
dotnet add package Microsoft.Extensions.Caching.StackExchangeRedis
dotnet add package Microsoft.EntityFrameworkCore.SqlServer
dotnet add package Microsoft.AspNetCore.Authentication.JwtBearer
Create the domain models for our multi-tenant system:
// Models/User.cs
public class User
{
public int Id { get; set; }
public string Email { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public UserRole Role { get; set; }
public int? StoreId { get; set; }
public Store Store { get; set; }
public List<int> ManagedStoreIds { get; set; } = new List<int>();
}
public enum UserRole
{
Franchisee = 1,
RegionalManager = 2,
CorporateUser = 3
}
// Models/Store.cs
public class Store
{
public int Id { get; set; }
public string Name { get; set; }
public string StoreCode { get; set; }
public int RegionId { get; set; }
public Region Region { get; set; }
public List<User> Users { get; set; } = new List<User>();
}
// Models/Region.cs
public class Region
{
public int Id { get; set; }
public string Name { get; set; }
public List<Store> Stores { get; set; } = new List<Store>();
}
// Models/EmbedConfig.cs
public class EmbedConfig
{
public string Type { get; set; }
public string Id { get; set; }
public string EmbedUrl { get; set; }
public string AccessToken { get; set; }
public string TokenId { get; set; }
public string Expiration { get; set; }
public Dictionary<string, object> Settings { get; set; } = new Dictionary<string, object>();
}
Implement the complete Power BI service with all the patterns we've discussed:
// Services/PowerBIEmbedService.cs
public class PowerBIEmbedService : IPowerBIEmbedService
{
private readonly IConfiguration _configuration;
private readonly IUserService _userService;
private readonly IDistributedCache _cache;
private readonly ILogger<PowerBIEmbedService> _logger;
private readonly HttpClient _httpClient;
private string WorkspaceId => _configuration["PowerBI:WorkspaceId"];
private string ClientId => _configuration["PowerBI:ClientId"];
private string ClientSecret => _configuration["PowerBI:ClientSecret"];
private string TenantId => _configuration["PowerBI:TenantId"];
public async Task<EmbedConfig> GenerateEmbedConfigAsync(
int userId,
Guid reportId,
EmbedRequest request = null)
{
var user = await _userService.GetUserByIdAsync(userId);
if (user == null)
{
throw new UnauthorizedAccessException("User not found");
}
// Check cache first
var cacheKey = $"embed_config:{userId}:{reportId}:{ComputeRequestHash(request)}";
var cachedConfig = await _cache.GetStringAsync(cacheKey);
if (!string.IsNullOrEmpty(cachedConfig))
{
var config = JsonSerializer.Deserialize<EmbedConfig>(cachedConfig);
if (IsTokenValid(config))
{
return config;
}
}
// Generate new embed config
var accessToken = await GetAccessTokenAsync();
var report = await GetReportAsync(reportId, accessToken);
var effectiveIdentity = await BuildEffectiveIdentityAsync(user, request);
var embedToken = await GenerateEmbedTokenAsync(
reportId,
accessToken,
effectiveIdentity);
var embedConfig = new EmbedConfig
{
Type = "report",
Id = reportId.ToString(),
EmbedUrl = report.EmbedUrl,
AccessToken = embedToken.Token,
TokenId = embedToken.TokenId,
Expiration = embedToken.Expiration.ToString("O"),
Settings = BuildEmbedSettings(user, request)
};
// Cache for 45 minutes
var cacheOptions = new DistributedCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(45)
};
await _cache.SetStringAsync(
cacheKey,
JsonSerializer.Serialize(embedConfig),
cacheOptions);
return embedConfig;
}
private async Task<EffectiveIdentity> BuildEffectiveIdentityAsync(
User user,
EmbedRequest request)
{
var identity = new EffectiveIdentity
{
Username = user.Email,
Datasets = new List<string> { await GetDefaultDatasetId() }
};
var customDataParts = new List<string>();
switch (user.Role)
{
case UserRole.Franchisee:
identity.Roles = new List<string> { "Franchisee" };
identity.Username = user.Store.StoreCode; // Used in RLS filter
break;
case UserRole.RegionalManager:
identity.Roles = new List<string> { "RegionalManager" };
var managedStores = await _userService.GetManagedStoresAsync(user.Id);
customDataParts.Add(string.Join(",", managedStores.Select(s => s.StoreCode)));
break;
case UserRole.CorporateUser:
identity.Roles = new List<string> { "CorporateUser" };
// No additional filters - see all data
break;
}
// Add time-based filters if requested
if (request?.DateRange != null)
{
customDataParts.Add($"DATE_START:{request.DateRange.StartDate:yyyy-MM-dd}");
customDataParts.Add($"DATE_END:{request.DateRange.EndDate:yyyy-MM-dd}");
}
// Add product category filters
if (request?.ProductCategories?.Any() == true)
{
customDataParts.Add($"CATEGORIES:{string.Join(";", request.ProductCategories)}");
}
if (customDataParts.Any())
{
identity.CustomData = string.Join("|", customDataParts);
}
return identity;
}
private Dictionary<string, object> BuildEmbedSettings(User user, EmbedRequest request)
{
return new Dictionary<string, object>
{
["panes"] = new
{
filters = new { expanded = false, visible = user.Role == UserRole.CorporateUser },
pageNavigation = new { visible = true }
},
["background"] = "transparent",
["layoutType"] = "custom",
["bars"] = new
{
statusBar = new { visible = false }
}
};
}
}
Create the API controllers that expose the embedding functionality:
// Controllers/EmbedController.cs
[ApiController]
[Route("api/[controller]")]
[Authorize]
public class EmbedController : ControllerBase
{
private readonly IPowerBIEmbedService _embedService;
private readonly IUserService _userService;
private readonly ILogger<EmbedController> _logger;
[HttpPost("reports/{reportId}")]
public async Task<IActionResult> GetEmbedConfig(
Guid reportId,
[FromBody] EmbedRequest request = null)
{
try
{
var userId = GetCurrentUserId();
var embedConfig = await _embedService.GenerateEmbedConfigAsync(
userId,
reportId,
request);
return Ok(embedConfig);
}
catch (UnauthorizedAccessException ex)
{
_logger.LogWarning(ex, "Unauthorized embed request for report {ReportId}", reportId);
return Forbid();
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to generate embed config for report {ReportId}", reportId);
return StatusCode(500, "Failed to generate embed configuration");
}
}
[HttpPost("reports/{reportId}/refresh-token")]
public async Task<IActionResult> RefreshEmbedToken(Guid reportId)
{
try
{
var userId = GetCurrentUserId();
// Clear cache to force new token generation
await _embedService.InvalidateCacheAsync(userId, reportId);
var embedConfig = await _embedService.GenerateEmbedConfigAsync(userId, reportId);
return Ok(new { accessToken = embedConfig.AccessToken });
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to refresh embed token for report {ReportId}", reportId);
return StatusCode(500, "Failed to refresh token");
}
}
[HttpGet("available-reports")]
public async Task<IActionResult> GetAvailableReports()
{
try
{
var userId = GetCurrentUserId();
var user = await _userService.GetUserByIdAsync(userId);
var availableReports = await _embedService.GetUserAccessibleReportsAsync(user);
return Ok(availableReports);
}
catch (Exception ex)
{
_logger.LogError(ex, "Failed to get available reports");
return StatusCode(500, "Failed to retrieve available reports");
}
}
private int GetCurrentUserId()
{
var userIdClaim = User.FindFirst("sub")?.Value ?? User.FindFirst("user_id")?.Value;
if (!int.TryParse(userIdClaim, out int userId))
{
throw new UnauthorizedAccessException("Invalid user token");
}
return userId;
}
}
Now build the React frontend components. First, create a robust PowerBI component:
// components/PowerBIReport.jsx
import React, { useEffect, useRef, useState, useCallback } from 'react';
import { models, service } from 'powerbi-client';
import { useAuth } from '../contexts/AuthContext';
import LoadingSpinner from './LoadingSpinner';
import ErrorBoundary from './ErrorBoundary';
const PowerBIReport = ({
reportId,
embedRequest = null,
onLoaded,
onError,
onDataSelected,
height = '600px',
className = ''
}) => {
const { apiClient } = useAuth();
const embedContainer = useRef(null);
const [report, setReport] = useState(null);
const [isLoading, setIsLoading] = useState(true);
const [error, setError] = useState(null);
const [embedConfig, setEmbedConfig] = useState(null);
const loadEmbedConfig = useCallback(async () => {
try {
setIsLoading(true);
setError(null);
const config = await apiClient.post(`/api/embed/reports/${reportId}`, embedRequest);
setEmbedConfig(config);
} catch (err) {
setError(err.message);
setIsLoading(false);
onError?.(err);
}
}, [reportId, embedRequest, apiClient, onError]);
const embedReport = useCallback(async () => {
if (!embedConfig || !embedContainer.current) return;
try {
// Clean up existing embed
if (report) {
service
Learning Path: Enterprise Power BI