Power BI Bible

Download as pdf or txt
Download as pdf or txt
You are on page 1of 396

Power BI

Bible
Transforming Insights with Power
BI’s Magic; Learn the Secrets of
Data Visualization and Analysis

Carnon J. Martinez
Copyright © 2023 Carnon J. Martinez
All Rights Reserved
This book or parts thereof may not be reproduced in
any form, stored in any retrieval system, or
transmitted in any form by any means—electronic,
mechanical, photocopy, recording, or otherwise—
without prior written permission of the publisher,
except as provided by United States of America
copyright law and fair use.
Disclaimer and Terms of Use
The author and publisher of this book and the
accompanying materials have used their best efforts
in preparing this book. The author and publisher
make no representation or warranties with respect to
the accuracy, applicability, fitness, or completeness
of the contents of this book. The information
contained in this book is strictly for informational
purposes. Therefore, if you wish to apply the ideas
contained in this book, you are taking full
responsibility for your actions.
Printed in the United States of America
TABLE OF CONTENTS
TABLE OF CONTENTS
CHAPTER 1
INTRO TO POWER BI
What Is Power BI?
Power BI Components
Power BI Desktop
Its components
The Power BI Service
The Power Platform
How Did We Get to Power BI?
SQL Server: Microsoft’s Relational Database
SQL Server Analysis Services Multidimensional: One Small Step into BI
SQL Server Reporting Services: Pixel-Perfect Reporting, Automated Reports,
and More
Excel: A Self-Service BI Tool
Power Pivot
Important New Functionality That Leads to Power BI
Power BI Desktop is born
Power BI Desktop under the Hood
VertiPaq: The Storage Engine
DAX: The Formula Engine
What Makes Power BI Different from Its Competitors?
Conclusion
CHAPTER 2
THE REPORT AND DATA VIEWS
Use the ribbon in Power BI Desktop
Report View: Home Section of the Ribbon
The Clipboard Subsection
The Data Subsection
The Queries Subsection
The Insert Subsection
The Calculation Subsection
Report View: The Insert Tab
The Pages Subsection
The Visuals Subsection
The AI Visuals Subsection
The Power Platform Subsection
The Elements Subsection
Report View: The Modeling Tab
The Relationships Subsection
The Calculations Subsection
The Page Refresh Subsection
The Security Subsection
The Q&A Subsection
Report View: The View Tab
The Themes Subsection
The Scale to Fit Subsection
The Page Options Subsection
The Show Panes Subsection
Report View: Help Section
Report View: External Tools Section
The Pane Interface of the Report View
Visualizations Pane
Fields and Filters Panes
A Quick Rundown of the Other Panes
Data View
Conclusion
CHAPTER 3
IMPORTING AND MODELING OUR DATA
Getting Our Data
Importing the Data
The Power Query Ribbon
The Home Tab
Transform Tab
The Add Column Tab
The Model-View
What Is a Relationship?
Types of Relationships in Power BI
Creating and Managing Relationships in Power BI
Benefits and Applications of Relationships in Power BI
The Properties Pane
Conclusion
CHAPTER 4
LET’S MAKE SOME PICTURES (VISUALIZING DATA 101)
Why Visualize Data?
The Visualizations pane
Fields
Format
Analytics
Visual Interactivity
Enable the visual interaction controls
Change the interaction behavior
Column and Bar Charts
Stacked Bar and Column Charts
Clustered Bar and Column Charts
100% Stacked Bar and Column Charts
Small Multiples
Waterfall Chart
Benefits of Using Waterfall Charts in Power BI
Creating a Waterfall Chart in Power BI
Line and area charts.
Line Chart
Creating a Line Chart
Area Chart
Stacked Area Chart
Line and Stacked Column Chart/Clustered Column Chart
Ribbon Chart
Donuts, dots, and maps.
Funnel Chart
Scatter Chart
Pie and Donut Chart
Treemap
Map Visuals
The “Flat Visuals”
Gauge
Card/Multi-Row Card
KPI
Table/Matrix
Slicer
Conclusion
CHAPTER 5
AGGREGATIONS, MEASURES, AND DAX
A Primer on the DAX Language
Measures
Calculated Columns
Calculated Tables
Types of Functions
Aggregations, More than Some Sums
Sum
Average
Minimum and Maximum
Standard Deviation, Variance, and Median
Standard Deviation
Variance
Median
Count and Count (Distinct)
First, Last, Earliest, and Latest
Measures and DAX Fundamentals
Implicit and Explicit Measures
Implicit Measures
Characteristics of Implicit Measures in Power BI
Applications of Implicit Measures in Power BI
Explicit Measures
Characteristics of Explicit Measures in Power BI
Applications of Explicit Measures in Power BI
DAX Syntax Fundamentals
CALCULATE
Examples of CALCULATE Function Usage
We Heard You like DAX, So We Put Some DAX in Your DAX
Nested DAX Functions
Examples of Nested DAX Expressions
Row and Filter Context
Row Context
Filter Context
Interaction between Row and Filter Context
One Final DAX Example
Conclusion
CHAPTER 6
PUTTING THE PUZZLE PIECES TOGETHER: FROM RAW DATA TO REPORT
Your First Data Import
Choose and Transform the Data When You Import
Consolidating Tables with Append
Considerations for Consolidating Tables with Append
Using Merge to Get Columns from Other Tables
Building Relationships
Autodetect during load
Create a relationship with Autodetect
Create a relationship manually
Edit a relationship
Editing relationships using different methods
Important
Configure more options
Cardinality
Cross filter direction
Make this relationship active
Understanding additional options
Automatic relationship updates
Identifying Our Relationship Columns
Time to Get Building
Let’s Get Reporting
We Need a Name...
Cards Help Identify Important Data Points
Bars, Columns, and Lines
Conclusion
CHAPTER 7
ADVANCED REPORTING TOPICS IN POWER BI
AI-Powered Visuals
Key Influencers
Using the Key Influencers Visual in Power BI
LET’S WORK WITH AN EXAMPLE.
Decomposition Tree
Key Benefits of the Decomposition Tree
Using the Decomposition Tree Visual in Power BI
Q&A
Smart Narrative
What-If Analysis
Benefits of What-If Analysis in Power BI
Techniques for What-If Analysis in Power BI
Best Practices for What-If Analysis in Power BI
Parameter Setup
DAX Integration of the Parameter
Parameter Modification
R and Python Integration
Benefits of R and Python Integration
Practical Applications
Limitations of Using R and Python
Enabling R and Python for Power BI
R and Python in Power Query
R and Python Visuals
Conclusion
CHAPTER 8
INTRODUCTION TO THE POWER BI SERVICE
The Basics of the Service: What You Need to Know
The Navigation Menu
Home and Browse
Create
Data Hub
Settings
Metrics
Apps
Deployment Pipelines
Learn
Publishing Your Work
What Is a Workspace?
My Workspace
Shared Capacity Workspaces
Dataflows in Shared Workspaces
Putting Your Data in Front of Others
Adding Users to a Workspace
Sharing via a Link or Teams
Sharing via a Link
Sharing via Teams
Sharing via SharePoint
Creating an App
Conclusion
CHAPTER 9
LICENSING AND DEPLOYMENT TIPS
Licensing
Power BI service licenses
Free per-user license
Pro license
Premium per user (PPU) license
Premium capacity
Workspace and App Management
Workspace Generation and Access Control
Managing Users in a Workspace
Remove a user or change their role in a workspace.
Adding Users to Roles for RLS Implementation
The Golden Dataset(s)
On-Premises vs. Cloud Deployment for Power BI
On-Premises Deployment
Cloud Deployment
Scaling Power BI Deployments for Enterprise Use
Best Practices for Power BI Deployment
Troubleshooting Licensing Issues in Power BI
Common Licensing Issues
Troubleshooting Licensing Issues
Tips for Optimizing Power BI License Costs
Conclusion
CHAPTER 10
THIRD-PARTY TOOLS
Get to Know Business Ops
Add External Tools, Remove External Tools, and Modify Display Order
Learning, Theme Generation, Visual Generation
Additional DAX Resources
DAX Studio
What can you do with DAX Studio in Power BI?
Download, Install, and Setup DAX Studio Power BI
DAX Studio UI Basics
Metadata Panel
The Ribbon
Query Pane
Output Pane
How to Write Queries in DAX Studio?
Tabular Editor
Creating Roles
Table and Measure Management
The ALM Toolkit for Power BI
Bravo
Analyze Model
DAX Formatting
Manage Dates
Export Data
Understanding Bravo's Data Export Options
Preparing Data in Bravo
Exporting Data from Bravo
Conclusion
Commonly Used DAX Expressions
Aggregation Functions
Date and Time Functions
Time Intelligence Functions
Filter Functions
Logical Functions
DAX Operators
Some Favorite Custom Visuals
SandDance
Smart Filter Pro
Chiclet Slicer
Timeline Storyteller
Synoptic Panel
Word Cloud
Card with States
Radar Chart
Hexbin Scatterplot
Hierarchy Slicer
Gantt chart by MAQ Software
Bullet chart
Sunburst Chart
Conclusion
INDEX
CHAPTER 1
INTRO TO POWER BI
What Is Power BI?

Imagine for a moment that you are a data analyst working in the
hectic atmosphere of a large corporation. Because you are
continually flooded with a vast volume of data coming from a variety
of sources, it can be difficult to glean significant insights from the
data and present them in a way that is both clear and succinct.
However, there is a solution that can completely transform your data
analysis experience, and that solution is called Power BI. You will
have the ability to transform raw data into visually engaging reports
and interactive dashboards with the help of the sophisticated
technology known as Power BI. You can simply connect to many
data sources thanks to its intuitive interface, regardless of whether
those data sources are spreadsheets, databases, or cloud services
where the data is kept. Excel, SharePoint, and SQL Server are
examples of well-known tools that can be easily integrated with
Power BI. This integration makes data retrieval and aggregation a
snap.
After your data has been connected, Power BI gives you the ability
to manipulate it and model it following your requirements. To build a
foundation for your study that is both structured and well-organized,
you can easily combine tables, define associations between various
datasets, and run calculations. You can acquire deeper insights
because of its flexible data modeling capabilities, which enable you
to explore the hidden patterns and correlations included within your
data. But the real power of Power BI rests in the features it offers for
visualizing data. The days of using boring spreadsheets and
unchanging charts are long gone. Power BI provides users with
access to a comprehensive collection of dynamic visualizations,
including bar charts, line graphs, maps, and gauges. Stunning
visualizations that bring your data to life can be crafted quickly and
easily by dragging and dropping data fields onto a canvas. These
aesthetically beautiful and interactive reports let you deliver
complicated information straightforwardly and engagingly, which
enables stakeholders to understand insights at a glance and
improves your ability to communicate effectively. Power BI is not only
about how it looks; it is built on a powerful and efficient engine that
can handle massive datasets with surprising speed and agility. Your
reports will always be up to date thanks to real-time updates and the
ability to refresh data, which will also provide you with the most
recent information to help you make decisions promptly.
To acquire a more in-depth comprehension of your data, you can
obtain new insights by using the advanced analytics tools of Power
BI. These features allow you to discover patterns, locate data
anomalies, and carry out complex calculations. You can simply share
your reports and dashboards with coworkers, allowing them to
examine the data, ask questions, and offer their insights.
Collaboration is another feature of Power BI. The collaboration tools
offered by Power BI allows for seamless teamwork, which in turn
helps to cultivate a culture of data-driven decision-making throughout
your firm. Lastly, Power BI is the most reliable partner you could ask
for in the field of data analysis. It gives you the ability to connect,
model, display, and share your data in ways that were previously
impossible to conceive of. You'll be able to turn raw data into insights
that can be put into action with Power BI, which will help your
business make more informed decisions and accelerate its path to
success. Consequently, if you want to realize the full potential of your
efforts to analyze data, you should leverage the power of Power BI.

Power BI Components
Users are now able to produce and use reports based on your data
with the help of a wide selection of products that together make up
Power BI today.
The following is a list of all the components that are included in
the Microsoft Power BI family of products, as provided by
Microsoft:
● Power BI Desktop
● Power BI service
● Power BI Mobile
● Power BI Report Builder
● Power BI Report Server on-premises
● Power BI Embedded
● Power BI Desktop: This is a Windows application that acts as
the primary development tool for the creation and design of
interactive reports and dashboards. Power BI Desktop was
developed by Microsoft. It provides a broad selection of data
connecting options, data transformation capabilities, modeling
tools, and visualization capabilities, among other things. Users of
Power BI Desktop can develop aesthetically attractive reports and
dashboards, as well as create complicated data models,
implement custom calculations with DAX (Data Analysis
Expressions), and more. When it comes to authoring and revising
Power BI content before uploading it to the Power BI Service, this
is the tool that everyone uses.
● Power BI Service: Microsoft's Power BI Service is a web-based
platform that is also known as Power BI Online and Power BI
Cloud. It is offered by Microsoft. Users are granted the ability to
publish, share, and collaborate on Power BI reports and
dashboards through the use of this feature. Users can store and
manage their data securely in the cloud using Power BI Service.
Users are also able to generate data-driven alerts, schedule data
refreshes, and access reports and dashboards from any location
using a web browser or the Power BI mobile app. It offers extra
features such as the capacity for content sharing and
collaboration, in addition to powerful analytics capabilities.
● Power BI Mobile: This is a collection of mobile applications that
can be downloaded on Apple and Google-based mobile devices
respectively. It gives users the ability to access and interact with
Power BI dashboards and reports when they are on the move.
The mobile apps deliver an experience that is responsive and
optimized, enabling users to access and explore data, receive
data-driven notifications, and maintain a connection with the data
insights of their company even when they are away from their
desks. Enhancing portability and adaptability in data consumption,
Power BI Mobile assures users that they will always have access
to their most important data, regardless of where they are.
● Power BI Report Builder: This is a standalone product that
enables users to generate paginated reports in Power BI.
Invoices, statements, and operational reports are examples of the
kinds of traditional and fixed-layout reporting demands that often
call for the employment of these reports. Table-based layouts,
flexible formatting options, and complex parameters are some of
the capabilities that are included in Power BI Report Builder's
extensive feature set, which can be used for designing and
authoring pixel-perfect reports. It is especially helpful for
businesses that largely rely on structured reports that are printed
out and require exact control over the report's style and content.
● Power BI Report Server (on-premises): Power BI Report
Server is a reporting solution that allows businesses to host,
manage, and distribute Power BI reports within their
infrastructure. It is an on-premises version of the Power BI Report
Server. It ensures compliance with regulatory standards our data
governance policies by providing a method to store sensitive data
and reports on local servers. The Power BI Report Server
provides functionality that is analogous to that of the Power BI
Service. These capabilities include interactive reports, the
refreshing of data, and the secure sharing of information inside an
organization's network.
● Power BI Embedded: This is a service that is geared toward
developers and allows Power BI reports and dashboards to be
embedded directly into other applications or websites. End users
will have access to real-time analytics and interactive reports
without having to leave the context of their primary application
thanks to this feature, which enables developers to integrate
Power BI's extensive data visualization capabilities into their apps
in a seamless manner. Power BI Embedded is frequently utilized
by software manufacturers, SaaS providers, and businesses who
wish to give embedded analytics to either their external clients or
their employees.
These Power BI components cater to various areas of data analysis,
reporting, and visualization. As a result, Power BI offers a
comprehensive array of tools and services to address a wide variety
of business objectives and scenarios.
Power BI Desktop
In the data-driven world of today, companies in a wide variety of
fields are struggling to make sense of massive amounts of data. The
ability to draw meaningful conclusions from this data has become
necessary to make intelligent business decisions. Users are given
the ability to connect, transform, and efficiently visualize data
through the utilization of Microsoft's Power BI Desktop, which is a
highly effective business intelligence application.
Power BI Desktop allows for seamless integration to a wide variety
of data sources, such as databases, spreadsheets, web sources,
and cloud services. Because of its extensive data connectivity,
customers can extract data from a wide variety of platforms,
including SQL Server, Excel, SharePoint, Salesforce, Azure, and
many others. The capability to merge data from several sources
makes the process of data integration simpler. As a result, users can
obtain a more comprehensive perspective of the information of their
firm. Once you have it linked, Power BI Desktop gives you access to
a wide variety of features for transforming and modeling data. Users
have the option of using an intuitive user interface or the more
complex Power Query Editor to clean, reformats, and alters their
data. Users of Power Query can execute data cleaning activities
such as removing duplicates, filtering rows, and combining
databases using the software. Data modeling is another feature that
is supported by Power BI Desktop. This feature enables users to
construct relationships between tables, generate calculated columns
and measures, and apply business rules to improve data analysis.
The capacity of Power BI Desktop to generate data representations
that are both interactive and aesthetically pleasing is one of its most
notable advantages. Charts, graphs, tables, maps, and personalized
visualizations are just some of the options for data visualization that
can be generated by the program. Users can quickly construct
engaging reports and dashboards by dragging and dropping fields
onto the canvas and customizing the associated visual
characteristics. Users can show data in a manner that is both
meaningful and entertaining because Power BI Desktop provides a
broad variety of formatting options, animations, and interactive
features.
Power BI Desktop goes beyond traditional methods of data
visualization by providing tools for more advanced analysis. It is
compatible with the data analysis and modeling tools developed by
Microsoft, such as the DAX (Data Analysis Expressions) and R
scripting languages. Users can make use of these functionalities to
build prediction models, perform complicated calculations, and
create sophisticated metrics. The Q&A feature of Power BI Desktop
allows users to ask questions about their data and obtain fast
insights. This is made possible by the fact that Power BI Desktop
supports natural language queries.
Power BI Desktop makes it easier for individuals, groups, and
organizations to work together and share insights gained from data.
Users can publish their reports and dashboards to either the Power
BI service or SharePoint, making them available to stakeholders
located around the enterprise. The Power BI service is hosted in the
cloud and offers real-time data refreshment and sharing capabilities.
These features make it possible for users to work together and make
decisions based on the collected data. In addition, Power BI
Desktop files (.pbix) can be readily shared with other users and
viewed by them, which ensures that working together on analysis
and reporting is a smooth process. It is crucial to be able to obtain
data insights while on the move in today's world, which is dominated
by mobile devices. Users can view and interact with reports and
dashboards on their mobile devices such as smartphones and
tablets thanks to the Power BI Desktop application's seamless
integration with the Power BI mobile app. This flexibility across
platforms ensures that users can access key data insights whenever
and wherever they need them, regardless of the device they use or
the operating system they employ.

Its components
The Power BI Desktop application consists of its very unique
components. The Power BI canvas and Power Query are the two
features that are going to receive the majority of our attention
because of how vital they are to someone just starting with Power BI.
The canvas in Power BI is where you construct your visualizations
using the software. Imagine the canvas as a slide in PowerPoint that
has all of your data. To explore your data and get new insights, this
section will guide you through the process of dragging and dropping
information into various types of visualizations. You'll also be able to
add photos and text boxes here, format the visuals, and perform a
variety of other tasks here.
Importing data and then manipulating it through the use of Power
Query is the essence of data modeling. In Power BI, in contrast to
Microsoft Excel, for example, you do not edit individual data cells;
rather, you alter entire columns of data by making use of the
program's built-in functions, wizards, and formulas. Power Query
gives you the flexibility to construct individualized columns based on
the rules that you choose. It gives you the ability to combine
numerous data tables into one, as well as add values from one table
to another.
Getting data from your sources is the starting point for everything in
Power Query, and Power Query supports a very large number of
different sources of data. You'd like to establish a connection to a
database, right? SQL? Oracle? Teradata? You have nothing to worry
about using Power Query. You want to obtain a table by establishing
a connection to an Excel worksheet, right? No issue. Comma-
separated values (CSV)? Easy. Cloud sources? Also not an issue.

The Power BI Service


Microsoft's Power BI Service, which is also known as Power BI
Online and Power BI Cloud, is a business intelligence platform that
runs in the cloud and was developed by Microsoft. It extends the
capabilities of the Power BI Desktop program in the areas of data
management, sharing, and collaboration, and it works in tandem with
that application. The Power BI Service provides users with a diverse
selection of possible data connectivity paths. Users can connect to a
wide variety of data sources, such as databases, cloud services,
web services, and data sources located on the user's premises.
Direct connections to widely used platforms such as SQL Server,
Azure SQL Database, Excel, SharePoint, Salesforce, and Google
Analytics, amongst many others, are supported by the Power BI
Service. Users are granted the ability to build connections, import
data, and construct datasets for analysis and visualization.
In Microsoft's Power BI Service, dashboards offer a consolidated
view of significant metrics and key performance indicators (KPIs),
which are compiled from a variety of reports and datasets. Users can
"pin" visualizations from a variety of reports to a dashboard, thereby
producing a customized and interactive summary of pertinent
insights. Users can dynamically monitor and measure their company
indicators while using Power BI Dashboards because these
dashboards offer real-time data updates.
Users can generate and publish interactive reports with the help of
Power BI Service. Users can develop reports using Power BI
Desktop, and then publish those reports to the Power BI Service
where they can be shared and used. Through the use of a web
browser or a mobile application, users of the Service can read,
explore, and engage with reports. Users can apply filters to the
published reports, dive down into the data, and perform ad-hoc
analysis of the information. Real-time dashboards, in which the
visualizations are automatically updated whenever there is new data,
are another feature that can be utilized with Power BI Service.
The Power BI Service provides users the opportunity to update the
data contained in their reports and datasets. Users can plan
automatic data refreshes, which ensures that reports and
dashboards always display the most recent information available.
There are a few different types of refreshes available to choose from,
including incremental refresh, scheduled refresh, and on-demand
refresh. Several different data refresh scenarios are supported by
Power BI Service. These possibilities include direct query, imported
data, and live connectivity to data sources. The capabilities of the
Power BI Service in terms of collaboration and sharing are among its
most prominent advantages. Users can collaborate with colleagues
within and outside of their organization by sharing datasets,
dashboards, and reports. Some of the possibilities for sharing
include restricting access to viewing only, letting users explore the
data, and enabling collaborative capabilities like commenting and
sharing insights with others. The Power BI Service is equipped with
role-based access management, which makes it possible for users
to have the privileges necessary to view and engage with content
that has been provided.
The Content Packs and Apps that are available through the Power
BI Service give users pre-built templates, dashboards, and reports
that are adapted for use with particular data sources or industries.
Users are given the ability to import pre-built dashboards and reports
from well-known services such as Salesforce, Dynamics 365,
Google Analytics, and many others through the use of Content
Packs. Apps are collections of dashboards, reports, and datasets
that have been packaged together to make their distribution and
consumption more straightforward. Users have the option of
downloading apps from AppSource or developing their apps to
distribute to other users.
The Power BI Service offers features that are helpful in data
governance and security. Through the use of the Power BI Admin
Portal, organizations can monitor data usage, implement data
security standards, manage user access and permissions, and
manage user access. Integration with Azure Active Directory (AAD)
is supported by Power BI Service. This integration makes single
sign-on (SSO) and centralized user management possible. Row-
level security, often known as RLS, can be implemented in
businesses to restrict data access based on user roles or qualities.
The Power BI Service can interface without any complications with
the various services and platforms offered by Microsoft. Users will be
able to incorporate Power BI reports in SharePoint Online or Teams
as a result of its integration with Office 365. Power BI Service also
connects with Azure services such as Azure Data Factory, Azure
Machine Learning, and Azure Synapse Analytics. This integration
gives users the ability to make advantage of powerful analytical and
data engineering capabilities. Users now have the opportunity to set
goals thanks to a recently added feature in the Power BI service.
The objectives are monitored with the help of data from the Power BI
service. After then, relevant users can be given access to
information regarding the goals to gain quick and actionable insight.

The Power Platform


Users can construct custom business apps, automate processes,
analyze data, and create chatbots with the help of Microsoft's Power
Platform, which is a set of low-code and no-code technologies
produced by the software giant. Power Apps, Power Automate,
Power BI, and Power Virtual Agents are the four primary
components that make up this offering.
These individual elements, when combined, constitute a
complete answer to the problems of app development,
automation, data visualization, and client interaction. Let's
explore each component in more detail:

1. Power Apps: Power Apps is a platform that allows users


to construct bespoke software for corporate use without
having to write conventional computer code. It provides
users with a visual interface and pre-built templates to
make it possible for them to construct online and mobile
applications. Users can construct interactive applications
that can link to a variety of data sources, such as
SharePoint, Dynamics 365, Excel, and SQL Server. These
applications can automate operations, streamline
processes, and capture data. The responsive design is
supported by Power Apps, which enables applications to
adjust to a variety of screen sizes and devices.
2. Power Automate: This is a workflow automation tool that
was once known as Microsoft Flow. It enables users to
construct automated processes that span numerous
applications and services. Power Automate was originally
known as Microsoft Flow. Users are given the ability to
construct workflows by utilizing a visual designer to link
together a variety of actions and triggers. Users can
automate repetitive processes, synchronize data, and
trigger actions depending on certain events or situations
thanks to Power Automate's integration with a wide variety
of apps and services, including those developed by
Microsoft as well as those developed by third parties.
3. Power Virtual Agents: This is a chatbot development
platform that enables users to construct intelligent chatbots
without requiring them to have any prior knowledge or
experience with coding. It provides a visual interface for
creating and building chatbot dialogues, in which users can
define questions, replies, and dialog flows for the chatbots
they are building. The integration of Power Virtual Agents
with Microsoft's artificial intelligence services, such as
Azure Cognitive Services, expands the chatbot's
capabilities in the areas of natural language
comprehension and sentiment analysis. Chatbots created
using Power Virtual Agents can be deployed on a variety
of channels, including websites, Microsoft Teams, and
Facebook Messenger, among others.

The Power Platform's components are meant to be compatible with


one another and can be combined in a variety of ways to produce
comprehensive solutions. Power Apps, for instance, can be used to
build a custom application; Power Automate, which can automate
the workflows and processes contained within that application;
Power BI, which can visualize and analyze the data collected by the
application; and Power Virtual Agents, which can provide a chatbot
interface for customer engagement and support. Users with no
experience in programming are given the ability to swiftly and
efficiently design and deploy bespoke solutions thanks to the low-
code/no-code approach used by the Power Platform. It makes it
possible for businesses to speed up the process of digital
transformation, automate operations that were previously done
manually, acquire insights from data, and provide customers with
engaging experiences.

How Did We Get to Power BI?


The realization by Microsoft of the ever-increasing significance of
data analytics and business intelligence in today's information era
was the first step in the company's journey toward the creation of
Power BI. Microsoft already had a foothold in the market for
business intelligence thanks to an earlier product it had released
called Microsoft SQL Server Reporting Services (SSRS), which was
able to perform fundamental reporting tasks. Microsoft, however,
understood the need for a business intelligence tool that was more
comprehensive and user-friendly as a result of the introduction of
new technologies, higher data quantities, and the requirement for
more powerful analytical and visualization capabilities. As a direct
result of this, Microsoft decided to build Power BI as a component of
their bigger Power Platform strategy. The Power BI project was first
introduced to the public in 2011 in the form of a cloud-based service
that was referred to as Power BI for Office 365. It offered options for
self-service data exploration and visualization within Excel and
SharePoint. Microsoft has, over time, come to appreciate the
potential of Power BI as a product that can stand on its own and has
increased the capabilities of Power BI beyond those of Excel and
SharePoint. Microsoft introduced a stand-alone application known as
Power BI Desktop in the year 2015. This program gave users the
ability to connect to a variety of data sources, convert and model
data, as well as create beautiful visuals and reports. Because it
provides more complex capabilities and functionalities, Power BI
Desktop has emerged as the key tool for data modeling and the
development of reports.
In addition to Power BI Desktop, Microsoft has also released Power
BI service, which was formerly known as Power BI Online or Power
BI Cloud. Power BI service is a cloud-based platform that allows for
the sharing of Power BI content, as well as collaboration and
management of that content. The Power BI service offers a variety of
capabilities, including data refreshment, dashboards, content packs,
the ability to collaborate with others, and sharing options. Microsoft
has continued to improve and expand Power BI's capabilities, taking
into account input from users as well as developments in the
industry. There have been consistent upgrades as well as the
addition of brand-new capabilities. Some examples of these
additions include support for natural language queries, sophisticated
analytics using R and Python, paginated reports, and AI integration.
Mobile apps have also been developed by Microsoft for use with
Power BI. These apps enable users to access and interact with
reports and dashboards on their mobile devices, such as
smartphones and tablets.
Because of its user-friendly interface, wide data connectivity options,
comprehensive visualization capabilities, and interaction with the
larger Microsoft ecosystem, Power BI has emerged as one of the
most powerful business intelligence tools available on the market
today. It has become a comprehensive platform for data analysis,
visualization, and decision-making as it has acquired significant
acceptance across many different industries. The development of
Power BI is the outcome of Microsoft's commitment to providing
users with effective and easily available tools for data analytics. This
commitment was made in light of the growing significance of data-
driven insights in the contemporary business environment.
This section will provide you with valuable context; which will
answer the following questions: why was Power BI developed,
why is it important, and what products are interrelated? Having this
information at the outset will be beneficial to you in the same way
that it is beneficial to do research about a firm before going into a job
interview. The enlightenment you attain will be beneficial to you in
the years to come.
SQL Server: Microsoft’s Relational Database
Microsoft first distributed SQL Server in 1989 as a relational
database management system. SQL Server was initially published
by Microsoft. When it was first released, SQL Server was largely
focused on standard database management functions, such as the
storage and management of data and transactions, as well as the
processing and management of queries. Microsoft introduced a
variety of new features and additions to SQL Server as it grew over
time to improve its overall performance, scalability, and security.

SQL Server Analysis Services Multidimensional:


One Small Step into BI
New approaches to the processing of data, such as data cubes,
gained popularity as computer processing power increased. The
initial version of Microsoft's online analytical processing (OLAP)
engine was published in 1998 under the name OLAP Services. This
product would later be renamed SQL Server Analysis Services. A
cube-based method of interacting with data to perform the analysis is
what we mean when we talk about OLAP Services. For well over a
decade, the cube model was the standard in many business
intelligence (BI) environments of enterprises.
SQL Server Reporting Services: Pixel-Perfect
Reporting, Automated Reports, and More
In the end, Microsoft needed to incorporate a pixel-perfect reporting
option into SQL Server. This was necessary because, as the number
of data use cases increased, so did the requirement to develop
reusable assets that adhered to very stringent guidelines. For
instance, you need to make sure that the format of each invoice that
you print is precisely the same as the one before it. The first edition
of Microsoft SQL Server Reporting Services was published in 2004
as an add-on to SQL Server 2000. In 2005, the company released
the second version of the software concurrently with SQL Server
2005. In an enterprise deployment, SQL Server Reporting Services
had several helpful features. These features included pixel-perfect
report generation, automated report distribution, and, in many
deployments, the ability for end users to generate queries to the
backend SQL Server database through a user interface. These
features were all included in SQL Server Reporting Services.
Excel: A Self-Service BI Tool
The history of Microsoft's self-service business intelligence offerings
can be boiled down to one basic product, Microsoft Excel, which
almost everyone has encountered or interacted with at some point.
In 1985, Microsoft released the very first version of Excel for the
Macintosh. Excel is a program that, at its most fundamental level,
gives users the ability to take data, pull it into a "flat" extract, and
then alter the data or do unplanned calculations on it as required.
Excel gives users the ability to gain insights from their data by
empowering them to analyze it. This is the fundamental idea behind
the concept of self-service business intelligence. For a very long
time, Microsoft Excel has been an indispensable tool for
professionals working in a wide variety of fields who are engaged in
data analysis and business intelligence. Excel has become a go-to
piece of software for jobs ranging from straightforward calculations to
in-depth data modeling as a result of its flexibility, user-friendliness,
and powerful capabilities. Excel has progressed into much more than
merely a program for creating spreadsheets as a result of the vast
range of functions and capabilities it possesses. It has evolved into
self-service business intelligence (BI) tool, which grants users the
power to explore and examine data in an approach that is both user-
friendly and effective.
Here are some key aspects of Excel as a self-service BI tool:

1. Data Import and Transformation: Excel users can import


data from a wide variety of sources, such as databases,
text files, web services, and more. Excel also enables
users to transform the data after it has been imported. It
simplifies the process of retrieving data by providing user-
friendly wizards and connections to work with the data.
Utilizing Excel's functions, formulae, and Power Query,
users can conduct operations such as data purification,
transformation, and shaping within the spreadsheet
program.
2. Data Modeling: Excel's data modeling capabilities enable
users to properly organize and organize their data, making
Excel a useful tool for data modeling. Users can create
calculated columns and measures, form links between
tables, and construct hierarchies. Using the DAX (Data
Analysis Expressions) programming language, users of
Excel's Power Pivot function can construct data models
that can manage massive amounts of data and carry out
complex calculations.
3. Visualization and Analysis: Excel provides users with a
broad variety of options for visually representing data, and
the program also includes an analysis module. Excel users
can produce interactive dashboards, pivot tables, charts,
and graphs. It offers a variety of formatting options,
including conditional formatting and charting capabilities, to
improve the data's overall aesthetic appeal. Excel's built-in
formulae and functions make it possible for users to carry
out complicated calculations, explore trends, and compile
data.
4. Advanced Analytics: Excel has some different advanced
analytics tools that make it possible for users to carry out
statistical analysis, predictive modeling, and what-if
scenario simulations. Excel's built-in functions and tools,
such as Solver (for optimization issues), Data Analysis
Toolpak (for statistical analysis), and Power View (for
interactive data exploration), can be utilized by users. To
take advantage of Excel's advanced analytics features,
users can now interface Excel with programming
languages such as R and Python.
5. Sharing and Collaboration: Excel includes capabilities
that allow users to share and collaborate on data and
results. These functions can be accessed through the File
tab. Excel files can be easily shared between users
through email, cloud storage platforms, or SharePoint. A
shared Excel worksheet allows for simultaneous
collaboration from multiple users, each of whom can make
adjustments in real-time. Users can keep track of changes
and revert to earlier versions because of Excel's built-in
version control. Excel also connects with Microsoft Teams
and Power BI, which makes it possible for teams to
seamlessly collaborate and share information.
6. Integration with External Data Sources: Integration with
other Data Sources Excel allows for the integration of a
wide variety of other data sources, including SQL Server,
Oracle, SharePoint lists, Azure services, and many others.
Excel users can make connections to the aforementioned
data sources, create live connections, or import data into
Excel to do analysis and generate reports. Users can
make use of Excel's well-known interface while
simultaneously accessing and analyzing data from a
variety of sources thanks to this integration feature.
7. Extensibility with Add-Ins: Excel's capabilities can be
expanded via the use of add-ins and extensions. Users
can download and install specialist add-ins for specific
data analysis purposes, such as data visualization, data
mining, or financial modeling. This add-ins can then be
utilized by the users. Excel's capabilities can be expanded
with the help of this add-ins, which also give users access
to more tools and features to meet their self-service
business intelligence needs.

Because of its broad use and familiarity, Excel is frequently selected


as a self-service business intelligence tool. Because of its large
feature set, data manipulation capabilities, visualization options, and
integration capabilities, it is a flexible tool that can be used by people
and small teams to analyze data and generate insights from that
data without the need for heavy programming or expert business
intelligence skills.

Power Pivot
PowerPivot was first made available by Microsoft in the year 2010.
After some time, the name PowerPivot was changed to include a
space between the two words, making it two words long and
conforming it to the naming conventions of the other products in the
new Power BI suite of tools. Power Pivot was initially developed as
an add-on for Excel. It enabled end users to collect data from a wide
variety of sources and then store that data in a relational online
analytical processing (ROLAP) model within the worksheet. Power
Query was included in the installation of Power Pivot. Data
manipulation through the use of the M programming language is
made possible through the Power Query tool, which is an in-engine
extract, transform, and load (ETL) tool. Power Pivot is a powerful
data modeling and analysis feature that is available in Microsoft
Excel and Power BI. It enables users to create advanced data
models by importing and linking large volumes of data from a variety
of sources. Users can also perform complex calculations and
analyses by making use of the Data Analysis Expressions (DAX)
language. Power Pivot is available in both of these applications.
The following are some of the most important characteristics of
Power Pivot:

1. Data Import and Linking: Users of Power Pivot can


import data from a variety of sources, including relational
databases, text files, Excel tables, SharePoint lists, and
more. Users can build links to different data sources and
import significant amounts of data into the Power Pivot
data model. To improve efficiency and be able to deal with
massive data scenarios, Power Pivot supports various
data compression techniques.
2. Data modeling and Relationships: Power Pivot gives
users the ability to build relationships between tables
based on fields that are shared between the tables. These
connections lay the groundwork for the construction of
solid data models. Users can construct associations that
can either be one-to-one, one-to-many, or many-to-many,
which enables quick data exploration and analysis. In
addition, Power Pivot allows for the construction of
calculated columns, measures, and hierarchies, which can
be used for the modification and aggregation of data.
3. Data Analysis Expressions: DAX is a formula language
that is utilized in Power Pivot to develop individualized
calculations and measures. Users can do advanced
calculations, create custom aggregations, and generate
calculated columns because of the comprehensive
collection of functions and operators that are provided by
this platform. Calculations that are based on connected
tables, time intelligence, filtering, and other features can be
created with the help of DAX formulae.
4. Advanced-Data Analysis: Power Pivot allows users to
execute complex data analysis and modeling within Excel
or Power BI. Users can design interactive dashboards,
create pivot tables, and produce reports using the data
model created in Power Pivot. Users of Power Pivot are
given the capacity to explore and analyze data from
various dimensions, discover trends, and acquire useful
insights as a result of the software's efficient ability to
manage enormous volumes of data.
5. Integration with Power Query and Power View:
Powerpivot integration with the other Power BI
components is easy. Users of Power Query are given the
ability to reformat and shape data before importing it into
Power Pivot, which significantly improves the capabilities
of data cleansing and transformation. Users can generate
interactive data visualizations and reports with the help of
Power View, which is based directly on the Power Pivot
data model.
6. Collaboration & Sharing: Power Pivot models can be
published to the Power BI service, where they can be
shared and collaborated on by many users, or they can be
shared and collaborated on within Excel workbooks. The
same Power Pivot model can be worked on simultaneously
by some users, who can then make adjustments in real-
time. In addition, Power Pivot models can be used as data
sources for other Excel workbooks or Power BI reports,
making it possible to reuse data and maintain its integrity
across a variety of different types of analyses.
7. Scalability and Performance: Power Pivot makes use of
in-memory technology to store and process data within the
Excel or Power BI file. This enables Power Pivot to store
and handle large amounts of data. Even when working
with big datasets, this makes it possible to do calculations
and query response times more quickly. The compression
methods utilized by Power Pivot result in an optimization of
memory utilization, which enables users to work with huge
datasets without a reduction in performance.

Important New Functionality That Leads to Power


BI
The tabular model is the name of a new feature that was introduced
by Microsoft as part of Analysis Services in SQL Server 2012.
Instead of a cube structure, which gets more difficult to manage over
time and tends to be more confusing for end users, Analysis
Services may now support a manner of data organization that is
more like that of a traditional data warehouse. This is in contrast to
the previous method, which was more like that of a data mart. The
main distinction was that to improve performance in this tabular
paradigm, Microsoft built its first columnar (column-based) data store
technology. This was done to acquire performance advantages. In
the end, this would evolve into what we know today as VertiPaq,
which is an in-memory columnar data storage using the tabular
architecture of Analysis Services. Consequently, as a result of these
enhancements, the performance became extremely quick.
In parallel with this process, a new formula language known as DAX
was developed to support these tabular models. This language
enabled calculations to be performed across the columns of data,
which assisted in the process of turning the data into something that
can be acted upon. This engine served as the foundation for the next
edition of Power Pivot, which was introduced alongside Excel 2013,
and its work.

Power BI Desktop is born


The first version of Power BI Desktop that was accessible to the
general public was made available to the world on July 24, 2015. A
complete enterprise-level semantic modeling tool, complete with the
VertiPaq engine and the DAX formula language, was included in the
Power BI Desktop software. A semantic model is designed to be
understood by people. It made use of Power Query to obtain data
from a broad number of sources and pull it into the engine.
Furthermore, it enabled transformations that might modify the data
for subsequent analysis.

Power BI Desktop under the Hood


The reason that Power BI Desktop is successful is that it conceals
not one but two robust engines. On a more technical level, these are
the things that make the whole thing operate. There is the formula
engine, which is responsible for accepting data requests, processing
those requests, and producing an execution plan for the query. Then
there is the storage engine, which is responsible for storing the data
that the data model generates and for retrieving the data that is
required by the formula engine to fulfill the requirements of a query.
To look at it from a different angle, you could consider the formula
engine to be the brain. It analyzes the situation and determines the
most effective strategy for dealing with it, then communicates this
strategy to the relevant areas of the body so that they can carry it
out. The body that is responsible for receiving those commands and
carrying out the work necessary to bring together all of the data is
known as the storage engine.

VertiPaq: The Storage Engine


In Power BI Desktop, the in-memory storage engine known as
VertiPaq is responsible for optimizing data storage and retrieval to
facilitate effective analysis. It is a columnar database engine that
was developed especially for the demands associated with analytical
work.
The following is a list of important aspects of the VertiPaq:

1. Columnar Storage: VertiPaq saves data in a format


known as columnar, which stores each column of a table in
its distinct location. This method of columnar storage offers
some benefits to the analytical processing that can be
utilized. It allows for faster data retrieval for particular
columns, improves compression rates, and minimizes the
amount of RAM that is used. In addition to this, it enables
efficient processes such as aggregation and filtering, which
are frequently used in analytical queries.
2. In-Memory Technology: VertiPaq saves data in-memory,
which means it keeps the data in the RAM (Random
Access Memory) rather than retrieving it from disk every
time it is required, which is what is meant by the phrase
"stores data in memory." The process of retrieving data
from RAM, which is far faster than recovering it from disk,
is one of the primary reasons why storing data in memory
offers considerable performance benefits. Real-time data
analysis and speedier query response times are made
possible by in-memory storage, which also makes it
possible to engage in interactive data exploration and
visualization.
3. Compression: VertiPaq implements powerful compression
techniques to optimize memory utilization. To make the
data smaller, it makes use of some different encoding and
compression techniques, including dictionary encoding,
run-length encoding, and high-performance compression
methods. The compression methods were developed
expressly for columnar storage, and they make use of the
values' resemblance to one another as well as the
repetition that occurs within a column. Because of this,
VertiPaq can successfully manage massive amounts of
data while also achieving efficient memory use.
4. Query Execution: The columnar storage and in-memory
design of VertiPaq make it possible to execute queries
quickly and effectively. VertiPaq will only access the
columns that are necessary for the calculation while a
query is being conducted. This helps to keep the amount
of data movement and disk I/O to a minimum. It does
things like filtering, aggregating, and joining at the column
level, taking advantage of the columnar structure to
perform processing in the most efficient way possible.
These changes allow for much faster interactive analysis
and a significant improvement in the performance of
querying.
5. Data Compression and Encoding: VertiPaq makes use
of a combination of compression and encoding strategies
to further improve performance and reduce the amount of
memory footprint required. It does this by encoding the
data values in such a way that they take up less space in
the representation they are given. Dictionary encoding is
another method utilized by VertiPaq. This method involves
the storage of one-of-a-kind values within a dictionary and
the subsequent replacement of those values within the
data with associated dictionary IDs. By using this method,
compression ratios can be improved even further, and data
retrieval can be sped up considerably.
6. Advanced Calculations: In addition to being able to store
data, VertiPaq can also perform advanced calculations by
utilizing the Data Analysis Expressions (DAX) language.
Users can define calculated columns, measures, and
sophisticated calculations within the data model by utilizing
the DAX formula language, which is a sort of programming
language. These calculations are carried out effectively
and efficiently on the fly by VertiPaq, which enables
interactive analysis and real-time insights.
The columnar storage, in-memory technology, and powerful
compression techniques that VertiPaq possesses make it an
essential component of Power BI Desktop's ability to operate
efficiently and scale effectively. Users are given the ability to
successfully explore and display their data by utilizing Power BI
Desktop's quick data retrieval, interactive analysis, and efficient
memory use, all of which are made possible through the exploitation
of these features.

DAX: The Formula Engine


Formula language and calculation engine used in Power BI Desktop
to specify calculations, construct custom measures, and carry out
advanced data analysis. DAX stands for "Data Analysis
Expressions." Within the context of the VertiPaq in-memory storage
engine, it fulfills the role of the formula engine. The DAX formula
language is a sophisticated tool that was developed primarily for use
in analytical and business intelligence contexts. It offers users
access to a comprehensive library of functions and operators, which
enables them to build expressions and calculations that can be used
to alter and examine data. Calculated columns, measures, tables,
and more complicated calculations can all be produced with the use
of DAX formulae, which take into account the relationships, present
within the data model.
Users of DAX can define calculated columns within the tables they
are working with. The values of an expression or formula that the
user has defined are used by a calculated column to produce those
columns' values. The data model gains these generated columns,
which can then be used for additional analysis and calculations.
Calculated columns are generated while the data loading procedure
is taking place and are saved in the VertiPaq engine. The DAX
places a significant emphasis on measures. Calculations that
aggregate data based on specified needs, such as sum, average,
count, or more complicated aggregations, are referred to as
measures. Examples of measures include the sum, the average, and
the count. Users can define measures by employing functions and
expressions that work on the columns that make up the data model
when working with DAX. The results of measures are computed
dynamically at the time of a query, making use of the capabilities of
the VertiPaq engine to perform calculations efficiently.
The DAX platform has a robust feature that is referred to as
"contextual intelligence." Because of this capability, DAX formulae
can automatically change their calculations so that they are
appropriate for the environment in which they are being evaluated. In
this sense, "filters" and "slicers" as well as the structure of the data
model are included. This contextual intelligence enables dynamic
and adaptable calculations to be performed based on the user's
interactions and selections, resulting in results that are correct and
relevant to the situation. The DAX platform is capable of supporting a
wide variety of complex calculation and analysis methods. It offers
capabilities for time intelligence, such as computing the year-to-date,
comparing data across multiple periods, and managing rolling
averages, among other things.
Calculations that are based on hierarchies, statistical functions, text
functions, and logical functions are all supported by the DAX
language. Within Power BI Desktop, users can carry out difficult
calculations and develop advanced analysis scenarios thanks to this
feature. The visuals in Power BI Desktop can be effortlessly
integrated with DAX. Formulas written in DAX can be used by users
to establish calculations, which can subsequently be represented in
charts, tables, and other visual elements. The calculations can react
to the user's interactions with the report, filters, and slicers when
using DAX, which makes it possible to create dynamic and
interactive visualizations. Users are given the ability to generate
meaningful and interactive visuals that reflect the outcomes of
calculations thanks to this integration.
As the formula engine found within Power BI Desktop, DAX is an
extremely important component of both the data analysis and
manipulation processes. Users are given the ability to establish
sophisticated calculations, develop unique measures, and carry out
an advanced analysis within their data models thanks to this
capability. The strength of the VertiPaq engine is leveraged by DAX
to enable rapid and efficient calculations, which in turn provides
users with accurate insights and makes it easier for users to engage
with and explore data.
What Makes Power BI Different from Its
Competitors?
Imagine that you are searching through the huge landscape of
business intelligence solutions to find the one that will revolutionize
the way that you analyze and visualize the data you collect. Among
the available options, Power BI stands out as the most noteworthy
option because it provides a unique collection of features and
capabilities that distinguish it from its rivals. When you first start
using Power BI, one of the things that immediately stand out to you
is the flawless connectivity it has with the rest of the Microsoft
ecosystem. If you already use applications like Excel, SQL Server,
Azure, SharePoint, or Teams, you'll find that integrating Power BI
into your current workflow is a breeze because of the fact that it's
designed to operate with these platforms. Because of this
connectivity, you can make use of the Microsoft environment with
which you are already familiar, which simplifies the data analysis
process.
The intuitive design of Power BI's user interface immediately grabs
your attention and holds it from the moment you begin exploring. It
adheres to Microsoft's design principles which results in an interface
that is natural and simple to navigate. You will find that you can
easily create attractive reports and dashboards with the help of its
drag-and-drop capabilities, pre-built templates, and interactive
visualizations; all of this is accomplished without the need for
extensive technical skills. Power BI's exceptional connection to a
wide variety of data sources is what differentiates it from other
business intelligence tools. Power BI provides you with an
abundance of alternatives to connect and integrate your data,
regardless of the location of that data. Your data could be stored in
databases, cloud-based platforms, files, or internet services.
Because of this versatility, you can compile information from a
variety of sources, giving you a full perspective of your data and
allowing you to make more informed decisions.
With the data transformation options that are at your disposal in
Power BI, you can mold and create your data however you see fit.
You will have an easier time cleaning, transforming, and enriching
your data when you make use of Power Query. You can connect to a
wide variety of data sources, carry out the necessary
transformations, and import the refined data into Power BI thanks to
the user-friendly interface. This will save you a significant amount of
time and work as well as ensure that your data is in the appropriate
format for analysis and reporting. You will find that as you progress
further in your data analysis journey, the advanced data modeling
and analysis features of Power BI reveal their full potential as a
powerful tool. You can develop complex data models with the help of
Power Pivot, which is an in-memory data modeling engine. These
models can include establishing associations and setting
calculations, hierarchies, and measures. The combination with the
VertiPaq engine offers efficient processing of massive datasets,
which enables you to execute sophisticated calculations and derive
deeper insights from your data. This is made possible by the
integration. The extensive sharing and collaboration options offered
by Power BI make the process of working together a snap. You can
publish your reports and dashboards to the Power BI service, where
they can be safely shared with coworkers both inside and outside of
your organization.
Your team will be able to make decisions that are data-driven jointly,
which will increase your overall analytical capabilities. This will also
create collaboration, stimulate the sharing of knowledge, and
empower team members. Your road toward better data analysis will
be made easier by the fact that Power BI is dedicated to providing
enterprise-level security and governance.
Features such as powerful encryption, access controls, and data
loss prevention work together to keep your data secure and intact at
all times. You can implement security standards, limit user access,
and regulate data sharing, all of which ensure compliance with
regulatory requirements and provide you with a piece of mind. When
you think back on your time spent using Power BI, you realize that
it's enterprise-grade security and governance, seamless integration,
user-friendly interface, connectivity to a wide variety of data sources,
powerful data transformation capabilities, advanced modeling and
analysis features, collaboration and sharing capabilities, and all of
these features combined make it an exceptional business
intelligence tool.
You and your team will be able to make decisions that are based on
data with confidence and clarity if you use Power BI because it gives
you the power to unleash the full potential of your data.

Conclusion
The journey through various aspects of Power BI and related
technologies has shed light on the remarkable features and
capabilities that make Power BI a standout choice in the realm of
business intelligence and data visualization. From its integration with
the Microsoft ecosystem to its user-friendly interface, Power BI offers
a seamless and familiar experience for users.
Its broad connectivity options enable users to bring data from diverse
sources, empowering comprehensive analysis and reporting. When
considering the evolution of Power BI, it is evident that it has been
shaped by Microsoft's commitment to providing powerful tools for
data analysis and visualization. From its origins in SQL Server and
Excel to the development of Power Pivot and Power Query,
Microsoft has continually refined and expanded the capabilities of
Power BI to meet the evolving needs of users.
Overall, Power BI offers a comprehensive and powerful solution for
organizations seeking to unlock the value of their data. Its integration
with the Microsoft ecosystem, user-friendly interface, extensive
connectivity options, advanced modeling and analysis capabilities,
collaboration features, and enterprise-grade security combine to
provide a unique and compelling offering in the business intelligence
landscape. With Power BI, organizations can derive meaningful
insights, make informed decisions, and drive success in today's
data-driven world.
CHAPTER 2
THE REPORT AND DATA VIEWS
If you have any experience with Power BI, you are aware of how
simple it is to generate reports that provide various views and
insights into the data you have. Power BI Desktop, which is also
available, has a more extensive set of functionality. Using Power BI
Desktop, you can design sophisticated queries, combine data from
many different sources, establish relationships across tables, and do
much more.
Power BI Desktop includes a Report view," in which users can
generate an unlimited number of report pages that include visuals.
The editing view of the report in the Power BI service is quite similar
to the experience provided by the report view in the Power BI
Desktop application. You can relocate visualizations, copy and paste
them, combine them, and do many other operations. You will be able
to work with your queries and model your data using Power BI
Desktop. This will allow you to ensure that your data will offer the
greatest insights possible in your reports. After that, you will have the
option to store your Power BI Desktop file either on your local disk or
in the cloud, depending on your preference.
When you load data for the first time in Power BI Desktop, the
Report view will appear before you. It will include a blank canvas and
links that will assist you in adding data to your report.
Use the ribbon in Power BI Desktop
The advantages of using the ribbon are designed to make the
experience of using Power BI Desktop as well as other Microsoft
applications simple.
These benefits can be grouped into the following categories:
● Search bar - The ribbon has a search experience that is
comparable to the search that is offered in other Microsoft
applications. Power BI will provide recommendations for
activities to take depending on the current condition of your
report whenever you pick the search box. While you are typing,
the search results will automatically change, and buttons will
appear to give assistance or guide you to the next step.
● Improved look, feel, and organization - Icons and
functionality in the updated Power BI Desktop ribbon are
aligned to the look, feel, and organization of ribbon items found
in Office applications.

● An intuitive Themes gallery: The Themes gallery, which can


be seen in the View ribbon, has a similar appearance and feel
to the themes gallery in PowerPoint. If you make changes to
the theme and apply them to your report, the graphics on the
ribbon will show you what those changes will look like,
including the color combinations and typefaces.

● Dynamic ribbon content based on your view: In the


previous version of Power BI Desktop's ribbon, icons or actions
that were unavailable were grayed out, which resulted in a
less-than-ideal user experience. You will always be aware of
the options that are accessible to you based on the context
thanks to the new ribbon, which displays icons in a manner
that is both dynamic and ordered in a certain order.
● A single-line ribbon, when collapsed, saves you space.
This is one of the many advantages of the updated ribbon.
Another advantage of the updated ribbon is the option to
collapse the ribbon itself into a single line, dynamically
displaying ribbon components depending on your context.

● Keytips to traverse and choose buttons: You can enable


keytips by using the Alt key. This will let you navigate the
ribbon more easily. Once the mode has been engaged, you
can navigate by pressing the keys that are shown on your
keyboard.

● Custom format strings - You can set custom format strings


not only in the Properties pane but also in the ribbon. This is in
addition to the fact that you can set custom format strings
there. Depending on whether you want to edit the measure or
column, a contextual tab labeled Measure tools or Column
tools will display once you choose the respective element. On
that tab, you can simply put your format string into the
selection box to make it your unique format.
● Accessibility - The title bar, the ribbon, and the file menu may
all be accessed without any difficulty. To get to the ribbon area,
press Ctrl and F6 on your keyboard. You can go between the
top and bottom bars by using the Tab key, and you can use the
arrow keys to navigate between the items.
In addition to those visible changes, an updated ribbon also
allows for future updates to Power BI Desktop, and its ribbon,
such as the following:
● The creation of more flexible and intuitive controls in the
ribbon, such as the visuals gallery
● The addition of black and dark gray Office themes to the
Power BI Desktop
● An improved accessibility
Report View: Home Section of the Ribbon
When you first launch Power BI, you will be brought to the Report
view by default. At the very top, you'll see a traditional ribbon
interface that gives you the ability to search for different tasks that
you may do. You'll find the pane section of the user interface over on
the right. This section is highly reminiscent of the work that Microsoft
made on the first iteration of the Xbox 360 user interface. Using the
View section of the ribbon gives you the ability to display many
panes simultaneously. You can also minimize panes that are
presently visible, such as the Filters and Fields panes in the picture
below, which shows the Visualizations window open. This is shown
by the fact that the Filters and Fields panes are minimized.

You can find the options for the Report Page Navigation towards
the bottom of the page. You can navigate between the worksheets in
Excel in a manner very similar to how you would navigate through
the worksheets in a report by either clicking on specific report pages
or using the arrows to scroll through the list of pages, depending on
how long your report is. The view selector, which consists of three
icons, can now be found on the left side of the screen. Report, Data,
and Model are the views in ascending sequence from the top of the
page to the bottom. The various views each have their unique
versions of the ribbon menu. The Home tab is the starting point for
the default view when using the ribbon interface, as can be seen in
the figure below. There are some different names for the Home tab,
some of which include Home view, Home ribbon, and Home section.
From this point on, I'll call its components (Home, Insert, and
Modeling) as tabs.

Every one of the tabs, or primary parts of the ribbon, is broken up


into subsections that are denoted by a slender vertical line. On the
row that is located at the bottom of the ribbon, you will find the
names of the subsections. The following are the subsections that
can be found in the picture above Clipboard, Data, Queries, Insert,
Calculations, Sensitivity, and Share.

The Clipboard Subsection


The Clipboard is a helpful part of Power BI that enables users to
manage and organize their copied things, such as visuals,
measures, or queries while working inside the Power BI Desktop
program. This can be done by dragging and dropping the items into
the appropriate folders. It functions as a temporary storage area in
which you can save things and reuse them across a variety of
reports and pages.
When you copy an item in Power BI, such as a visual, a measure, or
a query, that object will be saved in the Clipboard. You can get to the
Clipboard by going to the "Home" tab in the Power BI Desktop
ribbon and clicking on the "Clipboard" button there. This will give
you access to the Clipboard. The content that you have copied will
be shown in a window that appears on the right side of the screen
and is called the Clipboard.
The Clipboard provides several helpful functions:
1. Cutting and Pasting: The Clipboard's principal function is
to provide users with the ability to cut objects from one
report or page in Power BI and then paste them elsewhere
inside the application. This eliminates the need to start
from scratch when recreating visualizations, metrics, or
queries, which allows effective reuse of these elements.
2. Organizing Copied Items: The Clipboard gives you the
ability to create folders and subfolders, which enables you
to arrange the objects you have copied in a hierarchical
structure. You can simplify the process of locating and
managing your goods in the future by categorizing them
according to the sort of item they are or according to any
other relevant criteria.
3. Removing and Clearing Items: If there are objects in the
Clipboard that you no longer need, you can either delete
them one by one or clear out the whole Clipboard to start
again with a clean slate. This contributes to the Clipboard
being less cluttered and more structured overall.
4. Exporting and Importing: You can also export and import
the contents of the clipboard using the appropriate buttons.
This feature makes it easier to collaborate with others and
automates the process of sharing reusable components by
allowing you to export the objects on the Clipboard to a
file, from which they can be imported into another instance
of Power BI Desktop or shared with coworkers.

Users who routinely work on many reports or who need to reuse


components across a variety of projects will find the Clipboard to be
a very useful tool. It makes the process of copying and pasting
objects more streamlined, assists with organizing and managing the
content that has been copied, and fosters cooperation by facilitating
the sharing of items that have been copied to the clipboard.

The Data Subsection


In the next section, which is titled "Data," you will find multiple fast
connection options that will allow you to rapidly connect to various
data sources.
You will see a button labeled "Get data" on the left side of the
screen. This button consists of an icon and an arrow those points
down. If you choose the icon, Power BI will provide you with a new
menu that has the whole catalog of data connections that are
available to use.

If you choose the "drop-down" option, you will be presented with a


shortened and more concise list of the data sources that are most
often used. When you don't need access to the whole list of
connections, this can be a very useful feature.
When you click the "Excel workbook" button, an Explorer window
will instantly appear so that you can go to your Excel file. When you
pick the "Power BI datasets" option, a box will pop up allowing you
to choose a dataset that has already been published in the Power BI
service so that you can connect to it via a "live connection."

When you choose the SQL Server button from the toolbar, a pop-up
box will display, allowing you to input the server's name or location,
as well as an optional database name. When you attempt to connect
to the server, it will inquire about your credentials.
You will also have the option to pick DirectQuery or Import when you
get to this page. When you switch to the Import mode, the data will
be downloaded into your local data model. When the DirectQuery
option is selected, Power BI will construct queries to run against the
database and then, after the results of those queries are obtained, it
will execute whatever processing is necessary with the data.
There is also a button labeled "Advanced options". This button
gives you the ability to specify a command timeout in minutes, pass
a custom SQL statement, include relationship columns, browse
using complete hierarchies, and enable SQL Server failover support.
All of these features can be accessed by clicking the button.

When you click the "Enter data" button, you will be presented with
an interface that, if you are acquainted with Excel, should seem
rather natural to you. This is because the button creates a structure
that resembles a table, into which you can insert columns, give those
names, and enter data into cells. It is essential to take note that this
user interface does not include a feature for formulas. It's only for
entering basic data into a system. You can also copy and paste
information into this window, but you should exercise caution before
going copy-paste crazy. This window can be helpful for testing, or if
you want to construct a lookup table for your model. You can
accomplish any of those things here. However, because this kind of
table structure can only store a maximum of 3,000 cells' worth of
data and requires manual management, it is not a good idea to put
big quantities of data in it. I strongly advise against doing so.

When you click the Dataverse button, a pop-up window will be


generated for you to input the necessary information for your
environment domain, along with a prompt. If you have experience
with Microsoft Dynamics, you have probably come across references
to this concept in the past as the Common Data Model. When you
click the "Recent sources" option, a drop-down menu containing
your most recent sources will appear. This is done so that if you ever
need to connect to the same data source again, it will be ready and
waiting for you. You can get an even more comprehensive list of
recent data sources by selecting “More” from the drop-down menu.

The Queries Subsection


You can manage and alter your data using the Power Query Editor in
the Queries Subsection of the Power BI Desktop application. This
portion is referred to as the Queries Subsection. Before putting it into
your Power BI model, you will be able to connect to a variety of data
sources, apply data transformations, and shape your data using this
feature.
The following actions need to be taken to access the Queries
Subsection in Power BI Desktop:
1. Open Power BI Desktop.
2. Simply navigate your mouse to the "Home" option on the
ribbon that is located at the very top of the window.
3. You will locate the "Transform data" button in the
"External Data" group under the "Home" tab of the ribbon.
Click on it.

You can do two different things with the "Transform data" option
depending on whether you select the icon or the drop-down arrow. If
you click on the icon, you will be taken directly to the interface for
Power Query. Here, you can utilize an intuitive user interface to carry
out a variety of data manipulation operations including filtering,
sorting, merging, appending, and altering data.
If you click on the arrow that point downward, you will bring up a
menu with many different options as it drops down. When you click
"Data source settings," a pop-up window will appear that allows
you to alter various parameters, such as credentials or the locations
of files and other similar options. If you have parameters and
variables, you can also change them in this section.

The Insert Subsection


The Insert tab comes next and it includes three different possibilities.
In the Visualizations pane, you can add a new visualization to your
report page, add a text box, or add more visualization to your
collection of visualizations. If you choose "New visual," a new image
will be added to the canvas that you are working on. The stacked
column chart is the default graphic that will be set on the canvas
when the document is saved. When you click the "Text box" button,
a text box will be added to your canvas in the same manner as it
would appear in PowerPoint.
After clicking the "More visuals" option, the page will appear as
seen in the following picture. If you choose "From my files," you'll be
sent to an Explorer window where you can pick a visual to add from
a file that's already stored on your local system. If you choose "From
a URL," you'll be taken to a box where you can enter a URL. PBVIZ
is the standard format for these files; however, in recent years
Microsoft has been putting a lot of emphasis on AppSource as the
way that should be used to get custom graphics.

The Calculation Subsection


Under Calculations, you have” New measure” and “Quick measure”
options. Calculations that are performed over all of your data using
DAX are known as measures. If you choose "New measure," you
will see that a formula bar has appeared below the ribbon. This is the
location where you can enter the DAX for your measure. This
operates similarly to the formula bar that's available in Excel.
When you click the "Quick measure" button, a pop-up window will
appear that guides you through the process of creating a measure
using a wizard that has various prepared calculations. A useful hint
is that Microsoft adds new fast measures in certain versions, so it is
always worthwhile to check back after an update to see whether new
quick measures have been included. This is because Microsoft adds
new quick measures in some releases.
Report View: The Insert Tab

The Pages Subsection


No matter where you click inside the Pages area, the "New page"
button will always provide a drop-down menu in response to your
action. You have two options to choose from in this menu:
"Duplicate page" and "Blank page." The term "Blank page" refers
to a new page that is generated in your report and is placed to the
right of all of the other pages in your report. A copy of your current
report page will be created when you click the "Duplicate page"
button. This copy will include all of the visuals that are now on the
canvas and will be located to the right of all of the other pages in the
report.

The Visuals Subsection


The "New visual" and "More visuals" buttons can be found on the
Visuals area of the Home tab's toolbar as well. These buttons can
also be found in the Visuals subsection. The functioning of each of
them is precisely the same. They are repeated here for your
convenience, so if you are searching for other particular objects in
this area of the ribbon, which is about putting things into the canvas,
you can also do it from here if you are seeking them. This section of
the ribbon is about placing things onto the canvas.

The AI Visuals Subsection


The following subsection under the Insert tab is dedicated to artificial
intelligence visualizations. There are four of these images driven by
AI that are available to the general public. If you click any of them, a
blank version of that image will be added to the canvas in the upper
left-hand corner and as far to the top as feasible. The fact that this
specific visualization provides a one-of-a-kind functionality in terms
of analytical capabilities is a crucial point to emphasize here.
Microsoft is eager to show off its expertise in artificial intelligence (AI)
in as many Power Platform products as possible, and Power BI is
not an exception to this rule. In the not-too-distant future, I anticipate
an increase in the number of images that are driven by AI.
The Power Platform Subsection
The next section is titled "Power Platform," and it discusses
elements that are technically considered visuals, but which
fundamentally vary from all of the other visuals that are included in
the platform. When you integrate Power Platform graphics into your
Power BI report, you have the option to interact and engage with the
other components of the Power Platform that we covered before.
They are very effective and contribute to the process of
demonstrating the Power Platform's worth to a business on a holistic
level.
The Elements Subsection
Elements are the last sub-section of the Insert tab, and it deals with
the elements of the report. These are things that aren't necessarily
interactive in the same way that Power BI visualizations are, but they
can assist improve the clarity of your report and make it more useful.
Control options for elements include text boxes, button groups,
shape libraries, and image libraries. The "Text box" feature in Power
BI performs the same function as its PowerPoint counterpart. It adds
a text box that can be edited to the canvas and provides you with
some fundamental formatting options for your content. Using this
tool, you will be able to alter the font, font size, text color, bolding,
italicizing, and underlining, as well as the alignment of the text, and
you will also be able to add hyperlinks to websites that are not part of
your report. Other elements on the “Text box” can be edited in the
“Format text box” pane that appears when a text box is selected.
When you click the Buttons element, a variety of alternatives that
can assist with navigating the report or give further information will
appear before you. There are a good many of them. The whole list is
included in the picture that follows; thus, you can immediately begin
considering how you can incorporate them into any future reports
that you write.

It is essential to keep in mind that putting the button on your canvas


has no effect on any of these buttons, except the Q&A, Bookmark,
and Navigator options, unless you provide Power BI with the context
for how that button should be utilized, placing the button on your
canvas has no effect. When you place one of these buttons on your
canvas, a "Format button" window will appear on the right. It will
provide visual formatting options, highlighting what action the button
should perform, and it will display when you put one of these buttons
on your canvas. Both the Q&A button and the Bookmark button
already have their default actions configured to the respective
functionalities of the Q&A and Bookmark buttons. However, if you
believe that your report may employ that button in a different context,
you can still adjust the tasks that it does.
When you click the Shapes option, you will be presented with a
comprehensive collection of shapes that can be used in a report.
When you pick the shape, it will appear on your canvas already filled
with a color that is determined by your theme's default settings. From
that point on, you can edit it simply by adjusting the size of the
graphic on the canvas. Since shapes have their dedicated window,
the shape can also be changed when the formatting option is
chosen. If you want to add a picture to your Power BI report, you can
do so by clicking the Image button, which will bring up an Explorer
window from which you can make your selection. Pictures can be
imported into Power BI in a variety of formats. (As a side note, there
is a proper way to say GIF; make sure you're not pronouncing it
incorrectly.)

Once your picture is uploaded and placed on the report page, you
will see that it, along with every other item that can be found in the
Elements part of the ribbon, has its unique formatting window that
appears on the right side of the page when it is chosen. You can
change the size of the shape in the report, just like the size of any
other Power BI element, by clicking and dragging the box to the
appropriate size. This works for all Power BI components.

Report View: The Modeling Tab


The third part of the ribbon in the Report view is the Modeling
tab. Here you have more data management options from the Report
view.
The Relationships Subsection
First, there is a button labeled "Manage relationships." When you
click this button, a window will appear in which you can see all of the
relationships that are already present in your model. From this
window, you can also add new relationships or make changes to
relationships that are already there. Power BI will, as a matter of
course, and by default, attempt to auto detect relationships based on
column names. This can help put up an initial set of relationships for
your data model, but it is possible that this won't always be the way
the model is built up the way you want it to be. You must take into
account the fact that this button will be disabled if your data model
has less than two tables.
The Calculations Subsection
The next area involves Calculations. The buttons labeled "New
measure" and "Quick measure" are operational. The other two
items are prompts that will bring up the inline DAX editor so that you
can construct either DAX-calculated columns from the "New
column" button or DAX-calculated tables from the "New table"
button respectively using the "Inline DAX editor" that will appear
when you click on any of the other two things. Please be advised
that the "New measure" button has a peculiar quirk that can be
annoying at times. If you do not have a table selected in the Fields
pane before choosing this option, Power BI will presume that the
measure is going to go into the first table that it sees in that pane
and assign it as the destination for the measure. Theoretically, these
items can be transferred by dragging and dropping them in the
Fields pane to another table. However, you should be careful of the
behavior associated with this action.
The Page Refresh Subsection
A "Change detection" button can be found in the "Page refresh"
area, although it is only applicable in DirectQuery circumstances.
You have the option of setting a specific refresh frequency for your
pages, or you can have them automatically update whenever there is
a change detected in the data. Using this functionality, you can
create a simulation of this function inside Power BI Desktop.

The Security Subsection


The Security section has two buttons, both of which are of critical
significance. In Power BI, row-level security (also known as RLS) is
implemented via the use of the "Manage roles" and "View as"
buttons. Because RLS is a feature, it gives us the ability to govern,
depending on roles that are built algorithmically using DAX, how
individuals in our model see the data that it contains. Consider this
as a means of ensuring that individuals can only see the data in a
form that has been filtered following the unique context and security
limitations applicable to them. For example, if we had a report that
was published for different college courses, we could make simple
DAX statements that described how we wanted the data filtered, and
then we could assign users to those roles so that the data would be
filtered in the manner that we desired for each of those users.
The Q&A Subsection
Finally, we have the Q&A subsection. When you click the "Q&A
setup" button, a big wizard will appear before you that explains how
you can fine-tune the Q&A engine to better react to the more natural
language used by your customers. You can use this to teach the
engine synonyms so that when you ask a query containing a given
term, the Power BI engine understands what the meaning of that
word is in the context of your data model.
● Click the Language button to get a list of languages that
are supported for the Question and Answer section.
● Lastly, the "Linguistic schema" button includes a drop-
down menu that gives you the option to either import or
export a file containing a linguistic schema.
Report View: The View Tab
The View tab is the next major section of the ribbon that you'll find.
This option is crucial because it gives you the ability to choose
themes, configure multiple page views to test how different people
could read your report, and do other things. Do you want PowerPoint
included in your Power BI report? This is where some things should
start to take place.

The Themes Subsection


The functionality of the Themes part is the same as how it is in
PowerPoint. You have the option of working with the pre-installed
themes, or you can click the drop-down box on the right to get other
themes from which to choose, explore for additional themes, see the
theme gallery, or alter your existing theme and save any changes
you make. There are many themes, so feel free to experiment!

The Scale to Fit Subsection


When you choose "Scale to fit," the "Page view" button will offer
you the choice to have your canvas fit the page, fit to width, or
display its true size. You can also choose to reveal its actual size.
This works hand in hand with the "Mobile layout" option found in the
Mobile part. When you click that button, your report canvas will be
changed to a size and shape that is appropriate for reports that will
be presented on a mobile device. The construction of mobile reports
differs significantly from the development of normal reports in terms
of the considerations that must be given since there is less space
available for display on mobile devices.
The Page Options Subsection
You have three different buttons to choose from in the "Page
options" area of this section. The grid lines can be seen, things can
be made to snap to the grid, and objects can be locked. Gridlines are
of great use since they provide you with guiding references for where
your objects are located in relation to other items. The "Snap to
grid" option is only available when the layout is set to Mobile and will
cause all graphics to conform cleanly to the pixel density that has
been specified. When "Lock objects" is selected, elements on a
page cannot be rearranged in any way.

The Show Panes Subsection


On the View tab, the "Show panes" part is the very last one to be
found. This subsection addresses the many panes that can be
added or deleted from the UI at any moment, and it does so in
relation to this particular area. You can make changes to the filters
on a visual, page, or report level using the Filters pane. Just as you
can bookmark a page on your web browser, you can "bookmark" a
report page so that it opens to a certain sequence of events and then
returns to that state at a later time. The Selection pane provides a
visual representation of all the elements present on the canvas of
each report page and makes it possible to make a targeted selection
of an individual element. You can also use this to group things and
establish a hierarchy of significance among them, which is useful in
situations where the objects could overlap. You can extract machine-
generated DAX from the Performance analyzer, which also shows
you what factors are affecting the performance of your report page.
You can choose which slicers, if any, should be synchronized
between report pages by using the "Sync slicers" window. This
provides a means of maintaining consistency in comparisons despite
the presence of many report pages.
Report View: Help Section
You will be able to determine the version of Power BI Desktop you
are working with by clicking the Help button, which is located on the
ribbon. This tab will also include connections to guided learning,
training videos, documentation, and support links.

You will discover access to the Power BI blog; opportunities to


connect to the larger Power Platform community, particular
documentation for developers (named "Power BI for developers"),
and the option to submit an idea for Power BI Desktop, and a link to
widely used external tools inside the Community section of the
Power BI website. You can submit ideas and vote on others'
suggestions since this is something that will be of use to you. The
Microsoft Power BI development team has said on many occasions
that it pays attention to what is being voted on and listens to the
ideas that are being proposed by users. The program is updated
around once every month or two, and after each release, the team
reviews how popular a particular suggestion was on the Power BI
Ideas platform. Therefore, if you have a suggestion, you should post
it in this particular thread.
Report View: External Tools Section

The Pane Interface of the Report View


Each component of the user interface has its own distinct set of
panes, which enables that segment to perform a variety of functions.
You can access seven panes in the report view.

When all of the panes are shown to their full extent, as you can see,
they do not leave you with a lot of real estate to work with. However,
only the Visualization and Fields windows are consistently shown at
any one time. In addition, the Filters pane is distinct from the other
panes in that it is not as detached from the canvas as the other
panes are. This is because the Filters pane determines how things
on the canvas behave.

Visualizations Pane
You can choose visualizations to add to a report, add columns and
measures to show in those visualizations in the Values tab, and do
other tasks in the Visualizations pane.
First, let's look over the list of visuals. The visuals are listed in
order from left to right, top to bottom:
● Stacked bar chart
● Stacked column chart
● Clustered bar chart
● Clustered column chart
● 100% stacked bar chart
● 100% stacked column chart
● Line chart
● Area chart
● Stacked area chart
● Line and stacked column chart
● Line and clustered column chart
● Ribbon chart
● Waterfall chart
● Funnel chart
● Scatter chart
● Pie chart
● Donut chart
● Treemap
● Map
● Filled map
● Shape map
● Azure map
● Gauge
● Card
● Multi-row card
● KPI
● Slicer
● Table
● Matrix
● R script visual
● Python script visual
● Key influencers
● Decomposition tree
● Q&A
● Smart narrative
● Paginated report
● ArcGIS Maps for Power BI
● Power Apps for Power BI
● Power Automate for Power BI
That's a lot of visuals right first, and it can be overwhelming.
However, it is not as horrible as it seems, and some of those things
may be things that you never use anyhow. Be aware that if you have
any custom visuals or imported visuals in your Power BI report, they
will display under this list, above the Fields and Format buttons. Two
buttons can be found in the space between the visualization list and
the Values part. The Fields section can be found on the left, while
the Formatting section can be found on the right. Since they accept a
variety of criteria, the subsections of each visual will each seem
somewhat different. There is a possibility that a column chart and a
line chart may seem comparable to one another; nonetheless, both
types of charts are going to have different regions in which fields can
be placed.
Similarly, the Format pane is going to contain a varied set of settings
depending on the visual. This is because every visual has something
distinct that has to be formatted. Despite this, there are a few things
that remain the same. You'll also notice that when you have a visual
chosen, a third option will show up in this section as well. This is
something that you should keep in mind. That's the button for
Analytics. It is essential to have a solid understanding of the fact that
this feature is only applicable to the visualizations that come pre-
installed with Power BI, and even then, it is not accessible for every
single visual. You can, however, do things like establish trend lines,
constant lines, averages, and so forth. The most essential thing to
remember about this part is that you should not be afraid to
experiment with the plethora of different options that are at your
disposal. If you aren't sure what to do, you can always let Power BI
handle everything for you by using its default settings. The Format
and Analytics subsections provide you with the capabilities you'll
need when you're ready to take a little bit more control over your
report.
The "Drill through" portion is the last part of the visualization
window that can be accessed. You can quickly develop a story with
data that enables users to find specific examples that are relevant to
their analysis thanks to this powerful feature, which enables you to
drill down from one subsection of your report to another while
keeping all of the data elements filtered as you had them previously.
Fields and Filters Panes
A list of all the tables, columns, and measures that are included in
your report can be seen in the Fields pane. In addition to that, it will
include any folders that you can establish in addition to any
hierarchies or groups that you might build in your data model. Simply
clicking and dragging the column or measure to the right location in
the Visualizations pane or onto the visual itself on the canvas is all
that is required to include these components in your existing visuals.
You can also tick the option to tell Power BI to move the item into the
visualization of your choice when you click it. Or, if you don't have
visualization already, Power BI will generate one for you and add it to
the canvas along with the data element you choose. When you right-
click on a table inside your model, you will be presented with a
context list that has options for adding a new measure, a new
calculated column, a new fast measure, refreshing the data,
modifying the query for that table within Power Query, and other
options.
If you right-click on a column in this section, you will be presented
with a context list that allows you to add the column to your canvas,
create a hierarchy beginning with that column, add a new measure
or column, add the column to your filters list, or add it to the drill
through option. If you do not want that particular data piece to be
utilized at that moment, you will also have the option to conceal it
from view. There are three subsections in the Filters pane. You have
probably previously used filters, but in case you haven't, a filter is a
function that establishes a condition for the data to be presented.
This condition must be met for the data to be displayed. For
instance, if I have a filter for an age bracket and pick the grouping
from 12 to 18 years old, then all of the data that I see will be related
to that age range. Every other piece of data would be disregarded.
When you have a certain visual chosen, the Filters pane will
provide you the option to apply filters to that particular visual
when you make that selection.
● “Filters on this page” filters all visuals on that specific
report page.
● “Filters on all pages” sets a filter condition for the entire
report.
Any fields that are included inside a visual will, by default, also be
shown in the Filters pane under the part titled "Filters on this visual"
and can be interacted with in that area. In the Filters pane, you can
configure three different kinds of common filters, each of which has a
unique function. The fundamental filtering presents all the possible
values for the specified column and enables you to choose any
number of those values to have their status changed to "true." It's not
hard to understand at all.
The sophisticated filtering provides you with the ability to
combine two distinct and one-of-a-kind criteria, each of which
has logic associated with it depending on the kind of data that
the column represents:
● Contains (text)
● Does not contain (text)
● Starts with (text)
● Does not start with (text)
● Is (all)
● Is not (all)
● Is blank (all)
● Is not blank (all)
● Is empty (text)
● Is not empty (text)
● Is after (date)
● Is on or after (date)
● Is before (date)
● Is on or before (date)
● Is less than (number)
● Is less than or equal to (number)
● Is greater than (number)
● Is greater than or equal to (number)
Even though the majority of these criteria are self-explanatory, it is
essential to keep in mind that you are the one who must enter the
value into the subsection. This is in contrast to the standard filtering
option, which presents you with a list of possible values to choose
from. After that, you can build an “and/or” condition by selecting the
button, and you can also establish a second logic set. To make
advantage of sophisticated filtering, it is not necessary to set a
second logic condition. A Top N–type filter is the third sort of
common filter that is used. You can specify the number of top or
bottom values that you want to show by a specific value in your
model that may or may not be in that visual. You can do this by
clicking on the "Settings" button. An excellent illustration of this
would be to look at your top 10 sales customers ranked according to
state, nation, revenue, or volume can decide how many of the field's
values to show as well as whether the visual should show the
highest or lowest values for that particular field. When filtering for a
whole page or report, you will see that the Top N option is not
accessible. Only when used at the visual level is the Top N filtering
method effective. Dates also contain two other sorts of filters that
can be used, which are called relative date and relative time,
respectively. Only in cases when the date field also contains a time
component will the relative time be shown. These filter options
provide you the ability to define precise periods for a visual or the
report, such as the most recent 10 days, the next week, or the most
recent year. The use of hours and minutes in conjunction with
relative time gives you the ability to delve into this level of detail.
A Quick Rundown of the Other Panes
You will not see the first four panes by default; however, you can
make them visible by selecting the View tab, and they will display in
the order that you add them. These panels are the Bookmarks pane,
the Selection pane, the Performance Analyzer pane, and the
Sync Slicers pane.
You can preserve filters, drills, and any other criteria that you can set
for a particular report page by using the Bookmarks window. This
saves the page so that you can return to it later. Let's imagine, for
example, that we have a sample of our report page that displays all
of the data; however, we would want it to be possible for end users
to rapidly convert the report to a certain filtered view. To change the
page's conditions from one state to another, we can utilize
bookmarks. You can easily modify the visibility of items in the report
as well as the stacking order of those elements thanks to the
Selection pane. Suppose you have a logo that you want to always
display in a certain spot on the report.
However, when you add visuals to the canvas, the logo gets covered
up by the box associated with the visual, even if the section of the
box that is overlying the logo is blank. You will need to move the logo
to the forefront to correct it, and you can do so and change the
layering order of the whole document from inside this panel. You can
also group things so that they can be handled and moved as a unit.
When you are ready, the Performance analyzer will guide you
through the process of troubleshooting the reasons why anything in
a report could be taking longer to show than you believe it should or
why it loads more slowly than it did in the past. You can obtain
information on how many milliseconds it took to build the DAX query,
visualize the results, and other elements by using the Performance
analyzer. This includes the DAX that Power BI created to create the
visual. When you have some DAX knowledge under your belt, it can
be helpful to know the entire filter and row contexts to understand
how Power BI develops the proper code to power the visualization.
This is because machine-generated DAX isn't always the simplest to
learn from as a novice. In a similar vein, if your report is experiencing
problems, you can try out various settings to see if they affect the
report's performance and whether or not it improves.
The "Sync slicers" pane is the last pane. When you add slicer
visuals to a report, the visuals will only work on that particular page
of the report. You can, however, choose a slicer, add it to additional
pages, and sync them using the "Sync slicers" window. This will
ensure that any changes made to the slicer on one page will be
reflected on all of the other sites where it is used. When someone is
looking at the data and they discover, for instance, a certain firm that
they want to zero down on, this can be quite helpful for exploratory
research. If they set that filter in one section of the report, then the
slicer decision that they make there will be carried over to all of the
other pages in the report that also contain that slicer. You even have
the power to declare that the synchronization should take place on
Page A, but not on Page B, for example.
Data View
You will be able to see the data at the table level inside your report
by using the Data view. Instead of displaying the columns in your
report in alphabetical order, it will show them in the order that the
columns appear in the report. The order in which the columns are
found in the data source is referred to as the ordinal order. In our
example, the ordinal order refers to the order in which the columns
show in Power Query before being imported into our data model. To
begin, you will see that there are two new areas of the ribbon that
you can interact with. When you have a table or column chosen in
the Fields pane, the "Table Tools" tab of the ribbon will become
visible on the ribbon. When a certain column is chosen, and only
then, will the "Column tools" tab become visible. These tabs
consider context in every way.
You can alter the table name, designate it as a date table, or see our
relationships in this area in the same manner that we did from the
ribbon. The Table Tools subsection can be found under the Tools
menu. In addition, we provide a variety of mathematical methods for
the creation of tables and measures. There isn't a whole lot here
that's new, and you can also find this view in the Report view if you
have a table or column chosen. The "Mark as date table" feature is
the only item that is distinctive about this platform and warrants
discussion. Power BI will automatically generate, behind the scenes,
a date table for each occurrence of a date field in your data model.
This table will include all of the dates that fall between the first and
final date associated with that field. It accomplishes this so that,
behind the scenes, it can put together functionality for the date
hierarchy as well as some other essential tasks.
In a model of this scale, this is not a significant issue at all. On the
other hand, if you start working with huge models that include
dozens or even hundreds of hidden date tables, the size of your
model can begin to expand extremely rapidly. You can avoid going
through this step altogether with Power BI by configuring your model
to include a date table that will fulfill this function for all of the other
dates. As can be seen in the picture that follows, the "Column tools"
tab is visible to me whenever I have a column selected for editing.

I can see the name of the column as well as the data type it contains
if I start on the left. In this particular instance, I'm inspecting the Date
column, and the data type that it employs, which is simply referred
to as Date, is distinct from the Date/Time data type. When the data is
presented in a visual, I can choose how I want it to be formatted.
Since it is a date, I can use a drop-down menu to access a selection
of different possibilities. When I pick a column that contains
numbers, I have the option to display the value in a variety of
formats, including money, a decimal number, a whole number, a
percentage, or scientific notation. If you choose the General option,
the default setting will be the way the data model stores the
information. Because the data that was chosen in the picture that
was just shown is not of the numeric data type, the options to choose
the currency, percentage, display commas in numbers, and set the
number of decimal places for a value are all disabled and shown as
grayed-out text.
Next, inside the section titled "Properties," you will find a section
where you can configure the settings of that column, including its
default summarization and how Power BI should classify the data.
By default, non-numeric columns are not summarized. You can,
however, establish a default summary for any column or count or
count (distinct) you choose. The sum operation is used as the default
summary for numerical columns; however, this is not necessarily the
most appropriate choice. The total, the average, the minimum, the
maximum, the count, and the count (distinct) are the default
possibilities for summarization. If you want a certain column to not
have any default summarizing, you can always set that column to
"Don't summarize."
A summary of the circumstances surrounding the data in question
can be found under the Data category. The majority of columns do
not have any data categories assigned to them; hence they are left
uncategorized by default. When it comes to numbers, there aren't a
whole lot of possibilities, unless you want a numerical value to be
recognized as something else entirely, like a barcode value or a
postal code, for example. There are no options available for data
categories when dealing with dates. When dealing with data
represented as text, however, having the necessary settings chosen
for the relevant columns can make the process of working with that
data considerably simpler in some visualization. The following
categories are available for selection: Address, Place, City,
County, State or Province, Postal code, Country, Continent,
Latitude, Longitude, Web URL, Image URL, and Barcode.
The next part is titled "Sort by column," and while it is tucked away,
the environment it provides is very magnificent. You will have the
ability to provide a rule for a column so that it will automatically sort
itself according to that rule when it is included in a visual. Data
groups enable you to quickly put together combinations of data,
often known as bins, for analytical purposes. Examples of this are
many, and some of the most visible include age groups and
anthropological categories. On the other hand, you can undoubtedly
think of a great many methods to organize people into groups. Have
a lengthy and involved product list? Sort them into groups that are
easier to handle. Do you want to isolate a certain patient condition
from that of other patients and use them as a control group?
Specifically categorize each of them.

Conclusion
Users can build visually engaging reports and obtain deeper insights
into their data by using the Report and Data Views parts of Power BI,
which are both key components of the software. Users are given the
ability to successfully analyze, display, and present their data thanks
to the provision of a broad variety of features and functions by these
parts. They provide users with access to a full collection of tools that
enable them to turn data into meaningful visualizations, analyze
insights, and share reports with stakeholders. Users can unleash the
full potential of their data and make data-driven choices, which is
what drives the success of their organizations when they harness the
power of these areas.
CHAPTER 3
IMPORTING AND MODELING OUR
DATA
Getting Our Data
Obtaining the appropriate data for analysis is an essential phase in
the process of importing and modeling data in Power BI. This step is
one of the most important steps. This section focuses on the
numerous approaches and considerations that are involved in
getting data from various sources and preparing it for modeling
inside Power BI.
These techniques and concerns are discussed in further detail
below.

1. Identifying Data Sources: Before importing data into


Power BI, it is essential to first identify the data sources
that are relevant to the data that will be imported.
Databases, spreadsheets, online services, and storage in
the cloud are all potential examples of these types of
sources. It is of the utmost importance to have a crystal
clear grasp of the location of the data as well as how it can
be accessed.
2. Extraction of Data: The next stage is to extract data from
the many sources that have been identified. Power BI
gives users some different options when it comes to the
process of extracting data. These options include directly
connecting to databases, importing files from local storage,
or making use of web connectors for web-based data
sources. The nature of the data and the ease with which it
can be accessed are two factors that should guide the
selection of an extraction technique.
3. Transformation of the Data: Once the data has been
retrieved, it often has to be transformed to guarantee that it
is suitable for analysis. Power BI provides users with a
wide variety of options for transforming their data, such as
cleaning, filtering, merging, separating, and aggregating
options. The data is refined as a result of these
modifications, which also prepare it for modeling.
4. Integration of Data: In many instances, the data
necessary for analysis originates from a variety of sources,
and this necessitates data integration. Users can merge
information obtained from a variety of sources into a single
dataset with the help of Power BI, which facilitates the
integration of multiple datasets. To improve the quality of
the data, this procedure may entail combining tables,
establishing relationships, or adding calculated columns.
5. Refreshing the Data: It is important to ensure that the
data in Power BI is always brought up to date so that users
may get accurate and pertinent insights. The imported data
can be kept up to date with the help of Power BI's data
refresh capabilities, which can be set to run on a
predetermined basis. Users can specify refresh rates, and
such frequencies can be based either on the update
frequency of the data source or on the needs of the
company.
6. Data Security: When gathering data, it is essential to keep
data security and privacy considerations in mind at all
times. Power BI has some different security capabilities,
including encryption, row-level security, and connectivity
with Azure Active Directory. These tools are designed to
safeguard sensitive data and limit access to only those
individuals who have been approved.
7. Data Governance: It is very necessary to put in place
appropriate data governance processes if one wants to
maintain both the integrity and consistency of their data.
Power BI makes it possible to put in place data
governance measures inside an organization, such as
defining data lineage, setting data refresh standards, and
putting in place data access restrictions. These methods
help to guarantee that data is used in a trustworthy manner
across the business.

Importing the Data


These steps can be used to import data into a Power BI Desktop
file:

1. Launch Power BI Desktop.


2. Click on "Get Data": The "Get Data" button is accessible
on the Home ribbon on the Power BI Desktop home
screen. Click it to start collecting data. A dialog window
with numerous data source options will then be shown.
3. Choose a Data Source: From the list of options, choose
the best data source. Many different sources are
supported by Power BI, including databases, files, web
services, and more. SQL Server, Excel, SharePoint, Azure
services, Salesforce, and web connections are a few
popular options for data sources.
4. Connect to the Data Source: After choosing the data
source, a connection prompt will appear.

Depending on the source that is chosen, the connecting


procedure differs. Here are a few illustrations:
• Importing from Files: To import data from a file-based data
source, such as Excel or CSV, you can either explore your
local file system or provide a file location.
• Connecting to Databases: If you choose a database source
such as SQL Server, Oracle, or MySQL, you will need to
provide the required connection information, including the
server address, database name, authentication type, and
credentials.
● Power BI also provides web connectors for connecting to other
internet services. You may have to provide API keys, URLs, or
other authentication data for these sources.
Follow the prompts and provide the required information to
establish the connection to your chosen data source.

5. Specify Data Import Options: After the connection has


been made, you will be given the chance to choose how
the data should be imported. Selecting the tables, views,
or queries to import, defining import parameters, and
carrying out any necessary conversions are all included in
this.
6. Transform and Shape the Data: After importing the data,
you can use data transformations to reorganize and clean
the data as necessary. To meet your analytical needs,
Power BI offers a variety of tools and functions to filter,
combine, divide, pivot, or aggregate the data. After you
import the data, the Power Query Editor launches, and
from there you can access various transformation options.
7. Load the Data: After performing the required
transformations, load the data into the Power BI data
model by selecting "Close & Apply" in the Power Query
Editor. In doing so, Power BI Desktop will make the
imported data accessible for analysis and visualization.
8. Refresh Data: You can set up data refresh settings to
keep the imported data current if it is anticipated that the
data source will change over time. You can schedule data
refreshes with Power BI or set up automated refreshes
based on certain triggers.
9. Explore and Visualize the Data: Now that the data has
been imported into Power BI Desktop, you can use the
visualization tools and capabilities to explore and analyze
it. To learn from your data, create reports, dashboards, and
interactive visualizations.

The Power Query Ribbon


The Home Tab
In Power Query, the Home tab is mostly uncomplicated. We can see
all the icons in their respective groups in the figure below. We'll go
over each one and give you some ideas of how you can utilize it.

As soon as you click the Close & Apply button, all the changes
you've made up to that point are applied, refreshing any affected
data components. When you click the drop-down arrow, a drop-down
menu will appear with three options: (1) Close & Apply; (2) Apply,
which, as you might expect, applies changes but doesn’t close
Power Query; and (3) Leave, which does not apply any changes. If
you choose to leave, Power BI Desktop will warn you that you have
unapplied changes. With the help of the New Source button in the
New Query section, we can add new data. While clicking the drop-
down arrow creates a drop-down menu with the most popular data,
selecting its icon opens the whole data source selection menu.
You can instantly connect to any data source you've previously used
using the Recent Sources button, not simply the one in your active
Power BI Desktop file. This can be helpful if you are doing testing
and want to build a second file and quickly link, or if you are
developing a new model and want to enhance it with a component
from an earlier project. When you enter data, a simple table structure
will appear where you can manually enter data and establish column
names. Although I don't advise utilizing Enter Data often, it can be
helpful if you have several data sources and need to quickly add a
new dimension that might not be present elsewhere in your data.
Although Enter Data's interface resembles Excel quite a bit, neither
formulas nor data validation rules are supported by it. There are just
rows and columns, and the more data you add; the more human
management is required. Because of this, I normally advise against
using Enter Data, although it is available if necessary. When you
click the "Data source settings" button, a window will appear. You
can then change sources, export a PBIDS file for that data source,
adjust permissions, and delete permissions. A PBIDS file is
essentially a Power BI Desktop shortcut for that data source that you
can access in another Power BI file. You can add, examine, and
update each of your current parameters using Manage Parameters.
Power Query parameters are settings that can be used in certain
scenarios to alter how particular data is dealt with. There are many
uses for parameters, including parameter rules in deployment
pipelines, what-if analysis, and more.
You can manage a query, whose meaning isn't always clear, by
refreshing the preview, seeing the properties, using the Advanced
Editor, or going to the Query area of the Home tab. By using the
Refresh Preview button, you can acquire an updated preview of the
data for the query that is presently highlighted and use it as a
working reference. When you have rapidly updating datasets and
want to make sure that your intended transformation behavior is still
being obtained despite the changing data, this can be useful.
Additionally, you can choose to update the preview of each active
query in your model by clicking the drop-down arrow. When you click
the Properties button, a dialog box appears with the query's name,
description, and two checkboxes for "Enable load to report" and
"Include in report refresh"—both of which are automatically
chosen. Until it is added, there is nothing in the description box, thus
explaining a query can be useful.
All visuals that depend on that table will stop working when "Enable
load to report" is disabled since it will regard the whole table as if it
doesn't exist. Consider this to be similar to soft-deleting a table from
your data model. If you wish to re-enable it, it's still available in
Power Query, but Power BI ignores it otherwise. It would be better if
the "Include in report" refresh mechanism was more obvious. You
cannot include that table in the report's refresh if "Enable load" is
disabled. When this is deactivated, the data is kept in the model as it
is right now, but data refreshes do not update that query. Use this
method when a table is really big and you want to test some other
changes in a data refresh without refreshing the biggest table. In
some use situations, individuals will preserve certain data for some
time for things like audits. You can see the real M that is produced by
each modification you do in the Advanced Editor. In my opinion, it's
fantastic that you can see the code that is created when you make a
change, especially if you're interested in learning the language
yourself. M is a language that is simple to learn but challenging to
master, in my view. Duplicate your working code in a text file or other
backup before making changes if you want to attempt changing the
M straight from the editor here. The reason is that, regardless of
whether your code is usable or not, whatever modifications you
make and then click Done, that's what you'll receive. As with
everything programming, a good rule of thumb is to always keep a
backup!
Finally, the Manage button. There are three options available when
you click this button: Delete, Duplicate, and Reference. Duplicating
a query takes the query as it currently exists and duplicates it. This is
straightforward and does mean you are storing that data twice. The
reference takes a somewhat different approach. It creates a new
query that seems to be a duplicate at first, but it's not. It lets you start
from the final form of the query it is referring to and continue from
there. Be cautious since if you update the parent query, the referred
query will also change, which might lead to issues in that query.
There are two options in the Manage Columns section: Choose
Columns and Remove Columns. Both methods take you there, but
they do it in different ways.
You can choose which columns to maintain in the query and drop the
others by clicking the Choose Columns icon, which will open a
dialog box. You can access the previously stated Choose Columns
dialog box or a single column in the query by clicking the Choose
Columns drop-down arrow. In cases when a query has a large
number of columns and you are unsure of their order, this second
capability can be helpful. The column is only removed when the
Remove Columns button is clicked. You are not provided with a
dialog box. Whatever columns are chosen are taken out. A menu
with a second option, to Remove Other Columns, is shown when
you click the Remove Columns drop-down arrow. This removes all of
the columns that you don't have chosen. There are two sections in
the Reduce Rows section: Keep Rows and Remove Rows. Each
has a specific use case centered on the first X rows or the rows
between X and Y. The option to eliminate duplicate rows, as
illustrated in the Remove Rows section, is one item I do want to
caution you about in this area. It's crucial to be aware that it
eliminates duplicates depending on the columns you've chosen.
Therefore, if you simply choose one column and eliminate
duplicates, you risk losing information that you had planned to
maintain. You can choose a column and then sort in either
ascending or descending order using the Sort section. When more
than one column is chosen, this is ineffective.
The Transform section follows, where you will find some
straightforward transformation options. Split Columns enables you to
divide a specified column into numerous columns using a delimiter of
your choice. One common use of this is when you need to split the
first and last names in a single-name column. You can aggregate the
results of your current query using Group By. This can be helpful,
for example, if you have a lot of data and want to pre-aggregate it to
get rid of certain rows. We can modify the data type of a chosen
column by using the Data Type button. When data is initially
introduced to Power Query, Power BI will attempt to automatically
recognize the various kinds of data. Power BI performs a fantastic
job at recognizing data types, but sometimes it goes wrong, so you'll
have to make a manual change here. Coincidentally, you can quickly
verify the data type of a column by looking at the icon in the column
header.
Use First Row as Headers is fairly self-explanatory, and depending
on the data source, Power BI will often attempt to determine if the
first row has column headers. You will also notice an option to make
your headers the first row if you click the arrow next to the Use First
Row as Headers button. You can search for and replace data in a
specific column using Replace data. Depending on your analysis,
one typical use case would be to either regard zeros as nulls or nulls
as zeros. It's vital to remember that you cannot update specific cells
since Power Query only understands columns; it has no
understanding of cells. We can add queries, merge queries, and
combine files using the Combine section. Similar to a SQL JOIN
statement, merge queries operate. A JOIN statement instructs the
computer to provide the value for Column N and add it whenever the
values in Columns A and A in Tables A and B are equivalent. Let's
imagine you have Column A in Table A and Column A in Table B and
you want to add Column N from Table B to Table A. The same thing
is being done here.
Append Queries are designed to combine many inquiries with the
same kind and structure of data into a single, lengthier query that
resembles a UNION statement from SQL.
By pre-combining numerous files into a single view, Combine Files
enables you to avoid the problem in Append Queries. Be cautious
when adding tables that you are receiving the desired result. If you
should have columns that are specific to each table, this option will
combine all of that to create a single result covering all the columns
and values across the two tables. Finally, things that your company
may utilize to connect to Azure AI services are available in the AI
Insights area. Recognize that for these features to be updated, you
must be using Power BI Premium. To find out more, find out whether
your company uses Azure Machine Learning Studio or Cognitive
Services by asking your Azure administrator.
Transform Tab
Beyond what we saw in the Transform portion of the Home page,
there are other options for converting your data on the Transform
tab. To examine the whole Transform tab, though, let's first look at
the picture below. First, we see Group By and Use First Row as
Headers in the Table section.
Transpose converts your columns into rows. Your data's row-by-row
order is reversed by Reverse Rows such that the final record comes
before the first and vice versa. The number of rows in the query will
be shown using the command Count Rows.

The Any Column section offers the option to alter a column's data
type. Values can also be changed. With the help of Unpivot
Columns, you can restructure the columns of data you've chosen
into a "attributes" column and a "values" column. You may wish to
reshape your data in this manner at times, such as when you want to
combine many columns into a few columns with a larger number of
entries.
Based on its presentation of the data, the Detect Data Type button
will go over each column and try to give it the proper data type.
When you initially load data into Power Query, Power BI will often do
this task automatically. However, if you make a lot of transformation
modifications that might alter the data types of several columns, you
can instruct Power BI to perform this task once again. Using the Fill
button, you can substitute a value based on other values in the
column for any missing values, whether they are at the top or bottom
of the column. A column in a table can be moved by using the Move
button. You can reposition it to the left, the right, the beginning, or
the end. With many columns chosen, you can accomplish this.
Rename allows you to rename a chosen column. You can use Pivot
Column to take a column's values, convert them into columns, and
then recalculate values across the new combination. When you want
to display the values alongside the characteristics as columns but
have the attributes in one column and the values in another, this can
be helpful. The function Convert to List is special. It takes one
column and converts it into a list, a specific kind of table in Power
Query. To certain custom functions, lists can be supplied similarly to
arguments. The Text Column section is a bit strangely named
because you can use these functions on more than just text
columns. Thankfully, they are pretty straightforward for the most part.
Split Column enables you to choose a delimiter and divide a column
into two columns. You cannot change the data type in format. You
may make all the data lowercase, or uppercase, or add prefixes or
suffixes, for example. By using this functionality, the column will
change to a text-type column. With the option to choose a separator,
Merge Columns mashes two or more columns together.
You can edit a column or columns using Extract to maintain certain
characters inside the column. In contrast to everything else here,
Parse does something unique. Parse converts semi-structured data
from a JSON or XML file into a more conventionally structured
format for analysis. Although we don't use any JSON or XML files in
our examples, you may need to parse the data before analyzing it if
you were dealing with NoSQL databases or receiving results from
API calls. This parse function performs that. With the help of the
Number Column functions, you can round a column, determine
whether a value is even or odd, or determine its sign in addition to
performing standard calculations against a column to edit its values.
You can also perform trig functions against a column. I advise
duplicating a column before making any changes to the data, except
the statistics function, so that you can have both the original clean
value and the updated value. Similar to the Number Column, the
Date & Time Column section performs a variety of date and time
changes.
The final section in the Transform tab is the Scripts section, which
offers the ability to run both R and Python scripts against your data.
You need to have R or Python installed to utilize these functions, and
Power BI has to be aware of the location of your installation of those
language libraries. You can conduct transformations there and utilize
it as a single step if you're more at ease dealing with data frames in
R or Python. Additionally, you can use Power Query to pre-prepare
the job, use a script task, and then make additional changes in
Power Query. These features can be combined, so if you wanted to
utilize both, you could.
The Add Column Tab
The first thing to note in the Add Column tab is that except for the
General section, we have seen all the other functions in other places
in the ribbon to this point. However, the General section contains so
much that it still merits a section of its own.

● By selecting the "Custom Column" option, a dialog box will


appear, allowing you to generate a custom column utilizing M.
This approach can prove advantageous if there are
transformations that require execution but are not supported by
the user interface or can be performed more efficiently through
coding. The dialogue box will inform you of any syntax errors in
the code. However, for those who are still in the process of
learning M, the errors may not always be particularly
informative.
● The Invoke Custom Function feature is a highly potent tool
that enables the user to iterate over a function for a designated
column. This process involves identifying a particular value and
subsequently applying a specific logic to it. An elementary
illustration involves the creation of a function that generates a
fresh column. Subsequently, the function multiplies the value of
one column by another to obtain a third value, which is then
inserted into the newly created column. Have you
comprehended the aforementioned reasoning? This tool
possesses considerable potency, yet comprehending its
functionality during the initial usage may prove to be
challenging.
● The Conditional Column feature enables the creation of a
column that follows the structure of an if-then-else statement. If
Column X is smaller than Column Y, the answer is Yes;
otherwise, it is No. It is also possible to include multiple
clauses to create more complex or iterative if functions. If one
is acquainted with SQL, it can be likened to a case-when
statement.
● The Index Column function generates a column that
commences from either 0 or 1 and produces a distinct array of
values, commonly associated with the row number. The
paramount aspect of an index column is its absolute
uniqueness, enabling it to serve as a spontaneous key value,
which can be merged with other columns to transmit the index
or key value to other tables, thereby establishing relationships.
Ideally, your data should be already equipped with keys,
rendering an index redundant. However, if an index is required,
the process is quite straightforward.
● The Duplicate Column function creates a replica of the
current column. That concludes the matter. I have no clever
remarks to make regarding that statement. The task at hand
merely involves duplicating and transferring information from
one location to another.
● Lastly, the View tab. The options available on the View tab
provide users with the ability to make minor adjustments to
their Power Query interaction experience. To gain a more
comprehensive understanding of your data at the columnar
level, highlight the Column quality, Column distribution, and
Column profile checkboxes in the Data Preview section.
These options provide additional insights into the specifics of your
data. The aforementioned comprises a percentage analysis of valid,
erroneous, and null values within a column, a numerical breakdown
of values present in a column, and a data distribution encompassing
the count of distinct values within a column. Once all modifications
have been made, and you are prepared to incorporate the changes
into your data, kindly select the "Close & Apply" button. Closing the
Power Query window will redirect you to the Power BI Desktop
canvas.
The Model-View
Upon revisiting Power BI, it is imperative to examine the third view in
Power BI Desktop, namely the Model view. What is the significance
of relationships? What is their actual achievement? Is it possible to
disregard relationships and create a comprehensive table that can
effectively address the issue of global hunger? The following section
will cover these concepts.
Power BI will automatically attempt to detect relationships during the
data import process. This occurs when the data is arranged in a
manner that allows Power BI to accurately identify the relationships
between the various data elements. Regrettably, it does not
consistently perform this task effectively. Consequently, it is
advisable to verify your relationships post-importing data to ensure
that your model is structured as per your intended design.
What Is a Relationship?
The concept of a relationship is a complex and multifaceted one. It
refers to the connection or association between two or more
individuals, characterized by a range of emotional, social, and
physical. Within Power BI, a relationship denotes the linkage forged
between two tables predicated on one or more shared fields, also
known as columns.
Microsoft has developed Power BI, business intelligence and data
visualization tool that enables users to connect, transform, and
analyze data from multiple sources. In the context of Power BI, it is a
frequent occurrence to encounter related data that is dispersed
across multiple tables. An instance of this could be the presence of a
"Customers" table and an "Orders" table, wherein every order is
linked to a distinct customer.
Through the establishment of a relational connection between said
tables, one can effortlessly merge and scrutinize data from both
sources. In Power BI, a relationship is established by correlating the
values of a column in one table (commonly known as the primary
key) with the values of a column in another table (known as the
foreign key). After the establishment of the relationship, Power BI
can execute diverse operations, including filtering, aggregating, and
generating visualizations that integrate data from multiple tables.
Types of Relationships in Power BI
Power BI facilitates three distinct relationship types, namely one-to-
one, one-to-many, and many-to-many.
Each classification denotes a distinct form of correlation
between tables and carries its ramifications for the analysis of
data.
• One-to-One Relationship: In a one-to-one relationship, every
record present in the primary table has a corresponding record
in the related table, and this correspondence is limited to a
maximum of one record. This association proves beneficial
when one desires to establish a relationship between tables
that share common data or construct a hierarchical structure.
An instance of linking a "Person" table to an "Address" table
is when each individual is associated with a solitary address.
• One-to-Many Relationship: The one-to-many relationship
stands as the most prevalent type within Power BI. In this
particular relationship, each entry within the primary table can
possess multiple associated records located within the related
table. This type of relationship is deemed appropriate when a
singular entity has multiple associated entities. An example of
this would be the establishment of a relationship between a
"Customers" table and an "Orders" table, whereby each
customer is capable of having multiple orders.
• Many-to-Many Relationship: A many-to-many relationship is
present when each record in two tables can have multiple
related records in the other table. Notwithstanding, Power BI
does not provide direct support for many-to-many
relationships. To manage such situations, it is necessary to
utilize an intermediary table, commonly referred to as a
"bridge table" or "junction table." The table known as the
bridge table serves to resolve the many-to-many relationship
by establishing two one-to-many relationships. This
methodology facilitates efficient data analysis while
circumventing the issue of data redundancy.
Creating and Managing Relationships in Power BI
The process of creating and managing relationships in Power BI is a
crucial aspect of data modeling. It involves defining relationships
between tables in a dataset to enable the creation of accurate and
insightful reports.
Effective relationship management in Power BI requires a thorough
understanding of the data sources and the relationships between
them. The Power BI platform offers a user-friendly interface that
facilitates the creation and management of table relationships in a
seamless manner.
The subsequent steps delineate the procedure:
• Identification of Common Fields: To establish a relationship,
it is imperative to identify the common fields (columns) that are
present in both tables. The aforementioned fields ought to
encompass analogous data that can be utilized to establish
logical connections between the tables.
• Defining Relationships: When identifying common fields,
users can proceed to define relationships by designating the
primary key and foreign key columns. Power BI can
automatically identify potential relationships by analyzing the
column names and proposing possible matches, thereby
streamlining the setup process.
• Managing Relationships: Once relationships have been
established, users can further manage and customize them
according to their preferences. Within Power BI, it is possible
to modify the cardinality and cross-filtering direction for each
relationship. Cardinality is a term used to specify the number of
records from the primary table that can be linked to a single
record from the related table. The process of cross-filtering is
utilized to ascertain the impact of filtering in one table on the
data exhibited in the associated table.
• Handling Ambiguous Relationships: When faced with
ambiguous relationships, Power BI may encounter situations
where multiple relationships exist between tables. In such
circumstances, it is necessary for users to explicitly define the
active relationship. By doing this, it guarantees that Power BI
comprehends the specific relationship to employ while
conducting data analysis and generating reports.
Benefits and Applications of Relationships in
Power BI
• Comprehensive Analysis: A comprehensive analysis reveals
that relationships facilitate the integration of data from various
tables, resulting in a cohesive and consolidated perspective of
the information. This feature facilitates a thorough examination
of the data, enabling a meticulous exploration of its intricacies
and the acquisition of valuable perspectives across various
aspects of the dataset.
• Efficient reporting: This can be achieved through the
establishment of relationships, which enable users to generate
dynamic reports and visualizations that automatically update
as they interact with the data. The utilization of relationships is
a key feature of Power BI, enabling efficient aggregation,
filtering, and summarization of data. This functionality greatly
enhances the development of informative dashboards and
reports.
• Data Exploration and Discovery: The utilization of
relationships facilitates the process of data exploration and
discovery, allowing users to uncover concealed patterns,
correlations, and trends present within the data. Through the
strategic utilization of relationships, Power BI enables its users
to seamlessly traverse the data model, discern
interconnections, and execute impromptu analyses with ease.
• Simplified Data Maintenance: The utilization of relationships
in data management enables simplified data maintenance
processes by eliminating the need for redundant data. By
utilizing relationships, users can store data in a structured
manner, thereby reducing data storage requirements and
simplifying updates and modifications. This approach
eliminates the need for duplicating information across multiple
tables.
• Scalability and Flexibility: Power BI boasts the ability to
support relationships across vast datasets, thereby enabling
scalability for intricate data models. Additionally, it offers
flexibility to accommodate diverse data sources and adapt to
changing business needs. Furthermore, relationships provide a
level of adaptability in the integration of data from multiple
sources, allowing users to effectively handle a wide range of
data types and formats.
The Properties Pane
The contextual nature of the Properties pane within the Model view
is highly notable. The Properties pane can always be located on the
right-hand side. The Properties pane will adjust according to the
selected data object, displaying only the relevant information for the
chosen data element.
Upon selecting a table, one will observe its Name, Description,
Synonyms, Row label, Key column, the Is hidden flag, and the Is
featured table flag. Within the Advanced section, this particular
pane shall display the Storage mode. The available modes for
Storage are Import, DirectQuery, or Dual. It is important to note that
any item that has been set to Import mode is immutable and cannot
be altered. Observe an example of the Properties pane in operation
in the image provided below.
The field labeled "Name" is self-explanatory. The field designated for
Description may be left empty, but it is advisable to provide a concise
summary of the table's contents or its role within your data model.
The synonyms have been listed to facilitate the Q&A feature. The
Row label feature enables the identification of a column to function
as the default column during Q&A sessions involving a particular
table.
The Key column facilitates the identification of a column that
exclusively comprises unique values. Upon identification of a column
as the key column, a distinctive icon will appear beside it in the
Fields pane. If the value of "Is hidden" is set to "Yes", the table will
not be visible in the Report view. Consequently, it will not be possible
to include items from that table in report items. It is important to note
that any pre-existing objects created using the aforementioned table
will remain visible. However, the addition of data elements from said
table to new visualizations will no longer be feasible. In conclusion,
when a table is designated as a featured table, individuals within an
organization that has enabled featured tables can locate and access
said table through the Power BI service search function.
Upon selecting an individual column, one can observe the Name,
Description, Synonyms, Display folder, and Is hidden flags.
Within the Formatting section of the pane, one may observe the data
type of the column as well as the format of the data contained
therein. In the Advanced section, users have the option to choose an
additional column for sorting purposes. They may also identify a
relevant data category, which serves as a specific identification for
certain types of data that Power BI may require in particular
situations. Additionally, users can establish a default summarization.
In this section of the pane, it is also possible to determine whether a
column should be permitted to contain null values. All the guidance
provided in the preceding table section is applicable.
Utilizing display folders is an effective method for maintaining a well-
organized data model. This involves grouping related data elements
into a subfolder within a table. No factual data is altered or
manipulated in this process; it is solely intended to enhance visual
clarity. One may also opt to organize measures into display folders,
which can prove to be especially advantageous when attempting to
group select items that may not be conveniently located within
Power BI's larger alphabetical sorting system in the Fields pane.
Conclusion
In summary, we have thoroughly examined some significant features
and functionalities inherent to Power BI. These include the Power
Query Ribbon, Home Tab, Transform Tab, Add Column Tab, Model
View, Relationships, and Properties Pane. Each of these
components plays a pivotal role in the process of transforming,
modeling, and analyzing data. The array of features and
functionalities integrated into Power BI collectively yield a robust and
versatile platform for data analysis and reporting. Through the
effective utilization of these tools, individuals can convert
unprocessed data into practical insights, construct extensive data
models, establish significant connections, and generate visually
captivating reports and dashboards. Power BI enables users to fully
leverage their data, enabling them to make informed decisions and
drive organizational success.
CHAPTER 4
LET’S MAKE SOME PICTURES
(VISUALIZING DATA 101)
Why Visualize Data?
There was a team of committed analysts working in a busy corporate
office, and their job was to decipher complicated data sets and figure
out what they mean. They realized that they were up against a big
obstacle, namely the question of how to properly convey their results
to decision-makers and other stakeholders. At that point, they
became aware of the usefulness of data visualization, in particular
via the use of a program known as Power BI. The analysts quickly
concluded that data visualization was not limited to the creation of
appealing charts and graphs; rather, it required the transformation of
raw data into an intriguing story. They began on a trip using the
collection of visualization tools provided by Power BI at their
disposal, to discover the tales that were buried inside the data.
As soon as they started working on their first project, the group
understood how critical it was to make the content simple to
understand. They were aware of the fact that those responsible for
making decisions did not have the time or desire to go through
spreadsheets that were packed with rows and columns of statistics.
They need a visual representation that would provide them the ability
to quickly comprehend the most important aspects of the data. The
wide variety of visualization styles that are available in Power BI,
such as bar charts, line graphs, and pie charts, gave the team the
tools that they needed to show the data in a way that was both
aesthetically pleasing and simple to comprehend.
The analysts noticed a significant shift in the way decisions was
made as a result of their newly acquired capacity to develop eye-
catching visualizations. While they were presenting their results
using the interactive dashboards that Power BI provides, decision-
makers could explore the data on their own by diving down into
particular features and using filters to acquire deeper insights. The
interactive feature of Power BI made it easier to make decisions
based on data by enabling stakeholders to pose questions, put their
theories to the test, and choose options based on this information.
The group quickly concluded that data visualizations were more than
simply a tool to convey the data; rather, they were a portal to the
exploration and discovery of the data. They were able to wander
around the geography of the data, exposing hidden relationships,
recognizing outliers, and spotting patterns that would have otherwise
stayed hidden thanks to the interactive capabilities of Power BI. The
capability to slice and dice the data, filter it based on a variety of
criteria, and watch the visualizations dynamically respond in real-
time proved a very significant tool for them throughout their analysis.
The analysts made considerable headway until they realized that
data visualization can be used to generate compelling stories. They
were able to create a story around the data with the help of Power
BI's tools such as custom layouts, annotations, and tooltips. This
allowed them to guide stakeholders through the insights and
emphasize the most important themes. They concluded that the use
of visualizations could enthrall and involve their audience, making
the data more memorable and significant. The team also took use of
Power BI's features for collaboration and sharing information. They
made use of the platform to publish their data visualizations,
dashboards, and reports to the Power BI service. This gave
members of the team and other stakeholders the ability to see and
engage with the data. The newly discovered cooperation
encouraged the sharing of information across the business, as well
as alignment and the making of well-informed decisions.
In addition, the analysts were given the ability to monitor real-time
data streams and to react fast to shifting situations thanks to Power
BI. They were able to construct visualizations that updated in real-
time because they had access to live data connections and refresh
capabilities. These visualizations provided up-to-the-minute insights
into important metrics. Because of this, they were able to make
modifications promptly, deal with growing challenges, and exploit
new possibilities. In the end, the team of analysts concluded that
data visualization using Power BI was not only a tool or method, but
rather a powerful means of sharing, analyzing, and exploiting data to
achieve greater results. It turned their analytical journey from a sea
of statistics into an engaging tale, which allowed them to uncover the
real worth of their data and make a significant contribution to the
decision-making process of their business.
Next, let's discuss the tools for data visualization that Power BI
provides us with so that we can tell that narrative with our data.
Imagine that each of them is a letter of the alphabet. You'll first find
out how to build words out of them, and then get to the point where
you're telling tales with them, and eventually, with time and practice,
you'll figure out how to utilize each visualization to get the most out
of it.
The Visualizations pane
The Visualizations pane is where you create visualizations, edit them
by adding the necessary data points, format your visualizations, or
add extra analytics capabilities that Power BI provides in some of
these visuals. You can do any of these things by adding them,
changing them, or formatting your visualizations. The Fields, Format,
and Analytics sections are the three primary components that make
up the Visualizations window.
Fields
It is essential to keep in mind that each visualization in the Fields
section will have a unique appearance in comparison to the other
visualizations in that area since they all accept a unique set of
inputs. The visual representation of a map will not be the same as
that of a bar chart, which will not be the same as that of a matrix. If
you have previously added fields to a visual and then decide to alter
the type of that visual into something different, Power BI will do its
best to reallocate the chosen fields and measures to the new visual,
but it is not assured that it will operate as you intended for it to do so.
If you choose a visual that is already on the canvas and then click a
different visualization in the Visualizations pane, you will be able to
alter the kind of visual that is already on the canvas. That sums it up
well.
Axis, Legend, and Values are common components that can be
seen in many of the visuals found in the Fields window. How you
classify your data is denoted by an axis. A legend organizes the data
into subsections and draws attention to the differences between
them. The actual numbers that you wish to aggregate across various
categories are referred to as the values. There will at all times be a
"Drill through" options section located at the very bottom of the
Fields tab. You can use this fantastic feature to take a visual and
apply the filters that are now being used on that visual to another
visual that is located on another page, which will then transport you
straight to that report page and its set of visuals. The term "drill
through" refers to an extremely helpful method for moving insights
gained from one section of your report to another while maintaining
consistency in the context of the information being accessed.
Format
In earlier versions, the Format window was represented by a symbol
that resembled a paint roller. Now, the icon for a visual that is not
currently chosen looks like a paintbrush across the page. On the
other hand, when a visual on the canvas is chosen, the tool appears
like a paintbrush superimposed over a bar graph. Format gives you
the ability to modify the overall appearance and feel of certain
visualization to better suit your particular requirements.
Like that of Fields, the settings that can be selected under Format
are contextual to the visualization that is now being updated. There
are a lot of options to choose from, and if you've ever worked with
PowerPoint, many of them should seem quite familiar to you. In the
Format tab, there is an arrow next to each part, which can be used to
expand or collapse that section. When it comes to navigating, this
can be of great assistance. Focus on the idea that Format is where
you go to make your visualizations truly stand out for the sake of this
exercise, and ignore the fact that some of the tools located under
Format have their unique usefulness.
Analytics
The Analytics features, which can be accessed by clicking the
symbol of a magnifying glass gazing at a graph, do not operate on all
visualizations. Additionally, it is essential to keep in mind that no
custom visuals can make use of the Analytics section. You can use
this functionality to add in constant lines for comparison, minimum
value lines to identify when something is below a given threshold, a
maximum line for the obverse purpose, an average line, a median
line, a percentile line, or a trend line, or you can even do anomaly
detection. This is only available for visuals where Analytics does
work. There is a lot of capability available under Analytics, but
getting the most out of it depends on the context, so don't be afraid
to experiment to find out what does and doesn't work for you and the
project you're working on.
Visual Interactivity
Inside the context of Power BI, the term "visual interactivity" refers
to the capability of interacting with visual components inside a report
or dashboard, such as charts and tables. Users are given the ability
to explore and study data by choosing and filtering certain data
points or categories, delving further into specifics, and dynamically
altering the visual representations of the data. Power BI has some
interactive elements that improve the overall user experience and
make it possible to delve more deeply into the data.
The following is a list of important visual interactive features
that are available in Power BI:

1. Selection and Highlighting: Users can pick data points or


categories inside a visual by clicking on them to do so, and
those selections can then be highlighted. To provide a
unified perspective of the data, Power BI draws attention to
the elements that have been chosen and modify the other
visuals within the report appropriately. Users can zero in
on certain aspects and understand how those aspects
affect the overall report as a result of this.
2. Filtering: Users of Power BI can add filters to
visualizations to zero down on certain subsets of the data
being shown. Data can be filtered by the user by either
picking a particular value or range of values or by utilizing
complex filtering options. When a filter is applied, all of the
corresponding visuals in the report are updated to reflect
the data that has been filtered, producing a view that is
consistent across the report.
3. Drill-through: Power BI users can drill down into related
information by clicking on data points or categories, which
is made possible by the drill-through actions that can be
defined in the software. Users are thus given the ability to
explore data structures and go more deeply into certain
parts of the data.
4. Cross-filtering and Cross-highlighting: Both of these
features are supported by Power BI, allowing users to
cross-filter and cross-highlight across different
visualizations. It filters or highlights the matching data
points in other related visuals when a user chooses data in
one visualization, and it does this automatically. Users can
explore the relationships and correlations between the
various data items because of this behavior's interactive
nature.
5. Bookmarks and Buttons: Power BI gives you the ability
to develop interactive navigation via the use of bookmarks
and buttons. Bookmarks save the visuals in their present
form, including any filters, options, and drill-through levels
that have been applied. After that, users can move
between the many bookmarked views using buttons, which
provides a guided experience for the data exploration
process.
6. Tooltips: When users hover their mouse cursor over data
points in visualization, Power BI tooltips deliver more
information and insights to the user. Tooltips can be
configured to show a variety of data fields, calculations, or
user-defined expressions according to your preferences.
Tooltips improve the overall quality of the interactive
experience by providing context-sensitive information in a
way that does not clutter the visualizations.
7. Q&A Natural Language Queries: Natural Language
inquiries are made possible by the Q&A function of Power
BI, which enables users to inquire about the data they
have access to by utilizing natural language inquiries.
Power BI analyzes the queries to provide relevant visual
representations and insights. Users can write or voice their
inquiries. This feature eliminates the requirement for pre-
built visualizations by providing a method that is both user-
friendly and interactive for navigating data.

Enable the visual interaction controls


If you can make edits to a report, you will be able to activate the
visual interaction controls, after which you will be able to personalize
how the visualizations on the report page filter and highlight each
other.

1. Select a visualization to make it active.


2. Display the Visual Interactions options.

Within Power BI Desktop, go to the Format menu and then click Edit
Interactions.
3. Every single one of the other visualizations on the report
page gets a new filter and highlight icons thanks to Power
BI.

Cross-filtering is taking place between the line chart and the map
thanks to the tree map. Cross-highlighting is being done on the
column chart by the tree map as well. You now can adjust how the
chosen visualization interacts with the other visualizations that are
shown on the report page. This can be done by using the report
page's settings.
Change the interaction behavior
Get familiar with how your visualizations interact by selecting each
visualization on your report page, one at a time. Choose one
individual data point, a bar, or a shape, and then see how it affects
the other visualizations. Altering the interactions is something you
can do if you decide that the conduct being shown is not to your
liking. These modifications are preserved in the report, which
ensures that you and anybody else who consumes the report will
have the same visual interaction experience. To get started, choose
a visualization to turn it into an active mode. Take note that
interaction icons have been added to each of the other visualizations
that can be seen on this page. The icon that is bolded is the one that
is currently being used. The next step is to identify the influence that
the chosen visualization should have on the other options. You can
repeat this process for each of the additional visualizations that are
included on the report page.

Options for selected visualizations:


• To use the currently chosen visualization as a cross-filter for
another of the visualizations on this page, click the filter icon that

is located in the top right corner of that visualization . Line


charts, scatter charts, and map data are the only types of charts
that can be cross-filtered. They cannot be cross-highlighted in any
way.

• Select the highlight icon if you want the currently chosen


visualization to highlight one of the other visualizations on the
page. This can be done by clicking the icon.

• Click the "no impact" symbol on the toolbar if you do not


want the currently chosen visualization to have any effect on any
of the other visualizations shown on the page.
Column and Bar Charts
Columns and bar charts are two of the most common types of
visualizations used in Power BI. These charts are used to show
categorical data and compare results across a variety of categories.
Both kinds of charts provide the data in the form of horizontal
and vertical bars, with the length or height of the bars standing
in for the respective data values:

1. Column Chart: The bars in a column chart are aligned


vertically along the x-axis and are shown in a column
format. The height of the bar correlates to the data value
that is connected with that category, and each bar in the
graph represents a different category. Column charts are
often used to compare results across a variety of
categories or to monitor changes over time.
2. Bar Chart: A bar chart is characterized by its use of
horizontal bars that are aligned with the y-axis of the chart.
Each bar in the chart represents a different category;
however, the length of the bar reflects the amount of data
for that category. This is analogous to the column chart.
Bar charts are often used when the labels of the categories
are long or when comparing data in a certain order, such
as ranking. Another typical application of bar charts is in
the presentation of financial data.

To create a column or bar chart in Power BI, you can follow


these steps:

1. Prepare your data: Make sure that your data has both a
categorical column that represents the categories and a
numerical column that represents the values that are
linked with each category. This is an important step in the
data preparation process.
2. Open Power BI Desktop: Launch the Power BI Desktop
application on your computer.
3. Connect to the data source you want to use: After
loading the data into Power BI, connect to the data source
you want to use, which might be an Excel file, a database,
or a cloud service.
4. Drag and drop the desired fields: From the Fields pane,
drags the category column (for example, Product
Category) to the Axis field well, and then drags the
numerical column (for example, Sales Amount) to the
Values field well.
5. Select the chart type: Choose the sort of chart you want
to use by clicking the "Column Chart" or "Bar Chart"
button in the Visualizations window. This option is based
on your taste. The chart will be generated by Power BI
automatically and shown on the canvas.
6. Customize the chart: Adjust the look of the chart by
making use of the formatting options included in the
Visualizations pane. You can do this by making changes to
the chart's colors, labels, axes, and legends, among other
things. You can also activate extra features like data labels
or trendlines.
7. Interact with the chart: Users can interact with the chart
by choosing certain bars and columns, adding filters, or
diving down into more comprehensive views.

In Power BI, you can more effectively display and compare data
across categories by making use of column and bar charts. This, in
turn, enables you to get more insights and a deeper comprehension
of your data.
The construction of the x- and y-axes in column and bar charts is
often quite straightforward. Taking the values into consideration, we
can then compare them along two dimensions to get some
information. When we wish to compare one number to another, we
may utilize certain charts' additional y-axis feature, which allows us
to put in a second y-axis.

Power BI is equipped with the following visuals by default in the


column and bar chart category (listed in the order that they
appear in the Visualizations window, from left to right and top to
bottom, skipping past visuals that aren't relevant to this section
of the text):
• Stacked bar chart
• Stacked column chart
• Clustered bar chart
• Clustered column chart
• 100% stacked bar chart
• 100% stacked column chart
• Waterfall chart
Stacked Bar and Column Charts
With varied verticality, the stacked bar and column charts achieve
the same result. What is the difference between your x-axis and your
y-axis? Use bar charts when comparing discrete values and columns
when measuring against continuous data, such as time, as a general
rule of thumb. This isn’t a hard-and-fast rule, though. For this chart,
we are going to look at the difference between students for whom
this is their first class in the department and those for whom it is not.
We would like to know their average score and see how many office
hours were attended by each group.
In the figure below, we can see both the bar and column examples.
The numbers you layer on top of one another in these two charts to
create the visualization—in this example, the average assignment
grade and the total number of office hours spent by each group—
form a summation of the elements that make up the visualization.
This provides us with a combined value that enables us to easily
compare several columns and discover outcomes that may not first
make sense. Both charts, when seen side by side, have identical
inputs from the Visualizations tab. You establish an axis. To further
separate the bar or column, the numbers you wish to study, small
multiples, and tooltips, you can add a legend as a category. When
you move your mouse cursor over a certain area of the visual, the
tooltip appears. Power BI will compile a list of values for the tooltip
based on the contents of the visual when the tooltip is empty.
Consider it as an instant table you can display for a certain collection
of facts to aid in reading the visual. In the tooltip, you can also
provide information that isn't always shown in the visual.
Clustered Bar and Column Charts
Power BI has become a potent tool for analyzing and displaying
large, complicated data sets in the area of data visualization. The
best charts for comparing data within categories are clustered bar
and column charts. Each category is represented by a distinct color
or pattern, and the data is shown as vertical columns or horizontal
bars. The bars or columns can be grouped to make it simple to
compare data side by side.
The simplicity of clustered bar and column charts is one of its key
benefits. They provide a simple visual representation of the data,
making it simple for consumers to understand and evaluate the data.
The categories being compared are denoted by the names on the
chart's axes, and the bars or columns graphically depict the data
related to each category. Clustered bar and column charts are
excellent for both technical and non-technical users because of their
simplicity.
Here are the steps to create and use clustered bar and column
charts in Power BI:
Step 1: Import or connect to the data source
Open Power BI and connect to or import the data source containing
the information you wish to see visually. Various data types, such as
Excel spreadsheets, CSV files, SQL databases, and cloud-based
services like Azure SQL Database or SharePoint, can be used in
this.
Step 2: Select the clustered bar/column chart visualization
Select the clustered bar/column chart visualization from the
visualization window on the right once the data source has been
attached. Vertical columns or horizontal bars are used to symbolize
it.
Step 3: Assign data fields to appropriate axes
Drag and drag the category field (such as product categories or
periods) to the clustered bar/column chart visualization's axis
section. The categories along the axis will be chosen using this field.
Select the numerical field(s) and assign them to the value part of the
visualization (e.g., sales, quantities). The height of the bars or the
width of the columns will be determined by these settings.
Step 4: Customize the chart appearance
Power BI offers a wide range of customization options to improve the
clustered bar/column chart's visual appeal. By choosing the chart
and utilizing the formatting options offered in the formatting window,
you can change the colors, fonts, titles, legends, and axis labels.
Step 5: Apply sorting and filtering
You can sort the axis field to arrange the categories sensibly. This
can be accomplished by selecting the visualization pane's axis field
and then selecting the appropriate sorting options. By choosing the
filter pane and indicating the filters you wish to use, you can also
apply filters to the data. This aids in focusing the data shown in the
clustered bar/column chart depending on certain circumstances.

Step 6: Interact with the chart


You can interact with the clustered bar/column chart once it has
been made and altered to better analyze the data. You can examine
tooltips providing in-depth information about certain data points by
hovering your cursor over the bars or columns. You can also dig
down to further related visualizations or pages in your Power BI
report by clicking on the chart.
100% Stacked Bar and Column Charts
For showing the relative contributions of several categories or
groups to a total, 100% stacked bar and column charts are the best
option. The height of the bars or the length of the columns indicates
the percentage of each category about the whole, and they depict
each category or group as a piece of the whole. These graphs
provide a clear visual picture of the contributions made by several
categories to the composition as a whole. The ability to display
relative proportions and trends with 100% stacked bar and column
charts is one of its main benefits. These charts make it simple to
compare categories or groups, regardless of the overall value, by
showing the data as a percentage of the total. This is especially
helpful when comparing datasets of various dimensions or sizes.

Small Multiples
When comparing patterns, trends, or distributions within a dataset
across many categories or dimensions, small multiples are very
helpful. Small multiples provide a full perspective of the data and
make it simple to compare the subsets by organizing multiple charts
in a grid or matrix arrangement. More efficiently than a single chart,
this visualization style aids in the discovery of parallels, contrasts,
outliers, and relationships within the data. The capacity of small
multiples to lighten cognitive strain and enhance data understanding
is one of their main benefits. Small multiples divide the data into
smaller, more understandable parts as opposed to depending on a
single complicated display. As a result, users can concentrate on
each subset separately while keeping the larger context in mind,
which makes it simpler for them to grasp and analyze the data.
When comparing patterns, trends, or distributions within a dataset
across many categories or dimensions, small multiples are very
helpful. Small multiples provide a full perspective of the data and
make it simple to compare the subsets by organizing multiple charts
in a grid or matrix arrangement. More efficiently than a single chart,
this visualization style aids in the discovery of parallels, contrasts,
outliers, and relationships within the data. The capacity of small
multiples to lighten cognitive strain and enhance data understanding
is one of their main benefits. Small multiples divide the data into
smaller, more understandable parts as opposed to depending on a
single complicated display. As a result, users can concentrate on
each subset separately while keeping the larger context in mind,
which makes it simpler for them to grasp and analyze the data.
Waterfall Chart
The movement of values from an initial level to a final level via a
sequence of intermediate positive and negative elements is visually
represented by a waterfall chart, sometimes referred to as a bridge
chart. It illustrates how each aspect affects the total change and how
it affects the final number. In financial analysis, waterfall charts are
often used to show income statements, cash flows, and other
financial measures.
Benefits of Using Waterfall Charts in Power BI
Waterfall charts provide numerous benefits for data analysis
and reporting purposes.

1. Waterfall charts offer a holistic perspective on the


cumulative effect of both favorable and unfavorable values
on a metric. This facilitates users' comprehension of the
diverse factors that impact the ultimate result.
2. Waterfall charts facilitate the identification of the most
significant contributors to the overall change by
representing each factor as a distinct bar. This practice
facilitates the concentration on pivotal factors and
comprehension of their impact.
3. Waterfall charts are a highly effective tool for enhancing
data storytelling. Their visually appealing and intuitive
nature simplifies the communication of complex data
patterns and trends. They can effectively present data-
driven narratives to diverse audiences.
4. Waterfall charts are a useful tool for comparing multiple
scenarios or periods in a side-by-side manner. This feature
facilitates expeditious comparisons and analyses of the
effects of diverse factors across multiple dimensions.

Creating a Waterfall Chart in Power BI


Now let's explore how to create a waterfall chart in Power BI:

1. Data Preparation: Data preparation is a crucial step in the


Power BI process. It is imperative to ensure that all the
required data is available in the system. This may
necessitate importing data from various sources and
performing data cleaning and transformation operations as
required.
2. Select the Visualization: To choose the appropriate
visualization, launch Power BI Desktop and navigate to the
"Visualizations" pane located on the right-hand side of the
screen. From there, opt for the waterfall chart visualization.
3. Drag and Drop Fields: Utilize the drag and drop
functionality to place the necessary data fields onto the
canvas. In general, it is necessary to have a category field,
such as periods or products, and a corresponding value
field, such as revenue or profit.
4. Configure the Chart: To configure the chart, navigate to
the "Visualizations" pane and proceed to assign the
category field to the "Axis" section while the value field
should be assigned to the "Values" section. The Power BI
software can automatically generate a rudimentary
waterfall chart.
5. Customize the Chart: To customize the chart, access the
"Visualizations" pane and utilize the formatting options
provided to modify its visual presentation. One can
customize various properties, including colors, axis labels,
data labels, and tooltips, to align with their preferences and
specific needs.
6. Add Additional Fields: To improve the chart's quality, you
have the option to add supplementary fields into the
"Legend" and "Tooltip" segments of the "Visualizations"
panel. This facilitates a more comprehensive examination
and furnishes a backdrop for the information.
7. Utilize Analytical Features: Power BI provides an array of
analytical features within the "Analytics" pane. Consider
utilizing features such as data labels, constant lines, and
trend lines to enhance the visual representation of the
chart and extract more profound insights from the data.
8. Save and Share: Upon completion of customizing the
waterfall chart to meet your specific requirements, kindly
proceed to preserve the Power BI report. Additionally, you
can also opt to share the report with other relevant parties.
Subsequently, the report can be disseminated to the
Power BI service, exported, or shared with colleagues for
collaborative analysis and reporting.

Line and area charts.


Line and area charts, similar to column and bar charts, are primarily
centered on the x- and y-axes. The disparity lies in the fact that it is
comparatively effortless to discern patterns or conduct comparisons
using lines or areas, enabling the identification of overlapping data in
a time series or the observation of intersections between specific
categories. Furthermore, within this section, we shall delve into a
collection of charts in Power BI that amalgamate column charts with
line charts.
The list of visualizations in this particular category should be
read from left to right, top to bottom, with the exclusion of any
visualization that do not belong to this section.
• Line chart
• Area chart
• Stacked area chart
• Line and stacked column chart
• Line and clustered column chart
• Ribbon chart
Line Chart
A line chart bears a resemblance to a column chart, albeit with a
more pronounced ability to discern trends due to the visual clarity of
lines as opposed to columns. Line charts are most effective when
used with a continuous axis. This is because, in the event of a gap,
the line chart will discontinue the line and resume it for the
subsequent value in the axis series.

Upon examining the options presented in the Visualizations pane for


column and line charts, one may observe that they bear a striking
resemblance to each other. However, it is worth noting that the line
chart boasts an additional option for Secondary values. The
aforementioned section serves the purpose of facilitating the plotting
of two y-axes against each other, which is a distinctive characteristic
of line-type charts. This feature can prove to be advantageous in
cases where one intends to compare trends of two variables that
may exhibit significant differences in their orders of magnitude.
Creating a Line Chart
Choose a report canvas: Create a new report or open an existing
one. The report canvas serves as the workspace where you design
your visualizations.
Select the line chart visualization: To add a blank line chart to the
report canvas, navigate to the "Visualizations" pane located on the
right-hand side of the screen and select the line chart icon.

• Assign data fields: To assign data fields, kindly drag and


drop the relevant fields from your dataset onto the
corresponding sections provided in the "Visualizations" pane. It
is customary to place the date or time field in the "Axis" section
and the numerical values in the "Values" section.
• Customize the chart appearance: Power BI provides several
customization options to improve the visual appeal and
usability of the line chart. These options allow for the
customization of the chart appearance. Customize the chart
attributes, including colors, captions, axes, gridlines, and
legends, to align with your personal preferences and specific
needs.
• Add lines to the chart: Additional lines can be incorporated
into the chart: Line charts can accommodate numerous lines.
In certain instances, the values depicted on the lines can be
significantly disparate, resulting in an unsatisfactory display
when presented together. Let us examine the process of
appending supplementary lines to our existing chart, and
subsequently acquire the skill of formatting the chart in cases
where the values denoted by the lines are distinct.
• Add more lines: It would be beneficial to segregate the total
units by region instead of presenting it as a single line on the
chart. This approach would provide a more comprehensive
view of the data and enable a better understanding of the
performance of each region. To include more lines, simply drag
the "Geo > Region" option and drop it into the designated
"Legend" well.

• Add additional fields: To enhance context or facilitate


comparison, it is advisable to append supplementary fields to
either the "Legend" or "Tooltip" sections located within the
"Visualizations" pane. To compare sales data across various
regions, it is recommended to position the region field within
the "Legend" section.
• Apply sorting and filtering: The functionality of sorting and
filtering is available in Power BI, which enables the user to sort
and filter the data in the line chart. Make use of the sorting
options to arrange the data points either in ascending or
descending order. Utilize filters to concentrate on particular
subsets of data or emphasize specific periods.
Area Chart
The area chart possesses identical category options to those of a
line chart. An area chart can be conceptualized as a type of line
chart that has the space below the line filled in with values. The area
chart serves two purposes that are beneficial in various scenarios.
The initial use case pertains to situations wherein there is a
presence of data that changes over some time, and the objective is
to provide the reader with an understanding of the proportion of such
alterations. An exemplary instance of this phenomenon is the
examination of the fluctuation in population throughout the years. An
area chart can also be utilized to facilitate a clearer visualization of
the overlapping of two or more values or to identify instances where
there is no overlap, which I find to be the more intriguing scenario.
Stacked Area Chart
A stacked area chart is a graphical representation that displays the
changes in data over time. It is a type of chart that is commonly used
to show the composition of
A stacked area chart is a chart type that exhibits the cumulative
values of various categories along a continuous axis, such as time.
Comparing the contribution of each category to the whole and
illustrating the trend of the combined values is a valuable practice.
The stacked area chart in Power BI is a visualization tool that
portrays multiple data series in a stacked format, enabling the user
to visualize their individual and combined contributions. Every data
series is depicted by a colored region, while the collective worth of
each classification is exhibited as the summation of the regions at
any designated point on the x-axis.
The x-axis generally denotes time, including dates or periods, while
the y-axis signifies the numerical values linked with each category.
The presented chart showcases the correlation between various
categories and their cumulative values over some time, providing an
opportunity to scrutinize trends, identify patterns, and comprehend
the relative significance of each category in its entirety. Power BI
offers a range of options for tailoring the appearance and
functionality of a stacked area chart.
One can modify color schemes, incorporate data labels, implement
filters, customize legends, tooltips, and axis properties, and utilize
various other formatting features to optimize the visualization and
effectively communicate the intended insights. Consider a scenario
where you are operating a business that offers a range of
subscription models, namely silver, gold, and platinum. The stacked
area chart is a suitable choice for this scenario as it effectively
displays the silver, gold, and platinum areas in a layered manner,
providing a comprehensive view of both the aggregate and individual
subscription categories.
Line and Stacked Column Chart/Clustered
Column Chart
What does adding a line do for stacked and clustered column
charts? The ability to take a column chart and then overlay a line
value on a different axis can expand the analysis. In the image
below, the average score is broken down by race for the line, while a
clustered column is utilized for the total sum of all the scores.
Upon a cursory glance, it becomes apparent that the demographic
makeup of my class is predominantly White. However, upon further
examination, it is noteworthy that the Black or African American
students have achieved the highest average score.

The presence of the second y-axis indicates that the disparity in


mean scores across my ethnic groups is relatively minor, spanning
from approximately 76 to a maximum of 80.
Ribbon Chart
The Ribbon Chart is a graphical representation tool that displays
data in a horizontal bar format. It is commonly used to illustrate the
progress. The Ribbon Chart is a visual tool available in Power BI that
facilitates the comparison and analysis of data across various
categories or groups over some time. The data is presented in the
form of stacked ribbons, wherein the height of each ribbon denotes
the relative proportion or value of the corresponding category at a
specific point in time. Ribbon Charts prove to be a highly effective
tool for displaying trends and patterns in data, especially when
dealing with categorical data.
Ribbon Charts in Power BI offer a range of key features and benefits
that can enhance data visualization and analysis. These charts
provide a clear and concise representation of data trends, allowing
users to easily identify patterns and insights. With their intuitive
design and interactive capabilities, Ribbon Charts enable users to
explore data dynamically and engagingly, making it easier to draw
meaningful conclusions and make

1. Comparative Analysis: A comparative analysis reveals


that Ribbon Charts facilitate the effortless comparison of
data across diverse categories or groups. By representing
the proportions or values of each category through
ribbons, it becomes easier to discern trends and patterns
within the data.
2. Time-based Insights: Ribbon Charts possess an inherent
axis that is specifically designed for the representation of
time or dates. This feature renders them highly suitable for
the analysis of data over a particular duration. The chart
can exhibit the alterations in relative proportions of
categories over some time, thereby facilitating the
detection of trends, seasonality, or patterns.
3. Interactive Exploration: The Ribbon Charts in Power BI
offer an interactive exploration experience for users,
enabling them to drill down, filter, or highlight specific
categories or periods. The provision of interactivity
facilitates a more profound examination and evaluation of
the data, thereby endowing users with the ability to acquire
valuable insights and address particular inquiries.
4. Flexibility and Customization: Power BI provides a
multitude of customization options for Ribbon Charts,
allowing for enhanced flexibility. Individuals can alter the
visual components of a chart, including colors, labels,
tooltips, and legends, to improve its legibility and
effectively communicate information.
5. Integration with Other Visuals: Ribbon Charts can be
integrated with other visuals and elements on the Power BI
canvas, thereby enabling the creation of comprehensive
dashboards and reports. These can be associated with
slicers, filters, or other visuals to offer a unified analytical
experience.

It is noteworthy that Ribbon Charts, despite their efficacy in data


visualization within Power BI, may not be universally applicable in all
scenarios. When selecting the appropriate visualization type, it is
imperative to take into account the characteristics of the data, the
intended message, and the target audience.
Donuts, dots, and maps.
Donut charts, dot plots, and maps are three distinct visualization
techniques frequently employed in Power BI to present and evaluate
data.
Let us examine each one of them.

1. Donut Charts: Donut charts are circular visualizations that


bear a resemblance to a donut, featuring a hole in the
center. Circular diagrams, commonly known as donut
charts, are utilized to depict the distribution or ratio of
diverse classifications within a complete entity. Donut
charts are a valuable tool for visually representing the
proportional distribution or composition of data. Every
category is depicted by a segment of the donut, and its
magnitude is proportional to its share of the entirety.
Customization options are available for Donut Charts,
including labels, colors, and tooltips, which can improve
their clarity and visual appeal.
2. Dot Plots: Dot plots are a straightforward yet impactful
means of presenting and contrasting data points along a
singular axis. The chart comprises a line-oriented either
horizontally or vertically, accompanied by discrete dots or
markers that symbolize each data point. Dot plots are a
particularly valuable tool for illustrating the distribution,
ranking, or comparison of values within a given dataset.
Categorical or discrete data can be effectively handled
through their utilization. Dot plots can be tailored to specific
needs through the use of labels, colors, and markers. This
allows for the communication of supplementary information
or the emphasis on particular data points.
3. Maps: Maps are a form of visualization that employs
geographic data to depict information spatially. The Power
BI platform provides users with the convenience of built-in
mapping capabilities, enabling them to showcase their
data on interactive maps. Maps are a valuable tool for
displaying data that is regionally or location-based.
This can include information such as sales figures broken
down by region, the distribution of customers, or population
density. Power BI maps can showcase data through a range of
visual elements such as heat maps, bubbles, choropleth maps,
and filled maps. Maps can be enriched with tooltips, drill-down
functionalities, and interactive elements to offer comprehensive
insights and facilitate the exploration of geographical
information.
The visualizations, namely Donut Charts, Dot Plots, and Maps offer
diverse means of depicting and scrutinizing data in Power BI. This
enables users to effectively communicate information and extract
valuable insights from their datasets.
Through the selection of the most fitting visualization format, taking
into account the characteristics of the data and the desired analytical
outcomes, Power BI users can craft compelling and enlightening
reports and dashboards.

Funnel Chart
The funnel chart is a versatile visualization tool. Fundamentally, a
funnel chart serves the purpose of contrasting a set of data against
another set of data, to determine their proximity or divergence.
Below are the essential features and characteristics of Funnel
Charts in Power BI.

1. Sequential Representation: Funnel charts portray a


sequence of phases or actions within a given process.
Every stage is depicted by a horizontal bar or trapezoid
shape, wherein the width of the bar is commensurate with
the amount or worth of data present at that specific stage.
2. Data Reduction: The utilization of Funnel Charts
showcases the step-by-step decrease of data as it
advances from one phase to another. The bars' width
exhibits a gradual reduction, signifying a corresponding
decrease in the magnitude or worth of information with
each successive phase.
3. Conversion Analysis: The analysis of conversions is
often conducted through the utilization of Funnel Charts.
These charts are a popular tool for assessing the drop-off
rates that occur at various stages of a given process.
Through the comparison of bar widths, it is possible to
readily identify bottlenecks or areas where data or potential
customers are lost during the progression of the process.
4. Focus on Proportions: The emphasis of Funnel Charts
lies in the relative proportions of data between stages,
rather than absolute values. It is imperative to focus on
proportions while working with these charts. The utilization
of visual representation facilitates expeditious and
effortless comparison of the respective magnitudes of each
stage, thereby accentuating the importance of conversion
rates or drop-off rates.
5. Customization Options: Power BI offers a range of
customization options that can be utilized to elevate the
visual appeal and functionality of Funnel Charts.
Individuals can personalize the colors, labels, tooltips, and
additional visual components to enhance the chart's
aesthetic appeal and informational value.
6. Interactivity and Drill-Down: The Funnel Charts in Power
BI provide interactive capabilities, including drill-down
functionality. The chart allows for user interaction through
the ability to drill down into specific stages or apply filters,
providing a more comprehensive understanding of the
data. The flexibility inherent in this approach enables the
user to engage in dynamic exploration and analysis of the
underlying data.

Funnel charts are a highly effective means of visually representing


and analyzing data in situations where a sequential flow or
conversion process is present. They assist in identifying areas of
improvement, optimizing conversions, and making decisions based
on data. Funnel Charts in Power BI offer a lucid and user-friendly
means of visualizing and comprehending data, whether it pertains to
sales, marketing, or any other process characterized by a defined
flow.
Scatter Chart
The Scatter Chart is a data visualization tool in Power BI that
presents data points as distinct markers on a Cartesian coordinate
system. This method is utilized to explore the correlation, patterns,
and trends between two numerical variables. Scatter charts prove to
be highly advantageous in the analysis of extensive datasets or the
comparison of data across diverse dimensions. Scatter charts
employ a two-dimensional Cartesian coordinate system, wherein one
variable is plotted on the x-axis, and the other variable is plotted on
the y-axis. This facilitates effortless visualization and analysis of the
correlation between the two variables. Each data point is
represented as an individual marker or dot on the chart. The location
of the data point on the Cartesian plane is indicative of the values of
the corresponding variables. Scatter charts facilitate the identification
of relationships between two variables. The correlation between
variables can be positive, negative, or nonexistent, thereby indicating
the degree to which changes in one variable are associated with
changes in the other variable.
Scatter charts can exhibit the density or concentration of data points
within a specific area of the chart. This technique can aid in the
identification of regions with significant or insignificant data density
and unveil patterns or clusters within the data. Multiple data series or
categories can be included by assigning distinct colors, shapes, or
sizes to the markers. This feature facilitates the comparison of data
among diverse groups or dimensions within a single chart.
The charts are equipped with tooltips that offer supplementary
details of individual data points upon hovering over them. The chart
can be interacted with by users through various means such as
selecting data points, applying filters, or drilling down to explore
specific subsets of the data.
In addition, it enables the incorporation of trend lines or regression
analysis to offer additional insights into the correlation between the
variables. Trend lines serve to provide a visual representation of the
general trend or trajectory of the data.
Pie and Donut Chart
Both pie charts and donut charts are examples of circular data
visualizations that are commonplace in Power BI and other analytical
software. They are useful for illustrating the composition or
distribution of categorical data, and they are successful in doing so.
An overview of Pie Charts and Donut Charts is as follows:
1. Pie Chart: A pie chart is a circular chart that is subdivided
into sectors, and each sector represents a different
category or a percentage of the entire. The amount of
information that a particular sector "represents" is directly
proportional to its size within the chart.

The relative contribution that various categories provide to a


dataset can be visually represented with the use of pie charts.
They can be personalized in a variety of ways, such with
labels, colors, and tooltips, which helps to improve their
readability and provides more information.

2. Donut Chart: A Donut Chart is a variation of the Pie Chart


that has a hole in the middle to create a look similar to that
of a doughnut. The total, or one hundred percent, is shown
in the chart's outer ring, while the distribution of categories
is shown in the chart's inner ring. Donut Charts are
comparable to Pie Charts in terms of the visual
representation of proportions; however, the Donut Chart's
central hole may accommodate the display of extra data or
information. This area can be used to display a summed-
up value, a total, or any other pertinent information that
can be necessary.
Both Pie and Donut Charts have their uses and considerations:
• Comparison of Proportions: Pie and Donut Charts are useful
tools for making a comparison of the proportions or
percentages of various categories included within a dataset.
They give a visual picture of how each category contributes to
the overall.
• Limitations with Data: It is essential to be aware that Pie and
Donut Charts may not be appropriate for datasets that include
a high number of categories or when the proportions are
comparable. A chart with an excessive number of categories
can seem crowded and difficult to understand. In
circumstances such as these, other kinds of charts, such as
bar charts or treemaps, could be a better fit.
• Data Labels and Exploding Slices: Pie and Donut Charts
can incorporate data labels to illustrate the precise numbers or
percentages associated with each category. This can boost
readability and clarity by displaying the information more
concisely. Additionally, individual sectors or slices can be
"exploded" to highlight a certain category or bring emphasis to
particular data points. This is done by clicking on the "explode"
button.
• Interactivity and Drill-Down: Power BI offers interaction with
these chart types. Users can interact with the chart by
choosing or filtering certain categories to explore more
comprehensive information or to examine related data in
various visuals. This can be done in a variety of ways.
It is vital to evaluate the dataset, the number of categories, and the
insights you wish to express when using Pie or Donut Charts. It is
helpful to pick the proper chart type to effectively convey the needed
information when one has a good understanding of the context and
purpose of the visualization being used.
Treemap
Displaying hierarchical data in the form of nested rectangles is the
purpose of the Treemap data visualization technique, which can be
found in Power BI and other analytics applications. It presents the
data in a hierarchical format, with several levels of a category or
dimension represented by rectangles of different sizes and colors.
When it comes to effectively depicting the relative proportions and
patterns contained inside hierarchical data structures, treemaps are
among the most useful tools. When visualizing hierarchical data
structures, treemaps are useful because they depict each level of the
hierarchy using layered rectangles. The root node is represented as
the biggest rectangle in the display, and its offspring are shown as
additional rectangles that are smaller than the root rectangle. This
layering continues for sub-levels, which enables the visual study of
the hierarchical connections between the nodes. The percentage or
weight of the data that is represented by each rectangle in the
Treemap is indicated by the size of that rectangle. The bigger the
size of the rectangle, the more valuable it is in comparison to the
other rectangles. Because of this, it is possible to easily identify the
categories within the hierarchy that have the greatest significance.
To transmit more information, treemaps can use color as an
additional encoding technique. Color can be used to symbolize a
distinct dimension, or it can be used to emphasize certain categories
according to particular standards or measures. It has interactive
elements in its design.
Users can concentrate on certain subsets of the data by applying
filters, collapsing or expanding branches, or drilling down into
specific levels of the hierarchy. This capacity enables dynamic
exploration and study of the hierarchical data, and it is accessible via
an interactive interface. Options for personalization are available for
Treemaps using Power BI. Users can tailor the colors, labels,
tooltips, and other visual aspects to meet their requirements and
improve the visual attractiveness of the Treemap. When the mouse
is moved over a rectangle in a treemap, a tooltip can be shown that
provides further information about that rectangle. Labels can also be
placed on the rectangles to offer labels or data values. This improves
the readability of the visualization as well as one's capacity to
comprehend it.
When it comes to the visualization of hierarchical data, such as
organizational hierarchies, product categories, portfolio breakdowns,
or file directory structures, treemaps are among the most helpful
tools. Users can spot patterns, outliers, and trends within the
hierarchical structure because they give a method that is both
compact and effective for representing the proportions and
connections contained within the data. When working with
Treemaps, it is vital to take into consideration the amount of
complexity in the hierarchy as well as the clarity of the labels or
tooltips to guarantee that the visualization successfully conveys the
information and insights that you want to express.
Map Visuals
Displaying and analyzing geographical or location-based data can be
accomplished with the help of Power BI's map visuals. Users are
allowed to explore geographical correlations, patterns, and trends
because of the fact that they offer a visual depiction of data points
plotted on a map. Power BI provides users with some alternative
map visualizations to accommodate a wide variety of requirements
and kinds of geographic data.
Here are some key map visuals in Power BI:
• Basic Map: The Basic Map visual presents a conventional
map view and includes the capability to plot data points or
shapes on the map. It can handle many different types of
maps, such as road maps, satellite images, or a mix of the two.
Users have the option of representing their data spatially using
individual points, lines, or polygons that can be plotted.
• Shape Map: The Shape Map visual gives users the ability to
plot data on any map shape, such as nations, states, or
regions. Users may also customize the forms of the map. It can
handle individualized polygons as well as predetermined
borders for a variety of geographical locations. Visualizations
using shape maps help show data at various administrative
levels or in areas that have been specifically designated by the
user.
• ArcGIS Map: The ArcGIS Map visual integrates with Esri's
ArcGIS platform to provide sophisticated mapping capabilities.
These capabilities are provided by the ArcGIS Map visual. It
provides extra data layers, which may include demographic
data, the locations of businesses, information about the
weather, and perhaps more. Users can make use of ArcGIS
maps to create visualizations that are both interactive and
instructive by combining many layers of geographic data.
• Filled Map: The Filled Map visual is used to show data in a
choropleth style, in which geographic areas are shaded or filled
dependent on the value of a given measure. This kind of data
presentation is also known as a heat map. It is very helpful for
comparing data from various places as well as determining
geographical patterns or trends.
• Heat Map: The Heat Map is a visual representation of data
density that uses colors or gradients to illustrate the intensity or
concentration of the data. It helps identify places on the map
that have a high or low data density. The density of a
population, the distribution of customers, or any other data that
varies spatially in intensity can be analyzed with the use of
heat maps.
• Bubble Map: The Bubble Map is a visualization that presents
data as bubbles or markers on a map, with the size of each
bubble reflecting a particular measure or value. It makes it
easier to see the distribution, size, or comparability of data
over a variety of geographic regions.
Users can zoom in or out, apply filters, and examine comprehensive
information about individual locations or data points on the map
thanks to the interactive and drill-down features offered by the map
visuals in Power BI. They also offer extra features like tooltips,
legends, and legends to improve the user's knowledge of the data as
well as their analysis of it. When dealing with the map visuals in
Power BI, it is crucial to make sure that the geographic data is right,
that it is structured appropriately, and that it has been geocoded so
that it plots accurately on the map. Users can successfully analyze
and convey geographic information inside their reports and
dashboards by making use of the appropriate map visual depending
on the nature of the data and the intended insights. This allows users
to effectively communicate geographic information.
The “Flat Visuals”
Flat visuals are meant to show information openly that the reader is
meant to take away from the experience. One of these visuals, the
slicer, is the glaring exception to this rule. In general, these visuals
do not cross-filter one another; rather, they are cross-filtered
themselves. These visuals may seem to be straightforward; but, they
often provide that mystical and additional small piece of context to a
report, which transforms the facts into a narrative that makes sense.
They do this by emphasizing the particular aspects of what you want
your audience to grasp more than anything else.
The following examples of visuals can be found in this
category:
● Gauge
● Card
● Multi-row card
● KPI
● Slicer
● Table
● Matrix
Gauge
One of the earliest kinds of data visualization that we have at our
disposal is the gauge. For the better part of the 20th century, they
were used in a variety of equipment and engineering projects. In its
most basic form, a gauge provides you with information on the
current status of a value in relation to its lowest value, its maximum
value, and its goal value. The gauge is useful for defining goals and
providing a fast way to determine whether or not you are currently
exceeding or falling short of those goals. Additionally, it can be
helpful when you want to maintain a number within a certain range
so that it is neither too hot nor too cold.

Card/Multi-Row Card
The card visual in Power BI is one of the most straightforward
visuals available. It is also one of the names whose meaning can be
deduced by just looking at it. When a value is provided to it, it takes
that value and places it on a square card along with a short
explanation of what it is that is being seen. In my opinion, the card is
effective in both of these areas. First, because of its straightforward
nature, it is not difficult to comprehend the content that you are
reading. When you combine this with the capability of the card visual
to cross-filter, you have a tool that can simply and rapidly emphasize
a particular value for any combination of data. Second, it assists the
narrative process by directing the reader's attention to certain values
that you believe they should care about. These values can be
significant in and of themselves, or they may give the required
context that makes the other visuals on your report page more
relevant. Either way, it is beneficial to the storytelling process.
The multi-row card visual is analogous to the relative of the card
visual that your aunt and uncle continuously gush about at
Thanksgiving, but you have your doubts about whether or not they
are all that wonderful. Unapologetically cramming several values into
a single "card"-like the area is the goal of the multi-row card. If all of
the numbers are aggregates, then it will have a somewhat clean
appearance. However, if you have aggregations and then some
dimension that categorizes those aggregations, the multi-row card
will construct several rows on the card, showing the aggregated
values for each category value. These rows will be shown in the
order that the categorical values appear on the card.
KPI
The key performance indicator (KPI) visual is one of those things
that, on the surface, seems like a wonderful idea, but ultimately ends
up driving people crazy. Although it is not quite as confusing as the
scatter chart, the KPI visual might be difficult to understand at first.
There is a field that serves as an indication, and the value of this
field is what you are truly measuring. There is something called a
trend axis, which is how the visual will present the findings on an x-
axis that isn't being displayed right now. Then there is the desired
aim, which either we should be higher or lower than at this point. The
comparison of the current year's revenues to those of the prior year's
revenues broken down by fiscal month is a typical example of a KPI
use case. The most annoying aspect of the KPI is that it always
shows the value for the most recent value on the axis, regardless of
what occurred before it, even though it will depict those data in the
chart part of the report. This is a waste of everyone's time.
Table/Matrix
The table is exactly what it sounds like. There are no extra features
or bonuses here. In the Visualizations window, the table visual has
only one insert, and it goes into a values field. Everything is a field or
column, and the values are all associated with the column that
everything is in. It could seem contradictory to include a table visual
in a data visualization tool; nonetheless, the table visual can give
some additional particular information that can provide more
contexts. When it comes to emphasizing certain data points, I often
find that a table, as opposed to a multi-row card, works best for me.
Another nice thing done with table visuals is placing them at the end
of reports. This makes it simple for analysts who may want to utilize
the data from that table for other types of analysis to extract the data
from the table quickly and easily.
The matrix is more analogous to a pivot table in Excel since it has
the capability of having many layers of row attributes into which one
may delve. Matrices can potentially be overdone, however, when
they are utilized to highlight certain data sets or combinations of
data, they can assist spotlight specific things for readers or offer
further context to analysts, allowing the analysts to figure out what
questions they need to address as rapidly as possible. For a good
number of years, Power BI would first place the data in a table
before exporting it, which caused the matrix formatting to be lost
when the data was exported from a matrix visual. However, as of
recently, this is no longer the case; thus, if you need to export data in
the format of a matrix, maybe for a presentation or anything else,
you can do so.
Slicer
There isn't a huge amount of difference between using a slicer and
selecting a column in the Filters pane by clicking the "Filters on this
page" button. What is different, however, is that as a visual on a
report page, you can alter how it interacts with every other visual on
the report page by utilizing the "Edit interactions" function on the
Format tab of the ribbon; and slicers can be synchronized across
pages of your choice by using the "Sync slicers" pane. Both of
these features can be found on the ribbon's Format tab. Because of
this, slicers can become far more versatile and user-friendly for the
people reading your report. This provides you, the author, with an
additional opportunity to highlight the particular aspects that, in your
opinion, should be emphasized to assist your audience in
comprehending how they should be thinking about the content.

Conclusion
In conclusion, data visualization in Power BI is essential for gaining
relevant insights and making sensible decisions. The Visualizations
Pane acts as a central location for producing and modifying data
visualizations, giving users access to a variety of visual formats for
efficient information communication. Users can explore and analyze
data dynamically and intuitively by using visual interaction, revealing
patterns, trends, and outliers. Power BI's data analysis and decision-
making process both heavily rely on data visualization. Data
visualization improves data interpretation, permits trend detection,
and aids in successful communication, whether it be via column and
bar charts, map visuals, "flat visuals," tables and matrices, slicers, or
other visual kinds. Users may get useful insights from their data,
enhance reasoned decision-making, and promote corporate success
by using the power of visual representations.
CHAPTER 5
AGGREGATIONS, MEASURES, AND
DAX
A Primer on the DAX Language
Data Analysis Expressions, often known as DAX, is a multipurpose
query language that can be used to retrieve particular results or to
generate calculated tables or columns. Analysis Services Tabular
and Power BI both utilize DAX as the language of their respective
formula engines. When you build a new visual in Power BI, the
program automatically creates DAX behind the scenes to get the
data from the Power BI storage engine. However, much like
everything else in capability BI, DAX must be configured on a
column-by-column or table-by-table basis for it to be used. DAX has
an extraordinary amount of capability. In software such as Excel,
where everything is specified down to the level of each cell, this is
not the case. In contrast to Excel, DAX cannot edit specific points in
space or cells. The data in a table or column must be modified
before it may proceed. Keep in mind that the fundamental structure
of our database is columnar; thus, we want a language that can edit
and query columns of data rather than individual cells of information.
It is considerably more powerful to be able to change and query
columns since it is quicker, and you don't have all of the overhead of
having to be able to write formulae against particular cells. Having
this ability also makes it easier to work with large amounts of data.
Measures
The idea of measures is one of the most essential aspects of the
DAX programming language, and it is very important for both the
data analysis and the reporting processes. Measures are, in a
nutshell, calculations that are carried out at the moment. They allow
us to extract aggregated numbers, run calculations over several
rows or tables, and assess sophisticated business logic. Answers to
inquiries like "What is the total sales revenue?" or "What is the
average customer satisfaction score?" are often provided using
measures.
To construct a measure in DAX, we utilize the construct MEASURE
command or the user interface offered by visualization tools like
Power BI. Measures are often connected with a particular table in
the data model and are assessed within the context of that table.
However, they can also reference columns from other tables via the
connections between the tables. Expressions written in DAX, which
include functions, operators, and pointers to columns or other
measures, are used to generate measures. DAX has a
comprehensive set of functions, which can be used to carry out a
variety of calculations, aggregations, filtering, and other operations.
These functions make it possible for us to compose strong
expressions that can extract valuable insights from the data we
provide.
One of the most important characteristics of measures is their
capacity to dynamically adjust to varying degrees of granularity in the
data they collect. For instance, a measure of sales revenue can
either present the total revenue for the whole dataset or dynamically
alter itself to provide revenue at various levels of aggregation, such
as by year, month, or product type. Because of its adaptability, this
system enables drill-down analyses as well as the examination of
data at varying degrees of detail. The context in which measures are
assessed is also taken into consideration. The term "context" refers
to the collection of filters or criteria that are applied to the data before
any calculations are carried out. DAX contains methods like
CALCULATE and FILTER that allow users to change the context
and exert control over the calculation of measures. Because of this
understanding of context, it is possible to do more complex
calculations and use dynamic filtering. It is imperative that while
establishing measures, consideration be given to the various sorts of
data and it must be ensured that these types of data correspond to
the outcomes that are anticipated. DAX is capable of working with a
wide range of data kinds, including numeric, text, Boolean, date/time,
and many more. To guarantee that the calculations are accurate and
consistent, it is essential to choose the proper data type for each
measure.
Conditional logic, branching statements, and iterators can all be
used to improve measures. DAX contains methods such as IF and
SWITCH, as well as iterators such as SUMX and AVERAGEX,
which allow us to do calculations based on specified circumstances
or traverse over a table while applying filters. Because of these
properties, we can construct measures that are more sophisticated
and dynamic, which allows us to satisfy the criteria of more
complicated analyses. In addition to the mathematical calculations
they do, measures can also include text concatenation, formatting,
and other string operations. This enables us to generate informative
measures, have textual context, and are presented appealingly.
Additionally, we can merge numerous data sources into a single
result.
Lastly, measures are used as the basis for the development of
interactive visualizations and reports. They can be used in charts,
tables, and other types of visual components to highlight important
metrics and make data-driven decision-making easier. Measures
provide stakeholders with the vital information they need to analyze
trends, draw comparisons, and get useful business insights.
Calculated Columns
Calculated columns are a useful feature of the DAX programming
language that provides users the ability to add additional columns to
a table that are determined by the results of their individualized
calculations. A calculated column in DAX is defined by a formula that
is carried out on each row of a table. This formula can be found in
the table's header row. The formula is assessed column by column,
and the results are saved in the calculated column row that
corresponds to each row that the formula affects. Calculated
columns are often produced during the data modeling process, at
which point they are irreversibly incorporated into the table structure.
A formula for a calculated column can make use of DAX functions
and operators, as well as references to other columns included
within the same table. This allows you to execute calculations that
include many columns or apply sophisticated logic to obtain new
values. For instance, you can design a calculated column that
determines the profit margin by deducting the cost from the selling
price and displaying the result. Calculated columns enable users to
generate new data components, which may either give extra context
or additional insights. This is one of the most significant advantages
of using calculated columns. When added to the table, these
columns become a component of it and can be put to use in a variety
of ways, including calculations, filtering, sorting, and so on. When
you need to do calculations that include numerous rows, or when
you want to build dimensions based on existing data, they can be
very helpful. Another use case in which they come in handy is when
you want to create a pivot table.
Calculated columns are assessed at the row level, which means that
the formula is applied to every one of the rows in the table
individually. Calculations that take into consideration certain row
conditions or values are now feasible as a result of this development.
For instance, you can establish a calculated column that provides a
rank to consumers based on a specified set of criteria, or you can
classify customers following the history of their purchases. When
using calculated columns, the influence that these columns have on
the size and performance of the data model is an essential aspect to
take into account. Because the calculated values need to be saved
for each row, calculated columns make the table's storage needs
more stringent. In addition, calculations involving calculated columns
are done when data is being refreshed or queries are being
executed, affecting how quickly queries are processed. It is essential
to make calculated columns as efficient as possible by optimizing
their utilization and taking into account the storage and performance
trade-offs associated with using them.
Calculated columns can also be used as inputs for measures, which
enable more complex forms of calculation and analysis to be
performed. Calculations known as measures are carried out in real-
time and are dynamic. Measures sometimes include the aggregation
of data from many rows or tables. You can improve the degree of
analysis and extract insights that would not be achievable with
measures on their own by utilizing calculated columns as inputs for
measures. This is because using measures on their own would be
impossible. When constructing calculated columns, it is critical to
choose the right data types for the new column that is being created.
DAX is capable of working with many different kinds of data,
including numeric, text, Boolean, date/time, and more. The correct
representation of data and precise calculations may both be
achieved by carefully selecting the appropriate data type.
Calculated Tables
Calculated tables are a useful feature of the DAX programming
language that provides users the ability to generate new tables in a
data model depending on the results of their custom calculations or
filters. During this conversation, we are going to explore the notion of
calculated tables, as well as their many advantages and the
productive ways in which they can be used while doing data
analysis.
The EVALUATE statement is used to define a table expression,
which is then used in the process of producing a calculated table in
DAX. The DAX functions, operators, and references to previously
created tables can all be included in the table expression. As a
consequence of evaluating the phrase, a new table will be produced,
which will be calculated in real time according to the logic that has
been provided. Calculated tables provide a means of creating virtual
tables within the context of your data model. These tables are not
kept in the underlying data source; rather, they are constructed
dynamically at the time of use depending on the logic of the
calculation. This gives you the ability to build tables that each
represent a different subset of the data or to apply specialized filters
to the tables that are already available.
Calculated tables may make complicated data modeling situations
much easier to understand and work with, which is one of the
primary advantages of using them. They make it possible for you to
construct tables that compile data from different sources or tables,
apply individualized filters, or carry out calculations based on a set of
predetermined requirements. Calculated tables can be very helpful
for doing cross-table calculations, such as when establishing a table
that combines sales data with customer demographics. One
example of this kind of calculation is a table that combines sales
data with customer demographics.
Calculated tables can also be used for the creation of summary
tables or aggregate tables, both of which can save results that have
already been calculated. The quantity of data that has to be
processed can be greatly reduced thanks to the use of these
summary tables, which in turn can significantly enhance the speed of
the query. You can speed up searches and give quicker insights if
you aggregate the data at a greater degree of granularity. The
adaptability of calculated tables is yet another benefit associated
with using them. Calculated tables can adjust to changes in the data
model or filters applied to the data since they are dynamically
calculated. This enables dynamic analysis and reporting, in which
the calculated table dynamically updates itself depending on user
options or changes in the data that underlies it.
Note that calculated tables provide certain challenges in terms of
both performance and the complexity of the data model. This is an
essential point to keep in mind. Calculated tables add to the overall
memory footprint of the data model since they are responsible for
storing the results of any calculations that have been performed.
Additionally, complicated calculated tables that contain significant
amounts of data can affect the performance of queries. It is essential
to optimize the use of calculated tables and take into account the
trade-offs these tables provide in terms of the amount of memory
they use and the speed at which they process queries.
Calculated tables are one of the DAX capabilities that can be utilized
in combination with others, such as calculated columns and
measures. They can perform the function of a foundation upon which
more complex calculations or analyses can be constructed. You can
construct robust data models that give comprehensive insights and
meet complicated business needs if you combine calculated tables
with other parts of the DAX language. Calculated tables are also
often used in the "role-playing" of dimensions in the creation of
Power BI. In the most frequent case, you have a fact table that has
numerous dates, and you need to be able to interact with different
dates depending on the reporting need. Only one of the relationships
on your date table can be considered active at any one time.
Therefore, you can make yet another group of dates with the help of
a calculated table to construct yet another connection with that fact
table. Then you will have two alternative data tables that you can
utilize, depending on whatever date you might need to use from a
certain fact table. You can choose to use any of these data tables.
Types of Functions
The DAX functions are organized into families, also known as types,
which specify the tasks that each one is meant to carry out. A
building of one or more DAX functions, DAX formula is a
construction of one or more DAX functions. Simply since there are
hundreds of them, most people surely doing not use every single
DAX feature that is provided. For this reason, it can be useful to
know how DAX functions are organized in case you ever need to
quickly seek up information on how certain function works or if you're
trying to discover a function that suits the sort of analysis or data
manipulation you are attempting to execute.
The following is a list of the several ways that DAX functions
can be categorized:
● Aggregation functions
● Date and time functions
● Filter functions
● Financial functions
● Information functions
● Logical functions
● Math and trigonometric functions
● Other functions
● Parent and child functions
● Relationship functions
● Statistical functions
● Table manipulation functions
● Text functions
● Time intelligence functions
Aggregations, More than Some Sums
When doing data analysis, aggregations entail more than just basic
sums; rather, they involve a variety of calculations that give insightful
new perspectives on the information under consideration. It requires
extracting summary or aggregated values from a dataset and then
calculating those values. Although sums are a typical sort of
aggregation, there are a great many additional varieties that give
quite diverse viewpoints on the data.
We can obtain a more in-depth knowledge of patterns, trends,
and correlations by using aggregations, which enable us to
study data at varying degrees of granularity.

1. Sum: A fundamental kind of aggregation, summation


determines the final value by simply adding up the values
of the various components. It is most often used for
presenting numerical information, such as sales income,
quantity, or cost.
2. Average: The arithmetic mean of a group of numbers can
be determined by the use of an aggregation called the
average. It gives an insight into the primary trend of the
data and is often used to examine measures such as
average customer expenditure or average rating.
3. Count: The count aggregation determines the total
number of items included inside a dataset. It helps
comprehend the magnitude or frequency of events, such
as the number of orders, customers, or transactions.
4. Minimum and maximum: These aggregations find the
lowest and highest values in a dataset, respectively. They
compare all of the values in the dataset. They help identify
extremes and limits, such as the lowest and highest
recorded temperatures or the lowest and highest sales
statistics.
5. Median: The number that represents the "middle" position
in a sorted collection of data is called the median. Since it
is less impacted by extreme values, this aggregation is
beneficial for studying data with outliers or skewed
distributions since it reduces the impact of such factors.
6. Mode: The mode of a dataset is the value that appears
most often within that dataset. It is used to determine
which data points are the most prevalent or well-known,
such as the product or method of transportation that is
bought the most often.
7. Percentile: The term "percentile" refers to a method that
segments a dataset into hundredths to analyze the values'
distribution. For example, the median is equal to the value
at the 50th percentile of the distribution. The use of
percentiles is beneficial for comparing data points and
gaining a better understanding of their location within a
dataset.
8. Aggregations with Filters: Calculations can be carried
out on different subsets of data by combining aggregations
with filters. For instance, determining the overall income for
a certain area or the aggregate total of sales for a
particular kind of product category.
9. Weighted Aggregations: Weighted aggregations are
aggregations that provide various weights to individual
data items depending on how significant they are. When
examining data with varied relevance levels, such as a
weighted average or weighted total, they are important
tools to have.
10.
Statistical Aggregations: Statistical
aggregations, such as standard deviation, variance, or
correlation, provide insights into the variability, distribution,
and interactions between data points.

Sum
The time-tested method. Everyone is aware of what a sum is. It's a
case of adding. You have always had a solid grasp of the art of
adding numbers. You have undoubtedly previously used the sum
functions in Excel or other applications that are comparable to it.
The good news is that Sum works the same way in Power BI. For a
given column of numbers, we will obtain a sum, or addition, of those
numerical values for any given combination of data that is displayed
in a visual. This is because Sum works the same way in both Excel
and Power BI.
The visual representation of this, like that of so many other data
ideas, is the one that is easiest to grasp. Since the purpose of our
meeting is to visualize data, it seems more appropriate to
demonstrate with an example rather than explain it verbally.
Two tables can be seen in the picture below. Both have a total
number of office hours that they have been present for. Both of them
arrive at the same total, but their results for each possible
combination of data are unique.
The total number of office hours attended by a variety of grouped
classifications is shown in the table at the top of the page. Therefore,
in this particular instance, the individuals who attended the ISOM
210 course that began on January 4, 2021, who uses she/her as
their preferred pronouns, and who had never used Power BI before
the beginning of the course accounted for a total of 22 of the 41
office hours that were available. In this case, the total is the sum of
the values that are included in the OfficeHoursAttended column.
The table that is located at the bottom is different. It displays the
breakdown by a student's last name as well as, for a specific
assignment ID, the number of office hours that the student attended.
We can see that the overall number of hours remains the same, but
they are split up differently. If you were to take a peek at the table
that contains the grades, you would see that not a single student
attended more than one office hour for any particular assignment ID.
In this instance, the sum is still computing a summation; however,
the summation in question reduces all of the data to the level of a
single cell.
The reason I wanted to show you this example is to illustrate that by
default, Power BI will hide data collections that do not produce a
result for a summation that has been specified. Therefore, we are
aware that Ms. Avina only attended one of the office hours, and that
attendance was for Assignment ID 12. By evaluating just the data
where an actual value is provided rather than seeing results that
don't return any data at all, we can determine that a lot more rapidly.
Average
There are some different kinds of averages. To describe the default
aggregate that is shown in the options pane located under the
Visualizations section, Power BI is producing a straightforward
average. This simple average can be described as the total of a
given set of values, filtered by all the relevant categories on a visual,
which is then divided by the count of records that are likewise filtered
by all the relevant categories on a visual. In other words, the simple
average is the sum of the values filtered by all the relevant
categories on the visual. Therefore, if we had, for a particular
combination of data, a total score of 240 and three recordings made
up those 240 points, we would have an average score of 80. Exactly
like the sum, if we wind up getting to a combination of categories that
only returns one record's worth of data, a value divided by one is
itself; that's still technically an average. If we end up coming to a
combination of categories that only returns one record's worth of
data, a value divided by one is itself. Below are two new tables with
average scores. In addition, on the left is the total number of office
hours attended, and on the right, the average number of office hours
attended.

There are two characteristics of the picture that you see above that
could jump out at you. To begin, the cumulative Average Score
across both tables is equivalent to one another. Second, doesn't the
number of AverageOfficeHoursAttended that's shown on the right
appear a little off? The average is 1, although some numbers are
blank. What does it imply? Why are some of the values blank?
Nobody attended more than or less than one office hour for any
particular assignment. However, blanks are not zeros. In many
datasets, it is not unusual to find true blanks, often known as nulls in
database language.
The most essential thing to take away from this is that Power BI
does not consider nulls to be zeros and simply disregards them
when it generates a count of values. If nulls are disregarded while
creating a count, our mathematical description for the average in
Power BI will still be accurate, and the overall average shown in the
table to the right of the picture will continue to make sense.
Now, the most important thing is to figure out how to compute an
average for Ms. Avina. Should it be 1 hour divided by 1 occasion of
attending office hours, or should it be 1 hour divided by 14 occasions
to attend office hours, with 13 zeros? If the latter, then how many
zeros should follow the division? It is a question that only we, as the
analyst or author of the report, can provide an answer to, and the
response will largely rely on what we want the average to represent.
The second question is why the Average of OfficeHoursAttended
column on the right table is returning blank numbers for certain
users. The default behavior was not to return values when there
would be no data to show. On the other hand, if there is a second
aggregate on a visual that does yield a value for a certain
combination of data, then all of the other aggregations will still
appear, even though they will return a blank result.
The picture below is a simplified representation that was created by
duplicating the right table from the preceding image and omitting the
average score from the second copy. When you look at the two
tables together, you can confirm that the appearance of blank values
in the visual is due, in fact, to the inclusion of the second aggregate.
This can be seen by putting the two tables side by side. It also helps
show the concept that was made before about discarding blank
numbers when calculating an average.
Minimum and Maximum
When it comes to data analysis and visualization, Power BI places a
significant emphasis on the ideas of minimum and maximum. They
help determine the borders, range of values, and outliers that exist
within a dataset. Here, we will discuss the relevance of Power BI's
minimum and maximum values, how to compute them using DAX
(Data Analysis Expressions), as well as their uses in data analysis
and visualization. The borders and extremes of a dataset can be
better understood with the help of a dataset's minimum and
maximum values.
Let's get into more depth about their importance and how they
can be applied:

1. Data Range and Boundaries: The minimum and


maximum values, respectively, determine the range in
which the data points are located. It is vital to have a solid
understanding of the data range to determine the breadth
and variability of the dataset. You can rapidly discover the
lower and upper boundaries by displaying the lowest and
maximum values. This can help you detect possible
outliers or strange data points.
2. Data Cleaning and Quality Control: When it comes to
data cleaning operations, minimum and maximum values
are often employed. Outliers, abnormalities, and erroneous
data items that fall outside of the predicted range are
easier to spot with their assistance. You can assure the
correctness and dependability of the data by locating and
eliminating any outliers that may be present. You'll be able
to highlight or explore data points that considerably depart
from the norm when you use minimum and maximum
values for data quality control. This is because minimum
and maximum values are quite useful.
3. Visualizing Data Extremes: Power BI provides users with
a diverse selection of visualization techniques that can be
used to properly show data. The minimum and maximum
values can be shown using many types of visualizations,
such as line charts, column charts, or scatter plots, to
provide a clear understanding of the dispersion of the data.
In addition, making use of conditional formatting or color
scales that are determined by minimum and maximum
values can assist in bringing attention to patterns and
extremes in the data.
4. Threshold Determination: Minimum and maximum
values are often used in the process of decision-making as
either thresholds or benchmarks. To do a sales study, for
instance, you can decide to establish a minimum sales
objective or locate goods that have the highest sales to
concentrate on. You can assess performance, pinpoint
areas in which there is room for improvement, and make
choices based on this information by comparing real
numbers to these criteria.
5. Data Normalization: The minimum and maximum values
are essential components of the data normalization
process, which entails adjusting the data to fit within a
predetermined range. By ensuring that all variables are
measured on the same scale, normalization prevents
certain variables from being disproportionately influential in
the analysis owing to the bigger values they represent. You
can assist with fair comparisons and reliable analysis by
rescaling the data by utilizing the lowest and maximum
values as the scaling factors.
6. Machine Learning and Modeling: When doing activities
related to machine learning, feature scaling is often
required to make fair comparisons across a variety of
features or variables. When you scale features by utilizing
minimum and maximum values, you put them within a
defined range. This reduces bias and makes it possible for
models to accurately evaluate the relevance of the
features. Scaling of features is particularly important for
algorithms like k-means clustering and support vector
machines, which depend on distance-based calculations to
determine outcomes.
7. Conditional Formatting: Power BI has some strong
options for conditional formatting, which enables users to
highlight data depending on how it relates to minimum and
maximum values. You can visually emphasize high or low
values, discover patterns, and call attention to critical data
points by applying conditional formatting rules to tables,
charts, or cards. This allows you to visually highlight high
or low values.

The MIN and MAX functions of the DAX language can be used in
Power BI to do the calculation of minimal and maximum values,
respectively. These functions take a column or expression as their
input and then return either the lowest possible value or the highest
possible value that can be found in the column or expression.
For example, to calculate the minimum and maximum sales
amounts from a sales table in Power BI, you can use the
following DAX expressions:
• Minimum Sales = MIN(Sales[Amount])
• Maximum Sales = MAX(Sales[Amount])
These expressions will calculate the minimum and maximum sales
amounts, respectively, based on the "Amount" column in the "Sales"
table.
Standard Deviation, Variance, and Median
Important statistical measures such as the standard deviation,
variance, and median give useful insights into the distribution,
variability, and central tendency of a dataset. These measures can
be found in the standard deviation, variance, and median.
Standard Deviation
A measurement of the dispersion or spread of data points around the
mean is referred to as the standard deviation. It provides a numerical
representation of how much the data deviates from the mean.
Greater variability is indicated by a standard deviation with a larger
value, whereas a standard deviation with a lower value shows less
dispersion.
The following are the steps by which standard deviation is
calculated:
● Calculate the mean of the dataset.
● For each data point, calculate the difference between that data
point and the mean.
● Square each difference.
● Calculate the mean of the squared differences.
● Take the square root of the mean squared difference to obtain
the standard deviation.
Standard deviation is useful in several ways:
● It helps understand the spread and variability of data, which is
particularly important when comparing datasets or analyzing
the stability of a process over time.
● Standard deviation is often used to assess risk or volatility in
finance, such as measuring the volatility of stock prices.
● In quality control, standard deviation helps determine if a
process is within acceptable limits or if there is excessive
variation.
Variance
In the same vein as standard deviation, variance is an additional
measure of the dispersion of data. It does this by calculating the
average squared difference between each data point and the mean
of the data. The term "variance" refers to the amount that individual
data points deviate from the "mean" value. The procedures involved
in calculating variance are very similar to those involved in
calculating standard deviation, except for the last step, which
involves computing the square root of the value.
Variation is helpful in a few different ways:
● It plays a role in the analysis of experimental data to measure
the effect of different factors on the variability of the response
variable.
● Variance is employed in modeling and optimization problems,
such as determining the optimal allocation of resources.
● Variance is used in statistical inference to assess the variability
of a population based on a sample.

Median
The median is a measure of central tendency that is used to reflect
the value that falls exactly in the center of a collection of data that
has been sorted either ascending or descending. It is unaffected by
outlying values or extreme values, which makes it resilient in the
presence of skewed data.
The median is calculated in the manner described below:
● Place the values in either ascending or descending order.
● If the dataset has an odd number of values, the median is the
middle value.
● If the dataset has an even number of values, the median is the
average of the two middle values.
The median is a very helpful statistic in the following
circumstances:
• When working with skewed data or datasets that include
outliers, a better depiction of the core value can be found using
the median as opposed to the mean.
• It is often used in situations involving income distributions,
property prices, or other skewed economic indicators, which
are situations in which extreme values can have a substantial
influence on the mean but a less significant impact on the
median.
• In data analysis, the median is also used to compare the
central tendency of distinct groups or to characterize the
distribution of ordinal or ranked data. Both of these applications
include ranking the data.
You can compute any of these statistical measures by utilizing
the DAX functions that are available in Power BI:
• Standard Deviation: In Power BI, you can determine the
standard deviation of a column or expression by using either
the STDEV.P or STDEV.S functions. STDEV.P is used when
the data being analyzed are representative of a subset of a
population, while STDEV.S is utilized when the data being
analyzed are representative of the complete population.
• Variance: In Power BI, the variance of a column or expression
can be computed using either the VAR.P or VAR.S function.
Data from a sample are entered into VAR.P, while data from
the population are entered into VAR.S.
• Median: The MEDIAN function in Power BI determines the
value that represents the columns or expression's median.
When the data are sorted, it gives back the value that is in the
center.
Count and Count (Distinct)
Count and Count (Distinct) are two forms of aggregation that are
very helpful and can be used for any data type. You can use them to
count instances of text, dates, or numerical data. You can also use
them to count the number of occurrences of data. The selection
Count will count duplicate occurrences of that data, but the selection
Count (Distinct) will not do so. This is the primary distinction between
the two options. This count is carried out in relation to the column of
data that is now being referred to. There are four examples of
counting and counting (distinct) shown in the image below. To begin,
there is a list of total scores for all of the assignments located on the
top left. Because we are not aggregating the score for that table, we
are first retrieving the raw values and then carrying out a count of the
score as well as a count (Distinct) of the score to determine the total
number of occurrences of each score associated with that
assignment.
One thing to keep in mind is that Power BI will construct an alias for
the majority of aggregations to assist readers in recognizing the
aggregation if it isn't a sum. Nevertheless, Power BI aliases Count
(Distinct) in the same manner as it aliases Count; thus, to ensure
that the visuals are clear, I have aliased Count (Distinct) manually. A
new name can be given to any object in a visual by double-clicking
the item in the Values section of the Visualizations pane and then
putting in the new name. This is how an alias is created.

Note that the example in the upper left corner has three separate
counts: the raw score, the number of times that score occurs in the
dataset, and finally the distinct count, which is one for each value.
The sum reflects 52 distinct values in its entirety. On the other hand,
the distinct count of each value will consistently be one in the visual
representation, which lists all of the values. When we examine the
bottom left corner of the image, we find that it is the same visual as
before; however, the score value has been deleted. Here, we can
see the total number of scores, which is 280, as well as the number
of unique score values, which is 52.
The example in the upper right corner provides us with a count of the
occurrences of a certain date, which is the term start date in this
scenario. As the dataset only contains information about one
academic year at the moment, we only have a single-term start date
to work with. The table from which it derives has a total of 20
occurrences of that value. Keep in mind that due to the manner that
this table was built, the distinct count will always equal 1 regardless
of the circumstances. This is because the table defines a value, and
the distinct count of that value will always equal 1. An alternative
perspective on the situations we've been discussing is provided by
the example in the bottom-right corner. We have a list of the first
names of the pupils, together with the count of those names, which
is 1, for each name. This is not a count of unique names; it simply so
happens that none of the names occur more than once.
However, after that, we can look at the Count (Distinct) of scores and
determine how many distinct scores each person has gotten. Given
that there are 14 assignments, we would anticipate a significant
number of separate counts of scores for each individual, but as you
can see, it is not always 14 either. During the semester, several
students received the same score on many occasions.
First, Last, Earliest, and Latest
In the field of data analysis, certain functions and concepts, such as
first, last, earliest, and latest, are often used to locate certain values
or dates included inside a dataset. These functions provide insights
into the earliest and latest occurrences, in addition to the first and
last values, that occur inside a certain context. During this
conversation, we will explore the meaning of the terms "First,"
"Last," "Earliest," and "Latest," as well as their applications and
how they are applied in a variety of data analysis settings.
• First: The First function is used to identify the first value or
record inside a dataset or a certain grouping. This can be
accomplished by using the function to find the first value or
record. It assists in locating the first occurrence or value in a
series of events. When doing analyses on time-series data or
data that has been organized according to a particular
criterion, the First function is very helpful.
For example, in Power BI, you can use the First function as
follows:
• First Customer = FIRSTNONBLANK (Customer
[CustomerName],
• Customer [Date])
Based on the order of the "Date" column, this expression will return
the first non-blank value that can be found in the "CustomerName"
column of the "Customer" database. It assists in determining which
client was the first one recorded or which customer had the earliest
date.
• Last: The First function's analog, the Last function, is found
in the Last function. It denotes the record or value that comes
at the very end of a dataset or a particular grouping. The Last
function helps determine which occurrence or value is the very
last one in a series of events. When doing an analysis of the
data in the other direction, or when trying to establish which
event occurred most recently, it is a technique that is often
used.
For example, in Power BI, you can use the Last function as
follows:
• Last Sale Amount = LASTNONBLANK(Sales[Amount],
Sales[Date])
This expression retrieves the most recent non-blank value of the
"Amount" column in the "Sales" database depending on the order of
the "Date" column in the "Sales" table. It helps determine the
amount of the most recent sale that was recorded.
• Earliest: The Earliest function focuses on locating the earliest
date or value inside a dataset or a particular grouping. It does
this by searching for the earliest date or value. In the process
of dealing with time-series data, it is often used to create the
earliest occurrence of an event.

For example, in Power BI, you can use the Min function to
calculate the earliest date as follows:
Earliest Date = MIN(DateTable[Date])
This expression will provide the earliest possible date value that can
be found in the "Date" column of the "DateTable" table. It assists in
determining the date that was recorded first inside the collection.
• Latest: The Latest function is used to determine the date or
value that is the most recent occurrence inside a given dataset
or particular grouping. It is especially helpful for finding the
most recent value or the most recent occurrence of a given
event.
For example, in Power BI, you can use the Max function to
calculate the latest date as follows:
• Latest Date = MAX(DateTable[Date])
This expression retrieves the most recent possible date value from
the "Date" column of the "DateTable" table. It is useful in
determining which date was the most recent one recorded in the
dataset.
Applications of First, Last, Earliest, and Latest:
• Temporal Analysis: These functions are often used in the
process of evaluating time-series data, which might include
things like sales records, market prices, or weather data. They
help determine whether events came first or last, the earliest or
most recent occurrences, or the first and last values that
occurred over a certain period.
• Customer Analysis: The First and final functions are often
used in customer analysis to determine the first and final
interactions or purchases made by customers. This information
can be utilized to better understand customer behavior. This
information can provide light on a variety of topics, including
consumer behavior, loyalty, and turnover.
• Data Monitoring: The Earliest and Latest capabilities help
monitor updates or changes to data over time. Analysts can
follow the development or progression of certain variables or
metrics by determining the earliest and latest values or dates.
• Report Visualizations: The First, Last, Earliest, and Latest
values can be shown in reports and visualizations to either
offer an overview of the information or to emphasize individual
data points. A dashboard may, for instance, display the first
and final sale amounts, the earliest and latest dates recorded,
or the starting and final values in a time series.
Measures and DAX Fundamentals
We do data aggregation so that we can better comprehend the data
points themselves. There are occasions when we want to ensure
that the calculation is carried out in a certain manner, or that
potential users in the future can see an explicit calculation in our
data model. There are occasions when just looking at the name of
one of our columns is enough to comprehend what an aggregate is
doing. What is often misunderstood, however, is the fact that a
measure is being used to build an aggregate in either scenario—
whether you are dragging a column into the Values section to obtain
a total or average, or you are constructing a DAX measure to do a
calculation. In either case, an aggregation is being created.
Implicit and Explicit Measures
Implicit Measures
Calculations that are generated automatically by Power BI
depending on the data model and visualizations are referred to as
implicit measures. Implicit measures are also known as automated
measures. These measures are obtained by deriving them from the
fields and relationships that are described by the model. Power BI
examines the structure of the data and automatically selects the right
aggregation functions to use based on the findings of this analysis.
Characteristics of Implicit Measures in Power BI
• Automatic Generation: Power BI will generate implicit
measures on its own without any explicit user input. They are
created in accordance with the structure of the data model and
the fields that are used in the visualizations.
• Aggregated Functions: Implicit measures often utilize
ordinary aggregated functions, such as total, count, average,
maximum, or minimum. This is because implicit measures are
not explicitly stated. The data type of the field is taken into
consideration by Power BI when determining the suitable
aggregate to use.
• Contextual Evaluation: Implicit measures are evaluated
within the context of the visualizations. Power BI takes into
account filters, slicers, and other visual interactions to provide
accurate results.
Applications of Implicit Measures in Power BI
• Quick Analysis: Implicit measures help do exploratory data
analysis in a hurry. Users can simply drag and drop fields into
visualizations, and Power BI will automatically construct the
calculations that are required. This enables users to
concentrate on visually representing the data rather than
explicitly specifying metrics.
• Standard Aggregations: Implicit measures give basic
aggregations that are often employed in data analysis, such as
adding up the total amount of sales, tallying the total number of
orders, or calculating an average rating for each product. They
do this by automatically applying the proper aggregating
procedures, which in turn makes the process of building basic
visualizations much simpler.

Explicit Measures
Calculations developed by end-users in Power BI by way of the Data
Analysis Expressions (DAX) language are referred to as explicit
measures. These calculations are also known as user-defined
measures. Unlike implicit measures, explicit measures need to be
created and defined manually. This gives users greater control and
flexibility over the reasoning behind the calculation, in contrast to
implicit measures.
Characteristics of Explicit Measures in Power BI
• User-Defined: These explicit measures are generated by the
users themselves by utilizing DAX expressions. Users are
responsible for defining the calculation logic, specifying the
aggregation functions, and customizing the calculations
following the analytical needs that are unique to them.
• Customized Calculations: Users can execute complicated
calculations that go beyond the scope of typical aggregations
when using explicit measures. They can make use of
sophisticated functions, develop conditional logic, execute
calculations for time intelligence, or apply rules that are
particular to the company.
• Reusability: Explicit measures can be reused across various
visuals or reports, which ensure consistency in calculations
and reduces the need to duplicate calculations in different
visualizations. Explicit measures also can be exported and
imported into other applications.
Applications of Explicit Measures in Power BI
• Advanced Calculations: To complete complex calculations
and analysis, explicit measures are necessary. The user can
use statistical functions, build complicated business measures,
compute growth rates, do segmentation analysis, and produce
custom KPIs.
• Time Intelligence: Power BI includes a broad collection of
functions for time intelligence, which enables users to define
explicit measures for year-to-date calculations, period
comparisons, moving averages, or other time-based studies.
These functions can be accessed via the Power BI interface.
• Business-Specific Metrics: Explicit measures make it
possible to develop metrics that are unique to the business that
are adapted to the needs of the organization. Calculations for
customer retention rates, profitability ratios, conversion rates,
and any other key performance indicators relevant to the users'
particular company environment can be defined by users.
DAX Syntax Fundamentals
To get a grasp on the fundamentals of DAX syntax, we will begin
with a simple calculation that uses the AVERAGE function applied to
a single column.

First, going in this direction from left to right, we have the title of our
measure, which in this instance is "Simple DAX Average Score."
The symbol for equals differentiates the title from the calculation that
will come after it in the body of the sentence. Our job is to arrive at
an average. After that, the function will take specific criteria, which in
this instance will be a column reference. The function is
differentiated from the arguments or criteria that it is searching for by
the first pair of parentheses. Next, we have the table name that we
want to provide along with the function surrounded in single quotes,
as well as the column name that we want to pass along enclosed in
brackets.
When the second set of parentheses is used, it indicates that all of
the function's parameters have been provided before. Put all of that
together, and let's try to make it sound a little bit more like English,
shall we? There is a DAX measure called Simple DAX Average
Score. This DAX measure is referred to as the Average function and
it calculates its results by utilizing the Score column from the Grades
table as its calculating parameter. This will return the average score
from the Grades table in our scenario, and it is susceptible to being
changed by any context that is relevant to the situation.
As a point of reference, you should know that any table name that
has a space should always be surrounded by single quotes, and the
names of any column should always be surrounded by brackets.
Whenever you are developing your own DAX, the best technique is
to enclose the names of your tables in single quotes, regardless of
whether or not the table names include any spaces. When viewed
more technically, this reveals that all of the simple aggregations that
we explored previously share the same fundamental DAX syntax for
their respective functions. There is no difference between the Sum,
Average, Count, DistinctCount, Min, and Max values. Simply
calculating the minimum or maximum allows one to determine the
first, last, earliest, and latest positions. The following is how that
syntax looks.
• FUNCTION('TABLE'[COLUMN])
That sums it up well. You will then be provided with a straightforward
explicit measure that will have any additional context in the visual as
well as any relevant slicers on your worksheet applied to it for
filtering.
CALCULATE
When it comes to DAX functions, the CALCULATE function may be
thought of as the equivalent of a Swiss army knife. Similarly to many
other aspects of life, 80% of DAX issues can be resolved with just
20% level of knowledge. You will have made significant progress
toward reaching that early level of expertise if you can grasp
CALCULATE. The CALCULATE function is straightforward in its
operation. It is a wrapper that evaluates an expression in a context
that has had its filtering capabilities updated. What exactly does it
entail? After evaluating functions, you can save filter contexts for
later use. Consider the possibility of using a WHERE clause while
working with SQL. You could wish to restrict the results to those
where X = Y or Z > 100, for example. You can feed these WHERE
clause–type parameters into your DAX formula using CALCULATE,
which enables you to do so in an explicit manner. This ensures that
they will always apply, regardless of the context in which they are
used. You can come up with a huge number of other scenarios in
which this might be useful. Do you want to find out how the sales for
this year stack up against the sales at the same point in the prior
year? You can do it with the aid of CALCULATE. It's possible that
you need to evaluate how the sales figure for one product compares
to that of a different group of items. That is something you can do.
Do you wish to provide a dynamic calculation that satisfies the filter
context requirements depending on the outcomes of another
calculation, such as the date from the previous day? You can do
that.
Syntax: The syntax of the CALCULATE function in Power BI is
as follows:
● CALCULATE(<expression>, <filter1>, <filter2>, ...)
● <expression> represents the calculation or measure to be
evaluated or modified.
● <filter1>, <filter2>, and so on are optional parameters that
define the filters or conditions to be applied to the calculation.
Examples of CALCULATE Function Usage
• Applying a Filter to a Calculation: Let's say we have a Sales
table that has columns like "SalesAmount" and
"ProductCategory." We want to determine how much money
was generated in the "Electronics" area as a whole, but we are
only going to include the money that was earned in 2022.

In this demonstration, the CALCULATE function alters the SUM


calculation by adding filters that are determined by the column titled
"ProductCategory" as well as the column titled "SalesDate." Only
sales that fall under the category of "Electronics" and those that take
place in 2022 will be included in the calculation.
• Applying Multiple Filters: If we want to expand upon the
previous example, we can do so by adding a further filter that
is determined by the sales area. We want to determine how
much money will be made in sales for the category of
"Electronics" in the year 2022; however, we are only interested
in the "North" area.

In this particular instance, the CALCULATE function uses many


filters so that the calculation may be simplified. The only sales that
are taken into account are those in the "Electronics" category,
throughout the year 2022, in the "North" area.
• Nesting CALCULATE Functions: Calculations may get more
sophisticated if they are nested using CALCULATE functions.
Let's say we want to determine the typical amount of money
earned from sales in the "Electronics" category, but we don't
want to include any figures for the year 2023. In addition to
that, one of our goals is to contrast this average with the
general sales amount that is typical.
In this demonstration, the CALCULATE function is used to determine
the average amount of revenue generated by sales in the
"Electronics" category, omitting revenue generated by sales in the
year 2023. Separately, the total average sales amount is calculated.
In the last step of the calculation, the average for "Electronics" is
compared to the average for the whole survey to determine the
difference. Users of Power BI are given a flexible and dynamic
method for modifying calculations based on particular circumstances
or filters, which is made available via the CALCULATE function in
Power BI. Users can do complex data analysis, produce tailored
calculations, and obtain deeper insights into their data by using this
function.
We Heard You like DAX, So We Put Some DAX in
Your DAX
If you're a fan of DAX (Data Analysis Expressions) in Power BI, you'll
be delighted to explore some advanced techniques that involve
combining DAX functions within DAX expressions. This allows for
even more powerful and sophisticated calculations and analysis.
Here, we'll dive into the world of nested DAX expressions and
showcase some examples of how you can leverage this capability to
enhance your data modeling and analysis.
Nested DAX Functions
The use of one DAX function as an argument or parameter inside
the context of another DAX function is referred to as the practice of
"nesting" DAX functions. With the help of this method, you will be
able to carry out intricate calculations, modify data, and devise
bespoke measures that exceed the capabilities of individual
functions. It enables you to mix and link together some different
functions so that you can obtain the desired result.
Examples of Nested DAX Expressions
Let's explore a few examples to illustrate the power of nested
DAX expressions:
● Calculating a Weighted Average: Imagine that you have a
Sales table that has columns such as SalesAmount and
Quantity. You can use the following nested DAX statement to
determine the weighted average sales price, where the
contribution that each sale makes is proportionate to the
number of sales that it represents:

The nested DAX expression in this example makes use of the SUMX
function to loop over each row in the Sales database, multiplying the
SalesAmount by the Quantity for each sale. To calculate the
weighted average sales price, it then divides the total of all of these
values by the total of all of the quantities.
Calculating a Rolling Total: You can use the nested DAX
expression that is shown below to generate a running total of sales
for a certain period:

In this particular example, the layered DAX expression combines the


functionality of the CALCULATE function and the FILTER function. It
calculates the total SalesAmount for each row in the Sales
database that satisfies the criterion that is defined by the FILTER
function. This function chooses rows from the Sales table in which
the date is within the previous 30 days based on the maximum date
that applies to the present context.
● Applying Conditional Logic: You can utilize nested DAX
expressions to apply conditional logic and produce dynamic
calculations depending on particular criteria.
These capabilities are made possible by DAX. Consider, for
instance, the following example, which computes a specialized
measure dependent on a condition:

The IF function is utilized to apply a condition in this nested DAX


expression that we have here. In the calculation, the Total Sales are
multiplied by 0.1 if the Total Sales measure is larger than 1000; else,
it is multiplied by 0.05. This enables dynamic calculations to be
performed depending on the particular circumstances. Advantages
and Fields of Application: Nested DAX expressions provide some
advantages and throw up a wide range of opportunities for data
modeling and analysis in Power BI.
These advantages and possibilities are as follows:
• Improved Calculations: The use of nested DAX expressions
makes it possible to do more complex calculations, which may
include the combination of some functions and logical
operations.
• Flexibility: When you stack DAX functions, you have more
control over the calculations and can adapt them to match
particular needs or business logic. This is made possible by
the fact that you have more control.
• Advanced Analysis: Nested DAX expressions allow you to
execute advanced analysis, such as weighted averages, rolling
totals, complicated conditional calculations, and many other
types of analyses as well.
• Reusability: Once you've generated nested DAX expressions,
you can reuse them across many measures, visuals, or
reports, which will save you time and ensure consistency.
Row and Filter Context
The principles of row context and filter context are foundational
components of Power BI and other data analysis tools. They are an
extremely important component in gaining knowledge of how
calculations and aggregations are carried out in relation to the
context in which they are assessed.
Row Context
The term "row context" refers to the piece of data in the current row
that is being considered by a calculation. In Power BI, if you
construct a calculated column or a measure, the calculation will be
carried out for each row in the table or visual that you are working
with. The context of the row is what decides which values from which
rows are utilized in the calculation at any given point in time.
For instance, if you have a Sales table that has columns such
as SalesAmount and Quantity, and you develop a measure to
compute the overall income for each row, as an illustration:
● Total Revenue = Sales[SalesAmount] * Sales[Quantity]
The row context will proceed to iterate over each row in the Sales
database, compute the total income by multiplying the SalesAmount
by the Quantity for that particular row, and then show the result as
appropriate.
Filter Context
During the process of determining the accuracy of a calculation, a
set of filters or conditions may be applied to the data. This is what is
meant by the term "filter context." The user can choose to apply
filters manually themselves or they can be applied automatically
depending on interactions with graphics, slicers, or other filtering
methods.
When performing a calculation, Power BI takes into account the filter
context to decide which rows of data are included in the calculation
and which rows are not included in the calculation. The scope of the
calculation as well as the subset of data that it acts on is both
defined by the filter context.
For instance, if you build a measure to compute the overall
quantity of sales for a certain product category, the following
will occur:
● Total Sales = CALCULATE(SUM(Sales[SalesAmount]),
Sales[ProductCategory] = "Electronics")
The condition Sales[ProductCategory] = "Electronics" establishes
the context for the filter. Only the rows that satisfy this criterion will
be considered for inclusion in the calculation of the total amount
sold, while the results of the calculation will exclude the results of
any other product categories.
Interaction between Row and Filter Context
When doing calculations in Power BI, the row context and the filter
context interact with one another. The calculation assesses each row
in the row context while taking into consideration the filters that have
been applied in the filter context. For instance, if you have a table
visualization that displays total sales for each product category and
you add a slicer to filter the data by a certain area, the filter context
will be altered depending on the region that you pick. This is
because the context of the filter is determined by the data being
filtered. The total quantity of sales will be determined when the row
context has finished iterating over each row and combining the
information from the filter context with the row context.
To guarantee correct calculations and interpretations of data, it is
essential to have a solid understanding of the relationship that exists
between the row context and the filter context. By modifying the filter
context using functions like CALCULATE, FILTER, or ALL, you can
control which data is included or excluded from the calculation,
offering greater flexibility in data analysis. This allows you to select
which data is included or omitted from the calculation.
One Final DAX Example
Let's take a look at one more DAX example to demonstrate the
breadth and depth of the language's capabilities. A hypothetical
situation will be used as the basis for our calculation of the customer
retention rate, which will be based on the sales data. Imagine that
we have a Sales table that has columns such as CustomerID,
SalesAmount, and SalesDate. We need to determine the customer
retention rate, which is the ratio of the total number of customers to
the number of customers who have made purchases in both the
current year and the prior year. This ratio shows the proportion of
consumers who have made purchases in both years.
To calculate the customer retention rate using DAX, we can
follow these steps:

1. Create a measure to count the distinct number of


customers who made purchases in the current year:

● Current Year Customers =


DISTINCTCOUNT(Sales[CustomerID])

2. Create a measure to count the distinct number of


customers who made purchases in the previous year:

● Previous Year Customers =


CALCULATE(DISTINCTCOUNT(Sales[CustomerID]),
SAMEPERIODLASTYEAR(Sales[SalesDate]))
In this measure, we use the CALCULATE function along with the
SAMEPERIODLASTYEAR function to calculate the distinct count of
customers for the previous year.
3. Create a measure to calculate the customer retention
rate:

● Retention Rate = DIVIDE([Previous Year Customers],


[Current Year Customers])
A decimal representation of the retention rate can be obtained by
using the DIVIDE function to determine the ratio of customers from
the previous year to customers from the current year. We can get the
client retention rate by integrating all of these different measures. To
do the required calculations and comparisons, the measures make
use of DAX functions such as DISTINCTCOUNT, CALCULATE,
SAMEPERIODLASTYEAR, and DIVIDE.
The customer retention rate can be shown in a Power BI report using
a card visual or any other sort of visualization that is appropriate
after the measures have been defined. The retention rate will
dynamically alter depending on the filters and slicers that have been
added to the report, which will provide useful insights into the
behavior of customers and their loyalty over time. It is important to
point out that this illustration demonstrates the fundamental steps
involved in computing the client retention rate. You may need to put
extra logic into the calculations, depending on the unique data model
and needs that you have. Alternatively, you may need to improve the
calculations further.
With a deeper understanding of DAX functions and their application,
you can unlock the full potential of your data and derive meaningful
insights to drive informed decision-making. DAX is a powerful tool for
data modeling and business intelligence in Power BI due to its
flexibility and expressiveness, which allow you to perform complex
calculations and analyses.
Conclusion
Understanding the DAX language is crucial for unlocking the full
analytical potential of Microsoft Power BI. This primer has provided
an overview of key concepts, functions, and syntax that form the
foundation of DAX. In conclusion, we have provided a
comprehensive overview of the DAX language, including measures,
calculated columns, calculated tables, types of functions,
aggregations, implicit and explicit measures, DAX syntax
fundamentals, the CALCULATE function, row and filter context, and
practical examples. With this understanding, users can harness the
power of DAX to transform their data into actionable insights and
drive meaningful business outcomes in Power BI.
CHAPTER 6
PUTTING THE PUZZLE PIECES
TOGETHER: FROM RAW DATA TO
REPORT
Your First Data Import
1. Launch the Power BI Desktop.
2. To get data, choose the "Home" tab of the Power BI
Desktop application and then select the "Get Data" button.
3. Excel should be used as the data source; in the "Get Data"
box, choose "Excel" from the list of alternatives that are
shown to you. If Excel is not among the programs shown,
you can look for it by typing its name into the box provided
for that purpose, or you can choose "More..." to see further
options.
4. Choose the Excel file by using the "Navigator" window to
go to where your Excel file is stored, and then select it
after you've found it. After that, choose the "Open" option
to continue.
5. You will get a preview of the sheets and tables that are
accessible in the Excel file in the "Navigator" box. From
there, you can choose the data that you want to import.
Check the boxes next to the individual sheets and tables
that you wish to import to choose them for import. You can
also examine a preview of the data by clicking on the
"Preview" button.
6. After choosing the sheets or tables that you want to work
with, you have two options available to you: load the data
or edit the data.

● Load: If you want to load the data straight away, you may
do so by clicking the "Load" button. The chosen data will be
imported into Power BI, at which point you can begin the
process of displaying and analyzing the data.
● Edit: Click the "Edit" option if you wish to modify the data in
some way before importing it. This will allow you to do
things like change or shape the data. This will launch the
Power Query Editor, which is where you can execute a wide
variety of data changes, including deleting columns,
renaming columns, altering data types, combining tables,
and more.

7. Visualize and analyze your data: Once the data has


been imported, or after you have done shaping it in the
Power Query Editor, you can begin the process of building
visualizations, reports, and dashboards by making use of
the fields and tables that are accessible in the Power BI
Desktop.

Choose and Transform the Data When You Import


You have the option to use the "Choose and Transform Data"
feature, which is also known as Power Query, to transform and
shape your data before loading it into Power BI. This feature enables
you to perform various data manipulation tasks, such as cleaning,
filtering, merging, and appending data, as well as creating calculated
columns. When you import data into Power BI Desktop, you have the
option to use this feature.
The "Choose and Transform Data" option can be used in the
following manner:

1. Launch Power BI Desktop and choose the "Get Data"


option located in the Home tab of the program.
2. Choose the data source that you want to use, such as
Excel, a CSV file, a SQL database, or any other source
that is supported.
3. Choose the tables and sheets from the specified data
source that you wish to import in the window labeled
"Navigator."
4. Instead of selecting "Load," you should click the
"Transform Data" button. The window for the Power
Query Editor will open when you do this.
5. A glimpse of your data will appear for you to peruse in the
Power Query Editor. You can do a variety of data
transformation activities in this section. The following are
examples of frequent tasks:
● Removing columns: Right-click on a column header and
select "Remove" to remove unnecessary columns from
the dataset.
● Filtering rows: Use the filtering options in the column
headers to filter rows based on specific criteria.
● Changing data types: Select a column, right-click, and
choose "Change Type" to convert the data type of a
column.
● Merging tables: If you have multiple tables with related
data, you can merge them by defining relationships
between common columns. Use the "Merge Queries" or
"Append Queries" options under the Home tab to merge
tables.
● Creating calculated columns: Use the "Add Column"
tab to create new columns based on calculations or
expressions using existing columns.
● Splitting columns: If a column contains combined data,
you can split it into multiple columns using the "Split
Column" option.
● Renaming columns: Right-click on a column header and
select "Rename" to provide a more meaningful name to
the column.
● Applying transformations: Power Query provides
numerous transformation options under various tabs,
such as Home, Transform, Add Column, and View.

6. The preview of the data that is shown in the Power Query


Editor will be updated in real-time as you apply
transformations, giving you the ability to evaluate the
effects of your modifications as you make them.
7. When you are through shaping and converting the data, go
to the Home tab of the Power Query Editor and click on the
"Close & Apply" button. This will save your changes. In
doing so, the transformations will be applied, and the data
will be loaded into Power BI.
8. Now that the data has been converted, you can use Power
BI to build dashboards, reports, and visualizations using
the data.

Consolidating Tables with Append


In many cases, the data originates from multiple tables or sources;
thus, one of the most important steps in the process of preparing the
data for analysis is to combine all of these tables into a single table.
The Append function in Power BI provides users with an easy and
fast method to merge tables vertically, making it possible for them to
condense their data more. The process of combining the rows of
multiple tables into a single table is what's involved in appending
tables in Power BI.
When dealing with linked data or when data is scattered across
multiple sources or files, this procedure is extremely effective. Users
can construct a unified dataset by attaching tables, making it simpler
to evaluate and display the consolidated information. This is
accomplished by creating a unified dataset.

Here, we will look into the processes and things to keep in mind
while consolidating tables using the Append function in Power
BI.
Step 1: Accessing the Power Query Editor
When you've finished importing the tables you want, Power BI will
show you the data preview pane. In this section, users can pick
certain tables by selecting the boxes next to those tables, or they
can opt to import all of the available tables. To access the Power
Query Editor, choose the "Transform Data" button rather than just
importing the data. This will bring up the Power Query Editor. Before
adding the tables, this editor offers a rich set of tools for transforming
the data, which allows for the tables to be shaped and refined.
Step 2: Selecting and Appending Tables
Tables are presented in the "Queries" pane, which is located on the
left-hand side of the Power Query Editor. Find the first table that you
want to add to the consolidated table, and then choose it. To attach
tables, go to the Home tab of the Power Query Editor and click on
the "Append Queries" button. This button looks like two tables piled
on top of each other and is where you append tables. A list of the
tables that can be used will be shown in a dialog box. Choose the
table or tables you wish to attach by holding down the Control key
while clicking on the table(s) you want to add. To begin the add
procedure, click "OK" on the dialog box.

Step 3: Repeating the Append Process


If you have further tables that need to be appended, you will need to
repeat the procedure by choosing the subsequent table from the
"Queries" window and proceeding with steps 2 and 3. The entries
from each table will be added underneath the already existing
consolidated table by using Power Query. Users can integrate data
from multiple sources, combine similar information, or blend time-
series data for complete analysis by repeatedly adding tables.
Step 4: Data Transformation and Cleanup (Optional)
It is very necessary to check that the tables that are being
consolidated all have a comparable format before beginning the add
procedure. To bring about a state of harmony in the data, Power
Query Editor provides a comprehensive collection of data
transformation and cleansing functions. Users can get rid of columns
that aren't essential, filter rows based on certain criteria, modify the
data type, merge or divide columns, conduct calculations, and carry
out other data preparation operations as required. The processes of
this transformation assist align the tables and preserve the integrity
of the data included inside the consolidated table.
Step 5: Applying the Consolidation
Click on the "Close & Apply" button, which is located on the Home
tab of the Power Query Editor, after all of the tables that you want to
attach have been appended and any required transformations have
been performed. The modifications will be saved, and the
consolidation process will be applied when this step is taken. In the
next stages, Power BI will generate a new consolidated table that
has all of the rows that were added from the other tables. This will
make it easier to analyze and visualize the data.

Considerations for Consolidating Tables with


Append
Consolidating tables using the Append function can give some
advantages; nevertheless, to conduct an accurate and useful
analysis, it is essential to take into consideration the following
factors:

1. Verify that the tables that are being added contain


comparable structures, including column names, data
types, and order. This will ensure that the data will be
consistent. The inconsistency of the structures may result
in the misalignment of the data and incorrect analysis.
2. Data Quality: The data quality requires that any required
data cleaning and quality tests be carried out before the
appending of tables. To guarantee that the analysis is
correct, it is necessary to get rid of duplicate entries, deal
with missing values, and resolve data anomalies.
3. Performance: There is a possibility that performance may
be impacted if you append huge tables that include a
significant number of rows. Applying filters or changes to
the append process may help minimize the size of the
dataset and improve query speed. This can be done to
optimize the append process.
4. Data Refresh: If the original tables are changed frequently,
you should make sure that the consolidated table is also
kept up to date. To preserve the authenticity of the
aggregated data, the data refresh settings in Power BI
need to be configured.
5. Data Relationships: After you have consolidated the data,
you should next build the relationships between the
consolidated table and any additional tables that are
pertinent to the situation. This enables Power BI's
advanced capabilities, like drill-through and cross-filtering,
and makes it possible to analyze seamlessly.

A robust method for merging data from multiple sources or tables


into a single dataset is provided by the Append function in Power BI,
which can be used to consolidate tables to create a single dataset.
Users can change and shape the data by using the Power Query
Editor, which helps maintain the consistency and integrity of the
consolidated table. Users can unleash the full potential of their data
and gain useful insights via effective analysis and visualization in
Power BI by giving careful attention to the quality of their data, the
structure of their data, and the performance of their data.

Using Merge to Get Columns from Other Tables


The Merge function in Power BI enables users to merge columns
from several tables into a single table based on shared fields or
relationships between the datasets. When tables are merged, extra
information can be brought in, and your dataset can be enhanced by
adding data from other tables.

Here's a step-by-step guide on using the Merge function in


Power BI:
Step 1: Open Power BI Desktop and ensure that you have imported
the tables you want to merge using the "Get Data" option.
Step 2: Locate the table to which you wish to add columns from
other tables inside the Power Query Editor.
Step 3: To choose the table, locate the "Queries" pane on the left-
hand side of the screen and click on the table's name.
Step 4: Go to the Home tab of the Power Query Editor and choose
the "Merge Queries" option. It seems to be two tables connected by
a line with an arrow in the middle.
Step 5: You will see that the "Merge Queries" dialog box has two
drop-down selections. These menus are titled "Table" and "Merge
Kind."
• Table: From the drop-down list, choose the table that you wish
to combine with the table that is now chosen. This is the table
that includes the extra columns that you want to bring in, and it
is the table that you want to bring in.
• Merge Kind: Select the kind of merging operation that you
wish to carry out. Each of the four options—"Inner," "Left
Outer," "Right Outer," and "Full Outer"—determines how the
merging process deals with rows that match and rows that do
not match between the tables.
Step 6: Select the required columns from each table to define the
merging requirements. The names of the columns will cause Power
BI to provide suggestions for new columns automatically. Check to
see if the columns you've chosen to combine include data that
already matches, or that they may act as the merge's keys.
Step 7: Make your selections for the merging process based on the
preferences you have. You have the option of retaining all columns
from both tables, or you can keep just some columns from each
table. The combined columns can be expanded or maintained as a
nested table, depending on your preference.
Step 8: Click on the "OK" button to perform the merge operation.
Step 9: Power Query will perform the merging of the chosen table
with the other table depending on the merge criteria that you have
given. The end product is a merged table that has columns taken
from both of the original tables. The merge key will be used to match
rows, and after a match has been made, the merged columns will be
filled with the correct values.
Step 10: If you still have other tables that need to be merged, you
will need to repeat the procedure by choosing the next table in the
"Queries" pane and then proceeding with steps 3 through 9.
Step 11: When you have finished merging all of the tables that you
want to combine, go to the Home tab of the Power Query Editor and
click on the "Close & Apply" button. This will perform the merge
operation and load the data into Power BI.
The merged columns from other tables give extra context and
insights to strengthen your analysis and decision-making. After
merging tables, you can utilize the combined dataset to create
visualizations, analyze data, and produce reports in Power BI.

Building Relationships
If you have multiple tables, there is a good likelihood that you will do
some kind of analysis utilizing the data from all of those tables. The
establishment of relationships between such tables is essential for
doing precise calculations of results and presenting accurate data in
your reports. You won't have to take any action in the vast majority of
situations. This is taken care of for you by the autodetect function.
On the other hand, there are occasions when you may need to
develop relationships on your own or need to make adjustments to
an existing relationship. In any case, it is essential to have a solid
understanding of relationships in Power BI Desktop, as well as how
they can be created and edited.
Autodetect during load
When you query two or more tables at the same time in Power BI
Desktop, it will look for relationships between the tables and try to
construct them for you when the data is loaded. The relationship
options Cardinality and Cross filter direction, as well as the "Make
this relationship active" setting, are already configured by default.
When you run a query in Power BI Desktop, it examines the column
names in the tables you're querying to see if there are any
possibilities for relationships.
If there are, such relationships will be forged on their own. Power BI
Desktop does not establish the relationship if it cannot determine
with a high degree of certainty that there is a match. Nevertheless,
you can still manually establish or update relationships by using the
Manage Relationships dialog box that is available to you.
Create a relationship with Autodetect
● On the Modeling tab, select Manage Relationships >
Autodetect.

Create a relationship manually


1. Navigate to the Modeling tab and pick New > Manage
relationships from the drop-down menu.
2. In the Create Relationship dialog box, pick a table from the
first table drop-down list to create a connection between the
two tables. Choose the column you wish to utilize in the
relationship from the drop-down menu.
3. Make your selection for the other table that will participate in
the relationship from the drop-down menu labeled "second
table." After selecting the other column that you want to
utilize, proceed to click the OK button.

Cardinality (direction), Cross filter direction, and Make this


relationship active for your new relationship are all automatically
configured by default in Power BI Desktop. Cardinality refers to the
direction in which a filter is applied. Nevertheless, if you feel the
need to make adjustments, you can do so here. If none of the tables
selected for the relationship has unique values, you'll see the
following error: One of the columns must have unique values. At
least one table in a relationship must have a distinct, unique list of
key values, which is a common requirement for all relational
database technologies.
There are a few different solutions available if you come across
that error:
● To produce a column that has unique data, use Remove
Duplicates. The elimination of duplicate rows presents a
potential risk of information loss, which is why adopting this
strategy is not without its drawbacks. There is often a valid
cause for duplicating a key (row).
• To the existing model, add an intermediate table consisting of
the list of unique key values. This table will then be connected
to the two primary columns that comprise the relationship.
You also have the option of using the Model view diagram layouts,
where you can construct a relationship by dragging and dropping a
column from one table to a column in another table.

Edit a relationship
There are two ways to edit a relationship in Power BI. The first way
is to use the Editing relationships feature found in the Properties
pane in the Model view. This allows you to pick any line that
connects two tables and then inspect the available relationship
options in the Properties pane. Make sure that the Properties
window is expanded so that you can see the relationship options.

The second approach to update a relationship is by utilizing the


Relationship Editor dialog, which can be opened in a variety of
different ways from inside Power BI Desktop. The Relationship
Editor dialog can be opened using any one of the methods
presented in the following list:
From the Report view do any of the following:
• On the Modeling ribbon, pick the Manage Relationships
option, then choose the relationship you want to edit and click
the Edit button.
• First, choose a table from the Fields list. Next, go to the Table
Tools ribbon and pick the Manage Relationships option.
Finally, choose the relationship you want to edit and click the
Edit button.
To edit a relationship, select it first using the Table tools ribbon's
Manage relationships option, located in the Data view's ribbon, and
then pick Edit.
From the Model view do any of the following:
• Select Edit from the drop-down menu that appears after
selecting the relationship from the Manage Relationships
submenu of the Home tab on the ribbon.
• To join two tables, double-click any line that is in between
them.
• Right-click any line that connects the two tables, and then
choose Properties from the context menu.
• First, pick a line that connects two tables, and then choose the
Open relationship editor from the Properties pane.
You can also update a relationship from any view by right-clicking on
the table, selecting the ellipsis to get the context menu, then
selecting Manage relationships, selecting the relationship, and finally
selecting Update. This will allow you to make changes to the
relationship. The following image shows a screenshot of the Edit
relationship window.

Editing relationships using different methods


When modifying relationships in Power BI, using the Edit
relationships dialog provides a more guided experience that is
presently under preview. You can get a preview of the data in each
table. The window will automatically check the relationship between
the columns as you pick them and will provide suitable cardinality
and cross-filter selections. When it comes to altering relationships in
Power BI, making use of the Properties pane's editing capabilities is
a time-saving option. You are just shown the names of the tables
and columns from which you can pick, you are not shown a preview
of the data, and the relationship options you make are only
evaluated after you hit the Apply Changes button.
When changing a relationship, using the Properties pane and its
simplified method helps decrease the number of queries performed.
This can be a crucial consideration for situations with large amounts
of data, particularly those that include the use of DirectQuery
connections. The relationships that can be generated by using the
Properties pane have the potential to have a higher level of
complexity than those that can be produced by using the Edit
relationships dialog. You can also multi-select relationships in the
Model view diagram layouts by hitting the Ctrl key and choosing
more than one line at a time to pick multiple relationships. This will
allow you to choose multiple sets of relationships. The Properties
pane is where you can make changes to common properties, and
clicking Apply Changes will execute those changes in a single
transaction.
You can erase either a single relationship or several relationships at
once by using the erase key on your keyboard. Because you can't
reverse the delete operation, a dialog box will ask you to verify that
you want to get rid of the relationships.
Important
The functionality that allows users to edit relationships inside the
properties pane is presently under preview. As long as the product is
under preview, both its functionality and its documentation may
change. To activate this feature in Power BI Desktop, go to the File
menu, pick Options and Settings, click Options, and then select
Preview features. Finally, in the GLOBAL section, select the
checkbox that is located next to the Relationship pane option.

Configure more options


More parameters can be configured whenever a relationship is
created, edited, or edited again. Power BI Desktop automatically
configures additional options based on its best estimate, which can
be different for each relationship depending on the data that is
included in the columns. This is the default setting for the software.
Cardinality
The Cardinality parameter can be changed to any of the
following values based on your preferences:
● Many-to-one (*:1): The many-to-one relationship is the default
kind of relationship and the most prevalent type of relationship.
It indicates that a column in a certain database can have many
occurrences of a value, however, the column in the other linked
table, which is sometimes referred to as the lookup table, can
only have a single instance of a value.
● One-to-one (1:1): In a one-to-one relationship, each column in
one table only has one occurrence of a given value, and each
column in the other associated table also only has one
instance of a given value.
● One-to-many (1:*): In a relationship that has a one-to-many
structure, a column in one table can only ever have one
occurrence of a given value, but the column in the other
associated table can have many instances of a given value.
● Many to many (*:*): You no longer need to have one-of-a-kind
values in your tables if you use composite models since you
can set up a relationship between tables that is many-to-many.
Additionally, it eliminates the need for past workarounds, such
as creating additional tables only to create relationships.

Cross filter direction


The Cross filter direction option can have one of the following
settings:
Both: Both tables are processed as though they are a single table so
that filtering can be performed on them more efficiently. Both options
perform well when applied to a single table that is surrounded by a
large number of lookup tables. One example of this would be a sales
actuals table that also included a lookup table for its department.
This structure, which consists of a core table and many lookup
tables, is often referred to as a star schema configuration.
● On the other hand, you shouldn't utilize the both option if you
have more than one table that also contains lookup tables
(especially if part of those lookup tables are shared). In this
situation, to continue the example from before, you also have a
budget sales table that tracks the intended budget for each
department. In addition, the sales table and the budget table
are both related to the table that details the departments.
When configuring anything like this, you should not use the
both option.
Single: This is the most usual and default direction, which implies
that filtering decisions made on related tables affect the table that is
being used to aggregate information. When you import a data model
from Power Pivot into an older version of Excel than Excel 2013, all
of the relationships will only have one direction.

Make this relationship active


The relationship takes on the role of the current, default relationship
when the checkbox is selected. When there is more than one
relationship that can be established between two tables, Power BI
Desktop is given the ability to automatically produce visualizations
that incorporate both tables via the use of the active relationship.
Understanding additional options
Power BI Desktop will automatically specify extra options depending
on the data in your tables whenever a relationship is made,
regardless of whether the relationship was built using autodetect or
manually by the user. The Create relationship and Edit
relationship dialog boxes each include a section in the bottom part
of their interfaces dedicated to these new relationship options.
In most cases, Power BI will automatically configure these
parameters, and you won't need to make any changes to them.
However, there are some scenarios in which you may find it useful to
specify these parameters on your own.
Automatic relationship updates
You can manage how Power BI treats and automatically in your
reports and models, you can choose how Power BI handles
relationships and makes automated adjustments. From the Power BI
Desktop, go to File > Options and Settings > Options, and after
that, pick Data Load in the left pane. This will allow you to determine
how Power BI handles the options for managing relationships. The
menu of available options for Relationships is shown.

Three options can be selected and enabled:


● Import relationships from data sources on first load: This
option is set by default and allows the import of relationships
from data sources on the very first load. When this option is
enabled, Power BI searches for your data source to identify
any relationships that have been defined there, such as
primary key and foreign key relationships in your data
warehouse. When you first load data into Power BI, any such
relationships that are discovered are replicated in the
software's internal data model. You don't need to locate or
define such relationships on your own if you choose this option
since it allows you to start working with your model right away.
● Update or delete relationships when refreshing data: When
refreshing data, you can choose to either update or remove
existing relationships; by default, this option is deselected. If
you choose to activate this feature, Power BI will check for
changes in the data source relationships whenever you refresh
your dataset. If any of those relationships are modified or
severed, Power BI will reflect those changes in its own data
model by either updating or removing the relevant nodes to
ensure consistency.
Caution: If you are using row-level security that is dependent on
created relationships, choosing this option is not something that we
advocate doing. If you get rid of a relationship that your RLS settings
are dependent on, your model can end up being less safe as a
result.
● Autodetect new relationships after data is loaded
Identifying Our Relationship Columns
Establishing proper relationships between tables in Power BI
requires that you first locate and identify the columns that represent
those relationships. These columns are the keys that link and tie the
many tables to one another, hence they are very important.
Identifying the relationship columns can be done in some
different ways, including the following:
1. Data Understanding: To begin, you should comprehend
both the structure and the content of your tables.
Determine the fields or columns that are shared by many
tables and include information that is connected. Search
for columns that have names, data types, or value ranges
that are similar to one another since this may suggest a
probable relationship.
2. Data Modeling Best Practices: The Most Effective
Methods of Data Modeling: When developing your data
model in Power BI, it is important to follow data modeling
best practices. A good example of one of these strategies
is the use of unique identifiers or surrogate keys that can
double as relationship columns. These kinds of columns
include primary keys, foreign keys, and natural keys, all of
which are used to identify entries in different tables
uniquely.
3. Source Documentation or Data Dictionary: When
working with your data sources, consult any
documentation or data dictionaries that are currently
accessible. These resources often include information on
the relationships between tables, including the particular
columns that are responsible for establishing the
connections between the tables.
4. Analyzing Data Dependencies: Explore the data
dependencies as well as the business logic included inside
your dataset. It is important to have a conceptual
understanding of the relationships that exist between
tables as well as the columns that make these
relationships possible. For the sake of illustration, the
"CustomerID" column may act as the relationship column
between the "Customers" database and the "Sales" table
in a situation including data pertaining to sales.
5. Visual Exploration: Use visual exploration techniques in
Power BI to visually analyze the data and identify potential
relationship columns. Create visuals such as scatter plots,
bar charts, or matrix visuals to observe patterns and
correlations between columns across tables.
6. Collaboration and Expert Input: Engage with domain
experts, data analysts, or other stakeholders who have a
deep understanding of the data. Collaborate with them to
identify the appropriate relationship columns based on
their expertise and knowledge of the data domain.

Time to Get Building


Using the Manage Relationships Wizard, which can be accessed
from the ribbon in the Model view, is the first available choice.
After bringing up the wizard and selecting the New button, we
choose our two tables and, by clicking the column names in each
table, determine which columns will serve as the foundation for the
relationship that will be created. When we give the message our
approval, the relationship is established.

Building relationships with the second method is considerably


quicker. When you are in the Model view, you can create
relationships between tables by clicking a column in one table,
holding down the mouse button, and dragging it to the column you
want the relationship to be built into in another table. Then you
release your mouse button and, voila, relationship. The
disadvantage of using this strategy is that it makes it more likely for
errors to occur, such as dragging from or to the incorrect column. It is
also more prone to error when you have tables with numerous
columns and you must drag into areas of a table that have to be
scrolled through to locate the proper column; this can rapidly
become monotonous, and when it becomes tedious, errors arise.
Following the construction of those relationships, we need to have a
primary data model that resembles the illustration that is shown
below. Because of this, we can arrive at a star schema, which
consists of several dimension tables interacting with our fact table
while encircling it like a star. This not only improves optimization but
also makes the model very simple to read and comprehend. You are
immediately aware of which tables can filter the fact table (or tables)
that are of interest to you and how that filter operates, either in a
unidirectional or a bidirectional fashion.

Let’s Get Reporting


We've got our data in, we've had it converted, and we've got it
modeled. Now that we have this opportunity, we can begin gathering
the information that will satisfy our curiosity. There are a plethora of
inquiries that we can make.

We Need a Name...
Here are some words of advice: In the reports that I look at, I often
find that something as basic as a title that reminds people of what
they are looking at can pay off in a significant way. This title can be
something that we repeat on multiple pages, in which case it will
serve as the title of the report; otherwise, we may choose to use
various titles for each page in this report. We also have the option of
having an entire page of the report that is dedicated to acting as a
title introduction of some kind. There are positives and negatives
associated with each method, but for the sake of this example, we
will just offer a straightforward title for the report at the top of the
page. We will do this using the good old-fashioned text box. Inside
the Report view, the "Text box" option can be found inside the Insert
portion of the ribbon. We are aware that a text box does not provide
anything particularly fascinating, yet it is useful. As can be seen in
the picture below, I have positioned the text box so that it is at the
top of my reporting area, justified it in the middle, and added some
wonderful italic to it since I can't get enough of italics. Because I
prefer the Segoe font as well as the default font that comes with
Power BI, and because I want it to be lovely and readable, I enlarged
it quite a bit larger than it originally was.

Cards Help Identify Important Data Points


We will begin by highlighting specific values that I believe are
valuable for understanding how our cohort of students performed. I
am also interested in "lines" of information. I just can't help but like
building with blocks. The first card in our deck contains a tally of the
number of students for whom ISOM 210 was the very first class they
took inside the department. This makes use of the column in the
database labeled "1stCourseInDepartment?" On the other hand, if
you create a card graphic and put that column into it, you would most
likely see something that looks like the picture that is below;

Let's take a moment to discuss a few things that have recently


transpired here. To begin, this is not a number column, which means
that it does not have a summary that is applied by default. Card
graphics always present some kind of data summary, and in this
instance, it is displaying the first result from the table that meets our
filter context, which in this instance is nothing since there is no filter
context. Therefore, we have to alter the summary so that it reads
"count" rather than "first." To do this, in the Fields list of the
Visualizations pane, we click the down arrow next to the column
name, and then in the drop-down menu that appears, we pick the
kind of summary that we want, which in this instance is Count.
When we take a look at the picture below, we can see that while
we've come a long way, there's still a way to go before we reach our
result.
The fact that the card now shows the count rather than the word first
is a positive development. We must offer context for the filter. We
would want Power BI to provide us with a count; however, we are
only interested in the count of the "Y" values. We can see that there
is now no meaningful filter context when we look at this visual's Filter
pane at the moment.
Let's provide some more contexts by dragging the column into the
Filters section of this visual area of the Filter pane. Then, make use
of the Basic filtering option to pick just the values that are "Y." It is
important to point out that the Count version of this field already
exists since it is the one that is being shown here.
To get the column's base values to filter, we will need to add it once
again to the Filter pane. When we carry out those steps, we can
arrive at the number 24 we need. Let's create an alias for the column
name by double-clicking on the column name in the Fields section of
the Visualization pane. Then, rename the column to anything you'd
want it to be called.
Bars, Columns, and Lines
Imagine that you are the sales manager for a retail firm and that you
have been tasked with the responsibility of evaluating how various
goods in your inventory have performed.
To bring your data to life, you turn to Power BI and its
visualizations.

1. Bars: Using bars, you can compare the sales numbers of


various items that fall into a variety of categories. For
instance, you can generate a vertical bar chart to show the
total sales for each product category, such as electronic
goods, clothes, and accessories. The rise and fall of each
category's bar will provide a visual representation of the
category's overall sales success. You can easily determine
which categories are doing better than others by
evaluating the heights of the bars. This provides you with
the information you need to make educated options about
the management of your inventory and your marketing
tactics.
2. Columns: At this point, let's go to explore your product
analysis in more depth by employing columns. Suppose
you are interested in analyzing the number of units sold for
each unique product that falls under a certain category.
You can organize the items into categories according to
their attributes, such as brand or price range, by
developing a chart that is formatted in the form of vertical
columns. After that, the statistics on sales for each
category will be shown next to one another in the columns.
Through the use of this visualization, you will be able to
evaluate the performance of the many product groups that
are included inside a category as well as determine which
particular goods are driving sales. It gives you the ability to
concentrate your efforts on advertising certain items or
improving the variety of products that you provide.
3. Lines: Moving on to lines, let's say you're interested in
monitoring the yearly change in the average monthly
income of your organization. You can exhibit the data
points for revenue on the vertical axis of a line chart in
Power BI, and the months can be shown on the horizontal
axis of the chart. The data points are going to be elegantly
connected by the lines, which will illustrate the evolution
and variation of revenue over time. You can discover
seasonal trends, pinpoint times of increase or fall, and
acquire insights into the general trend of income by
carefully examining the line chart. This information can be
used to help with financial forecasts, the identification of
possible bottlenecks, and the evaluation of how successful
marketing initiatives have been.

Conclusion
The process of transforming raw data into a comprehensive report in
Power BI involves several key steps. The journey from raw data to a
comprehensive report in Power BI involves importing and
transforming data, consolidating tables, establishing relationships,
and building the report itself. Power BI provides a user-friendly
interface and a robust set of tools and features to facilitate each step
of this process. By leveraging the capabilities of Power BI, users can
turn raw data into actionable insights, enabling data-driven decision-
making and effective communication of information.
CHAPTER 7
ADVANCED REPORTING TOPICS IN
POWER BI
AI-Powered Visuals
Microsoft has already included four AI-powered visuals in Power BI
Desktop, and it is anticipated that the company will continue to invest
in the creation of more AI-powered visuals to further enhance the
capabilities of Power BI. Both the Desktop version and the online
service may make use of these graphics. They are located in the
Visualizations window, which can be seen in the picture below. The
images consist of "Key influencers," "Decomposition tree,"
"Q&A," and "Smart narratives," in that order from left to right. Now,
let's take a look at each one individually.

Key Influencers
Power BI users are given the ability to do causal analysis by way of
the Key Influencers visual. This entails studying the relationships that
exist between various factors and the influence that these
relationships have on a certain measure.
Power BI makes use of powerful machine learning algorithms to
automatically determine the key influencers based on the data that is
presented to it. Users can identify the most important causes behind
a statistic with the assistance of this graphic, which facilitates data-
driven decision-making.
Benefits and Significance: The Key Influencers visual has a
significant amount of importance for companies and organizations
operating in a wide variety of fields.
It gives users the ability to:

1. Identify influential factors: Users can discover which


variables or dimensions have the most significant effect on
a measure of interest by using the Key Influencers graphic.
Because of this knowledge, firms can concentrate their
efforts on the aspects that have the most impact, which
results in more productive strategies and better outcomes.
2. Acquire actionable insights: Organizations can obtain
actionable insights to enhance their operations, marketing
campaigns, product offerings, and more if they have a
solid awareness of the key influencers in their respective
industries. These insights make it possible to make
educated decisions, which in turn enable organizations to
align their efforts with the factors that are the most
significant in determining their level of success.
3. Optimize your marketing strategies: The Key
Influencers graphic can assist marketers in determining the
aspects that have the most significant influence on the
actions of their customers. Businesses can adjust their
marketing plans to target certain groups more successfully
by evaluating client segments, demographics, campaign
data, and other characteristics, which leads to improved
customer engagement and conversion rates.
4. Improve the overall customer experience: By having an
understanding of the key influencers, businesses can
determine the aspects that have a substantial impact on
the level of consumer satisfaction. Businesses can
prioritize changes and create tailored experiences that
address the most significant pain points by analyzing
customer feedback, support interactions, product use
statistics, and other relevant characteristics.

Using the Key Influencers Visual in Power BI


Now let's explore how to use the Key Influencers visual in
Power BI effectively:

1. Get your data ready by making sure you have a dataset


that has all of the required measures and dimensions. You
should clean and organize your data to get the most
accurate analysis possible.
2. Launch Power BI and open your report or build a new one.
Next, launch Power BI and either open your report or
create a new one.
3. Open the Visualizations pane: Find your way over to the
right-hand side of the Power BI interface, where you'll find
the Visualizations pane.
4. Choose the Key Influencers visual by navigating to the
Visualizations pane and selecting the "Key Influencers"
visual icon with your mouse. It is similar to a scatter plot
that also includes a target symbol.
5. Select the measure that will be analyzed by dragging and
dropping the metric that you want to study into the
"Analyzing" field well. You wish to get a deeper
understanding of this measure or result, such as sales,
revenue, or the level of pleasure experienced by
customers.
6. Choose the influencers by dragging and dropping the
possible dimensions or influencers into the "Explain by"
box nicely. These are the aspects of the situation, known
as factors or variables that you feel may affect the
measurement, such as the kind of product, the location, or
the marketing campaign.
7. Visual creation and customization: Power BI will
automatically analyze the data and build a visualization
that displays the key influencers. This visualization may
then be customized by the user. The graphic may present
a ranking of influencers based on their effect, coupled with
a bar chart or table that corresponds to the ranking.
Utilizing the many formatting options available in the
Visualizations tab, you can personalize the look and style
of the graphic.
8. Consider your options and get new perspectives:
Perform some analysis of the information that was
supplied by the Key Influencers graphic. Use the insights
gained to understand the relationships between variables
and the impact on your chosen measure. Utilize these
realizations to help you arrive at options based on
evidence, which will lead to successful results.

Let’s work with an example.


We will be demonstrating the “Key influencers” visual from a
separate dataset from Microsoft known as AdventureWorks.
AdventureWorks is a sample database shipped with Microsoft SQL
Server. Additionally, there are several examples of AdventureWorks
being coupled with Power BI. In this particular example, I will
examine the total number of invoices based on the name of the area,
the product category, the product subcategory, the product
description, and the component number. This visual can be broken
up into a few different sections. Let's start by taking a look at the
picture that's been provided below, and then we'll proceed to analyze
the visual. First, we are going to zero in on the factors that affect the
total number of invoices for the organization. Figuring out what
factors contribute to an increase or reduction in that count might
provide important information. In principle, a greater number of
invoices correspond to an increased number of sales. By moving our
way down the picture from top to bottom, we can acquire a better
understanding of what's going on. To begin, we can choose between
"Key influencers" and "Top segments." At this time, "Key
influencers" are chosen both explicitly and by default. The next
thing you will notice is a question that is structured around the topic
of what causes whatever it is that we are evaluating to either grow or
decrease. In this case, the default setting is Increase, but I changed
it to Decrease so that more results would be shown. Then, on the left
side of the screen, you'll find a list of chosen circumstances along
with an explanation of how each condition influences the value that
I'm examining.
In this particular instance, the count of invoices for the Touring
Frames subcategory is 43 less than the overall average of all other
subcategories. This is because Touring Frames is the subcategory
being considered. This particular subcategory is highlighted, and you
can find it in the column graph section of the picture, which is located
farthest to the left. Because the dimension in question is
Subcategory, a graph is shown to the right of the screen that
provides information on the typical number of invoices assigned to
each subcategory and underlines the particular group that has been
chosen. You'll see that the graph on the right can be scrolled as well,
which enables me to view all of the subcategories in this manner.

Let's begin by returning to the very top of the visual and selecting the
option labeled "Top segments." In the picture below, you can get a
first glimpse at what it looks like;
The first line has been left the same as it was previously. The second
line, on the other hand, is written in a slightly different way. You'll be
able to see that the question that appears in the selection box has
been rephrased somewhat in its current iteration. The question that
is being asked here is not "What might be causing a value to
increase or decrease," but rather "When is the point of analysis more
likely to be high or low?" In this instance, we will go with Low, even
though High is the setting that comes by default. For this query,
Power BI has shown a tiny scatterplot and provided information on
how it is arranged. The figure illustrates that Power BI has identified
four sections that it considers to be interesting and worth delving into
further. In this instance, it is proceeding from the lowest average
invoice count to the highest, while simultaneously displaying the total
number of records that are included inside each segment.
When you choose one of those bubbles (in this example, we are
going to select segment 2), you'll be able to see on the left all of the
components that go into creating that segment. You can see
segment 2 in contrast to the total value on the right side of the figure
below, and you can also see how much of the data is included in the
segment that you picked to examine.
The visual in and of itself is strong; nevertheless, making use of it
can be extremely irritating. The visual will update itself after each
addition when you add categories to the "Explain by" portion of the
Visualizations pane. It may have difficulty locating combinations of
data that will lead to the identification of noteworthy influencers or
segments. It is not necessarily more beneficial to keep adding
categories though; the more categories you add, the smaller your
population becomes for each combination of categories that you
have added.
To make efficient use of this visual, you will need to have a deep
comprehension of the data as well as some element of human
judgment. What is it that your gut feeling tells you to need to be the
most significant? Begin there and see whether your presumptions
are borne out by the evidence. You'll see that this is a recurring motif
throughout the artificial intelligence visuals section. The AI visuals
cannot make these things function on their own; human intellect is
required to make them operate.

Decomposition Tree
The Decomposition Tree visual is a useful component of Power BI
that gives users the ability to study and explore complicated data
structures in a way that is both understandable and interactive.
Users are given the ability to deconstruct a measure into its parts,
which may include categories, dimensions, or characteristics, and to
graphically illustrate the hierarchical connections that exist within the
data. The composition and distribution of a measure over several
dimensions can be comprehended in a complete and approachable
manner with the help of this visual representation.

Key Benefits of the Decomposition Tree


1. Hierarchical Analysis: The Decomposition Tree is a tool
for visually representing data hierarchies, which makes it
much simpler to examine how different measures are
distributed across various levels. It provides consumers
with the ability to go deeper into the data, therefore
illuminating the underlying elements that contribute to a
certain indicator.
2. Interactive Exploration: Users can interactively explore
the decomposition tree by expanding or collapsing nodes
to zero in on certain levels or dimensions. Because of the
interactivity, a more detailed study of the data is possible,
which, in turn, helps unearth insights that would otherwise
likely stay concealed.
3. Contribution Analysis: The visual presents a concise and
understandable description of how each dimension or
category contributes to the overall measure. Users can
determine the elements that have the most impact on a
certain result, which assists them in prioritizing areas in
need of modification or optimization.
4. Comparisons and Trends: The Decomposition Tree is a
useful tool for making comparisons between the many
levels or dimensions that are included within the hierarchy.
To comprehend trends, patterns, and variances throughout
the data, users can quickly compare the contribution of
various categories or dimensions.

Using the Decomposition Tree Visual in Power BI


1. To begin, launch Power BI and then open your report.
2. Click the Visualizations tab: Find your way over to the
right-hand side of the Power BI interface, where you'll find
the Visualizations pane.
3. Select the "Decomposition Tree" visual icon by looking in
the Visualizations window for the "Decomposition Tree"
visual icon and then clicking on it. In most cases, it takes
the form of a tree-like structure.
4. Define the measure: To do this, go to the "Analyze" box
and then drag and drop the measure that you want to
analyze into the well. This metric indicates the result or
statistic that you want to dissect to have a deeper
understanding of it.
5. Define the hierarchy by dragging and dropping the
dimensions or characteristics you wish to include in the
decomposition tree into the "Explain by" field properly.
This will allow you to define the hierarchy. These are the
categories or elements that have been taken into
consideration in the measurement.
6. Make adjustments to the visual: Power BI will
automatically construct a decomposition tree depending on
the measure and hierarchy that you choose. You can
personalize the visual by modifying the layout, formatting,
and style options that are found in the Visualizations tab.
7. Explore the decomposition: To zero in on a particular
aspect of one of the levels or dimensions, you can interact
with the visual by expanding or collapsing nodes within the
tree. Analyze the contributions, and dig further into the
facts, so that you can get a more in-depth understanding.
8. Make use of extra features: Power BI's Decomposition
Tree visual has additional capabilities that can be used to
further improve the analysis and enhance the presentation
of the findings. These features include the ability to sort the
components, filter the elements, and highlight certain
aspects.

Q&A
The end users are supposed to be given the ability to ask questions
about the data by utilizing natural language, which is the purpose of
the Q&A visual. Power BI will make an effort to take your inquiry and
convert it into a chart or a group of data that provides an answer to
your inquiry. The Q&A format has the potential to be quite effective,
but right out of the box, it can seem like it is missing a few of its
screws. Although Microsoft seems to be making progress on the
Q&A feature with nearly every new update, I still wouldn't call it
"natural language" just yet. There are some of the recommended
questions that may not seem relevant. However, there are numerous
things you can do to offer Power BI with just enough context to assist
in making the Q&A feature seem more natural. Putting a question-
and-answer visual onto our canvas, the first thing we notice is the
outcome, which is seen in the picture below;

Simply by glancing at the questions that have been posed, we can


tell that Q&A performs a decent job of identifying the names of our
fields. However, the meanings of those field names may not be
immediately clear to other users. For example, "What is the total
grade by the category column?" is meant to be understood by
"What is the total grade." In addition, what exactly is meant by the
term "category"? We are both aware that this is the criterion that
determines whether a certain assignment is to be considered
homework, an exam, or extra credit. However, can we assume that
our audience is aware of this?
We can see one of the problems with Q&A and the sample questions
it creates by looking at this particular illustration. At first, it can only
work with the names of our columns and the connections between
them. It will try to apply some straightforward aliasing whenever it
can, but that alone won't necessarily get us all the way there.
Nevertheless, we can still make use of one of the sample questions
to illustrate what Q&A can deliver as a consequence. In this
particular instance, I will choose the first question that is located to
the left, which is titled "What is the total grade by type?" In this
scenario, Power BI creates a bar chart that illustrates how the overall
grade metric is distributed among the various types of assignments.
That outcome is shown in the image below. Take note that this visual
interacts with every other visual on the canvas and can have other
visuals interact with it in the same way that everything else can
interact with anything else. When you start cross-filtering across
different visuals using the Q&A, make sure you don't forget what
question you asked!

Breaking down the visual itself, there will always be the option to add
synonyms unless you click the X next to the button. Let's imagine
that you discover a result from Q&A that you truly like and you want
to make it a more permanent fixture in your life. If you want to
transform the Q&A visual into a visual of the kind that was created,
you can do so by clicking the first button to the right of the text box.
The widget right next to it takes you to the settings for the Q&A
section. When hovered over, the "i" in the circle displays a brief
tooltip that confirms the information that the visual is displaying.
We can quickly add to, modify, or eliminate a portion of a question
since it is still in the text field. When you make changes to a
question, the text box will display a drop-down menu that uses a
search engine to try to guess what the question is that you are trying
to ask. This function, as you would assume, improves with usage,
and the more experience you have using it, the more accurate your
synonyms will be. If we intend to make considerable use of the Q&A
feature, we may give the column headings more user-friendly titles.
This could make it easier for the AI to figure out what the meaning of
a column is. When users fill in queries, we can also make the
language simpler by adding synonyms to particular column
descriptors. This will help make the language easier to understand.
In the picture below, there's a button in the upper-right-hand corner
that reads "Add synonyms now." If we choose that option, the box
for configuring the Q&A will pop up, and when we select the Field
synonyms section, we will be sent there immediately, as seen in the
figure below. Take note that it will display all of the tables in the
model, even the ones that we have concealed using the Report view
settings. On the other hand, it will exclude such tables from the Q&A
options by default.

When you choose a table, you will get a list of columns under the
heading "Name," a list of presently recognized synonyms under the
heading "Terms," and a list of suggested terms to the right of that
heading. At the level of each column, you will continue to see a
toggle that gives you the option to include or exclude that column
from the Q&A results. If I click the + button that is located next to any
of the terms that are labeled as "Suggested terms," then that term
will be added to the list of terms that are associated with that column
instantly. The most effective way to illustrate how this works is to
think about the process from right to left. The section referred to as
"Suggested terms" includes a list of potential alternative names for
the Terms section. When a user chooses to use one of the
recommended terms, it is added to the "Terms" section.
When Power BI comes across a question in the Q&A section, it will
utilize those terms to determine which column it should use to
answer the question by using them as real synonyms. The column
labeled "Name" displays the name of the thing that the provided
phrase refers to. The names of tables, columns, or measures can all
be referred to by terms. You will also see that Power BI already has
certain fundamental terms preset; however, these terms are not
especially illuminating. It is my recommendation that, before putting
Q&A into action, you test the system with as many questions as you
can think of. Check out the results that are returned, make any
required changes to include synonyms, and then try again. There are
many options for fine-tuning the outcomes, and enhancing the
quality assurance in your firm is going to be a process that requires
iteration.
The next two aspects of the Q&A setup are the "Review questions"
and "Suggest questions" functions, respectively. Within the
"Review Questions" area, you will have the ability to evaluate every
Q&A question that has been asked in the most recent 28 days
across all of the datasets that you have access to and that have
been submitted. It is frustrating because it displays all of the datasets
to which you have access, rather than only those datasets to which
you have access and which have had questions answered about
them using Q&A. If, on the other hand, someone does have a
question, you can analyze it as well as the response that Power BI
provides to verify for correctness and offer suggestions for how you
would have liked the answer to be presented for that specific topic.
"Suggest questions" gives you the ability to provide questions to
your audience, such as the ones that were shown when we first
viewed the visual. The distinction lies in the fact that they are
questions that you have already evaluated. Consider these
recommended questions to be user-friendly cheat instructions that
can be used by other users when constructing reports utilizing Q&A
visuals. You are aware of the outcomes that will be shown when that
particular question is chosen. These can be excellent jumping-off
locations for your business to begin utilizing Q&A in something of a
sandbox, to instill trust in users about the outcomes, and to stimulate
investigation. In addition to this, this can help provide a particular set
of responses to certain inquiries. Users can choose recommended
questions inside a single visual, which can assist them in better
framing additional visuals on a particular report page. The question-
and-answer portion of a conference call is a tricky beast, and if you
want to get the most out of it, it's just like getting to perform at
Carnegie Hall. It requires a lot of practice, practice, and more
practice. You can also deactivate Q&A against datasets in service if
you would rather not enable customers to utilize the Power BI
service's Q&A feature against a dataset.
Smart Narrative
The visual known as "smart narrative" is a little bit different in that it
does not carry out any actions by itself. If you attempt to place this
visual into a blank canvas, you will get an error notice, and a text box
will be produced instead. However, when put on a report page that
already has other visuals on it, the "Smart narrative" will "read" the
other visuals by constructing a narrative based on the data points
included in those visuals. This narrative will then assist in "reading"
the data. It is essential to keep in mind that a visual with a "Smart
narrative" can be cross-filtered and interacted with by any other
visual on the canvas, and the "Smart narrative" visual will update its
story in response to these actions. You can also add comments to
the "Smart narrative," which will be there independent of any
changes to the filter context that would impact the remainder of the
narrative. This is possible since the "Smart narrative" is still
contained inside a functional text box.
In my opinion, the so-called "Smart narrative" is a sword with two
edges. On the one hand, it is an incredible miracle for the
advancement of data literacy. This visual does an excellent job of
breaking down the data in a manner that can make it intelligible for
nontechnical users and those who aren't as acquainted with the
data. This can have so much value for report users since it can make
the data more accessible. Having said that, it is also rather large and
cumbersome to use. In the end, it's just a big text box that can
construct tales that are so lengthy that you have to drag them up and
down to read them all the way through. Because there is a limited
amount of design space available, determining whether or not the
"Smart narrative" visual is the best option ultimately depends on the
context of your audience and the design options that you make.
What-If Analysis
The What-If Analysis is a strong approach that helps users
comprehend the repercussions of changes in their data by
examining alternative scenarios. This is accomplished via the use of
"what-if" statements. Users are given the ability to experiment with a
variety of inputs, variables, and assumptions to assess the many
possible results. Users of Power BI can simulate and display the
effect of changes made to key performance indicators, metrics, and
other data pieces by using the What-If Analysis feature.

Benefits of What-If Analysis in Power BI


● Decision-making: When it comes to making decisions, the
"What-If" Analysis gives decision-makers useful insights that
allow them to evaluate the possible effect of various options or
situations. Users can make better options and reduce risks by
having a better awareness of the whole range of potential
outcomes.
● Forecasting and Planning: What-If Analysis allows users to
model future possibilities based on alternative assumptions and
inputs. This can be helpful in both of these processes. Knowing
the probable outcomes that may occur under a variety of
circumstances provides improved methods of forecasting,
budgeting, and strategic planning.
● Sensitivity Analysis: This is a method that helps determine how
sensitive important variables in a dataset are. Users can identify
the variables that have the most significant influence on their data
by manipulating individual inputs and monitoring how those
changes play out.
Techniques for What-If Analysis in Power BI
● Data Table: This is a feature that can be found in Power BI that
gives users the ability to specify and alter individual data that are
included inside a table. Users can observe the immediate effect
these changes have on the calculations, measures, and
visualizations by modifying these settings.
● Measure Branching: This is yet another method available in
Power BI that allows users to generate new variants of previously
used measures. Users can compare and evaluate many
situations inside the same report by simply replicating a measure
and adjusting its logic.
● Scenario analysis: This is supported by Power BI via the use of
parameters and measures. Measures determine outcomes based
on these factors and provide insights into the effect that
modifications have had by simulating different situations, while
parameters serve as inputs that users can tweak to mimic various
scenarios.
● Advanced Analytics: Power BI can also interface with other
advanced analytics tools like R and Python. These tools provide
users with strong statistical modeling and simulation capabilities,
enabling them to conduct complicated What-If Analysis that is
based on advanced algorithmic frameworks.

Best Practices for What-If Analysis in Power BI


• Clearly outline Your Objectives: One of the first things you
should do is outline the goals and questions you want to answer
using the What-If Analysis. This will serve as a guide for selecting
relevant variables, assumptions, and parameters for the study.
• Start Simple: Start with simple situations and gradually raise the
difficulty. Beginning with less complicated situations enables users
to gain confidence and a better understanding of the mechanics
of What-If Analysis before moving on to more complex examples.
• Work Together and Iterate: Involve Stakeholders and Subject
Matter Experts in the Process of Conducting the What-If Analysis.
Collaboration and iteration both contribute to a greater knowledge
of the data, as well as the validation and efficient application of
insights gained from that analysis.
• Document and Communicate: Document the assumptions,
variables, and outcomes of each What-If Analysis scenario.
Document the results of each What-If Analysis scenario. The
ability of stakeholders to grasp the significance of the results and
to make choices based on that understanding is facilitated by the
effective communication of insights and findings.
To perform What-If Analysis in Power BI, the first step is to
create a What-If parameter.

1. Once your data has been imported, open the Modeling tab
by navigating to the Modeling tab on the Power BI Desktop
ribbon. This page provides access to a wide variety of data
modeling-related tools and functions.
2. To generate a new parameter, go to the Modeling tab and
choose the "New Parameter" button from the drop-down
menu. The What-If parameter that you want to specify can
be found in the dialog box that will appear when you take
this action.
3. Within the "New Parameter" dialog box, supply the
following information:

▪ Name: Give your parameter a descriptive name.


▪ Data type: Decimal Number, Whole Number, or Date/Time:
Choose the proper data type for your parameter.
▪ Minimum and maximum values: Determine the value range
for your parameter by setting the minimum and maximum
values. This establishes the parameters for the scope of the
What-If analysis that you want to carry out.
▪ Increment: When manipulating the parameter value, you can
provide the increment by which the value of the parameter
should change.
Parameter Setup
The What-if parameter includes several inputs, each of which has to
be carefully controlled. Name is evident but also has a distinct
impact. No matter which name is chosen, the data model is going to
have a pair of objects constructed to accommodate the parameter
regardless of which name is used.
To build a DAX table, the What-if parameter, after it has been
finished, will make use of the GENERATESERIES statement,
passing in the name of the parameter as its argument. After that, it
will generate two separate objects inside that table. The "New
parameter" button that we pressed to open the dialog box will be
represented by one of the objects. This will be the first object. This
will be a slicer that can be placed onto the canvas to modify the
value of the parameter that is shown on the report canvas. The
second item that will be made is a measure that makes use of the
SELECTEDVALUE function. This measure is going to be made so
that you can use it to call in another measure to get a parameterized
value that you can then use in other DAX functions. This will be
illustrated for you in just a moment.
After giving our parameter a name, the next step is to specify the
possible forms of data it can take, which include a whole number, a
decimal number, or a fixed decimal number. A decimal number that
is fixed has a place for the decimal separator that is also fixed. This
can be handy in some situations when rounding might potentially
cause mistakes, or when you are adjusting extremely tiny values and
errors could potentially compound as a result of your actions.
Because we do not dislike ourselves to the extent that we would
need decimal points to grade, we can proceed with selecting "Whole
number." The minimum and maximum values determine the first and
last values in the GENERATES statement, which creates the table
and values that the parameter will use. The minimum and maximum
values are used to establish the first and last values. In my opinion, it
is preferable to maintain the value of zero at some point in the
series. Having zero as part of the series can be beneficial when I
want to illustrate the baseline value without making any adjustments,
even if it doesn't have to be the absolute lowest or highest number.
In this particular scenario, I will choose a minimum of –100 and a
maximum of 100 for the range.
Next, I determined the amount of the change's increment. This
explains the range of time that can be selected when altering the
value. For example, if I selected 5, with the lowest possible score
being –100 and the highest possible score being 100, I would have
the option to choose either –100, –95, –90, or... 95, 100. It's either
97 or 43 or –37 for me; I can't choose. If my increment were 1, I
would have the option of picking any whole integer value between –
100 and 100. If you are going to have an increment that is
dependent on decimals, it goes without saying that you should be
utilizing a data type that is either decimal or fixed decimal. For the
sake of clarity, I will use a one-point increment for this illustration.
Finally, I specified a default value. This is the value that will be
shown when I first bring the slicer from the Fields list onto the
canvas, and it is also the value that the parameter will reset to if
someone uses the "Reset to default" function in the Power BI
service. It is also the value that will be displayed when I first bring the
slicer from the Fields list onto the canvas. The slicer will initially
display a blank value if the default value of zero is chosen to be
used. There is no need to be concerned about this since it is the
behavior that was anticipated. One last decision is ahead of us, and
that is to choose whether or not we want Power BI to automatically
include the slicer that it will build on this report page.
DAX Integration of the Parameter
After selecting the OK button, I am now in possession of a
parameter. Nevertheless, the value of that parameter has no effect!
This is because the parameter cannot alter anything at this time. We
must call the argument in with some value, which we will then put
into the canvas. Because of this, the measure that was developed
concurrently with our parameter is of utmost significance. By
modifying one of our measures using the slicer on the report page,
we can add the value of a certain parameter to the total. Go through
your measures and then point out where in the figure below we can
be able to put the new measure that was developed based on the
What-if parameter. We are working with the "Total Grade Measure".

Keeping this adjustment in mind, I'm going to concentrate on the


following question: what if our overall grade was X number of points
higher or lower? What kind of impact would it have on the relevant
letter grade? For the sake of this example, I'm also going to change
the way that my Total Letter Grade measure is shown so that it
shows two decimal places rather than zero. I've put up a very basic
page with a term slicer, a What-if parameter slicer, and a table that
displays the total grade along with the total grade letter. What
happens to our table when we change the grade modifier to a value
between -100 and 100? Take a look at the picture below to find out.
As can be seen from the picture, the 100 points, in any direction, do
not make or break the class. On the other hand, considering that we
are discussing grades, you may think that if we reduced the total
number of people in the population, the change would be more
noticeable. According to my research, there were only seven
students during the summer of 2023, although their average grade
was greater. Would 100 points bump summer 2023 up a letter
grade? Let's try broadening the scope of this study to see what kind
of results we get.

If we gave the students in the summer 2021 class an additional 100


points for their overall score, they would have completed the course
with a total grade of 89.49%, which is just 0.01% away from rounding
up to an A–.

Parameter Modification
1. Locate the Parameter Control: You will need to look for
the visual element that represents the parameter in your
report or dashboard. This control can take the form of a
slicer, a numeric input box, or any other control that enables
users to interact with the parameter.
2. Modify the Value of the Parameter: By interacting with the
parameter control, you can change the value of the
parameter.

You can carry out one of the following actions, depending on


the kind of parameter control you have:
▪ Slicer: If the parameter in question is shown as a slicer, you
can change the value of the parameter or the range it falls
within by using the options provided by the slicer. This will
cause the value of the parameter to be updated automatically
and will cause any visualization that are related to the
parameter to be recalculated.
▪ Numeric Input Box: If the parameter is represented by a
numeric input box, you can change the value of the parameter
by clicking on the box and then manually entering a new value.
To use the new value, either click outside of the input box or
press the Enter key on your keyboard.
▪ Additional Controls: The parameter control could seem
different depending on how it's incorporated into the report or
the dashboard. To change the parameter value, it is necessary
to first follow the instructions that are included with the control.

3. Watch the Impact: After altering the value of the parameter,


watch how the modifications affect the visualizations and
calculations that are connected to the parameter. For Power
BI to accurately represent the changed parameter value, the
impacted measures will undergo a recalculation, and the
visualizations will be updated appropriately. Users can
obtain insights into the What-If scenarios and understand
the dynamic influence that varied parameter values have on
the data as a result of this.
4. Iterative Modification: Users can iterate and adjust the
parameter value several times to explore a variety of use
cases and see the changes that correlate to those use
cases in the report or dashboard. This repeated procedure
enables a greater comprehension of the data as well as the
possible results that can be achieved using a variety of
parameter settings.

R and Python Integration


In the fields of data analytics and statistical modeling, two of the
most popular computer languages to utilize are R and Python. Users
of Power BI can integrate R and Python scripts without any
complications, giving them the ability to make use of the extensive
libraries and functionality offered by these programming languages.
Installing the appropriate language interpreters and setting up Power
BI so that it can identify and run R or Python scripts are both
required steps in the process of integrating the two programs.
Benefits of R and Python Integration
Data workers stand to profit in a great number of ways from the
incorporation of R and Python into Power BI. R and Python both
provide substantial libraries for statistical analysis, machine learning,
and data manipulation that can be used to do complex calculations
and gain useful insights. This integration with Power BI enhances the
variety of data processing and analysis capabilities that are already
accessible. Second, the connection makes it possible for users to
make use of individualized analytical models and visualizations that
were developed using R and Python. Data professionals can
improve their visualizations and present complicated analyses in a
way that is clearer and aesthetically attractive by using the ability to
include custom visuals and models in Power BI reports.
In addition, the integration of R and Python offers streamlined
processes for the preparation and manipulation of data. When it
comes to handling missing values, transforming variables, or
conducting complex data manipulations, R and Python offer
flexibility, and efficiency. These languages provide powerful data
manipulation and cleansing tools, which allow users to preprocess
their data before visualizing or analyzing it in Power BI.
Practical Applications
The incorporation of R and Python into Power BI makes available a
vast array of useful applications in the real world. For instance, data
scientists can construct predictive models by using machine learning
algorithms written in R or Python and then integrate those models
into Power BI. This allows real-time forecasts and forecasting, which
in turn makes it easier for organizations to make decisions based on
data. Sentiment analysis is another application that includes
analyzing text data to ascertain a person's feelings or opinions on a
topic. Using R or Python, enterprises can construct models for
sentiment analysis, which can then be incorporated into Power BI.
This enables organizations to monitor real-time changes in social
media trends, customer feedback, and brand sentiment.
In addition, the programming languages R and Python can be used
to use clustering and segmentation algorithms to find groups or
segments included inside datasets. Understanding consumer
behavior, market segmentation, or product classification can all be
accomplished via the use of Power BI's visualization capabilities. In
addition, time series analysis, anomaly detection, and forecasting
are some of the other areas where the combination of R and Python
in Power BI can be of tremendous use. Users can find trends, spot
abnormalities, and make accurate predictions based on historical
data by using the powerful time series analysis frameworks that are
available in R or Python.
Limitations of Using R and Python
Although the incorporation of R and Python into Power BI delivers a
substantial number of benefits, it is essential to be aware of the limits
that are connected with the use of these programming languages.
Take into consideration the following restrictions:

1. Learning Curve: If you want to work with R and Python in


Power BI, you'll need to be familiar with those programming
languages. Users who are not acquainted with R or Python
may need to spend some time studying the syntax and
libraries associated with these languages before properly
using their capabilities in Power BI. This may require them
to invest some time and effort.
2. Managing Dependencies: Utilizing R and Python inside
Power BI may need the management of dependencies and
the installation of all required packages and libraries. When
working with unique or sophisticated libraries that have
compatibility concerns or need specific setups, this can be a
hard aspect of the process.
3. Performance Considerations: Scripts written in R and
Python can be executed in Power BI, but doing so may
have an impact on performance. This is particularly true
when working with big datasets or performing operations
that need a lot of calculation. The execution time of R or
Python scripts can be slower compared to the execution
time of calculations performed natively inside Power BI,
depending on the complexity of the study.
4. Security and Governance: When incorporating external R
and Python scripts, extra security issues are brought into
play. This relates to governance as well. Companies must
check that the Power BI scripts they employ are trustworthy,
secure, and compliant with their organizations' data
governance regulations. When bringing in code from outside
sources, precautions should be taken to avoid or minimize
any possible problems.
5. Deployment and Maintenance: Deploying and maintaining
Power BI reports or dashboards that use R or Python scripts
can be more challenging compared to reports that depend
exclusively on Power BI's native capabilities. This is
because R and Python scripts are more sophisticated than
Power BI's native features. The deployment procedure can
get more complicated if there is a need to ensure that the
necessary R or Python environments are appropriately set
up and maintained across all of the various Power BI
setups.
6. Compatibility and Versioning: Compatibility issues
between various versions of R, Python, and Power BI can
be a source of worry. To prevent any compatibility concerns,
it is essential to check that the versions of R and Python
that are being used in Power BI are compatible with the
ones that are supported by Power BI.
7. Support and Community: Even though R and Python both
have sizable user communities, the support and resources
that are uniquely tailored to the usage of R and Python in
Power BI can be quite limited. It's possible that further work
can be required to troubleshoot problems or locate
particular solutions linked to Power BI integration.

Enabling R and Python for Power BI


To utilize R and/or Python with Power BI, the first thing that needs to
be done is to make sure that Power BI is aware of the locations of
the required home directories. This is necessary so that Power BI
can call the appropriate libraries and packages when those
languages are used. To do this, choose File > Options & Settings>
Options from the menu bar. This will take you to the whole range of
options and settings for the Power BI Desktop. Some other
advanced options can be found in this menu, but the R scripting and
Python scripting settings are the ones that are relevant to us. When
these languages are installed, Power BI Desktop performs a decent
job of recognizing where their respective home directories are
located.
You will, however, need to manually browse them inside the Options
box if they are not autodetected by the program. Take note that if you
click on the hyperlink in any of these two options, you will be sent to
the appropriate Microsoft page for installing R or Python. Once it is
finished, you are ready to go on to the next step.
R and Python in Power Query
First, we are going to examine how scripts written in R and Python
can be used to manipulate data, and then we can talk about R and
Python on their own as independent data sources. Scripts written in
R and Python can be located in the Transform portion of the ribbon,
which is the area that is the farthest to the right. The subsection is
called Scripts.
Either one of these buttons, when clicked, will bring up a dialog
window into which you can paste the corresponding R or Python
script. Make a note in your head that, just like everything else, the
stages of applying an R or Python script to the query are performed
in the correct sequence. This has the benefit that it can be done at
any point up to the point when the R or Python script is called, and
both R and Python will allow naming the data in the step before that
transformation a "dataset," to access the data in the relevant script.
This can be done at any point until the point where the R or Python
script is invoked. R and Python do not necessarily have to be
considered terminal transformations either.
You can instruct Power Query to perform transformations either
before or after a script written in R or Python is executed. This can
be useful in situations in which you might find it challenging to do a
particularly specific data transformation using the Power Query user
interface (UI), or in situations in which performing a transformation
would involve some sophisticated M.
In addition to this, you can utilize scripts written in R or Python as
your independent data sources. If you choose "Get data" and then
search for "script," you will find possibilities to write scripts in both R
and Python. In situations such as this one, when you may already
have a well-defined script that you have written from the integrated
development environment (IDE) of the applicable programming
language, you can just copy and paste that script into the dialog box
that displays when it occurs.
This will invoke the R or Python script, which will most likely include
a link to some source data, execute that script, and then present the
results as a whole new query. After that, the results are completely
changeable in the same manner as any other Power Query table in
any subsequent transformation stages. This can include, once again,
a script transformation that employs either or both languages in
succession.
R and Python Visuals
R and Python visuals are both supported in Power BI Desktop,
supposing that you have the respective languages installed on your
computer. When you pick Visual for the first time from the
Visualizations pane, a warning window will appear, asking you
whether you want to allow script visuals. Script visuals provide you
the opportunity to generate custom visuals using either language in
such a manner that can extend your analysis beyond the
visualizations that may typically be accessible in Power BI—and
even those that could be available via the Power BI custom visual
options. Script visuals also allow you the power to do this in a way
that can take your analysis to the next level.
In addition, these visuals are completely interactive with the other
visuals on the report page as well as the slicers. Alternate cross-
filtering from other visuals can change visuals generated by R and
Python as well. After opening an R or Python visual for the first time,
you will be prompted to drag fields into the Values section of the
Visualization pane using the mouse. When you drag fields into a
Power BI visual, Power BI will utilize those fields to determine what it
refers to as the "dataset" for the visual. In the figure that follows, we
can see what it looks like both before and after the fields have been
added.
You'll also note that, for both the R and the Python visuals, you'll
receive a notification indicating that duplicate rows in the data will be
deleted. This is something that you'll notice. To execute a script, you
need to first ensure that it has all of the necessary scripts and then
modify those scripts so that they refer to the "dataset" that you have
created in the Power BI interface. Once you have done this, you can
click the Play button that is located on the far right of the script editor
bar. You can reduce the size of the window by clicking the arrow next
to the menu. If you need to change the script, you can do it in a more
user-friendly editor by clicking the arrow those points up and to the
right, which will open the IDE for the language you are now using.

Conclusion
By using artificial intelligence, machine learning techniques, and
interaction with R and Python, AI-powered visualizations in Power BI
improve the data analysis and reporting experience. Users can get
deeper insights, carry out sophisticated analysis, and produce
unique visualizations thanks to these functionalities, which eventually
results in more precise and well-informed decision-making. With the
help of R and Python integration and the power of AI, Power BI
enables users to fully use their data to produce reports that are both
engaging and significant.
CHAPTER 8
INTRODUCTION TO THE POWER BI
SERVICE
The Basics of the Service: What You Need to
Know
Microsoft's Power BI Service is a web-based platform that was
designed in the cloud and provides companies with the ability to do
comprehensive data analysis, data visualization, and report
generation.
The following is a detailed review of the fundamental
information that you need to know about this service:

1. Cloud-Based Accessibility: Because the Power BI Service


is hosted in the cloud, users can access their data, reports,
and dashboards at any time and from any location. Users
can access the functions of the service using a web browser
or mobile app, eliminating the requirement for on-premises
infrastructure and increasing flexibility and simplicity of use.
2. Workspaces for Organization: Power BI organizes content
inside workspaces, which serve as containers for connected
dashboards, reports, datasets, and dataflows. Workspaces
make it possible for teams to efficiently cooperate by serving
as a single hub for the teams' project-specific or
departmental data and analytic requirements. In addition to
this, they provide control over access rights and the sharing
of content.
3. Dashboard Visualizations: Dashboards are
representations of data that are visually engaging and give
condensed glimpses of important metrics and insights.
Users can pin different visual representations to
dashboards, such as charts, tables, or tiles, and organize
them in a way that provides relevant summaries of their data
at a glance.
4. Interactive Reports: Power BI Service enables the use of
Power BI Desktop, a powerful authoring tool, to create
interactive reports that can be shared with others. After that,
the service can get these reports in the form of publications.
Users are given the ability to go further into data exploration
and analysis by making use of the much visualization, filters,
and interactive components that are included in the reports.
5. Connectivity to Data Sources: The Power BI Service can
connect to a broad variety of data sources in a streamlined
manner, including databases, Excel files, cloud services,
and many more. By importing data from various sources or
connecting to them, users can generate datasets that will
serve as the basis for their reports and visualizations.
6. Data Refreshing: The Power BI Service enables users to
set up scheduled data refresh capabilities to guarantee that
their data is accurate and relevant. Reports and dashboards
will always present the most recent information since users
can preset precise intervals for automated dataset updates
or manually initiate a refresh when necessary.
7. Ease of Collaboration and Sharing: The Power BI Service
makes it simple to collaborate with other users. Users can
govern who can access their dashboards, reports, and
datasets by configuring access permissions and sharing
settings respectively. Co-authoring, commenting, and email
sharing are just a few examples of the features that make
seamless collaboration and group decision-making possible.
8. Pre-built Content Packs and Apps: The Power BI Service
provides content packs and apps that have already been
produced and are tailored for certain applications or
services. These packs come with pre-built dashboards,
reports, and datasets that are ready to be used and can be
modified to fit the needs of a particular industry or company.
They provide a good starting point for tailoring reports and
analysis to specific needs and requirements.
9. Natural Language Query: Power BI Service is equipped
with a feature known as "Q&A," which makes it possible for
users to engage with the data they have access to by
utilizing natural language questions. Users can access
visualizations and responses based on the available data by
typing or uttering inquiries, which facilitates intuitive
exploration and analysis.
10.
Mobile Accessibility: To supplement the service, Power BI
offers mobile applications for iOS and Android devices.
These apps guarantee that customers can access and
interact with their reports and dashboards while they are on
the go. Regardless of the location, the availability of data
and the capacity to make flexible decisions are both
guaranteed by this mobility.

To begin, you can sign in to the service by entering the credentials


for your Microsoft account (or employer account, if you have one).
The Power BI service is available to everyone with a "free" license,
and anyone can use it. Using this free license, you will have access
to "My Workspace." In this area of personal growth, you can submit
reports and analyze how they function and appear inside the service.
After you have successfully authenticated yourself, you will be
brought to a home page. You'll find that a lot is going on in this area.
On the left side of the screen is a navigation bar, in the center is a
listing of Favorites and Frequents, and just below that is a
connection to current components and applications that you have
access to. Various options allow you to search for items to which you
have access, create new reports, and adjust the settings controls in
the bar that is located on the top right. This first view is the same
view that you would get if you were to select the Home link that is
located at the very top of the menu on the left side of the screen.
The good news is that everything you need to access in the
beginning can be gained from that menu on the left-hand side, which
is called the Navigation menu. Therefore, let's travel through those
regions and get an understanding of the most significant things that
can be found there.
The Navigation Menu
A detailed view of the Navigation menu can be seen in the picture
below. You will see what appear to be a nine-dot box, the Power BI
title, and another link to the Home page in the black bar at the very
top of the screen. The nine-dot box is pretty handy. If you choose it,
a selection of shortcuts that will lead you to various Microsoft
services that you will be able to use will display before you. For
instance, you can access Outlook, OneDrive, Word, Excel,
PowerPoint, OneNote, SharePoint, Teams, and Sway, in addition to
any other Microsoft products that you may have access to via your
Office 365 subscription. When you're working in Power BI and all of
a sudden realize you need to go over to another area of your Office
suite, this feature can come in handy. When you're developing
anything, like a dataflow, and you need to reference where an item
could be that you want to draw in from SharePoint or OneDrive for
Business, it can also be helpful to have this information at your
fingertips.
Clicking on that Home link will do the same action as selecting the
Home button that is shown below, to the right of the symbol that
looks like home (in the lighter area of the menu). Keep in mind that
Microsoft aims to provide you with as many options as possible,
even if those options are only separated by a few pixels. You'll notice
a hamburger menu button just under the nine-dot box in the
Navigation menu. Clicking this icon will allow you to shrink or enlarge
the Navigation menu, depending on how much space you need to
see it. Even when it is in its condensed state, you can still view the
icons that make up the Navigation menu and utilize those icons to
navigate. You will be provided with explanations of what those
symbols imply after you enlarge them.

Home and Browse


The Home section of the Power BI service provides users with
convenient access to recommended objects, recently accessed
objects, favorite objects, and accessible apps. The topmost section
of this page features a pleasant greeting and a button labeled "New
report" that allows users to generate a new report. Clicking this
button will redirect users to the same section as the "Create" link
located on the Navigation menu. The image below provides a visual
representation of this feature. Adjacent to it, there exists a fusion of
ellipses and hamburger menu icons, which provides me with the
option to select either the uncomplicated or comprehensive
arrangement. Although the expanded layout appears largely
unchanged, it does feature a selection of valuable learning resources
located at the bottom of the page. The entirety of the content
presented in this section has been designed using a simplified page
layout.
At the uppermost part of the page, there is a section labeled
"Recommended" which displays a list of suggested items. Various
items such as workspaces, reports, datasets, and apps can be
included in this category. This list may also comprise data elements
that are endorsed by your organization, including certified datasets
and apps that you may have access to. Displayed below the list of
suggested items, you will find a compilation of recently viewed
elements, favorite elements, and applications. Switching between
views is a breeze - just click the corresponding button. The search
functionality allows for easy navigation of the lists by keyword, while
the Filter button provides the option to narrow down the search by
selecting specific types of objects, filtering by recent activity, or
sorting by endorsement status.
The design of this layout bears a striking resemblance to that of a
SharePoint list. The user can view the titles of their favorite content.
The user can easily identify the type of element, be it a report or a
dashboard, along with its owner, endorsement status, and sensitivity
labels. Within the Favorites list, it is noticeable that Power BI
continues to display the items as starred, despite the possibility of it
seeming repetitive. The aforementioned feature can be found within
the unmarked column located in the Recent section. In case there
are any recently added items that you have marked as favorites, you
will notice that these particular elements are indicated with a star
symbol. In relation to the Home section, the Browse category within
the Navigation menu appears to be redundant. The interface
displays a comprehensive inventory of accessible items, categorized
into subpages such as Recent, Favorites, and Shared with You. The
image below showcases Microsoft's unwavering commitment to
providing users with a multitude of options to accomplish a single
task.
Create
With Power BI, users can construct reports directly within the
service. In all honesty, while the option to create reports directly on
the web does exist, it is not frequently utilized as compared to the
alternative of downloading Power BI Desktop for free and generating
reports through that platform.
It is highly recommended to utilize Power BI Desktop for authoring
purposes. Keeping that in consideration, the Create tab will present
you with two alternatives. There are two ways to input data into a
table sheet: manual entry or copy-pasting. Alternatively, you can
select a pre-existing dataset within the workspace and generate a
new report based on that dataset.

Data Hub
The Data Hub is a centralized platform designed to store and
manages large amounts of data. It provides a secure and efficient
way to. The "Data Hub" section provides users with a
comprehensive view of their available datasets. This includes a
curated list of recommended datasets, a complete list of accessible
datasets, and a personalized list of authored datasets. If you
possess a Premium Per User or Premium Per Capacity license,
you can also observe the Datamarts feature in preview.
It is worth noting that Microsoft provides hyperlinks within this view
that can redirect you to their documentation. This can be helpful if
you have any queries regarding specific elements on the page
related to dataset discovery and understanding the concept of a
dataset. Upon selecting a dataset, a vertical three-dot selection will
appear, providing access to an expanded list of options specific to
that dataset.
This menu provides you with various options to manipulate a
dataset. The approach we will be taking is slightly out of order; with
the settings section being addressed last as it is a distinct topic on its
own. The initial option available to us is to perform an analysis in
Excel. Upon your initial attempt, Power BI will prompt you to install
an update to Excel to properly read the file format. Upon completion
of the task, Power BI will automatically produce an Excel worksheet
that is pre-configured with a live connection to the source dataset.
Connecting to a Power BI dataset to generate a report from Power
BI Desktop is a process that bears resemblance to the one at hand.
The data can be accessed in a pivot-table format, allowing for the
placement of fields into rows and columns, and the inclusion of
measures into the Values section, similar to any other pivot table.
This piece is designed to assist users who may possess the
knowledge to utilize the data model you have established, but may
not yet feel confident using Power BI Desktop. Perhaps the user
requires expeditious pivot-table analysis utilizing Microsoft Excel.
From an Excel standpoint, utilizing this function offers the benefit of
storing only the data that is visible in the cell on the local device,
while the primary data is stored in the cloud. Upon selecting "Create
report," users will be directed to the web report authoring interface.
Here, all tables and measures within the dataset will be readily
available for viewing. With the aid of this feature, it is possible to
construct a report with an unlimited number of pages, which will be
accessible on the Power BI service. It is important to note that the
web authoring experience does not provide the option to modify a
dataset. In the web authoring experience, it is not possible to add a
table or any missing measure. To incorporate your pertinent data
elements, you will need to access the primary dataset, append the
required information, and subsequently re-release the dataset to the
platform.
The "Create paginated report" function generates a Report
Definition Language (RDL) file that includes all the essential
connection details required to produce a pixel-perfect report, similar
to those in SQL Server Reporting Services. This report can then be
hosted on the Power BI service. Paginated reports in the Power BI
service are created using a distinct software package called Power
BI Report Builder, rather than Power BI Desktop. This feature is
exclusively available in workspaces with Premium Per User or
Premium Per Capacity plans. By accessing the "Manage
Permissions" feature, you will be able to view a comprehensive list
of objects that you have the authority to add or remove users. It is
possible to view the individuals who possess direct access to a
report, dataset, or workbook. This interface allows for the manual
addition of users and provides visibility on pending access requests.

The "Chat in Teams" feature generates a link that can be easily


shared with individuals, groups, or channels within Microsoft Teams.
Upon accessing the dataset, users are seamlessly directed to the
relevant section of the Power BI service through Teams integration.
In addition to that, it is possible to distribute reports and report pages
within Teams. Incorporating analytics into your organization can be
seamlessly achieved through Microsoft Teams, leveraging its user-
friendly interface that your team is already accustomed to. By
utilizing Teams as a communication channel, you can facilitate
discussions around the findings of a particular report or dataset. This
feature also empowers you, as the author, to promptly offer valuable
insights, clarify any doubts, or incorporate suggestions from users to
enhance the report. By clicking on the "View lineage" option, a fresh
browser window will open up, displaying a comprehensive list of all
the data elements present within a particular workspace, along with
their corresponding dependencies. In the case of a dataset that
employs a SQL Server instance, it is customary to present the SQL
Server first, followed by the dataset, then the reports based on that
dataset, and finally any dashboards that were generated from report
elements. From this perspective, you have the option to generate
various other components by utilizing the "New" button. By utilizing
web-based design tools, you can seamlessly generate fresh
elements within a workspace or upload PBIX files directly into it. This
versatile tool can be utilized for generating reports and paginated
reports, creating scorecards, managing dataflows, and organizing
datasets. I highly recommend utilizing Power BI Desktop over the
web authorship tool whenever possible. This is the stage where you
can create dataflows and scorecards.
From a web authorship perspective, this location may seem unusual
for housing such functionality, particularly for items that can be better
suited to Power BI Desktop. However, it is worth noting that this is
the most comprehensive location where these options can be found.
Upon clicking on a dataset, a comprehensive list of tables and
columns contained within the dataset will be displayed. By utilizing
the "View lineage" feature, you can gain insight into the flow of data
elements from one item to another. This allows you to identify
potential impacts that may arise from modifying a dataset, extending
beyond your immediate workspace.
Settings
Upon selecting the Settings option for a dataset, users will be
directed to a properties page that is conveniently preselected to the
Datasets section of the Navigation menu located at the top of the
page. On the left-hand side, there is a comprehensive list of all the
datasets available in the workspace. To effectively manage your
dataset(s), it is crucial to pay attention to the various properties
options located on the right-hand side of the screen. These options
are specifically designed to cater to the needs of the currently
selected dataset. The image below gives us a look at this page.
By clicking on the "View dataset" hyperlink, you will be directed to
the workspace page of the dataset. By selecting the "Refresh
history" option, a pop-up window will appear within the page,
displaying the refresh history of the dataset. This will indicate
whether the refresh was successful or not, and if there was a failure,
it will provide a reason for the failure. Some failure messages are
helpful. Some are not. It’s still a Microsoft product, and Microsoft’s
error messages can be hit-or-miss in terms of their helpfulness;
they’re notoriously unfriendly when it comes to troubleshooting in
Windows. Next, we can create or update the dataset’s description
In cases where data sources are not hosted in the cloud, particularly
in Azure, it is typically necessary to install a data gateway to enable
the refresh of the corresponding dataset. Suppose that there is a
Power BI dataset that is being refreshed from a few Excel files
located on the network drive of the user's company. Access to
network drives is currently not available on the Power BI service.
The data gateway serves as a virtual private network that connects
the Power BI service with authorized locations, enabling seamless
data retrieval for refreshes. Reviewing each data source and
determining whether a gateway is necessary would be a time-
consuming task. It is worth noting that Microsoft is the most reliable
and current source for this information.
In most cases, the responsibility of managing gateways falls on the
IT or security team. Therefore, if your organization is currently
utilizing an enterprise gateway, it is advisable to contact the relevant
stakeholders to determine whether you need to include a specific
data source. A downloadable personal version of the data gateway is
offered by Microsoft. Assuming that your local machine is up and
running, this version can be accessed. Additionally, if the gateway is
operational during the refresh, it will also function properly. It is
important to keep in mind that utilizing R and Python data sources
and transformations necessitates the use of a personal data
gateway, rather than an enterprise one.
The reason for this is that the Power BI service requires the
execution of the corresponding R and Python scripts from the user's
machine. This is because the user's machine is assumed to possess
all the essential packages and libraries. Next, “Data source
credentials” would be where you configure the credentials required
to access the data. Assuming that you are extracting data from on-
premises SQL Server, which is configured in your data gateway, you
will need to furnish the appropriate credentials in this section. This
will enable Power BI to identify the credentials required to pass
through the gateway and access the original data source. When a
Power BI file is uploaded, it typically retains the credentials that were
utilized in Desktop. However, this may not always align with your
preferences. It may be necessary to transfer the refresh credentials
to a pre-established service account. In certain cases, it may be
necessary to modify login credentials, such as when an individual
departs from the company or due to a variety of other factors.
In case of encountering refresh issues, it is recommended to verify
your data source credentials as Power BI can indicate if any action is
required. Rest assured that Power BI will efficiently notify you if any
further action is necessary. The concept of parameters is quite
simple and easy to understand. In case your dataset contains
parameters, this is the section where you can adjust their respective
values. A common practice is to parameterize server and database
names, which can be useful in various scenarios. In the event of an
unexpected occurrence such as the need to redirect a Power BI
dataset to a backup server or a database name alteration, it is
possible to promptly modify the connections by accessing the
settings. This process involves a seamless transition of connections,
and data refresh and allows for a swift continuation of tasks. This
platform allows you to efficiently manage and update all Power
Query parameters associated with a specific dataset. Upon updating
the parameter values, a refresh will be triggered to incorporate the
latest changes to the data state.
For me, the true magic of the Power BI service lies in the process of
"scheduled refresh". In this section, you can customize the
frequency and timing of your Power BI dataset's refresh against its
source data. In non-premium environments, users have the option to
schedule up to 8 refreshes, whereas, in premium capacity
environments, they can schedule up to 48 refreshes per day. Users
have the option to customize their time zone settings, whether it's for
daily use or specific days of the week. Once the preferred settings
have been selected, they can be saved for future use. Merely
configuring a refresh feature does not guarantee its functionality.
Before scheduling a refresh, it is imperative to ensure that your
dataset has been successfully refreshed in Power BI Desktop and
that all dataset settings in service are accurate.
Enabling or disabling the Q&A feature on a dataset can be achieved
through the Q&A toggle switch. In addition, this feature enables
users to easily share their synonyms with all members of the
organization. The "Featured Q&A Questions" feature enables you
to showcase select questions to your audience as they peruse
reports generated from this particular dataset. With the
Endorsement feature, you can pinpoint a specific dataset and bring
it to the forefront of your organization's attention. Microsoft offers
three tiers of endorsement: None, Promoted, and Certified. The
default setting is "none." If a dataset lacks endorsement, it will only
appear in search results without any further visibility. Promoting a
dataset can significantly enhance its visibility in search results. This
increased visibility can lead to a higher ranking in the search context,
thereby increasing its chances of being discovered by potential
users.
In addition, there is an option to enhance the visibility of the dataset
by making it discoverable. This feature enables users who do not
have the dataset's access to discover it and request access on their
own. Certification of a dataset is a possible final step in the dataset
creation process. A dataset that has undergone a thorough review
process is referred to as a certified dataset, which serves as a
reliable and trustworthy label, often regarded as a "source of truth."
The certified status for a dataset is not set by default by an
individual. The aforementioned task is accomplished through various
procedures within your company. Every organization has its own set
of standards for determining when a dataset is eligible to be
classified as Certified. The "Request Access" feature allows users
to manage how they can request permission to access content
associated with the chosen dataset. There are two ways to request
access to a dataset. The first option is to send an email request to
the dataset owner. Alternatively, you can receive an automatic
response that includes a set of instructions to follow to gain access
to the dataset. With Dataset Image, users have the option to select
an attractive image that effectively showcases their dataset in
various discoverable locations. Once uploaded, this image will serve
as the universal representation of the dataset across your
organization's various platforms for dataset discovery.
Metrics
The Power BI service has recently introduced metrics as a new
feature. With the Power BI service, users can generate measurable
KPIs or goal metrics that can be tracked using data. Scorecards are
designed to display the performance of a specific metric during a
designated time frame. It is important to note that metrics are closely
tied to individual users or business scenarios. Therefore, it is best
not to delve too deeply into the specifics. If you are considering the
implementation of metrics, give it a try to assess its potential
usefulness. In a recent development, Microsoft has made it possible
for individuals to work with metrics within their workspace. It is
recommended that you begin by setting some personal goals before
exploring this feature for other areas of the organization.
Apps
An app, in the context of the Power BI service, is a collection of
packaged content that can be distributed to a broader
audience. Apps are developed inside of workspaces, and once they
are complete, they can be published to a set of people, an enterprise
as a whole, or to a Microsoft 365 community depending on the
requirements. Apps can also have different rights than those in a
workspace, which makes user management somewhat simpler since
the permissions for an app are maintained in a single area. Users
can create their applications using a workspace. You can utilize any
one of the many applications that are accessible to you, which are
referred to as template apps. These applications are compilations
that were developed by other people and then shared with you and
the other members of your company so that you can utilize them.
In the picture below, we can see the very first Apps view, which will
first give the impression of being empty. After that, we can either add
an organizational app or a template app or then inspect the content
that comes from those applications. We can view the list of available
applications that we can bring into our workplace by using the yellow
"Get apps" button in any area. This button is accessible in both
places. In our demonstration, I have developed an application known
as the RLS Test that will serve as an organizational app. The other
applications are examples. Using the COVID-19 US Tracking Report
template app offered by Microsoft, we will guide you through the
process of installing the template app. It is important to note that in
the area labeled "All apps," organizational apps are always shown
first, followed by template apps; yet, the apps in either category are
not structured in a manner that makes sense. You'll see that there is
a spot where you can opt to pick just applications that have been
recommended by the company. When you need to find a collection
of data that has been approved by your organization's leadership in
a hurry, this can be a useful tool for you.
To install the program, just click the proper box, and you will be
brought to the right page in AppSource, which is Microsoft's
application and custom visual gateway. Installing the application is
as simple as that. You should examine the description of the
software to see if there are any licensing requirements before
downloading it, but in most instances, there will be a large button
that says "Get It Now." In the case of Microsoft's template programs,
a pop-up window will appear before the installation process in which
it will request some information from you for the sake of marketing.
This doesn't need to be coupled with the same information as your
Power BI account itself; thus, if you are uneasy about doing so, you
are free to use an email address that you do not often check.
When the software or applications have been successfully installed,
you will see them here. To get access to them, just click on the link
that is provided below its name. Each app will have a list of
information associated with it, which may include the publisher, the
date it was released, the kind of app, the version number (if
applicable), and the endorsement status of the app. You can see the
reports that were provided with an app after it has been installed. It’s
fun for me to browse through some of the template applications to
get ideas for the layout of my work, so don't be embarrassed to steal
creative concepts!
Deployment Pipelines
You are only able to define Development, Test, and Production
environments for a particular Power BI workspace if you have the
Premium per Capacity plan since that plan includes the
Deployment Pipelines functionality. This is a tool that can be used by
developers to assist them with difficulties such as user testing and
version control. To provide support for a pipeline, workspaces can
either be created or assigned; but, as was stated before, any such
workspaces must be in the Premium per Capacity tier. In most
cases, administrators at the workspace or tenant level, or whatever
group within an organization is in charge of managing the Premium
per Capacity tenancy, are the ones responsible for managing
deployment pipelines. If you are on a Premium per Capacity tenant
and have a need or want to construct a pipeline, you should work
with your appropriate administrator to have it put together or seek to
have work added to an existing pipeline. If you are not a Premium
per Capacity tenant, you should not do any of these things.
Learn
Microsoft provides you with connections to training resources,
documentation for Power BI, and some example reports to get you
started on your learning journey via the Learn site. Learn is a modest
learning portal. You can also join the bigger communities around
Power BI and Power Platform by using the Learn tab. There are a lot
of very awesome users out there that take the time to provide
content, answer questions on forums, push new ideas for the
product, and host events for practitioners to share what they've
learned with others so that others can benefit from them as well.
Publishing Your Work
You will need to take your PBIX file from Power BI Desktop to
publish your work. You will then have the option to either utilize the
Publish button in Power BI Desktop, which is located on the Home
ribbon or upload a PBIX file to the Power BI service, which can be
done through the Navigation menu. When we click the "Get data"
button, the page that appears next is as seen in the picture below.

You will see a menu in Power BI Desktop that provides a list of the
workspaces to which you have access. From this menu, you will
choose the workspace in which you want to publish the report. In the
context of the service, its role is more of an upward pull than an
outward push. You will need to submit the PBIX file that is located in
the workspace to which you would want to add the dataset and/or
report. On the left, we can see the Discover content functionalities,
which will make it possible for you to see your organizational apps as
well as other template apps that can be directly integrated into the
workspace. You can upload your PBIX file by selecting the Files
option that is located on the right side of the screen.
The Databases & More option gives you the ability to work on the
process of creating a dataset from a connection to Azure SQL, Azure
SQL Data Warehouse (which is now Azure Synapse), SQL Server
Analysis Services, or Spark on Azure. You can do this by connecting
to Azure SQL, Azure SQL Data Warehouse, SQL Server Analysis
Services, or Spark on Azure. Because selecting File is the 99% of
the time when you'll want to utilize this option, the picture below will
show us what the service looks like when File is selected.

You will notice a few different options once you reach this point. For
the sake of this discussion, "Local File" refers to the PBIX file that
you created using Power BI Desktop. You can establish a dataset in
the Power BI service that is linked to either your OneDrive for
Business account or your OneDrive account by using the OneDrive
connections. These sources are handy from a refreshing standpoint
since they do not call for the use of a data gateway to be refreshed.
Despite this, I continue to recommend that you construct your report
using sources from OneDrive in Power BI Desktop as opposed to
doing it here. In a similar vein, the same may be said about the
SharePoint option.
It is important to keep in mind that even in circumstances in which
you could be utilizing OneDrive or SharePoint for the sake of version
control; you can still upload PBIX files over those channels. This is
one use case that can make it simple to utilize these nonlocal file
alternatives. When you pick Local File, the classic window for
Windows Explorer will come up for you to browse your files. Find the
right file, and then upload it to the server. Following that step, the
dataset will be visible in the workspace navigation section
corresponding to the chosen workspace.
One other point: Anytime a dataset is uploaded to a workspace for
the first time, it will also automatically generate a dashboard with the
same name as the dataset. This dashboard will be empty. You can
now see that transferring our work from Power BI into a workspace is
a very straightforward process; however, what exactly is a
workspace and why are they important?
What Is a Workspace?
A workspace in Power BI is nothing more than a storage location for
various data assets. A workspace is where datasets, reports, and
dashboards are stored. A workspace is also where visualizations are
stored. You can send users to a workspace for them to get content,
and you can also utilize a workspace as the foundation for an
application that you will send users to obtain content. When we
publish datasets and the reports associated with them to the service,
we are, in effect, publishing them to these workspaces.

My Workspace
Anyone who uses the Power BI service will automatically have a
"free" customized workspace created for them. You have a
responsibility to be aware of a few significant restrictions that apply
to this workplace. To begin, to share content that is located in your
workspace, both you and the individuals with whom you share are
needed to have at least a Power BI Pro license. This is the case
even if you can share the content.
Second, it is widely regarded to be a smart practice to not share
anything permanently outside of your workplace. This is because
access to that workspace can become problematic if you ever leave
the business. Additionally, the content in your workspace cannot be
used to create an application. Sorry, but you can't utilize this to get
around the license requirements set by Microsoft. You also are
unable to make a dataflow in Power BI, which is another thing you
cannot accomplish. Using one's workstation as one's private testing
environment is the most productive way to utilize that space. After
constructing your report and determining that it has an appealing
appearance in Power BI Desktop, you publish it to the service and
then see it there to verify that all of the components are operating as
you would want them to. Does the report maintain its professional
appearance when viewed on a variety of screen sizes? When you
don't have access to the additional capabilities provided by Power BI
Desktop, does the navigation of the report flow the way you want it
to? These are queries that can be answered right here, in the
comfort of your workstation. You can also test your scheduled
refresh in this workspace, which can be beneficial before transferring
that dataset into a more permanent home in a shared workspace.
Alternatively, you can test your planned refresh in this workplace.
Shared Capacity Workspaces
Users who have Power BI Pro or Power BI Premium per User
licenses can share Power BI data components with other users who
also have access to a shared capacity workspace if the workspace is
set up with that capability.
The procedure of establishing a workplace is quite uncomplicated.
You will see a list of the workspaces to which you have access when
you click the Workspaces button, which is located in the Navigation
menu. This will cause a window to appear to the right of the
Navigation menu. In this particular scenario, I've created two other
workplaces in addition to "My Workspace." The first workspace is a
Premium per User workplace, while the second workspace is a
standard shared capacity workspace. A Premium per User
workspace is required for the operation of some functionality. In
contrast to a regular Pro license shared workspace, Premium per
Capacity workspaces will have a diamond symbol next to their
names so you can determine whether it is a Premium per User or
Premium per Capacity workspace. Have a glance at it in the
photograph that is provided down below. You can start a new
workspace by selecting the option that is located at the very bottom
of that list.
The newly generated workspace will provide you with alternatives
that are suitable for your licensing requirements. In this example, we
are going to make use of the premium per-user trial that's free for the
first sixty days. The creation of a workspace will trigger the
appearance of a pane on the right-hand side of the page. There will
be a spot for the name of the workspace and a description of the
workspace, as well as a picture that will explain the workplace.
Additionally, some configuration options can be found in the
advanced portion of this pane. We can choose the individuals who
will be included on the contact list for this newly created workspace,
affix a OneDrive location to the workspace so that files can be
hosted there, and determine the kind of licensing mode that is
associated with the workspace (in this example, Pro, Premium Per
User, Premium Per Capacity, or Embedded).
If your company does not have the appropriate license in place, the
Embedded and Premium per Capacity options will be grayed out.
If you do not have a license for Premium per User, then the
Premium per User option will likewise be unavailable to you. You
have the option of designating the workspace as one that is being
used for the creation of a template app, and you also have the option
of allowing contributors to change any app that originates from this
workspace. When you want more than one person to be able to
deploy the updated app yet you have numerous individuals working
on the development of a particular Power BI solution, this can be
useful.
It is essential to keep in mind that both the Pro and Premium per
User workspaces can be accessed inside a licensing hierarchy. For
example, if I have a Pro workspace, users who have licenses for
either Premium per User or Pro will be able to access the workspace
and the data items that are included inside it. On the other hand, if
the workspace in question requires a Premium per User license to
be accessed, only those individuals will be permitted to do so. This
rule does not apply to businesses that have Premium per Capacity
licensing since that permits an unlimited number of readers inside an
organization to access any workspace built on its premium capacity
node. Consequently, this regulation is null and void in such
organizations.
Dataflows in Shared Workspaces
In Power BI, a "dataflow" refers to a shared data element that is
kept in a workspace and that can be called upon to serve as a data
source for the creation of reports.
Consider this to be an ETL process that not only retrieves data from
some source and does certain transformations, but also generates a
data piece that can be utilized for further analysis outside of the
context of a particular model. These days, there are two distinct
varieties of dataflows. Refer to the first kind of dataflows as
"classic," and the second type as "streaming." Any workspace that
includes one kind of dataflow cannot also have the other type of
dataflow inside the same workspace.
A set of tables that were produced by the Power Query service
inside the Power BI platform constitutes a traditional dataflow. In the
section under "Get Data," you will discover the possibility to make a
dataflow. When we were looking at that page in my workspace, the
option that is now there under the heading "Create new content"
was not there when we first looked at it. Several data sources are
not accessible via a Power BI dataflow that is available through the
Power BI Desktop. It is a rather comprehensive list of the most
frequent data sources, although Microsoft does sometimes add new
data sources to the Power Query Online platform. When working
with a Power BI Desktop file, it can be convenient to have a series of
transformations or components of a data model that aren't
dependent on a single dataset for execution. Additionally, it is
beneficial to have reusable data items so that other individuals within
the business may potentially use them as a foundation for their
analysis. On licenses for the Pro tier, classic dataflows can be used.
A minimum of a Premium per User license is required to use
streaming dataflows. Only users who have a Premium per User
license or who are working in an environment that is Premium per
Capacity can consume reports that have been shared from
streaming dataflows. Power BI is given the ability to call either an
Azure Event Hub event or an Azure IoT event by virtue of a
streaming dataflow. You can do "real-time" reporting on data that is
generated by any of these scenarios by consuming data from either
of those sources and performing transformations on that data in real-
time. Although this is a wonderful feature, the fact that it can only be
used with Azure Event Hubs and Azure IoT Hub restricts its utility,
and setting up this kind of event requires a bit more work than usual.
I'm hoping that before they make this available to the general public,
they increase the number of available options and makes it simpler
to make use of them.
Putting Your Data in Front of Others
Therefore, even if we can place our data in a workspace, we still
need to make sure that other people can see it. We can do it using
one of a few different approaches. Users can join the workplaces
that we develop thanks to this option. An app can be made by us.
Reports can be linked to either Teams or SharePoint as necessary. If
we are using Power BI Embedded, we can also embed report parts
into either a website or an application. It is highly recommended that
you collaborate with an application developer to integrate reports into
an application. An application developer will be able to assist you in
overcoming any technical obstacles that may arise. Otherwise, let's
begin sharing.
Adding Users to a Workspace
Simply adding someone into our office is the quickest and simplest
approach to providing someone access to our data. That is a simple
thing to do.
When we pick a workspace from the list of available workspaces, we
are presented with a view. You should notice an Access button in
the upper-right corner of the screen. When you click it, a pane will
open up on the right side of the screen that enables us to add or
delete people and specify the function that each person plays in the
workspace. You will also notice a space where you can add
individuals or organizations by entering their email addresses. This is
helpful since Power BI can very effortlessly interface with your
current directory instance in an organizational context, allowing you
to search for individuals in your company and validate their email
addresses. Simply add them by clicking the Add button.

If you have a group of users that you wish to add to a workspace in


bulk, you can use security groups to accomplish this task instead. A
user can be added at one of these four levels: Admin, Member,
Contributor, or Viewer. There are several degrees of permissions
associated with each position, with Admin having the most privileges
and Viewer having the fewest.
Sharing via a Link or Teams
Sharing Power BI reports and dashboards with other people may
also be done only by using either Teams or Sharing through a Link.
Sharing makes it possible for users to view and engage with the
data; this can be accomplished via the use of a simple link or inside
a collaborative environment such as Microsoft Teams. Here,we will
go further into both approaches and discuss the actions required in
sharing content through a link and sharing content using Teams.
Sharing via a Link
1. Publish your Power BI report to the Power BI service. To
get started, make sure your Power BI Pro subscription is
activated and then publish the report to the Power BI
service. This makes sure that the report is saved in the
cloud and that it can be accessed by other people.
2. Configure the report's sharing settings: Change the
settings for how the report is shared by going to the Power
BI service, selecting the report you wish to share, and then
clicking the "Share" option. You have control over the
sharing settings, so you can decide whether visitors simply
have access to read the report or if they can also amend it
and work together on it.
3. Generate a shareable link: Following the configuration of
the sharing options, create a link that can be shared.
Depending on the circumstances, you can either establish a
safe connection or an anonymous one. Users can view the
report via the link even if they do not have the necessary
permissions or login credentials.
4. Copy and distribute the link: After the link has been
produced, you should copy it and send it to the people you
desire to receive it by email, chat, or any other mode of
contact. Users who are sent the link can easily view the
report in their web browser by clicking on the link itself; they
do not need a Power BI account to do so.
5. View and interact with the shared report: Recipients will
have the ability to view and interact with the Power BI report
when they access the link that has been shared with them.
They can explore the visual representations, apply filters,
delve into the data, and get insights from the content that
has been given.

Sharing via Teams


1. Publish your Power BI report to the Power BI service:
Using your Power BI Pro license, begin by publishing the
report to the Power BI service. This step is identical to the
one that is taken when sharing through a link.
2. Launch the Microsoft Teams: Start the Microsoft Teams
program, or log in to the service via the online interface.
3. Create or navigate to the desired channel: Pick the
section of Teams where you want the Power BI report to be
shared and click on its name. It can be a pre-existing
channel, or it can be a brand-new one that was created just
for the report.
4. Add the Power BI tab: Within the chosen channel, locate
the "+" button to include a new tab in the interface. In the
app gallery, look for the term "Power BI," and then choose
the "Power BI" application. This incorporates a tab for
Power BI into the channel.
5. Connect to the Power BI report: Following the addition of
the Power BI tab, choose the report you want to share from
inside the Power BI app. Establish a connection between
the tab and the report by supplying the required credentials.
6. Configure tab settings: Customize the tab settings to
select the first display, such as a particular report page or
dashboard. This can be done by configuring the settings for
the tabs. You can also choose whether or not members
have read-only access.
7. Save and share with the team: Once the settings are
configured, save the tab. The Power BI report will now be
available within the Teams channel for all members to
access.
8. Collaborate and engage with the report: Members of the
team can launch the Power BI tab inside Teams to
collaborate and interact with the report immediately there.
Through the use of Teams' built-in collaboration tools, users
can see visualizations, apply filters, and debate findings.
9. Receive alerts and keep up to date: Teams offer
notifications for report updates, making it possible for
members of a team to be informed when new data is
available or when modifications are made to the report.

The distribution of Power BI reports and dashboards is made easier


by the availability of customizable alternatives including sharing
through a link or via Teams. Sharing a report inside Teams improves
cooperation since it incorporates the report into the workflow of a
team. Sharing a report through a link is a straightforward and
straightforward way of sharing. Select the approach that caters most
closely to your requirements, and then begin exchanging insightful
observations with others.
Sharing via SharePoint
Sharing Power BI reports and dashboards through SharePoint is a
handy way to disseminate data and collaborate on it inside your
SharePoint environment. This can be accomplished by using one of
the two aforementioned methods.
The following is an in-depth tutorial that will walk you through
each stage of sharing content from Power BI using SharePoint:

1. Publish your Power BI report to the Power BI service: To


get started, make use of the Power BI Pro license that you
have purchased to publish your report to the Power BI
service. After completing this step, your report will be
uploaded to the cloud and made available to other users.
2. Navigate to your SharePoint site: Launch Power BI and
go to the SharePoint site where you wish to distribute the
report. Make sure that you can change the SharePoint page
and that you have the appropriate rights.
3. Activate the edit mode: To make modifications to the page
layout and content, you must first activate the edit mode on
the SharePoint page.
4. Add the Power BI web part: To insert a web part into the
page, while you are in edit mode, click on the "+" or "Edit"
button. In the gallery of available web parts, search "Power
BI," and then choose the "Power BI" web component.
5. Customize the Power BI web part: Once the Power BI
web part has been installed, you can customize it by clicking
on the "Add report" option located inside the web part. You
have the option of displaying a previously created report, or
you can make a new one. If you choose to use an existing
report, you will be prompted to enter the URL of the Power
BI report located inside the Power BI service.
6. Personalize the display options: Personalize the display
options for the Power BI web component by configuring the
display options, such as the size of the embedded report,
the amount of interaction, and the default report page to
display.
7. Save and publish the SharePoint page: Once you have
finished creating the Power BI web component, save the
changes that were made to the SharePoint page. To make
the changes to the page visible to other users, you must first
click the "Save" or "Publish" button.
8. Verify the embedded Power BI report: Check that the
embedded Power BI report is presented properly on the
SharePoint page Exit the edit mode and check to see
whether the embedded Power BI report is displayed
appropriately on the SharePoint page. The integrated report
will now be seen by users who have access to the
SharePoint site, and they will be able to interact with it as
well.
9. Collaborate and engage with the report: Users of
SharePoint can explore the integrated Power BI report
immediately inside the SharePoint page. To get insights,
they can interact with the visual representations, apply
filters, and do data analysis.
10.
Manage access and permissions: SharePoint offers a
variety of settings via which users may manage access and
permissions for integrated Power BI reports. You can control
who can read the report, amend the report, or collaborate on
the report depending on the rights settings for the
SharePoint site.

Creating an App
Choose the workspace that already has the components of the
app you want to build. On the other side of the workspace objects,
you will find a toggle that gives you the option to choose whether or
not the item should be included in the application. Only the reports
and dashboards that make use of a particular dataset will be
included in an app; datasets themselves are not shared.
The software still gets its information from the dataset, but it
conceals the dataset itself from users of the app. Click "Create app"
in the upper-right corner after you have the things you want to
include in your app chosen, and you will be sent to a screen similar
to the one shown in the picture below.
There are controls to choose the app's name, enter a description,
create a link to a site where end users can get help or read
documentation, and determine app navigation settings (I like the
default navigation of the navigation builder, so I tend to leave this
alone), and set permissions around who can access the app. These
controls can be found in the app's settings menu.
Does it apply to the whole organization? Is there a certain kind of
user involved here? With build permissions, who can access the
datasets that are underneath the application? Is it possible for them
to make copies of the reports? Are users able to share, and, as a
last question, ought this software to be automatically installed? You
should probably discuss some of these options with someone in your
business that is responsible for data governance, since that person
may already have rules in place that will assist and guide your
options.
Conclusion
Power BI Service offers a comprehensive and robust platform for
data visualization, sharing, and collaboration. With its intuitive
interface and powerful features, users can harness the full potential
of their data to gain valuable insights and make informed decisions.
The Power BI Service serves as a centralized hub for managing and
accessing data, reports, and dashboards. It allows users to connect
to various data sources, including cloud-based services, on-
premises databases, and online services, ensuring that data can be
easily integrated into Power BI for analysis.
CHAPTER 9
LICENSING AND DEPLOYMENT TIPS
Licensing
The concept of a license in and of itself is straightforward. It is a paid
agreement that gives you the right to use a service for as long as
you keep paying for it, no matter how long that can be. You may
have the option of paying for one level of capability inside a service
rather than another if the product in question offers many tiers of
functionality. For the majority of customers, the license for Power BI
can be divided into three primary categories: Pro, Premium per
User, and Premium per Capacity.
Free, Pro, and Premium per User (PPU) are the three varieties of
individual user licenses that are available for Power BI. The location
of where your content is kept, how you want to engage with that
content, and whether or not the content utilizes Premium features
will decide the sort of license you need. The Premium capacity-
based license is the other kind of license that may be obtained.
PPU and Pro users that have a capacity-based license for Power BI
Premium can generate content in workspaces that have been
designated as having Premium capacity. After that, PPU and Pro
users can provide access to such Premium workplaces to their
coworkers, including free users.
Power BI service licenses
There are two types of licenses: one that relates to an individual
(per-user licensing), and another license (often also referred to as a
subscription) that applies to the sort of storage capacity that an
organization acquires. It is essential to note the difference between
these two types of licenses when discussing licenses.
Each of the three individual user licenses is distinct from the others,
and each provides access to a different subset of the tools and
capabilities offered by the Power BI service. When you combine the
benefits and capabilities of each kind of per-user license with the
usage of a Premium capacity, which is where Pro, PPU, and free
license holders receive access to extra features and capabilities -
such as sharing, collaboration, and more - you have what is known
as a Premium capacity.

Free per-user license


Users who have free licenses for the Power BI service can use it to
connect to data and generate reports and dashboards for their
usage. They are unable to utilize the Power BI sharing or
collaboration tools with other users, nor can they publish information
to the workspaces of other individuals. However, users with a Pro or
PPU subscription can share content and collaborate with free users
provided the information is kept in workspaces that are hosted with a
Premium capacity subscription.

Pro license
Users are allowed to not only view and interact with content that has
been published by others to the Power BI service but also produce
their content via the use of an individual license called Power BI Pro.
Users that have this sort of license can exchange content with one
another and work with users of other versions of Power BI Pro.
Power BI Pro users are the only ones who can produce content,
share content with other Power BI Pro users, or consume content
that was developed by other Power BI Pro users unless the content
is hosted by a Power BI Premium capacity. If the content is hosted
by a Power BI Premium capacity, then users with a Pro subscription
can collaborate and share content with users with a free or PPU
subscription as well.

Premium per user (PPU) license


The owner of a PPU per-user license has access to all of the
features that Power BI Pro is capable of, in addition to the vast
majority of the Premium capacity-based features. A Power BI PPU
license enables users to have access to a wide array of tools,
capabilities, and kinds of content that are otherwise restricted to
Premium subscribers only. This access is only available to the
person who has the PPU license as well as any other coworkers who
also own a PPU license. For instance, all users of a PPU workspace
need a PPU license to interact with one another and share
information inside the workplace. If a user has a PPU license, the
content that they generate can only be shared with other users who
also have a PPU license, unless that content has been placed in a
workspace that is hosted in Premium capacity. The following table
provides a summary of the fundamental features that are included
with each level of license.

Premium capacity
Users with a Pro or PPU account can produce and store content in
Premium capacity workspaces if they have a capacity-based
Premium license, which is also often referred to as a Premium
subscription. After that, they can share that workspace with
coworkers who have any sort of license. Premium capacities can
only be used to generate and store content by users who have a Pro
or PPU license, and even then, only if their organization has
acquired Premium capabilities.

Workspace and App Management


You will need to add users to workspaces as well as applications that
you build, regardless of the licensing structure you choose to use for
your business since you will be required to do so. I will be using a
Premium per User license to demonstrate all of the features that
are available here.

Workspace Generation and Access Control


Access to a workspace can be gained in a handful of different ways.
Workspaces are represented as a category of objects that can be
added to your Recent list when you access them via the Home
menu screen, as illustrated in the screenshot that follows.
Additionally, a line separates the Learn navigation from the
Workspace Management navigation on the menu on the left.
When we click the arrow next to the Workspaces menu item, a
second gray vertical box will open, displaying a list of the
workspaces to which I have access. Contextually, a button labeled
"Create a workspace" will also appear in this box, but only if I can
create a workspace. If you were on the Premium Per Capacity plan
and the tenant administrator blocked this capability, this would be the
situation that would prevent you from being able to establish a
workspace the vast majority of the time.
Second, have a look at the arrow pointing upwards in the vertical
menu that is located on the line just below the line that says
Workspaces. It will display the workspace that you are now working
in, and when you click on it, it will provide information about all of the
dashboards, reports, workbooks, and datasets that are included
inside that workspace. We can see what this looks like in my
personal workspace example as shown in the image. You can see
that I am now working in "My workspace" and have a few datasets
by looking at this screen. You can't see all of the things that are in
the workspace from where you are, thus there is a scroll bar shown
alongside the information that is provided about what is in the
workspace.
Selecting a workspace from the list of Workspaces shown on the
right-hand side of the menu is yet another method for seeing all of
the items included inside a workspace. This will send you to a
landing page for the workspace that displays all of the different data
components that have been saved in that workspace. We can now
go on to the process of workspace creation. Let's begin by making a
new workspace by clicking on the button that says "Create a
workspace" in yellow. When you select that button, a new window
will appear on the right side of the screen. This window will provide
information on the various possibilities for creating a workspace. To
access the advanced settings, you will first need to click the arrow
next to the advanced menu item. This is because the advanced
settings are hidden by default.
You will see that there is an option to upload a picture for the
workspace. Users can get from you a description of the workspace if
they so want. Then we get to the advanced settings. To begin, you
will choose the individuals who will be notified about any requests for
access to a workspace. You can also set a Workspace OneDrive
location for the storage of files, the licensing mode of the workspace,
whether or not the workspace will produce a template app, and
whether or not contributors will be able to alter the app that was
generated from the workspace if one already exists. The majority of
the time, the individuals you want alerted about access requests will
be the administrators of the workspace. However, in Premium per
Capacity settings, the tenant administrator can have a larger degree
of control over the workspaces. In these kinds of scenarios, the
business may choose to have all requests of this kind sent to either
an IT group or the tenant administrator.
We can choose a location on OneDrive for Business that will be
utilized by the workspace to store documents. However, you cannot
just utilize your own OneDrive storage space. Microsoft, don't you
think it would be too handy for private users if you did that?
Therefore, the SharePoint document library that belongs to the
Microsoft 365 group should be your primary focus here. Before
moving on to Power BI, you need to establish this group elsewhere.
You can build a group similar to this from inside OneDrive for
Business; but, the capability to form groups in Microsoft 365 may be
limited in your context. If access is limited, contact the administrator
of your SharePoint site or your company's information technology
department for information on how to integrate OneDrive into your
new workspace.
The "License mode" menu is the next option on our list. Because of
the significance of this option to our objectives, it will be the most
crucial option we make, as it will decide the capabilities of our
workplace. If you choose Premium per User or a higher plan, a new
component will show up on the screen that was not present at any
point in the past. This is an example of what can only be described
as inconsistent design language. You will be prompted to choose a
storage format for your datasets to use as the default. Either the
small dataset storage format or the big dataset storage format is
available for your selection. The small dataset storage format is the
default for all Pro workspaces and cannot be modified. Make sure
that "Large dataset storage" is enabled in your system if you want to
have data models that are larger than 1 gigabyte.
The next option is whether or not this workspace will house a
template app. You can make this decision right now. To share
content with others outside of your business, a template app is being
built. If you want to ultimately employ pieces that were created in this
workspace in an application, your final decision is also very
significant. Following this, we will talk about access levels and
restrictions, but for now, the focus of this inquiry is on determining
whether or not a user with a contributor role should be allowed to
make changes to an application that is powered by this workspace.
That is a question on data governance, and the response must come
from the workspace administrator or the tenant administrator. You
can allow that feature in certain circumstances; such as while you're
developing something so that you can make changes more rapidly.
However, if it is going to be used in regular production, you should
probably disable it to reduce the number of persons who may
accidentally alter the files and cause disruptions to the workflow.
When all of the information has been entered, the Save button will
become yellow and the workspace will be made available. It should
be brought to your attention that Pro workspaces do not have a
diamond next to them. Users will always see a diamond next to the
name of a Premium per User or Premium per Capacity
workspace. This is done to assist users in determining the kind of
workspace that is being used.

Managing Users in a Workspace


View, Filters, Settings, Access, and Search are some of the
features that can be accessed from the home page of any
workspace. These options are located in the top right corner of the
page. Access is the thing that we are searching for. When we select
that button, a pane that looks very much like the Workspace
Generation window slides in from the right side of the screen. In this
pane, we can add people to the workspace.
If your company uses Windows or Azure Active Directory, Power BI
can and will search for people as you write their email addresses or
Microsoft 365 groups if you have them, as seen in the picture below.
Power BI also works very nicely with the active directory instance of
an organization, so this functionality is available to you regardless of
whether your company uses Windows or Azure Active Directory. You
can add whole groups to a workspace with only a few clicks. You can
also add numerous persons at once by putting out multiple emails in
the appropriate boxes. Take note of the line that may be seen
underneath the email address. At this point, we decide what kind of
access a user will have to the workspace that we've made available
to them. Users in a workspace can be given one of these four roles:
Admin, Member, Contributor, or Viewer.
Because of the hierarchical structure of these roles, an individual
who holds a lower position cannot influence the user access of an
individual who has a higher function. For example, a Member cannot
remove an Admin from a workspace or throw them out of the
workplace, while an Admin can do any of those things to a Member.
Admins are the only ones who can remove other admins. Admins
and Members can take actions that either have an effect on users in
a workspace or alter workspace components that are utilized outside
of the workspace. Contributors can work on items that are contained
inside the workspace, but they often cannot engage with the way
that external viewers interact with workspace data pieces. Viewers
are unable to make any changes to any of the items in a workspace
and can only see the things that are present in the workspace.
When it comes to managing workspaces, set up at least one service
account. A service account is an account that does not belong to a
specific user but rather to the business as a whole, and the
credentials for that account are held by a select group of individuals.
Even though this service account isn't typically logged into, it should
nonetheless be added as an Administrator to every workspace. If for
some reason the lone Admin of a workspace were to leave the
company, you would be required to go via the Power BI Admin API
at that time to promote someone else to the role of Admin for the
workspace. On the other hand, the data governance policies that
your business may have in place will be able to assist you in
determining the best course of action for both you and your
organization in this respect. It is of the utmost importance to ensure
that you have some strategy or plan in place so that, if there is a
disturbance caused by workers, you can retake control of a
workplace.
Remove a user or change their role in a
workspace.
You will see an ellipsis, which is represented by three dots, to the
right of the permission that is stated. If you click it, which acts as a
button, a tiny pop-up window will emerge, displaying a list of the
roles that can be reassigned to a user. You'll find a button labeled
Remove at the very bottom of that list. If you choose Remove from
the menu, they will be deleted. When you click a different role, the
person in question will switch to the selected role. This portion is
simple, maybe even a bit too simple considering that Power BI does
not even provide you with a confirmation notice asking if you are
certain that you want to act on question. However, if you do make a
mistake, it is simple to correct, as shown by how straightforward it is
to add users, as you can see here.
Adding Users to Roles for RLS Implementation
You have your workspace organized with the people who are
authorized to have access to it. Currently, you have to add them to
RLS for each dataset in the workspace that has distinct
responsibilities configured for it. This is not a very difficult task to do
since the actual challenge is in determining the roles in the first
place. When you are on the landing page for a workspace, hover
your mouse over the dataset that you wish to add users or groups to
for RLS until a vertical three-dot ellipsis option appears next to the
name of the dataset.
After clicking the ellipses, go to the Security tab. When you do so, if
you have roles specified for your Power BI dataset, you will see the
list of roles as well as the people who are currently members of
those roles in the dataset. If you have roles created for your dataset,
you can view this by clicking here. You can add users or groups to
your Microsoft 365 account. One thing to keep in mind is that Power
BI will regard a user as if they had complete access to whatever
combination of data would be present in between the several roles
they play if the person is a member of more than one group. This
behavior is analogous to a SQL outer join. The picture below shows
the simple user interface for adding users to RLS. Similar to the
process of adding users to the workspace, Power BI will make an
effort to locate people when you enter the right email address or
group name.

You may want users to be able to interact with the data items you've
developed at some time, even if you don't intend to bring those
elements to the workspace directly. This can be accomplished by
incorporating them into an application that is driven by a workspace.
On the homepage page for our workspace, there is a large "Create
app" button located in the upper-right-hand corner of the page. On
this very first page of the Setup process, we have a few different
options. We give our app a name and a description, and we can
provide a link to a support or documentation website where users
can get assistance with using the app. In addition, we can provide a
logo for our app, choose the color scheme for the app's interface,
and, finally, we can show contact information for the person who
should be contacted about access to the app. All of this ought to
seem quite familiar by now since it is in accordance with our
previous experience with the construction of workspaces.
Before you go ahead and click the button that says "Publish app,"
you will see that there are a few more things that need our attention.
Following the Setup tab, we will find the Navigation and Permissions
tabs. Let's go over them in the order they were presented, beginning
with Navigation.

The first thing we see is an on/off toggle to let us choose whether


we want to use the New navigation builder. If you are beginning
fresh with applications, give your consumers a consistent experience
by not changing this setting at all. If you are starting fresh with apps,
the old option will probably be removed at some time in the future.
Only the data products are included in an application; we don't
include any of the underlying core data at all. In an application, for
instance, we do not provide any datasets or dataflows; but, we do
include reports and dashboards.
You will see that all of the dashboards and reports that are currently
present in the workspace have been imported into the application
when you look at the Navigation pane on the left. In this particular
instance, I own both the real report and a generic dashboard that
was generated automatically when I submitted my PBIX file. The
menu in the left-hand column is gray and displays what those
components are. Hiding objects in an application is more accurate
than saying that they have been deleted entirely. You can hide things
from the list that you do not want to be included by clicking the little
eye icon that is located next to the arrows that go up and down. If
there is a slash in the eye, then the object will be concealed.
When you click the +New button, navigation becomes helpful since
you can add a section or a link. A section functions very similarly to a
show folder. It is not possible to establish a folder structure with
many levels—there cannot be folders inside folders here. You can
specify a link to a web page, which may or may not be included
inside Power BI, as well as the action that the web browser should
do to view the link.
When you create a section, the area on the right will provide you with
locations to name the part and the option to conceal the section from
the navigation. You can also choose not to show the section in the
menu. To construct a link, you must first give the app element a
name, then provide the URL for the link, decide how the link should
open when it is chosen, and lastly choose a group for the element to
fit in so that it can be navigated. You have your navigation system all
set up and organized well at this point. Now you need to figure out
who should be able to access the app and what underlying rights are
provided by having access to the app and the datasets that power
the app behind the scenes. You also need to determine who should
be allowed to access the app. That is a lot of information to process,
so give yourself some time. As you can see, the first step is to
determine who ought to be allowed access. Does it apply to the
whole organization? Are there certain people or groups inside the
organization that are the target? When we talk about groups, we're
referring to the ones in Microsoft 365. It is also said by Power BI that
everyone who has access to the workspace will also have access to
the application.
For the same reason that a workspace Admin isn't impacted by RLS,
this makes perfect sense. The PBIX can be downloaded with ease
by an administrator of a workspace. The content is available for
inspection by anybody who has access to the workplace, and they
need only go there to do it. Next, we decide whether or not users
who have access to the application will be able to connect to the
dataset that the program is based on, as well as whether or not
users will be able to create copies of the reports included inside the
application and use them in their workspaces. These two options are
arranged hierarchically. Reports cannot be copied if the underlying
datasets to which the app refers cannot be accessed by the user.
However, just because you have access to the datasets does not
guarantee that you can replicate the reports. One is required, but it is
not adequate on its own.
Next, we need to decide whether or not we want users to be able to
share the app as well as the datasets that are underlying the app by
using the share permission. It seems strange to me that there is no
clear hierarchy here either. In most cases, I do not enable this option
and instead work to ensure that there is a place where users can
seek access to the app from the team that owns the app. This is in
contrast to situations in which items are shared without any
restrictions. Lastly, there is a link to more documentation provided by
Microsoft as well as an option of whether or not this application will
automatically install for users who have access to it. Because it can
be so helpful to end users, automated installation is something that I
have no problem leaving switched on. We can then release the app
after this. After the application has been packed, which may take
anywhere from five to ten minutes on average, it will show up in the
Apps area of the Power BI Navigation menu. If you need to update
the app, go back into the workspace that hosts the app, and you'll
see that the button labeled "Create app" has been replaced with the
one labeled "Update app." That sums it up well. You now own an
application!
The Golden Dataset(s)
As you continue to handle the deployment of Power BI, you are
going to find that you have accumulated a significant amount of data
pieces at some point. You will run into computing issues as the
number of these data items increases. If you live in a world where
each report is driven by its dataset and each of those datasets is
always being updated, then you have a lot of moving components to
keep track of.
Heaven forbid you have that many datasets refreshing against
corporate databases...because if those datasets are of any
magnitude at all, you can give your poor DBA a heart attack. Heaven
forbid. What percentage of the data inside both databases is
identical to one another for no apparent reason? When the Power BI
service truly came into its own, a lot of analysts who were using
Power BI and pushing this area forward didn't have very solid
answers to these questions. This is because these questions were
asked before the service came into its own. However, there are a
thousand Excel worksheets that are dispersed around the office; why
should this be any different? The issue is that, at some point in time,
data items become crucial to the operation of the organization. We
need to handle them as if they were assets that are vital to the
purpose. Therefore, to examine how we needed to handle this issue,
we needed to take a peek at the history of Power BI and recall what
it is like below the hood. It is the Analysis Services department.
How would we go about managing a deployment of Analysis
Services throughout the whole organization? We would either
establish a single, comprehensive master dataset or a limited
number of datasets that we could easily manage. If we did so, it
would reduce the amount of computing work that was required of the
other components of our data pipeline. By building that master
dataset, sometimes known as a golden dataset or whatever name
you like, we can do the same thing with Power BI. If we can limit the
number of times we make large data queries to our databases, we
will be in a better position. The fewer locations at which we are
responsible for managing RLS, the simpler the task will be. The
fewer datasets we have to handle, the more confident we can feel in
putting our finest data pieces in front of our data consumers. This
gives us greater peace of mind, as we are aware that there is a
greater possibility of being able to respond to the inquiries that users
have. We accomplish this goal by making the data more accessible.
Does this imply that you can only work with a single dataset to
manage everything? Certainly not in every case. You could arrange
data into a structure similar to that of a data mart, producing a limited
number of datasets that have been carefully chosen to provide
answers to certain categories of queries while yet retaining their
status as reusable data pieces. This may be shown with the help of
RLS as an example. Establishing the roles that are present in a
Power BI dataset and defining those roles using DAX are two
separate processes. You will need to remember to go into Power BI
service, find the dataset in the Workspace landing page, hover over
the dataset until the ellipses menu pops up, click that, go into
security, and make sure to include the people or groups into those
roles you've defined. If you have three similar datasets, this means
that you will need to manage those roles in three different locations.
What happens if you fail to perform it even once, which results in one
of the datasets not being updated? This indicates that unauthorized
individuals have access to the data.
There is a good chance that this will become a more significant
problem in some settings, such as a Premium Per Capacity
environment, in which you can have hundreds or even thousands of
report viewers who might be members of dozens of groups and for
whom you are attempting to regulate access. Does this imply that
once your company reaches a certain size, you should no longer
consider using Power BI for ad hoc or spontaneous analysis? Of
course not! I hope that you will be able to make use of your dataset
or a limited number of datasets, to obtain the answers you are
looking for more expediently. On the other hand, while looking for
data, you may need to think creatively at times.
In big settings that use Premium per Capacity, Microsoft advises
customers to separate the capacity they use for business-mission-
critical tasks from the capacity they use for ad hoc exploratory tasks.
I believe that this is an excellent concept, and it is simple enough to
implement using workspace management, even in settings that are
not premium. When your users start developing their reports,
encourage them to connect to datasets in the Power BI service as
data sources in Power BI Desktop so that they can get the data they
need. The size of the dataset will decrease as a result. Too many
companies with enormous datasets have concluded, in my
experience, that the Power BI software is just too difficult to use. The
reason for this is that they did not have proper governance policies in
place, and as a result, users finally got to the point where they lost
track of all the resources that were accessible to them.
On-Premises vs. Cloud Deployment for Power BI
On-Premises Deployment
The term "on-premises deployment" refers to the process through
which a company hosts the Power BI architecture inside its own data
center or on its servers. The following are some important things
to keep in mind while planning an on-premises deployment:

1. Data Security: The protection of sensitive information is


consistently ranked as one of the highest priorities for
businesses. On-premises deployment gives businesses and
other organizations full authority over their data, ensuring
that confidential information is kept inside their network and
is always subject to their direct oversight. This control can
be very important for businesses that must adhere to
stringent compliance rules or those that deal with data that
is strictly sensitive.
2. Data Governance: An on-premises deployment offers
businesses an increased level of control over how their data
is managed. Access controls, data privacy rules, and
compliance procedures can be defined and enforced by IT
teams according to the particular needs of each
organization. This degree of control guarantees that the
data are handled and disseminated in a way that is
consistent with the rules of the company.
3. Connectivity: When working with data sources that are
located behind firewalls or inside private networks, on-
premises deployment can be useful. Organizations can
build direct connections to these sources without having to
expose them to the internet since the Power BI architecture
can be kept on-premises. This strategy offers access to
real-time data as well as analysis of that data, both of which
may be essential in certain types of business
circumstances.
4. Customization: With an on-premises deployment,
businesses can tailor the Power BI environment to meet
their unique requirements via the addition of new
components and the modification of existing ones. This
flexibility is especially advantageous for businesses that
have specific data needs, intricate data models, or that need
to integrate with pre-existing computer systems. It makes it
possible to build customized functions such as custom data
connections, security enhancements, and other specialized
features.

Cloud Deployment
Cloud deployment, on the other hand, refers to the process of
hosting the Power BI infrastructure on the Azure cloud platform
provided by Microsoft.
Let's look at some of the benefits of using this approach:

1. Scalability: Cloud deployment provides an unrivaled level


of scalability, which enables businesses to make use of the
extensive computing resources provided by the cloud. With
cloud-based Power BI, businesses can effortlessly manage
massive amounts of data, support expanding user
populations, and seamlessly scale their analytical
capabilities in response to demand. This scalability offers a
solution that is both effective and economical since it
removes the need for costly infrastructure design.
2. Accessibility: A cloud deployment offers users the ability to
view Power BI dashboards and reports from any location
and on any device, provided that the device has an internet
connection. This accessibility encourages cooperation and
makes remote work possible, which in turn makes it easier
for geographically disparate teams to make decisions in real
time. In addition, mobile apps are supported by cloud-based
Power BI, which makes data insights accessible on mobile
devices such as smartphones and tablets.
3. Maintenance and Updates: Cloud deployment relieves
enterprises of the strain of maintaining their infrastructure
and keeping their software up to date. It is Microsoft's
responsibility to manage and update the underlying
infrastructure, which they do to guarantee high availability,
performance, and security. Because of this, businesses can
focus more of their attention on leveraging Power BI for data
analysis and less on maintaining the technical parts of the
platform.
4. Integration and Ecosystem: Cloud-based Power BI
connects easily with other Microsoft Azure services and a
broad variety of apps from third parties, enabling
businesses to build full data ecosystems. Integration with
Azure services like Azure Data Factory, Azure SQL
Database, and Azure Machine Learning extends the
capabilities of Power BI and makes it possible to implement
complex analytics and machine learning scenarios.

When deciding between an on-premises deployment and a cloud


deployment, there are a few things to keep in mind. Several aspects
should be taken into consideration before settling on either an on-
premises or cloud deployment for Power BI.

1. Security and Compliance: Conduct an audit of the security


standards and compliance rules that apply to your firm. On-
premises deployment is often preferred in fields like
healthcare and finance, which tend to demand a higher level
of data security. However, cloud providers such as Microsoft
Azure provide sophisticated security protections and
compliance certifications, which enables many enterprises
to consider cloud deployment as a feasible alternative.
2. Cost: examine both the one-time and recurring expenses
that are associated with on-premises and cloud
deployments. On-premises deployment often necessitates
more upfront expenditures in terms of both hardware and
software licensing, in addition to ongoing infrastructure
upkeep. The deployment of software in the cloud, on the
other hand, utilizes a subscription-based pricing model,
which enables businesses to only pay for the resources they
use.
3. Scalability and Growth: When thinking about the future
expansion of your company's data volume, user base, and
analytics needs, keeps in mind the scalability of your
solution. Deployment in the cloud has the benefit of
scalability and flexibility, making it possible for your business
to expand smoothly as it expands. If you anticipate large
development or if you need the capacity to scale up or down
quickly, the cloud may be the most suitable option for you.
4. IT Expertise: Analyze the IT capabilities and resources
available to your company. A greater degree of technical
competence is required to operate the on-premises
infrastructure and guarantee its high availability when using
an on-premises deployment model. Cloud deployment,
although lowering the number of duties associated with
infrastructure management, needs skill in both the
administration of cloud resources and the integration with
other services.

Scaling Power BI Deployments for Enterprise Use


Scaling Power BI installations becomes essential for guaranteeing
the platform's efficacy and efficiency as enterprises continue to
develop and their data analysis requirements get more complicated.
When growing Power BI installations for business usage, the
following are some essential considerations and best practices
to keep in mind:
1. Data Modeling and Optimization:
• Implement effective data modeling techniques: It is
essential to the system's speed and scalability that an
optimal data model be designed. Make use of best
practices, such as building relevant associations,
intelligently designing calculated columns, and, if required,
putting data partitioning into action.
• Make Use of DirectQuery and Live Link: Rather than
importing all of your data into Power BI, you should think
about using DirectQuery or Live Connection to build a real-
time link to the data sources. This method helps to eliminate
redundant data while also making it possible to get data in a
dynamic manner, which is very beneficial for huge datasets.
2. Data Source Considerations:

• Optimize data sources: Ensure that data sources are


correctly indexed, and optimize queries to decrease the
amount of time spent retrieving data by running them.
• Leverage data source-specific optimizations: To
improve speed, you should make use of optimizations that
are unique to the data source. For example, partitioning in
SQL Server, aggregations in Analysis Services, or query
folding in Power Query are examples of such optimizations.
3. Data Refreshment and Gateway Management

• Configure efficient data refresh schedules: Determine


the proper frequency of data refresh depending on the
constraints placed on the data about its level of freshness. It
is important to limit the strain on data sources to avoid doing
needless refreshes.
• Utilize Power BI Gateways effectively: Power BI
Gateways make it possible to link data sources that are
located on-premises to those located in the cloud.
Gateways should be configured to strike a balance between
performance, resource usage, and security.
4. Power BI Premium:

• Evaluate Power BI Premium: Enterprise installations may


benefit from the increased scalability, performance, and
advanced capabilities that are available with Power BI
Premium. You should think about subscribing to Power BI
Premium so that you can make use of its features and
advantages, like expanded data capacity, paginated reports,
and artificial intelligence capabilities.
5. Usage Metrics and Monitoring:
• Keep an eye on both usage and performance: Utilize the
monitoring tools and consumption data provided by Power
BI to obtain insights into the behavior of users, the
performance of reports, and the utilization of resources.
Utilizing these indicators, identify areas that might benefit
from optimization and enhancement.
• Make sure you have alerts and notifications set up:
Establish proactive monitoring alerts so that you can get
information about possible problems or performance
deterioration, which will enable you to respond and resolve
the problem quickly.

6. Collaboration and Governance:


• Establish a Role-Based Security System: Define the
proper roles and permissions to exert control over who may
access the data, reports, and dashboards. Make sure that
sensitive data is safeguarded and that only authorized
persons may access it.
• Encourage collaboration and sharing: Foster an
environment that values collaboration by encouraging the
sharing of reports and dashboards. This will help to create a
culture that values cooperation. Make use of shared
workspaces and other tools to ease the process of
collaborating as a team and exchanging information.
7. Training and Support:

• Provide training and support: Invest in training programs


that will educate users on the best practices for using Power
BI, as well as approaches for data modeling and performance
optimization. Make continual help available to resolve concerns
and queries raised by users.
8. Consider Future Growth:

• Plan for Scalability and Future Expansion: Anticipate future


data volume, user base growth, and analytical needs. Create
the design of your Power BI solution with scalability in mind,
taking into account aspects like the requirements for data
storage, processing resources, and infrastructure.

Best Practices for Power BI Deployment


For an efficient deployment of Power BI, rigorous planning and
adherence to best practices are required to guarantee that
performance, security, and user adoption are at their highest
possible levels.
The following is a list of critical recommended practices for the
deployment of Power BI:
1. Planning and Requirements Gathering:
• Establish the goals of the project very specifically: It is
important to explain the aims and objectives of the Power BI
deployment clearly and concisely. Determine the intended
audience, as well as their requirements for reporting and
analysis, and the expected results.
• Engage stakeholders: Ensure alignment and improve the
effectiveness of requirement gathering by including important
stakeholders, such as business users, IT teams, and
executives, in the planning process from the beginning to the
end.
2. Data Modeling and Design:

• Implement a robust data modeling strategy: Invest some


time and effort into building a data model that has a solid
organizational structure and is in line with the needs of the
company. Normalize the data, describe the connections
between the data, and think about performance optimization
strategies such as data division and indexing.
• Use calculated columns and measures judiciously: You
should try to limit the usage of computed columns since it can
harm performance. Instead, rely on measures to carry out
calculations while the program is running.
• Apply data cleansing and transformation Make the quality
of the data your top priority by conducting the necessary steps
required for data cleansing and transformation.
3. Security and Governance:

• Implement role-based security: Define the roles that users


play and the permissions they have access to so that suitable
data may be accessed depending on those roles. Make
advantage of Row-Level Security, often known as RLS, to
restrict user access to certain data rows depending on the
context of the user.
• Establish data governance policies: Define data
governance principles, such as naming conventions, data
categorization, and version control. Put in place procedures to
ensure that these rules are followed and that data integrity is
preserved.
4. Performance Optimization:
• Optimization of the Data Refresh Schedules: Based on the
needs for the data's current state of freshness, determine the
optimal frequency and time for data refreshes. It is important to
limit the strain on data sources to avoid doing needless
refreshes.
• Utilize Power BI caching: Make use of the caching
mechanism in Power BI to increase the speed of queries and
limit the amount of data source access.
• Make use of aggregations and calculated tables:
Implementing aggregations and pre-calculated tables is one
way to increase the efficiency of queries, particularly when
dealing with huge datasets. Tools like Power BI Premium and
Azure Analysis Services are two examples of what can be
used to accomplish this goal.
5. Data Source Connectivity and Gateway Management:
• Select the most suitable data connection option: Evaluate
the data connecting options that are available for your data
sources, such as DirectQuery, Import, and Live Connection.
Your evaluation should take into account the data volume, real-
time needs, and performance factors.
• Optimize Power BI Gateway configuration: Configure
Power BI Gateways in an efficient manner to build safe and
dependable connections between on-premises data sources
and the cloud. This can be accomplished by optimizing the
configuration of the Power BI Gateway. Monitor and perform
maintenance on gateways regularly to ensure that data refresh
and communication run well.
6. User Training and Adoption:

• Provide Comprehensive Training: Provide users with


training programs and tools to educate them on the capabilities
of Power BI, report creation best practices, and self-service
analytics. To facilitate adoption, you should encourage users to
explore and experiment using Power BI.
• Encourage user engagement and collaboration: To
cultivate a culture that values cooperation and the exchange of
information, you should encourage the usage of shared
workspaces, encourage comments and conversations, and
share reports and dashboards with other users.
7. Version Control and Deployment Lifecycle:
• Implement version control: Make use of version control
systems like Git to keep track of changes made to the reports
and dashboards in Power BI. This not only assures traceability
but also makes cooperation easier and permits rollbacks if they
are required.
• Create a deployment lifecycle: Define a systematic
deployment process to manage the release and promotion of
Power BI artifacts across multiple environments (for example,
development, testing, and production). This will allow you to
track changes and ensure that they are made promptly. The
best way to prevent interruptions and assure stability is to put
change management principles into action.
8. Continuous Monitoring and Improvement:
• Monitor usage and performance: Maintain a consistent
monitoring schedule for use trends, report performance, and
ensure connection to data sources. Utilize the usage metrics,
audit logs, and monitoring tools that are available in Power BI
to identify problem areas and take preventative measures to fix
them.
• Gather user feedback: Request that users submit feedback
on the dashboards and reports generated by Power BI. It is
important to actively seek feedback to discover usability
concerns, potential for development, and demands for new
features.
Troubleshooting Licensing Issues in Power BI
Common Licensing Issues
• Access Restrictions: When consumers can't access certain
features or reports because of license restrictions, this is a
typical licensing problem. Users of the free version of Power
BI, for instance, have restricted access to the collaborative
capabilities of the platform and may encounter constraints on
the data refresh rates.
• Problems with License Assignment: There are situations
when users may not be allocated the right license type. As a
consequence, they may have restricted access to Power BI or
be unable to use it at all. It is of the utmost importance to check
that the appropriate licenses have been given to users in
accordance with the needs of those users.
• License Expiration: If a license is left to lapse or is not
renewed when it should be, this can lead to still another
problem. Users' access to Power BI tools and reports may be
terminated as a result of this. To minimize delays, it is vital to
routinely check when licenses are about to expire and to renew
those licenses in a timely way.
• Sharing and Collaborating Restrictions: If the necessary
licensing is not in place, sharing reports and dashboards with
external users can be a difficult process. Users of Power BI
Free could be subject to restrictions on their ability to share,
which would need an upgrade to Power BI Pro or Premium.

Troubleshooting Licensing Issues


• Checking License Assignments: Verifying that users have
been given the appropriate licenses to use is the first step in
determining how to fix problems related to licensing. Either the
Microsoft 365 admin portal or the Power BI admin portal can
be used to do this task. If there are any inconsistencies, the
licenses can be reallocated as necessary.
• Activation of the License and License Renewal: Make
certain that licenses are both active and renewed at the
appropriate times. Licenses for Power BI are often included as
part of a company's subscription to Microsoft 365. Users run
the risk of losing access to Power BI capabilities if their
licenses are not activated or renewed on time. It is important to
routinely check when licenses are about to expire and to set up
automatic reminders to prevent any interruptions.
• Upgrading Licenses: If customers are encountering
constraints as a result of their current licensing tier, upgrading
their licenses to Power BI Pro or Premium should be
considered. This will allow for more features and capabilities to
become available, such as increased possibilities for
collaborative work and faster data refresh rates.
• Troubleshooting Access Restrictions: If particular features
or reports cannot be accessed, check the licensing limits that
are connected with the user's current license type. Try to
establish if the problem is caused by license constraints or
configuration settings. The access limitations may be lifted by
changing the configuration settings or amending the license
type.
• Seeking Support from Microsoft: It is suggested that you get
in touch with the Microsoft support team if the troubleshooting
methods do not address the licensing difficulties. To solve
difficult licensing issues, they can provide further advice and
direction. Be ready to offer specific details about the problem,
including any error messages that you may have received and
the procedures you took to replicate the issue.

Tips for Optimizing Power BI License Costs


• Understand Power BI licensing options: There are some
different licensing options available for Power BI, such as
Power BI Free, Power BI Pro, and Power BI Premium. Make
sure you are familiar with the capabilities and restrictions of
each kind of license before attempting to choose which option
will best meet the requirements of your firm.
• Evaluate user requirements: Research the typical activities
and needs of the people who use Power BI. Power BI Free can
be sufficient for many users who simply need to consume
reports and dashboards; therefore not all users may need a
Pro license. Reserving Pro licenses for customers that want
more sophisticated features such as content creation,
collaboration, and data research is recommended.
• Leverage Power BI Free: Utilize the Power BI Free license for
users who are mainly concerned with the consumption of
reports and dashboards rather than the production of content
or the facilitation of collaboration. Users can access shared
content and see reports with the help of Power BI Free, which
helps businesses save money by reducing the number of Pro
licenses that are required.
• Share reports and dashboards efficiently: Instead of
distributing Pro licenses to each user, take advantage of Power
BI's built-in sharing features to collaborate on reports and
dashboards. Users who do not have a Pro license can still see
and engage with reports thanks to shared content, which
reduces the number of Pro licenses that are necessary.
• Think About Purchasing Power BI Premium: If you have a
high number of Pro users or want additional features like
paginated reports, AI capabilities, or sophisticated data refresh
options, Power BI Premium may be a cost-effective option for
you to consider. Power BI Premium eliminates the need for
individual Pro licenses by providing users with limitless viewing
and collaboration capabilities.
• Optimize data refresh frequency: Examine and make
necessary adjustments to the frequency of the data refresh
depending on the needs of the users. Refreshing content at a
high frequency can take up extra resources and drive up the
cost of licensing. Analyze how data is used so that you can
establish a suitable refresh schedule and frequency for each
dataset.
• Implement row-level security: Make advantage of row-level
security, often known as RLS, to limit data access in
accordance with the user roles or permissions. This eliminates
the need for separate reports and licenses by giving you the
ability to generate a single report that can be customized to
provide varying degrees of data access to various user groups.
• Monitor license usage: Use the audit logs provided by Power
BI or other specialized monitoring tools to do regular checks on
license use and user activities. Find any Pro licenses that
aren't being used or aren't being used to their full potential, and
then consider reallocating them to other users or, if necessary,
downgrading those users to Free licenses.
• Explore embedded analytics: You should think about using
Power BI Embedded or Power BI Embedded Capacity if you
have users from outside your organization who need access to
reports and dashboards. With the help of these solutions, you
will be able to incorporate content from Power BI into your
apps or portals without the need to purchase Pro licenses for
each external user.
• Negotiate to price with Microsoft: Engage in price
negotiations with Microsoft or your licensing supplier if you
have a big deployment or are planning a substantial
expansion. You can also explore possible discounts or
licensing solutions that are suited to the requirements of your
business during these negotiations.

Conclusion
In conclusion, understanding the licensing and deployment aspects
of Power BI is crucial for effectively implementing and utilizing the
platform. By selecting the right licensing model and deploying Power
BI in a suitable environment, organizations can optimize their data
analytics and reporting processes.
Choosing the appropriate licensing option for Power BI involves
considering factors such as the number of users, their roles and
requirements, and the desired level of functionality. Power BI offers
various licensing tiers, including free, Pro, and Premium, each with
its own set of features and limitations. Organizations should assess
their needs and budget to determine the most suitable licensing
model.
CHAPTER 10
THIRD-PARTY TOOLS
Get to Know Business Ops
You can attempt to identify each component or extension for the
majority of tools, then download that extension, install it one at a
time, and then manage your content. Keeping this in mind, PowerBI
Tips has developed an external tool manager that goes by the name
Business Ops and can be downloaded at no cost at all. This
website takes all of the hard work of gathering a large number of
third-party tools, putting them into one location, and enabling you to
rapidly install and configure the ones that you are interested in using.
It is continually updated whenever newer versions of the external
tools to which it connects are also made available for use.
The download is sent in the form of a ZIP file, and the installer may
be found inside the ZIP file itself. When the installation is finished,
you will be presented with a user interface like the one seen in the
picture below. You can see the release notes for the version of
Business Ops that you've installed by navigating to the Home
landing page. You can access the Git repository of the code if you
want since it is entirely open source. You will be able to use
Business Ops to install a large number of Power BI add-ons, quickly
get access to learning tools, design Power BI themes, use a gallery
for bespoke visualizations that are not available via AppSource, and
locate connections to some of the very greatest DAX resources in
the entire world.
Add External Tools, Remove External Tools, and
Modify Display Order
From this location, we can go to Add External Tools to see the list
of external tools that Business Ops has curated. After viewing the
list, we can then have those package files uploaded so that Power BI
identifies them as external tools. You can do this with the use of a
very simple checkbox interface. There is a lot of information on this
page, and although the scroll bar on the right side of the application
may not be the simplest thing to see, I can assure you that it is there.
The scroll wheel on my mouse is my preferred method of navigation
while using this. The ALM Toolkit, Tabular Editor, DAX Studio, and
Bravo will be the primary focuses of this section. When you have
gathered all of the tools that you want to add, click the blue "Add
External Tools" button that is located in the bottom right corner of
the screen. After that, leave the rest of the work to Business
Operations.
The software add-ons, such as the ones we are talking about right
now, are installed into the same folder that the Business Ops
software does. This is a disadvantageous feature for some reasons.
In addition, Business Ops will not create unique entries in your Start
menu for the software installs it does. This behavior is distinct from
the behavior of installing the programs directly from their respective
individual installers. To get around this, I start a new blank Power BI
file once the external tools have been installed, and then I use those
external tools to forcibly open the program that I need to use. Right-
click anywhere on the Windows taskbar, and after that, pick the
option to Pin to Taskbar. If you want to create separate shortcuts for
your desktop, you will need to open Windows Explorer, browse to the
program that you want to create a shortcut for inside the Business
Ops directory, and then create the shortcut for the software in the
same way that you would in any other version of Windows.
You can change the order in which the tools show in the External
Tools segment of the ribbon in Power BI Desktop by navigating to
the Edit External Tools section and selecting the appropriate option
from the drop-down menu. You can even delete individual items from
the list entirely if you want to. When you install third-party tools, a
JSON file specific to each tool is produced automatically. You can
make changes to the filename by selecting the pencil icon. The tools
that are available in Power BI Desktop are shown from left to right
and in alphabetical order. By default, each JSON file starts with a
three-digit code that sets the initial order.
If an item was installed using this program, you can also uninstall it
from your External Tools list. To do this, click the garbage can that is
located next to the Edit button. If you delete anything by mistake, you
can always go back to Add External Tools, and everything you
currently have there will be grayed out when you do so. You can
simply choose the tool you wish to restore and then add it once
again.
Learning, Theme Generation, Visual Generation
The custom theme generator is incredible! You can assemble a motif
consisting of colors and individualized visual settings, which you can
configure on either the global or the individual visual level. After that,
you can use Power BI Desktop to import it as a custom theme. This
tool will also provide you with the hex codes for all of the colors that
you choose, allowing you to keep track of them for any future
reference needs. You will notice that there is a color sliding scale on
the left side of the page. This scale enables you to move from red to
violet and then back again to red, following the color sequence of
red, orange, yellow, green, blue, indigo, and violet. At any position
along that bar, the square corresponding to the color range you have
picked will display a selection of lighter and darker hues for you to
choose from, as well as the hex code for the color that you have
presently selected.
You can enter those codes directly in the input field at the bottom of
the page if you already know your hex codes. For example, if the
marketing department of your company provided you with your
organization's hex-code color scheme, you could enter those codes.
You can access a list of color schemes that have previously been
produced in the Palette area by looking in the Gallery portion of the
app. If you click on a color theme that is shown in this section, you
will be sent to a sample page of the Power BI Report that will
demonstrate how the theme appears when combined with visuals.
This function is of great assistance. In the Charts section, you will
see a selection of bespoke visuals that have been produced using
Charticulator, a visual creation tool developed by Microsoft that
allows users to build their visuals. You can download and import
these custom visuals into the report that you are working on in
Power BI Desktop. Importing the PBVIZ file will allow you to do this,
after which it may be added as a custom visual.
When you choose the Create My Own option inside the program, a
fully operational version of Charticulator will appear for you to use. It
is important to keep in mind that strictly speaking, this is a fork (or a
clone of the code source) of Microsoft Charticulator. This is because
Microsoft has made a few minor UI adjustments. On the other hand,
the behavior of whatever you make in this version of Charticulator
will be the same as if you had made it by going to the Charticulator
website and making it there.
Additional DAX Resources
Both the DAX Guide and the DAX.do websites are provided free of
charge courtesy of the people at SQLBI. Business Ops provides
direct connections to both of these outstanding DAX resources.
DAX Guide provides a list of every DAX function, which includes its
syntax, the kind of values that function returns, real-world coding
examples written with DAX best practices, and, for many functions, a
YouTube video in the upper-right corner that has even more
information on the function presented in an approachable video
format.
On the homepage of DAX Guide, we can choose a particular
category of DAX functions to explore or learn about the most current
DAX functions to be published and the dates on which they were
made available. As someone who uses DAX in settings other than
Power BI, I find it quite helpful to be able to see, on the left, which
products a certain DAX function will operate with. This feature is not
limited to Power BI.
On the other hand, DAX.do is a fully-featured DAX playground that
gives you the ability to edit and reorganize some aspects to make
them conform to the testing method that you want. You have access
to two data models inside the playground; these are the Contoso and
the DAX Guide models.
You can switch between them at any time. The DAX.do column list
supports dragging and dropping of data between columns. It will tell
you the functions you are attempting to utilize and then offer you a
drop-down list of those functions so that you can access the DAX
Guide pages for those functions. You can see your results, and if
you make a mistake in the query that you are writing, you will get the
proper error notice.
Because Business Ops offers us access to a diverse selection of
third-party tools, we would focus on some of the most advantageous
ones. Specifically, we will go through a DAX editor, a dataset editor,
and a tool for analyzing the health of datasets and generating
measures in an easy-to-use manner. Let's begin by discussing DAX
Studio, the most powerful querying tool available for DAX.
DAX Studio

You can develop, run, and examine DAX queries with the help of
DAX Studio, an open-source tool that is compatible with Power BI.
DAX is an acronym that stands for preset codes, functions, and
operators that are used to carry out data-centric analytical
operations. The Power BI DAX Library contains over 200 different
functions, operators, and constants that, when applied to Data
Analysis activities, give an enormous amount of versatility. In
addition to this, DAX Studio is constantly being updated with
additional capabilities and functionalities to handle the addition of
new features.
A built-in editor that gives you the ability to develop and run queries
is included with DAX Studio by default. Object browsing, query
editing and execution, syntax highlighting and formatting, formula
and measure editing, integrated tracing, and query execution
breakdowns are some of the features that are facilitated by this. To
put it another way, DAX Studio presents critical facts and information
pertaining to the data model as well as your DAX queries.
What can you do with DAX Studio in Power BI?
• Learn DAX Language: DAX Studio not only assists you in
authoring DAX queries and analyzing the performance of your
data models, but it also aids you in becoming proficient in the
DAX language. If you want to learn more about DAX, you can
do so by navigating to the Home page and exploring the Query
Builder there.
• Optimize your Model Performance: The VertiPaq Analyzer is
a tool that is integrated with DAX Studio that makes it possible
for you to improve the performance of your model easily. It
gives you a quick summary of the data distribution and the
amount of memory that is being used, and it assists you in
solving the problems. You can also run a measure in DAX
studio and make use of the Server Timings tool to get insight
into how the formula is being processed.
• Visualize DAX “table” Functions: The results of DAX
measures that include a table function can be shown in DAX
Studio. This gives you the ability to see the result table,
allowing you to verify that the appropriate table is being
generated. The Power BI Desktop application does not provide
this particular feature.
• Extract your measures into a Spreadsheet: You can quickly
extract a list of your measures from your DAX Studio data
model into a Spreadsheet. This makes it simple for you to
record and reuse the measures you've created.
Download, Install, and Setup DAX Studio Power
BI
• To begin, you will first need to ensure that you have the most
recent version of DAX Studio downloaded to your computer. It
is a free and open-source tool that may be downloaded without
cost at any time. After the file has been successfully
downloaded, you can immediately begin the installation
process.
• When the installation is running, you can choose either
"Current User" or "All Users" to continue. It is highly
suggested to make use of the 'All Users' install option as the
default since this option provides the user with the fullest
possible experience.

• Allow Windows to access the application, and then accept the


terms of the license agreement.
• After selecting the target location, click the "Next" button.
• After selecting the components that you want to install, click
the "Next" button.
• Continue with the installation, and when it is finished, click the
"Finish" button at the bottom of the screen.
• The DAX Studio application has been successfully installed on
your system. The next step is to establish a connection
between it and Power BI. Launching DAX Studio straight from
inside Windows's Program Files folder will allow you to link it to
Power BI so that you can use it.
• When you first run the standalone program for DAX Studio,
you will be requested to choose the kind of connection you
want to use. You can choose to link it to a Tabular Server or a
PBI file. Both options are available to you. You can connect to
any of the opened Power BI Desktop files by choosing the
appropriate data model.

• Once you have chosen a kind of connection, click the


"Connect" button;
• If you are attaching a PBI file, you can skip the "Advanced
Options" step.
DAX Studio UI Basics
You are now able to properly link DAX Studio to Power BI, at this
point you can begin working with the data models by using the
exciting capabilities of the DAX Studio User Interface (UI). There is a
lot to DAX Studio; but, before we get started, let's go through a few
key aspects of the user interface (UI).
1. Metadata Panel
2. The Ribbon
3. Query Pane
4. Output Pane

Metadata Panel
The metadata of your tables in your data model is going to be the
first thing that you notice when you log into the DAX Studio for the
first time. This is the Metadata Panel, and in this panel, you can
locate all of the tables, columns, and DAX measures that are
included in your data model. Any table that has been designated as
a "Date Table" will have a symbol depicting a clock displayed next to
it.
The Ribbon
The next part of DAX Studio is the ribbon, through which you can
access all of the other functions. Let's talk about the many different
significant options that are included inside this ribbon.

1. Clicking this "Run" button will cause your query to be


carried out.
2. If you click the "Clear Cache" option, you will be able to
erase the cache for the database that is now open.
3. When you click the "Output" button, you will be able to
choose the location to which you want the query results to
be sent. From the Output Pane, you can also change the
default output format to one of the other acceptable formats,
such as Excel or a file (CSV or TXT).
4. Selecting this option brings up a drag-and-drop interface for
the Query Builder.
5. The "Format Query" option makes use of the DAX
Formatter service to produce a query that is properly
structured and simpler to understand.
6. The performance data from Power BI Performance Analyzer
will be imported by clicking "Load Pref Data."
7. The connection to the Power BI Desktop files may be seen
by clicking the "Connect" button. If you want to link DAX
Studio to a different data model, you can do so by clicking
this button.
8. Refreshing data is done manually when you click this
button.

Query Pane
You can write, modify, format, and view your queries in the Query
Pane.

Output Pane
The results of your query are shown in this Output Pane, which
is the default for the Output Pane. It has these three tabs:
• Output: You can get some basic information about the query's
execution time in this section.
• Results: This is only a temporary storage place that is used to
return the result table once the query has been executed.
• Query History: This shows a list of the queries that have been
run in the past.

How to Write Queries in DAX Studio?


To write queries in DAX Studio, you can follow these steps:

1. Connect to a data source: Start up DAX Studio and make


a connection to the data source you're using. You can
connect to a variety of data sources, such as an Excel
worksheet, Power BI Desktop, Analysis Services, or Power
Pivot.
2. Open the Query Editor: After you have successfully
connected, open the Query Editor by either clicking the
"New Query" button that is located in the toolbar or by going
to the "File" menu and selecting "New Query" from there.
3. Write your DAX query: You can get started crafting your
DAX query whenever you're ready in the Query Editor.
Power BI, Power Pivot, and Analysis Services all make use
of a mathematical language called DAX. It is quite similar to
the formulae that can be found in Excel, but it also has
some extra features that may be used when dealing with
tabular data.

Here's an example of a simple DAX query to retrieve data from a


table:
● EVALUATE
● TableName
Replace "TableName" with the name of the table you want to
query.

4. Execute the query: After you have finished writing your


DAX query, you can put it into action by either hitting the F5
key on your keyboard or choosing the "Run" button located
in the toolbar.
5. View the results: Once the query has been run, the results
will be presented in the Query Editor's bottom section once
the query has been executed. If necessary, you can inspect
the data and do further analysis.
6. Save the query: If you wish to use the query in the future,
you can go to the "File" menu and choose "Save" or "Save
As" to save the DAX query file (.dax) to a specified place.
This can be done if you want to store the query for future
use.
That's it! Using DAX Studio, you have successfully authored and run
a DAX query. Congratulations! You can continue to hone the
accuracy of your queries, carry out calculations, implement filters,
and explore the potential of DAX for data analysis.
Tabular Editor
It is essential to be aware that Tabular Editor comes in not one but
two unique versions, which are referred to respectively as Tabular
Editor 2 and Tabular Editor 3. Tabular Editor 2 is the first solution that
was created and is available under an open-source license. Tabular
Editor 3 was developed by the same team that was responsible for
Tabular Editor 2, and it has a more aesthetically pleasing user
interface as well as some other enhancements and improvements.
We will be discussing Tabular Editor 2, which is the free, open-
source version. There is currently no functionality in Tabular Editor 3
that cannot be accomplished using Tabular Editor 2; however, it is
possible that doing so may involve more effort.
When we open Tabular Editor from the External Tools list, similar to
what happens when we access DAX Studio, it will instantly connect
to the Power BI data model that we now have running. Tabular Editor
is an editor for any SSAS tabular model that is easy to use. It is
similar to the editor that Power BI data models are constructed on
top of. Let's have a look at the picture below to see what the
interface looks like.
The typical navigation for Windows can be seen at the top of the
screen and includes the options File, Edit, View, Model, and Tools.
Following that, there are three different symbols. The first component
is an open folder that gives you access to all of the BIM files that are
produced by Tabular Editor whenever you save a copy of the model
information in Tabular Editor. This folder will enable you to examine
the files that Tabular Editor generates. Some companies manage
their Power BI datasets fully in Tabular Editor by using BIM files,
which are essentially very big XML files. BIM files can be found here.
In comparison to the PBIX equivalents, these files are of a far more
manageable size, can be subjected to source control, and can be
deposited in repositories.
Next to that is an image of a transparent cube. Within the Server
section, the server address may be changed to reflect any analytic
services database to which we have access. If you have XMLA read
capability enabled, this also covers datasets that are presently
available in the Power BI service. Keep in mind that to achieve this,
you will need at least the Premium per User licensing. If you have
a Power BI dataset open in Power BI Desktop at the moment, you
can use the "Local instance" picker to choose which of those
datasets you want to connect to. In either scenario, you have the
option of using Windows or Azure single-sign-on credentials;
alternatively, if you need to pass a particular set of credentials, you
can identify those credentials by selecting Username and Password
and then supplying the relevant information. If you choose to use
Windows single-sign-on credentials, you will be able to access Azure
with just one click.
Lastly, inside that set, there is a symbol that resembles a disk save.
If you make any changes in the Tabular Editor and then click this
button, those changes will be sent to the database that the Tabular
Editor is linked to. If that's the data model you're using for Power BI,
any changes you make to the measures—whether they're additions,
deletions, or edits—will be applied all at once. You developed some
brand-new connections, right? These will also be added to the list.
Have new tasks been established? You are getting the point. The
current state of Tabular Editor will be restored to the system that it is
modifying as soon as the button in question is pressed. Make sure
you have a backup of a Power BI dataset before making any
changes to it, regardless of whether the information is stored locally
or in the cloud. Keep a Power BI Template (PBIT) file or a BIM file on
hand just in case you end up making a mistake and need to revert to
an earlier state because you changed anything that you hadn't
intended to modify.
Having said that, this does have a benefit attached to it. Let's
imagine you want to populate your data model with a lot of different
measures. When working with Power BI Desktop, you have to build
each one individually. The user interface might be a touch sluggish
at times. You are responsible for creating the measure; however,
there is a mistake in the DAX, and you need to go repair it.
When the evaluation is complete, you will want to add it to your
display folder for the sake of organization. However, you will need to
accomplish this step one at a time. That's a real bother. You can
submit many adjustments to the data model all at once using the
Tabular Editor. And although you're working on things, the changes
you make won't take effect on the data model until you save them.
The learning curve for several items in Tabular Editor is a little bit
steeper than the learning curve for such things in Power BI Desktop.
Because of this, I suggest using a hybrid approach to utilizing
Tabular Editor until you get more familiar with the specifics that are
required to construct many of these pieces outside of Power BI
Desktop. Until then, you should use a hybrid method. You can see
every aspect of your model by looking at the Model Navigation
window located on the left side of the screen.
You will not be able to alter or even see the Data Sources for any
dataset that was generated by Power BI Desktop. This includes
viewing them. Perspectives have a very specific use case in Power
BI, which is that they can be applied to individual visuals if the
Personalize Visuals option is on. This is the only scenario in which
perspectives are applicable. Aside from that, they don't do anything.
Power BI allows you to construct them, and they can exist in the data
model, but it does not enable their capabilities beyond the scope that
was described before. Simply right-clicking an element will bring up a
contextual menu for you to choose from when you want to build a
new element relating to a certain section of the data model. This may
be done at any time. Because I prefer a more visual manner to
understand how all the tables go together, I make it a point to steer
clear of handling connections via the Tabular Editor whenever
possible. On the other hand, if you'd want to, you can form new
connections in this space. If you are starting from scratch with the
construction of a data model, you will need to become acclimated to
this. We can also manage roles for RLS using Tabular Editor, and we
can add measures, show folders, and other computed elements to
our tables. The section under "Shared Expressions" provides a list
of the parameters that may be found in the dataset, while the section
titled "Translations" details the language (or languages) that can be
used with the dataset.
Creating Roles
Step 1: Launch Tabular Editor
To get started, use Tabular Editor and load the project that contains
your tabular models. The Tabular Editor offers a nice and intuitive
user experience for the management of roles and other model
components.
Step 2: Navigate to Roles
Expand the "Roles" folder that is located in the object tree view that
is located on the left-hand side of the Tabular Editor window. You will
see a list of any pre-existing roles in your model if there are any of
those roles in your model. If this does not occur, the folder will be
empty.
Step 3: Create a New Role
To create a new role, pick "New Role" from the context menu that
appears when you right-click the "Roles" folder. This will give your
model a new role to play as a result. Simply give the role a name
that is informative and represents its function, then click the Enter
key to save the name.
Step 4: Define Role Membership
The "Membership" tab may be found in the properties pane that is
located on the right-hand side of the window that displays the
Tabular Editor. In this section, you will be able to identify the users or
groups who will be given this role as their assignment. Members of
the role can be individuals or groups. Click the "..." button that is
located next to the "Members" property to add members to the role.
This will open a member selection window, in which you can pick
users or groups from your Active Directory, or you can specify
specific Windows users or groups. To save the changes, make the
selections you want and then click the OK button.
Step 5: Define Role Permissions
In the properties pane, go to the "Permissions" tab and choose it.
Here is where you will specify the permissions for the role,
establishing what objects and data the members of the role can
access. If you are utilizing row-level security, the permissions can be
specified at a variety of different levels, including the model, table,
column, and even the individual row level. To configure permissions,
expand the tree view of your model inside the properties pane and
choose the objects to which you want to provide access to the role
members or limit their access. You can declare the permissions that
are granted or refused for each item that has been chosen. These
permissions can include Read, Write, or Process. In addition to this,
you can specify permissions on a more detailed level for certain
columns or measures.
Step 6: Save the Changes
Once you have defined the role membership and permissions, click
the "Save" button in the toolbar or press Ctrl+S to save the changes
to your tabular model.
Step 7: Test the Role
You will need to deploy your tabular model to a server, such as SQL
Server Analysis Services, to evaluate the role. Connect to the model
that has been deployed by using a client tool like Power BI. When
prompted for credentials, use those of a user who is already a
member of the role that you have defined. Check to see whether the
user can access the permitted objects and data but is prevented
from accessing the prohibited ones to validate that the permissions
are being applied as intended.
Step 8: Maintain and Update Roles
It's possible that over time, roles will need to be updated in response
to shifting user needs, adjusting permissions or access levels, or
both. Utilize Tabular Editor to make any required adjustments to the
role membership and permissions. To maintain data security and
compliance, you should routinely examine and update your
responsibilities.
Table and Measure Management
The development and administration of DAX-created items is the
next area where Tabular Editor can save any Power BI developer
time and effort. If your model has calculated tables, calculated
columns, or measures, using Tabular Editor can almost immediately
make your life easier. We can divide a table into numerous portions
by right-clicking any table in my model and selecting "Create New
Measure, Calculated Column, Hierarchy, Data Column, or Data
Partition." This will allow us to divide the table into multiple sections.
You will see that Power BI takes care of establishing the time-based
divisions for you if you have Incremental Refresh enabled. This
allows you to focus on analyzing the data rather than worrying about
how to organize it. There are many options available to me about the
table; we can choose to conceal it, replicate it, or see its dependents
as well.
When we right-click on a column, we have the option to create a new
display folder, measure, calculated column, hierarchy, or data
column; we can also establish a link between one column and
another column. This can save you a significant amount of time, in
addition to if you handle relationships using the Tabular Editor. You
can also pick many components at once by clicking while holding
down the Control key. Using the expression editor on the right, let's
assume you construct ten new measures; you can deploy all of them
at once by clicking a single button. You want to transfer a set of
measures into the display folder that you just made, right? That is
something you can do. It is fascinating to watch how much quicker
Power BI Desktop seems to adapt those adjustments when they are
sent to it via Tabular Editor as compared to when you perform those
things inside Power BI Desktop itself.
Version control is something that Tabular Editor 2 does not do on its
own, and it does not preserve a good history of the modifications that
you have made to the document. It doesn't matter how proficient you
get at this; at some point, you will make a mistake, and then you will
have to either reverse it or alter it. This is why this is so important.
You are aware that the modification could have been implemented in
an earlier deployment.
The ALM Toolkit for Power BI
The Application Lifecycle Management (ALM) Toolkit for Power
BI is a collection of tools and best practices that are intended to
assist you in the management of the creation, deployment, and
upkeep of Power BI applications in a way that is both regulated and
effective. It offers features for controlling versions of Power BI
assets, automating deployments, testing those assets, and
documenting their use.
The following is an outline of the most important aspects and
components of the ALM Toolkit:
• Version Control Integration: You can manage Power BI files,
such as .pbix and .pbit, along with other relevant artifacts,
such as DAX scripts and data sources, in a version-controlled
environment by integrating the ALM Toolkit with common
version control systems, such as Git. This is made possible by
the integration of the ALM Toolkit with these systems.
Collaboration, monitoring of changes and rollbacks are all
made possible as a result, which contributes to the controlled
development process.
• Automated Deployment: If you have the ALM Toolkit, you
can automate the deployment of Power BI assets across a
variety of environments, including development, testing, and
production. It gives you the ability to manage dependencies
across the many components of your Power BI applications,
establish deployment pipelines, and designate target
environments.
• Testing and Validation: The ALM Toolkit provides automated
testing and validation of Power BI reports, datasets, and data
models. This feature is part of the Testing and Validation
module. During the process of creation and deployment of your
Power BI assets, you can design test cases and scenarios to
validate their correctness and dependability. This guarantees
the quality of your BI solution and helps uncover problems or
regressions that may have occurred.
• Documentation Generation: The process of creating
documentation is an essential part of managing applications
throughout their life cycles. The ALM Toolkit assists in the
process of automatically creating documentation for your
Power BI projects. This documentation may include report
metadata, data lineage, and data dictionaries. This
documentation helps gain a grasp of the Power BI solution as
well as sustaining it over time.
• Data Source Management: The ALM Toolkit makes it easier
to manage data sources, which is one of the toolkit's primary
functions. Your Power BI projects can make use of a variety of
data sources, and you can build and manage connections to
those data sources. Having this capability enables you to
simply switch between many data sources throughout the
deployment process as well as centrally control the settings of
the data sources themselves.
• Collaboration and Team Development: The ALM Toolkit
includes features that assist both of these activities. The same
Power BI project can be worked on simultaneously by many
developers, and the tools make it easy to handle disputes and
combine changes. It makes working together easier and better
coordinates everyone's efforts inside the development teams.
• CI/CD Integration: The ALM Toolkit can be connected to
CI/CD pipelines, which enables automation of build, test, and
deployment procedures. The entire processes for development
and release are streamlined as a result of this connection,
which improves productivity and reduces the amount of human
labor required.
The ALM Toolkit for Power BI improves the development lifecycle of
Power BI projects by offering an organized approach to version
control, deployment, testing, and documentation. This makes it
easier to track changes and improve quality. It encourages
cooperation, consistency, and dependability in the administration of
Power BI assets, which eventually leads to solutions for business
intelligence that are more effective and dependable.
Bravo
You can immediately improve the performance of your model with
the assistance of the Bravo software, which is a user-friendly tool. If
you use Bravo from Power BI Desktop, the tool will open in the
context of the Power BI Desktop file that is now being utilized. This
behavior is consistent with how all of the other external tools that we
have used behave. You can log in to the Power BI service from
inside the tool, where you will also be able to see information on
datasets in Premium workspaces.
Analyze Model
The Analyze Model window, shown in the picture below, will be the
very first thing that you see when you open Bravo. We can use this
to get information on the size of our dataset, the number of columns
it contains, and, most crucially, the number of columns that are not
referenced inside the model, in a relatively short amount of time. It is
crucial to be aware that Bravo cannot determine if any reports that
are based on the dataset may utilize one of these columns. But if
you want to eliminate columns from your data model, Bravo can
point you in the direction of where to begin searching for a solution.
This will contribute to the model being more compact while also
increasing its overall efficiency.

There is also a wonderful visual that shows how much of the model
is occupied by certain columns or, in the case of my example, a
cluster of lesser columns. The visual will split out those columns as
well if I pick the smaller columns collection in the columns list.
Everything that is marked in yellow is something that is not being
utilized in the model at this time and might be a candidate for
removal. If you have a model with a lot of columns, using the search
function and the filter function, both of which are available, can be
quite beneficial. There is also the option to download a VPAX file,
which is a file that can be read by a piece of software known as
VertiPaq Analyzer, which is another piece of software that was
developed by the individuals at SQLBI. When I'm ready to commit
my modifications, I prefer to utilize Bravo to locate columns that I can
get rid of. After that, I'll use the ALM Toolkit to make a record of the
changes made to the columns and then use that record to deploy the
changes to an updated PBIT or BIM file.
DAX Formatting
If you don't use Bravo for anything else, getting it only for the Format
DAX page is worth it. You can see all of your measures, and the
DAX Formatting service will read the script if you click Analyze Now.
It will inform you how many of your measures include mistakes and
how many of your measures are not prepared in the manner that the
DAX Formatting service recommends they be structured in. You
have the option of selecting individual measures, or you can pick all
of them at once and have them formatted in bulk. If you click on a
measure, a window will emerge on the right side of the screen. This
window will provide you with a preview of the formatted DAX as well
as the currently applied format. You can leave the formatting of a
particular measure alone if you know of a good justification for doing
so, in which case you should focus on correcting the other formats.
There is no need to have an unformatted DAX when you have
something like Bravo; when your DAX is formatted and someone
else's isn't, I guarantee that you will appear better than they do.
Manage Dates
You can ask Bravo to generate a date table for you, as well as a
large number of time intelligence measures, depending on the
measures that are already included in your model. This tool can help
you save a significant amount of time by automatically creating
hundreds of correctly structured measures, which will speed up the
development of your program. Having said that, there is a catch. Two
things are not allowed in your Power BI data model: first, you cannot
have the auto date/time feature activated, and second, you cannot
have another table already recognized as your date table. Bravo will
let you know whether the Manage Dates functionalities can be used
with the data model that you currently have. If you can do so, you will
be able to rapidly design a date table complete with predetermined
time intervals, the language of your choosing, and even the option to
choose which nation's holidays should be included in the model.
If that were the only thing that this thing could do, then it would
already qualify as fantastic. It gets better. In the time intelligence
section, you will be asked if you want these measures to be enabled
and, if yes, whether you want time intelligence to be constructed for
all of your measures or a subset of measures that you choose. You
can get a brief idea of how deep the measure rabbit hole goes in
Bravo by looking at the image below. Additionally, it will generate
display folders for you, which is a very helpful feature. If you want to
become better at DAX, there are some fantastic examples of how to
create DAX utilizing time intelligence functions that you can use to
help push you along in your DAX journey. If you want to get better at
DAX, you can use these examples to help you get better at DAX.

Export Data
Users can make use of the sophisticated visualization and analytical
capabilities of Power BI with the data that is stored in Bravo by using
a simple procedure that involves exporting data from Bravo to Power
BI.
Understanding Bravo's Data Export Options
Before exporting data from Bravo, it is very necessary to have an
understanding of the many export options that are at one's disposal
and choose the format that is most compatible with Power BI.
Exporting to CSV (Comma-Separated Values), Excel, JSON
(JavaScript Object Notation), and maybe even other standard
formats might be one of the many exporting features that Bravo
provides. Importing data into Power BI is made easier and more
compatible using these formats, which provide flexibility. If you
require further information on the data that was exported, Bravo can
also offer you an export summary page. When the export is finished
and the file is saved, Bravo will even provide you with a visual link
that you can click to be taken directly to the file. As of right now, it is
still exporting to either Excel or CSV, which means that it cannot
export an endless amount of rows. If you have a procedure that
requires you to get millions of entries from a table, you will almost
certainly find that utilizing an Evaluate statement in DAX Studio and
exporting the results from there is the most efficient course of action
to take. On the other hand, if your table does not include millions of
rows, using Export Data as a method to export data from a table in
your data model can be an extremely useful option.
Preparing Data in Bravo
It is very necessary to prepare the data inside Bravo to guarantee
that the export procedure will go off without a hitch. This requires the
data to be organized and structured in a manner that is compatible
with the reporting requirements of Power BI. To keep the data's
integrity and consistency intact, it is necessary to clean the data,
format the data, and validate the data. Before exporting the data, it
can be refined using the data preparation tools provided by Bravo.
These tools include methods for filtering, sorting, and aggregating
the data.
Exporting Data from Bravo
When the data is available, Bravo offers an export or downloads
option that can be used to extract the data so that it can be used in
Power BI. Find the export option inside the interface of Bravo. This
option is often located in the menu or toolbar of the interface. Based
on the needs you have for Power BI, choose the format you want to
use (for example, CSV, Excel, or JSON). Follow the on-screen
instructions to pick the particular data to export, such as the specific
tables or queries, and then click "Export."
Conclusion
Utilizing third-party applications increases Power BI's capability and
capabilities, giving customers better access to data analysis and
visualization tools. In addition to Power BI's built-in capabilities,
customizations, and connectors, these products provide extra
features and functionality. Users can boost Power BI's functionality in
some areas, including advanced analytics, data preparation, custom
visuals, and access to a variety of data sources, by using third-party
applications. These technologies enable users to get deeper insights
from their data by offering sophisticated statistical analysis,
predictive modeling, data profiling, and data cleaning capabilities.
Beyond what is offered by Power BI's basic visuals, custom
visualizations developed by third parties provide distinctive
visualization options. By presenting data in unique and aesthetically
pleasing ways, these visuals enable users to generate more
engaging and personalized reports and dashboards.
Commonly Used DAX Expressions
Aggregation Functions
Aggregation functions are used in DAX (Data Analysis Expressions)
to perform calculations on a set of values within a column or table.
These functions summarize or aggregate data based on specific
criteria.
Here are some commonly used aggregation functions in Power
BI:

1. SUM: Calculates the sum of a column or expression.


Example: Total Sales = SUM(Sales[Amount])
2. AVERAGE: Calculates the average of a column or
expression. Example: Average Sales =
AVERAGE(Sales[Amount])
3. MIN: Returns the minimum value in a column or
expression. Example: Min Sales = MIN(Sales[Amount])
4. MAX: Returns the maximum value in a column or
expression. Example: Max Sales = MAX(Sales[Amount])
5. COUNT: Counts the number of rows in a table or column.
Example: Number of Orders = COUNT(Orders[OrderID])
6. DISTINCTCOUNT: Counts the number of distinct values in
a column or expression. Example: Number of Customers
= DISTINCTCOUNT(Sales[CustomerID])
7. COUNTROWS: Returns the number of rows in a table or
table expression. Example: Total Rows =
COUNTROWS(Sales)
8. MEDIAN: Calculates the median value in a column or
expression. Example: Median Sales =
MEDIAN(Sales[Amount])
9. VAR: Calculates the variance of a column or expression.
Example: Variance = VAR(Sales[Amount])
10.
STDDEV: Calculates the standard deviation of
a column or expression. Example: Standard Deviation =
STDDEV(Sales[Amount])
11.
FIRSTNONBLANK: Returns the first non-blank
value in a column or expression. Example: First Non-
Blank Date = FIRSTNONBLANK(Sales[OrderDate])
12.
LASTNONBLANK: Returns the last non-blank
value in a column or expression. Example: Last Non-
Blank Date = LASTNONBLANK(Sales[OrderDate])
13.
DISTINCT: Returns a one-column table with
distinct values from a column or expression. Example:
Distinct Products = DISTINCT(Products[ProductID])
14.
CONCATENATEX: Concatenates values from
a column or expression with a delimiter. Example:
Concatenated Names = CONCATENATEX(Customers,
Customers[Name], ", ")
15.
GROUPBY: Groups rows based on a specified
column or expression and apply an aggregation. Example:
Sales by Region = SUMMARIZE(Sales, Sales[Region],
"Total Sales", SUM(Sales[Amount]))

16.
RANKX: Calculates the rank of a value within a
specified column, optionally sorted by another column.
Example: Rank = RANKX(Sales, Sales[Amount], ,
DESC)
17.
PERCENTILEX.INC: Calculates the value at a
given percentile within a column or expression. Example:
90th Percentile = PERCENTILEX.INC(Sales,
Sales[Amount], 0.9)
18.
TOPN: Returns the top N rows based on a
specified expression and ranking. Example: Top 5
Customers = TOPN(5, Customers,
Customers[TotalSales], DESC)
19.
BOTTOMN: Returns the bottom N rows based
on a specified expression and ranking. Example: Bottom
10 Products = BOTTOMN(10, Products,
Products[Sales], ASC)
20.
DISTINCTCOUNTNOBLANK: Counts the
number of distinct non-blank values in a column or
expression. Example: Number of Customers =
DISTINCTCOUNTNOBLANK(Sales[CustomerID])
21.
SUMX: Calculates the sum of an expression
for each row in a table, and then aggregates the results.
Example: Total Revenue = SUMX(Sales, Sales[Amount]
* Sales[Quantity])
22.
AVERAGEX: Calculates the average of an
expression for each row in a table, and then aggregates
the results. Example: Average Revenue per Customer =
AVERAGEX(Customers, SUM(Sales[Amount]))
23.
CALCULATE and ALLEXCEPT: Modifies the
filter context for an expression, removing all filters except
those specified. Example: Total Sales All Years =
CALCULATE(SUM(Sales[Amount]), ALLEXCEPT(Sales,
Sales[Product]))
24.
FIRSTDATE and LASTDATE: Retrieve the first
or last date in a column or table, considering the filter
context. Example: First Sale Date =
FIRSTDATE(Sales[OrderDate]) Last Sale Date =
LASTDATE(Sales[OrderDate])
25.
CONCATENATEX and VALUES:
Concatenates values from a column or expression,
considering the filter context and returning distinct values.
Example: Concatenated Products =
CONCATENATEX(VALUES(Products[Category]),
Products[Category], ", ")

Date and Time Functions


Date and time functions in DAX (Data Analysis Expressions) are
used to manipulate and extract information from date and time
values. These functions allow you to perform calculations,
comparisons, and transformations on date and time data in Power
BI.
Here are some commonly used date and time functions in
Power BI:

1. TODAY: Returns the current date. Example: Today =


TODAY()
2. NOW: Returns the current date and time. Example:
CurrentDateTime = NOW()
3. DATE: Creates a date value based on specified year,
month, and day. Example: OrderDate = DATE(2023, 6,
14)
4. YEAR: Extracts the year from a date value. Example: Year
= YEAR(Sales[OrderDate])
5. MONTH: Extracts the month from a date value. Example:
Month = MONTH(Sales[OrderDate])
6. DAY: Extracts the day from a date value. Example: Day =
DAY(Sales[OrderDate])
7. WEEKDAY: Returns the day of the week as a number,
where Sunday is 1 and Saturday is 7. Example: Weekday
= WEEKDAY(Sales[OrderDate])
8. EOMONTH: Returns the last day of the month for a given
date. Example: EndOfMonth =
EOMONTH(Sales[OrderDate], 0)
9. DATEDIFF: Calculates the difference between two dates in
specified units. Example: DaysBetween =
DATEDIFF(Sales[StartDate], Sales[EndDate], DAY)
10.
TOTALYTD: Calculates the year-to-date value
for a specified expression. Example: TotalSalesYTD =
TOTALYTD(SUM(Sales[Amount]), Dates[Date])
11.
SAMEPERIODLASTYEAR: Returns a table
that includes the same period as the current context, but in
the previous year. Example: SalesLastYear =
CALCULATE(SUM(Sales[Amount]),
SAMEPERIODLASTYEAR(Dates[Date]))
12.
CALENDAR: Generates a table with a
continuous range of dates. Example: DateTable =
CALENDAR(DATE(2023, 1, 1), DATE(2023, 12, 31))
13.
LASTDATE: Returns the last date from the
current context. Example: LatestDate =
LASTDATE(Dates[Date])
14.
TODAY: Returns the current date. Example:
Today = TODAY()
15.
TIME: Creates a time value based on specified
hours, minutes, and seconds. Example: OrderTime =
TIME(9, 30, 0)
16.
HOUR: Extracts the hour from a time value.
Example: Hour = HOUR(Sales[OrderTime])
17.
MINUTE: Extracts the minute from a time
value. Example: Minute = MINUTE(Sales[OrderTime])
18.
SECOND: Extracts the second from a time
value. Example: Second = SECOND(Sales[OrderTime])

19.
TIMEVALUE: Converts a text string
representing a time to a time value. Example: OrderTime
= TIMEVALUE("09:30 AM")
20.
NOW: Returns the current date and time.
Example: CurrentDateTime = NOW()
21.
DATEVALUE: Converts a text string
representing a date to a date value. Example: OrderDate
= DATEVALUE("2023-06-14")
22.
QUARTER: Returns the quarter of the year for
a date value. Example: Quarter =
QUARTER(Sales[OrderDate])
23.
WEEKNUM: Returns the week number for a
date value. Example: WeekNumber =
WEEKNUM(Sales[OrderDate])
24.
STARTOFYEAR: Returns the first date of the
year for a given date. Example: YearStartDate =
STARTOFYEAR(Sales[OrderDate])
25.
ENDOFYEAR: Returns the last date of the
year for a given date. Example: YearEndDate =
ENDOFYEAR(Sales[OrderDate])
26.
STARTOFQUARTER: Returns the first date of
the quarter for a given date. Example: QuarterStartDate =
STARTOFQUARTER(Sales[OrderDate])
27.
ENDOFQUARTER: Returns the last date of
the quarter for a given date. Example: QuarterEndDate =
ENDOFQUARTER(Sales[OrderDate])
28.
STARTOFMONTH: Returns the first date of the
month for a given date. Example: MonthStartDate =
STARTOFMONTH(Sales[OrderDate])
29.
ENDOFMONTH: Returns the last date of the
month for a given date. Example: MonthEndDate =
ENDOFMONTH(Sales[OrderDate])
30.
NEXTDAY: Returns the next date after a given
date. Example: NextDay = NEXTDAY(Sales[OrderDate])
31.
PREVIOUSDAY: Returns the previous date
before a given date. Example: PreviousDay =
PREVIOUSDAY(Sales[OrderDate])
32.
ADDMONTHS: Adds or subtracts a specified
number of months to a date value. Example: FutureDate =
ADDMONTHS(Sales[OrderDate], 3)
33.
ADDYEARS: Adds or subtracts a specified
number of years to a date value. Example: FutureDate =
ADDYEARS(Sales[OrderDate], 2)
34.
DATESBETWEEN: Returns a table of dates
between two given dates. Example: DateRange =
DATESBETWEEN(Dates[Date], DATE(2021, 1, 1),
DATE(2021, 12, 31))
35.
TOTALMTD: Calculates the month-to-date
value for a specified expression. Example: TotalSalesMTD
= TOTALMTD(SUM(Sales[Amount]), Dates[Date])

Time Intelligence Functions


Time Intelligence functions in DAX (Data Analysis Expressions) are
specifically designed to perform calculations and analysis on time-
based data in Power BI. These functions help in comparing,
aggregating, and analyzing data over different periods.
Here are some commonly used Time Intelligence functions in
Power BI:

1. TOTALYTD: Calculates the year-to-date value for a


specified expression. Example: Total Sales YTD =
TOTALYTD(SUM(Sales[Amount]), Dates[Date])
2. SAMEPERIODLASTYEAR: Returns a table that includes
the same period as the current context but in the previous
year. Example: Sales Last Year =
CALCULATE(SUM(Sales[Amount]),
SAMEPERIODLASTYEAR(Dates[Date]))
3. PREVIOUSYEAR: Returns a table that includes the data
for the previous year. Example: Sales Previous Year =
CALCULATE(SUM(Sales[Amount]),
PREVIOUSYEAR(Dates[Date]))
4. PREVIOUSQUARTER: Returns a table that includes the
data for the previous quarter. Example: Sales Previous
Quarter = CALCULATE(SUM(Sales[Amount]),
PREVIOUSQUARTER(Dates[Date]))
5. PREVIOUSMONTH: Returns a table that includes the data
for the previous month. Example: Sales Previous Month
= CALCULATE(SUM(Sales[Amount]),
PREVIOUSMONTH(Dates[Date]))
6. DATESYTD: Returns a table that includes all dates from
the start of the year up to the given date. Example: Sales
Dates YTD = DATESYTD(Dates[Date])
7. DATESINPERIOD: Returns a table that includes all dates
in a specified period. Example: Sales Dates in Q1 =
DATESINPERIOD(Dates[Date], DATE(2023, 1, 1),
DATE(2023, 3, 31))
8. FIRSTDATE: Returns the earliest date from a given
column or table expression. Example: First Sale Date =
FIRSTDATE(Sales[OrderDate])
9. LASTDATE: Returns the latest date from a given column
or table expression. Example: Last Sale Date =
LASTDATE(Sales[OrderDate])
10.
OPENINGBALANCEYEAR: Calculates the
opening balance for a specified measure at the beginning
of the year. Example: Opening Balance Year =
OPENINGBALANCEYEAR(SUM(Sales[Amount]),
Dates[Date])
11.
CLOSINGBALANCEYEAR: Calculates the
closing balance for a specified measure at the end of the
year. Example: Closing Balance Year =
CLOSINGBALANCEYEAR(SUM(Sales[Amount]),
Dates[Date])
12.
TOTALMTD: Calculates the month-to-date
value for a specified expression. Example: Total Sales
MTD = TOTALMTD(SUM(Sales[Amount]), Dates[Date])
13.
SAMEPERIODLASTMONTH: Returns a table
that includes the same period as the current context but in
the previous month. Example: Sales Last Month =
CALCULATE(SUM(Sales[Amount]),
SAMEPERIODLASTMONTH(Dates[Date]))
14.
PARALLELPERIOD: Returns a table that
includes the data for the same period in a previous period.
Example: Sales Same Period Last Year =
CALCULATE(SUM(Sales[Amount]),
PARALLELPERIOD(Dates[Date], -1, YEAR))

15.
DATESBETWEEN: Returns a table of dates
between two specified dates. Example: Sales Dates
Between = DATESBETWEEN(Dates[Date], DATE(2023,
1, 1), DATE(2023, 12, 31))
16.
TOTALQTD: Calculates the quarter-to-date
value for a specified expression. Example: Total Sales
QTD = TOTALQTD(SUM(Sales[Amount]), Dates[Date])
17.
SAMEPERIODLASTQUARTER: Returns a
table that includes the same period as the current context
but in the previous quarter. Example: Sales Last Quarter
= CALCULATE(SUM(Sales[Amount]),
SAMEPERIODLASTQUARTER(Dates[Date]))
18.
PREVIOUSNMONTHS: Returns a table that
includes the data for the specified number of previous
months. Example: Sales Previous 3 Months =
CALCULATE(SUM(Sales[Amount]),
PREVIOUSNMONTHS(3, Dates[Date]))
19.
STARTOFYEAR: Returns the first date of the
year for a given date. Example: Year Start Date =
STARTOFYEAR(Dates[Date])
20.
ENDOFYEAR: Returns the last date of the
year for a given date. Example: Year End Date =
ENDOFYEAR(Dates[Date])
21.
STARTOFQUARTER: Returns the first date of
the quarter for a given date. Example: Quarter Start Date
= STARTOFQUARTER(Dates[Date])
22.
ENDOFQUARTER: Returns the last date of
the quarter for a given date. Example: Quarter End Date =
ENDOFQUARTER(Dates[Date])
23.
STARTOFMONTH: Returns the first date of the
month for a given date. Example: Month Start Date =
STARTOFMONTH(Dates[Date])
24.
ENDOFMONTH: Returns the last date of the
month for a given date. Example: Month End Date =
ENDOFMONTH(Dates[Date])
25.
NEXTDAY: Returns the next date after a given
date. Example: Next Day = NEXTDAY(Dates[Date])
26.
PREVIOUSDAY: Returns the previous date
before a given date. Example: Previous Day =
PREVIOUSDAY(Dates[Date])
27.
NEXTMONTH: Returns the next month after a
given date. Example: Next Month =
NEXTMONTH(Dates[Date])
28.
PREVIOUSMONTH: Returns the previous
month before a given date. Example: Previous Month =
PREVIOUSMONTH(Dates[Date])
29.
NEXTQUARTER: Returns the next quarter
after a given date. Example: Next Quarter =
NEXTQUARTER(Dates[Date])
30.
PREVIOUSQUARTER: Returns the previous
quarter before a given date. Example: Previous Quarter =
PREVIOUSQUARTER(Dates[Date])

Filter Functions
Filter functions in DAX (Data Analysis Expressions) allow you to
apply specific filters to your data and calculate expressions based on
those filters. These functions help you narrow down your data and
perform calculations on specific subsets.
Here are some commonly used filter functions in Power BI:

1. FILTER: Returns a table that includes only the rows that


meet specified criteria. Example: FilteredTable =
FILTER(Sales, Sales[Amount] > 1000)
2. CALCULATE: Modifies the filter context by applying
additional filters to the data. Example: TotalSales =
CALCULATE(SUM(Sales[Amount]), Sales[Region] =
"North")
3. ALL: Removes filters from a table or column, returning the
entire table or column. Example: TotalSalesAll =
CALCULATE(SUM(Sales[Amount]),
ALL(Sales[Region]))
4. ALLEXCEPT: Removes filters from all columns except the
specified columns. Example: TotalSalesAllexcept =
CALCULATE(SUM(Sales[Amount]), ALLEXCEPT(Sales,
Sales[Region]))
5. RELATEDTABLE: Returns a table that is related to the
current table by a defined relationship. Example:
RelatedProducts = RELATEDTABLE(Products)
6. TOPN: Returns the top or bottom n rows from a table
based on a specified expression. Example:
Top5Customers = TOPN(5, Sales, Sales[Amount])
7. RANKX: Calculates the rank of a value within a specified
column. Example: Rank = RANKX(Sales,
Sales[Amount])
8. EARLIER: Refers to a previous row context within an
iteration of a calculation. Example: PreviousRowAmount
= Sales[Amount] - EARLIER(Sales[Amount])
9. USERELATIONSHIP: Changes the active relationship
between two tables for a specific calculation. Example:
TotalSalesWithInactiveRelationship =
CALCULATE(SUM(Sales[Amount]),
USERELATIONSHIP(Sales[Date],
Dates[CalendarDate]))
10.
SELECTEDVALUE: Returns the value if there
is only one distinct value in a column within the current
filter context. Example: SelectedRegion =
SELECTEDVALUE(Sales[Region])
11.
HASONEVALUE: Checks if there is only one
distinct value in a column within the current filter context.
Example: HasOneRegion =
HASONEVALUE(Sales[Region])
12.
SUMMARIZE: Creates a summary table by
grouping data based on specified columns and calculating
aggregations. Example: SummaryTable =
SUMMARIZE(Sales, Sales[Region], "Total Sales",
SUM(Sales[Amount]))
13.
NATURALINNERJOIN: Performs a natural
inner join operation between two tables based on common
columns. Example: JoinedTable =
NATURALINNERJOIN(Table1, Table2)
14.
LOOKUPVALUE: Returns the value from a
column in a table that matches specified search criteria.
Example: ProductCategory =
LOOKUPVALUE(Products[Category], Products[ID],
123)

15.
VALUES: Returns a one-column table that
contains unique values from a specified column. Example:
UniqueRegions = VALUES(Sales[Region])
16.
ISFILTERED: Checks if a column or table is
filtered. Example: IsRegionFiltered =
ISFILTERED(Sales[Region])
17.
ISCROSSFILTERED: Checks if a column or
table is filtered by a specific column or table. Example:
IsFilteredByRegion =
ISCROSSFILTERED(Sales[Region], Dates[Date])
18.
CROSSFILTER: Defines or modifies the cross-
filter direction between two tables. Example:
CrossFilterDirection =
CROSSFILTER(Sales[ProductID], Products[ProductID],
BOTH)
19.
RELATED: Returns a single value from a
related table based on a specified column. Example:
ProductName = RELATED(Products[Name])
20.
KEEPFILTERS: Retains the existing filter
context while evaluating an expression. Example:
TotalSales = CALCULATE(SUM(Sales[Amount]),
KEEPFILTERS(Sales[Region] = "North"))
21.
FILTERS: Returns a table containing all active
filters in the current filter context. Example: ActiveFilters =
FILTERS(Sales)
22.
REMOVEFILTERS: Removes all filters from
the specified table or column. Example: FilteredTable =
REMOVEFILTERS(Sales)
23.
TREATAS: Treats a table as if it were another
table for evaluation purposes. Example: FilteredSales =
CALCULATE(SUM(Sales[Amount]),
TREATAS({"ProductA", "ProductB"}, Sales[Product]))
24.
ALLSELECTED: Returns all the values that
are currently selected in a column. Example:
SelectedRegions = ALLSELECTED(Sales[Region])
25.
CALCULATETABLE: Returns a table that is
filtered by one or more expressions. Example:
FilteredTable = CALCULATETABLE(Sales,
Sales[Amount] > 1000, Sales[Region] = "North")
26.
ISEMPTY: Checks if a table, column, or
expression is empty. Example: IsSalesEmpty =
ISEMPTY(Sales)
27.
USERNAME: Returns the username of the
current user accessing the data. Example:
CurrentUsername = USERNAME()
28.
USERPRINCIPALNAME: Returns the user
principal name (UPN) of the current user accessing the
data. Example: CurrentUserUPN =
USERPRINCIPALNAME()
29.
USEROBJECTID: Returns the object ID of the
current user accessing the data. Example:
CurrentUserObjectID = USEROBJECTID()
30.
USERROLE: Returns the role of the current
user accessing the data. Example: CurrentUserRole =
USERROLE()

Logical Functions
Logical functions in DAX (Data Analysis Expressions) are used to
perform logical operations and evaluate conditions in Power BI.
These functions help in making decisions, creating conditional
expressions, and filtering data based on logical criteria.
Here are some commonly used logical functions in Power BI:
1. IF: Evaluates a condition and returns different results
based on whether the condition is true or false. Example:
Result = IF(Sales[Amount] > 1000, "High", "Low")
2. SWITCH: Evaluates a series of conditions and returns a
result based on the first true condition. Example: Result =
SWITCH(Sales[Region], "North", 1, "South", 2, "West",
3, "East", 4, 0)
3. AND: Checks if all specified conditions are true and
returns true or false. Example: IsAllTrue =
AND(Sales[Amount] > 1000, Sales[Region] = "North")
4. OR: Checks if any of the specified conditions are true and
returns true or false. Example: IsAnyTrue =
OR(Sales[Amount] > 1000, Sales[Region] = "North")
5. NOT: Reverses the logical value of a condition or
expression. Example: IsFalse = NOT(Sales[Amount] >
1000)
6. TRUE: Returns the logical value "true". Example: IsTrue =
TRUE()
7. FALSE: Returns the logical value "false". Example:
IsFalse = FALSE()
8. XOR: Checks if exactly one of the specified conditions is
true and returns true or false. Example: IsExclusive =
XOR(Sales[Amount] > 1000, Sales[Region] = "North")
9. IFERROR: Returns a specified value if an expression
results in an error, otherwise returns the result of the
expression. Example: Result = IFERROR(1 / 0, 0)
10.
BLANK: Returns a blank value. Example:
Result = BLANK()
11.
ISBLANK: Checks if a value is blank and
returns true or false. Example: IsBlank =
ISBLANK(Sales[Amount])
12.
CONTAINSROW: Checks if a table contains a
specific row based on the specified condition. Example:
IsFound = CONTAINSROW(Products,
Products[Category] = "Electronics")
13.
ISFILTERED: Checks if a column or table is
filtered. Example: IsFiltered =
ISFILTERED(Sales[Region])
14.
ISCROSSFILTERED: Checks if a column or
table is filtered by a specific column or table. Example:
IsFilteredByDate = ISCROSSFILTERED(Sales[Region],
Dates[Date])
15.
ISINSCOPE: Checks if a column is in the
current filter context and returns true or false. Example:
IsInScope = ISINSCOPE(Sales[Product])

16.
ISERROR: Checks if an expression results in
an error and returns true or false. Example: IsError =
ISERROR(1 / 0)
17.
ISNUMBER: Checks if a value is a number and
returns true or false. Example: IsNumber =
ISNUMBER(Sales[Amount])
18.
ISTEXT: Checks if a value is text and returns
true or false. Example: IsText = ISTEXT(Sales[Product])
19.
ISLOGICAL: Checks if a value is a logical
value (true or false) and returns true or false. Example:
IsLogical = ISLOGICAL(Sales[IsApproved])
20.
IFBLANK: Returns a specified value if a value
is blank, otherwise returns the value itself. Example:
Result = IFBLANK(Sales[Amount], 0)
21.
COALESCE: Returns the first non-blank value
from a list of expressions. Example: Result =
COALESCE(Sales[Amount], Sales[Quantity], 0)
22.
IN: Checks if a value is found in a specified list
of values and returns true or false. Example: IsInList =
IN(Sales[Region], {"North", "South", "West"})
23.
INVERT: Inverts the logical value of a condition
or expression. Example: IsFalse =
INVERT(Sales[Amount] > 1000)
24.
HASONEVALUE: Checks if a column or
expression has only one distinct value in the current filter
context and returns true or false. Example: HasOneValue
= HASONEVALUE(Sales[Region])
25.
RELATEDTABLEHASDATA: Checks if a
related table has data for the current row and returns true
or false. Example: HasData =
RELATEDTABLEHASDATA(Orders)
26.
AND/OR functions with multiple conditions:
You can use multiple AND or OR functions to combine
multiple conditions. Example: IsTrue =
AND(Sales[Amount] > 1000, OR(Sales[Region] =
"North", Sales[Region] = "South"))
27.
SWITCH with multiple conditions: You can
use multiple conditions in the SWITCH function to evaluate
different results. Example: Result = SWITCH(TRUE(),
Sales[Amount] > 1000, "High", Sales[Amount] > 500,
"Medium", "Low")
28.
&& (AND) and || (OR) operators: DAX also
supports the && (AND) and || (OR) operators for combining
conditions. Example: IsTrue = Sales[Amount] > 1000 &&
(Sales[Region] = "North" || Sales[Region] = "South")

DAX Operators
DAX (Data Analysis Expressions) includes various operators that
allow you to perform mathematical, comparison, logical, and text
operations on data in Power BI. These operators help you create
expressions, perform calculations, and manipulate data within your
DAX formulas.
Here are some commonly used operators in DAX:
1. Arithmetic Operators:
● Addition (+): Performs addition between two values.
● Subtraction (-): Performs subtraction between two
values.
● Multiplication (*): Performs multiplication between two
values.
● Division (/): Performs division between two values.
● Modulus (%): Returns the remainder after division.

2. Comparison Operators:
● Equal to (=): Compares two values for equality.
● Not equal to (<>): Compares two values for inequality.
● Greater than (>): Checks if one value is greater than
another.
● Less than (<): Checks if one value is less than another.
● Greater than or equal to (>=): Checks if one value is
greater than or equal to another.
● Less than or equal to (<=): Checks if one value is less
than or equal to another.
3. Logical Operators:
● AND: Performs logical AND operation between two or
more conditions.
● OR: Performs logical OR operation between two or more
conditions.
● NOT: Negates a logical condition.

4. Text Operators:
● Concatenation (&): Concatenates two or more text
values.
● Text Comparison Operators: DAX supports comparison
operators for text values, such as =, <>, <, >, <=, >=.
5. Set Operators:
● UNION: Combines two sets into a single set, removing
duplicates.
● INTERSECT: Returns the intersection of two sets.
● EXCEPT: Returns the difference between two sets.

6. Membership Operators:
● IN: Checks if a value is a member of a set or a list of
values.
● NOT IN: Checks if a value is not a member of a set or a
list of values.
7. Range Operators:
● BETWEEN: Checks if a value is within a specified range.
● NOT BETWEEN: Checks if a value is not within a
specified range.

8. Parentheses: You can use parentheses to group


expressions and control the order of operations.

9. Null-Coalescing Operator (??): Returns the first non-null


value from a list of expressions. Example: Result =
Sales[Amount] ?? 0
10.
Unary Plus (+) and Minus (-) Operators:
Performs unary positive or negative operation on a value.
Example: PositiveValue = +Sales[Amount]
NegativeValue = -Sales[Amount]
11.
Power (^) Operator: Raises a number to the
power of another number. Example: Result =
Sales[Amount] ^ 2
12.
Concatenation Operator (+): Concatenates
two or more text or numeric values. Example: FullName =
Customers[First Name] + " " + Customers[Last Name]
13.
Logical Operators in FILTER Function:

● && (AND): Performs logical AND operation within the


FILTER function. Example: FilteredTable =
FILTER(Sales, Sales[Amount] > 1000 &&
Sales[Region] = "North")
● || (OR): Performs logical OR operation within the FILTER
function. Example: FilteredTable = FILTER(Sales,
Sales[Amount] > 1000 || Sales[Region] = "North")

14.
IN Operator: Checks if a value is in a specified
list of values. Example: IsInList = Sales[Region] IN
{"North", "South", "West"}
15.
CONTAINS Operator: Checks if a text value
contains a specified substring. Example:
ContainsSubstring = CONTAINS(Sales[Product],
"ABC")
16.
DISTINCT Operator: Returns distinct values
from a column or table. Example: DistinctValues =
DISTINCT(Sales[Region])
17.
VALUES Operator: Returns unique values
from a column or table. Example: UniqueValues =
VALUES(Sales[Region])
18.
SUMMARIZE Operator: Creates a summary
table by grouping data and calculating aggregations.
Example: SummaryTable = SUMMARIZE(Sales,
Sales[Region], "Total Sales", SUM(Sales[Amount]))
19.
TOPN Operator: Returns the top N values
based on a specified column or expression. Example:
Top5Customers = TOPN(5, Customers,
Customers[Total Sales], DESC)
20.
RANKX Operator: Assigns a rank to each row
in a table based on a specified column or expression.
Example: Rank = RANKX(Sales, Sales[Amount], ,
DESC)

Some Favorite Custom Visuals


You can enhance the capability of Power BI via the use of custom
visuals in Power BI. This is accomplished by developing and
inserting your visualizations into your reports and dashboards.
Because of this capability, you will be able to show data in creative
and individualized ways, some of which may not be possible with
built-in visualizations. When you know where to search, adding a
custom visual is a simple and quick procedure. Look for an ellipsis
following the last visual in the Visualizations window, which is located
among all of the basic Power BI visuals. When you click on this, a
drop-down menu will appear, giving you the option to import an
existing visual from a file, receive additional visuals, delete an
existing visual, or restore the default visuals.
To use custom visuals in Power BI, you have a few options:

1. AppSource Marketplace: Power BI has a marketplace


called AppSource, in which users can explore and download
custom visuals that were developed by other users. The
requirements for this kind of data visualization are met by
the visuals that have been generated by a broad variety of
organizations and people. You can search for certain visuals
that meet your criteria, download them, and then use them
in the dashboards and reports you create.
2. Power BI Custom Visuals SDK: If you have certain
visualization requirements that are not satisfied by the
custom visuals that are currently available on the market,
you can create your custom visual by making use of the
Power BI Custom Visuals SDK. This software development
kit gives you access to a collection of tools, libraries, and
sample code that you can use to generate custom visuals
by making use of common web technologies like HTML,
CSS, and JavaScript.

You can create visuals with distinctive features for data processing,
interaction, and visualization by using the SDK. After you have
created a custom visual, you can save it to your computer as a.pbiviz
file and then import it into Power BI so that you can use it in your
dashboards and reports. To construct a custom visual using the
Power BI Custom Visuals SDK, the majority of the time will be spent
writing code to describe the behavior, rendering, and interaction with
data of the custom visual. The software development kit will give you
APIs as well as documentation to help you through the process. The
specialized visuals that you design using the Software Development
Kit (SDK) can be distributed to other people working for your
company or uploaded to the AppSource marketplace to make them
more widely available.
Many custom visuals were developed as PBVIZ files before
AppSource became as pervasive in the Microsoft environment as it
is now. If you have a PBVIZ that you wish to import from a file, either
from a custom visual that you've developed using TypeScript or from
an earlier version of a custom visual, you can pick that option to
open an Explorer window, go to the file, and then select it for
importing. This option is available to you whether you've built the
custom visual using TypeScript or not.
If you pick the "Get more visuals" option, an overlay will emerge
that will lead you to a list of custom visuals that are accessible in
AppSource as well as custom visuals that have been added to your
organization. These custom visuals may be accessed by clicking on
any of the options in the overlay. In the list of all visuals, things
associated with your organization will always be shown in a more
prominent position than those associated with other organizations.

You can see that there are selection options for all visuals,
organizational visuals, and visuals that are exclusive to AppSource.
There is a capability for searching. In addition to that, there is the
capability to filter the visuals according to several categories.
Analytics, Advanced Analytics, Change over Time, Filters,
Infographics, Key Performance Indicators, and Maps are the
categories that fall under this heading. The filter categories aren't
often helpful in identifying what it is that you're searching for. This is
especially true in the Analytics area, which is, in my opinion, too wide
to be useful. On the other hand, you can combine search and filter
into a single operation, which you might find to be more useful. The
fact that each of these custom visuals from AppSource comes with
its example PBIX, which you can use to test out its functionality and
learn more about its capabilities, is yet another advantage of using
these custom visuals. This results in a very low barrier to entry since
you can view the visual in its most effective form as well as the
stated use case provided by the person who created the visual. You
can then determine whether it is something that may be useful to
you.
Here are some popular custom visuals that Power BI users
often find valuable:
SandDance
SandDance is a powerful custom visual for Power BI that enables
you to see and explore your data dynamically and engagingly. You
can access SandDance by going to the Custom Visuals section of
the Power BI menu. It offers a one-of-a-kind visualization toolkit that
can assist you in locating patterns, trends, and insights that might
not be immediately evident when using typical chart formats.
SandDance was created by Microsoft Research as an independent
project, and it has now been included in Power BI as a custom
visual.
The following is a list of some of SandDance's most important
features and capabilities:

1. Multi-dimensional data exploration: SandDance gives


you the ability to do analysis and visualization of data
concurrently across various dimensions. It features
interactive controls that enable you to map distinct data
characteristics to visual aspects like location, color, size,
and shape, among other things.
2. Fluid and interactive visualizations: SandDance is
capable of supporting a wide variety of visualization formats,
including scatter plots, bar charts, stacked charts, and many
more. You can quickly switch between multiple visual
representations and explore your data from a variety of
viewpoints. The visualizations are quite interactive, enabling
you to zoom in and out, filter the data, and sort it in a very
fluid manner.
3. Data transformations and grouping: SandDance allows
you to conduct data transformations and grouping
operations right inside the visual interface, which is a
significant benefit. To get a more in-depth understanding of
your data, this enables you to compile and summarize the
data depending on a variety of criteria.
4. Facets and insights: SandDance offers the notion of
facets, which are various perspectives or slices of your data.
Insights are a more in-depth analysis of your data. You can
construct facets that are based on various data qualities,
and then compare and analyze subsets of your data by
switching between the aspects you've created. In addition,
SandDance can automatically produce insights and
suggestions based on the patterns it finds in your data by
analyzing it.

You can discover SandDance under the Visualizations pane of the


report builder in Power BI, which is where you need to go to utilize it.
You can customize the SandDance visual in Power BI by mapping
data characteristics to visual properties once you have imported your
data into Power BI, selected it, and then configured it. After the
configuration is complete, you can then explore and analyze your
data in a way that is both rich and intuitive by interacting with the
visual.
SandDance is a powerful tool for data exploration and analysis,
especially when working with complicated and multidimensional
datasets. It can assist you in gaining fresh ideas and successfully
communicating your results via the use of visualizations that are both
aesthetically attractive and engaging.
Smart Filter Pro
The method in which users interact with and filter data is
revolutionized by Smart Filter Pro, a powerful custom visual for
Microsoft Power BI. Smart Filter Pro was developed by Microsoft. It
improves the default filtering capabilities of Power BI and provides
additional features and functions that help users to obtain deeper
insights and extract useful information from their datasets. These
enhancements are made possible as a result of the fact that it
enables users to filter data. The filtering process in Power BI is made
easier to understand and more user-friendly by using Smart Filter
Pro, which is meant to simplify and streamline the process. Users
can quickly browse through vast datasets and zero in on the
particular data subsets they want for analysis thanks to this visual.
The search capability that Smart Filter Pro provides is one of the
app's most notable features. It gives users the ability to search for
certain values or phrases inside the filter options, which enables
quick data discovery and reduces the amount of time spent manually
reading through extensive lists of value alternatives. When working
with datasets that include a large number of unique values, or when
looking for certain outliers or anomalies, this capability is extremely
helpful. The feature of hierarchical filtering is included in Smart Filter
Pro, in addition to search and multi-select capabilities. Users are
provided with the capability to filter data based on hierarchical
connections, such as filtering data by area, nation, and city, thanks to
this functionality. Users are given the ability to go deeper into their
data and do analysis at a variety of different granularities thanks to
this feature. This skill of hierarchically filtering data is very useful
when working with complicated data structures and when analyzing
data from a variety of organizational or geographic viewpoints. Users
can expand or collapse hierarchical levels, which enables them to
traverse through the data hierarchy and concentrate on certain
subsets of data that are most relevant to the study they are doing.
Users of Smart Filter Pro also have access to a user interface that is
both simple to use and pleasant to the eye. The user experience will
be uninterrupted because of the visually appealing design that has a
straightforward and up-to-date layout. The look of the visual may be
readily customized by users so that it corresponds with the general
theme and style of the reports they generate using Power BI. Smart
Filter Pro is a tool that can be used in human resources to filter
employee data based on numerous variables such as job roles,
performance ratings, or tenure. This enables HR professionals to
obtain insights into workforce demographics, identify talent
shortages, and design effective strategies for workforce planning.
Chiclet Slicer
The Chiclet Slicer is a slicer that includes settings for the number of
rows and columns of values that you wish to compress your options
into. This slicer is one of my favorites since it contains these
controls. You can correlate photographs with various "chiclets" in
the Chiclet Slicer, which is another unique feature. A popular
illustration of this would be a report that covers numerous nations,
with each chiclet perhaps including a flag of one of those countries.
Even without the extra chiclet formatting, the Chiclet Slicer has a
wonderful appearance and operation, which I truly like. Back in 2015,
when Microsoft first announced the Chiclet Slicer, the firm used
automobile manufacturer logos as an example. My preferred method
for storing photos for Chiclet Slicers is in a folder on OneDrive or a
SharePoint site. After that, I like to add a column to my data model
that refers to those links as Image URLs. The Chiclet Slicer, as
shown by Microsoft in the following picture, is displayed below, along
with the visualization settings.

Timeline Storyteller
The Timeline Storyteller is a custom visual for Power BI that gives
you the ability to build visual tales that are not only interactive but
also interesting by using a timeline layout. It enables you to show
facts and events in chronological order, presenting your audience
with an engaging experience of dynamic storytelling. Your data is
seen as a horizontal timeline, with each data point or event being
depicted at the proper time position. A variety of formatting options,
including colors, fonts, labels, and scales, can be modified to create
the timeline of your choosing.
It enables you to design transitions that are fluid and dynamic
between various time intervals or periods that you have chosen.
Within your timeline, you can establish numerous stages or phases,
and then you can add animated transitions between them to create a
smooth flow of information. You can construct a story with Timeline
Storyteller by including text, photographs, and notes at various
points along the timeline. Because of this, you will have the ability to
add context, explanations, and insights connected to the data points
or events that are now being shown.
Users can zoom in and out, pan across various periods, and explore
specific data points or events by interacting with the Timeline
Storyteller visual. Users will be able to filter or dig down into
particular data depending on their interactions with the timeline,
which can be done by linking the visual to other visuals on the report
page. The visual look and overall storytelling experience may both
be improved by making use of the many customization options that
Timeline Storyteller provides. To meet your particular narrative
needs, you can adjust the layout, style, and interactivity.
Downloading and importing the Timeline Storyteller as a custom
visual into your Power BI report is required to take advantage of this
feature in Power BI. After the data has been imported, you will be
able to add the Timeline Storyteller visual to your report canvas and
customize it by choosing the data fields that are relevant to the
timeline and the information that is linked with it.
Timeline Storyteller is especially helpful in situations in which you
wish to show a sequence of events, milestones, or historical facts in
the form of a story. It enables you to engage your audience in a
visually attractive way and successfully express the narrative that
lies behind the statistics.
Synoptic Panel
The Synoptic Panel is a specialized custom visual for Power BI that
gives users the ability to present data on individualized floor layouts,
maps, or any other individualized pictures. It gives you the ability to
examine and analyze data in the context of certain places or layouts
that you choose.
Here are some key features and capabilities of the Synoptic
Panel:

1. Custom mapping: The Synoptic Panel gives you the ability


to submit custom pictures to use as the backdrop for your
visualizations. These images may be things like floor plans,
maps, or diagrams. You can plot and show data points with
the help of these photos by using them as a reference.
2. Data mapping: After you have uploaded a custom picture,
you can specify regions or places on the image where the
data points that you have entered should be displayed. This
can be done after you have published the custom image.
These zones can be drawn in and defined with the help of
the tools that are included in the Synoptic Panel.
3. Data binding: You can tie your data to the designated
areas on the custom picture by using the data binding
feature. This indicates that you can link data values or
measures to certain regions or forms that are shown in the
picture. The data will be visualized at that point by the
Synoptic Panel, which will do so by dynamically coloring or
shading the areas according to the data values.
4. Interactivity and drill-through: The Synoptic Panel
provides interactivity, which enables users to interact with
the visual and dig down into individual data points. This
functionality is made possible thanks to drill-through. When
users move their mouse over or click on a data point, you
can set actions and tooltips to deliver more information in
that context.
5. Customization and formatting: The Synoptic Panel has
some different customization options, which may be used to
improve the data's readability and the way it appears
visually. The colors, legends, labels, and tooltips can all be
customized to meet the specifications of your design and to
make the user experience more satisfying as a whole.

Downloading and importing the Synoptic Panel into your Power BI


report as a custom visual is required to take advantage of this
feature in Power BI. Once imported, you will be able to add the
Synoptic Panel visual to your report canvas and modify it by
uploading your custom picture, creating areas, and binding data. You
can do this after the visual has been imported. The Synoptic Panel is
especially helpful in situations in which you wish to superimpose
data over custom floor plans, maps, or photographs to acquire
insights into certain places or layouts. It can be used in a variety of
use cases, such as facility management, retail shop analysis, event
planning, and many more, where it is essential to visualize data
within the context of a particular layout.
Word Cloud
The Word Cloud is a piece of data visualization that is being used for
an increasing number of different kinds of information. A word cloud
is created by tallying up the frequency with which individual words
appear. The area of marketing analytics in which we are most likely
to use this visual is one in which we are attempting to get some
insight into the sentiment of consumers or to determine which
subjects are predominating a discussion. Let's imagine we have a
focus group and we're analyzing what the members of the group
said about a product; in that case, we’d be interested in finding out
which terms were used most often to describe my product. The
results of surveys also refer to this a good deal. It would not be
difficult to create a visual representation of the selection count using
a word cloud, for example. This visual enables you to pick terms to
remove from the word cloud, and it also includes a preset list that
you can opt to deactivate to make the word cloud seem less
congested. You can also have control over which words are omitted
by giving your own words to delete in the Formatting section of the
Visualization pane that comes with this visual. By inserting commas
between the terms, we could also add several words to the list.
When analyzing emotion, specific words may be assigned
"weights." You can do so by utilizing the values section of the
Visualization pane. This will allow you to add the weights. In the
following picture, which comes from the download sample, you can
see an example of a word cloud that has default exclusions.

Card with States


Users of Microsoft Power BI can present important measures or
numbers in a way that is both aesthetically attractive and interactive
thanks to the "Card with States" custom visual, which is a strong
feature in Microsoft Power BI. The Card with States visual offers a
comprehensive and easy-to-understand method to convey essential
information as well as monitor performance. This is made possible
by the fact that it may present data by using a variety of states or
thresholds. When there is a need to emphasize certain data points or
values depending on specified states or thresholds, the Card with
States visual is especially valuable because of its ability to do so. It
provides customers with the ability to specify several states or
conditions depending on their data, such as good, warning, and
critical, for example. Users are given the ability to quickly and
visually represent the status or performance of a measure under the
color that corresponds to each possible state.
Users can rapidly identify regions that need attention or additional
study because the visual gives a clear and fast explanation of the
data by utilizing colors to indicate distinct states, which provides a
clear and instant comprehension of the data. The versatility with
which one may define bespoke states and thresholds is one of the
most notable elements of the visual representation known as the
Card with States. Based on their data and business needs, users
can establish their thresholds or ranges. Users can establish several
thresholds for revenue growth, such as low, moderate, and high
growth rates in a sales scenario, for example.
These growth rates can range anywhere from zero to one hundred
percent. Users can modify the visual to their requirements and make
certain that the information that is presented is following their
business context by configuring the states and thresholds. The Card
with States visual also has a variety of modification options, which
may be used to improve both the aesthetic appeal and the utility of
the product. Users can alter the look of the card, including the font
styles, colors, and backdrop designs, to ensure that the card's
presentation is consistent with the overall theme and style of their
Power BI reports. This customization function helps to guarantee
coherence in the report's visual presentation across its whole, which
in turn improves the quality of the user experience as a whole. The
visual representation of a Card with States can be applied in project
management to monitor the progress of a project, keep track of
important milestones, and detect possible risks or delays. The
proportion of work that has been completed on a project or certain
milestones within a timeframe can both be used by project managers
to define states. The visual gives a condensed and all-encompassing
overview of various projects by visually representing the project state
using color-coded cards. This makes it easier to effectively manage
the projects and make decisions about them.
The Card with States visual is especially helpful in operational
dashboards, where it can be used to monitor key performance
indicators (KPIs) across a variety of departments or processes.
Another place where it is useful is in the context of financial
dashboards. The visual enables users to monitor and analyze key
performance indicators (KPIs) in real-time, discover areas of
improvement, and take proactive actions to maximize operational
efficiency. This is made possible by the user being able to customize
the states depending on performance criteria.
Radar Chart
The Radar Chart is a visual that gives you the ability to show the
data from numerous variables at once, which enables you to get a
more comprehensive picture that encompasses all of those factors.
In general, the axes should not be regarded as comparable to one
another since the scales of the axes that correspond to the various
variables will be different from one another. For illustration purposes,
the example shown in the picture below compares the actual number
of new workers hired in each division to the anticipated number of
new employees hired in each division. Although we do not
necessarily know what the exact numbers are, we can easily
compare how well each group fared with actuals to predict in a broad
interpretative sense.
The video game Street Fighter 5 is one of my favorite instances of
the usage of the radar chart that does not include Power BI. In this
game, each character has a radar chart that indicates their "stats"
out of a rank of 5, which makes it easy to determine which
characters have which kinds of overall advantages.
Do you like characters that have a lot of life and strike hard or
characters that are swift and have a lot of special moves? The radar
chart functions faultlessly in that setting as well, demonstrating that it
is a form of a chart that has many different uses. In addition, using it
is simple with Power BI since all that is required is a category list and
values for the y-axis.
Hexbin Scatterplot

Within Microsoft Power BI, the Hexbin Scatterplot is a very effective


and aesthetically pleasing custom visual that can be used for data
analysis and visualization. It offers a novel and very efficient method
for visualizing and analyzing huge datasets that include a significant
number of data points that overlap one another.
When working with datasets that have a high data density, typical
scatter plots have constraints that the Hexbin Scatterplot was
created to alleviate. When there are numerous overlapping points in
a scatter plot, it is difficult to recognize patterns, trends, and
correlations within the data. Individual data points are represented by
markers in a scatter plot. This problem is addressed by the Hexbin
Scatterplot, which organizes the data points into hexagonal bins. The
color intensity or size of each bin signifies the amount of data points
that it includes. This method of binning helps decrease clutter and
offers a better depiction of the distribution of the data.
The Hexbin Scatterplot excels in many respects, but one of its most
notable qualities is its capacity to successfully manage enormous
datasets. Traditional scatter plots can become visually confusing and
difficult to comprehend when there are hundreds or even millions of
data points involved. The Hexbin Scatterplot, on the other hand, can
circumvent this obstacle since it organizes data points into
hexagonal bins. This process of data aggregation not only clears up
the visual clutter, but also maintains the underlying patterns and
trends in the data. The Hexbin Scatterplot enables users to analyze
broad patterns and identify regions of high or low data density by
grouping data points. This provides a more thorough perspective of
the data. The Hexbin Scatterplot is created by grouping data points.
Another important component of the Hexbin Scatterplot is its color
palette. Users can clearly distinguish regions that have a high or low
data density because the intensity of the color used to depict each
hexagonal bin corresponds to the amount of data points that are
included inside that bin. This color encoding makes it easier to get a
visual grasp of how the data is distributed and assists in locating
clusters, outliers, and concentration patterns within the dataset. A
visually attractive and understandable representation of the data can
be achieved by allowing users to choose the color palette and match
it with their preferences about the data and its display of it.
The Hexbin Scatterplot additionally has interactive elements that
may be used to assist the exploration of the data as well as the study
of it. The user can interact with the visualization by zooming in or
out, moving over the plot, and hovering over individual hexagonal
bins to see more information about the data points that are included
inside them.
Users are given the ability to explore certain regions of interest and
get specific insights from the display thanks to this interactive
feature. Users can detect correlations, clusters, or outliers within the
data by engaging with the Hexbin Scatterplot, which leads to deeper
data analysis and informed decision-making.

Hierarchy Slicer
The Hierarchy Slicer is a custom visual that was developed for
Power BI. It offers a straightforward and engaging method for
filtering data based on hierarchical structures. Users are given the
ability to dig down into a hierarchy and pick certain levels or nodes
within that hierarchy to filter the data that is presented in other
visualizations.

Hierarchical data structures, such as product categories,


organizational hierarchies, geographical areas, or any other
hierarchical data you have in your dataset, may be visualized with
the help of the Hierarchy Slicer. It presents the hierarchy in the form
of a tree view that can be collapsed, allowing users to traverse
between various levels of the hierarchy by expanding or collapsing
individual nodes.
When users want to filter the data shown in other visualizations on
the report page, they can pick one or more nodes from the hierarchy
to do so. The Hierarchy Slicer will transmit the chosen values to
other visuals when a node is selected. This will enable those other
visuals to update and present data that is pertinent to the hierarchy
level or node that has been selected. The Hierarchy Slicer enables
users to see several tree structures inside a single visual
representation. This translates to the fact that you can have
numerous layers of hierarchies inside a single slicer, such as product
categories, subcategories, and individual goods. To develop
complicated filtering combinations, users can traverse and choose
nodes from several hierarchies.
The Hierarchy Slicer comes with some different customization
options, allowing users to alter both its look and its functionality. To fit
the design and user experience needs of your report, you can adjust
the font, colors, and icons, as well as the behavior of
expand/collapse, and other visual characteristics.
You will need to download the Hierarchy Slicer and then import it as
a custom visual into your Power BI report to take advantage of this
feature in Power BI. After it has been imported, you will be able to
add the Hierarchy Slicer visual to your report canvas and configure it
by choosing the field or fields that reflect the hierarchical data in your
dataset. You can do this by clicking the "Configure" button. After
then, users can interact with the Hierarchy Slicer by expanding or
compressing nodes and choosing certain levels or nodes within the
hierarchy. The chosen hierarchy values will cause the other
visualizations on the report page to be updated, creating a dynamic
and focused view of the data.
The Hierarchy Slicer is especially helpful when dealing with datasets
that contain hierarchical connections and when you require a flexible
and user-friendly approach to filter data based on such hierarchies.
This is because the Hierarchy Slicer allows you to slice data based
on many hierarchical levels at once. It enables users to get deeper
insights and analysis by assisting them in exploring and analyzing
data at multiple degrees of granularity within a hierarchy.
Gantt chart by MAQ Software
The Gantt chart has been around for an indeterminable amount of
time and has become indispensable in the field of project
management. A time series bar chart is a sort of bar chart that lets
you identify events in terms of the expected amount of time it will
take to finish them and the number of steps that are involved.
Although AppSource provides many alternatives for Gantt charts, I
will be referring to the one that was developed by Microsoft in this
particular instance. The fact that you can make this version as
general or as specific as you desire is one of the aspects of it that
people like the most. This Gantt chart provides you with ten different
field options into which you can enter data. Only the task field has to
be filled out, while the others are optional. The visual sample PBIX
has an easy-to-read hints page that people often reference when
they need to remember which field does what since they are not in
project management that often. This is because they do not engage
in project management very frequently. Some wish that some of the
formatting possibilities that are available in this chart were accessible
in some of the basic visuals. However, you can utilize the Gantt chart
in "off-label" use cases to borrow some of that capability for more
traditional bar chart use cases.

Bullet chart
A Bullet Chart is a kind of data visualization tool that is often used in
the production of business dashboards and reports to display the
performance of a particular metric in comparison to a goal or
threshold value. It gives a representation of the data that is
condensed and succinct, which makes it simple to contrast the
actual values with the values that were sought or anticipated.
The Bullet Chart consists of the following elements:

1. Target/Threshold Line: This horizontal line depicts the


value of the target or threshold against which you wish to
evaluate the actual value. It helps determine if the actual
number is lower than, on par with, or higher than the
objective.
2. Actual Value Bar: This bar of a different color indicates the
actual value or measure that you wish to see. The
magnitude of the actual value is represented by the length
of the bar in this graph.
3. Qualitative Ranges: The actual value bar is often
separated into several colored ranges or zones that reflect
qualitative levels. These ranges and zones are known as
qualitative ranges. For instance, you can use color coding to
signify poor, good, or great levels of performance.
4. Performance Measures: Additional indicators, such as the
performance from the year before or benchmarks for the
industry, can be added to the Bullet Chart to offer additional
context and comparison.

When evaluating performance, identifying gaps, and determining


whether or not the actual value meets or exceeds the objective or
threshold, you can easily do all of these things by making use of a
Bullet Chart. It is especially helpful for monitoring key performance
indicators (KPIs) or making headway toward one's objectives. The
built-in Bullet Chart visual in Power BI can be used, or you can
explore custom visuals from the marketplace that provides expanded
capabilities and options for customization.
Sunburst Chart
The Sunburst Chart is a captivating and informative custom visual in
Microsoft Power BI that provides a unique and visually appealing
way to represent hierarchical and categorical data. With its circular
layout and interactive features, the Sunburst Chart enables users to
explore and analyze complex data structures, identify patterns, and
gain insights into the relationships between different levels of a
category.

To depict hierarchical data in a radial pattern, like a sunburst or a pie


chart, the Sunburst Chart was developed. This chart's structure is
similar to that of those two types of charts. It is composed of
concentric rings that stand for the many levels of a category or
dimension, and each of these rings is subdivided into sectors that
represent the subcategories or values that are included inside each
level. The percentage or quantity of the values that the sectors
reflect is indicated by the size of the sectors themselves. Users can
quickly grasp the hierarchical links and move through the data
structure thanks to the circular style of the Sunburst Chart. The
Sunburst Chart provides an interactive experience by way of tooltips
and highlighting. Tooltips are little pop-up windows that appear when
you hover your mouse over a section of data. They provide extra
information about the data point, such as contextual details, which
improve the user's comprehension of the data.
To highlight certain data points or compare various segments, users
can additionally highlight particular sectors or pathways within the
graphic. Users are given the ability to engage with the chart to
acquire deeper insights from the data, which is made possible by the
interactivity that helps data exploration and analysis. Another key
component that contributes to a better comprehension of the data is
the color encoding that is included in the Sunburst Chart. Users can
give different colors to the sectors depending on certain data
properties or categories. This makes it easier to differentiate
between the various segments and identify them as separate
entities. Users can more easily see trends, recognize abnormalities,
and compare numbers across several levels and subcategories
thanks to color encoding. A visually attractive and understandable
representation of the data can be achieved by allowing users to
choose the color palette and match it with their preferences about
the data and its display of it.
The Sunburst Chart is a tool that can be used in the field of
financial analysis to assess expenditure breakdowns. In this chart,
each level represents a separate category of expenses. Financial
analysts can find possibilities for cost optimization and budget
allocation by visually representing the proportions of and linkages
between different types of expenses.
This chart can be applied in project management to examine
resource allocation and provide a visual representation of project
hierarchy. Each level can be used to represent different project
stages, activities, or resources, which provides a holistic perspective
of how the project is progressing and how its resources are being
used. The Sunburst Chart is a tool that can be used by project
managers to discover bottlenecks, effectively allocate resources, and
monitor the success of the project across several levels.
Conclusion
Power BI is a versatile and powerful business intelligence tool that
empowers organizations to leverage their data and derive actionable
insights. With its intuitive interface, extensive connectivity options,
robust data modeling capabilities, and rich visualization features,
Power BI enables users to transform raw data into meaningful
reports and dashboards, driving informed decision-making and
fostering a data-driven culture within organizations. This guide has
provided a comprehensive overview of Power BI, a powerful
business intelligence tool developed by Microsoft. Throughout the
guide, we have explored the key features, functionalities, and
benefits of Power BI, as well as discussed various tips and best
practices for effectively utilizing the platform.
Power BI offers a wide range of capabilities, including data
visualization, data modeling, and data analysis, allowing businesses
to transform raw data into actionable insights. Its user-friendly
interface and intuitive drag-and-drop functionality make it accessible
to users of all skill levels, enabling them to create interactive reports
and dashboards without extensive coding knowledge.
Furthermore, we discussed the importance of data visualization in
conveying information effectively and facilitating better decision-
making. Power BI provides a rich set of visualization options,
including charts, graphs, maps, and tables, allowing users to present
data in a visually appealing and interactive manner. Additionally, the
guide has highlighted various design principles and best practices for
creating impactful and user-friendly dashboards and reports. Lastly,
we have touched upon the collaboration and sharing capabilities of
Power BI, which enable users to collaborate with team members,
share insights, and distribute reports securely within the organization
or externally. This fosters a data-driven culture and ensures that
stakeholders have access to the right information at the right time.
By leveraging Power BI's connectivity options, users can seamlessly
integrate data from various sources, such as databases, cloud
services, and spreadsheets, into a single consolidated view. This
unified view enables organizations to gain a holistic understanding of
their data and make informed decisions based on accurate and up-
to-date information. By following the tips and best practices outlined
in this guide, users can maximize the value of Power BI and unlock
its full potential in their business operations.

We’d love to hear from you!


“Thanks for reading. If you enjoyed this book,
please consider leaving an honest review as a
feedback mechanism.”
INDEX

1
1stCourseInDepartment , 160

A
A Bullet Chart , 295
A cube-based method , 12
A fundamental kind of aggregation , 114
A Quick Rundown, 50
A Quick Rundown of the Other Panes, 50
A Self-Service BI Tool, 13
A single-line ribbon , 27
A single-line ribbon, when collapsed, saves you space , 27
A stacked area chart , 92, 93
A Top N–type filter , 50
AAD , 8
ability to build connections , 7
ability to publish , 3
ability to refresh data , 2
Access Restrictions , 238, 239
access to real-time analytics , 4
Accessibility , 28, 189, 190, 231
account input from users , 11
accurate and up-to-date information. , 298
Acquire actionable insights , 165
Activation of the License and License Renewal , 239
Actual Value Bar , 295
Add additional fields , 92
Add External Tools, Remove External Tools, and Modify Display Order, 242
Add lines to the chart , 91
Add more lines , 91
Adding Users to a Workspace, 210
Adding Users to Roles for RLS Implementation, 225
Addition , 278
addition of brand-new capabilities , 11
Additional Controls , 182
additional insights , 65, 99, 110
additions to SQL Server , 12
Address , 54, 67
address the issue of global hunger , 66
Advanced Analysis , 136
Advanced Analytics , 14, 177, 282
Advanced Calculations , 19, 129
Advanced options , 33
ADVANCED REPORTING TOPICS, 164
Advanced-Data Analysis , 16
advantage of powerful analytical and data , 8
advantage of powerful analytical and data engineering capabilities , 8
advantages , 5, 8, 17, 26, 27, 110, 111, 112, 136, 145, 234, 290
AdventureWorks , 166
Aggregated Functions , 128
aggregating options , 55
Aggregation functions , 113, 267
Aggregation Functions, 267
AGGREGATIONS, 108
aggregations with filters , 115
Aggregations with Filters , 115
Aggregations, More than Some Sums, 114
AI integration , 11
AI-Powered Visuals, 164
ALM , 243, 260, 261, 262, 263
amount of data points , 292
An improved accessibility , 28
An intuitive Themes gallery , 26
Analysis Services , 12, 17, 108, 204, 229, 234, 237, 253, 259
Analysis Services Multidimensional, 12
analytical queries , 18
Analytics, 47, 75, 76, 89, 282
analytics tools of Power BI. , 2
analyze data , 2, 9, 15, 16, 67, 107, 148, 287
Analyze Model, 262
analyzing data , 14, 72, 97, 271, 284, 294
Analyzing Data , 157
Analyzing Data Dependencies , 157
animations , 5
anthropological categories , 54
appealing and interactive manner , 298
Apple and Google-based mobile , 3
Apple and Google-based mobile devices , 3
apply business rules , 5
apply data transformations , 35
Apply sorting , 85, 92
Applying Conditional Logic , 135
Applying Multiple Filters , 132
Applying the Consolidation , 145
Applying transformations , 143
Apps, 8, 9, 201, 228
AppSource or developing , 8
ArcGIS Maps , 47
Area chart , 46, 89
Area Chart, 92
areas of natural language , 10
areas of natural language comprehension , 10
areas of natural language comprehension and sentiment analysis , 10
Arithmetic Operators , 278
ask questions , 2, 5, 171
Assign data fields , 85, 91
Assign data fields to appropriate axes , 85
assist with navigating the report , 39
Autodetect during load, 148
automate processes , 9
automate repetitive processes , 9
Automated Deployment , 261
Automated Reports, 12
Automatic Generation , 128
Automatic relationship updates, 155
automation , 9, 262
automobile manufacturer logos , 285
Average, 114, 117, 118, 130, 131, 134, 267, 268
average for Ms. Avina , 118
AverageOfficeHoursAttended , 118
axis section , 85
Azure Active Directory , 8, 56, 223
Azure Cognitive Services , 9
Azure Data , 8, 232
Azure Data Factory , 8, 232
Azure Machine Learning , 8, 62, 232
Azure map , 46
Azure services , 8, 14, 56, 232
Azure SQL , 7, 84, 204
Azure Synapse , 8, 204
Azure Synapse Analytics , 8

B
backend SQL Server database , 12
Bar Chart , 80, 81
bar charts , 2, 73, 80, 81, 82, 89, 101, 107, 157, 283
Barcode , 54
Bars, Columns, and Lines, 162
behavior's interactive nature , 77
Benefits and Applications of Relationships, 69
Benefits and Significance , 164
benefits of Power BI , 298
bespoke solutions , 10
BI , 1, 2, 3, 4, 5, 6, 7, 8, 10, 11, 12, 13, 15, 16, 17, 19, 21, 22, 23, 24, 25, 30, 33, 35, 38, 40,
45, 47, 48, 51, 55, 56, 57, 59, 61, 63, 66, 67, 68, 69, 72, 73, 74, 76, 78, 81, 89, 103, 104,
107, 108, 115, 118, 119, 124, 128, 130, 140, 142, 143, 146, 148,152, 155, 156, 163, 164,
165, 168, 171, 174, 180, 183, 184, 185, 186, 187, 188, 189, 191, 194, 195, 196, 197,
198, 199, 201, 203, 204, 205, 206, 207, 209, 211, 212, 213, 214, 216, 217, 218, 223,
225, 228, 229, 230, 232, 234, 236, 237, 239, 240, 241, 242, 244, 245, 247, 249, 254,
255, 256, 257, 259, 260, 261, 262, 265, 266, 280, 281, 283, 284, 286, 287, 289, 294, 298
Black or African American students , 93
Blank page , 37
blank version , 38
bookmarks and buttons , 77
Bookmarks and Buttons , 77
Bookmarks window , 50
Bravo, 243, 262, 263, 264, 265, 266
broad selection of data , 3
bubbles , 96, 104, 168
budget allocation , 297
build prediction models , 5
Building Relationships, 148
Bullet chart, 295
business intelligence , 7, 10, 12, 13, 14, 15, 20, 21, 22, 23, 67, 139, 262, 298
business intelligence platform , 7
business intelligence skills , 15
business objectives and scenarios , 4
business on a holistic level. , 38
Business Ops , 242, 243, 245, 246
Business-Specific Metrics , 130
Buttons element , 39

C
CALCULATE, 109, 131, 132, 133, 135, 137, 138, 139, 140, 268, 269, 271, 272, 273, 274,
275
CALCULATE and FILTER , 109
Calculated Column , 259
Calculated columns , 20, 110, 111
Calculated Columns, 110
Calculated Tables, 111
calculated values , 110
Calculations , 16, 20, 21, 30, 37, 41, 110, 111, 115, 128, 129, 130, 133, 136
capacity for content sharing , 3
capacity for content sharing and collaboration , 3
Card , 46, 105, 160, 289, 290
Card with States , 289
Cardinality, 69, 148, 149, 153
Carnegie Hall. , 175
carry out complex calculations , 2, 13
certified dataset , 200
Change detection , 41
Change the interaction behavior, 79
Changing data types , 142
Charts , 5, 80, 82, 83, 86, 87, 94, 95, 96, 97, 99, 100, 101, 244
chart's colors , 81
charts provide the data , 80
Chat in Teams , 196
Chatbots created using Power Virtual Agents , 10
Chiclet Slicer , 285
Choose a Data Source , 56
Choose a report canvas , 90
Choose and Transform the Data When You Import, 142
choosing the data source , 56
choropleth maps , 96
CI/CD Integration , 262
cleaning activities , 5
clear and succinct , 1
client interaction , 9
Clipboard , 30, 31
Cloud Deployment, 231
cloud services , 1, 4, 7, 189, 298
Cloud sources , 6
cloud-based platform , 11
cloud-based service , 10
Clustered Bar and Column, 83
Clustered bar chart , 46, 82
clustered bar/column , 84, 85, 86
Clustered column chart , 46, 82
Collaborate and engage with the report , 212, 214
Collaboration & Sharing , 16
Collaboration and Expert Input , 157
collaboration and management , 11
collaboration and sharing , 8, 23, 74, 235, 298
Collaboration and Team Development , 262
collaborative capabilities , 8, 238
collected data , 5
collections of dashboards , 8
Column Chart , 80, 81, 93
Column distribution , 65
column format , 80
Column profile checkboxes , 65
Column quality , 65
Column tools , 52, 53
Columnar Storage , 18
columnar structure , 19
columns and measures , 5, 13, 46, 112, 236
columns of data , 6, 17, 62, 108
combining databases , 5
Comma-separated values , 6
Comma-Separated Values , 265
comparing data , 21, 80, 83, 104, 115
comparing data across multiple periods , 21
comparing patterns , 86, 87
Comparison of Proportions , 100
Comparison Operators , 278, 279
Comparisons and Trends , 170
Compatibility and Versioning , 185
compile data , 14, 112
complex parameters , 3
complex Power Query Editor , 5
complicated data models , 3
complicated information , 2
Comprehensive Analysis , 69
comprehensive array of tools , 4
comprehensive array of tools and services , 4
comprehensive collection , 2, 16, 40, 145
comprehensive collection of dynamic visualizations , 2
comprehensive library of functions , 20
comprehensive library of functions and operators , 20
comprehensive overview of Power BI , 298
Compression , 18, 19
computer processing power , 12
Concatenation , 279
Conclusion, 23, 54, 72, 107, 140, 163, 188, 216, 241, 266, 298
conditional formatting , 14, 120
Conditional Formatting , 120
conditional formatting and charting capabilities , 14
Conditional logic , 109
conduct operations such as data purification , 13
Configure efficient data refresh schedules , 234
Configure more options, 153
Configure the Chart , 88
Configure the report's sharing settings , 211
Connect to the Data Source , 56
Connect to the data source you want to use , 81
connecting options , 3, 237
Connecting to Databases , 57
connectivity paths , 7
Connectivity to Data Sources , 189
Consider Future Growth , 235
Consider your options and get new perspectives , 166
Considerations for Consolidating Tables, 145
consistent upgrades , 11
consolidated view of significant metrics , 7
Consolidating Tables with Append, 143, 145
construct bespoke software , 9
construct custom business apps , 9
construct datasets , 7
construct datasets for analysis , 7
construct individualized columns based , 6
construct intelligent chatbots , 9
construct interactive applications , 9
construct relationships , 5
construct workflows , 9
construct your visualizations , 6
Contains (text) , 49
contemporary business environment , 11
content packs , 11, 190
contextual intelligence , 20
Continent , 54
Continuous Monitoring and Improvement , 238
Contribution Analysis , 170
conventional computer code , 9
Conversion Analysis , 97
Copy and distribute the link , 211
Count and Count (Distinct), 123
Country , 54
County , 54
Create, 90, 138, 139, 148, 149, 154, 157, 193, 194, 195, 196, 209, 212, 214, 220, 221,
226, 228, 235, 237, 245, 258, 259
create a comprehensive table , 66
Create a measure to calculate the customer retention rate , 139
Create a relationship manually, 149
create chatbots , 9
create interactive reports , 189, 298
create sophisticated metrics , 5
Creating an App, 214
Creating and Managing Relationships, 68
Creating calculated columns , 142
Creating Roles, 257
creation and design , 3
creation and design of interactive reports , 3
Cross filter direction, 148, 149, 153
CSV , 6, 57, 84, 142, 252, 265, 266
Custom format strings , 28
custom graphics , 36
Custom mapping , 287
Customer Analysis , 127
customer retention rate , 138, 139
CustomerName , 125, 126
customers can extract data , 4
customers can extract data from a wide variety of platforms , 5
Customization and formatting , 287
Customization Options , 97
Customize the chart , 81, 85, 91
Customize the Chart , 88
customized and interactive summary of pertinent insights , 7
Cutting and Pasting , 30

D
Dashboard Visualizations , 189
dashboards , 1, 3, 6, 7, 8, 11, 14, 16, 57, 69, 73, 74, 95, 141, 143, 184, 189, 190, 197, 206,
216, 221, 227, 231, 235, 238, 240, 281, 290, 295
dashboards offer real-time data updates , 7
data analysis , 1, 2, 5, 11, 13, 14, 16, 18, 20, 21, 22, 23, 68, 69, 72, 87, 107, 108, 111, 114,
119, 123, 125, 128, 129, 134, 136, 138, 188, 189, 214, 232, 233, 254, 266, 291, 292, 298
data analysis and business intelligence , 13
data analysis experience , 1
Data Analysis Expressions , 3, 5, 13, 15, 16, 19, 20, 108, 119, 129, 134, 267, 269, 271,
273, 276, 278
Data Analysis Toolpak , 14
data analyst working , 1
Data binding , 287
Data Cleaning and Quality Control , 119
Data Column , 259
data cubes , 12
Data Exploration and Discovery , 69
Data Export Options, 265
Data Governance , 56, 231
Data Hub, 195
Data Import , 13, 15, 57, 141
Data Import and Transformation , 13
Data Labels and Exploding Slices , 101
data labels or trendlines , 81
data management options , 40
data manipulation , 15, 35, 113, 142, 183
Data manipulation , 15
data manipulation capabilities , 15
Data mapping , 287
data mart , 17, 229
Data Modeling , 13, 156, 233, 236
Data modeling and Relationships , 15
Data Monitoring , 127
Data Normalization , 120
data over a particular duration , 95
Data Partition , 259
Data Preparation , 88
Data Preview section. , 65
Data professionals , 183
Data Quality , 145
Data Range and Boundaries , 119
Data Reduction , 97
Data Refreshing , 190
Data Refreshment and Gateway Management , 234
Data Relationships , 146
data retrieval , 1, 18, 19, 198
Data Security , 56, 230
Data Source Connectivity , 237
Data Source Connectivity and Gateway Management , 237
Data Source Considerations , 233
Data Source Management , 261
Data source settings , 35, 59
data sources , 1, 4, 7, 8, 9, 10, 14, 15, 16, 22, 23, 31, 32, 35, 55, 56, 58, 68, 70, 109, 155,
156, 185, 186, 189, 198, 199, 209, 216, 230, 231, 233, 234, 236, 237, 238, 253, 261, 266
data store technology , 17
Data Transformation and Cleanup (Optional) , 145
data transformation capabilities , 3, 23
Data transformations and grouping , 283
Data Understanding , 156
Data View, 52
databases , 1, 4, 7, 13, 15, 22, 55, 56, 63, 84, 189, 216, 228, 229, 298
data-driven , 2, 3, 4, 11, 22, 24, 54, 88, 110, 163, 164, 298
Dataflows in Shared Workspaces, 208
Dataset Image , 200
datasets or dataflows , 226
Dataverse button , 35
Date and time functions , 113, 269
DAX , 3, 5, 13, 15, 16, 17, 19, 20, 21, 37, 41, 42, 44, 51, 108, 109, 110, 111, 112, 113, 119,
121, 123, 127, 128, 129, 130, 131, 134, 135, 136, 138, 139, 140, 178, 180, 229, 242,
243, 245, 246, 247, 248, 249, 250, 251, 252, 253, 254, 256, 259, 261, 263, 264, 265,
267, 269, 271, 273, 276, 278, 279
decision-making , 2, 11, 74, 107, 110, 120, 139, 148, 163, 164, 188, 190, 292, 298
Decision-making , 176
decision-making and fostering , 298
Decomposition tree , 47, 164
Decomposition Tree, 169, 170, 171
Define Role Membership , 258
Define Role Permissions , 258
Define the measure , 170
Defining Relationships , 68
Deployment and Maintenance , 184
Deployment Pipelines, 203
derive actionable insights , 298
design sophisticated queries , 25
Desktop application's seamless integration , 6
Detect Data Type button , 62
develop aesthetically attractive reports , 3
develop complex data models , 22
development of Power BI , 11
development of Power BI is the outcome of Microsoft's commitment , 11
development or progression of certain variables , 127
developments in the industry , 11
dialog flows for the chatbots , 9
dimensional Cartesian coordinate , 98
dimensional Cartesian coordinate system , 98
Direct connections , 7
direct query , 7
DirectQuery , 33, 41, 70, 152, 233, 237
DirectQuery circumstances , 41
discover patterns , 2, 121
DISTINCTCOUNT , 139, 267
distribution , 8, 12, 65, 95, 96, 99, 100, 104, 115, 121, 123, 169, 212, 247, 292
distribution of customers , 96, 104
diverse dimensions , 98
diverse viewpoints on the data. , 114
Document and Communicate , 177
documentation , 44, 45, 152, 156, 195, 203, 215, 226, 228, 261, 262, 281
Documentation Generation , 261
Does not contain (text) , 49
Does not start with (text) , 49
Donut chart , 46
Donut Chart , 100
Donut Charts , 96, 99
Donut Chart's central hole , 100
Dot Plots , 96
Downloading and importing the Timeline , 286
Drag and drag the category field , 85
Drag and drop the desired fields , 81
dragging and dropping fields , 5
drill through , 48, 75
Drill-through , 77
Duplicate page , 37
Dynamic ribbon content , 27
Dynamic ribbon content based on your view , 27
Dynamics 365 , 8, 9

E
Ease of Collaboration and Sharing , 190
Edit a relationship, 150
Edit Interactions , 78
edit the measure or column , 28
Editing relationships feature , 150
Editing relationships using different methods, 152
effective business intelligence , 4
effective business intelligence application. , 4
efficient memory , 19, 20
efficient processing of massive datasets , 22
emphasis on AppSource , 36
Enable load to report , 59
Enable the visual interaction controls, 78
encoding the data values , 19
Encourage user engagement and collaboration , 237
Engage stakeholders , 236
engaging reports , 1, 5, 54
Enhancing portability , 3
Enhancing portability and adaptability , 3
Enhancing portability and adaptability in data consumption , 3
environments of enterprises. , 12
essence of data modeling , 6
establish a connection to a database , 6
Establish data governance policies , 236
establishing associations , 22
Evaluate user requirements , 240
examine the data , 2
Excel and SharePoint , 10
Excel's advanced analytics features , 14
Excel's built-in functions and tools , 14
Excel's built-in version control , 14
exceptional connection , 22
expand Power BI's capabilities , 11
explicit measures , 129, 140
exploration of geographical information , 96
Explore and Visualize the Data , 57
explore data , 3, 21, 77, 94, 120
explore trends , 14
explore your local file system , 57
Export Data, 265, 266
Exporting and Importing , 31
Extensibility with Add-Ins , 14
extensive coding , 298
extensive coding knowledge , 298
extensive connectivity options , 23, 298
extensive data connectivity , 4
External Tools Section, 45
Extraction of Data , 55

F
Facebook Messenger , 10
Facets and insights , 283
favorite objects , 193
few popular options , 56
Fields, 29, 41, 45, 47, 48, 52, 71, 72, 75, 76, 81, 88, 136, 150, 161, 179
Fields and Filters Panes, 48
Filled map , 46
FILTER , 135, 138, 273, 279
filter and highlight icons , 78
Filter Context, 137
Filter functions , 113, 273
Filter Functions, 273
filtering , 5, 16, 18, 19, 35, 49, 55, 67, 68, 69, 76, 77, 79, 85, 92, 101, 109, 110, 131, 137,
142, 146, 153, 154, 161, 172, 187, 193, 266, 276, 284, 293
Filtering , 77, 142
filtering rows , 5
Filters , 29, 44, 45, 48, 49, 107, 161, 223, 282
Filters and Fields panes , 29
Filters on all pages , 49
Filters on this page , 49, 107
financial analysis , 87, 297
Financial analysts , 297
Financial functions , 113
First, Last, Earliest, and Latest, 125, 127
flexibility across platforms , 6
Flexibility and Customization , 95
flexible data , 1
flexible data modeling , 1
flexible data modeling capabilities , 1
flexible formatting options , 3
Fluid and interactive , 283
Focus on Proportions , 97
folder structure , 227
Forecasting and Planning , 176
Format, 39, 40, 47, 75, 76, 78, 107, 252, 263
Format button , 40
format string , 28
Formatting section of the Visualization pane , 288
formula engine , 18, 21
Formula language and calculation engine , 20
formulae , 13, 14, 16, 20, 108, 253
formulas , 6, 34, 59, 278
fosters a data-driven culture , 298
Free per-user license, 218
FROM RAW DATA TO REPORT, 141
fundamental DAX syntax , 131
fundamental level , 13
Funnel chart , 46
Funnel Chart, 96
G
Gantt chart by MAQ Software, 294
Gather user feedback , 238
Gauge , 46, 105
Generate a shareable link , 211
generate a single report , 240
generate data representations , 5
GENERATESERIES statement , 178
geographic viewpoints , 284
Get apps , 201
Get to Know Business Ops, 241
Getting data , 6
Getting data from your sources , 6
Getting Our Data, 55
Google Analytics , 7, 8
graphs , 5, 14, 73, 86, 298

H
Handling Ambiguous Relationships , 69
heavy programming , 15
hectic atmosphere , 1
Help Section, 44
Hexbin Scatterplot, 291, 292
hidden patterns , 1
hidden patterns and correlations , 1
Hierarchical Analysis , 170
Hierarchical data structures , 293
hierarchies , 13, 16, 21, 22, 33, 48, 102, 170, 293, 294
Hierarchy , 259, 293, 294
high data density , 291
highlighted various design principles , 298
high-performance compression methods. , 19
holistic understanding of their data , 298
Home and Browse, 193
Home ribbon , 29, 56, 204
Home section , 29, 193, 194
Home Section of the Ribbon, 29
Home tab , 29, 37, 58, 59, 142, 143, 144, 145, 147, 151
Home view , 29
horizontal and vertical bars , 80

I
Icons and functionality , 26
Identification of Common Fields , 68
Identify influential factors , 165
Identifying Data Sources , 55
Identifying Our Relationship Columns, 156
IF and SWITCH , 109
Image URL , 54
Image URLs , 285
impactful and user-friendly dashboards , 298
impactful and user-friendly dashboards and reports , 298
implement custom calculations , 3
implement data security standards , 8
Implement effective data modeling techniques , 233
Implement row-level security , 240
Implement version control , 237
implicit measures , 128, 129
import data , 7, 13, 14, 15, 56, 57, 142
Import relationships from data , 155
Important, 17, 121, 152, 160
Important New Functionality, 17
important visual interactive , 77
importing and linking large volumes of data , 15
IMPORTING AND MODELING, 55
importing and modeling data , 55
IMPORTING AND MODELING OUR DATA, 55
Importing data , 6, 265
Importing from Files , 57
Importing the Data, 56
improve data analysis , 5
improve measures , 109
improve the data's overall aesthetic appeal , 14
Improve the overall customer experience , 165
Improved look, feel, and organization , 26
incremental refresh , 7
independent project , 283
in-depth comprehension of your data , 2
in-depth data modeling , 13
indeterminable amount of time , 294
individual elements , 9
individual user licenses , 217
Infographics , 282
Information functions , 113
information regarding the goals , 8
Inline DAX editor , 41
in-memory columnar data storage , 17
In-Memory Technology , 18
Insert , 29, 30, 36, 37, 38, 159
insights into the variability , 115
inspecting the Date column , 53
install specialist add-ins , 14
Integration and Ecosystem , 232
Integration of Data , 55
Integration of the Parameter, 180
Integration with External Data Sources , 14
Integration with Other Visuals , 95
intelligence visualizations , 38
intelligent business , 4
intelligent business decisions , 4
Interact with the chart , 81, 86
Interaction between Row and Filter, 138
interaction experience , 65, 79
interactive data visualizations , 16
Interactive Exploration , 95, 170
interactive features , 5, 77, 296
interactive reports , 2, 4, 7
Interactive Reports , 189
interactive reports and dashboards , 3, 298
interactive summary , 7
interactive visualizations , 21, 22, 57, 109, 283
Interactivity and Drill-Down , 97, 101
Interactivity and drill-through , 287
interface for customer engagement and support , 10
intuitive interface , 1, 216, 298
intuitive user interface , 5, 35
Is hidden flags , 71
ISOM 210 course , 116
Iterative Modification , 182
Its components, 6

J
JavaScript Object Notation , 265
JSON , 63, 244, 265, 266
JSON or XML file , 63

K
keep the amount of data movement , 19
KEEPFILTERS , 275
Key Benefits of the Decomposition, 170
Key influencers , 47, 164, 166
Key Influencers, 164, 165, 166
key performance indicators , 7, 130, 176, 290, 296
key tool for data modeling and the development , 11
key tool for data modeling and the development of reports. , 11
Keytips to traverse and choose buttons , 27
KPI , 46, 105, 106
KPIs , 7, 129, 201, 290, 296

L
large corporation , 1
largely focused on standard database management functions , 12
Latitude , 54
Learn, 203, 219, 247
Learning Curve , 184
Learning, Theme Generation, Visual Generation, 244
length or height of the bars standing , 80
Let’s Get Reporting, 159
LET’S MAKE SOME PICTURES (VISUALIZING DATA 101), 73
letting users explore the data , 8
Leverage data source-specific optimizations , 233
leveraging Power BI's connectivity , 298
leveraging Power BI's connectivity options , 298
License Expiration , 238
Licensing, 217, 238, 239
LICENSING AND DEPLOYMENT, 217
LICENSING AND DEPLOYMENT TIPS, 217
Limitations with Data , 101
Line and area charts , 89
Line and area charts., 89
Line and clustered column chart , 46, 89
Line and Stacked Column, 93
Line chart , 46, 89
Line Chart, 89, 90
line graphs , 2, 73
live connection , 33, 195
load (ETL) tool , 15
Load the Data , 57
local servers , 4
local system , 36
locate data anomalies , 2
Locate the Parameter Control , 181
locating patterns , 283
logical functions , 21, 276
Logical functions , 113, 276
Logical Functions, 276
Logical Operators , 278, 279
Longitude , 54

M
Machine Learning and Modeling , 120
maintain a connection with the data insights , 3
Maintain and Update Roles , 259
Maintenance and Updates , 232
Make adjustments to the visual , 171
MAKE SOME PICTURES, 73
Make this relationship active, 148, 149, 154
Manage Dates, 264
Manage relationships , 41, 149, 151
Manage Relationships Wizard , 157
Manage roles , 42
manage user access , 8
manage user access and permissions , 8
Managing Dependencies , 184
Managing Relationships , 68
Managing Users, 223
Managing Users in a Workspace, 223
managing workspaces , 224
many data sources thanks , 1
Many-to-Many Relationship , 68
Map , 46, 103, 104
maps , 2, 5, 95, 96, 103, 104, 287, 288, 298
Mark as date table , 52
Math and trigonometric functions , 113
Matrix , 47, 105, 106
meaningful conclusions , 4, 94
meaningful reports and dashboards , 298
Measure Branching , 176
Measure Management, 259
Measure tools , 28
Measure tools or Column tools , 28
Measures, 108, 110, 111, 127, 128, 129, 177, 296
MEASURES, 108
Median, 114, 122, 123, 267
Membership Operators , 279
Merge Columns , 63
merging , 35, 55, 142, 146, 147, 148
Metadata Panel , 250, 251
method for visualizing and analyzing huge datasets , 291
Metrics, 201, 234
Microsoft built , 17
Microsoft environment , 21, 281
Microsoft Excel , 6, 13, 15, 195
Microsoft Research , 283
Microsoft Teams , 10, 14, 196, 211, 212
Microsoft’s Relational Database, 12
Microsoft's artificial intelligence , 9
Microsoft's artificial intelligence services , 9
Microsoft's online analytical processing , 12
Microsoft's Power BI Service , 3
Minimum and maximum , 114, 120, 178
Minimum and Maximum, 119
Minimum Sales , 121
mobile application , 7
Mobile apps , 11
mobile devices , 6, 11, 43, 232
Model Navigation window , 256
modeling tools , 3, 5
Modify the Value of the Parameter , 181
Monitor license usage , 240
Monitor usage and performance , 238
most recent information , 2, 7, 190
Multi-dimensional data exploration , 283
Multiple data series , 98
Multiplication , 278
Multi-row card , 46, 105
My Workspace, 190, 206, 207

N
natural language queries , 11
Natural Language Query , 190
Navigate to Roles , 258
necessary R or Python environments , 185
Negates a logical condition , 278
Negotiate to price with Microsoft , 241
New approaches to the processing of data , 12
New measure , 37, 41
New Measure , 259
Null-Coalescing Operator , 279
Numeric Input Box , 182

O
obtain fast insights , 5
Obtaining the appropriate data for analysis , 55
OfficeHoursAttended , 116, 118
OLAP , 12
OLAP Services , 12
on-demand refresh , 7
One Small Step into BI, 12
OneDrive , 191, 205, 208, 221, 285
One-to-One Relationship , 67
on-premises deployment , 230, 231, 232, 233
On-Premises Deployment, 230
on-premises version , 4
On-Premises vs. Cloud Deployment, 230
operational reports , 3
Optimize your marketing strategies , 165
Options for selected visualizations , 80
Oracle , 6, 14, 57
organizations can monitor data usage , 8
organization's network , 4
Organizing Copied Items , 30
Other functions , 113
Output Pane , 250, 252, 253

P
Page options , 44
Page view , 43
Paginated report , 47
Parameter Modification, 181
Parameter Setup, 178
Parent and child functions , 113
Parentheses , 279
particular combination of data , 117
PBIT , 256, 263
pbix , 6, 261
PBVIZ , 36, 245, 281
PBVIZ is the standard format , 36
percentage analysis , 65
Percentile , 115, 268
perform a variety of other tasks , 6
perform ad-hoc analysis , 7
perform ad-hoc analysis of the information , 7
perform complicated calculations , 5
perform fundamental , 10
Performance Analyzer , 50, 252
Performance Analyzer pane , 50
Performance Considerations , 184
Performance Optimization , 236
Performs logical AND operation , 278, 279
personalized visualizations , 5
per-user license , 217, 218
pick DirectQuery or Import , 33
Pie and Donut Chart, 99
Pie chart , 46
Pie Chart , 99, 100
Pie Charts , 100
Pixel-Perfect Reporting, 12
Pixel-Perfect Reporting, Automated Reports, and More, 12
Place , 54, 123
popular custom visuals , 282
Power Apps , 9, 10, 47
Power Automate , 9, 10, 47
Power Automate's integration , 9
Power BI , 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 25, 26, 27,
28, 29, 30, 31, 33, 35, 38, 39, 40, 41, 42, 43, 44, 45, 47, 48, 51, 52, 53, 54, 55, 56, 57,
58, 59, 61, 62, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78,80, 81, 82, 83, 84,
85, 86, 87, 88, 89, 91, 92, 93, 94, 95, 96, 97, 98, 99, 101, 102, 103, 104, 105, 107, 108,
112, 115, 116, 117, 118, 119, 120, 121, 123, 124, 125, 126, 127, 128, 129, 131, 134, 136,
137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149, 150, 152, 153, 154,
155, 156, 157, 160, 161, 162, 163, 164, 165, 166, 168, 169, 170, 171, 172, 174, 175,
176, 177, 179, 182, 183, 184, 185, 186, 187, 188, 189, 190, 191, 193, 194, 195, 196,
197, 198, 199, 201, 202, 203, 204, 205, 206, 207, 208, 209, 210, 211, 212, 213, 214,
216, 217, 218, 222, 223, 224, 225, 227, 228, 229, 230, 231, 232, 233, 234, 235, 236,
237, 238, 239, 240, 241, 242, 243, 244, 245, 247, 248, 249, 250, 252, 253, 254, 255,
256, 257, 259, 260, 261, 262, 264, 265, 266, 267, 269, 271, 273, 276, 278, 280, 281,
282, 283, 284, 285, 286, 287, 289, 290, 291, 293, 294, 296, 298
POWER BI, 1, 164, 189
Power BI Admin Portal , 8
Power BI Cloud , 3, 7, 11
Power BI components , 4, 16, 40
Power BI Components, 2
Power BI Desktop , 2, 3, 4, 5, 6, 7, 10, 11, 17, 18, 19, 20, 21, 25, 26, 27, 28, 30, 31, 35, 42,
44, 45, 56, 57, 58, 59, 65, 66, 78, 81, 88, 141, 142, 146, 148, 149, 150, 152, 153, 154,
155, 164, 177, 185, 186, 189, 194, 195, 196, 197, 200, 204, 205, 206, 209, 230, 243,
244, 248, 249, 252, 253, 255, 256, 257, 260, 262
Power BI Desktop application , 6, 25, 35, 81, 141, 248
Power BI Desktop files , 6
Power BI Desktop supports natural language queries , 5
Power BI Desktop was developed by Microsoft , 3
Power BI Embedded , 3, 4, 209, 241
Power BI Mobile , 2, 3
Power BI mobile app , 3, 6
Power BI Online , 3, 7, 11
Power BI Report , 3, 4, 196, 244
Power BI Report Builder , 3, 196
Power BI Report Builder's extensive feature , 3
Power BI Report Builder's extensive feature set , 3
Power BI Report Server , 4
Power BI Report Server on-premises , 3
Power BI service , 2, 5, 8, 11, 16, 22, 25, 33, 71, 74, 89, 175, 179, 190, 193, 195, 196, 198,
199, 201, 204, 205, 206, 211, 212, 213, 217, 218, 228, 229, 230, 255, 262
Power BI Service , 3, 7, 8, 189, 190, 216
Power BI. Invoices , 3
Power Pivot, 13, 15, 16, 17, 22, 23, 154, 253
Power Platform's worth , 38
Power Query , 5, 6, 13, 15, 16, 17, 22, 23, 35, 48, 52, 57, 58, 59, 61, 62, 63, 64, 65, 72,
141, 142, 143, 144, 145, 146, 147, 185, 186, 199, 209, 234
Power Query supports , 6
Power Query tool , 15
Power Virtual Agents , 9, 10
PowerBI Tips , 242
powerful business intelligence , 11, 298
powerful business intelligence tool , 298
PPU , 217, 218, 219
Practical Applications, 183
Pre-built Content Packs and Apps , 190
preceding image and omitting , 118
Premium capabilities , 219
Premium capacity, 217, 218, 219
Premium capacity-based features , 218
Premium per Capacity , 203, 207, 208, 209, 217, 221, 230
Premium Per Capacity , 195, 196, 208, 220, 223, 230
Premium Per Capacity license , 195
Premium per User , 207, 208, 209, 217, 219, 222, 223, 255
Premium per user (PPU) license, 218
Prepare your data , 81
primary application , 4
primary components , 9, 75
primary development tool , 3
Pro license, 206, 207, 212, 213, 218, 240
Pro or PPU license , 219
Problems with License Assignment , 238
process of retrieving data , 13, 18
product categories , 85, 102, 138, 293
programming languages , 14, 182, 183, 184
Provide Comprehensive Training , 237
published reports , 7
Publishing Your Work, 204
PUTTING THE PUZZLE PIECES TOGETHER, 141
Putting Your Data in Front of Others, 209
Python script visual , 47

Q
Q&A , 5, 39, 42, 47, 70, 78, 164, 171, 172, 173, 174, 175, 190, 200
Q&A and Bookmark buttons , 40
Qualitative Ranges , 295
Queries , 30, 35, 61, 78, 142, 144, 147, 253
Queries Subsection , 35
Query Execution , 19
Query Pane , 250, 252, 253
quick and actionable insight , 8
Quick measure , 37, 41

R
R and Python , 11, 14, 64, 177, 182, 183, 184, 185, 186, 187, 188, 199
R script visual , 47
R scripting languages , 5
Radar Chart, 290
range of visual elements , 96
Range Operators , 279
raw data , 1, 2, 73, 163, 298
real-time updates , 2
Receive alerts and keep up to date , 212
Recent sources , 35
Refresh Data , 57
Refreshing the Data , 56
regulatory standards , 4
regulatory standards our data governance policies , 4
relational database , 12, 149
relational database management , 12
relational database management system , 12
relational online analytical processing , 15
Relationship functions , 113
relative contribution , 99
Remove a user or change their role in a workspace., 224
Removing and Clearing Items , 31
Removing columns , 142
Renaming columns , 143
Repeating the Append Process , 144
Report Page Navigation , 29
report page's settings , 79
Report View, 29, 37, 40, 43, 44, 45
Report Visualizations , 127
reporting , 4, 6, 10, 12, 22, 23, 69, 72, 87, 89, 108, 112, 113, 160, 188, 209, 235, 241, 266
Reporting Services, 10, 12, 196
reports , 2, 3, 4, 5, 7, 8, 11, 14, 16, 22, 25, 30, 31, 39, 43, 54, 57, 68, 69, 72, 74, 95, 96,
104, 106, 109, 127, 129, 136, 141, 143, 148, 155, 159, 175, 183, 184, 188, 189, 190,
193, 194, 196, 200, 203, 206, 208, 209, 211, 212, 213, 214, 215, 216, 218, 221, 226,
227, 228, 230, 231, 234, 235, 237, 238, 239, 240, 241, 261, 262, 266, 280, 281, 285,
289, 295, 298
reports and dashboards , 2, 3, 4, 5, 7, 11, 22, 72, 96, 104, 190, 211, 212, 213, 214, 218,
226, 235, 237, 238, 240, 241, 266, 280
reports and dashboards with coworkers , 2
reports and datasets , 7
respective element , 28
restrict data access , 8
Retention Rate , 139
Reusability , 129, 136
Ribbon chart , 46, 89
Ribbon Chart, 94
rich visualization features , 298
robust data , 112, 236, 298
robust data modeling , 236, 298
robust data modeling capabilities , 298
ROLAP , 15
role-based access management , 8
Row and Filter Context, 136, 138
Row Context, 137
Row-level security , 8

S
SaaS providers , 4
Salesforce , 5, 7, 8, 56
SAMEPERIODLASTYEAR , 139, 269, 271
SandDance, 282, 283, 284
Save and Share , 89
Save and share with the team , 212
Save the Changes , 258
Scalability , 16, 70, 231, 232, 235
Scalability and Flexibility , 70
Scalability and Performance , 16
Scale to fit , 43
Scatter chart , 46
Scatter Chart, 98
Scenario analysis , 177
schedule data refreshes , 3, 57
scheduled refresh , 7, 199, 206
seamless manner , 4, 68
seamless teamwork , 2
Search bar , 26
Secondary values , 90
secure sharing , 4
Security and Compliance , 232
Security and Governance , 184, 236
Seeking Support from Microsoft , 239
Select the chart type , 81
Select the most suitable data connection option , 237
Select the Visualization , 88
SELECTEDVALUE function , 179
Selecting and Appending Tables , 144
Selection and Highlighting , 77
Selection pane , 44, 50, 51
self-service business intelligence , 13
Sensitivity , 30, 176
separate category of expenses , 297
separating , 55
Sequential Representation , 97
Server is a reporting solution , 4
Server provides functionality , 4
set of low-code and no-code , 9
set of low-code and no-code technologies , 9
setting calculations , 22
Settings, 50, 152, 155, 185, 197, 223
several helpful functions , 30
Shape map , 46
shaping within the spreadsheet program , 13
Shared Capacity Workspaces, 207
SharePoint , 1, 5, 7, 8, 9, 10, 14, 15, 21, 56, 84, 191, 194, 205, 209, 213, 214, 222, 285
SharePoint or OneDrive for Business , 192
Sharing and Collaboration , 14
sharing capabilities , 5, 23, 298
sharing capabilities of Power BI , 298
sharing of information , 4, 74
sharing of Power BI content , 11
Sharing via a Link, 211
Sharing via a Link or Teams, 211
Sharing via Teams, 212
Show panes , 44
significant insights , 1
Simplified Data Maintenance , 69
single sign-on , 8
Slicer , 46, 105, 107, 182, 285, 293, 294
slide in PowerPoint , 6
Small Multiples, 86
Smart Filter Pro, 284
smart narrative , 175
Smart narrative , 47, 175
Smart Narrative, 175
smartphones and tablets , 6, 11, 232
software giant. , 9
software manufacturers , 4
Some Favorite Custom Visuals, 280
sophisticated calculations , 19, 21, 22, 134
sophisticated tool , 20
Sort by column , 54
Source Documentation or Data Dictionary , 156
Spark on Azure , 204
specific data analysis purposes , 14
Splitting columns , 142
spreadsheets , 1, 4, 13, 55, 73, 84, 298
SQL , 1, 5, 6, 7, 9, 10, 12, 14, 17, 21, 23, 33, 56, 57, 61, 65, 84, 131, 142, 166, 196, 197,
199, 204, 225, 232, 234, 259
SQL Server , 1, 5, 7, 9, 10, 12, 14, 17, 21, 23, 33, 56, 57, 166, 196, 197, 199, 204, 234,
259
SQL Server Analysis Services , 204
SQL Server Reporting Services , 12
SQL Server Reporting Services. , 12
SSO , 8
Stacked Area, 92
Stacked area chart , 46, 89
Stacked Bar, 82, 86
stacked bar and column charts , 82, 86
stacked bar chart , 46, 82
Stacked bar chart , 46, 82
stacked column chart , 36, 46, 82, 89
Stacked column chart , 46, 82
Standard Aggregations , 129
standard database management , 12
Standard deviation , 122
Standard Deviation, 121, 123, 267
Standard Deviation, Variance, and Median, 121
Starts with (text) , 49
State or Province , 54
statements , 3, 42, 87, 109, 176
Statistical Aggregations , 115
statistical analysis , 14, 183, 266
statistical functions , 21, 129
Statistical functions , 113
STDEV.P or STDEV.S functions , 123
Storage are Import , 70
store sensitive data , 4
store sensitive data and reports , 4
stores data in memory , 18
Studio UI Basics, 250
Stunning visualizations , 2
subscription categories , 93
subsets of the data , 77, 99, 102
Subtraction , 278
Sum, 114, 115, 131
SUMX and AVERAGEX , 109
Sunburst Chart, 296, 297
Support and Community , 185
support links. , 44
surprising speed , 2
surprising speed and agility , 2
Sync slicers , 44, 51, 107
Sync Slicers pane , 50
synchronize data , 9
Synoptic Panel, 287
Syntax Fundamentals, 130

T
Table , 3, 47, 52, 61, 62, 105, 106, 113, 147, 150, 151, 176, 251, 259
Table-based layouts , 3
tables , 1, 5, 6, 13, 14, 15, 16, 20, 21, 25, 41, 48, 52, 55, 57, 61, 65, 67, 68, 69, 71, 76, 107,
108, 109, 111, 112, 115, 117, 118, 121, 131, 140, 141, 142, 143, 144, 145, 146, 147, 148,
149, 150, 151, 152, 153, 154, 156, 157, 158, 163, 173, 174, 189, 195, 197,209, 237, 251,
257, 259, 266, 274, 275, 298
Tabular Editor, 243, 254, 255, 256, 257, 258, 259, 260
Target/Threshold Line , 295
tasks. Microsoft , 10
Temporal Analysis , 127
Teradata , 6
Test the Role , 259
Testing and Validation , 261
text functions , 21
Text functions , 113
Text Operators , 278
The "Drill through" portion , 48
the "Excel workbook" button , 33
The "License mode" menu , 222
The "Sync slicers" pane , 51
The “Flat Visuals”, 104
The Add Column Tab, 64
The AI Visuals Subsection, 38
The ALM Toolkit, 243, 260, 261, 262
the amount of RAM , 18
The Any Column section , 62
The Application Lifecycle Management , 261
The arithmetic mean of a group , 114
the AVERAGE function , 130
The bars in a column chart , 80
The Calculation Subsection, 37
The canvas in Power BI , 6
the Card with States visual , 289
The Card with States visual offers a comprehensive , 289
the chatbot's capabilities , 9
The Chiclet Slicer , 285
The Clipboard Subsection, 30
the cloud and offers real-time data refreshment , 5
The collaboration tools , 2
the color combinations and typefaces. , 27
the company's journey toward the creation , 10
The Conditional Column feature , 65
The Content Packs , 8
The Content Packs and Apps , 8
The contextual nature of the Properties pane , 70
The count aggregation , 114
The Data Subsection, 31
the Datamarts feature , 195
The DAX formula language , 20
The Duplicate Column function , 65
The Elements Subsection, 38
The EVALUATE statement , 111
the FILTER function , 135, 279
The first edition of Microsoft SQL Server Reporting , 12
The Formula Engine, 20
The Gantt chart , 294
The Golden Dataset(s), 228
the Hexbin Scatterplot , 292
The Hexbin Scatterplot , 292
The history of Microsoft's self-service business , 13
The Home Tab, 58
The idea of measures , 108
the Image button , 40
the Import mode , 33
The Index Column function , 65
the installation of Power Pivot. , 15
The integration of Power Virtual Agents , 9
The Invoke Custom Function feature , 64
The Key column facilitates the identification of a column , 71
the key features , 298
The list of visualizations , 89
the low-code/no-code approach , 10
the M programming language , 15
the Microsoft Power BI family of products , 2
The MIN and MAX functions , 121
The mode of a dataset , 114
the Modeling tab , 40, 148, 149, 177, 178
The Modeling Tab, 40
The Model-View, 66
The multi-row card visual , 105
The Navigation Menu, 191
The Page Options Subsection, 44
The Page Refresh Subsection, 41
The Pages Subsection, 37
The Pane Interface, 45
The Power BI project , 10
the Power BI service , 8, 196, 198, 201, 211, 212, 213
the Power BI Service , 3, 4, 7, 8, 189
The Power BI Service can interface without any complications , 8
The Power BI Service provides users with a diverse selection , 7
The Power Platform, 9, 10, 38
the problems of app development , 9
the process of digital transformation , 10
the processing and management of queries , 12
The Properties Pane, 70
The Q&A feature , 5
The Q&A feature of Power BI Desktop , 5
the Q&A subsection , 42
The Q&A Subsection, 42
The Relationships Subsection, 41
THE REPORT AND DATA, 25
THE REPORT AND DATA VIEWS, 25
the Report view , 25, 29, 40, 52, 71, 150, 159, 173
the respective data values , 80
The Ribbon , 94, 95, 250, 252
the right information , 298
the role of the formula engine , 20
The Scale to Fit Subsection, 43
The Security Subsection, 42
The Show Panes Subsection, 44
the storage and management of data and transactions , 12
The Storage Engine, 18
The Sunburst Chart , 296, 297
The tabular model , 17
The Themes Subsection, 43
The time-tested method , 115
the use of Power Query , 6
The video game Street Fighter 5 , 290
The View Tab, 43
The Visualizations pane, 75
The Visuals Subsection, 37
the x- and y-axes , 81, 89
THIRD-PARTY TOOLS, 241
Threshold Determination , 120
Time Intelligence , 129, 271
Time intelligence functions , 113
Time to Get Building, 157
Time-based Insights , 95
Timeline Storyteller, 286
Tooltips , 77, 297
Total Grade Measure , 180
total SalesAmount , 135
TOTALQTD , 272
TOTALYTD , 269, 271
traditional and fixed-layout reporting , 3
traditional and fixed-layout reporting demands , 3
traditional data warehouse , 17
Training and Support , 235
training videos , 44
Transform and Shape the Data , 57
Transform data , 35
transform raw data into actionable insights , 298
Transform Tab, 62, 72
transformation , 3, 13, 16, 22, 23, 57, 59, 61, 63, 73, 88, 142, 143, 145, 186, 236
Transformation of the Data , 55
transforming and modeling data , 5
Transpose converts your columns into rows , 62
Treemap , 46, 101, 102
Types of Functions, 113

U
Understanding additional options, 154
understanding of the dispersion of the data , 120
Unpivot Columns , 62
Upgrading Licenses , 239
usage of a Premium capacity , 217
Use First Row as Headers , 61, 62
use of Content Packs , 8
user management , 8, 201
User Training and Adoption , 237
User-Defined , 129
user-defined expressions , 78
user-defined measures , 129
USERPRINCIPALNAME , 275
users can conduct operations , 13
Users can plan automatic data refreshes , 7
user's interactions and selections , 20
Users of Power BI , 3, 77, 133, 176, 182, 238
Users of Power BI Desktop , 3
Users of Power Query , 5, 16
users' particular company environment , 130
users pre-built templates , 8
user's premises , 7
using a web browser , 3, 189
using Charticulator , 244
Using Merge to Get Columns from Other Tables, 146
Using the Decomposition, 170
using the Refresh Preview button , 59
using the tabular architecture , 17
using typical chart formats , 283
Utilize Analytical Features , 89
Utilizing Excel's functions , 13
utilizing the DAX formula language , 19

V
Variance, 121, 122, 123, 267
variety of data fields , 78
variety of formatting options , 5, 14, 286
variety of new features , 12
variety of sources , 1, 13, 14, 15, 22, 55
various areas of data analysis , 4
various dimensions or sizes , 86
various types of visualizations , 6
Version Control and Deployment Lifecycle , 237
Version Control Integration , 261
VertiPaq , 17, 18, 19, 20, 21, 22, 247, 263
View and interact with the shared report , 211
View dataset , 198
visual characteristics , 5, 294
visual components of a chart , 95
Visual creation and customization , 166
visual creation tool , 244
visual creation tool developed by Microsoft , 244
Visual Exploration , 157
visual formatting options , 40
Visual Interactivity, 76
visual representation , 44, 73, 75, 84, 89, 97, 99, 100, 104, 115, 124, 162, 169, 193, 288,
289, 293, 297
visual representation of project hierarchy , 297
visualization , 3, 4, 5, 7, 9, 10, 11, 14, 15, 18, 23, 36, 38, 47, 48, 51, 54, 57, 67, 73, 74, 75,
76, 77, 78, 79, 80, 82, 83, 84, 85, 87, 88, 89, 90, 92, 93, 94, 95, 96, 98, 101, 102, 104,
105, 106, 107, 108, 119, 120, 138, 139, 146, 162, 166, 183, 189, 216, 265, 266, 280,
281, 283, 285, 288, 291, 292, 295, 298
Visualization and Analysis , 13
visualization and analysis of the correlation , 98
visualization capabilities , 3, 4, 10, 11, 183
visualization options , 15, 266, 298
Visualizations Pane, 46, 107
Visualizations window , 29, 75, 81, 82, 106, 164, 170, 280
Visualize and analyze your data , 141
visualize data , 4, 115, 288
VISUALIZING DATA 101, 73
Visualizing Data Extremes , 120

W
Watch the Impact , 182
Waterfall chart , 46, 82
Waterfall Chart, 87, 88
web browser , 7, 44, 211, 227
web services , 7, 13, 56
web sources , 4
Web URL , 54
web-based platform , 3, 189
websites , 4, 10, 39, 245
Weighted Aggregations , 115
well-known tools , 1
What Is a Relationship?, 66
What Is a Workspace?, 206
What Is Power BI?, 1
What-If Analysis, 176, 177
Why Visualize Data?, 73
wide range of capabilities , 298
wide variety of fields , 4, 13, 164
wizards , 6, 13
Word Cloud, 288
Work Together and Iterate , 177
Workspace and App Management, 219
Workspace Generation and Access Control, 219
Workspaces for Organization , 189

You might also like