DP900 ExamTopic Questions - 70 To 200
DP900 ExamTopic Questions - 70 To 200
DP900 ExamTopic Questions - 70 To 200
1. Which command-line tool can you use to query Azure SQL databases?
B. bcp
C. azdata
D. Azure CLI
o Explain: The sqlcmd utility lets you enter Transact-SQL statements, system
procedures, and script files at the command prompt.
Incorrect Answers:
B: The bulk copy program utility (bcp) bulk copies data between an instance of
Microsoft SQL Server and a data file in a user-specified format.
D: The Azure CLI is the defacto tool for cross-platform and command-line tools
for building and managing Azure resources.
Reference:
https://docs.microsoft.com/en-us/sql/tools/overview-sql-tools?view=sql-server-
ver15
2.
a. All Yes
i. Azure Defender provides security alerts and advanced threat protection for
virtual machines, SQL databases, containers, web applications, your
network, and more.
Azure Defender provides security alerts and advanced threat protection for
virtual machines, SQL databases, containers, web applications, your
network, and more.
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/database/sql-database-
paas-overview https://azure.microsoft.com/en-us/blog/announcing-sql-atp-
and-sql-vulnerability-assessment-general-availability/
https://docs.microsoft.com/en-us/azure/security-center/azure-defender
3.
https://docs.microsoft.com/en-us/azure/azure-sql/database/sql-database-paas-overview
4.
a. https://azure.microsoft.com/en-gb/blog/hot-patching-sql-server-engine-in-azure-
sql-database/
b. https://azure.microsoft.com/en-us/services/sql-database/#product-overview
C. database
E. external
D. Manual filters: Manual filters are user-defined filters that allow report
viewers to select specific values or ranges. Users can manually choose which
data they want to include or exclude in the report. These filters are flexible
and customizable, enabling users to focus on relevant information.
6. When you create an Azure SQL database, which account can always connect to the database?
A. the Azure Active Directory (Azure AD) account that created the database
B. the server admin login account of the logical server Most Voted
D. the sa account
a. When you first deploy Azure SQL, you specify an admin login and an associated
password for that login. This administrative account is called Server admin.
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/database/single-database-create-
quickstart
6. You need to reduce the amount of time that the IT team spends on user support.
What are three possible ways to achieve this goal? Each correct answer presents a
complete solution.
NOTE: Each correct selection is worth one point.
E. Deploy Microsoft Office 365 Professional Plus to all client devices Most Voted
a. Reference:
https://social.technet.microsoft.com/wiki/contents/articles/35748.office-365-what-is-
customer-lockbox-and-how-to-enable-it.aspx
https://docs.microsoft.com/en-us/windows/deployment/windows-autopilot/windows-
autopilot
https://www.microsoft.com/en-us/microsoft-365/blog/2015/03/19/office-365-proplus-it-
control-and-management-update/
7.
Box 1: No -
Microsoft Sentinel data connectors are available for non-Microsoft services like Amazon Web
Services.
Box 2: Yes -
Once you have connected your data sources to Microsoft Sentinel, you can visualize and monitor
the data using the Microsoft Sentinel adoption of Azure Monitor
Workbooks, which provides versatility in creating custom dashboards. While the Workbooks are
displayed differently in Microsoft Sentinel, it may be useful for you to see how to create
interactive reports with Azure Monitor Workbooks. Microsoft Sentinel allows you to create
custom workbooks across your data, and also comes with built-in workbook templates to allow
you to quickly gain insights across your data as soon as you connect a data source.
Box 3: Yes -
To help security analysts look proactively for new anomalies that weren't detected by your
security apps or even by your scheduled analytics rules, Microsoft
Sentinel's built-in hunting queries guide you into asking the right questions to find issues in the
data you already have on your network.
Reference:
https://docs.microsoft.com/en-us/azure/sentinel/data-connectors-reference
https://docs.microsoft.com/en-us/azure/sentinel/monitor-your-data
https://docs.microsoft.com/en-us/azure/sentinel/hunting
8.
Reference:
Transparent data encryption (TDE) helps protect Azure SQL Database, Azure SQL
Managed Instance, and Azure Synapse Analytics against the threat of malicious
offline activity by encrypting data at rest.
https://docs.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-tde-
overview?tabs=azure-portal
9. You need to ensure that users use multi-factor authentication ( when connecting to an
Azure SQL database.
Which type of authentication should you use?
C. SQL authentication
D. certificate authentication
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/database/authentication-mfa-ssms-
overview
10. What is a benefit of hosting a database on Azure SQL managed instance as compared to an
Azure SQL database?
A. built-in high availability
B. native support for cross-database queries and transactions Most Voted
C. system-initiated automatic backups
D. support for encryption at rest
https://docs.microsoft.com/en-us/azure/azure-sql/database/features-comparison
10.
Box 1: Yes -
The MailItemsAccessed event is a mailbox auditing action and is triggered when mail data is
accessed by mail protocols and mail clients.
Box 2: No -
Basic Audit retains audit records for 90 days.
Advanced Audit retains all Exchange, SharePoint, and Azure Active Directory audit records for
one year. This is accomplished by a default audit log retention policy that retains any audit record
that contains the value of Exchange, SharePoint, or AzureActiveDirectory for the Workload
property (which indicates the service in which the activity occurred) for one year.
Box 3: yes -
Advanced Audit in Microsoft 365 provides high-bandwidth access to the Office 365 Management
Activity API.
Reference:
Answers are correct,
but products are rebranded to: -
Microsoft Purview Audit (Standard)
Microsoft Purview Audit (Premium)
https://docs.microsoft.com/en-us/microsoft-365/compliance/advanced-audit?view=o365-
worldwide
https://docs.microsoft.com/en-us/microsoft-365/compliance/auditing-solutions-overview?
view=o365-worldwide#licensing-requirements
https://docs.microsoft.com/en-us/office365/servicedescriptions/microsoft-365-service-
descriptions/microsoft-365-tenantlevel-services-licensing-guidance/ microsoft-365-security-
compliance-licensing-guidance#advanced-audit
11. You need to design and model a database by using a graphical tool that supports project-
oriented offline database development.
What should you use?
11. You have a transactional application that stores data in an Azure SQL managed instance.
When should you implement a read-only database replica?
A. You need to generate reports without affecting the transactional workload. Most Voted
Reference:
Use read-only replicas to offload read-only query workloads.
The correct answer is A. You should implement a read-only database replica when you
need to generate reports without affecting the transactional workload. By creating a read-
only replica of the transactional database, you can offload the reporting workload to the
replica, ensuring that the performance of the transactional workload is not impacted. This
allows you to generate reports and perform analytical queries on the replica without
affecting the availability or responsiveness of the primary transactional database.
https://docs.microsoft.com/en-us/azure/azure-sql/database/read-scale-out
12.
13. You need to query a table named Products in an Azure SQL database.
Which three requirements must be met to query the table from the internet? Each correct
answer presents part of the solution.
NOTE: Each correct selection is worth one point.
A. You must be assigned the Reader role for the resource group that contains the
database.
B. You must have SELECT access to the Products table. Most Voted
D. You must be assigned the Contributor role for the resource group that contains the
database.
Reference:
https://docs.microsoft.com/en-us/sql/relational-databases/security/authentication-access/getting-
started-with-database-engine-permissions?view=sql-server-ver15
14.
15. Which T-SQL statement should be used to instruct a database management system to use
an index instead of performing a full table scan?
A. SELECT
C. JOIN
Reference
https://docs.microsoft.com/en-us/sql/t-sql/queries/hints-transact-sql-table
Which Azure service provides the highest compatibility for databases migrated from Microsoft
SQL Server 2019 Enterprise edition?
SQL Managed Instance has near 100% compatibility with the latest SQL Server (Enterprise Edition)
database engine, providing a native virtual network (VNet) implementation that addresses common
security concerns, and a business model favorable for existing SQL Server customers.
Note: Azure SQL Managed Instance is the intelligent, scalable cloud database service that combines the
broadest SQL Server database engine compatibility with all the benefits of a fully managed and evergreen
platform as a service.
Reference:
https://docs.microsoft.com/en-us/azure/azure-sql/managed-instance/sql-managed-instance-paas-
overview?view=azuresql
Box 1: Key/value -
A key/value store associates each data value with a unique key.
Box 2: Object -
Object storage is optimized for storing and retrieving large binary objects (images, files, video and audio
streams, large application data objects and documents, virtual machine disk images).
Box 3: Graph -
A graph database stores two types of information, nodes and edges. Edges specify relationships between
nodes. Nodes and edges can have properties that provide information about that node or edge, like
columns in a table. Edges can also have a direction indicating the nature of the relationship.
Reference:
https://docs.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-store-overview
You have an Azure Cosmos DB account that uses the Core (SQL) API.
Which two settings can you configure at the container level? Each correct answer presents a
complete solution.
NOTE: Each correct selection is worth one point.
D. the API
https://docs.microsoft.com/en-us/azure/cosmos-db/how-to-manage-database-account
Which storage solution supports role-based access control (RBAC) at the file and folder level?
https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-access-control
You need to store data in Azure Blob storage for seven years to meet your company's compliance
requirements. The retrieval time of the data is unimportant. The solution must minimize storage
costs.
Which storage tier should you use?
A. Archive
B. Hot
C. Cool
Answer is correct Hot - Optimized for storing data that is accessed frequently. Cool -
Optimized for storing data that is infrequently accessed and stored for at least 30 days.
Archive - Optimized for storing data that is rarely accessed and stored for at least 180 days
with flexible latency requirements (on the order of hours).
https://cloud.netapp.com/blog/azure-blob-storage-pricing-the-complete-guide-azure-cvo-blg#H1_4
Which type of non-relational data store supports a flexible schema, stores data as JSON files, and
stores the all the data for an entity in the same document?
B. columnar
C. graph
D. time series
Document is correct
https://docs.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-
store-overview#column-family-databases
https://docs.microsoft.com/en-us/azure/cosmos-db/faq
A key mechanism that allows Azure Data Lake Storage Gen2 to provide file system performance at object
storage scale and prices is the addition of a hierarchical namespace. This allows the collection of
objects/files within an account to be organized into a hierarchy of directories and nested subdirectories in
the same way that the file system on your computer is organized. With a hierarchical namespace enabled,
a storage account becomes capable of providing the scalability and cost-effectiveness of object storage,
with file system semantics that are familiar to analytics engines and frameworks.
One advantage of hierarchical namespace is atomic directory manipulation: Object stores approximate a
directory hierarchy by adopting a convention of embedding slashes (/) in the object name to denote path
segments.
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-namespace
WRONG
Correct: By using the Gremlin API, you can query a graph database in Azure Cosmos DB
using the Gremlin query language. Gremlin is a graph traversal language and the primary
query language for graph databases. To query a graph database in Azure Cosmos DB
using the Gremlin API, you would typically use a Gremlin console or a Gremlin client
library. Here's a general outline of how you would query a graph database in Azure
Cosmos DB using the Gremlin API: Connect to the Cosmos DB account: You would need to
provide the connection details for your Azure Cosmos DB account, including the URI and
authentication keys or tokens. Instantiate a Gremlin client: Use a Gremlin client library for
your programming language of choice (e.g., Java, Python, .NET) to establish a connection
to the Cosmos DB account. Execute Gremlin queries: Use the methods provided by the
Gremlin client library to execute Gremlin queries against the graph database in Azure
Cosmos DB. These queries can traverse the graph, retrieve vertices (nodes), edges
(relationships), and perform various graph operations. Process the results returned by the
Gremlin queries as needed for your application logic.
When provisioning an Azure Cosmos DB account, which feature provides redundancy within an
Azure region?
A. multi-master replication
With Availability Zone (AZ) support, Azure Cosmos DB will ensure replicas are placed across multiple
zones within a given region to provide high availability and resiliency to zonal failures.
Note: Azure Cosmos DB provides high availability in two primary ways. First, Azure Cosmos DB
replicates data across regions configured within a Cosmos account. Second, Azure Cosmos DB maintains
4 replicas of data within a region.
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/high-availability
Your company needs to design a database that shows how changes in network traffic in one area
of a network affect network traffic in other areas of the network.
Which type of data store should you use?
B. key/value
C. document
D. columnar
Data as it appears in the real world is naturally connected. Traditional data modeling focuses on defining
entities separately and computing their relationships at runtime. While this model has its advantages,
highly connected data can be challenging to manage under its constraints.
A graph database approach relies on persisting relationships in the storage layer instead, which leads to
highly efficient graph retrieval operations. Azure Cosmos
DB's Gremlin API supports the property graph model.
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/graph-introduction#introduction-to-graph-databases
Box 1: Yes -
Azure Databricks can consume data from SQL Databases using JDBC and from SQL Databases using the
Apache Spark connector.
The Apache Spark connector for Azure SQL Database and SQL Server enables these databases to act as
input data sources and output data sinks for Apache
Spark jobs.
Box 2: Yes -
You can stream data into Azure Databricks using Event Hubs.
Box 3: Yes -
You can run Spark jobs with data stored in Azure Cosmos DB using the Cosmos DB Spark connector.
Cosmos can be used for batch and stream processing, and as a serving layer for low latency access.
You can use the connector with Azure Databricks or Azure HDInsight, which provide managed Spark
clusters on Azure.
Reference:
https://docs.microsoft.com/en-us/azure/databricks/data/data-sources/sql-databases-azure
https://docs.microsoft.com/en-us/azure/databricks/scenarios/databricks-stream-from-eventhubs
C. a blob container
D. a table
First create an Azure storage account, then use Table service in the Azure portal to create a table.
Note: An Azure storage account contains all of your Azure Storage data objects: blobs, files, queues, and
tables.
Reference:
https://docs.microsoft.com/en-us/azure/storage/tables/table-storage-quickstart-portal
https://docs.microsoft.com/en-us/azure/storage/common/storage-account-create
You need to recommend a data store service that meets the following requirements:
✑ Native SQL API access
✑ Configurable indexes
What should you recommend?
A. Azure Files
Box 2: No -
Box 3: Yes -
Box 4: Yes -
Azure Cosmos DB supports multi-region writes.
Reference:
https://docs.microsoft.com/en-us/azure/storage/common/storage-redundancy
https://manojchoudhari.wordpress.com/2019/12/16/azure-cosmos-db-enable-multi-region-writes
A key mechanism that allows Azure Data Lake Storage Gen2 to provide file system performance at object
storage scale and prices is the addition of a hierarchical namespace. This allows the collection of
objects/files within an account to be organized into a hierarchy of directories and nested subdirectories in
the same way that the file system on your computer is organized. With a hierarchical namespace enabled,
a storage account becomes capable of providing the scalability and cost-effectiveness of object storage,
with file system semantics that are familiar to analytics engines and frameworks.
Reference:
https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-namespace
You manage an application that stores data in a shared folder on a Windows server.
You need to move the shared folder to Azure Storage.
Which type of Azure Storage should you use?
A. queue
B. blob
C. file
D. table
Azure file shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux,
and macOS. Azure file shares can also be cached on
Windows Servers with Azure File Sync for fast access near where the data is being used.
Reference:
https://azure.microsoft.com/en-us/services/storage/files/
You have an application that runs on Windows and requires access to a mapped drive.
Which Azure service should you use?
A. Azure Files
C. Azure Cosmos DB
Azure Files is Microsoft's easy-to-use cloud file system. Azure file shares can be seamlessly used in
Windows and Windows Server.
To use an Azure file share with Windows, you must either mount it, which means assigning it a drive
letter or mount point path, or access it via its UNC path.
Reference:
https://docs.microsoft.com/en-us/azure/storage/files/storage-how-to-use-files-windows
Box 3: No -
Logical partitions are formed based on the value of a partition key that is associated with each item in a
container.
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/partitioning-overview
Your company is designing an application that will write a high volume of JSON data and will
have an application-defined schema.
Which type of data store should you use?
A. columnar
B. key/value
D. graph
You need to recommend a non-relational data store that is optimized for storing and retrieving
text files, videos, audio streams, and virtual disk images. The data store must store data, some
metadata, and a unique ID for each file.
Which type of data store should you recommend?
A. key/value
B. columnar
C. object
D. document
Object storage is optimized for storing and retrieving large binary objects (images, files, video
and audio streams, large application data objects and documents, virtual machine disk images).
Large data files are also popularly used in this model, for example, delimiter file (CSV), parquet,
and ORC. Object stores can manage extremely large amounts of unstructured data.
Reference:
https://docs.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-store-
overview
Question #: 134
A. relational
B. time series
C. graph
D. columnar
Time series data is a set of values organized by time. Time series databases typically collect
large amounts of data in real time from a large number of sources.
Updates are rare, and deletes are often done as bulk operations. Although the records written to a
time-series database are generally small, there are often a large number of records, and total data
size can grow rapidly.
Reference:
https://docs.microsoft.com/en-us/azure/architecture/guide/technology-choices/data-store-
overview
Question #: 135
Yes:: You implement ADLS as a Storage Account. Azure Data Lake Storage is a type of
storage account in Azure, specifically optimized for big data analytics workloads. When
you create an Azure Data Lake Storage Gen2 account, you are essentially creating a
specialized type of Storage Account with additional capabilities tailored for data lake
scenarios. When you create an ADLS Gen2 account, it is provisioned under the hood as a
hierarchical namespace on top of Blob storage, which is part of Azure Storage. This means
that you can use your Azure Data Lake Storage account to store and analyze large
amounts of structured and unstructured data, leveraging features such as fine-grained
access control, hierarchical namespaces, and integration with big data analytics services
like Azure Databricks, Azure HDInsight, and Azure Synapse Analytics.
Reference:
https://docs.microsoft.com/en-us/azure/data-lake-store/data-lake-store-get-started-portal
https://docs.microsoft.com/en-us/azure/storage/common/storage-account-overview
https://azure.microsoft.com/en-us/pricing/details/bandwidth/
Question #: 136
Reference : https://azure.microsoft.com/en-us/blog/a-technical-overview-of-azure-cosmos-
db/ API - Gremlin // Container is projected as -Graph // Item is projected as -Nodes and
Edges
Question #: 137
At which two levels can you set the throughput for an Azure Cosmos DB account? Each correct
answer presents a complete solution.
NOTE: Each correct selection is worth one point.
B. item
D. partition
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/set-throughput
Reference:
https://docs.microsoft.com/en-us/azure/cosmos-db/high-availability
Question #: 139
A. no indexes
C. a well-defined schema
B. batch
C. massively parallel processing (MPP)
D. streaming
Reference:
https://docs.microsoft.com/en-in/azure/azure-monitor/overview
Question #: 141
You need to gather real-time telemetry data from a mobile application.
Which type of workload describes this scenario?
B. batch
D. streaming
Reference:
https://docs.microsoft.com/en-in/azure/azure-monitor/overview
Question #: 142
You have a dedicated SQL pool in Azure Synapse Analytics that is only used actively every
night for eight hours.
You need to minimize the cost of the dedicated SQL pool as much as possible during idle times.
The solution must ensure that the data remains intact.
What should you do on the dedicated SQL pool?
Reference:
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-
warehouse-manage-compute-overview
https://learn.microsoft.com/en-us/azure/synapse-analytics/plan-manage-costs
Dedicated SQL pool You can control costs for a dedicated SQL pool by pausing the resource
when it is not is use. For example, if you won't be using the database during the night and on
weekends, you can pause it during those times, and resume it during the day. For more
information, see Pause and resume compute in dedicated SQL pool via the Azure portal.
Question #: 143
Which Azure Data Factory component initiates the execution of a pipeline?
A. a control flow
B. a trigger
C. a parameter
D. an activity
Reference:
https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-
triggers#trigger-execution
https://docs.microsoft.com/en-us/azure/data-factory/concepts-pipeline-execution-triggers
Pipeline runs are typically instantiated by passing arguments to parameters that you define in the
pipeline. You can execute a pipeline either manually or by using a trigger.
Question #: 144
Your company has a reporting solution that has paginated reports. The reports query a
dimensional model in a data warehouse.
Which type of processing does the reporting solution use?
A. stream processing
B. batch processing
Reference:
https://datawarehouseinfo.com/how-does-oltp-differ-from-olap-database/
Question #: 146
What are three characteristics of an Online Transaction Processing (OLTP) workload? Each
correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. denormalized data
E. schema on read
Reference:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/relational-data/online-transaction-
processing
B: Transactional data tends to be heavy writes, moderate reads.
D: Typical traits of transactional data include: schema on write, strongly enforced What is
Schema On Write Schema on write is defined as creating a schema for data before writing into
the database. If you have done any kind of development with a database you understand the
structured nature of Relational Database(RDBMS) because you have used Structured Query
Language (SQL) to read data from the database.
F: Transactional data tends to be highly normalized.
Reference Online transaction processing (OLTP)
https://learn.microsoft.com/en-us/azure/architecture/data-guide/relational-data/online-
transaction-processing
Question #: 148
https://docs.microsoft.com/en-us/azure/data-factory/introduction
Question #: 149
You need to develop a solution to provide data to executives. The solution must provide an
interactive graphical interface, depict various key performance indicators, and support data
exploration by using drill down.
What should you use in Microsoft Power BI?
A. a view
B. a report
C. a dataflow
https://docs.microsoft.com/en-us/power-bi/consumer/end-user-dashboards
https://docs.microsoft.com/en-us/power-bi/visuals/power-bi-visualization-kpi
https://docs.microsoft.com/en-us/power-bi/consumer/end-user-drill
https://learn.microsoft.com/en-us/power-bi/create-reports/service-dashboards#dashboards-
versus-reports
Question #: 150
Which two Azure services can be used to provision Apache Spark clusters? Each correct answer
presents a complete solution.
NOTE: Each correct selection is worth one point.
B. Azure HDInsight
C. Azure Databricks
correct answers:
1. Azure Synaps Analytics
2. Azure HDInsight
3. Azure Databricks
https://www.sqlshack.com/a-beginners-guide-to-azure-databricks/
https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-overview
Question #: 151
You have a quality assurance application that reads data from a data warehouse.
Which type of processing does the application use?
B. batch processing
D. stream processing
The quality assurance application that reads data from a data warehouse typically uses Online
Analytical Processing (OLAP) because it involves querying and analyzing historical data for
reporting and analysis purposes. OLAP is optimized for complex queries and aggregations on
large volumes of data, making it suitable for tasks like data analysis and business intelligence,
which align with quality assurance activities. So, the correct answer is: C. Online Analytical
Processing (OLAP)
Question #: 147
Which two activities can be performed entirely by using the Microsoft Power BI service without
relying on Power BI Desktop? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
C. data modeling
https://docs.microsoft.com/en-us/power-bi/fundamentals/service-service-vs-desktop
Question #: 152
Which three objects can be added to a Microsoft Power BI dashboard? Each correct answer
presents a complete solution.
NOTE: Each correct selection is worth one point.
D. a dataflow
A,C,E
https://docs.microsoft.com/en-us/power-bi/create-reports/service-dashboard-pin-live-tile-from-
report
https://docs.microsoft.com/en-us/power-bi/create-reports/service-dashboard-add-widget
Question #: 153
Right Answer is Yes, No, yes.
Dashboard is associate for a single workspace.
Dashboard visualization from different datasets and reports.
Dashboard visualization from other tools example: Excel
https://docs.microsoft.com/en-us/power-bi/fundamentals/service-basic-concepts#dashboards
Question #: 155
Paginated Reports in Power BI now allows users to generate these fixed-layout documents
optimized for printing and archiving, such as PDF and Word files.
These document-style reports with visualizations that provide additional control, like which
tables expand horizontally and vertically to display all their data and continue from page to page
as needed.
Reference:
https://powerbi.microsoft.com/en-us/blog/announcing-paginated-reports-in-power-bi-general-
availability/
Question #: 158
Yes No Yes The first one is Yes, you can copy a dashboard between Microsoft Power BI
workspaces. Here’s how: Open the dashboard you want to copy. Click on the ‘File’ menu in the
upper-left corner of the screen. Select ‘Save As’ from the dropdown menu. Provide a name for
the new dashboard copy. Choose the destination workspace where you want the copy to be
saved. Click ‘Save’.
https://learn.microsoft.com/en-us/power-bi/connect-data/service-datasets-copy-reports
What should you use to build a Microsoft Power BI paginated report?
A. Charticulator
B. Power BI Desktop
Power BI Report Builder is the standalone tool for authoring paginated reports for the Power BI
service.
Reference:
https://docs.microsoft.com/en-us/power-bi/paginated-reports/paginated-reports-report-builder-
power-bi
Question #: 154
Which Azure Data Factory component provides the compute environment for activities?
A. SSIS packages
B. an integration runtime
C. a control flow
D. a pipeline
The answer is correct = An integration runtime (IR) is a compute infrastructure that provides the
data integration capabilities for Azure Data Factory. The integration runtime provides the
compute environment for executing activities, which are the building blocks of data pipelines in
Azure Data Factory.
SSIS packages are a type of data integration solution that can be run on premises or in the cloud.
They can be integrated with Azure Data Factory, but they are not a component of the Data
Factory compute environment. A control flow is a logical representation of the steps that are
required to execute a workflow, and a pipeline is a collection of activities that are organized into
a workflow. While both components are important for building data pipelines in Azure Data
Factory, they do not provide the compute environment for executing activities.
https://docs.microsoft.com/en-us/azure/data-factory/concepts-integration-runtime
Question #: 156
What are two uses of data visualization? Each correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
Data visualization is a key component in being able to gain insight into your data. It helps make
big and small data easier for humans to understand. It also makes it easier to detect patterns,
trends, and outliers in groups of data.
Data visualization brings data to help you find key business insights quickly and effectively.
Reference:
https://docs.microsoft.com/en-us/azure/synapse-analytics/spark/apache-spark-data-visualization
Question #: 157
You need to use Transact-SQL to query files in Azure Data Lake Storage Gen 2 from an Azure
Synapse Analytics data warehouse.
What should you use to query the files?
A. Azure Functions
C. PolyBase
PolyBase enables your SQL Server instance to process Transact-SQL queries that read data from
external data sources. SQL Server 2016 and higher can access external data in Hadoop and Azure
Blob Storage. Starting in SQL Server 2019, you can now use PolyBase to access external data in
SQL Server, Oracle, Teradata, and MongoDB.
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql/load-data-overview
Question #: 159
What are three characteristics of an Online Transaction Processing (OLTP) workload? Each
correct answer presents a complete solution.
NOTE: Each correct selection is worth one point.
A. denormalized data
A. to provide answers to complex queries that rely on data from multiple sources Most
Voted
Box 1: Batch -
The batch processing model requires a set of data that is collected over time while the stream
processing model requires data to be fed into an analytics tool, often in micro-batches, and in
real-time.
The batch Processing model handles a large batch of data while the Stream processing model
handles individual records or micro-batches of few records.
In Batch Processing, it processes over all or most of the data but in Stream Processing, it
processes over data on a rolling window or most recent record.
Box 2: Batch -
Box 3: Streaming -
Reference:
https://k21academy.com/microsoft-azure/dp-200/batch-processing-vs-stream-processing
Question #: 163
Note: The data warehouse workload encompasses:
✑ The entire process of loading data into the warehouse
✑ Performing data warehouse analysis and reporting
✑ Managing data in the data warehouse
✑ Exporting data from the data warehouse
Reference:
https://docs.microsoft.com/en-us/azure/synapse-analytics/sql-data-warehouse/sql-data-
warehouse-workload-management
Question #: 164
Box 1: No -
A pipeline is a logical grouping of activities that together perform a task.
Box 2: Yes -
You can construct pipeline hierarchies with data factory.
Box 3: Yes -
A pipeline is a logical grouping of activities that together perform a task.
Reference:
https://mrpaulandrew.com/2019/09/25/azure-data-factory-pipeline-hierarchies-generation-
control/
Azure Data Factory has 4 key components:
**Datasets: represent data structures within the data stores. Input dataset represents input for an
activity in the pipeline. Output dataset represents the output for the activity. **Pipeline: a group
of activities; used to group activities into a unit that together performs a task.
**Activities: the actions to perform on your data. Azure Data Factory supports two types of
activities: data movement and data transformation.
**Linked Services: the information needed for ADF to connect to external resources. For
example, Azure Storage linked service specifies a connection string to connect to the Azure
Storage account.
Question #: 165
Box 1: Yes -
Compute is separate from storage, which enables you to scale compute independently of the data
in your system.
Box 2: Yes -
You can use the Azure portal to pause and resume the dedicated SQL pool compute resources.
Pausing the data warehouse pauses compute. If your data warehouse was paused for the entire
hour, you will not be charged compute during that hour.
Box 3: No -
Storage is sold in 1 TB allocations. If you grow beyond 1 TB of storage, your storage account
will automatically grow to 2 TBs.
Reference:
https://azure.microsoft.com/en-us/pricing/details/synapse-analytics/
Question #: 168
Box 1: Azure Data factory -
Relevant Azure service for the three ETL phases are Azure Data Factory and SQL Server
Integration Services (SSIS).
Question #: 173
A bar chart showing year-to-date sales by region is an example of which type of analytics?
A. predictive
B. prescriptive
D. diagnostic
Question #: 174
Yes - Stream processing has access to the most recent data received or data within a rolling time
window. Stream processing operates on data in near real-time, allowing for analysis and
processing of data as it is received or within a defined time window.
No - Batch processing is not required to occur immediately and can have higher latency. Batch
processing typically operates on larger volumes of data and is often performed at regular
intervals or in scheduled batches, which can have latency in the order of minutes, hours, or even
days.
Yes - Stream processing is commonly used for simple response functions, aggregates, or
calculations such as rolling averages. It enables real-time data analysis and enables quick
calculations and aggregations on streaming data as it arrives.
Question #: 175
You need to perform hybrid transactional and analytical processing (HTAP) queries against
Azure Cosmos DB data sources by using Azure Synapse Analytics.
What should you use?
A. Synapse pipelines
C. Synapse Link
D. Synapse Studio
Synapse Link is a feature of Azure Cosmos DB that allows you to enable real-time analytics on
your Cosmos DB data, by creating a seamless connection between Azure Cosmos DB and Azure
Synapse Analytics. It enables you to run analytical queries against your Cosmos DB data using
Synapse SQL pool, and perform complex joins and aggregations across data stored in both
Cosmos DB and other data sources.
Hybrid Transactional and Analytical Processing (HTAP) is a technique for near real time
analytics without a complex ETL solution. In Azure Synapse Analytics, HTAP is supported
through Azure Synapse Link.
https://learn.microsoft.com/en-us/training/paths/work-with-hybrid-transactional-analytical-
processing-solutions/#:~:text=Hybrid%20Transactional%20and%20Analytical%20Processing
%20(HTAP)%20is%20a%20technique%20for,Data%20Engineering%20on%20Microsoft
%20Azure.
Question #: 176
You need to create a visualization of running sales totals per quarter as shown in the following
exhibit.
A. a waterfall chart
B. a ribbon chart
C. a bar chart
D. a decomposition tree
Reference:
https://docs.microsoft.com/en-us/power-bi/visuals/power-bi-visualization-types-for-reports-and-
q-and-a
Question #: 181
You have an on-premises Microsoft SQL Server database.
You need to migrate the database to the cloud. The solution must meet the following
requirements:
* Minimize maintenance effort.
* Support the Database Mail and Service Broker features.
What should you include in the solution?
Azure SQL Database does not support service broker and mail. You need the Managed Instance.
https://docs.microsoft.com/en-us/azure/azure-sql/database/features-comparison?view=azuresql
Question #: 184
Which two features distinguish Delta Lake from Azure Data Lake Storage? Each correct answer
presents a complete solution.
B. schema enforcement
D. transactional consistency
Question #: 185
A company plans to use Power Apps to connect to a series of custom services. There are no
connectors available for the custom services.
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
NOTE: Each correct selection is worth one point.
Hot Area:
Yes - Custom connectors for customer-specific services are only available to the users within the
same tenant or environment as the creator of the custom connector. They do not need to go
through the review and certification process by Microsoft, which is only required for custom
connectors that are intended to be published and shared publicly.
Yes - Custom connectors are reusable components that can be used across Power Apps and
Power Automate. You can build a custom connector once and use it in multiple apps and flows if
you have the appropriate permissions and licenses.
No- Custom connectors that are meant to be publicly available for all Power Platform users need
to be certified by Microsoft before they can be published in the connector gallery.
Question #: 186
What is a function of a modern data warehouse?
Hierarchical namespace
https://docs.microsoft.com/en-us/azure/storage/blobs/create-data-lake-storage-account
Question #: 188
What can be used with native notebook support to query and visualize data by using a web-based
interface?
A. Azure Databricks
B. pgAdmin
C. Microsoft Power BI
Notebooks are a common tool in data science and machine learning for developing code and
presenting results. In Azure Databricks, notebooks are the primary tool for creating data science
and machine learning workflows and collaborating with colleagues. Databricks notebooks
provide real-time coauthoring in multiple languages, automatic versioning, and built-in data
visualizations.
Reference
Introduction to Databricks notebooks
https://learn.microsoft.com/en-us/azure/databricks/notebooks/
Question #: 189
Which format was used?
A. XML
B. HTML
C. YAML
Question #: 190
A. JSON
B. YAML
C. HTML
Question #: 191
Which database transaction property ensures that transactional changes to a database are
preserved during unexpected operating system restarts?
A. consistency
B. atomicity
C is correct, Durability – when a transaction has been committed, it will remain committed and
saved to hard disk, after system reboot or restart, the data will be loaded back from hard disk so
no data loss occurred.
Question #: 194
Which database transaction property ensures that individual transactions are executed only once
and either succeed in their entirety or roll back?
B. durability
C. isolation
D. consistency
Atomicity is the right answer. An atomic transaction is an indivisible and irreducible series of
database operations such that either all occurs, or nothing occurs. A guarantee of atomicity
prevents updates to the database occurring only partially, which can cause greater problems than
rejecting the whole series outright. As a consequence, the transaction cannot be observed to be in
progress by another database client.
Question #: 196
Which Azure Storage service implements the key/value model?
A. Azure Queue
B. Azure Files
D. Azure Blob
Has to be C : Azure Table storage is a service that stores non-relational structured data (also
known as structured NoSQL data) in the cloud, providing a key/attribute store with a schemeless
design. Because Table storage is schemeless, it's easy to adapt your data as the needs of your
application evolve. Access to Table storage data is fast and cost-effective for many types of
applications and is typically lower in cost than traditional SQL for similar volumes of data.
Question #: 197
Answer: Semi-structured
Semi-structured data is data that does not conform to a strict relational model but still has some
organizational structure, often through key-value pairs or graph nodes and edges. In a graph
database, relationships between social media users and their followers are represented as edges
connecting different nodes (users), providing a form of structure but not as rigid as in traditional
relational databases.
Question #: 199
Which node in the Azure portal should you use to assign a user the Reader role for a resource
group? To answer, select the node in the answer area.
NOTE: Each correct selection is worth one point.
Hot Area:
Box 1: Overview -
When you assign roles, you must specify a scope. Scope is the set of resources the access applies
to. In Azure, you can specify a scope at four levels from broad to narrow: management group,
subscription, resource group, and resource.
1. Sign in to the Azure portal.
2. In the Search box at the top, search for the scope you want to grant access to. For example,
search for Management groups, Subscriptions, Resource groups, or a specific resource.
3. Click the specific resource for that scope.
4. The following shows an example resource group.
Box 2: Access control (IAM)
Access control (IAM) is the page that you typically use to assign roles to grant access to Azure
resources. It's also known as identity and access management
(IAM) and appears in several locations in the Azure portal.
1. Click Access control (IAM).
The following shows an example of the Access control (IAM) page for a resource group.
2. Click the Role assignments tab to view the role assignments at this scope.
3. Click Add > Add role assignment.
If you don't have permissions to assign roles, the Add role assignment option will be disabled.
4. The Add role assignment page opens.
Reference:
https://docs.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal
Question #: 200
You plan to deploy an app. The app requires a nonrelational data service that will provide latency
guarantees of less than 10-ms for reads and writes.
What should you include in the solution?
A. Azure Blob storage
B. Azure Files
D. Azure Cosmos DB The read latency for all consistency levels is always guaranteed to be less
than 10 milliseconds at the 99th percentile. The average read latency, at the 50th percentile, is
typically 4 milliseconds or less. The write latency for all consistency levels is always guaranteed
to be less than 10 milliseconds at the 99th percentile. The average write latency, at the 50th
percentile, is usually 5 milliseconds or less. Azure Cosmos DB accounts that span several regions
and are configured with strong consistency are an exception to this guarantee.
https://learn.microsoft.com/en-us/azure/cosmos-db/consistency-levels