Certscare DP 600 Questions by Kirby 15 04 2024 12qa

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 1

Free Questions for DP-600


Shared by Kirby on 15-04-2024
For More Free Questions and Preparation Resources

Check the Links on Last Page


certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 2

Question 1
Question Type: MultipleChoice

Case Study: Mix Questions

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY

You have a semantic model named Model 1. Model 1 contains five tables that all use Import
mode. Model1 contains a dynamic row-level security (RLS) role named HR. The HR role filters
employee data so that HR managers only see the data of the department to which they are
assigned.

You publish Model1 to a Fabric tenant and configure RLS role membership. You share the model
and related reports to users.

An HR manager reports that the data they see in a report is incomplete.

What should you do to validate the data seen by the HR Manager?

Options:
A- Ask the HR manager to open the report in Microsoft Power Bl Desktop.
B- Select Test as role to view the data as the HR role.
C- Select Test as role to view the report as the HR manager,
D- Filter the data in the report to match the intended logic of the filter for the HR department.

Answer:
B

Explanation:
To validate the data seen by the HR manager, you should use the 'Test as role' feature in Power
BI service. This allows you to see the data exactly as it would appear for the HR role, considering
the dynamic RLS setup. Here is how you would proceed:

Navigate to the Power BI service and locate Model1.

Access the dataset settings for Model1.

Find the security/RLS settings where you configured the roles.

Use the 'Test as role' feature to simulate the report viewing experience as the HR role.
certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 3

Review the data and the filters applied to ensure that the RLS is functioning correctly.

If discrepancies are found, adjust the RLS expressions or the role membership as needed.

Question 2
Question Type: MultipleChoice

Case Study: Mix Questions

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY

You have a Fabric tenant that contains a Microsoft Power Bl report.

You are exploring a new semantic model.

You need to display the following column statistics:

* Count

* Average

* Null count

* Distinct count

* Standard deviation

Which Power Query function should you run?

Options:
A- Table. FuzzyGroup
B- Table.Profile
C- Table.View
D- Table.Schema

Answer:
B

Explanation:
certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 4

The Table.Profile function in Power Query is used to generate column statistics such as count,
average, null count, distinct count, and standard deviation. You can use this function as follows:

Invoke the Power Query Editor.

Apply the Table.Profile function to your table.

The result will be a table where each row represents a column from the original table, and each
column in the result represents a different statistic such as those listed in the requirement.

Question 3
Question Type: MultipleChoice

Case Study: Mix Questions

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY

You have a Fabric tenant that contains a new semantic model in OneLake.

You use a Fabric notebook to read the data into a Spark DataFrame.

You need to evaluate the data to calculate the min, max, mean, and standard deviation values
for all the string and numeric columns.

Solution: You use the following PySpark expression:

df.explain()

Does this meet the goal?

Options:
A- Yes
B- No

Answer:
B

Explanation:
The df.explain() method does not meet the goal of evaluating data to calculate statistical
certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 5

functions. It is used to display the physical plan that Spark will execute. Reference = The correct
usage of the explain() function can be found in the PySpark documentation.

Question 4
Question Type: Hotspot

Case Study: Mix Questions

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY

You have a Fabric warehouse that contains a table named Sales.Orders. Sales.Orders contains
the following columns.

You need to write a T-SQL query that will return the following columns.

How should you complete the code? To answer, select the appropriate options in the answer
area.

NOTE: Each correct selection is worth one point.


certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 6

Answer:
See the Answer in the Premium Version!

Question 5
Question Type: Hotspot

Case Study: Mix Questions

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY

You have a Microsoft Power B1 report and a semantic model that uses Direct Lake mode. From
Power Si Desktop, you open Performance analyzer as shown in the following exhibit.
certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 7

Use the drop-down menus to select the answer choice that completes each statement based on
the information presented in the graphic. NOTE: Each correct selection is worth one point.

Answer:
See the Answer in the Premium Version!

Question 6
Question Type: MultipleChoice

Case Study: Mix Questions

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY
certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 8

You have a Fabric tenant that contains a warehouse.

Several times a day. the performance of all warehouse queries degrades. You suspect that Fabric
is throttling the compute used by the warehouse.

What should you use to identify whether throttling is occurring?

Options:
A- the Capacity settings
B- the Monitoring hub
C- dynamic management views (DMVs)
D- the Microsoft Fabric Capacity Metrics app

Answer:
D

Explanation:
To identify whether throttling is occurring, you should use the Monitoring hub (B). This provides a
centralized place where you can monitor and manage the health, performance, and reliability of
your data estate, and see if the compute resources are being throttled. Reference = The use of
the Monitoring hub for performance management and troubleshooting is detailed in the Azure
Synapse Analytics documentation.

Question 7
Question Type: Hotspot

Case Study: Mix Questions

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY

You have a Fabric tenant that contains a lakehouse named Lakehouse1. Lakehouse1 contains a
table named Nyctaxi_raw. Nyctaxi_raw contains the following columns.
certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 9

You create a Fabric notebook and attach it to lakehouse1.

You need to use PySpark code to transform the dat

a. The solution must meet the following requirements:

* Add a column named pickupDate that will contain only the date portion of pickupDateTime.

* Filter the DataFrame to include only rows where fareAmount is a positive number that is less
than 100.

How should you complete the code? To answer, select the appropriate options in the answer
area. NOTE: Each correct selection is worth one point.

Answer:
See the Answer in the Premium Version!

Question 8
Question Type: MultipleChoice

Case Study: Mix Questions

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY

You are analyzing customer purchases in a Fabric notebook by using PySpanc You have the
following DataFrames:
certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 10

You need to join the DataFrames on the customer_id column. The solution must minimize data
shuffling. You write the following code.

Which code should you run to populate the results DataFrame?

A)

B)

C)

D)

Options:
A- Option A
B- Option B
C- Option C
D- Option D

Answer:
A

Explanation:
The correct code to populate the results DataFrame with minimal data shuffling is Option A. Using
the broadcast function in PySpark is a way to minimize data movement by broadcasting the
smaller DataFrame (customers) to each node in the cluster. This is ideal when one DataFrame is
much smaller than the other, as in this case with customers. Reference = You can refer to the
official Apache Spark documentation for more details on joins and the broadcast hint.
certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 11

Question 9
Question Type: DragDrop

Case Study: Mix Questions

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY

You have a Fabric tenant that contains a lakehouse named Lakehouse1

Readings from 100 loT devices are appended to a Delta table in Lakehouse1. Each set of readings
is approximately 25 KB. Approximately 10 GB of data is received daily.

All the table and SparkSession settings are set to the default.

You discover that queries are slow to execute. In addition, the lakehouse storage contains data
and log files that are no longer used.

You need to remove the files that are no longer used and combine small files into larger files with
a target size of 1 GB per file.

What should you do? To answer, drag the appropriate actions to the correct requirements. Each
action may be used once, more than once, or not at all. You may need to drag the split bar
between panes or scroll to view content.

NOTE: Each correct selection is worth one point.

Answer:
See the Answer in the Premium Version!

Question 10
Question Type: MultipleChoice

Case Study: Mix Questions


certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 12

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY

You have an Azure Repos Git repository named Repo1 and a Fabric-enabled Microsoft Power Bl
Premium capacity. The capacity contains two workspaces named Workspace! and Workspace2.
Git integration is enabled at the workspace level.

You plan to use Microsoft Power Bl Desktop and Workspace! to make version-controlled changes
to a semantic model stored in Repo1. The changes will be built and deployed lo Workspace2 by
using Azure Pipelines.

You need to ensure that report and semantic model definitions are saved as individual text files
in a folder hierarchy. The solution must minimize development and maintenance effort.

In which file format should you save the changes?

Options:
A- PBIP
B- PBIT
C- PBIX
D- PBIDS

Answer:
C

Explanation:
When working with Power BI Desktop and Git integration for version control, report and semantic
model definitions should be saved in the PBIX format. PBIX is the Power BI Desktop file format
that contains definitions for reports, data models, and queries, and it can be easily saved and
tracked in a version-controlled environment. The solution should minimize development and
maintenance effort, and saving in PBIX format allows for the easiest transition from development
to deployment, especially when using Azure Pipelines for CI/CD (continuous
integration/continuous deployment) practices.

Question 11
Question Type: Hotspot

Case Study: Mix Questions


certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 13

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY

You have a Fabric tenant that contains a workspace named Workspace1. Workspace1 contains a
lakehouse named Lakehouse1 and a warehouse named Warehouse1.

You need to create a new table in Warehouse1 named POSCustomers by querying the customer
table in Lakehouse1.

How should you complete the T-SQL statement? To answer, select the appropriate options in the
answer area.

NOTE: Each correct selection is worth one point.

Answer:
See the Answer in the Premium Version!

Question 12
Question Type: MultipleChoice

Case Study: Mix Questions

Mix Questions
DP-600 Mix Questions IN THIS CASE STUDY

You have a Fabric tenant that contains a lakehouse named Lakehouse1. Lakehouse1 contains a
subfolder named Subfolder1 that contains CSV files. You need to convert the CSV files into the
delta format that has V-Order optimization enabled. What should you do from Lakehouse
explorer?
certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 14

Options:
A- Use the Load to Tables feature.
B- Create a new shortcut in the Files section.
C- Create a new shortcut in the Tables section.
D- Use the Optimize feature.

Answer:
A

Explanation:
To convert CSV files into the delta format with Z-Order optimization enabled, you should use the
Optimize feature (D) from Lakehouse Explorer. This will allow you to optimize the file organization
for the most efficient querying. Reference = The process for converting and optimizing file
formats within a lakehouse is discussed in the lakehouse management documentation.
certscare DP-600 QUESTIONS BY Kirby 15-04-2024 12QA - Page 15

To Get Premium Files for DP-600 Visit


https://www.p2pexams.com/products/dp-600

For More Free Questions Visit


https://www.p2pexams.com/microsoft/pdf/dp-600

You might also like