The latest Microsoft DP-900 Azure Data Fundamentals certification actual real practice exam question and answer (Q&A) dumps are available free, which are helpful for you to pass the Microsoft DP-900 Azure Data Fundamentals exam and earn Microsoft DP-900 Azure Data Fundamentals certification.
Table of Contents
- DP-900 Question 31
- Exam Question
- Correct Answer
- DP-900 Question 32
- Exam Question
- Correct Answer
- Explanation
- DP-900 Question 33
- Exam Question
- Correct Answer
- Explanation
- DP-900 Question 34
- Exam Question
- Correct Answer
- DP-900 Question 35
- Exam Question
- Correct Answer
- DP-900 Question 36
- Exam Question
- Correct Answer
- DP-900 Question 37
- Exam Question
- Correct Answer
- DP-900 Question 38
- Exam Question
- Correct Answer
- Explanation
- DP-900 Question 39
- Exam Question
- Correct Answer
- DP-900 Question 40
- Exam Question
- Correct Answer
DP-900 Question 31
Exam Question
Which two activities can be performed entirely by using the Microsoft Power BI service without relying on Power BI Desktop?
A. report and dashboard creation
B. report sharing and distribution
C. data modeling
D. data acquisition and preparation
Correct Answer
A. report and dashboard creation
D. data acquisition and preparation
DP-900 Question 32
Exam Question
What are three characteristics of an Online Transaction Processing (OLTP) workload?
A. denormalized data
B. heavy writes and moderate reads
C. light writes and heavy reads
D. schema on write
E. schema on read
F. normalized data
Correct Answer
B. heavy writes and moderate reads
D. schema on write
F. normalized data
Explanation
B. heavy writes and moderate reads: Transactional data tends to be heavy writes, moderate reads.
D. schema on write: Typical traits of transactional data include: schema on write, strongly enforced
F. normalized data: Transactional data tends to be highly normalized.
DP-900 Question 33
Exam Question
Match the types of activities to the appropriate Azure Data Factory activities. Each activity type may be used once, more than once, or not at all.
Activity Types:
- Control
- Data movement
- Data transformation
Azure Data Factory Activities:
- Copy
- Mapping data flow
- Lookup
Correct Answer
Data movement: Copy
Data transformation: Mapping data flow
Control: Lookup
Explanation
<box 1>: Data movement
<box 2>: Data transformation
A pipeline could contain a set of activities that ingest and clean log data, and then kick off a mapping data flow to analyze the log data.
<box 3>: Control
Lookup Activity is a control flow activity.
Lookup Activity can be used to read or look up a record/ table name/ value from any external source. This output can further be referenced by succeeding activities.
DP-900 Question 34
Exam Question
Your company has a reporting solution that has paginated reports. The reports query a dimensional model in a data warehouse.
Which type of processing does the reporting solution use?
A. stream processing
B. batch processing
C. Online Analytical Processing (OLAP)
D. Online Transaction Processing (OLTP)
Correct Answer
C. Online Analytical Processing (OLAP)
DP-900 Question 35
Exam Question
Which Azure Data Factory component initiates the execution of a pipeline?
A. a control flow
B. a trigger
C. a parameter
D. an activity
Correct Answer
B. a trigger
DP-900 Question 36
Exam Question
You have a SQL pool in Azure Synapse Analytics that is only used actively every night for eight hours.
You need to minimize the cost of the dedicated SQL pool as much as possible during idle times. The solution must ensure that the data remains intact.
What should you do on the SQL pool?
A. Scale down the data warehouse units (DWUs).
B. Pause the pool.
C. Create a user-defined restore point.
D. Delete the pool
Correct Answer
B. Pause the pool.
DP-900 Question 37
Exam Question
You need to gather real-time telemetry data from a mobile application.
Which type of workload describes this scenario?
A. Online Transaction Processing (OLTP)
B. batch
C. massively parallel processing (MPP)
D. streaming
Correct Answer
D. streaming
DP-900 Question 38
Exam Question
Which setting can only be configured during the creation of an Azure Cosmos DB account?
A. geo-redundancy
B. multi-region writes
C. production or non-production account type
D. API
Correct Answer
D. API
Explanation
You must create a separate account for each API.
DP-900 Question 39
Exam Question
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
Statement 1: Azure Table storage supports multiple read replicas.
Statement 2: Azure Table storage supports multiple write regions.
Statement 3: The Azure Cosmos DB Table API supports multiple read replicas.
Statement 4: The Azure Cosmos DB Table API supports multiple write regions.
Correct Answer
Statement 1: Azure Table storage supports multiple read replicas: Yes
Statement 2: Azure Table storage supports multiple write regions: No
Statement 3: The Azure Cosmos DB Table API supports multiple read replicas: Yes
Statement 4: The Azure Cosmos DB Table API supports multiple write regions: Yes
DP-900 Question 40
Exam Question
At which two levels can you set the throughput for an Azure Cosmos DB account?
A. database
B. item
C. container
D. partition
Correct Answer
A. database
C. container