The latest Microsoft DP-900 Azure Data Fundamentals certification actual real practice exam question and answer (Q&A) dumps are available free, which are helpful for you to pass the Microsoft DP-900 Azure Data Fundamentals exam and earn Microsoft DP-900 Azure Data Fundamentals certification.
Table of Contents
- DP-900 Question 31
- Exam Question
- Correct Answer
- DP-900 Question 32
- Exam Question
- Correct Answer
- Explanation
- Reference
- DP-900 Question 33
- Exam Question
- Correct Answer
- Explanation
- Reference
- DP-900 Question 34
- Exam Question
- Correct Answer
- DP-900 Question 35
- Exam Question
- Correct Answer
- Reference
- DP-900 Question 36
- Exam Question
- Correct Answer
- Reference
- DP-900 Question 37
- Exam Question
- Correct Answer
- Reference
- DP-900 Question 38
- Exam Question
- Correct Answer
- Explanation
- Reference
- DP-900 Question 39
- Exam Question
- Correct Answer
- Reference
- DP-900 Question 40
- Exam Question
- Correct Answer
- Reference
DP-900 Question 31
Exam Question
Which two activities can be performed entirely by using the Microsoft Power BI service without relying on Power BI Desktop?
A. report and dashboard creation
B. report sharing and distribution
C. data modeling
D. data acquisition and preparation
Correct Answer
A. report and dashboard creation
D. data acquisition and preparation
DP-900 Question 32
Exam Question
What are three characteristics of an Online Transaction Processing (OLTP) workload?
A. denormalized data
B. heavy writes and moderate reads
C. light writes and heavy reads
D. schema on write
E. schema on read
F. normalized data
Correct Answer
B. heavy writes and moderate reads
D. schema on write
F. normalized data
Explanation
B. heavy writes and moderate reads: Transactional data tends to be heavy writes, moderate reads.
D. schema on write: Typical traits of transactional data include: schema on write, strongly enforced
F. normalized data: Transactional data tends to be highly normalized.
Reference
Microsoft Learn > Azure > Architecture > Data Architecture Guide > Online transaction processing (OLTP)
DP-900 Question 33
Exam Question
Match the types of activities to the appropriate Azure Data Factory activities. Each activity type may be used once, more than once, or not at all.
Activity Types:
- Control
- Data movement
- Data transformation
Azure Data Factory Activities:
- Copy
- Mapping data flow
- Lookup
Correct Answer
Data movement: Copy
Data transformation: Mapping data flow
Control: Lookup
Explanation
<box 1>: Data movement
<box 2>: Data transformation
A pipeline could contain a set of activities that ingest and clean log data, and then kick off a mapping data flow to analyze the log data.
<box 3>: Control
Lookup Activity is a control flow activity.
Lookup Activity can be used to read or look up a record/ table name/ value from any external source. This output can further be referenced by succeeding activities.
Reference
Microsoft Learn > Azure > Data Factory > Pipelines and activities in Azure Data Factory and Azure Synapse Analytics > Data transformation activities
DP-900 Question 34
Exam Question
Your company has a reporting solution that has paginated reports. The reports query a dimensional model in a data warehouse.
Which type of processing does the reporting solution use?
A. stream processing
B. batch processing
C. Online Analytical Processing (OLAP)
D. Online Transaction Processing (OLTP)
Correct Answer
C. Online Analytical Processing (OLAP)
DP-900 Question 35
Exam Question
Which Azure Data Factory component initiates the execution of a pipeline?
A. a control flow
B. a trigger
C. a parameter
D. an activity
Correct Answer
B. a trigger
Reference
Microsoft Learn > Azure > Data Factory > Pipeline execution and triggers in Azure Data Factory or Azure Synapse Analytics
DP-900 Question 36
Exam Question
You have a SQL pool in Azure Synapse Analytics that is only used actively every night for eight hours.
You need to minimize the cost of the dedicated SQL pool as much as possible during idle times. The solution must ensure that the data remains intact.
What should you do on the SQL pool?
A. Scale down the data warehouse units (DWUs).
B. Pause the pool.
C. Create a user-defined restore point.
D. Delete the pool
Correct Answer
B. Pause the pool.
Reference
Microsoft Learn > Azure > Synapse Analytics > Manage compute for dedicated SQL pool (formerly SQL DW) in Azure Synapse Analytics
DP-900 Question 37
Exam Question
You need to gather real-time telemetry data from a mobile application.
Which type of workload describes this scenario?
A. Online Transaction Processing (OLTP)
B. batch
C. massively parallel processing (MPP)
D. streaming
Correct Answer
D. streaming
Reference
Microsoft Learn > Azure > Azure Monitor > Azure Monitor overview
DP-900 Question 38
Exam Question
Which setting can only be configured during the creation of an Azure Cosmos DB account?
A. geo-redundancy
B. multi-region writes
C. production or non-production account type
D. API
Correct Answer
D. API
Explanation
You must create a separate account for each API.
Reference
Microsoft Learn > Azure > Azure Cosmos DB > Manage an Azure Cosmos DB account using the Azure portal
DP-900 Question 39
Exam Question
For each of the following statements, select Yes if the statement is true. Otherwise, select No.
Statement 1: Azure Table storage supports multiple read replicas.
Statement 2: Azure Table storage supports multiple write regions.
Statement 3: The Azure Cosmos DB Table API supports multiple read replicas.
Statement 4: The Azure Cosmos DB Table API supports multiple write regions.
Correct Answer
Statement 1: Azure Table storage supports multiple read replicas: Yes
Statement 2: Azure Table storage supports multiple write regions: No
Statement 3: The Azure Cosmos DB Table API supports multiple read replicas: Yes
Statement 4: The Azure Cosmos DB Table API supports multiple write regions: Yes
Reference
Microsoft Learn > Azure > Azure Cosmos DB > Achieve high availability with Azure Cosmos DB
DP-900 Question 40
Exam Question
At which two levels can you set the throughput for an Azure Cosmos DB account?
A. database
B. item
C. container
D. partition
Correct Answer
A. database
C. container
Reference
Microsoft Learn > Azure > Azure Cosmos DB > Introduction to provisioned throughput in Azure Cosmos DB