Skip to Content

DP-900 Microsoft Azure Data Fundamentals Exam Questions and Answers – Page 2 Part 1

The latest Microsoft DP-900 Azure Data Fundamentals certification actual real practice exam question and answer (Q&A) dumps are available free, which are helpful for you to pass the Microsoft DP-900 Azure Data Fundamentals exam and earn Microsoft DP-900 Azure Data Fundamentals certification.

DP-900 Question 121

Exam Question

To complete the sentence, select the appropriate option in the answer area.

A visualization that illustrates a university’s current student enrollment per department is an example of __________ analytics.

Answer Area:
A. cognitive
B. descriptive
C. predictive
D. prescriptive

Correct Answer

B. descriptive

Explanation

Generally speaking, data analytics comes in four types (Figure 1):

  • Descriptive, to answer the question: What’s happening?
  • Diagnostic, to answer the question: Why’s happening?
  • Predictive, to answer the question: What will happen?
  • Prescriptive, to answer the question: What actions should we take?

Reference

Describe, diagnose, and predict with IoT Analytics

IoT Analytics Flavors

DP-900 Question 122

Exam Question

To complete the sentence, select the appropriate option in the answer area.

Creating closed caption text for audio files is an example of __________ analytics.

Answer Area:
A. cognitive
B. descriptive
C. predictive
D. prescriptive

Correct Answer

B. descriptive

Explanation

Descriptive, to answer the question: What’s happening?

Note: Azure Media Indexer enables you to make content of your media files searchable and to generate a fulltext transcript for closed captioning and keywords. You can process one media file or multiple media files in a batch.

IoT Analytics Flavors

DP-900 Question 123

Exam Question

Which statement is an example of Data Manipulation Language (DML)?

A. REVOKE
B. DISABLE
C. CREATE
D. UPDATE

Correct Answer

D. UPDATE

Explanation

Data Manipulation Language (DML) affect the information stored in the database. Use these statements to insert, update, and change the rows in the database.

  • BULK INSERT
  • DELETE
  • INSERT
  • SELECT
  • UPDATE
  • MERGE

Reference

Microsoft Learn > SQL > SQL Server > Transact-SQL statements

DP-900 Question 124

Exam Question

Your company recently reported sales from the third quarter.

You have the chart shown in the following exhibit.

Your company recently reported sales from the third quarter.

Which type of analysis is shown in the fourth quarter?

A. predictive
B. prescriptive
C. descriptive
D. diagnostic

Correct Answer

A. predictive

Explanation

Predictive, to answer the question: What will happen?

IoT Analytics Flavors

Reference

Describe, diagnose, and predict with IoT Analytics

DP-900 Question 125

Exam Question

To answer, drag the appropriate locations to the correct stages. Each location may be used once, more than once, or not at all.

Your company plans to load data from a customer relationship management (CRM) system to a data warehouse by using an extract, load, and transform (ELT) process.

Where does data processing occur for each stage of the ELT process?

Locations:

  • A standalone data analysis tool
  • The CRM system
  • The data warehouse

Stages:

  • Extract:
  • Load:
  • Transform:

Correct Answer

Extract: The CRM system
Load: The data warehouse
Transform: A standalone data analysis tool

Explanation

Extract: The CRM system
Data is extracted from the CRM system.

Load: The data warehouse
Data is loaded to the data warehouse.

Transform: A standalone data analysis tool
The data transformation that takes place usually involves various operations, such as filtering, sorting, aggregating, joining data, cleaning data, deduplicating, and validating data.

Data is extracted from the CRM system. Data is loaded to the data warehouse. The data transformation that takes place usually involves various operations, such as filtering, sorting, aggregating, joining data, cleaning data, deduplicating, and validating data.

Reference

Microsoft Learn > Azure > Architecture > Data Architecture Guide > Extract, transform, and load (ETL)

DP-900 Question 126

Exam Question

Match the types of workloads to the appropriate scenarios. Each workload type may be used once, more than once, or not at all.

Workload Types:

  • Batch
  • Streaming

Scenarios:

  • Data for a product catalog will be loaded every 12 hours to a data warehouse.
  • Thousands of data sets per second for online purchases will be loaded into a data warehouse in real time
  • Updates to inventory data will be loaded to a data warehouse every 1 million transactions.

Correct Answer

Batch: Data for a product catalog will be loaded every 12 hours to a data warehouse.
Streaming: Thousands of data sets per second for online purchases will be loaded into a data warehouse in real time
Batch: Updates to inventory data will be loaded to a data warehouse every 1 million transactions.

Explanation

Batch: Data for a product catalog will be loaded every 12 hours to a data warehouse.
Batch processing refers to the processing of blocks of data that have already been stored over a period of time.

Streaming: Thousands of data sets per second for online purchases will be loaded into a data warehouse in real time
Stream processing is a big data technology that allows us to process data in real-time as they arrive and detect conditions within a small period of time from the point of receiving the data. It allows us to feed data into analytics tools as soon as they get generated and get instant analytics results.

Batch: Updates to inventory data will be loaded to a data warehouse every 1 million transactions.

Reference

Microsoft Learn > Azure > Architecture > Data Architecture Guide > Choose a batch processing technology in Azure

DP-900 Question 127

Exam Question

Match the Azure Data Factory components to the appropriate descriptions. Each component may be used once, more than once, or not at all.

Components:

  • Dataset
  • Linked service
  • Mapping data flow
  • Pipeline

Descriptions:

  • A representation of data structures within data stores
  • The information used to connect to external resources
  • A logical grouping of activities that performs a unit of work and can be scheduled

Correct Answer

Dataset: A representation of data structures within data stores
Linked service: The information used to connect to external resources
Pipeline: A logical grouping of activities that performs a unit of work and can be scheduled

Explanation

Dataset: A representation of data structures within data stores
Datasets must be created from paths in Azure datastores or public web URLs, for the data to be accessible by Azure Machine Learning.

Linked service: The information used to connect to external resources
Linked services are much like connection strings, which define the connection information needed for Data Factory to connect to external resources.

Pipeline: A logical grouping of activities that performs a unit of work and can be scheduled
A pipeline is a logical grouping of activities that together perform a task.

Reference

DP-900 Question 128

Exam Question

What are two characteristics of real-time data processing?

A. Data is processed periodically
B. Low latency is expected
C. High latency is acceptable
D. Data is processed as it is created

Correct Answer

B. Low latency is expected
D. Data is processed as it is created

Explanation

Real time processing deals with streams of data that are captured in real-time and processed with minimal latency to generate real-time (or near-real-time) reports or automated responses.

Reference

Microsoft Learn > Azure > Architecture > Data Architecture Guide > Real-time processing

Real-time processing

DP-900 Question 129

Exam Question

Match the Azure Data Lake Storage Gen2 terms to the appropriate levels in the hierarchy. Each term may be used once, more than once, or not at all.

Quesion: Match the Azure Data Lake Storage Gen2 terms to the appropriate levels in the hierarchy. Each term may be used once, more than once, or not at all.

Correct Answer

Answer: Match the Azure Data Lake Storage Gen2 terms to the appropriate levels in the hierarchy. Each term may be used once, more than once, or not at all.

<box 1>: Azure Storage account

<box 2>: File share

Explanation

<box 1>: Azure Storage account

Azure file shares are deployed into storage accounts, which are top-level objects that represent a shared pool of storage.

<box 2>: File share

Reference

Microsoft Learn > Azure > Storage > Files > Create an SMB Azure file share

DP-900 Question 130

Exam Question

Match the types of data to the appropriate Azure data services. Each data type may be used once, more than once, or not at all.

Data Types:

  • Image files
  • Key/value pairs
  • Relationships between employees

Azure Data Services:

  • Azure Blob storage
  • Azure Cosmos DB Gremlin API
  • Azure Table storage

Correct Answer

Image files: Azure Blob storage
Key/value pairs: Azure Cosmos DB Gremlin API
Relationships between employees: Azure Table storage

Explanation

Image files: Azure Blob storage

Azure Blob storage is suitable for image files.

Key/value pairs: Azure Cosmos DB Gremlin API

Azure CosmosDB table API is a key-value storage hosted in the cloud.

Relationships between employees: Azure Table storage

One-to-many relationships between business domain objects occur frequently: for example, one department has many employees. There are several ways to implement one-to-many relationships in the Azure Table service.

Reference

Microsoft Learn > Azure > Storage > Tables > Modeling relationships