Common Technical Interview Questions and Answers Update on November 18, 2021

Exam Question 1

DR insurance companies will pay for costs immediately as they arise.

A. True
B. False

Correct Answer:
B. False
Answer Description:
Don’t count on that reimbursement as an immediate influx of funds following a disaster. Having DR insurance often means paying all of your bills, then later being reimbursed for what was covered. While it will vary by company, DR reimbursements can take some time to come in.

Because of this, be sure to have alternative funds readily available for after disaster strikes. You may need to finance your company’s operations with little to no income for an unknown amount of time, so plan ahead. Prior to a disaster, you can also work to develop a relationship between your organization and banking representatives. Doing so not only will make them aware of how you plan to operate in a disaster, but also could enable you to get preapproved for any necessary loans that may be needed in the future.

Exam Question 2

During a disaster, data security should be:

A. Put on the backburner while the recovery is handled
B. Attended to on an as-needed basis
C. Consistent with security during regular operations
D. None of the above

Correct Answer:
C. Consistent with security during regular operations
Answer Description:
After disaster strikes, data is at its most vulnerable, so don’t leave security as an afterthought. The best thing you can do is maintain the regular level of security during a disaster so you can rest assured that your data is protected while a recovery is taking place. This, of course, means that you need to have strong security measures in place prior to an outage or failure that can be kept up in a crisis.

Being mindful of business continuity and DR (BC/DR) while establishing security measures — and vice versa — is the easiest way to ensure that your data is secure when a disaster event takes place. If treated as totally separate entities, BC/DR and security efforts will likely suffer.

Exam Question 3

A phone tree is an outdated and inefficient form of emergency notifications.

A. True
B. False

Correct Answer:
B. False
Answer Description:
Although newer, faster methods of mass notification exist, don’t rule out having an emergency call tree established when checking off the communications box in your disaster recovery planning checklist. While speed is important and automated systems, such as mass text notifications, are available, there is a chance that those systems could go awry, depending on the disaster.

As long as phone lines and cellular networks are available, a phone tree is a reliable method of communicating information to employees during a crisis. Even if it is not the primary system being used, having a phone tree organized and distributed as a backup plan can ensure that everyone will be contacted in a reasonably timely manner. Another advantage to a phone tree over modern automated systems is that it enables two-way communication. While this could result in the callers getting more questions, it lets employees communicate and exchange information in a way an automated system cannot. It will also be immediately clear if an employee does or does not receive the message, so your organization can take further actions to relay the information if necessary.

Exam Question 4

What does RTO specify?

A. How much data you can afford to lose in a disaster
B. How long it should take for a system to fully recover following a disaster
C. Which data is mission-critical in a disaster
D. How badly the business is impacted by a disaster

Correct Answer:
B. How long it should take for a system to fully recover following a disaster
Answer Description:
RTO stands for recovery time objective and refers to the maximum amount of time that a system, network or application can be down after a disaster hits. This is not to be confused with RPO, which stands for recovery point objective. RPO establishes how old the last backup of data can be to resume normal operations after a disaster.

While both RPO and RTO can be expressed in terms of time, they do not necessarily match each other. When determining an RTO, you should take into account any compliance or other business requirements that specify acceptable amounts of downtime, as well as how quickly your current system is capable of recovering. Your RPO should be based on how frequently data needs to be backed up. If backups are performed at a frequency of once every 60 minutes, then the RPO and last usable backup should be 60 minutes prior to the disaster.

When planning for potential disasters, ensure that your recovery objectives are not only established, but attainable and up to date with your current requirements and capabilities.

Exam Question 5

When creating your disaster recovery planning checklist, what qualities should you prioritize when choosing a DR team leader?

A. Someone who is knowledgeable but open-minded
B. Someone who is remote but strict
C. Someone who is available but fresh to the field
D. Someone who is antisocial but detail-oriented

Correct Answer:
A. Someone who is knowledgeable but open-minded
Answer Description:
While managerial styles vary, there are a few basic traits that you should look for — and some you should avoid — when choosing a team leader. One of the most common mistakes made when choosing someone to head up a DR team is to choose the person who is available at the time. This is the fastest way to fill the role, but a level of expertise may be lacking when you go with the first available option. Availability should not be confused with accessibility, however. Rather than someone who works remotely or travels frequently, choose a leader who is on-site often and available to answer questions, offer suggestions and communicate openly with the team.

Being a strong leader is one thing, and being severe is another. Prioritize a leader who is open to hearing ideas from the team and gets along with colleagues and external partners. Finding someone focused and detail-oriented is helpful, as things can often go overlooked in a crisis situation. However, if candidates cannot communicate that information openly with the team, they may not be suited to a leadership position.

When you choose a DR leader, take the time to find someone who is experienced and knowledgeable in the field but also open to hearing input from the rest of the team.

Exam Question 6

In addition to your internal staffing, having an outside DR adviser can be beneficial to recovery planning.

A. True
B. False

Correct Answer:
A. True
Answer Description:
Having an experienced DR consultant to bring in for DR planning offers a number of advantages. She can offer an outside point of view and identify any weaknesses or areas for improvement that internal staff may miss. However, the emphasis here should be on experienced. Finding an adviser should be relatively simple for most organizations, but finding one that has all of the necessary qualifications and knowledge to aid your business is more of a challenge.

When vetting outside consultants, ensure that candidates are not only knowledgeable of IT, but your industry as well. Taking on an adviser with insight into your particular needs is critical to adequate DR planning. Also, the IT landscape is constantly evolving, so be sure that a potential adviser is up to date on the latest technologies, approaches and compliance needs.

Exam Question 7

This is an approach to software development that encourages developers, operations staff and business teams to work together to be more responsive to user demand and ultimately maximize revenue.

A. ProductDevOps
B. BizDevOps
C. DataOps
D. DevOps 2.0

Correct Answer:
B. BizDevOps
Answer Description:
While folks often refer to DevOps 2.0 as BizDevOps, that doesn’t mean they’re right and that we’re just going to go along with it. DevOps 2.0 truly aims to expand DevOps style collaboration, not only to the business side, but also to quality assurance, security, HR and across the whole organization. DataOps is an approach that prioritizes a data analytics pipeline, so the data group can get involved and consistently monitor the pipeline. ProductDevOps is a great name, but it doesn’t exist … yet.

Exam Question 8

Some argue this methodology is redundant because security has always been a part of DevOps since the Phoenix Project.

A. DevBlackOps
B. SciOps
C. DevSecOps
D. DevOpsIoT

Correct Answer:
C. DevSecOps
Answer Description:
Don’t let “DevBlackOps” fool you, DevSecOps is nearly as famous as DevOps itself. DevSecOps aims to integrate security through all aspects of software development lifecycle. Basically, continuous quality should be as important to you as continuous delivery. Sorry, SciOps — Scientific DevOps, i.e., DevOps practices applied to scientific computing to deliver scientific insights — and sorry, DevOpsIoT, better luck in next year’s DevOps quiz.

Exam Question 9

DevOps aims to cut inefficiencies, improve communication and develop collective empathy within an organization. This DevOps methodology emphasizes the empathy portion of the equation.

A. HugOps
B. DadOps
C. HumanOps
D. SnackOps
E. HugOps and HumanOps

Correct Answer:
E. HugOps and HumanOps
Answer Description:
HumanOps and HugOps — one more of a Twitter movement and the other something you could actually pitch to your boss — assert that human health affects business health. Sometimes we forget that shipping software 24/7 means pushing real humans past normal working limits. HumanOps and HugOps are DevOps methodologies cut from the same cloth and work to develop processes that create a balanced workflow. DadOps, on the other hand, is less interested in developing empathy within your organization and more interested in telling you about the World War II book he just read. (If DadOps is new to you, it’s clear you needed to take our DevOps quiz.)

Exam Question 10

What happens when your DevOps process becomes so automated and abstracted that your infrastructure doesn’t even need a dedicated IT ops team? With the rise of IT automation and cloud computing, this methodology is quietly becoming a reality.

A. CloudOps
B. NoOps
C. ProtoOps
D. MicroOps

Correct Answer:
B. NoOps
Answer Description:
In this DevOps scenario, coined by Forrester Research, automated programs deploy and maintain developer code. NoOps faces considerable backlash from operations folks, understandably, and the idea has pivoted from “operations not allowed” to a more business and product-oriented focus. NoOps can mean an organization without any operations professionals at all, but product and PaaS vendors have adopted a more conservative form — with operations professionals residing in product teams — too. We’re not sure what ProtoOps is, but we liked the sound of it in our DevOps quiz.

Exam Question 11

This Agile movement takes the collaborative element of DevOps and applies it to the greater IT and tech community.

A. DevMeetOps
B. HumanOps
C. CoffeeOps
D. SnackOps

Correct Answer:
C. CoffeeOps
Answer Description:
SnackOps is at it again, but it’s CoffeeOps that takes the cake here. This DevOps meetup can bring a diverse group of IT professionals — developers, ops folk, even recruiters — together to talk strategies for tackling particular DevOps issues. Part workshop, part therapy session, these DevOps meetups have sprouted across the country, most popularly in the Bay Area and Seattle. Most follow the lean coffee model and promote an anti-burnout culture.

Exam Question 12

This DevOps methodology strives to speed the production of applications running on big data processing frameworks, using an organization’s data in the most flexible, effective manner possible.

A. DataOps
B. CloudOps
C. NoOps
D. DevOps 2.0

Correct Answer:
A. DataOps
Answer Description:
If you saw “big data,” and thought “DataOps,” well, you didn’t outthink yourself on this DevOps quiz. As we’ve already explored, NoOps imagines a DevOps without operations. And DevOps 2.0 imagines DevOps applied throughout an organization — HR, legal, etc. DataOps, though, is an often overlooked DevOps methodology. In order to run multiple open source tools and frameworks all at once, your organization’s data architecture has to be both complex and flexible. If organizations want to do DataOps — and therefore DevOps — they have to break down silos between data engineers, developers and stakeholders.

Exam Question 13

A classic application of graph analytics in action is:

A. Bank account transactions
B. An inventory list
C. A recommendation engine

Correct Answer:
C. A recommendation engine
Answer Description:
Recommendation engines suggest products to consumers based on previous purchases and purchases made by similar buyers. They examine complex relationships between trends, relationships between products and relationships between comparable customers. Considerations between these different data points are more obvious to plot with graph analytics.

Exam Question 14

How do graph data analytics help with data visualization?

A. They map connections between nodes to show relationships
B. It’s easier to analyze data in a graph format
C. They ignore unusual relationships, making high-level analytics clearer

Correct Answer:
A. They map connections between nodes to show relationships
Answer Description:
Graphs treat the relationships between nodes as additional data, making it easy to visualize the map between point A to point B to point C. Seeing these connections makes visualizations easier, and makes relationship-driven decision-making easier.

Exam Question 15

True or false? A graph database falls under the categorization of relational databases.

A. True
B. False

Correct Answer:
B. False
Answer Description:
While graph data analytics are best known for prioritizing relationships between data, they are not considered relational databases. Graph databases are categorized as NoSQL.

Exam Question 16

What factor is holding back graph analytics adoption the most?

A. The cost of implementing graph databases typically outweigh the benefits
B. The technology isn’t easy for business users to operate
C. Graph technology hasn’t advanced enough in recent years

Correct Answer:
B. The technology isn’t easy for business users to operate
Answer Description:
Gartner expects the market for graph data analytics to expand in the next few years and many use cases can benefit from their application, but the technology isn’t the easiest for end users. Data scientists are the most common graph database users, not business analysts.

Exam Question 17

What makes graph databases unique?

A. They treat relationships between nodes as data
B. It’s easier to add data to a graph database than any other database
C. Graph databases are the easiest to query

Correct Answer:
A. They treat relationships between nodes as data
Answer Description:
Connections between data points are treated as additional data. Relational databases map the connections between data points using tables, but graph databases prioritize relationships between data points, which often exhibit more complex relationships.

Exam Question 18

True or false? Graph data analytics are only available to large organizations because of their cost.

A. True
B. False

Correct Answer:
B. False
Answer Description:
Larger organizations with larger data sets typically see a benefit from implementing graph technology. But smaller organizations, especially ones focusing on AI, fraud or commerce, can also take advantage of graph analytics.

Exam Question 19

What is the main benefit of having no fixed schema for your database?

A. There are no apparent benefits
B. Reduced overhead costs
C. There’s more flexibility
D. Improved security

Correct Answer:
B. Reduced overhead costs
Answer Description:
While there are many benefits to a lack of rigid schema, the main benefit is the lower overhead cost. This also covers other benefits such as ease of restructuring and fewer problems, fewer security breaches and fewer crashes, as those can be costly to fix.

Exam Question 20

How does MongoDB offer high availability for data?

A. Replication
B. Sharding
C. The document-based format
D. The lack of schema

Correct Answer:
A. Replication
Answer Description:
MongoDB allows copies of the database to be duplicated across servers, creating replica sets organized into primary and secondary servers through an election process by members of the replica set. This allows continuous availability for the database and has the potential for improved performance.

Exam Question 21

MongoDB updated ACID transactions how?

A. Only one document can be in one atomic transaction
B. No changes
C. Multiple documents can be included in one atomic transaction
D. It doesn’t allow ACID transactions

Correct Answer:
C. Multiple documents can be included in one atomic transaction
Answer Description:
MongoDB 4 new features include allowing multiple documents in a single atomic transaction. In later versions of MongoDB 4, the limit on transaction size was removed and support for multidocument transactions was added for sharded clusters. While multidocument ACID transaction support is attractive to users and can improve database integrity, it does have some negative impact on support.

Exam Question 22

One important security improvement to MongoDB 4.x is:

A. Client-side field-level encryption
B. Aggregation pipeline type conversions
C. Wildcard indexes
D. Nonblocking secondary reads

Correct Answer:
A. Client-side field-level encryption
Answer Description:
Client-side field-level encryption allows you to provide end-to-end encryption for data all the way down to the field level in transmission. This means your application can have a transmission where specific information is encrypted among otherwise plaintext data.

Exam Question 23

MongoDB 4.4 added a hedged read ability.

A. True
B. False

Correct Answer:
A. True
Answer Description:
MongoDB 4 new features include hedged reads on sharded clusters, allowing applications to perform read requests on secondary servers. This allows for improved performance because the application can take the first result, but it does pose a risk of supplying obsolete data.

Exam Question 24

What is the benefit of including timestamps in the storage engine?

A. Increased data integrity
B. Improved performance
C. Produced more current documents for read operations
D. All the above

Correct Answer:
D. All the above
Answer Description:
MongoDB acquired WiredTiger in 2014 and integrated it into version 3 as the storage engine. In version 4, the internal update mechanism in the storage engine began to include a timestamp for update documents. The overall reason for this update was to support the new feature of allowing multidocument transactions, but the change also improved data integrity during replication and freed up resources on the database since the synchronization is now handled by the storage engine.

Exam Question 25

The shard cluster balancer doesn’t aid in capacity expansion.

A. True
B. False

Correct Answer:
B. False
Answer Description:
If you need to scale your database up or down for whatever reason, manual intervention is necessary to rebalance data between nodes while scaling. MongoDB’s shard cluster balancer preallocates chunks by predefined zones and ranges to facilitate capacity expansion, allowing DevOps engineers to scale up or down on demand.

Exam Question 26

One of MongoDB 4’s most popular user security approaches is SCRAM. What does SCRAM stand for?

A. Secure Change Request Authorization Mechanism
B. Salted Challenge Response Authentication Method
C. Security Channel Request Authentication Method
D. Secure Challenge Response Authenticating Means

Correct Answer:
B. Salted Challenge Response Authentication Method
Answer Description:
Challenge responses are a popular approach to database user authentication. MongoDB 4 updated its challenge response mechanism. For users upgrading from version 3, it’s important to upgrade to SCRAM before upgrading to MongoDB 4.x.

Exam Question 27

A healthcare provider is examining a new patient with complaints of severe back pain. The provider suspects substance abuse and dependency on pain medication based on her initial observation. She requests access to previous substance abuse records. The request is denied to protect patient privacy by the health information management director who controls access to the electronic health record. He claims to be following governing law and ethical practices. Which of the following principles is impeding patient care?

A. Confidentiality
B. Integrity
C. Availability
D. Accountability

Correct Answer:
C. Availability
Answer Description:
The answer is availability since the health information management director won’t make the records available to the patient, thus hindering the treatment.

Exam Question 28

In a healthcare office environment, which of the following applications may possibly have sensitive data included within its storage media?

A. Email
B. Scheduling
C. Billing
D. All of the above

Correct Answer:
D. All of the above
Answer Description:
Email, scheduling and billing can all have sensitive information pertaining to a patient’s health records.

Exam Question 29

Which of the following are elements of business continuity?

A. Continuing mission capabilities after a power loss
B. Continuing contact with a patient who moves from the care area
C. Keeping in contact with a former business associate
D. Preparing for an external assessment while continuing to see patients

Correct Answer:
A. Continuing mission capabilities after a power loss
Answer Description:
Business continuity ensures an organization can deliver its mission capabilities after an event that results in power loss.

Exam Question 30

Which of the following describes the ability for a user to do something with a computer resource, such as permission to review, edit or delete?

A. Least privilege
B. Logging
C. Monitoring
D. Access control

Correct Answer:
D. Access control
Answer Description:
Access control is the ability to perform a specific function with a computer resource, whereas least privilege is nonspecific, and logging and monitoring do not set any privileges.

Exam Question 31

Which of the following is defined as a condition or weakness in (or absence of) security procedures of technical, physical or other controls?

A. Threat
B. Risk
C. Vulnerability
D. Exploitation

Correct Answer:
C. Vulnerability
Answer Description:
This is the correct definition of a vulnerability, which is exploited by a threat and can become a risk to the system or the organization.

Exam Question 32

The principles of security, often referred to as CIA, are:

A. Confidentiality, integrity, accountability
B. Contingency, integrity, accountability
C. Confidentiality, integrity, availability
D. Confidentiality, interoperability, availability

Correct Answer:
C. Confidentiality, integrity, availability
Answer Description:
While accountability is often confused as a principle, the official key principles of CIA are confidentiality, integrity and availability.

Exam Question 33

To protect health information in an email sent to a colleague, which would be a proper security control?

A. Logical controls
B. Strong authentication
C. Encryption
D. Least privilege

Correct Answer:
C. Encryption
Answer Description:
Encryption is the proper technical security control used to safely and effectively transfer sensitive information via email.

Exam Question 34

Based on some concerns you have the with organization’s firewall, you export a report of a data packet transfer during the last 15 minutes. You see evidence of packet loss and a stalled data transfer. Which of these terms best defines the type of information you are reviewing?

A. Incidents
B. Breaches
C. Events
D. Alerts

Correct Answer:
C. Events
Answer Description:
This is an event because it consists of the items or entries that describe the activity within the network or within the device logs. Events are not typically identified as a problem.

Exam Question 35

According to many leading privacy principles and regulations, which concept would be followed to make edits or amendments to the record should a mistake be identified after a patient requested a copy of her last medical procedure?

A. Accuracy
B. Access
C. Individual participation
D. Openness

Correct Answer:
A. Accuracy
Answer Description:
Accuracy is the correct concept. Enabling patients to request edits or amendments to a record will help ensure the record’s accuracy.

Exam Question 36

Which of the following is not an element of the limited data set under HIPAA?

A. Social security numbers
B. First three digits of your postal code
C. Email address
D. Bank personal identification number

Correct Answer:
B. First three digits of your postal code
Answer Description:
The result of the first three digits of your postal code could pull up thousands of people, therefore it’s not an element of the limited data set under HIPAA. Social security numbers, email addresses and bank personal identification numbers are because they’re specific to your health record.

Exam Question 37

Which of the following is not an advantage of object storage?

A. Handles the massive amounts of data that IoT and AI generate
B. Simplifies unstructured data management
C. Offers a high level of resiliency
D. Provides high availability and eventual data consistency
E. Is useful when dealing with frequently modified data

Correct Answer:
E. Is useful when dealing with frequently modified data
Answer Description:
Object storage can expand quickly to match growing volumes of data, making it useful when dealing with the large amounts of data that IoT and AI applications generate. It’s able to manage everything as a single storage resource, making it a good choice for unstructured data management. It uses erasure coding to provide a high level of resiliency. Object storage architecture provides high availability across geographic regions and ensures that data, while not immediately consistent everywhere, is eventually consistent. But because it doesn’t guarantee consistency, object storage usage probably isn’t the right option when dealing with data that’s frequently modified.

Exam Question 38

Which is a major strength of object storage?

A. It’s scalable
B. It has an organized folder structure
C. It’s user-friendly
D. It supports applications where data changes quickly
E. It provides optimal performance

Correct Answer:
A. It’s scalable
Answer Description:
Scalability is object storage’s biggest advantage. It’s easy to increase capacity by adding a new node to a storage cluster. However, object storage’s object naming scheme isn’t particularly easy to use. File storage is organized using a more user-friendly file and folder structure. Another downside to object storage usage is that when a change is made, both the data and the metadata must be updated, making it less-than-optimal for data that changes frequently and slowing performance.

Exam Question 39

__________ is a data type where object storage can be used effectively.

A. Storage for structured databases
B. Transactional data
C. Online video
D. Random read/write workloads
E. Virtual desktops

Correct Answer:
C. Online video
Answer Description:
Data that is written once and seldom modified is a good candidate for an object store. Online videos and images that are posted once and viewed many times are good examples of this type of data. Because it’s accessible as volumes and accessed via the OS, block storage is better suited to structured databases, transactional data, random read/write workloads and virtual desktops.

Exam Question 40

Which of the following is true of object storage?

A. It doesn’t allow for extended or customized metadata
B. It keeps data and metadata together
C. Data and metadata don’t share a common address space
D. It has a higher cost per gigabyte than block and file storage
E. All of the above

Correct Answer:
B. It keeps data and metadata together
Answer Description:
Object storage keeps data and its metadata, along with a unique identifier, together as an object in the system. It allows for extended metadata that users can customize for specific applications. The data and metadata are stored together as individual objects with a common address space or storage pool that eliminates the need to wade through volumes and file hierarchies. It typically has a lower cost per gigabyte than other types of storage.

Exam Question 41

Object storage paired with erasure coding can provide an alternative to block storage and RAID.

A. True
B. False

Correct Answer:
A. True
Answer Description:
RAID technology is paired with block storage to maximize performance and provide a way to recover data if a drive fails. RAID groups an array’s physical disks together and presents them as one logical drive, using some combination of striping, mirroring and parity to store and protect the data. However, as data volumes have grown, RAID has become less effective as a data protection methodology because the drive rebuild time for large volumes of data increases, along with the risk of a secondary failure that could cause permanent data loss.
Using object storage paired with erasure coding is a simpler, more flexible and cost-effective alternative to block storage and RAID. A distributed data pool makes it possible to store large amounts of unstructured data that span geographic boundaries. Objects can be distributed and replicated across multiple devices. If one location goes down or is slow, files can be accessed from another location. The object storage system tracks changes and syncs copies.

Exam Question 42

Some vendors integrate file, block and object storage into a single system using which of the following techniques?

A. An object gateway built into the storage system that enables a single system to encompass block, file and object storage
B. A separate appliance that integrates with the storage controller
C. Both of the above
D. This can’t be done

Correct Answer:
C. Both of the above
Answer Description:
In a cloud system, an object gateway can be built into storage, allowing the incorporation of block, file and object storage in a single box. Alternatively, vendors can deploy a standalone appliance that integrates with their storage controller to run object storage along with block and file.

Exam Question 43

How is on-premises object storage being used?

A. For long-term backup and archiving
B. Video streaming
C. Big data analytics
D. High-security applications
E. IoT and AI applications
F. All of the above

Correct Answer:
F. All of the above
Answer Description:
On-premises object storage has traditionally been used for secondary storage. But that’s changing as the technology’s performance improves. It’s now being used for a range of more-demanding applications, including as back-end storage for video streaming services. Object storage’s use of metadata tags enables the addition of analytics tools that make it valuable for big data efforts. Industries with high-security requirements, such as finance and healthcare, are using on-premises object storage to maintain more control over data than is possible in the public cloud. Applications such as IoT and AI that generate large amounts of data are also turning in this direction.

Exam Question 44

Which of the following is not a challenge facing high-performance object storage?

A. Aggregating and managing an object storage system’s many nodes
B. Handling high-scale workloads
C. Managing the metadata
D. Metadata can overwhelm metadata controllers
E. Internode communications

Correct Answer:
B. Handling high-scale workloads
Answer Description:
A high-performance object storage system can require hundreds of nodes. Aggregating and managing these nodes introduces latency into the system. Object storage can require additional time to manage its metadata, and that metadata can, at times, overwhelm metadata controllers and become a drag on the system. Internode communications can also cause performance issues. However, high-performance object storage is suitable for high-scale workloads where many nodes and capacities in the tens to hundreds of petabytes are required.

Exam Question 45

What is next-generation object storage doing to overcome its performance challenges?

A. Eliminating use of the NFS and SMB protocols
B. Cutting ties with public cloud services
C. Using all-flash storage systems
D. Transitioning to software-only products
E. Limiting devices to object storage only

Correct Answer:
C. Using all-flash storage systems
Answer Description:
Object storage use is expanding in the enterprise as all-flash object storage technology becomes prevalent, delivering better performance without sacrificing the flexibility, scalability and cost benefits of object storage. On-premises flash-based object storage can support several types of applications beyond backup and archiving, including AI, machine and deep learning, and big data analytics. These systems typically support NFS and SMB to enable continued use of legacy applications. They can integrate with the public cloud services. Many come as physical and virtual appliances in addition to software-only products. Several products combine file and object storage in one device.

Exam Question 46

Data governance institutes accountability for data management and data quality in an organization. Effective data governance encompasses the right levels of:

A. Policies and procedures, structures, and roles and responsibilities
B. Data quality tools and diverse skill sets
C. Policies, data stewards and data governance tools

Correct Answer:
A. Policies and procedures, structures, and roles and responsibilities
Answer Description:
Explanation: Danette McGilvray, president and principal of Granite Falls Consulting Inc., details what constitutes an effective data governance program in a Q&A with SearchDataManagement, and highlights the need for the right level of each of these elements.

Exam Question 47

Best practices for effective data governance include all of the following EXCEPT:

A. Building a strong relationship between business and IT
B. Combining the use of top-down and bottom-up approaches
C. Trying to address all of an organization’s data problems from the start
D. Providing incentives for complying with data governance rules

Correct Answer:
C. Trying to address all of an organization’s data problems from the start
Answer Description:
IT managers speaking during a conference panel discussion on effective data governance recommended the steps listed above and more, according to our news story, IBM users reveal five data governance best practices to remember.

Exam Question 48

True or false? Key enterprise information is often located in “data shadow systems” scattered throughout an organization.

A. True
B. False

Correct Answer:
A. True
Answer Description:
Data governance managers commonly make the mistake of focusing only on the data in transactional “systems of record” and business intelligence systems. But data that should be governed often is also stored in Excel spreadsheets, files and other local repositories set up by individual business users, according to this feature on the worst practices in enterprise data governance.

Exam Question 49

Set up your data stewards for success by finding people with the right skills and:

A. Assigning them business ownership over the data
B. Giving them appropriate tools and performance incentives
C. Allowing them to function in informal roles

Correct Answer:
B. Giving them appropriate tools and performance incentives
Answer Description:
Consultant David Loshin offers guidelines for the successful management of data stewardship programs in his tip, Data steward role needs some shepherding itself. He recommends such steps as clearly defining stewardship responsibilities, creating incentives for meeting performance objectives and providing appropriate tools.

Exam Question 50

Data stewards face new challenges when it comes to dealing with big data, including:

A. The need for more meetings with additional stakeholders
B. Living up to long-held expectations of the data steward role
C. Potential resistance to data cleansing and consolidation efforts from data scientists

Correct Answer:
C. Potential resistance to data cleansing and consolidation efforts from data scientists
Answer Description:
Analysts and consultants explain in this feature how big data can further complicate data stewardship efforts. They cite the immature nature of most big data programs, the prevalence of data from external sources, short-term analytics projects that might not be repeated and a preference on the part of analytics team members to maintain data in its raw form for analytical purposes.