Exam 10: Data Quality and Integration
Exam 1: The Database Environment and Development Process116 Questions
Exam 2: Modeling Data in the Organization103 Questions
Exam 3: The Enhanced E-R Model103 Questions
Exam 4: Logical Database Design and the Relational Model102 Questions
Exam 5: Physical Database Design and Performance103 Questions
Exam 6: Introduction to SQL105 Questions
Exam 7: Advanced SQL107 Questions
Exam 8: Database Application Development105 Questions
Exam 9: Data Warehousing103 Questions
Exam 10: Data Quality and Integration105 Questions
Exam 11: Big Data and Analytics102 Questions
Exam 12: Data and Database Administration110 Questions
Exam 13: Distributed Databases100 Questions
Exam 14: Object-Oriented Data Modeling105 Questions
Select questions type
Total quality management (TQM) focuses on defect correction rather than defect prevention.
(True/False)
5.0/5
(39)
The uncontrolled proliferation of spreadsheets, databases and repositories leads to data quality problems.
(True/False)
4.8/5
(35)
In the ________ approach, one consolidated record is maintained from which all applications draw data.
(Multiple Choice)
4.7/5
(36)
Data scrubbing is a technique using pattern recognition and other artificial intelligence techniques to upgrade the quality of raw data before transforming and moving the data to the data warehouse.
(True/False)
4.8/5
(36)
One characteristic of quality data which pertains to the expectation for the time between when data are expected and when they are available for use is:
(Multiple Choice)
4.7/5
(34)
An approach to filling a data warehouse that employs bulk rewriting of the target data periodically is called:
(Multiple Choice)
4.7/5
(38)
A method of capturing data in a snapshot at a point in time is called static extract.
(True/False)
4.8/5
(37)
The process of transforming data from detailed to summary levels is called normalization.
(True/False)
4.7/5
(37)
Refresh mode is an approach to filling the data warehouse that employs bulk rewriting of the target data at periodic intervals.
(True/False)
4.8/5
(40)
Data that are accurate, consistent, and available in a timely fashion are considered:
(Multiple Choice)
4.9/5
(36)
Data reconciliation occurs in two stages, an initial load and subsequent updates.
(True/False)
4.8/5
(40)
Data propagation duplicates data across databases, usually with some real-time delay.
(True/False)
4.9/5
(38)
Showing 41 - 60 of 105
Filters
- Essay(0)
- Multiple Choice(0)
- Short Answer(0)
- True False(0)
- Matching(0)