Exam 3: Basic Data Mining Techniques
Exam 1: Data Mining: a First View22 Questions
Exam 2: Data Mining: a Closer Look16 Questions
Exam 3: Basic Data Mining Techniques13 Questions
Exam 4: An Excel-Based Data Mining Tool12 Questions
Exam 5: Knowledge Discovery in Databases10 Questions
Exam 6: The Data Warehouse13 Questions
Exam 7: Formal Evaluation Techniques13 Questions
Exam 8: Neural Networks10 Questions
Exam 9: Building Neural Networks With Ida4 Questions
Exam 10: Statistical Techniques13 Questions
Exam 11: Specialized Techniques10 Questions
Exam 12: Rule-Based Systems15 Questions
Exam 13: Managing Uncertainty in Rule-Based Systems10 Questions
Exam 14: Intelligent Agents6 Questions
Select questions type
Which statement is true about the decision tree attribute selection process described in your book?
Free
(Multiple Choice)
4.7/5
(36)
Correct Answer:
B
Which statement is true about the K-Means algorithm?
Free
(Multiple Choice)
4.8/5
(30)
Correct Answer:
D
Use these tables to answer questions 5 and 6.
Single Item Sets Magazine Promo = Yes Watch Promo = No Life Ins Promo = Yes Life Ins Promo = No Card Insurance = No Sex = Male Number of Items 7 6 5 5 8 6
Two Item Sets Magazine Promo = Yes \& Watch Promo = No Magazine Promo = Yes \& Life Ins Promo = Yes Magazine Promo = Yes \& Card Insurance = No Watch Promo = No \& Card Insurance = No Number of Items 4 5 5 5
-Based on the two-item set table, which of the following is not a possible two-item set rule?
(Multiple Choice)
4.8/5
(32)
A genetic learning operation that creates new population elements by combining parts of two or more existing elements.
(Multiple Choice)
4.9/5
(45)
This approach is best when we are interested in finding all possible interactions among a set of attributes.
(Multiple Choice)
4.8/5
(39)
Use these tables to answer questions 5 and 6.
Single Item Sets Magazine Promo = Yes Watch Promo = No Life Ins Promo = Yes Life Ins Promo = No Card Insurance = No Sex = Male Number of Items 7 6 5 5 8 6
Two Item Sets Magazine Promo = Yes \& Watch Promo = No Magazine Promo = Yes \& Life Ins Promo = Yes Magazine Promo = Yes \& Card Insurance = No Watch Promo = No \& Card Insurance = No Number of Items 4 5 5 5
-One two-item set rule that can be generated from the tables above is: If Magazine Promo = Yes Then Life Ins promo = Yes
The confidence for this rule is:
(Multiple Choice)
4.9/5
(35)
Construct a decision tree with root node Type from the data in the table below. The first row contains attribute names. Each row after the first represents the values for one data instance. The output attribute is Class.
Scale Type Shade Texture Class One One Light Thin A Two One Light Thin A Two Two Light Thin B Two Two Dark Thin B Two One Dark Thin C One One Dark Thin C One Two Light Thin C
(Essay)
4.8/5
(42)
Given a rule of the form IF X THEN Y, rule confidence is defined as the conditional probability that
(Multiple Choice)
4.9/5
(36)
The computational complexity as well as the explanation offered by a genetic algorithm is largely determined by the
(Multiple Choice)
4.9/5
(39)
Filters
- Essay(0)
- Multiple Choice(0)
- Short Answer(0)
- True False(0)
- Matching(0)