Exam 5: Machine-Learning Techniques for Predictive Analytics
Exam 1: Overview of Business Intelligence, Analytics, Data Science, and Artificial Intelligence: Systems for Decision Support21 Questions
Exam 2: Artificial Intelligence Concepts, Drivers, Major Technologies, and Business Applications53 Questions
Exam 3: Nature of Data, Statistical Modeling, and Visualization33 Questions
Exam 4: Data Mining Process, Methods, and Algorithms15 Questions
Exam 5: Machine-Learning Techniques for Predictive Analytics30 Questions
Exam 6: Deep Learning and Cognitive Computing56 Questions
Exam 7: Text Mining, Sentiment Analysis, and Social Analytics13 Questions
Exam 8: Prescriptive Analytics: Optimization and Simulation17 Questions
Exam 9: Big Data, Cloud Computing, and Location Analytics: Concepts and Tool12 Questions
Exam 10: Robotics: Industrial and Consumer Applications64 Questions
Exam 11: Group Decision Making, Collaborative Systems, and AI Support26 Questions
Exam 12: Knowledge Systems: Expert Systems, Recommenders, Chatbots, Virtual Personal Assistants, and Robo Advisors54 Questions
Exam 13: The Internet of Things As a Platform for Intelligent Applications60 Questions
Exam 14: Implementation Issues: From Ethics and Privacy to Organizational and Societal Impacts61 Questions
Select questions type
The methodology employed in the traffic case follows a very well-known standardized analytics process know by its acronym _________.
Free
(Short Answer)
4.8/5
(39)
Correct Answer:
binary CRISP-DM
In the opening vignette, the high accuracy of the models in predicting the outcomes of complex medical procedures showed that data mining tools are ready to replace experts in the medical field.
Free
(True/False)
4.8/5
(34)
Correct Answer:
False
A disadvantages of Hopfield neural networks is that their structure cannot be replicated on an electronic circuit board.
Free
(True/False)
4.8/5
(38)
Correct Answer:
False
The k-nearest neighbor algorithm appears well-suited to solving image recognition and categorization problems.
(True/False)
4.9/5
(29)
The Naïve Bayes method is a powerful tool for representing dependency structure in a graphical, explicit, and intuitive way.
(True/False)
4.8/5
(38)
The most complex problems solved by neural networks require one or more hidden layers for increased accuracy.
(True/False)
4.9/5
(36)
Model ensembles tend to be more ________ against outliers and noise in the data set than individual models.
(Short Answer)
4.7/5
(37)
The strong assumption of independence among the input variables in the Naïve Bayes method is realistic.
(True/False)
4.9/5
(41)
Naïve Bayes is a simple probability-based classification method derived from the Bayes theorem.
(True/False)
4.9/5
(27)
Ensemble models can be quickly characterized based on their use of a bagging or boosting method type.
(True/False)
4.8/5
(30)
Pearl won the prestigious ACM's A.M. Turing Award for his contributions to the field of artificial intelligence and the development of BN.
(True/False)
5.0/5
(32)
BN is a powerful tool for representing dependency structure in a ________, explicit, and intuitive way.
(Short Answer)
4.7/5
(22)
In the mining industry case study, the input to the neural network is a verbal description of a hanging rock on the mine wall.
(True/False)
4.8/5
(39)
The k-nearest neighbor algorithm is overly complex when compared to artificial neural networks and support vector machines.
(True/False)
4.9/5
(40)
The random forest (RF) model is a modification to what algorithm?
(Multiple Choice)
4.8/5
(35)
Backpropagation requires the of vector pairs, with the pairs consisting of:
(Multiple Choice)
4.9/5
(30)
In 1992, Boser, Guyon, and Vapnik suggested a way to create nonlinear classifiers by applying the kernel trick to maximum-margin hyperplanes. How does the resulting algorithm differ from the original optimal hyperplane algorithm proposed by Vladimir Vapnik in 1963?
(Essay)
4.8/5
(30)
Showing 1 - 20 of 30
Filters
- Essay(0)
- Multiple Choice(0)
- Short Answer(0)
- True False(0)
- Matching(0)