CS 540: Introduction To Artificial Intelligence: Final Exam: 8:15-9:45am, December 21, 2016 132 Noland
CS 540: Introduction To Artificial Intelligence: Final Exam: 8:15-9:45am, December 21, 2016 132 Noland
CLOSED BOOK
(two sheets of notes and a calculator allowed)
Write your answers on these pages and show your work. If you feel that a question is not fully
specified, state any assumptions that you need to make in order to solve the problem. You may
use the backs of these sheets for scratch work.
Write your name on this page and initial all other pages of this exam. Make sure your exam
contains six problems on eight pages.
Name ________________________________________________________________
UW Net ID ________________________________________________________________
1 ______ 17
2 ______ 10
3 ______ 18
4 ______ 15
5 ______ 15
6 ______ 25
a) What is the probability that B and C are true but A and D are false? ______________
[Be sure to show your work for Parts a-c. Put your (numeric) answers on the lines provided.]
c) What is the prob that A is true given that B is false, C is false, and D is true? _____________
2
Initials: ________________________
Ex # A B Output
1 True False 1
2 True True 0
3 True True 1
Calculate the ratio below, showing your work below it and putting your final (numeric)
answer on the line to the right of the equal sign. Be sure to explicitly show in your work the
counts due to pseudo examples.
3
Initials: ________________________
Students who do not understand kernels can learn about them by taking cs540.
[You must use situation calculus here. Do not use Markov Logic, though.]
4
Initials: ________________________
Θ={ }
__________________________________________
b) Put the following into clausal form (write your answer on the line below):
∀x [ p(x) ↔ q(2) ]
__________________________________________________
c) Given the following clauses, show ∃w C(w) must be true by assuming its negation and using
only the resolution inference rule to derive a contradiction. In the clauses below all the
variables are universally quantified.
Use the notation presented in class (and in the book) where the resulting clause is connected
by lines to the two clauses resolved and indicate any variable bindings needed.
5
Initials: ________________________
a) Show how the perceptron learning rule (also called the delta rule) would alter this neural
network upon processing the above training example. Let η (the learning rate) be 0.3.
a) Assume you are given this initial dataset and wish to use the Gaussian kernel with variance
(i.e., σ2) equal to 1 to create a new dataset. (If you do not recall the Gaussian kernel, use the
dot-product kernel for partial credit.)
Ex # A B Output
1 0.1 0.9 0
2 0.2 0.8 0
3 0.7 0.4 1
Show the new dataset below. Be sure to clearly label the columns and rows.
6
Initials: ________________________
What is the probability P is true? Show your work below. Answer: __________________
i. ____________________________________________
ii. ____________________________________________
c) Two methods covered this semester for clustering unlabeled training examples are:
i. ____________________________________________
ii. ____________________________________________
d) Two machine learning approaches covered this semester that do not require users to represent
examples as fixed-length feature vectors are:
i. ____________________________________________
ii. ____________________________________________
7
Initials: ________________________
e) Circle the phrase(s) below that is (are) not a valid logical inference rule
i. AND Elimination
ii. OR Elimination
iv. OR Introduction
v. Modus Facto
f) Circle the topic(s) below that was (were) not covered this semester.
i. Byzantine Agreements
v. Simulated Annealing
g) Consider the kernel function below, where examples are Boolean-valued features:
Briefly describe below the main reason why this would not be a good kernel function.