Sep

16 September 2014 /R1
Name:
Group:
Pages:
Exam in Intelligent Systems
Lecturer Adrian Groza
Time: 17-20. Write clearly; unreadable = wrong! If we observe any attempt of deception, the whole exam
will be evaluated to 0 points.
Available Points
Achieved Points
1.
Exercise 1
30
Exercise 2
30
Exercise 3
30
Sum
90
Midterm
90
Model a small bayesian network that represents the relationship between yellow
ngers, smoking, cancer, radiation, solar ares, and using a microwave. In this model, smoking can cause
yellow ngers and cancer. Solar ares and making microwave popcorn can cause radiation, and radiation
can cause cancer as well. The prior probability of smoking P(S) is 0.3. The prior probability of solar ares
P(F) is 0.8. The prior probability of using the microwave is P(M) is 0.9. The conditional probability
tables for radiation (R), cancer (C) and yellow ngers (Y) are:
F M P(R)
S R P(C)
0 0 0.1
0 0 0.1
S P(Y)
0 1 0.2
0 1 0.6
0 0.11
1 0 0.2
1 0 0.3
1 0.8
1 1 0.9
1 1 0.9
(a) What is the prior probability of cancer?
(b) What is the probability of smoking given cancer?
(c) What is the probability of smoking given cancer and radiation?
(d) What is the markov blanket of yellow ngers?
(e) Are solar ares and using the microwave independent given cancer?
(f) What is the probability of cancer is you never use a microwave?
Bayesian networks.
Solution:
(a) P (c = 1) = 0.53.
(b) The irrelevant variable is yellow ngers. P (s = 1|c = 1) = 0.40665, P (s = 1|c = 0) = 0.17874
(c) When radiation an cancer is given, the irrelevant variables are: yellow ngers, solar ares and
microwaves. P (s|c = 1, r = 1) = 0.3913
(d) the Markov blanket of node includes its parents, children and the other parents of all of its
children. For the node Yellow Fingers the Markov blanket contains only the node Smooking.
(e) The topological semantics species that each variable is conditionally independent of its nondescendants, given its parents. A node is conditionally independent of all other nodes given its
Markov blanket, (i.e. its parents, children, children³ parents).
(f) P(c=1|m=0)=0.2554.
2.
Construct by hand a neural network that computes the XOR function of two inputs.
Make sure to specify what sort of units you are using.
Neural networks.
XOR (in fact any Boolean function) is easiest to construct using step-function units.
Because XOR is not linearly separable, we will need a hidden layer. It turns out that just one
hidden node suces. To design the network, we can think of the XOR function as OR with the
AND case (both inputs on) ruled out. Thus the hidden layer computes AND, while the output layer
computes OR but weights the output of the hidden node negatively.
Solution:
3.
Consider the sentence Someone walked slowly to the superand a lexicon consisting of the following words:
Natural Language for Communication.
market
Pronoun → someone
Adv → slowly
Article → the
Verb → walked
Prep → to
Noun → supermarket
Which of the following three grammars, combined with the lexicon, generates the given sentence? Show
the corresponding parse tree(s).
S → NP VP
NP → Pronoun
NP → Article NP
VP → Verb Adv
Adv → Adv Adv
Adv → PP
PP → Prep NP
NP → Noun
S → NP VP
NP → Pronoun
NP → Article Noun
VP → VP PP
VP → VP Adv Adv
VP → Verb
PP → Prep NP
NP → Noun
S → NP VP
NP → Pronoun
NP → Noun
NP → Article NP
VP → Verb Vmod
Vmod → Adv Vmod
Vmod → Adv
Adv → PP
PP → Prep NP
For each of the preceding three grammars, write down three sentences of English (or Romanian) and
three sentences of non-English (or non-Romanian) generated by the grammar. Each sentence should be
signicantly dierent, should be at least six words long, and should include some new lexical entries.
Suggest ways to improve each grammar to avoid generating the non-English sentences.
Page 2
Solution: Grammar (B) does not work, because there is no way for the verb alkedollowed by the
adverb ³lowly nd the prepositional phrase µo the supermarketµo be parsed as a verb phrase.
Page 3