CMSC 491D/691B



CMSC 475/675 Introduction to Neural Networks Fall 2009

Exam 1

1. Briefly define the following terms (5 points each)

a) Hidden nodes.

b) Radial basis functions (RBF).

c) Momentum term in backpropogation learning.

d) Recurrent network.

2. (10 points) The function NAND(x, y) is defined as NAND(x, y) = NOT(AND(x, y)). Design a single layer neural network of one output node and two input nodes to compute NAND(x, y). The output node should be a threshold node, and the input/output values should be bipolar.

3. Short questions (10 points each)

a) What are the major differences between human brain and the Von Neumann architectures?

b) What is the overfitting problem in supervised learning? Can you think of a way to ease this problem?

c) What is the problem of linear separability? (You may use an example to illustrate your points.)

4. (40 points) This question concerns backpropagation (BP) learning

a) Explain how the error is backpropagated down from the output layer to the hidden layer.

b) Quickprop is widely used to speedup BP learning. Describe how quickprop determines weight update (e.g., Δw).

c) What is the network paralysis problem in BP learning? Can you think of a way to ease this problem?

d) What is the reason that BP learning is not guaranteed to reduce the total square error to zero even if the learning rate is very small? Can you think of a way to ease this problem?

5. (20 points) THIS QUESTION IS FOR PEOPLE REGISTERED FOR CMSC675 ONLY.

The network below is for binary pattern classification. It has the same architecture as Adaline except the output node has sigmoid activation function [pic]. P training samples [pic] are used to train the weights, including the bias [pic].

a) Give the formula for [pic], the net input to the output node.

b) Give the formula for the total square error over all P training patterns.

c) Derive the weight update rule that minimize the total error defined in (b) based on the gradient descent approach.

d) Does the learning converge when the classification problem is not linearly separable? Justify your answer.

-----------------------

[pic][pic]

[pic]

...

[pic][pic]

[pic]

h,;Øh,;ØCJhHÔ5?CJ\?hhJ?5?CJ\?h„-ÂCJh |Q7CJh¯xXC EMBED Equation.3 [pic]

x

1

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download

To fulfill the demand for quickly locating and searching documents.

It is intelligent file search solution for home and business.

Literature Lottery

Related searches