Neural Networks - Unit Wise Questions
1. Differentiate between Quasi-Newton method and nonlinear conjugate gradient algorithm for supervised training of multiplayer perception and explain conjugate gradient algorithm. (3+7)
1. Differentiate between small scale and large scale learning problems. How can heuristic be implemented for making back propagation algorithm perform better? (4+6)
2. How XOR (Exclusive OR) problem can be solved by Radial-Basis Function networks. Draw necessary figures and perform required calculations to complete the specificatioin of the RBF network. (10)
3. Explain real time recurrent learning algorithm for training a recurrent network with practical example. (10)
2. State Cover’s theorem on separability of problems. Explain hybrid learning technique for RBF network. (3+7)
3. What is vanishing gradient problem in recurrent networks? How can it be solved? Explain with necessary equations. (4+6)
4. What are the three basic rules that depicts the flow of signal in a neural network viewed as directed graph? (5)
4. Define neural network. Briefly explain the working mechanism of biological neural with its related functional units (1+4)
6. What are the assumptions that need to be considered while estimating parameter in a Gaussian environment? (5)
5. What is a perceptron? Explain batch perceptron algorithm. (1+4)
7. Draw the block diagram of signal-flow graph representation of the LMS algorithm and express the evolution of the weight vector. (5)
6. Highlight on minimum-description length principle. Explain instrumental-variables method. (1+4)
8. Describe four heuristics that provide guidelines for acclerating the convergence of back-propagation learning through learning rate adaption. (5)
7. Explain LMS algorithm. How does it differ from wiener filter.(4+1)
9. Differentiate between RBF network and MLP network. (5)
10. Differentiate between willshaw-von der Malshurg's model and Kohonen model. (5)
8. What are the properties of feature map? Explain Kernel Self-Organizing map.(1+4)
11. Explain Theroem 1 with respect to the computational power of a recurrent network. (5)
12. Write short notes on: (2 x 2.5 = 5)
a. Wiener filter
b. Supervised learning
9. What is universal approximation theorem? How can real-time recurrent learning be achieved.(1+4)
10. Explain hybrid learning concept in RBF networks (5)
11. Differentiate between batch learning and on-line learning. How is learning rate controlled by using optimal annealing? Explain the concept of network pruning. (1+2+2)
12. Write short notes on: (2 × 2.5 = 5)
a. Convolutional networks
b. cross validation