AI Neural Networks Question:

Download Job Interview Questions and Answers PDF

Having multiple perceptrons can actually solve the XOR problem satisfactorily: this is because each perceptron can partition off a linear part of the space itself, and they can then combine their results.
a) True - this works always, and these multiple perceptrons learn to classify even complex problems.
b) False - perceptrons are mathematically incapable of solving linearly inseparable functions, no matter what you do
c) True - perceptrons can do this but are unable to learn to do it - they have to be explicitly hand-coded
d) False - just having a single perceptron is enough

AI Neural Networks Interview Question
AI Neural Networks Interview Question

Answer:

c) True - perceptrons can do this but are unable to learn to do it - they have to be explicitly hand-coded

Download AI Neural Networks Interview Questions And Answers PDF

Previous QuestionNext Question
A perceptron adds up all the weighted inputs it receives, and if it exceeds a certain value, it outputs a 1, otherwise it just outputs a 0.
a) True
b) False
c) Sometimes - it can also output intermediate values as well
d) Can't say
The network that involves backward links from output to the input and hidden layers is called as ____.
a) Self organizing maps
b) Perceptrons
c) Recurrent neural network
d) Multi layered perceptron