Learn how to identify activation functions in CNN exams. Understand why the graph represents the tanh function and how it differs from sigmoid, ReLU, and ELU.
Question
What activation function is it?
A. sigmoid
B. tanh
C. ReLU
D. ELU
Answer
B. tanh
Explanation
The graph provided in the image represents the tanh (hyperbolic tangent) activation function. Here’s why:
Range of Outputs
- The tanh function outputs values between -1 and 1, as seen in the graph.
- This is distinct from the sigmoid function, which outputs values between 0 and 1.
Shape of the Curve
- The graph shows a smooth, S-shaped curve that is symmetric about the origin (0, 0). This symmetry is a key characteristic of the tanh function.
- In contrast, ReLU (Rectified Linear Unit) is not S-shaped; it has a linear segment for positive inputs and outputs zero for negative inputs.
- Similarly, ELU (Exponential Linear Unit) has an exponential decay for negative inputs and a linear response for positive inputs.
Behavior at Extremes
- For large positive inputs (x→+∞),tanh(x)→1.
- For large negative inputs (x→−∞),tanh(x)→−1.
- This matches the behavior depicted in the graph.
By analyzing the graph’s range, symmetry, and curve shape, we can confidently conclude that the correct answer is B. tanh.
Convolutional Neural Network CNN certification exam assessment practice question and answer (Q&A) dump including multiple choice questions (MCQ) and objective type questions, with detail explanation and reference available free, helpful to pass the Convolutional Neural Network CNN exam and earn Convolutional Neural Network CNN certification.