Skip to content
Catalog
Home
Resources
Menu Toggle
Mind Map Library
About
Pricing
Blog
Contact
Log In
Main Menu
About
Account
All Courses
Become a MLdenizen!
Become A Teacher
Blog
Checkout
Contact
Course Select
Courses
Discover our offer
Home
Instructor
Instructors
Login
Logout
Machine Learning Path
Members
Mind Map Library
Password Reset
Profile
Register
Services
Signup
Summary
Term Conditions
User
Exercise 1 – Original Perceptron (Rosenblatt)
The
original perceptron
, proposed by Frank Rosenblatt in 1958, is a simple single-layer binary classifier. It learns by adjusting weights based on errors in prediction. The output is computed as: $$ y = \begin{cases} 1 & \text{if } \sum_{i=1}^{n} w_i x_i + b \geq 0 \\ 0 & \text{otherwise} \end{cases} $$ During training, weights are updated as: $$ w_i := w_i + \eta (y_{true} - y_{pred}) x_i $$ Where: - \( x_i \): input feature - \( w_i \): weight - \( b \): bias - \( \eta \): learning rate - \( y_{true} \): actual label - \( y_{pred} \): predicted output Which of the following best describes the behavior and training method of the original perceptron? {blank1}
Drag these lines into the blanks:
It adjusts weights based on classification error and uses a step function as activation.
It performs matrix inversion and probabilistic estimation to infer outputs.
It is a trainable neuron that uses sigmoid activation and gradient descent.
It encodes binary logic gates through fixed weights and thresholds.
Scroll to Top