This week we have learnt Regression and Neural Networks(NN). Both technique was taught in the year two. This time the topic is more detail.

The lecture covered three different model
Regression analysis is to relate one or more numeric input attributes to a single numeric output attribute. The focus is on the relationship between a dependent variable and one or more independent variables

The lecture covered three different model
- Linear Regression
A straight line graph - Nonlinear Regression
Usually a Curve - Logistic Regression
Categorical data such as y = o or 1
Output of Regression
R2 is the percentage of difference between line and the actual value. Model is more accurate with a higher R2. To increase R2, data analyst can increase the number of attributes.
Adding R2 will always increase R2 value while Adjusted R2 is to adjusts calculation to penalize for number of independent variables.
There are a lot of good examples of Regression Models from hereThe next thing I like to share is Neural Networks
It is computer that will operate like a human brain. The machines possess simultaneous memory storage and works with ambiguous information.
NN can be used for both supervised and unsupervised learning. Only numeric data can be used for NN. The relationships between input and output are not linear. NN is usually used in areas like approval of loan application and fraud prevention.
There two type of NN
Feed Forward NN


Kohonen Neural Networks

The other type of NN is Kohonen Neural Networks. It is unsupervised mining. Unlike Feed Forward NN there no hidden layer in it. Instances input into the network are assigned to the most appropriate cluster represented by the nodes at the output layer using the input values and connection weights.
You may find this page interesting as there is historical and background about NN and many good examples.
No comments:
Post a Comment