StudyGPT Just Leveled Up – Say Hello to Knowee! 🚀
Knowee
Questions
Features
Study Tools

(a) What is the difference between Normalisation and Standardisation? (b) When do we usually prefer Standardisation to Normalisation?

Question

🧐 Not the exact question you are looking for?Go ask a question

Solution

(a) Normalisation and Standardisation are both techniques used in preparing data for machine learning algorithms. However, they differ in their methods and purposes.

Normalisation, also known as Min-Max Scaling, is a scaling technique where the values are shifted and rescaled so that they end up ranging between 0 and 1. It is done through the following formula:

Normalized Value = (Value - Min Value) / (Max Value - Min Value)

Standardisation, on the other hand, is a scaling technique where the values are centered around the mean with a unit standard deviation. This means that the mean of the attribute becomes zero and the resultant distribution has a unit standard deviation. It is done through the following formula:

Standardized Value = (Value - Mean) / Standard Deviation

(b) We usually prefer Standardisation over Normalisation when the data we are working with has unknown boundaries, or when we have to deal with outliers. This is because Standardisation does not have a bounding range like Normalisation, and it does not heavily influence by outliers. In contrast, Normalisation is sensitive to outliers, so if there are extreme values in your data, Normalisation might not be the best approach.

This problem has been solved

Upgrade your grade with Knowee

Get personalized homework help. Review tough concepts in more detail, or go deeper into your topic by exploring other relevant questions.