Convert matrix columns to z scores - notation
1
I am doing a course on machine learning. In it I am asked to convert a matrix's columns to z-scores. It says: Subtract the mean value of each feature from the dataset (columns of X). After subtracting the mean, additionally scale (divide) the feature values by their respective “standard deviations.” This is not hard for me and I have written a little python function to this goal, but I was wondering what the vectorized version of this function would look like. So I set up some equations that I could then express in python, but I have done some unconventional things to achieve this and I would like to know what the conventional way would be. Here is what I've got: $$ z: R^{mtimes n}rightarrow R^{mtimes n}\ z( X) =left( X-1_{m} mu ( x)^{T}right) otimes 1_{m} sigma ( x)^{T}\ $$ Insecurities...