Creating Variables to Store Results
Now we will initialize some NumPy matrices so, that we can store the results from the estimation to them. Here, the mean squared error calculation logic and shrinkage estimation logic of both Ledoit-Wolf and OAS is being specified.
Python3
# specifying the sampling range numOfSamplesRange = np.arange( 8 , 40 , 1 ) repeat = 150 # MSE logic specification ledoitWolf_mse = np.zeros((numOfSamplesRange.size, repeat)) oas_mse = np.zeros((numOfSamplesRange.size, repeat)) # shrinkage estimation ledoitWolf_shrinkage = np.zeros((numOfSamplesRange.size, repeat)) oas_shrinkage = np.zeros((numOfSamplesRange.size, repeat)) |
Parameters
- Store_precision: bool, default=True
- Assume_centered: bool, default=False
- Block_size: int, default=1000
Attributes
- Covariance_: array of shape (n_features, n_features)
- Precision_: array of shape (n_features, n_features)
- shrinkage_: float
- n_features_in_: int
Now, we will calculate the MSE and shrinkage by using nested for loop to fit the values for both models step by step:
Python3
# defining nested loop for each step iteration for i, numOfSamples in enumerate (numOfSamplesRange): for j in range (repeat): X = np.dot(np.random.normal(size = (numOfSamples, numOffeatures)), colorMatrix.T) # calculation of MSE and shrinkage for Ledoit-Wolf model ledoitWolf = LedoitWolf(store_precision = False , assume_centered = True ) ledoitWolf.fit(X) #fitting the values in model ledoitWolf_mse[i, j] = ledoitWolf.error_norm(realTimeCovariance, scaling = False ) # error normalization ledoitWolf_shrinkage[i, j] = ledoitWolf.shrinkage_ # calculation of MSE and shrinkage for OAS model oas = OAS(store_precision = False , assume_centered = True ) oas.fit(X) #fitting the values in model oas_mse[i, j] = oas.error_norm(realTimeCovariance, scaling = False ) # error normalization oas_shrinkage[i, j] = oas.shrinkage_ |
Ledoit-Wolf vs OAS Estimation in Scikit Learn
Generally, Shrinkage is used to regularize the usual covariance maximum likelihood estimation. Ledoit and Wolf proposed a formula which is known as the Ledoit-Wolf covariance estimation formula; This close formula can compute the asymptotically optimal shrinkage parameter with minimizing a Mean Square Error(MSE) criterion feature. After that, one researcher Chen et al. made improvements in the Ledoit-Wolf Shrinkage parameter. He proposed Oracle Approximating Shrinkage (OAS) coefficient whose convergence is significantly better under the assumption that the data are Gaussian.
There are some basic concepts are needed to develop the Ledoit-Wolf estimators given below step by step: