Approach

To find out what unmixing matrices are independent, we have to bank upon non-Gaussianity. Let's see how we can do this.

Here, we will need to maximize the kurtosis, which will turn the distribution into a non-Gaussian. This will result in independent components. The following diagram shows an image of fast ICA:

For this, we have the FastICA library in Python.

Let's look at how we can execute this in Python. We will work with the same Iris data. This might not be an ideal dataset for executing ICA, but this is being done for directional purposes. To execute the code in Python, we will need to perform the following steps:

  1. First, we need to load the library:
import numpy as np # linear algebra
import pandas as pd # data processing
import matplotlib.pyplot as plt
from sklearn import datasets
  1. Now, we need to load the data:
iris = datasets.load_iris()
X = iris.data
y = iris.target
  1. Let's partition the data into train and test sets:
from sklearn.model_selection import train_test_split
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.25, random_state = 0)
  1. Let's make the data a standard scalar:
from sklearn.preprocessing import StandardScaler
sc = StandardScaler()
X_train = sc.fit_transform(X_train)
X_test = sc.transform(X_test)
  1. Now, we need to load in the ICA library:
from sklearn.decomposition import FastICA
  1. We carry out ICA as follows. We will stick to three components here:
ICA = FastICA(n_components=3, random_state=10,whiten= True) 
X=ICA.fit_transform(X_train)
  1. We will then plot the results, as follows:
plt.figure(figsize=(8,10))
plt.title('ICA Components')
plt.scatter(X[:,0], X[:,1])
plt.scatter(X[:,1], X[:,2])
plt.scatter(X[:,2], X[:,0])

The output for this is as follows:

We can see the three different components here (by color).

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.220.111.87