Lasso regression using sklearn

The lasso implementation looks very similar to the ridge model we just ran. The main difference is that lasso needs to arrive at a solution using iterative coordinate descent whereas ridge can rely on a closed-form solution:

nfolds = 250
alphas = np.logspace(-8, -2, 13)
scaler = StandardScaler()

lasso_results, lasso_coeffs = pd.DataFrame(), pd.DataFrame()
for i, alpha in enumerate(alphas):
coeffs, test_results = [], []
lr_lasso = Lasso(alpha=alpha)
for i, (train_dates, test_dates) in enumerate(time_series_split(dates, nfolds=nfolds)):
X_train = model_data.loc[idx[train_dates], features]
y_train = model_data.loc[idx[train_dates], target]
lr_lasso.fit(X=scaler.fit_transform(X_train), y=y_train)

X_test = model_data.loc[idx[test_dates], features]
y_test = model_data.loc[idx[test_dates], target]
y_pred = lr_lasso.predict(scaler.transform(X_test))

rmse = np.sqrt(mean_squared_error(y_pred=y_pred, y_true=y_test))
ic, pval = spearmanr(y_pred, y_test)

coeffs.append(lr_lasso.coef_)
test_results.append([train_dates[-1], rmse, ic, pval, alpha])
test_results = pd.DataFrame(test_results, columns=['date', 'rmse', 'ic', 'pval', 'alpha'])
lasso_results = lasso_results.append(test_results)
lasso_coeffs[alpha] = np.mean(coeffs, axis=0)
..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
18.222.10.9