Partial Least Squares (PLS), Kernel-based Orthogonal Projections to Latent Structures (K-OPLS) and NIPALS based OPLS
PLS regression algorithm based on the Yi Cao implementation:
PLS Matlab code
K-OPLS regression algorithm based on this paper.
K-OPLS Matlab code
OPLS implementation based on the R package Metabomate using NIPALS factorization loop.
installation
$ npm i ml-pls
Usage
import PLS from 'ml-pls';
const X = [
[0.1, 0.02],
[0.25, 1.01],
[0.95, 0.01],
[1.01, 0.96],
];
const Y = [
[1, 0],
[1, 0],
[1, 0],
[0, 1],
];
const options = {
latentVectors: 10,
tolerance: 1e-4,
};
const pls = new PLS(options);
pls.train(X, Y);
import {
getNumbers,
getClassesAsNumber,
getCrossValidationSets,
} from 'ml-dataset-iris';
import { OPLS } from 'ml-pls';
const cvFolds = getCrossValidationSets(7, { idx: 0, by: 'trainTest' });
const data = getNumbers();
const irisLabels = getClassesAsNumber();
const model = new OPLS(data, irisLabels, { cvFolds });
console.log(model.mode);
The OPLS class is intended for exploratory modeling, that is not for the creation of predictors. Therefore there is a built-in k-fold cross-validation loop and Q2y is an average over the folds.
console.log(model.model[0].Q2y);
should give 0.9209227614652857
import {
getNumbers,
getClasses,
getCrossValidationSets,
} from 'ml-dataset-iris';
import { OPLS } from 'ml-pls';
const cvFolds = getCrossValidationSets(7, { idx: 0, by: 'trainTest' });
const data = getNumbers();
const irisLabels = getClasses();
const model = new OPLS(data, irisLabels, { cvFolds });
console.log(model.mode);
console.log(model.model[0].auc);
If for some reason a predictor is necessary the following code may serve as an example
import {
getNumbers,
getClassesAsNumber,
getCrossValidationSets,
} from 'ml-dataset-iris';
import { OPLS } from 'ml-pls';
const { testIndex, trainIndex } = getCrossValidationSets(7, {
idx: 0,
by: 'trainTest',
})[0];
const irisNumbers = getNumbers();
const testData = irisNumbers.filter((el, idx) => testIndex.includes(idx));
const trainingData = irisNumbers.filter((el, idx) => trainIndex.includes(idx));
const irisLabels = getClassesAsNumber();
const testLabels = irisLabels.filter((el, idx) => testIndex.includes(idx));
const trainingLabels = irisLabels.filter((el, idx) => trainIndex.includes(idx));
const model = new OPLS(trainingData, trainingLabels);
console.log(model.mode);
const prediction = model.predict(testData, { trueLabels: testLabels });
console.log(prediction.Q2y);
import Kernel from 'ml-kernel';
import { KOPLS } from 'ml-pls';
const kernel = new Kernel('gaussian', {
sigma: 25,
});
const X = [
[0.1, 0.02],
[0.25, 1.01],
[0.95, 0.01],
[1.01, 0.96],
];
const Y = [
[1, 0],
[1, 0],
[1, 0],
[0, 1],
];
const cls = new KOPLS({
orthogonalComponents: 10,
predictiveComponents: 1,
kernel: kernel,
});
cls.train(X, Y);
const {
prediction,
predScoreMat,
predYOrthVectors,
} = cls.predict(X);
console.log(prediction);
console.log(predScoreMat);
console.log(predYOrthVectors);
License
MIT