Statistical Learning Theory

The goal of learning is understanding and prediction. Learning falls into many categories, including supervised learning, unsupervised learning, online learning and reinforcement learning. From the perspective of statistical learning theory, supervised learning is best understood. It involves learning from a training set of data. Every point in the training is an input-output pair, where the input maps to an output. The learning problem consists of inferring the function that maps between the input and the output, such that the learned function can be used to predict output from future input.

Depending on the type of output, supervised learning problems are either problems of regression or problems of classification. If the output takes a continuous range of values, it is a regression problem. Using Ohm’s Law as an example, a regression could be performed with voltage as input and current as output. The regression would find the functional relationship between voltage and current to be 1/R, such that

I = V/R