I need to generate a Large Margin Classifier using python library cvxopt which allows me to solve the quadratic program.
I am trying to write a python function to take the training data and some test data and return the support vectors and the distance of each test data point from the optimal hyperplane.
I am struggling, however, to understand the output of the cvxopt and how to use it to find the support vectors and then the distances.
My function so far looks like this:
def lmc(X, t, Xtest):
numSamples = len(X)
P = matrix(outer(t,t) * inner(X,X))
q = matrix(ones(numSamples) * -1)
G = matrix(diag(ones(numSamples) * -1))
h = matrix(zeros(numSamples))
A = matrix(t, (1,numSamples))
b = matrix(0.0)
solution = solvers.qp(P, q, G, h, A, b)
return solution
The output of this function so far includes large vectors: s, x, z and a single value for y. My knowledge of support vector machines is pretty sketchy so any help would be greatly appreciated.