An implementation of Large Margin Nearest Neighbors (LMNN), a distance learning technique. Given a labeled dataset, this learns a transformation of the data that improves k-nearest-neighbor performance. This can be useful as a preprocessing step.
Is a single predicate that initiates the lmnn model with all the given params and then performs Large Margin Nearest Neighbors metric learning on the reference data.
```prolog
%% part of the predicate definition
lmnn(+string,
+pointer(float_array),+integer,+integer,
+pointer(float_array),+integer,
+integer,
+float32,+float32,+integer,+integer,+float32,
+integer,+integer,
+integer,+integer,+integer,
-pointer(float_array),-integer,-integer)
```
### Parameters
| Name | Type | Description | Default |
|------|------|-------------|---------|
| optimizer | +string | Optimizer to use; "amsgrad", "bbsgd", "sgd", or "lbfgs". | amsgrad |
| data | +matrix | Input dataset to run LMNN on. | - |
| labels | +vec | Labels for input dataset. | - |
| k | +integer | Number of target neighbors to use for each datapoint. | 1 |
If you want a more detailed explanation, then go to the python documentation. There is most of the time a good explanation on how the methods work and what the parameters do.
added some of the links from the python documentation
* nca
*[Large margin nearest neighbor on Wikipedia](https://en.wikipedia.org/wiki/Large_margin_nearest_neighbor)
*[Distance metric learning for large margin nearest neighbor classification (pdf)](http://papers.nips.cc/paper/2795-distance-metric-learning-for-large-margin-nearest-neighbor-classification.pdf)