Learning Vector QuantizationLearn Vector Quantization (or LVQ) is a type of Artificial Neural Network that is also influenced by the biological model that represents neural networks. It is based on a prototype algorithm for supervised learning and classification. It has developed its network using an algorithm of competitive learning similar to the Self Organizing Map. It is also able to deal with the problem of multiclass classification. LVQ is composed of two layers, one of which is the input layer, and the second is called the Output. The structure of Learning Vector Quantization with the number of classes that are in the input data and the numbers of features that are input for every sample can be found below: How Learning Vector Quantization works?Let’s suppose that we have input data of size (m and the number) with m being the training samples, while n refers to the number of components in each instance and an arbitrary label vector of the size (1 m). Then, it is initialized with its weights in size (n and C) from the initial number of training samples that have different labels. They must be removed in all samples of training. In this case, c is an indication of the classes. Then, iterate over the remaining input information for each example of training that it changes to the winner vector (weight vector) with the closest distance (e.g., Euclidean distance ) from the example of training ). The weight updation rules are provided by: Where alpha represents a learning rate over time t, J is an award-winning vector. In addition, i is the characteristic of the training example, and k represents the number k of the training sample using an input dataset. After training on the LVQ network, weights trained are used to classify new examples. The new instance is labelled by an LVQ class that is the one winning. AlgorithmThe steps involved include:
Below is the implementation. Code: Output: Sample T belongs to class : 0 Trained weights : [[0.3660931, 0.2816541, 1, 1], [0.33661, 0.1729, 0, 1]] |
原创文章,作者:ItWorker,如若转载,请注明出处:https://blog.ytso.com/263150.html