Weighted k-NN

The Weighted k-NN algorithm is an extension of the standard k-NN algorithm. Instead of treating all k neighbors equally, it assigns weights to neighbors based on their distance from the test instance.

The idea is:

  • Closer neighbors have higher weight.
  • Farther neighbors have lower weight.
  • This helps in making more accurate predictions, especially when neighbors are unevenly distributed.

Weights are inversely proportional to the distance between the test instance and the training instance.


Algorithm 4.2: Weighted k-NN

Inputs: Training dataset T, Distance metric d, Weighting function w(i), Test instance t, Number of neighbors k

Output: Predicted class or category

Prediction Steps:

  1. Compute distances: For each instance iii in dataset T, compute the distance between the test instance t and instance i using:
    • Euclidean Distance for continuous values:
  1. Hamming Distance for binary (categorical) values.
  2. Sort distances and choose the k nearest neighbors.
  3. Apply weighted voting:
    • Compute inverse distance for each of the k neighbors.
    • Find the sum of all inverse distances.
    • Divide each inverse distance by the total sum to get the weight.
    • Add the weights class-wise.
    • Predict the class with the maximum total weight.

Weighted k-NN on Student Dataset

Given a test instance: (7.6, 60, 8)
We are to classify it using Weighted k-NN with k = 3


Step 1: Distance Calculation (using Euclidean)

Sl.NoCGPAAssessmentProjectResultEuclidean Distance
19.2858Pass25.05115
28.0807Pass20.02898
38.5818Pass21.01928
46.0455Fail15.38051
56.5504Fail10.82636
68.2727Pass12.05653
75.8385Fail22.27644
88.9919Pass31.04336

Step 2: Select 3 Nearest Neighbors

InstanceEuclidean DistanceClass
510.82636Fail
612.05653Pass
415.38051Fail

Step 3: Compute Weights by Inverse Distance

Table: Inverse Distance

InstanceEuclidean DistanceInverse DistanceClass
415.380510.06502Fail
510.826360.09237Fail
612.056530.08294Pass

Sum of Inverses: 0.06502+0.09237+0.08294=0.24033


Step 4: Calculate Weights

Table: Final Weight Calculation

InstanceInverse DistanceWeight = Inverse/SumClass
40.065020.270545Fail
50.092370.384347Fail
60.082940.345109Pass

Step 5: Class Prediction

  • Total weight for Fail = 0.270545 + 0.384347 = 0.654892
  • Total weight for Pass = 0.345109

Since Fail has higher total weight →
Predicted class: ‘Fail’


The test instance (7.6, 60, 8) is classified as ‘Fail’ using the Weighted k-NN method.

Leave a Reply

Your email address will not be published. Required fields are marked *