K-NN Algorithm From Scratch

Anjani Suman
1 min readApr 17, 2022

This k-Nearest Neighbors tutorial is broken down into 3 parts:

Step 1: Calculate Euclidean Distance.

Step 2: Get Nearest Neighbors.

Step 3: Make Predictions.

Let’s discuss all of them in detail with python3 code.

Step-1: Calculation of Euclidean Distance

def euclidean_distance(p1:list,p2:list) ->float:  distance = 0.0  for i in range(len(p1)-1):    distance += (p1[i]-p2[i])**2  return math.sqrt(distance)

This function calculates the Euclidean Distance.

Step-2:Get Nearest Neighbors.

# Locate the most similar neighborsdef get_neighbors(train:list, test:list , num_neighbours:int)->list:

distance = list()
for train_row in train: dist = euclidean_distance(train_row, test)
#this is function which gets the euclidean distance
distance.append((train_row,dist)) distance.sort(key= lambda tup:tup[1]) neighbors = list() for i in range(num_neighbors): neighbors.append(distance[i][0]) return neighbors

Step 3: Make Predictions.

# Make a classification prediction with neighborsdef predict_classification(train, test_row, num_neighbors):  neighbors = get_neighbors(train, test_row, num_neighbors)  output_values = [row[-1] for row in neighbors]  prediction = max(set(output_values), key=output_values.count)  return prediction

These are the buliding blocks of K-NN algorithm. Try to run these functions and try to understand their use.

--

--