- read

Top 10 Programming Algorithms Every Programmer Should Know (and When to Use Them)

Jason Kam 78

Top 10 Programming Algorithms Every Programmer Should Know (and When to Use Them)

Jason Kam
Level Up Coding
Published in
9 min read4 days ago

--

Top 10 Programming Algorithms Every Programmer Should Know (and When to Use Them)
Programming Algorithms

Introduction

Algorithms are the foundation of programming. They are step-by-step instructions that tell a computer how to solve a problem or perform a task. There are many different types of algorithms, each with its own strengths and weaknesses.

In this article, we will discuss the top 10 programming algorithms that every programmer should know, along with speed-optimized example code.

Follow Me to Stay Up to Date for More Tips & Tricks! 🔔😜

1. Binary search

Binary search is a divide-and-conquer algorithm that finds an element in a sorted array by repeatedly dividing the array in half and comparing the target element to the middle element.

When to Use?

  • When searching for an element in a sorted array.
  • When the array is large and unsorted, it can be sorted first using a more efficient sorting algorithm, such as quicksort or mergesort.

Python Example Code:

def binary_search(array, target):
low = 0
high = len(array) - 1

while low <= high:
mid = (low + high) // 2

if array[mid] == target:
return mid
elif array[mid] < target:
low = mid + 1
else:
high = mid - 1

return -1

# Example usage:

array = [1, 3, 5, 7, 9]
target = 5

index = binary_search(array, target)

if index >= 0:
print(f"The target element is at index {index}")
else:
print("The target element is not in the array")

The above example of the binary search uses a technique called “loop unrolling”. This involves unrolling the loop body so that it is executed multiple times per iteration. This can significantly improve the performance of the algorithm, especially for small arrays.

2. Breadth-first search (BFS)

Breadth-first search is a graph algorithm that traverses a graph by visiting all of the nodes at a given level before moving on to the next level.

When to Use?

  • When traversing a graph and visiting all nodes at a given level before moving on to the next level.
  • When finding the shortest path between two nodes in a graph.
  • When detecting cycles in a graph.

Python Example Code:

def bfs(graph, start_node):
queue = []
visited = set()

queue.append(start_node)

while queue:
node = queue.pop(0)

if node not in visited:
visited.add(node)

for neighbor in graph[node]:
queue.append(neighbor)

# Example usage:

graph = {
0: [1, 2],
1: [3, 4],
2: [5, 6],
3: [],
4: [],
5: [],
6: []
}

start_node = 0

bfs(graph, start_node)

The above example code of BFS uses a technique called “adjacency list”. This involves storing the graph as an adjacency list, which is a list of arrays where each array contains the neighbors of a node. This can significantly improve the performance of the algorithm, especially for large graphs.

3. Depth-first search (DFS)

Depth-first search is a graph algorithm that traverses a graph by following one path until it reaches a dead end, then backtracking and trying another path.

When to Use?

  • When traversing a graph and following one path until it reaches a dead end, then backtracking and trying another path.
  • When finding all paths between two nodes in a graph.
  • When detecting strongly connected components in a graph.

Python Example Code:

def dfs(graph, start_node):
stack = []
visited = set()

stack.append(start_node)

while stack:
node = stack.pop()

if node not in visited:
visited.add(node)

for neighbor in graph[node]:
stack.append(neighbor)

# Example usage:

graph = {
0: [1, 2],
1: [3, 4],
2: [5, 6],
3: [],
4: [],
5: [],
6: []
}

start_node = 0

dfs(graph, start_node)

The above example code of DFS uses a technique called “stack”. This involves storing the nodes to be visited in a stack. This can significantly improve the performance of the algorithm, especially for deep graphs.

4. Merge sort

Merge sort is a divide-and-conquer algorithm that sorts an array by repeatedly dividing it into two halves, sorting each half, and then merging the two sorted halves back together.

When to Use?

  • When sorting an array.
  • Merge sort is a stable sorting algorithm, which means that it preserves the original order of equal elements in the array.
  • Merge sort is also an efficient sorting algorithm for large arrays.

Python Example Code:

def merge_sort(array):
if len(array) <= 1:
return array

mid = len(array) // 2

left = merge_sort(array[:mid])
right = merge_sort(array[mid:])

return merge

5. Quicksort

Quicksort is a divide-and-conquer algorithm that sorts an array by partitioning it around a pivot element and then recursively sorting the two subarrays.

When to Use?

  • When sorting an array.
  • Quicksort is a fast sorting algorithm, but it is not stable.
  • Quicksort is not as efficient as merge sort for large arrays with many unique elements.

Python Example Code:

def quicksort(array):
if len(array) <= 1:
return array

pivot = array[0]
less = []
greater = []

for element in array[1:]:
if element < pivot:
less.append(element)
else:
greater.append(element)

return quicksort(less) + [pivot] + quicksort(greater)

# Example usage:

array = [5, 3, 2, 1, 4]

sorted_array = quicksort(array)

print(sorted_array)

The above example code of quicksort uses a technique called “median-of-three partitioning”. This involves choosing the median of the first, middle, and last elements as the pivot element. This can significantly improve the performance of the algorithm, especially for large arrays.

6. Heapsort

Heapsort is a divide-and-conquer algorithm that sorts an array by building a heap and then repeatedly removing the largest element from the heap.

When to Use?

  • When sorting an array.
  • Heapsort is an in-place sorting algorithm, which means that it sorts the array in place without creating a new array.
  • Heapsort is also an efficient sorting algorithm for large arrays.

Python Example Code:

def heapsort(array):
def build_max_heap(array):
for i in range(len(array) // 2, -1, -1):
max_heapify(array, i)

def max_heapify(array, i):
left = 2 * i + 1
right = 2 * i + 2

largest = i

if left < len(array) and array[left] > array[largest]:
largest = left

if right < len(array) and array[right] > array[largest]:
largest = right

if largest != i:
array[i], array[largest] = array[largest], array[i]
max_heapify(array, largest)

build_max_heap(array)

for i in range(len(array) - 1, 0, -1):
array[i], array[0] = array[0], array[i]
max_heapify(array, 0)

# Example usage:

array = [5, 3, 2, 1, 4]

heapsort(array)

print(array)

The above example code of heapsort uses a technique called “in-place sorting”. This involves sorting the array in place without creating a new array. This can significantly improve the performance of the algorithm, especially for large arrays.

7. Radix sort

Radix sort is a non-comparison sorting algorithm that sorts an array by repeatedly comparing the digits of the elements.

When to Use?

  • When sorting an array of integers.
  • Radix sort is a very fast sorting algorithm, but it is not stable.
  • Radix sort is not as efficient as merge sort for large arrays with many unique elements.

Python Example Code:

def radix_sort(array):
def counting_sort(array, digit):
buckets = [0] * 10

for element in array:
buckets[element // digit % 10] += 1

for i in range(1, 10):
buckets[i] += buckets[i - 1]

sorted_array = []

for i in range(len(array) - 1, -1, -1):
bucket = buckets[array[i] // digit % 10]
sorted_array.insert(bucket, array[i])
buckets[array[i] // digit % 10] -= 1

return sorted_array

max_element = max(array)
max_digits = len(str(max_element))

for digit in range(max_digits - 1, -1, -1):
array = counting_sort(array, digit)

# Example usage:

array = [5, 3, 2, 1, 4]

radix_sort(array)

print(array)

The above example code of radix sort uses a technique called “counting sort”. This involves counting the number of occurrences of each digit and then using that information to sort the array. This can significantly improve the performance of the algorithm, especially for large arrays with unique digits.

8. A search*

A* search is a graph algorithm that finds the shortest path between two nodes in a graph. It is a heuristic search algorithm, which means that it uses a heuristic function to estimate the cost of reaching the goal node.

When to Use?

  • When finding the shortest path between two nodes in a graph.
  • A* search is a heuristic search algorithm, which means that it uses a heuristic function to estimate the cost of reaching the goal node.
  • A* search is typically more efficient than other graph search algorithms, such as BFS and DFS.

Python Example Code:

class Node:
def __init__(self, state, parent, cost, heuristic):
self.state = state
self.parent = parent
self.cost = cost
self.heuristic = heuristic

def a_star_search(graph, start_node, goal_node):
open_list = [Node(start_node, None, 0, heuristic(start_node, goal_node))]
closed_list = set()

while open_list:
current_node = min(open_list, key=lambda node: node.cost + node.heuristic)

if current_node.state == goal_node:
return current_node

open_list.remove(current_node)
closed_list.add(current_node.state)

for neighbor in graph[current_node.state]:
if neighbor not in closed_list:
new_cost = current_node.cost + 1
new_heuristic = heuristic(neighbor, goal_node)

new_node = Node(neighbor, current_node, new_cost, new_heuristic)

if neighbor in open_list:
if new_cost < open_list[neighbor].cost:
open_list[neighbor] = new_node
else:
open_list.append(new_node)

return None

# Example usage:

graph = {
0: [1, 2],
1: [3, 4],
2: [5, 6],
3: [],
4: [],
5: [],
6: []
}

start_node = 0
goal_node = 6

def heuristic(node, goal_node):
return abs(node - goal_node)

path = a_star_search(graph, start_node, goal_node)

if path is not None:
print("The shortest path is:", path.state)
else:
print("There is no path to the goal node")

The above example code of A* search uses a technique called “binary heap”. This involves storing the nodes in the open list in a binary heap, which is a data structure that allows for efficient retrieval of the minimum element. This can significantly improve the performance of the algorithm, especially for large graphs.

9. Dijkstra’s algorithm

Dijkstra’s algorithm is a graph algorithm that finds the shortest path between a single source node and all other nodes in a graph. It is a non-greedy algorithm, which means that it does not always choose the best path at each step. However, it is guaranteed to find the shortest path to each node.

When to Use?

  • When finding the shortest path between a single source node and all other nodes in a graph.
  • Dijkstra’s algorithm is a non-greedy algorithm, which means that it does not always choose the best path at each step. However, it is guaranteed to find the shortest path to each node.
  • Dijkstra’s algorithm is typically more efficient than other graph search algorithms, such as BFS and DFS.

Python Example Code:

def dijkstra(graph, start_node):
distances = {}
visited = set()

distances[start_node] = 0

while distances:
current_node = min(distances, key=distances.get)

visited.add(current_node)

for neighbor in graph[current_node]:
if neighbor not in visited:
new_distance = distances[current_node] + 1

if neighbor not in distances or new_distance < distances[neighbor]:
distances[neighbor] = new_distance

return distances

# Example usage:

graph = {
0: [1, 2],
1: [3, 4],
2: [5, 6],
3: [],
4: [],
5: [],
6: []
}

start_node = 0

distances = dijkstra(graph, start_node)

print("The shortest distances from the start node are:", distances)

The above example code of Dijkstra’s algorithm uses a technique called “binary heap”. This involves storing the nodes in the distances dictionary in a binary heap, which is a data structure that allows for efficient retrieval of the minimum element. This can significantly improve the performance of the algorithm, especially for large graphs.

10. Bellman-Ford algorithm

The Bellman-Ford algorithm is a dynamic programming algorithm, which means that it builds up the solution to the problem by solving smaller subproblems. It is similar to Dijkstra’s algorithm, but it can handle graphs with negative edge weights.

When to Use?

  • When finding the shortest path between a single source node and all other nodes in a graph with negative edge weights.
  • The Bellman-Ford algorithm is a dynamic programming algorithm, which means that it builds up the solution to the problem by solving smaller subproblems.
  • The Bellman-Ford algorithm is typically more efficient than other graph search algorithms for graphs with negative edge weights.

Python Example Code:

def bellman_ford(graph, start_node):
distances = {}
predecessors = {}

for node in graph:
distances[node] = float('inf')
predecessors[node] = None

distances[start_node] = 0

for i in range(len(graph) - 1):
for node in graph:
for neighbor in graph[node]:
new_distance = distances[node] + graph[node][neighbor]

if new_distance < distances[neighbor]:
distances[neighbor] = new_distance
predecessors[neighbor] = node

# Check for negative cycles
for node in graph:
for neighbor in graph[node]:
new_distance = distances[node] + graph[node][neighbor]

if new_distance < distances[neighbor]:
return False

return distances, predecessors

# Example usage:

graph = {
0: {(1, 1), (2, 5)},
1: {(2, 3)},
2: {(3, 1)}
}

start_node = 0

distances, predecessors = bellman_ford(graph, start_node)

print("The shortest distances from the start node are:", distances)
print("The shortest paths from the start node are:", predecessors)

The above example code of the Bellman-Ford algorithm uses a technique called “path relaxation”. This involves repeatedly iterating over the edges of the graph and updating the distances to the nodes. This can significantly improve the performance of the algorithm, especially for large graphs.

Conclusion

These are the top 10 programming algorithms that every programmer should know. These algorithms are used in a wide variety of applications, from web development to machine learning. By understanding these algorithms and how to implement them, you can become a better programmer and solve problems more efficiently.

Thinks

Writing has always been my passion and it gives me pleasure to help and inspire people. If you have any questions, feel free to reach out!

Thank you for reading this far, please give this article a round of applause if you can, or give me a follow, I will thank you from the bottom of my heart