Categories
Tools & HowTo

Sorting Algorithms in Python: Crash Tutorial

Sorting algorithms are a vital part of computer science due to their versatile nature. They also serve as a medium of education. These algorithms sort data in a particular order– such as alphabetical or numerical order. There are also numerous types of algorithms: brute-force, divide-and-conquer, greedy algorithms, etc. In this article, we’ll explore three popular sorting algorithms: bubble sort, merge sort, and quick sort in-depth, including how to code each in Python. We’ll also look at how each algorithm works, as well as the Big O notation for time and space complexity.

sorting algorithms in python

Bubble Sort: Get Started with Sorting Algorithms

Bubble sort is one of the simplest sorting algorithms that repeatedly increments through a list, compares adjacent elements, and swaps them if they are in the wrong order. The reason for its name is that the data sorts in “bubbles”, moving incrementally through each item. The pass-through of the list repeats until the sorting completes. Below is the code for a bubble sort algorithm in python:

def bubble_sort(arr):
    n = len(arr)
    for i in range(n):
        for j in range(0, n-i-1):
            if arr[j] > arr[j+1]:
                arr[j], arr[j+1] = arr[j+1], arr[j]

The bubble_sort function takes an array arr as input and sorts it using the bubble sort algorithm. The outer loop runs n times, where n is the length of the array. The inner loop runs from 0 to n-i-1 because the larger i elements are already sorted. In each iteration of the inner loop, adjacent elements compare with each other. If necessary, the items swap for the sorting to take place.

The time complexity of bubble sort is O(n^2), where n is the array’s length. This means that the algorithm could be more efficient for large arrays. To put it into context, for an array of size 1,000,000, the time taken in seconds would be much larger than the sorting algorithms below. On the other hand, the space complexity is O(1) because only a constant amount of extra memory is needed to store temporary variables. By the end of this article, this is where it beats both merge sort and quicksort in Python.

Bubble sort is a stable algorithm. This means it keeps the relative order of elements with equal values same after sorting.

Merge Sort in Python: How to Divide and “Merge”

John von Neumann invented the merge sort algorithm in 1945 as a divide and conquer algorithm. Merge sort is one of the more complex sorting algorithms that divides the array into two halves, sorts each half recursively, and then merges the two halves together. Here is the python code for merge sort:

def merge_sort(arr):
    if len(arr) > 1:
        mid = len(arr) // 2
        left_half = arr[:mid]
        right_half = arr[mid:]
        
        merge_sort(left_half)
        merge_sort(right_half)
        
        i = j = k = 0
        
        while i < len(left_half) and j < len(right_half):
            if left_half[i] < right_half[j]:
                arr[k] = left_half[i]
                i += 1
            else:
                arr[k] = right_half[j]
                j += 1
            k += 1
        
        while i < len(left_half):
            arr[k] = left_half[i]
            i += 1
            k += 1
        
        while j < len(right_half):
            arr[k] = right_half[j]
            j += 1
            k += 1

The merge_sort function in python takes an array arr as input and sorts it using the merge sort algorithm. The function uses recursion to sort the left and right halves of the array., dividing the initial array into two halves. The sorting recursively takes place in each half, undergoing the division until the list is indivisible. Then, the sorted halves merge together using a temporary array at the end. Thus, that’s the reason it is a divide-and-conquer algorithm.

The time complexity of merge sort is O(n log n), where n is the array’s length. This is amazing when you consider the history of algorithms. Still, it is one of the lowest Big O values which means it usually takes a short amount of time to sort. The space complexity is O(n) because a temporary array is needed to store the merged halves.

Quicksort: Quickly Divide and Conquer Your Sorting Algorithms

Quicksort is also a divide-and-conquer algorithm invented by Tony Hoare in 1959. The algorithm divides the array into two subarrays, recursively sorts the subarrays, and then combines the sorted subarrays.

Here is an implementation of quicksort in Python:

def quicksort(arr):
    if len(arr) <= 1:
        return arr
    
    pivot = arr[0]
    left = []
    right = []
    
    for i in range(1, len(arr)):
        if arr[i] < pivot:
            left.append(arr[i])
        else:
            right.append(arr[i])
    
    return quicksort(left) + [pivot] + quicksort(right)

The quicksort function takes an array arr as input, and sorts it using the quicksort algorithm. First, it selects a pivot element (in this case, the first element of the array), and then partitions the array into two subarrays: one containing elements smaller than the pivot, and one containing elements greater than or equal to the pivot. Then, the function recursively sorts the two subarrays and combines them with the pivot element.

The time complexity of quicksort is O(n log n) on average, where n is the length of the array. However, the worst-case time complexity is O(n^2), which occurs when the pivot is selected in a way that causes the algorithm to make many unnecessary comparisons. So while, in average it could run really fast, it could also take a much longer amount of time. The space complexity is O(n) on average, but can be as bad as O(n^2) in the worst case.

Conclusion

In summary, we have looked at three popular sorting algorithms: bubble sort, merge sort, and quicksort. Bubble sort is a simple algorithm with a time complexity of O(n^2) and a space complexity of O(1). Merge sort is a divide and conquer algorithm with a time complexity of O(n log n) and a space complexity of O(n). Quicksort is also a divide and conquer algorithm with a time complexity of O(n log n) on average and a worst-case time complexity of O(n^2). It has a space complexity of O(n) on average, and O(n^2) in the worst case.

When selecting a sorting algorithm, it’s important to consider the size of the input data and the expected performance characteristics of the algorithm. Bubble sort is generally useful for small arrays or for educational purposes. On the contrary, merge sort and quicksort are more commonly helpful in real-world applications. If stability is important, merge sort is the better choice, while if in-place sorting is a necessity, quicksort may be a better choice.

Disclaimer: ChatGPT wrote the code in this article. Please don’t worry if you want to learn more about ChatGPT. Articles like ChatGPT: The Next Big Thing in AI and ChatGPT AI: Features to 6X Your Productivity in 2023 will help you start.

I hope you liked reading this post. Please share your thoughts below on these sorting algorithms and tell me which one is your favorite. If you have one. What was your experience like when you first learned about them? Thank you for reading this post and have a good day!

By abel

Family, Systems engineer, philosophy, world history, good food, current affairs and good movies.

Leave a Reply

Your email address will not be published. Required fields are marked *

Start Sharing and Storing Files for Free

You can also get your own Unlimited Cloud Storage on our pay as you go product.
Other cool features include: up to 100GB size for each file.
Speed all over the world. Reliability with 3 copies of every file you upload. Snapshot for point in time recovery.
Collaborate with web office and send files to colleagues everywhere; in China & APAC, USA, Europe...
Tear prices for costs saving and more much more...
Create a Free Account Products Pricing Page