and we are back 🙂 , now let’s continue with  some other sorting algorithms  if you haven’t read part one of this Blog post you can  find it here  Sorting algorithms 

selection sort O(n^2)  : 

Selection sort worst case  \mathcal{}  n^2  average case  \mathcal{}  n^2  best case   \mathcal{}  n^2  disadvantages : it doesn’t ever benefit from presorted arrays 

lets start again with a stupid slow  sorting algorithm  , it’s the selection sort   it’s a very basic algorithm even more basic than bubble sort , it’s logic like when you have a bunch of numbers  and you want to sort them , you take the smallest one  , then the second smallest one , then the 3rd smallest one and so on .

and from here comes the naming  ( selection ) , and every time you have to loop on the array to decide the smallest item  (complexity O(n) ) and you repeat this n times in order to pick  n items ,   so the overall complexity will be O( n^2 )

 

 

 

 

 

why the selection turtle is dumber than the Bubble turtle  ?   

selection sort never takes benefit from  presorted arrays or even presorted elements every time it loops on the total array to decide the minimum rest element to pick it   so if we have a presorted array  which will be the best case 
the bubble sort  will sort it in  n iterations  , while Bubble sort will have complexity of  n^2 which makes it the dumbest ever 😀  

_____________________________

Quick Sort  : 

Quicksort  best case  : \mathcal{} n \log n   average case : \mathcal{} n \log n  worst case : \mathcal{} n^2

 

 

 

 

 

 

Quick sort  , who called it Quick ?    

 dunno . maybe it was the Quickest then ?  😀 

kidding 😀 , Quick sort is one of the most efficient algorithms  it has n^2  as a worst case but it often reaches it’s worst case it’s also faster than those  nlogn algorithms in practice 
 because its inner loop can be efficiently implemented by most architectures ( array , linked list ..etc ) 

actually : Quick sort also  competes with merge sort  ,  that Quick sort could be implemented on Linked lists  and as well as that it takes a little Memory utilization needed usually ( Log(n) )  unlike merge sort wich needs  (n) extra memory when using arrays 

let’s see how does Quick sort works : 

Quick  sort Algorithm :

it is from (divide and conquer algorithms , like the  Merge sort )

  1. Divide : using a pivot ( random number selected )
  2. sort  all the numbers less than the Pivot on  before it  
  3. sort all the numbers greater than the pivot after it 
  4. then you have two arrays one all it’s element smaller than the pivot and the other larger than the pivot 
  5. repeat all those steps recursively on the two arrays 

step  2  and  3  are done as following : 

*we use two pointers , one points to the top of the array  ,and the other at the bottom  and in the middle lies the pivot  move the left pointer until you find an element greater than the pivot ( it should to to the next side )

* on the other side move the right pointer until it hit a number less than the pivot  ( it should go to the next side )

*replace the two numbers

*  repeat those steps until you reach the pivot then the left array will become less than the pivot , and the right is more than the pivot

check this video it elaborates will the using of two pointers : 

 you can check this video for further demonstration  and some code snippets :

_____________________________

 

Heap Sort :

Heapsort  best case \mathcal{} {n \log n}  average case \mathcal{} {n \log n}  worst case \mathcal{} {n \log n}

this post is supposing that you have a pre-knowlege of what’s heap if you don’t , you can check this video  or this  

Heap sort is an efficient  kind  of  Selection sort  ( yeah the Dumbest one ever ) 😀 

How ? ? 

in selection sort we used to search for the smallest number in the array  so it was a complex algorithm of complexity n^2 

but in heaps you can easily search for elements inside it .  i needs number of iterations = height of the Tree = log(n) 

so it Gives an overall complexity of  O(nlog(n)) 

this is how it works 

  • Step 1: Build a heap
  • Step 2: removeMin()

the minimum item in the heap will be always the Root of the Heap so we will always remove it and reconstruct the heap again and it will be super easy when you are dealing with a Heap class that have remove Function

removing item from the heap

when you remove the root of the heap you and replace it with the last element in the heap

then you compare it with it’s two children , and replace the root with the minimum of it’s two children

then repeate those steps with the replaced item untill the root is smaller than it’s children

( if you don’t Get this , it’s recommended to revise the Heap removal )

the worst case here is when you compare the item to the Bottom element of the TREE , then you will do log(n) iterations

for removing n items you will have and overall complexity of O(nlog(n)) which is the complexity of the heap sort

comparing to the Selection sort (O(n^2)) it’s a Huge progress, although both based on the same logic

__________________________________________________________________

till now we are Done 🙂

yet let’s do some conclusions 😀
from different searching algorithms above and in the last post we found that

in the average case :

our three heroes will be the : Merge sort , Quick Sort , Heap sort

and our Loosers will be : the Bubble sort and the Selection sort and the insertion sort

in case of a preordered array :

the true heroes will be : the bubble and the Intersion sort for having a O(n) complexity

and the looser will be : the dumb turtle 😀 the Insertion sort

Finally that most of us before using the Algorithms was using the Bubble sorting algorithm to sort Arrays

so it once again will be the Hero as the Most Famous Sorting Algorithm 

______________________________________________________________

DO IT yourself  😀 

 :  

those people who do folk  algorithmic dances watch their youtube channel 

those Guys are awesome 🙂

it’s the best practice now to watch their Dances and try to map them to what you’ve  learnt on different sorting algorithms

also follow their fanpage

________________________________

and once again Credit Goes to :

  • MIT opencourse ware :  introduction to algorithms course , with charles and Eric  🙂  i recommend to watch the Full playlist  or take a look on  the full course with the slides and exercises  the from the MIT opencourseware website  here  thank you MIT 🙂
  • thanks to wikipedia for the Amazing animated Gif  images  🙂
  • Google for images of  Rabbits and Bunnies  😀
Advertisements