It's been a while since I've posted here, feel like another ramble towards the zero people who read this (myself).
Currently I am partaking in the Design and Analysis of Algorithms course (rerun) on Coursera. It's only in lecture 2/6 so far, but I am getting ahead using the lectures from the initial run of the course.
Lectures 1 and 2 cover Divide and Conquer algorithms (Merge sort, Quick sort, ith order statistic, Strassen's matrix multiplication etc). It's pretty awesome to think that a lot of O(n^2) algorithms can be split up to run in O(nlogn), and also made me realise Divide and Conquer is such a basic technique used in everyday life (albeit the chance of exceeding your function stack's limit due to recursion).
"Your mother is so fat, the recursive function computing her mass causes a stack overflow."
Also it makes me wonder, though there are probably masses of proofs to end this train of thought, but since many O(n^2) algorithms become O(nlogn) and O(n) become O(logn) using D&C. Could the D&C algorithms be further Divided and Conquered (with some kind of magic)?
Taking sorting as an example, proofs are in place to make sure data can't be sorted quicker than O(n) therefore our new algorithm order will have to be:
O(n) < g(n) < O(nlogn)
Let's take O(nk(n)) where in Mergesort k(n) = log(n)
So we can split the problem up a bit:
1 < k(n) < log(n)
Obviously since O(0.5log(n)) = O(log(n)), k(n) cannot be multipled by a constant so the below will be a bit clearer
O(1) < O(k(n)) < O(log(n))
Sadly I cannot think of a function smaller than log(n) so I can't ramble any further!
Unless...
I invent one,
ALS(n) is smaller than log(n) for all values of n
Therefore a D&C^2 sorting algorithm is O(nALS(n))!
Tidak ada komentar:
Posting Komentar