Jump to content

英文维基 | 中文维基 | 日文维基 | 草榴社区

Talk:Best, worst and average case

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Worst case... A person won't know that this refers to sorting algorithms... Does this have any sort of potential as an encyclopedia article? Been a long time since I talked the lingo so I don't feel qualified... --Rgamble


This looks like some kind of sorting routine, possibly in C#.

A Java programmer would never write such a travesty. He'd simply call the Arrays.sort() method in the Java API.

Actually the sort below is written in C. And it isn't a travesty at all. Suppose you know that all the elements in an array are between a and b, this sort will run in Θ(max(n, b-a)) time (in many cases, this is equivalent to &Theta(n) time: not too shabby :-) --Hari

void pigeonhole_sort ( int *low , int *high , int minvalue , int maxvalue )
{
   /* minvalue and maxvalue can also easily be determined within this function */
   int count , size = maxvalue - minvalue + 1 , *current ;
   bool holes[size] ;
   for ( count = 0 ; count < size ; count++ ) /*Initializing*/
      holes[count] = false ;
   for ( current = low ; current <= high ; current++ ) /*Sorting*/
      holes[(*current)-minvalue] = true ;
   for ( current = low , count = 0 ; count < size ; count++ )
      {
      if ( holes[count] )
         {
         *current = count + minvalue ;
         current++ ;
         }
      }
}

Are there quicksort implementations which push the worst case performance below O(n2)? AxelBoldt

Well, the STL uses the "introspection sort", which basically uses a median-of-three pivot, and switches to a true O(n log n) sort if the recursion depth is greater than k*log n, for some k, typically 2. One such O(n log n) algorithm would be a quicksort that uses the true median as the pivot in the partition. Since the true median can be found in O(n) time, and partition is O(n) time, this is O(n log n). However, the constant factor in such a sort would be somewhat large, and hence should be used only if the recursion depth gets out of hand (the STL uses heapsort instead). So, you can achieve O(n log n) with the quicksort structure. --Hari

Actually, I am thinking maybe we should just merge this to algorithm or have an article like complexity analysis of algorithm or something. -- Taku 08:15, Mar 30, 2004 (UTC)

Merge with average performance

[edit]

I merged this with average performance and did some cleanup. I'll leave any extra work (and this article could use it) for others to do. --Phils 16:01, 20 Oct 2004 (UTC)


Linear search on an array has a worst-case performance O(n) (see Big O notation), because in the worst case, the algorithm has to check every element. However, the average running time is O(n/2).

This is a poor example, because O(n/2) is really just O(n) as well in Big O notation (the 1/2 is just a scalar multiple).--Malcohol 13:48, 31 January 2006 (UTC)[reply]

Removed content from best case

[edit]

This kind of analysis differs from the other two in that we consider not the algorithm but the problem itself. It should really be referred to as 'best worse case' analysis, because we aim to arrive at bounds on the performance of all possible algorithmic solutions, assuming their worst case.

Best case analysis is based on a consideration of the logical demands of the problem at hand - what is the very minimum that any algorithm to solve this problem would need to do, in the worst case, for an input of size n?

I don't think you ever do best case analysis on the problem. There will almost always be an algorithm that have a constant best case performance. Taemyr (talk) 18:52, 1 May 2008 (UTC)[reply]
This content is confusing "best case" with "lower bounds"; you can very well analyze the best case performance of a specific algorithm over all inputs (see the table in sorting algorithm for example). On the other hand, a lower bound can specify the minimum resources required in the worst case, with the minimum taken over all possible algorithms, a good example being the Ω(n log n) lower bound on comparison sorts. Dcoetzee 21:23, 1 May 2008 (UTC)[reply]

Merger proposal

[edit]

I propose the three articles Best, worst and average case, Worst-case complexity and Average-case complexity be merged to make one informative and complete article. --Robin (talk) 16:13, 11 August 2009 (UTC)[reply]

I'm removing the merge tag. After over three years, there is no discussion on this topic. If you still want to merge, I suggest that you be WP:BOLD and WP:JUSTDOIT. WTF? (talk) 14:02, 24 January 2013 (UTC)[reply]

WTF? you want a merge 'just to see some action', ...so you remove the 'action'? It's all told here, in the Annals of Wikipedia. TOOFUNNYWTFLOL :D Henrik Erlandsson 03:01, 30 March 2019 (UTC) — Preceding unsigned comment added by HenrikErlandsson (talkcontribs) [reply]

Confusion

[edit]

What is worst case space complexity?

Worst case space complexity and worst case performance both link here. It is explained what is worst case perfomance, but what is worst case space complexity? They are different things, because article [|quicksort ] has both of them. —Preceding unsigned comment added by Rmasta (talkcontribs) 22:28, 3 January 2010 (UTC)[reply]

I second Rmasta. At the very least, the stub of Space Complexity should be linked to first occurrence. Also, there's not much depth of thought in mentioning it (or calling it "Complexity"). This factor may be completely disregarded, as far as anyone developing or testing a sort algorithm is concerned, even considering early attempts on low-memory-space computers. Performance, perhaps measured in the number of "swaps", but for a good algorithm probably not, is the real unsolved problem since time immemorial.

Henrik Erlandsson 02:39, 30 March 2019 (UTC) — Preceding unsigned comment added by HenrikErlandsson (talkcontribs)