Is big-O notation a tool to do best, worst, & average case analysis of an algorithm? Or is big-O only for worst case analysis, since it is an upper bounding function?

It is Big O, because orders of magnitude are expressed like O(n), O(logN), etc.

The best, worst, and average cases of an algorithm can all be expressed with Big O notation.

For an example of this applied to sorting algorithms, see

http://en.wikipedia.org/wiki/Sorting_algorithm#Comparison_of_algorithms

Note that an algorithm can be classified according to multiple, independent criteria such as memory use or CPU use. Often, there is a tradeoff between two or more criteria (e.g. an algorithm that uses little CPU may use quite a bit of memory).

Big "O" is a measure of *asymptotic complexity*, which is to say, roughly how an algorithm scales as N gets really large.

If best & worse converge to the same asymptotic complexity, you can use a single value - or you can figure them out seperately (for example, some sorting algorithms have completely different characteristics on sorted or almost-sorted data than on un-sorted data).

The notation itself doesn't convey this though, how you use it does.

... Or is big-O only for worst case analysis ...

If you give just one asymptotic complexity for an algorithm, it doesn't tell the reader whether (or how) the best and worst case differ from the average.

If you give best-case and worst-case complexity, it tells the reader how they differ.

By default, if a single value is listed, it is probably the average complexity which may (or may not) converge with the worst-case.

Similar Questions

m=1; for(i=1;i<=n;i++){ m=m*2; for(j=1;j<=m;j++){ do something that is O(1) } } What will be time complexity of the above code ?? Please tell me how to solve these types of problem.

Here is the code: int Outcome = 0; for (int i = 0; i < N; i++) for (int j = i+2; j = 0; j--) Outcome += i*j; Here's my analysis. Since the first line is an assignment statement, this takes exactly

I am trying to determine the run time of dijkstra's algorithm using min-priority queue (which is implemented using a Fibonacci heap) The code Analysis: I know that for Fibonacci heap insert is Decrea

This is the code I wrote for inputting random numbers initially and then sorting them with the insertion sort method. #include<iostream> #include <cstdlib> #include <ctime> using na

Which is the Best Tool for Creating XSLT File.

I'm at a loss as to how to calculate the best case run time - omega f(n) - for this algorithm. In particular, I don't understand how there could be a best case - this seems like a very straightforwa

I am trying to learn about complexity analysis and how to perform it from first principles. Take QuickSort as an example, I would like to be able to derive an O-notation expression for the average-cas

As the title states im having some difficulties analzying the average-case memory usage of a memory allocator (quick-fit). My goal is to determine the average-case internal fragmentation of an allocat

i am studying the randomized-quicksort algorithm and i realized that the running time of this algorithm is always represented as expected running time. what was the reason of it? why don't we calcul

I'm trying to create a sentiment analysis tool to analyse tweets over a three day period about Manchester United football club and determine whether people view them positively or negatively. I am cur

I am looking for a static analysis tool for Python, Ruby, Sql, Cobol, Perl, PL/SQL, SQL similar to find bugs and check style. I am looking for calculating the line count, identify bugs during the deve

I've been doing some questions but answers not provided so I was wondering if my answers are correct a) given that a[i....j] is an integer array with n elements and x is an integer int front, back; wh

I'm currently working on software for pedestrian navigation and the topic that is hard for me is to find the best routing algorithm. I heard that A* is one of the best actually used algorithm in that

I have a question regarding complexity theory. If I have a Bubble sort algorithm and I want to find its worst case running time Big O, we can conclude that it is O(n^2). Now, what about If I have a pr

According to the docs List<T>.Sort uses the QuickSort algorithm. I've heard that this can exibit worst case performance when called on a pre-sorted list if the pivot is not chosen wisely. Does t

I'm writing a small software application that needs to serve as a simple planning tool for a local school. The 'problem' it needs to solve is fairly basic. Namely, the teachers need to talk with the p

public void addOccurence(String word) { if (hm.containsKey(word)){ hm.put(word, hm.get(word)+1); } else {hm.put(word, 1); } } I know that in average put(k,v) and get(v) take o(1) and their worst case

There is a picture book with 100 pages. If dice are rolled randomly to select one of the pages and subsequently rerolled in order to search for a certain picture in the book -- how do I determine the

I understand that there are static analysis tool e.g. FindBugs (there are bunch of them ) which do the code analysis statically and also can be scheduled as a part of continuous build process. These t

Here's the algorithm: boolean findTripleA(int[] anArray) { if (anArray.length <= 2) { return false; } for (int i=0; i < anArray.length; i++) { // check if anArray[i] occurs at least three times

I have a data set of the form: [9.1 5.6 7.4] => 8.5, [4.1 4.4 5.2] => 4.9, ... , x => y(x) So x is a real vector of three elements and y is a scalar function. I'm assuming a weighted average

I'm looking for resources (websites/books) where I can find algorithm puzzles/fun-problems, where the goal would be to design an algorithm and detect the worst-case complexity for that algorithm. I wa

Is there a way to make this call in dot notation?: [someSwitch setOn:YES animated:YES]

I'm looking for an algorithm to do a best fit of an arbitrary rectangle to an unordered set of points. Specifically, I'm looking for a rectangle where the sum of the distances of the points to any one

I have a randomized recursive backtracking algorithm to generate Sudoku puzzles (see here). It works fast enough on average, but the worst-case runtime is unacceptably slow. Here is a histogram of the

I have an app and I want to analyze the DB side of it. Here is what I want to do with some very easy solution / tools 1. Analyze the sprocs, queries, tables, indexes and suggest what changes needs to

Hell everyone! I'm using the Stanford Core NLP package and my goal is to perform sentiment analysis on a live-stream of tweets. Using the sentiment analysis tool as is returns a very poor analysis of

Can somebody help me detect a deadlock in my program using any open source tool?

This is more to satisfy my curiousity than anything else, but what languages have the best (or worst) documentation and support communities (including IRC, mailing lists, USENET groups, websites, foru

My question is about using the SARSA algorithm in reinforcement learning for an undiscounted, continuing (non-episodic) problem (can it be used for such a problem?) I have been studying the textbook b

Unfortunately, I can't find any freely available text with an estimation of worst case (external) fragmentation overhead in (binary) buddy memory system. I've found only something like M(1+lg2 m) , wi

I'm supplying a function to idleCallback This notation works: idleCallback $= Just (do modifyIORef world play postRedisplay Nothing) Why doesn't this (seemingly similar) notation work? idleCallback $

I calculate the optimal case complexity, average, and worst of this algorithm in java, I think if good is O (1) in the worst case is O (n), but I do not know if average! could you help me on how to ca

In CLRS, third Edition, on page 155, it is given that in MAX-HEAPIFY, the worst case occurs when the bottom level of the tree is exactly half full I guess the reason is that in this case, Max-Heap

I have the following algorithm that determines the greatest common divisor of two numbers x and y. I need to find the big o notation that describes this algorithm and explain why, but I have no idea h

P(x,y,z){ print x if(y!=x) print y if(z!=x && z!=y) print z } Trivial Algorithm here, values x,y,z are chosen randomly from {1,...r} with r >= 1. I'm trying to determine the average case c

I am shopping for a STATIC ANALYSIS TOOL that can tell me if our code suffers from deadlocks, race conditions and bad practices in general. I know that FindBugs and PMD do already something like this,

Most Haskell tutorials teach the use of do-notation for IO. I also started with the do-notation, but that makes my code look more like an imperative language more than a FP language. This week I saw a

I have got a function and want to denote it in terms of bigO notation. f(n) = log4n+n*(1/3). Is this function O(n)? Thanks for your help

First, I'm new in Python and I work on Arc GIS 9.3. I'd like to realize a loop on the Select_Analysis tool. Indeed I have a layer stations composed of all the bus stations of a city. The layer has

Imagine a case where comparison of two elements is hugely expensive. Which sorting algorithm would you use? Which sorting algorithm uses the fewest comparisons in the average case? What if you can exp

What is the best Fuzzy Matching Algorithm (Fuzzy Logic, N-Gram, Levenstein, Soundex ....,) to process more than 100000 records in less time?

I was given a task. Write an algorithm so that, the input of 2 lists of data, will have at least one in common. So, this is my algorithm: (I write the code in php) $arrayA = array('5', '6', '1', '2',

By now many of you must have heard about HashDoS. The researchers who found this, claim in their video that the worst case complexity of Hastable is O(n^2). How can this be?

Hi i have started learning algorithm analysis. Here i have a doubt in asymptotic analysis. Let's say i have a function f(n) = 5n^3 + 2n^2 + 23. Now i need to find the Big-Oh, Big-Omega and Theta Notat

sum = 0; for (int i = 0; i < N; i++) for(int j = 0; j < i*i; j++) sum++; I'm not entirely sure of my answer; I think the inner loop runs i^2 operations and the outer loop runs N times so the fi

I'll go ahead and mention that this is homework, but I'm not seeking typical homework help. I'm just wanting confirmation on wording of the question. The question states that my algorithm should be li

I was hoping to get some help assessing the best way to notated the complexity of the following algorithm (Written in psuedo-code): Input N; bool complete = false; If(N Satisfies Condition A) { K = N/

I have found many posts about the complexity of map and unordered_map. It is said that unordered_map has worst case complexity of O(N). For my purpose I will have input as sorted values like 1 2 5 6 9

I need to solve for the average case complexity of ternary search. In the worst case you would do two comparisons so I assume worst case complexity looks like this: C(n) = C(n/3) + 2 which can then b