I have a question regarding complexity theory. If I have a Bubble sort algorithm and I want to find its worst case running time Big O, we can conclude that it is O(n^2). Now, what about If I have a program that executes different operations like a sorting algorithm, search algorithm, etc. How do I know what is the worst case running time (Big O) of this program in general.

For example, how having different algorithms within a program with its respective worst case running time Big O notations get to the conclusion of the worst case running time (Big O) of the entire program. Like when a program has the following: O(n^2), O(1), O(n).... Which one of these notations is the one that represents the entire program?

How would you find the worst-case running time Big O of this program?

```
import java.util.*;
public class Prog1 {
public static void main(String[] args) {
int first = 0;
int last;
int middle;
int search;
int[] array;
Scanner input = new Scanner(System.in);
System.out.println("Number of elements");
int n = input.nextInt();
array = new int[n];
System.out.println("Enter " + n + " value ");
for (int x = 0; x < n; x++) {
array[x] = input.nextInt();
}
System.out.println("Value to search");
search = input.nextInt();
last = n - 1;
middle = (first + last) / 2;
while (first <= last) {
if (array[middle] < search)
first = middle + 1;
else if (array[middle] == search) {
System.out.println(" Number " + search + " is in the array");
break;
} else
last = middle - 1;
middle = (first + last) / 2;
}
if (first > last)
System.out.println(" Number " + search + " is not in the list.");
}
}
```

The highest one. O(n^2) + O(n) + O(1) = O(n^2) asymptotically talking! This is how you would calculate the complexity of an algorithm though. It doesn't make much sense to talk about program "complexity".

Similar Questions

I am revising for an exam and I have found this problem on the internet and was wondering how I would go about solving it. (With base 2 logs) Prove that log(2n) is a member of O(log n). I have given

We have 3 functions with big o notations: Func A: O(n) Func B: O(n^2) Func C: O(2^n) If these functions does n proccesses in 10 seconds, how much time need to proccess 2 * n processes for each functi

I'm looking at this website that lists Big O complexities for various operations. For Dynamic Arrays, the removal complexity is O(n), while for Hash Tables it's O(1). For Dynamic Arrays like ArrayList

Possible Duplicates: Good text on order analysis Plain English explanation of Big O Hello, I have read a lot of documentation about O notation, but all of them in mathematic context and question: Co

What is the big-O complexity of the function (log n)k for any k?

So I am trying to understand the data types and Big O notation of some functions for a BST and Hashing. So first off, how are BSTs and Hashing stored? Are BSTs usually arrays, or are they linked list

I'm trying to figure out where O(sqrt(n)) and O(n2 log n) fit in this growth hierarchy listed below. This chapter is so confusing and i'm lost on how to figure this out. Any suggestions would be much

What's a good strategy to determine the running time (Big O notation) of data structures and algorithms. I have the following to figure out the running times for and I'm having trouble determining wha

I am doing the exercise of Skiena's book on algorithms and I am stuck in this question: I need to calculate the big O of the following algorithm: function mystery() r=0 for i=1 to n-1 do for j=i+1 to

What is the Big-O time complexity ( O ) of the following recursive code? public static int abc(int n) { if (n <= 2) { return n; } int sum = 0; for (int j = 1; j < n; j *= 2) { sum += j; } for (i

Can anyone please let me what would be big O time complexity for the following piece of code: for (int i = 0; i < array.length - 1; i++) { for (int j = i + 1; j < array.length; j++) { // do some

I have to determine the time complexity (big O) of the following function: void BET::makeEmpty(BinaryNode* &n) { if(n != NULL) { makeEmpty(n->left); makeEmpty(n->right); delete n; } n = NULL

I can clearly see than N^2 is bounded by c2^N, but how do i prove it by using formal definition of big-O. I can simply prove it by M.I. Here is my attempt.. By definition, there for any n>n0, ther

I can't solve a problem; can someone help me? What is the Big O notation for the below statement:- for (int i=2;i<=n;i=i*4) sum++;

I think this is probably a beginner question about big-O notation. Say, for example, I have an algorithm that breaks apart an entire list recursively(O(n)) and then puts it back together (O(n)). I ass

Possible Duplicate: What’s the complexity of for i: for o = i+1 I have done the following sorting algorithm for arrays of length 5: int myarray[5] = {2,4,3,5,1}; int i; for (i = 0; i < 5; i++) {

I would like to show that: O(f_1(n) + f_2(n) + .. + f_k(n)) <= O(f_1(n)) + O(f_2(n)) + ... + O(f_k(n)) is true. My intuition why inequality holds is that in both directions: <=: We sum up all th

How to prove this: 4n = O(8n) 8n = O(4n)? So what are the C and n0 values for both cases?

I was just wondering what is the time complexity of the following code. The time complexity (Big O) of the code below in my opinion would be O(n^4) What do you guys think? int result = 0; for(int i =1

What is the complexity of inserting into sorted link list in big-O notation? Let say I have 5 elements and what is the complexity to insert all of them. Thank you very much

I need to prove/disprove the following sentence: for each f(n)=O(logn) implies 2^(f(n)) = O(n) I think it's true, because 2^(log(n)) = n. What do you think?

I am completely new here and what I am fighting with is understanding the Big-oh notation concept. Recently I have started data structure and algorithm course in my school and the term Big-oh is ver

I'm confused about how to do big-O analysis for the following problem - find an element from an array of integers. ( an example problem) my solution sort the array using bubble sort ( n^2 ) binary se

I've looked all around for information about Big-Theta, and I think I've come to a decent understanding of it. However, the question remains: Is Big Theta Notation an efficient measure of algorithm ef

Is big-O notation a tool to do best, worst, & average case analysis of an algorithm? Or is big-O only for worst case analysis, since it is an upper bounding function?

for (int j=0,k=0; j<n; j++) for (double m=1; m<n; m*=2) k++; I think it's O(n^2) but I'm not certain. I'm working on a practice problem and I have the following choices: O(n^2) O(2^n) O(n!) O(

for i := 1 to n do j := 2; while j < i do j := j^4; I'm really confused when it comes to Big-O notation, so I'd like to know if it's O(n log n). That's my gut, but I can't prove it. I know the whi

Considering the below algorithm, Loop1 until(i<n^2) Loop2 until(j<i^2) .... j=j+4 End Loop2 i=i*3 End Loop1 I think this is Theta(n^2*log(n)). This is correct or is the Big Theta higher than th

I am trying to figure out how to determine Big O notation for recursive methods . I realise that is probably the same way as with a iterative method, but im not sure about this. I wrote this simple r

I have a C program that converts postfix notation into an answer, so for 2 2 + it would print 4, for 2 4 + 3 / 6 + it would print 8. However, when I do 2 4 ^ 2 * 5 % 2, it has a problem with the

I am working on some revision at the moment and specifically going over Big-O notation. I have asked a similar question (which dealt with a different algorithm) but am still unsure if I am going the r

What is the Big-O for SQL select, for a table with n rows and for which I want to return m result? And What is the Big-O for an Update, or delete, or Create operation? I am talking about mysql and sql

The question is rather simple, but I just can't find a good enough answer. On the most upvoted SO question regarding the big-O notation, it says that: For example, sorting algorithms are typically co

I have run across this method in our code base and wonder what the Big O is. The method takes a flat list and creates a tree, assigning the Parent and Children values as it goes. private void AddChild

I was reading about Big O notation. It stated, The big O of a loop is the number of iterations of the loop into number of statements within the loop. Here is a code snippet, for (int i=0 ;i<n; i+

Maybe I'm mistaken in my understanding of Big-O notation (it has been a while since I've taken a course on algorithms) but the following has never made too much sense to me: This would be considered O

Question: what is the big oh notation for the cost of a stack (that implements an array) that doubles the size of its array if it needs more space. it dynamically resizes larger, but not smaller. ex:

I am trying to find the Big O for this code fragment: for (j = 0; j < Math.pow(n,0.5); j++ ) { /* some constant operations */ } Since the loop runs for √n times, I am assuming this for-loop is O(√

I'm trying to find the Big O running time for the following code snippet: for( i = 0; i < n * n; i++ ) for( j = 0; j < i; j++ ) k++; I'm not sure if it would be O(n^3) because of the multiplica

I was given the following code and was told to find the best and worst case running times in big theta notation. def find(a, target): x = 0 y = len(a) while x < y: m = (x+y)/2 if a[m] < target:

I understand that this is O(N^2): Loop from i=1 to N Loop from j=1 to N Do something with i,j But what about this? Loop from i=1 to N Loop from j=1 to i Do something with i,j Is it still O(N^2) or O

void function(int N){ int c=0; for (int i =0; i< N; i+= N/5) c++; } What is the Big O of the above? Since for every N the loop would iterate 5 times, would it be O(1)?

So, I really don't get Big O notation. I have been tasked with determining O value for this code segment. for (int count =1; count < n; count++) // Runs n times, so linear, or O(N) { int count2

I am just starting to learn about Big O notation and had a question about how to calculate the growth rate of an algorithm. Suppose I had an algorithm with O(√n log n) time complexity, and for n = 10

Is there a difference between saying that f(n)=O(g(n)) and f(n) ∈ O(g(n))?

I am learning algorithm analysis. While doing the theory I across many big-O proofs. I was able to solve them but I need help with omega which is the oposite of big-O? Is 22n = O(2n)? --->My answer

We use Ө-notation to write worst case running time of insertion sort. But I’m not able to relate properties of Ө-notation with insertion sort, why Ө-notation is suitable to insertion sort. How does th

I'm reading a textbook right now for my Java III class. We're reading about Big-Oh and I'm a little confused by its formal definition. Formal Definition: A function f(n) is of order at most g(n) - th

I am trying to find the big O bound for the following recurrence relation: T(n) = T(n-1) + n^c, where c >= 1 is a constant So I've decided to solve this by using iteration: T(n) = T(n-1) + n^c T(n

I have noticed that big-O of 1000n or 10n is the same thing as O(n), but big-O of 2^n and 3^n are different: O(2^n) and O(3^n), what I don't get is why can't we ignore the constants in this case (2 or