I'm wondering what is the big O notation for each of the below statement:

```
Sum = 0;
for i=1 to N^2 do:
for j=1 to N do:
'Sum += 1;
```

Sum = 0 is O(1) for sure, because it will only be executed once. But I'm confused by the second statement, should it be O(N) because it's the first loop? or it should be O(N^2) because N^2 is a quadratic function about variable N?

You'll be looping through N three rounds..so i say: O(n^3)

The first loop is O(N^{2}) because it executes N^{2} steps. Each of those steps executes the inner loop, which involves N steps, so there are N^{2} * N or N^{3} steps, and the algorithm is O(N^{3}).

Algorithm:

```
Sum = 0; ~ 1
for i=1 to N^2 do: ~ 1+2N^2
for j=1 to N do: ~ (1+2N) * N^2
'Sum += 1; ~ 1 * N * N^2
```

Time Complexity:

```
Time = 1 + 1+2N^2 + (1+2N)*N^2 + 1 * N * N^2
Time = 2 + 2N^2 + N^2 + 2N^3 + N^3
Time = 2 + 3N^2 + 3N^3
O(Time) = O(2 + 3N^2 + 3N^3) ~ O(N^3)
```

Similar Questions

I'm solving some recurrence relation problems for Big O and so far up till this point have only encountered recurrence relations that involved this form: T(n) = a*T(n/b) + f(n) For the above, it's qu

This question already has an answer here: Best case Big O complexity 2 answers Could someone please help me with this question?: how can you limit the input data to achieve a better Big O compl

Is there any sorting algorithm which has running time of O(n) and also sorts in place?

I have been programming in PHP for a long time now, and as I don't come from a computer science / math background I only have a basic understanding of the Big O notation, so I have taken a function an

Two algorithms say A and B are written to solve the same problem. Algorithm A is O(n). Algorithm B is (n^2). You expect algorithm A to work better. However when you run a specific example of the same

Should the people who analyse the output and results of an algorithm be aware of its design? By analysis I mean finding cases where the algorithm failed and returned bad results.

I'm busy doing an assignment and I'm struggling with a question. I know I'm not supposed to ask assignment questions outright so I understand if I don't get straight answers. But here goes anyway. We

I wanted to figure out the Big Oh of each line of code. I was able to get the first 2 lines but the 3rd line is confusing. sum <- 0 // O(1) for i <- 0 to n do // O(n + 3) for j <- 0 to i * i

m=1; for(i=1;i<=n;i++){ m=m*2; for(j=1;j<=m;j++){ do something that is O(1) } } What will be time complexity of the above code ?? Please tell me how to solve these types of problem.

Hello it has been some time sins i have used Big O notation, so i'm a bit rusty. I know that having 1 loop, that loops n times is O(n), and having 1 loop that loops n time within another loop that loo

Title is self-explanatory. Very easy question. I think it's O(n) but wanted to verify before my final tomorrow.

What Big-O notation questions have you been asked? Did you find them to be good questions? Did the interviewer actually understand the concept?

I have a big data frame with almost 1M rows (transactions) and 2600 columns (items). The values in the data set are 1's and NA's. Data type of all the values are factor. I want to add a new column to

Problem What is this algorithm doing? What does 0x01 represent? What does it mean that m = m >> 1 within the inner while loop? What is this algorithm big-O of? while(n>0) { m = n; while(m) {

I'm a bit confused on how to determine when constant is important to find big O. I know we're supposed to ignore constant to find big O but this function is making me think twice: f(n): 5n + 8n log n

According to Wikipedia, the Selection Algorithm has runtime of O(n), but I am not convinced by it. Can anyone explain why is it O(n)? In the normal quick-sort, the runtime is O(n log n). Every time we

I do not have any idea how can I prove that algorithm analysis or discrete math question:

Consider the function F: 2^(3*n) + n^2 Can the function A: 2^(3*n) be used as a Big Theta, Omega or O as a characterisation of F? Why? I'm revising the concepts of Big Omega, Big Theta and Big O and I

If I have the following closed form solution for a recurrence relation, how can I simplify it under big O: f(n) = 3^n + n.9^n I would hazard a guess at: f(n) is a member of O(9^n) -> Am not sure if

I am trying to find a good explanation to quickly understand Big O and Theta theory. I always feel an explanation can be given in a million different ways, and I guess I'm seeking that one explanation

I've been having some problems trying to grasp the concept of big O notation. So, by definition big O is as follows, T(n) ∈ O(G(n)) if T(n) <= G(n) * C. Since the the constant C can be any integ

My question arises from the post Plain English Explanation of Big O. I don't know the exact meaning for logarithmic complexity. I know that I can make a regression between the time and the number of

I am going through exercies for an exam in algorithm analysis and this is one of them: Present an algorithm that takes as input a list of n elements (that are comparable) and sorts them in O(n log m)

This is the algorithm for finding the maximum contiguous sub-sequence sum using the DP approach. The algorithm seems to be quite okay but it was mentioned that this has space complexity O(n). Why? To

void bar(int N){ int c=0; for (int i=0; i<N*N; i++) for (int j= i; j< N; j++) c++; } The outer (and inner) loops above never get past N. Would this mean the Big O is N*N (N for the outer and N/

O represents Big-O. O(g) : { f| f is non negative function there exists c,m where c and m are any constants

I was wondering what the Big O notation for this would be. I know the for loop is O(n). I wasn't sure if the if statements were O(n log n). If so, doesn't that make the run time complexity (n)*((n log

This isn't actually homework, but I need to understand these concepts in the class. What is the worst-case Big-O performance for the insert, find and remove operations in a general tree? Why is this

I am trying to find the big O bound for the following recurrence relation: T(n) = T(n-1) + n^c, where c >= 1 is a constant So I've decided to solve this by using iteration: T(n) = T(n-1) + n^c T(n

I've read the topic: Big O, how do you calculate/approximate it? And am not sure what the Big-O notation for the following function would be: static int build_nspaces_pattern(const char * const value,

Hello I am having trouble finding the running time for this algorithm with following assumptions that s = O(logN) and the running time of Random_Integer is constant time: 1 express N-1 as 2^t * u whe

The following code give me a O(n). how do I code a for loop that has time complexity of O(c^k)? int power(int x, unsigned int y) { if( y == 0) return 1; else if (y%2 == 0) return power(x, y/2)*power(x

I think this is probably a beginner question about big-O notation. Say, for example, I have an algorithm that breaks apart an entire list recursively(O(n)) and then puts it back together (O(n)). I ass

How would I go about solving this problem: Use big O notation to give the number of nodes that can be contained in a tree of depth n I'm not looking for the exact answer or anything, Just how I would

I am working out a function: total = 0; for (i = 0; i < N; i++){ for (j = 0; j < i*i; j++){ if (j % i == 0){ for (k=0; k < j; k++){ total++; I the Big O number for this N^4 or N^5 when you b

Trying to do Big O analysis- What is the average case for this program?. O(n/2(n/2)) = O(n^2) ?.. /* Returns the largest integer in the array */ int CompareToAll(int array[], int n) { int i, j; bool

I'm trying to grasp the idea of Big O for part of a project that is due tonight and I don't know if i'm thinking this through right or not... The project included us writing Iterative and Recursive so

As part of a programming assignment I saw recently, students were asked to find the big O value of their function for solving a puzzle. I was bored, and decided to write the program myself. However, m

I may be teaching a Java crash-course soon. While it is probably safe to assume that the audience members will know Big-O notation, it is probably not safe to assume that they will know what the ord

for i := 1 to n do j := 2; while j < i do j := j^4; I'm really confused when it comes to Big-O notation, so I'd like to know if it's O(n log n). That's my gut, but I can't prove it. I know the whi

What is the Big-O efficiency of a stack, queue, set, and deque as it pertains to insertion, search, indexing, space, and deletion complexities?

I am currently taking a class which is incorporating a topic I have not yet had much experience with - Big-O. Here is an example of the type of questions I will need to answer. Please note: these ques

I'm trying to determine the time complexity of the following: while loop that executes n! times { perform an operation that is O(n) } Would the big-o analysis still be O(n!)? I don't see how it would

What in an exact formal manner does the expression f(n) = 2^O(n) mean?

Possible Duplicate: Plain English explanation of Big O Dear all, As I read information about some algorithms, occasionally, I encounter algorithm performance information, such as: O(1), O(n), O(n^2)

I know that the relation n = Big-O(1) is false. But if we use induction involving Big-O it can be proved. But the fallacy is we cannot induct Big-O. But my question is how we can disprove the relation

If I have the following code: IterateArray(object[] array) { for(int i=0; i<array.length; i++) { Dosomething(array[i]); } } and the Dosomething(object) method's time performance is O(log n), is th

>Problem: n^4 + 100n^2 + 50. >Solution given: n^4 + 100n^2 + 50 <= 2n^4 for all n>=1 > n^4 + 100n^2 + 50 = O(n^4) with c=2 and n0 = 100 But when n is 1, the above function will be 4+1

If I have a function of: for(i=0;i<n;i++) for(j=0;j<i*i;j++) for(k=0;k<j;k++) System.out.println(k); Would the big O of this function be n^5 from having: n*((n-1)^2)*((n-1)^2)-1?

The great people at MyCodeSchool.com have this introductory video on YouTube, covering the basics of Big-O, Theta, and Omega notation. The following definition of Big-O notation is provided: O(g(n) )