What would the Big O notation be for the following nested loops?

```
for (int i = n; i > 0; i = i / 2){
for (int j = n; j > 0; j = j / 2){
for (int k = n; k > 0; k = k / 2){
count++;
}
}
}
```

My thoughts are: each loop is `O(log2(n))`

so is it as simple as multiply

```
O(log2(n)) * O(log2(n)) * O(log2(n)) = O(log2(n)^3)
```

Yes, that is correct.

One way to figure out the big-O complexity of nested loops whose bounds do not immediately depend on one another is to work from the inside out. The innermost loop does O(log n) work. The second loop runs O(log n) times and does O(log n) work each time, so it does O(log^{2} n) work. Finally, the outmost loop runs O(log n) times and does O(log^{2} n) work on each iteration, so the total work done is O(log^{3} n).

Hope this helps!

Yes you are right.

Easy way to calculate -

```
for(int i=0; i<n;i++){ // n times
for(int j=0; j<n;j++){ // n times
}
}
```

This example of simple nested loop. Here Big-O of each loop O(n) and it is nested so typically O(n * n) which is O(n^2) actual Big-O. And in your case -

```
for (int i = n; i > 0; i = i / 2){ // log(n)
for (int j = n; j > 0; j = j / 2){ // log(n)
for (int k = n; k > 0; k = k / 2){ // log(n)
count++;
}
}
}
```

Which is in nested loop where each loop Big-O is `O(log(n))`

so all together complexity would be `O(log(n)^3)`

Indeed, your assumption is correct. You can show it methodically like the following:

Similar Questions

I have used the Master Theorem to solve recurrence relations. I have gotten it down to Θ(3n2-9n). Does this equal Θ(n2)? I have another recurrence for which the solution is Θ(2n3 - 1002). In BigTheta

I have been given some code to work out big O runtimes on them, could someone tell me if I am on the right track or not? //program1 int i, count = 0, n = 20000; for(i = 0; i < n * n; i++) { count++

I have a script that loops through a series of four (or less) characters strings. For example: aaaa aaab aaac aaad If have been able to implement it with nested for loops like so: chars = string.digi

A question in one of my past exams is a multi-choice question: Choose the FALSE statement: 7(log n) + 5n + n(log log n) + 3n(ln n) is A. O(n^2) B. Ω(n^2) C. O(n log^2 n) D. Ω(n) E. Θ(n log n) First I

I am developing a java program that uses curve fitting techniques to determine the time complexities for o-notation. I am provided with input size and time, and i'm to determine which time complexity

I know that big O notation is a measure of how efficint a function is but I don\t really get how to get calculate it. def method(n) sum = 0 for i in range(85) sum += i * n return sum Would the answe

I am trying to figure out the complexity of a for loop using Big O notation. I have done this before in my other classes, but this one is more rigorous than the others because it is on the actual algo

I am having trouble finding out the Big-O notation for this fragment of code. I need to find the notation for both for loops. public static int fragment(int n) { int sum = 0; for (int i = n; i >= 1

I have like a table to make a nested loop in Java applet , during this loop i should change the colors like the picture said. now i successed to make the table but i cannot change the colors because e

Possible Duplicate: Plain English explanation of Big O I need to figure out O(n) of the following: f(n) = 10n^2 + 10n + 20 All I can come up with is 50, and I am just too embarrassed to state how I

I have a quite simple problem, however I am a bit unsure of what the actual runtime (e.g. Big-O) is of this. The program looks like this. n <- user input for i=1 to n foo(i) foo a: for i=1 to a foo

I have strings as elements in the arraylist and i would like to remove them in O(logn) time, using Java i have tried using HashSet to copy and clear and copy back to another arraylist, but i think tha

I'm getting objects containing big numbers from a web service and show them in an <input type=number> field. It works until angular begins to show the values in scientific notation. The value

I've been having some problems trying to grasp the concept of big O notation. So, by definition big O is as follows, T(n) ∈ O(G(n)) if T(n) <= G(n) * C. Since the the constant C can be any integ

So I am trying to understand the data types and Big O notation of some functions for a BST and Hashing. So first off, how are BSTs and Hashing stored? Are BSTs usually arrays, or are they linked list

I've having trouble with nested while loops. I've created this function to list three groups and WITHIN each group I want another loop to list up to four members of that group: function getBlockCode_M

Using an ADT Linked List and the following code, how would I express the runtime in Big O Notation?: def reverse (self): (1)if self._head is not None: (2)temp = self._head.item (3)self._head = self._h

I need to derive the Big-O complexity of this expression: c^n + n*(log(n))^2 + (10*n)^c where c is a constant and n is a variable. I'm pretty sure I understand how to derive the Big-O complexity of

I have a question regarding time complexity (big O notation) for Java software. Is there a way to quickly calculate or test it (or any website that could calculate it for me would be welcomed). For ex

We did an exercise in class today dealing with big-O notation. Here is one of the problems: void modifyArray(int a[], int size) { int max = a[0]; for (int i = 1; i < size / 2; ++i) { if (max < a

Hello it has been some time sins i have used Big O notation, so i'm a bit rusty. I know that having 1 loop, that loops n times is O(n), and having 1 loop that loops n time within another loop that loo

I am trying to get the correct Big-O of the following code snippet: s = 0 for x in seq: for y in seq: s += x*y for z in seq: for w in seq: s += x-w According to the book I got this example from (Pyth

Considering the below algorithm, Loop1 until(i<n^2) Loop2 until(j<i^2) .... j=j+4 End Loop2 i=i*3 End Loop1 I think this is Theta(n^2*log(n)). This is correct or is the Big Theta higher than th

This maybe a trivial/ mathematical concept that I cant seem to work my head around. So if the processing time T(n) of a certain algorithm is both Ω(n) and O(n^3), how can i prove that the T(n) is Θ(n^

are there a limited amount of basic O Notations, considering you are meant to 'distil' them down to their most important part? O(n^2): O(n): O(1): O(log n) logarithmic O(n!) factorial O(na) polynomia

I am trying to write a program in java that guesses a password of length n. So far I have only been able to write a program to guess a password of length 3 because I am not doing it recursively. My

I think the Big-O notation is n^2, but im not too sure. for (int i = 0; i < n -1; i++) { for (int j = 0; j < n – 1; j++) if (x[j] > x[j+1]) { temp = x[j]; x[j] = x[j+1]; x[j+1] = temp; } }

I know what is O(n) notation and I also understand how I can get notations like O(n), O(n2), .... O(n) means I have to get through sequence once O(n2) means that I have two nested cycles traversing s

Given the following code, what is the complexity of 3. and how would I represent simple algorithms with the following complexities? O(n²+n) O(n²+2n) O(logn) O(nlogn) var collection = new[] {1,2,3}; va

I thought this would be easier than originally planned. I'm trying to make this image out of nested for loops: Any suggestions or solutions would be helpful. #include <iostream> using namespace

We just started learning big-o in class. I understand the general concept that f(x) is big-o of g(x) if there exists two constants c,k such that for all x>k |f(x)|<=c|g(x)|. I had a question whe

I'm auditing this algorithms class for work and I'm trying to do some practice problems given in class. This problem has me stumped and I just can't wrap my head around it. None of my solutions come o

I got asked an interview question that wanted me to discern the Big-O notation of several logarithmic functions. The functions were as follows: f(x) = log5(x) f(x) = log(x5) f(x) = log(6*log x) f(x) =

Is there a difference between saying that f(n)=O(g(n)) and f(n) ∈ O(g(n))?

I can't figure out the smallest upper barriers for those two i thought about log3(n) for the first one and O(n!) for the second, but i'm not sure about that, because i have not really understood the s

What are some examples where Big-O notation[1] fails in practice? That is to say: when will the Big-O running time of algorithms predict algorithm A to be faster than algorithm B, yet in practice algo

I had a question about Big O vs little o notation. It seems intuitively, that Big O is like <= while little o is like <. Does that mean that if something is little o of f(n), it is also Big O of

When i make the following multiplication in PHP: $ret = 1.0 * 0.000000001; i get the result: 1.0E-9 I want to convert this result into the normal decimal notation, how can i do this? sprintf('%f',$re

Next in my series of Big O questions that I can't find the answer to Take the following example for(int i = 0; i < someNumber; i++) { for(int j = i; j < someNumber; j++) { DoSomething(); } } Wo

Mozilla's website clearly describes hasOwnProperty() and the in operator. However, it does not give any implementation details in regards to their efficiencies. I would suspect they'd be O(1) (constan

This question already has an answer here: Continue For loop 3 answers I'm working on this code in which there are two nested loops and I get a compile error Next without for in the line indic

Given the following code: for (int i = 0; i < n-1; ++i) { for (int j = i+1; j < n; ++j) { // Do work. } } What is the Big-Oh value for it (over n)? I'm thinking it's O(N^2) but I'm not sure. I

I am having some trouble working out the Big O notation for these 2 recursive functions: int calc (int n) { if (n <= 0) return 0 ; else if (n > 10) return n ; else return calc (5 + calc(5n)); }

Ok I am just learning about Big-O and someone gave me a conceptual question, to take as a means of trying to learn. However just barely starting out with Big-O I only know concept per say. I've been

I've been told the below code is = O(MN) however, I come up with O(N^2). Which is the correct answer and why? My thought process: nested for loops plus if statements --> (O(N^2)+O(1)) + (O(N^2)+O(

I have a problem with the Big O: for i:=1 to n do for j:=1 to i*i do begin k:=1; m:=n; while m>=k do begin k:=k*3; m:=m/2 end end Teacher gave the answer - n*n*n*log(n). However, I can't get there

i need help finding time complexity of this function in big O notation: int myfunction(bool exists) { int var=0; int k,j,n; if (exists) for(k=1; k<=n; k*=2) for(j=1; j<=k; j++) var++; else for(k

I have the following question: Is the following statement true or false? All logs to base 2 log2n is a member of O(log(n)) My attempt: log2n - clogn <= 0 log2 + logn - clogn <= 0 1 + logn(1-c)

I am trying to figure out how to determine Big O notation for recursive methods . I realise that is probably the same way as with a iterative method, but im not sure about this. I wrote this simple r

I have noticed that big-O of 1000n or 10n is the same thing as O(n), but big-O of 2^n and 3^n are different: O(2^n) and O(3^n), what I don't get is why can't we ignore the constants in this case (2 or