Prove that

```
1 + 1/2 + 1/3 + ... + 1/n is O(log n).
Assume n = 2^k
```

I put the series into the summation, but I have no idea how to tackle this problem. Any help is appreciated

This follows easily from a simple fact in Calculus:

and we have the following inequality:

Here we can conclude that S = 1 + 1/2 + ... + 1/n is both Ω(log(n)) and O(log(n)), thus it is Ɵ(log(n)), the bound is actually tight.

**Question**

1 + 1/2 + **1/3** + ... + 1/n

Assume n = 2^k

If the

`3rd`

term is`1/3`

and**not**`1/4 (1/2^2)`

, it means that there are n terms in all, so if will be**O(n)**. It is true that their**sum**may be**log n**, but complexity is number of iterations and not sum of series.Consider solving it programatically. You would end up loop running 1 to n times and adding 1/n to sum. How many times did it run????

`1 to n`

**O(n)**If the problem was changed to

`1 + 1/2 + 1/4 + ... + 1/n`

, series can*now*be written as 1/2^0 + 1/2^1 + 1/2^2 + ... + 1/2^(k). Now hint provided by your teacher makes also sense. How many times loop will run???`0 to k = k + 1 times`

, and from both series we can see`2^k = n`

hence`k = log (n)`

. So, number of times it ran =`log(n) + 1 = O(log n)`

I guess you or your teacher made a typo.

Similar Questions

I am trying to figure out how to write a running harmonic mean, that is: a harmonic mean that updates at each iteration. It is easy to do it with the arithmetic mean...but I am struggling a lot on thi

My homework involves Big O analysis and I think I've got the hang of it, but I'm not 100% sure. Would any of you lovely people mind taking a look and telling me if I'm on the right track? The assignme

Does an algorithm exist that finds the maximum of an unsorted array in O(log n) time?

Given a list of complexities: How do you order then in their Big O order? I think the answer is below? Question now is how does log(n!) become n log(n). Also I don't know if I got the n! and (n-1

The std::queue class is unclear as to the complexity of the size member function. It appears to be based on the data structure implementation used at the time. One would assume that size would be O(C)

Can anyone point me to a resource that lists the Big-O complexity of basic clojure library functions such as conj, cons, etc.? I know that Big-O would vary depending on the type of the input, but stil

I have to write down the Big O notation of an algorithm I had to think up for my homework. I'm able to tell that the code below is O(n^2). Because for every x I have to go through all of the y's and

int num = n/4; for (int i = 1; i <= num; i++) { for (int j = 1; j <= n; j++) { for (int k = 1; k <= n; k++) { int count = 1; } } } According to the books I have read, this code should be O((

i <-- 1 while(i < n) j <--1 while(j < i) j <-- j * 2 i <-- i + 1 done My shot at this would be O(log n) for the inner loop. And I'm guessing the outer loop is O(n), for an overall c

I have the following question: Solve the recurrence relation simplifying the answer using Big 'O' notation: f(0) = 2 f(n) = 6f(n-1)-5, n>0 I know this is a first order inhomogenous recurrence rela

What would the Big O notation be for the following nested loops? for (int i = n; i > 0; i = i / 2){ for (int j = n; j > 0; j = j / 2){ for (int k = n; k > 0; k = k / 2){ count++; } } } My t

What is the Big-O for SQL select, for a table with n rows and for which I want to return m result? And What is the Big-O for an Update, or delete, or Create operation? I am talking about mysql and sql

If I want to compute the n-th hexadecimal digit of Pi with http://en.wikipedia.org/wiki/Bailey-Borwein-Plouffe_formula what is the big O notation http://en.wikipedia.org/wiki/Big_O_notation for the Ba

I have to determine the time complexity (big O) of the following function: void BET::makeEmpty(BinaryNode* &n) { if(n != NULL) { makeEmpty(n->left); makeEmpty(n->right); delete n; } n = NULL

If I have int i; for(i = 1; i < n; i *= 2) //n is the size of an array passed to the function { //O(1) code here } What is the Big O complexity? The i *= 2 bit is confusing me. I tried operatio

I can't figure out the smallest upper barriers for those two i thought about log3(n) for the first one and O(n!) for the second, but i'm not sure about that, because i have not really understood the s

I'm trying to help a friend analyze the complexity of his algorithm but my understanding of Big-O notation is quite limited. The code goes like this: int SAMPLES = 2000; int K_SAMPLES = 5000; int i =

How can I specify computational complexity of an algorithm in Big-O notation whose execution follows the following pattern, according to input sizes? Input size: 4 Number of execution steps: 4 + 3 +

Does anyone have insight into the typical big-O complexity of a compiler? I know it must be >= n (where n is the number of lines in the program), because it needs to scan each line at least once. I

If I had two data structures linked together (e.g. each node of a linked list contained an AVL tree) then when searching for one data item would the Big O efficiency be O(N) + O(logN) = O(N), using

I am taking now the big O in ICS202 course, and I really find some dificulty to figure it out from a code, Is there any videos,web pages or blogs that can help me with that?

I have a little list descending into faster asymptotic growth: O(1) O(log log n) O(log n) O(n^x), where 0 <= x <= 1 O(n) O(n log n) O(n^x), where x>1 O(x^n) O(n!) Im just curious as to where

X=2, y=1 X=3, y=3 X=4, y= 6 X=5, y= 10 X=6, y= 15 X=7, y= 21 X=8, y=28 I know that f(x) = f(x-1) + (x-1) But...is that the correct mathematical function? What would Big O notation be?

I am trying to get a grasp on Big O notations. It seems pretty abstract. I selected the most common data structures - array, hash, linkedl list (single and double) and a binary search tree and guessed

There is a comparison-based sorting algorithm that runs in O(n*log(sqrt(n))). Given the existence of an Omega(n(log(n)) lower bound for sorting, how can this be possible?

I may be teaching a Java crash-course soon. While it is probably safe to assume that the audience members will know Big-O notation, it is probably not safe to assume that they will know what the ord

From Wikipedia: O(|E| + |V| log|V|) From Big O Cheat List: O((|V| + |E|) log |V|) I consider there is a difference between E + V log V and (E+V) log V, isn't there? Because, if Wikipedia's one is corr

I have a question, what does it mean to find the big-o order of the memory required by an algorithm? Like what's the difference between that and the big o operations? E.g a question asks Given the fol

This question already has an answer here: Plain English explanation of Big O 21 answers I really can't figure out what Big-O is and how to use it in practice, so i hope someone could give me

I'm learning programming using YouTube and whatever I can find. I stumbled upon some Big O problems and I'm quite confused on one of them. for (int i = 0; i < n; i++) for (int j = 0; j < 2; j++)

I have the following question: Is the following statement true or false? All logs to base 2 log2n is a member of O(log(n)) My attempt: log2n - clogn <= 0 log2 + logn - clogn <= 0 1 + logn(1-c)

I have a series of questions in which I need feedback and answers. I will comment as to what I think, this is not a homework assignment but rather preparation for my exam. My main problem is determini

What's a good strategy to determine the running time (Big O notation) of data structures and algorithms. I have the following to figure out the running times for and I'm having trouble determining wha

What in an exact formal manner does the expression f(n) = 2^O(n) mean?

The question asks me to determine the big-O notation for the following efficiencies: 1) 3n3/2 + 3 n1/2 + 10logn + 105n +5 2) 5 n3 +10n2logn 3) 105logn+10n3logn+10n2 4) 3n2+5n3/2 +151logn Here are my

How would I go about solving this problem: Use big O notation to give the number of nodes that can be contained in a tree of depth n I'm not looking for the exact answer or anything, Just how I would

Is the big o for the following O(n^2*log(n)) or O(n^3*log(n))? for (int i=0;i<n;i++){ for(int j=0;j<i;j++){ for(int k=0;k<n;k*=2){ System.out.print(test); } } }

I understand these algorithm functions for calculating complexities in algorithm: n = input size O(n) = linear O(n2) = quadratic but I find it difficult to understand the logarithmic (lg) functions

A question in one of my past exams is a multi-choice question: Choose the FALSE statement: 7(log n) + 5n + n(log log n) + 3n(ln n) is A. O(n^2) B. Ω(n^2) C. O(n log^2 n) D. Ω(n) E. Θ(n log n) First I

I'm implementing the Pythagorean means in PHP, the arithmetic and geometric means are a piece of cake but I'm having a really hard time coming up with a reliable harmonic mean implementation. This is

What would be the big O notation of the length of the list of permutations of the characters of a list of words of lenght n? I just do not know how to express that because it would be like n! for each

Let f(n)= ( (n^2+2n)/n + 1/1000*(n^(3/2)))*log(n) The time complexity for this function could be both O(n²*log(n)) and O(n^(3/2)*log(n)) How is this possible? I thought the dominating term here was n²

I've been working through some problems in my textbook that are about calculating the big O complexity of algorithms. One of the questions i'm stumped on doesn't have an answer in the back and i'd app

I have this recurrence: T(n) = T(n-1) + O(n log n) Then I guess the solution is T(n)=O(n log n) I use the substitution method. T(n)<= c*(n-1)*log (n-1) + O(n log n) T(n) <= c*n*log(n) + O(n log

I have noticed that big-O of 1000n or 10n is the same thing as O(n), but big-O of 2^n and 3^n are different: O(2^n) and O(3^n), what I don't get is why can't we ignore the constants in this case (2 or

Hi can someone explain me how to resolve this homework? (n + log n)3^n = O((4^n)/n). i think it's the same as resolving this inequality: (n + log n)3^n < c((4^n)/n)). thanks in advance

I have a simple algorithm that prints the two dimensional matrix (m*n, m and n are different numbers): for(i=0;i<m;i++) for(j=0;j<n;j++) Console.WriteLine({0},A[i,j]); I read that the big O n

I'm trying to determine the time complexity of the following: while loop that executes n! times { perform an operation that is O(n) } Would the big-o analysis still be O(n!)? I don't see how it would

The following code give me a O(n). how do I code a for loop that has time complexity of O(c^k)? int power(int x, unsigned int y) { if( y == 0) return 1; else if (y%2 == 0) return power(x, y/2)*power(x

Given the following code, what is the complexity of 3. and how would I represent simple algorithms with the following complexities? O(n²+n) O(n²+2n) O(logn) O(nlogn) var collection = new[] {1,2,3}; va