Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
457 views
in Technique[技术] by (71.8m points)

big o - Asymptotic Complexity of Logarithms and Powers

So, clearly, log(n) is O(n). But, what about (log(n))^2? What about sqrt(n) or log(n)--what bounds what?

There's a family of comparisons like this:
n^a versus (log(n))^b

I run into these comparisons a lot, and I've never come up with a good way to solve them. Hints for tactics for solving the general case?

Thanks,
Ian

EDIT: I'm not talking about the computational complexity of calculating the values of these functions. I'm talking about the functions themselves. E.g., f(n)=n is an upper bound on g(n)=log(n) because f(n)<=c*g(n) for c=1 and n0 > 0.

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

log(n)^a is always O(n^b), for any positive constants a, b.

Are you looking for a proof? All such problems can be reduced to seeing that log(n) is O(n), by the following trick:

log(n)^a = O(n^b) is equivalent to: log(n) = O(n^{b/a}), since raising to the 1/a power is an increasing function. This is equivalent to log(m^{a/b}) = O(m), by setting m = n^{b/a}. This is equivalent to log(m) = O(m), since log(m^{a/b}) = (a/b)*log(m).

You can prove that log(n) = O(n) by induction, focusing on the case where n is a power of 2.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...