My question was: nestrov accelerated gradient descent is represented as having iteration complexity of o(\frac{1}{\sqrt{\varepsilon}}) and convergence rate of o(1/t^2). Are these the same?

A) YES, depends on what is fixed.

  1. error after k iterationn (convergence rate): o(1/k^2), o(1/k),o(log(1/k))
  2. iteration needed for error k (iteration complexity): o(log(1/k)), o(1/\sqrt{k}), o(1/k), o(1/k^2)

above is the order of fast algorithm. I’m not sure whether the expression o(log(1/k)) which is an linear algorithm for convergence rate exists in iteration complexity context as well. I guess not.

ref: convergence rate and its comparison with iteration complexity (below)