Analysis of Dijkstra algorithm’s (Lazy) running time


I’m trying to figure out the running time for a Dijkstra algorithm. All the sources I have read say that the running time is O(E * log (E)) for a lazy implementation.

But when we do the math we get O(E * (Log(E)+E*Log(E))).

Since E isn’t a constant, I don’t see how someone could reduce this to O(E * log (E).

Are we analyzing the wrong or is it possible to reduce?

        while (!minPQ.isEmpty()) { <=== O(E)             Node min = minPQ.poll(); <=== O(log(e)              for (Edge edge : graph.adj(min)) { <=== O(E)                 if (min.getId() == target.getId()) {                     // Source and Target = Same edge                     if (edgeTo.size() == 0) edgeTo.put(target, edge);                      return;                 }                  relax(edge, min, vehicle); <=== log(e) (because of add method on PQ)             }       }