Synchronization and memory costs are becoming humongous bottlenecks in today's architectures. However, algorithm complexities assume these operations as constant, which are done in O(1) time. What are your opinions in this regard? Are these good assumptions in today's world? Which algorithm complexity models assume higher costs for synchronization and memory operations?