### Snow's blog

By Snow, history, 7 weeks ago,

I have been solving Stone Age Problem and something caught my attention, what is the real complexity for map.clear and unordered_map.clear

The same code using map.clear passes but unorder_map.clear however cppreference says both are linear, so what is the real complexity.

UPD1: I have proved that isn't related with collision because I haven tested initiating a new one instead of cleaning and it passes. AC

• +3

 » 7 weeks ago, # |   0 i think this https://codeforces.cc/blog/entry/62393 is related
 » 7 weeks ago, # |   0 I think its not because of clear() method but since unordered_maps are made using hash maps there might be some cases which maximizes collision that's why you got TLE
•  » » 7 weeks ago, # ^ |   0 Not sure it is related with collision, I have tested using the custom_hash as suggested on this blog and doesn't work neither.PD, the keys are between [1, N] N <= 2e5 which doesn't looks to fall into collision easily.
 » 7 weeks ago, # |   +1 IIRC, unordered works in 0(number of elements+ number of buckets) in the implementation I checked.so if you have a lot of buckets (due to reserving or the fact that you had a lot of elements at some point) itll take long
•  » » 7 weeks ago, # ^ |   0 what is buckets?
•  » » » 7 weeks ago, # ^ |   +4 Spoiler
 » 7 weeks ago, # |   0 Auto comment: topic has been updated by Snow (previous revision, new revision, compare).
 » 7 weeks ago, # | ← Rev. 2 →   0 I noticed the same thing on the same problem.It seems that in competitive programming map is always a better choice than unordered_map. According to this discussion on stackoverflow, unordered_map.clear() might do quite a lot on all "buckets". Also map offers a stable O(log N) complexity so you never need to worry about hash collisions (as you proved this problem is not related to collisions but other problems might be).Or you can always do m = unordered_map(); (but you still cannot get rid of collisions).