I think the problem with size is due to the fact that there's no __sizeof__
method defined in Python 2.X implementation of OrderedDict
, so it simply falls back to dict's __sizeof__
method.
To prove this here I've created a class A
here which extends list
and also added an additional method foo
to check if that affects the size.
class A(list):
def __getitem__(self, k):
return list.__getitem__(self, k)
def foo(self):
print 'abcde'
>>> a = A(range(1000))
>>> b = list(range(1000))
But still same size is returned by sys.getsizeof
:
>>> sys.getsizeof(a), sys.getsizeof(b)
(9120, 9120)
Of course A
is going to be slow because its methods are running in Python while list's method will run in pure C.
>>> %%timeit
... for _ in xrange(1000):
... a[_]
...
1000 loops, best of 3: 449 μs per loop
>>> %%timeit
for _ in xrange(1000):
b[_]
...
10000 loops, best of 3: 52 μs per loop
And this seems to be fixed in Python 3 where there's a well defined __sizeof__
method now:
def __sizeof__(self):
sizeof = _sys.getsizeof
n = len(self) + 1 # number of links including root
size = sizeof(self.__dict__) # instance dictionary
size += sizeof(self.__map) * 2 # internal dict and inherited dict
size += sizeof(self.__hardroot) * n # link objects
size += sizeof(self.__root) * n # proxy objects
return size
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…