cacheprovider: set: use json.dumps + write

``json.dump`` is slower since it iterates over chunks [1].

For 100 ``cache.set`` calls this saved ~0.5s (2.5s => 2s), using a dict
with 1500 entries, and an encoded size of 500kb.

Python 3.7.4.

1: https://github.com/blueyed/cpython/blob/1c2e81ed00/Lib/json/__init__.py#L177-L180
This commit is contained in:
Daniel Hahler 2019-11-16 17:17:57 +01:00
parent c49c61fdaf
commit 786d839db1
2 changed files with 3 additions and 1 deletions

View File

@ -0,0 +1 @@
cacheprovider: improved robustness and performance with ``cache.set``.

View File

@ -125,13 +125,14 @@ class Cache:
return return
if not cache_dir_exists_already: if not cache_dir_exists_already:
self._ensure_supporting_files() self._ensure_supporting_files()
data = json.dumps(value, indent=2, sort_keys=True)
try: try:
f = path.open("w") f = path.open("w")
except (IOError, OSError): except (IOError, OSError):
self.warn("cache could not write path {path}", path=path) self.warn("cache could not write path {path}", path=path)
else: else:
with f: with f:
json.dump(value, f, indent=2, sort_keys=True) f.write(data)
def _ensure_supporting_files(self): def _ensure_supporting_files(self):
"""Create supporting files in the cache dir that are not really part of the cache.""" """Create supporting files in the cache dir that are not really part of the cache."""