-
Notifications
You must be signed in to change notification settings - Fork 50
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
More efficient cache system #230
Comments
Are you testing on a VPS? Please consider that you're sharing IO throughput with many others then. Please also consider, that your test case is not suitable for evaluating cache efficiency. 25 sec vs 30 sec is not very significant. But you are right however, as my test case will show. My results with the following commands on a VPS CentOS x64 with 2x1,5 Ghz:
Memory load on rendering: 814 Megabytes With caching: Without caching: You see, that the caching is only effective on very very very large maps and with a very very very good hard drive and very very very good disk io throughput. Your first and second idea are the way on how to improve this I think. 120,000 files (especially without SSD) simply can't be fast. |
I have tested this on my local system. The attributes of the directory was modified with "chattr +A -j /tmp/no-journal" to increase the performance. The disk was a common mechanical hard drive. |
On testing this again I'm noticing that using the --cache-key option is much slower even if the cache already exists. I think the main problem could really be the massive amount of small files being created causing a high I/O overhead. |
I have made a few tests with the cache system:
Without the cache 30 seconds were needed.
There was no cache so the first run needed 92 seconds.
With the cache (cache_hits: 24074/24074) only 25 seconds were needed.
The cache can be only used on the same options because of the current behaviour of --cache-key. A user need to look at the worldmap 14 times with the same options until he gains benefits from the cache. There are 3 things that could be enhanced:
The text was updated successfully, but these errors were encountered: