admin管理员组文章数量:1180547
I was trying to explore how JavaScript Object
performs in comparison to Map
or Set
for normal key accesses. I ran the below 3 codes on JSBEN.CH.
Objects
const object = {};
for (let i = 0; i < 10000; ++i) {
object[`key_${i}`] = 1;
}
let result = 0;
for (let i = 0; i < 10000; ++i) {
result += object[`key_${i}`];
}
Maps
const map = new Map();
for (let i = 0; i < 10000; ++i) {
map.set(`key_${i}`, 1);
}
let result = 0;
for (let i = 0; i < 10000; ++i) {
result += map.get(`key_${i}`);
}
Sets
const set = new Set();
for (let i = 0; i < 10000; ++i) {
set.add(`key_${i}`);
}
let result = 0;
for (let i = 0; i < 10000; ++i) {
result += set.has(`key_${i}`);
}
As you can check in the test link, Map
and Set
seem to perform almost similar however Objects
are much slower every time. Can someone explain what could be the reason that Objects
perform worse than Map
or Set
for basic key access operation?
Edit 1: Just setting keys on Object
is also slower than Map
/Set
.
I was trying to explore how JavaScript Object
performs in comparison to Map
or Set
for normal key accesses. I ran the below 3 codes on JSBEN.CH.
Objects
const object = {};
for (let i = 0; i < 10000; ++i) {
object[`key_${i}`] = 1;
}
let result = 0;
for (let i = 0; i < 10000; ++i) {
result += object[`key_${i}`];
}
Maps
const map = new Map();
for (let i = 0; i < 10000; ++i) {
map.set(`key_${i}`, 1);
}
let result = 0;
for (let i = 0; i < 10000; ++i) {
result += map.get(`key_${i}`);
}
Sets
const set = new Set();
for (let i = 0; i < 10000; ++i) {
set.add(`key_${i}`);
}
let result = 0;
for (let i = 0; i < 10000; ++i) {
result += set.has(`key_${i}`);
}
As you can check in the test link, Map
and Set
seem to perform almost similar however Objects
are much slower every time. Can someone explain what could be the reason that Objects
perform worse than Map
or Set
for basic key access operation?
Edit 1: Just setting keys on Object
is also slower than Map
/Set
.
1 Answer
Reset to default 37Looking at relative numbers only is always dangerous, here are some absolute numbers, run on NodeJS v14.14.0 on an Intel 8350U:
Iterations | Object write | Object read | Map write | Map read |
---|---|---|---|---|
100 | 0ms | 0ms | 0ms | 0ms |
1.000 | 3ms | 1ms | 0ms | 0ms |
10,000 | 7ms | 4ms | 8ms | 1ms |
1.000.000 | 1222ms | 527ms | 632ms | 542ms |
So as one can see, for 10.000 iterations the difference between objects and maps is 1 millisecond in the run above, and as that is the accuracy of the time measurement, we can't really derive any conclusion from that test. The results are absolutely random.
For 1 Million iterations one can see a clear advantage of Map writes over Object writes, the read performance is very similar. Now if we look at absolute numbers, this is still one million writes / s. So although object writes are a lot slower, this will unlikely be the bottleneck of your application.
For an accurate explanation, one would have to analyze all the steps the engine performs. For that you can run node --print-code
and analyze the bytecode that gets run. I don't have the time for that, though here are some observations:
If the object gets constructed with
Object.create(null)
(having no prototype) the performance is about the same, so prototype lookup does not influence performance at all.After the 20th iteration, V8 chooses the internal representation
dictionary_map
forobject
, so this is basically one hash map competing with another hashmap (one can runnode --allow-natives-syntax
and then use%DebugPrint(object)
to get the internal representation).For objects with more than 2 ** 23 keys, write performance degrades even more, see Performance degrade on JSObject after 2^23 items (though maps also can‘t be much larger - see Maximum number of entries in Node.js Map? )
For reference, here is the code used to run the benchmark:
function benchmark(TIMES) {
console.log("BENCHMARK ", TIMES);
const object = Object.create(null);
let start = Date.now();
for (let i = 0; i < TIMES; ++i) {
object[`key_${i}`] = 1;
}
console.log("Object write took", Date.now() - start);
start = Date.now();
let result = 0;
for (let i = 0; i < TIMES; ++i) {
result += object[`key_${i}`];
}
console.log("Object read took", Date.now() - start);
start = Date.now();
const map = new Map();
for (let i = 0; i < TIMES; ++i) {
map.set(`key_${i}`, 1);
}
console.log("Map write took", Date.now() - start);
start = Date.now();
result = 0;
for (let i = 0; i < TIMES; ++i) {
result += map.get(`key_${i}`);
}
console.log("Map read took", Date.now() - start);
}
benchmark(100);
benchmark(1_000);
benchmark(10_000);
benchmark(1_000_000);
To sum up:
- Use Maps for dictionaries with lots of different, changing keys as they are slightly better than objects internally represented as hash table
- Use Objects for - well - objects. If you have a low number of keys and frequently access those, Objects are way faster (as the engine can use inline caching, hidden classes with fixed memory layout etc.)
本文标签: Javascript Object vs MapSet key lookup performanceStack Overflow
版权声明:本文标题:Javascript Object vs MapSet key lookup performance - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1738131824a2065265.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
set.add
only takes one argument. – Jonas Wilms Commented Apr 3, 2021 at 13:18Map
loops over the values in order that they were added whereas an object does not. See Map Description and Map vs Object – Get Off My Lawn Commented Apr 3, 2021 at 13:56