admin管理员组文章数量:1414870
I have run a benchmark to pare the use of CPU and GPU in nodejs with GPU.js. The NVidia icon shows GPU use in the first console timer, but it is slower than the CPU (second timer).
const {GPU} = require('gpu.js');
const gpu = new GPU();
const multiplyMatrix = gpu.createKernel(function(a, b) {
let sum = 0;
for (let i = 0; i < 512; i++) {
sum += a[this.thread.y][i] * b[i][this.thread.x];
}
return sum;
}).setOutput([512, 512]);
var a = [];
var b = [];
for (var i = 0; i < 512; i++) {
a.push([]);
b.push([]);
for (var j = 0; j < 512; j++) {
a[i].push(1);
b[i].push(-1);
}
}
console.time("gpu");
const c = multiplyMatrix(a, b);
console.timeEnd("gpu"); //2148ms
console.time("cpu");
var d = [];
for (var i = 0; i < 512; i++) {
d.push([]);
for (var j = 0; j < 512; j++) {
let sum = 0;
for (let k = 0; k < 512; k++) {
sum += a[i][k] * b[k][j];
}
d[i].push(sum);
}
}
console.timeEnd("cpu"); //710ms
Am I doing something clearly wrong?
I have run a benchmark to pare the use of CPU and GPU in nodejs with GPU.js. The NVidia icon shows GPU use in the first console timer, but it is slower than the CPU (second timer).
const {GPU} = require('gpu.js');
const gpu = new GPU();
const multiplyMatrix = gpu.createKernel(function(a, b) {
let sum = 0;
for (let i = 0; i < 512; i++) {
sum += a[this.thread.y][i] * b[i][this.thread.x];
}
return sum;
}).setOutput([512, 512]);
var a = [];
var b = [];
for (var i = 0; i < 512; i++) {
a.push([]);
b.push([]);
for (var j = 0; j < 512; j++) {
a[i].push(1);
b[i].push(-1);
}
}
console.time("gpu");
const c = multiplyMatrix(a, b);
console.timeEnd("gpu"); //2148ms
console.time("cpu");
var d = [];
for (var i = 0; i < 512; i++) {
d.push([]);
for (var j = 0; j < 512; j++) {
let sum = 0;
for (let k = 0; k < 512; k++) {
sum += a[i][k] * b[k][j];
}
d[i].push(sum);
}
}
console.timeEnd("cpu"); //710ms
Am I doing something clearly wrong?
Share Improve this question asked Dec 19, 2020 at 13:38 Eduardo PoçoEduardo Poço 3,1092 gold badges21 silver badges31 bronze badges 4-
No expert here, but from what I understand GPU has big gains when calculations are kept parallel.
sum +=
is not, so one idea is use another array to store the loop calls, and calculate the sum from this array. – Keith Commented Dec 19, 2020 at 13:47 - Wow! I took this example from GPU.js site, gpu.rocks. I thought it would run in parallel for each pair (i, j) – Eduardo Poço Commented Dec 19, 2020 at 14:06
- You have a slow GPU or a fast CPU, or both. I have a 1070 and get ~80ms vs 350ms on the CPU. – Sergiu Paraschiv Commented Dec 19, 2020 at 14:17
- Not sure, if it is on gpu.js, then I would expect it knows how to optimise += & **. Running your code on my machine I get 44ms & 239ms. So GPU is running about 6 times faster. – Keith Commented Dec 19, 2020 at 14:52
1 Answer
Reset to default 5this isn't the way to benchmarking CPU vs GPU
the GPU got warmup time so if you really want to benchmark pare both of them on a 1000 execution and not single execution
GPU won't always be faster it depends on the task and the GPU RAM Size
and finally as Keith Mention at the ment gpu works better then cpu in parallel small task and large batches
本文标签: javascriptNodejs GPUjs slower using GPU than using CPUStack Overflow
版权声明:本文标题:javascript - Nodejs GPU.js slower using GPU than using CPU - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1745217683a2648247.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论