admin管理员组文章数量:1391981
I have a C++ program that can potentially handle very big data sets. I'm getting a SIGKILL, and I'm trying to avoid that and handle the error the correct way.
I'm pretty sure the SIGKILL happens when allocating certain arrays. I debugged it with gdb and the program crashes in that point.
The new instruction is in a try...catch statement, but no exception is being thrown. It just crashes. I would like to be able to handle the case where the requested data is too big in a graceful manner, but its proving to be harder than expected.
The code I'm using is more less like this:
int result = 0
try
{
m_array = new double[sizeOfArray];
}
catch(const std::bad_alloc &e)
{
result = -1;
}
catch(const std::length_error &e)
{
result = -1;
}
return result;
if result != 0 I handle the situation and put info into logs etc. Why is there no exception thrown but SIGKILL is emitted instead? Is there a way to avoid the SIGKILL?
The data size requested is absurdly large for my PC, but not so much for higher performance situations. I just need to handle the error without a crash. I'm running Rocky linux.
本文标签: linuxHow to avoid SIGKILL when dealing with huge arrays in a C programStack Overflow
版权声明:本文标题:linux - How to avoid SIGKILL when dealing with huge arrays in a C++ program - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1744701532a2620590.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论