#include <iostream> #include <vector> using namespace std; int main() { vector< vector<int> > dp(50000, vector<int>(4, -1)); cout << dp.size(); }
This tiny program takes a split second to execute when simply run from the command line. But when run in a debugger, it takes over 8 seconds. Pausing the debugger reveals that it is in the middle of destroying all those vectors. WTF?
Note – Visual Studio 2008 SP1, Core 2 Duo 6700 CPU with 2GB of RAM.
Added: To clarify, no, I’m not confusing Debug and Release builds. These results are on one and the same .exe, without even any recompiling inbetween. In fact, switching between Debug and Release builds changes nothing.
Running in the debugger changes the memory allocation library used to one that does a lot more checking. A program that does nothing but memory allocation and de-allocation is going to suffer much more than a ‘normal’ program.
Edit Having just tried running your program under VS I get a call stack that looks like
Which shows the debug functions in ntdll.dll and the C runtime being used.