See the question and my original answer on StackOverflow

Exceptions can notably impact the performance of an application. There are two types of exceptions: 1st chance exceptions (the one gracefully handled with a try/catch block), and unhandled exceptions (that will eventually crash the application).

By default, the debugger does not show 1st chance exceptions, it just shows unhandled exceptions. And by default, it also shows only exceptions occurring in your code. However, even if it does not show them, it still handles them, so its performance may be impacted (especially in load tests, or big loop runs).

To enable 1st chance exceptions display in Visual Studio, click on "Debug | Exceptions" to invoke the Exceptions dialog, and check "Thrown" on the "Common language runtime" section (you can be more specific and choose wich 1st chance exception you want to see).

To enable 1st chance exceptions display originating from anywhere in the application, not just from your code, click on "Tools | Options | Debugging | General" and disable the "Enable Just My Code" option.

And for these specific "forensics mode" cases, I also strongly recommend to enable .NET Framework Source Stepping (it requires "Enable Just My Code" to be disabled). It's very useful to understand what's going on, sometimes just looking at the call stack is very inspiring - and helpful especially in the case of cosmic radiation mixup :-)

Two related interesting articles: