19 July 2014

The advantage of compiler optimization and why it can sometimes mess up your program

Recently, I was working on a software that required high-precision computation of matrices stored in an advanced math library.
While working in the Debug mode of Visual C++, all calculations worked fine. Running the program in Release mode, gave me a "1.#QNAN" error and "-1.#IND" error. I was perplexed, because there were no changes I made to the code.

A couple of printf's showed me that the matrices which were supposed to contain small values like 0.5e-03, had been interpreted as negligible values by the compiler, and considered as zero. This caused numerous divide-by-zero errors, resulting in the QNAN and IND errors which are MSVC's way of displaying a NaN.

Here's how you can check for NaN in your program.
  • Boost has  template <class T> bool isnan(T t); under the header #include
  • If TR1 is available, then cmath includes C99 elements like isnan()
  • Here is a small function which should work if your compiler doesn't have the standard function: bool custom_isnan(double var) { volatile double d = var; return d != d; }

The solution
Going to "project properties > C/C++ > Optimization" and selecting the "/Od (Disable)" option prevented the compiler from optimizing away the little numbers to zero and the program worked fine in Release mode.

How compiler optimization can be helpful
A bit of background research led me to an article of Mr.Scott Robert Ladd on Dr.Dobbs. He said:
An optimizing compiler performs an analysis on a program being compiled, generating a more efficient program. Optimizers can delete unused code and variables, improve register use, combine common subexpressions, precalculate loop invariants, and perform other tasks that improve program performance.

At best, optimizing a well-written program will improve its speed by as much as 25 percent. An optimizer will not replace inefficient algorithms with better ones. As the saying goes, "garbage in, garbage out." Most of the responsibility for a program's performance lies with the programmer. Improving algorithms and manually optimizing a program will often increase program speed by several orders of magnitude. The purpose of an optimizer is to make sure that the compiler is producing the fastest code possible from your source code.
          /O1 optimizes code for minimum size.
          /O2 optimizes code for maximum speed.
          /Ob controls inline function expansion.
          /Od disables optimization, speeding compilation and simplifying debugging.
          /Og enables global optimizations.
          /Oi generates intrinsic functions for appropriate function calls.
          /Os tells the compiler to favor optimizations for size over optimizations for speed.
          /Ot (a default setting) tells the compiler to favor optimizations for speed over optimizations for size.
          /Ox selects full optimization.
          /Oy suppresses the creation of frame pointers on the call stack for quicker function calls

Optimization might not always be the best option for you, especially if you're working with high-precision calculations. There are places where an optimizer can optimize away calculations to improve speed.
Eg: If you use ++i (which is faster) instead of i++, it might not actually be of much consequence, because modern compilers would optimize it sufficiently to make the i++ equivalent to the ++i.

p.s: In Visual Studio project settings, if you ever need to use similar settings for Debug and Release mode and you want to avoid having to change the string "Debug" to "Release", then just use the variable "$Configuration". Visual Studio will substitute the string "Debug" or "Release" into "$Configuration", depending on which build configuration you choose.

No comments: