Thursday, December 19, 2013

A Workaround for Type Inference with Expression Templates and Proxies

Back in 2011 Motti Lanzkron wrote an article titled "Inferring Too Much"

The problem brought to light by the article is that C++11 auto interacts badly with expression templates and proxies. Just replacing the type with auto can cause undefined behavior as shown by the following lines of code taken from the article above

#include <vector>
#include <iostream>
#include <limits>
std::vector<bool> to_bits(unsigned int n) {
    const int bits = std::numeric_limits<unsigned int>::digits;
    std::vector<bool> ret(bits);
    for(int i = 0, mask = 1; i < bits; ++i, mask *= 2)
        ret[i] = (n &  mask) != 0;
    return ret;
}

int main()
{
    bool b = to_bits(42)[3];
    auto a = to_bits(42)[3];
    std::cout << std::boolalpha << b << std::endl;
    std::cout << std::boolalpha << a << std::endl;
}

So how do we fix it?

There has been some talk about adding an operator auto that you could define in your class. However, it might be some time before we get something like that.

Herb Sutter in his "Almost Always Auto" says this is a feature and not a bug, "because you have a convenient way to spell both 'capture the list or proxy' and 'resolve the computation' depending which you mean".

Here is some code discussing this

auto a = matrix{...}, b = matrix{...}; // some type that does lazy eval
auto ab = a * b;                       // to capture the lazy-eval proxy
auto c = matrix{ a * b };              // to force computation

Unfortunately, not only is this potentially dangerous but it can be tedious. What if matrix takes some template parameters such as dimensions and type. Now you have

auto a = matrix<2,3,double>{...}, b = matrix<3,2,double>{...}; // some type that does lazy eval
auto ab = a * b;                       // to capture the lazy-eval proxy
auto c = matrix<3,3,double>{ a * b };              // to force computation

In this scenarior we are fast loosing the benefits of auto. Is there some way that we can have our auto and our expression templates. Here is a workaround, which admittedly is not perfect, but I think it is the best we can do without changing the language.

We are going to simulate operator auto

namespace operator_auto {
    template <class T> struct operator_auto_type {
        using type = T;
    };

    
    struct operator_auto_imp {
    template <class T> typename operator_auto_type<T>::type operator=(T &&t){
        return std::forward<T>(t);
    }
};
     

namespace {
    operator_auto_imp _auto;
}
}

All this does is create a variable _auto that when assigned to it returns whatever was assigned converted to another type which in the default case is the same type.

Then we specialize operator_auto_type like this

// For my::string for Motti's example
namespace operator_auto {

    template <class T> 
    struct operator_auto_type<my::string::concat<T> > 
    {
       using type = my::string;
    };
}

// For vector bool
namespace operator_auto {

    template <> 
    struct operator_auto_type<std::vector<bool>::reference>
    {
        using type = bool;
    };
}

Now to use it, whenever we use auto with an expression that might yield a proxy, we just include an additon assignment to _auto. Here is how we would use it with my::string

    using operator_auto::_auto;
    my::string a("hello"), b(" "), c("world"), d("!");
    auto s = _auto = a + b + c + d;
    auto a1 = _auto = a;
    std::cout << s << std::endl;

Notice that for a1 were are actually assigning to a my::string. In this cause the assignment to _auto will become a no-op.

For full source code for this take a look at https://gist.github.com/jbandela/8042689 For a runnable version look at http://ideone.com/eLyg7T

As for the name _auto, I chose it because it was short and the underscore kind of suggested "flatten" or "collapse" leading to a mnemonic of "collapse auto" which is kind of suggestive what you want. However, you can easily change it if you wish.

Let me know what you think in the comments. I welcome your comments, suggestions, and ideas.

  • John Bandela