For decades now, the overriding message from the Financial Services Industry, from Wall Street to your friendly local financial advisor, has been that you need to "put your money in the stock market and always invest for the long run." Well, the economic times we’re in combined with a stock market that’s gone nowhere for the past decade have forced that message to change. They have no choice. Its become apparent to the average investor that the stock market has been and still is a bad bet. As a result, the Financial Services Industry realized that if they didn’t change their sales pitch, they'd end up with less of your money in their pockets. Is the change a good thing? Does it mean they are now investing smarter and selling safer products?