It is interesting to listen to President Obama (and occasionally others) talk of "corporate responsibility". He talks as if bankers should be "socially responsible" and doing their best to help our economy. Companies selling health insurance are somehow expected by many to be "responsible".
It seems very strange to me! Corporate leaders have a responsibility to maximize profits and to minimize the possibilities of having financial losses. Where helping the "better good" is not perfectly congruent with maximizing profits, why would anyone expect that a CEO or corporate board would try to help you and me?
I can easily understand the perspective of the corporate leadership. It seems less clear to me why anyone else would either be naive or talk as if they were naive. It seems obvious to me that if we wish corporations to act in certain ways there need to be laws or regulations which force them to do so.
Thanks!
Tuesday, December 22, 2009
Subscribe to:
Posts (Atom)