Dismal savings rates by the majority of the public put people decidedly in the grasshopper camp, but the reason might have more to do with evolution than indolence.
The paper, with a snappy title of “Differential temporal salience of earning and saving,” was recently published in the journal Nature Communications.
“People are often characterized as poor savers,” the authors noted. “Here we examined whether cues associated with earning and saving have differential salience for attention and action. We first modeled earning and saving after positive and negative variants of monetary reinforcement, i.e., gains versus avoiding loss.”
The result (revealed in classic academic geek-speak)?
“Despite their equivalent absolute magnitude in a monetary incentive task, colors predicting saving were judged to appear after those that predicted earning in a temporal-order judgment task.”
It didn’t matter how “saving” was framed, whether traditionally defined or “as earnings that come slightly later,” a human bias against deferred gratification, and hence saving, persisted.
“An attentional asymmetry away from money-saved relative to money-earned, potentially contributes to decreased everyday salience and future wealth …The way we attend has important interactions with value, perception, decision making, and ultimately behavior.”
That behavior, as every 401k advisor can attest, is poor.
The paper cites analysis of the Federal Reserve’s 2013 survey of consumer finances, which found the median American working-age couple has saved only $5,000 for retirement with 43 percent of working-age families estimated to have no retirement savings at all.
“On a long downward trend, the personal savings rate (expressed as a percentage of disposable personal income), dropped to less than 3 percent at the close of 2017. Contrasting with a near 96 percent employment rate, we have an ant-like work ethic, yet earnings are rarely converted into savings.”