Not so long ago, we were a very confident people; we believed that the present was excellent, and the future looked to be even better! Nowadays, however, most Americans--at least, among the MSM--appear to believe that the present is bleak, and the future looks even worse! I am wondering what explains this. Could it be the recent appearance of the covid-19 virus? Or the 2016 election, to the presidency, of Donald Trump? Or--related to this--fears that the coastal elite are no longer in charge? Or fears of impending doom (due to either nuclear war; or climate change; or...well, you fill in the blank)? Thoughts?