Seemingly there are not more discriminated people as white Christians in the U.S., e.g. the people of color have huge privileges to be enrolled in the prestige colleges, get good jobs etc. etc. etc. America is going to became a society where white males and often womans get more and more discriminated and forced from all good jobs. Also, do you support the abolishment of Affirmative Actions Programs or not?More about you can read in this Great Book.