Many Americans believe the United States was founded as a Christian nation, and the idea is energizing some conservative and Republican activists.
What does it mean to say America is a Christian nation?
Was it only conservatives citing the idea of a Christian nation?
Forty-five percent said the U.S. should be a Christian nation, but only a third thought it was one currently.
___Sources: Pew Research Center; Public Religion Research Institute/Brookings; “Was America Founded as a Christian Nation?” by John Fea.
Persons:
”, couldn't, Let's, It's, Benjamin Franklin, Jesus, deists, Franklin D, Roosevelt, Martin Luther King Jr, Christ, John
Organizations:
Republican, Congregational Church, American, Christian, Soviet Union, National Council of, Pew Research Center, Pew, Constitution, Religion Research Institute, Public Religion Research Institute, Brookings, “, John Fea, Lilly Endowment Inc, AP
Locations:
United States, U.S, Connecticut, Massachusetts, America, Israel, Christianity, Rhode, Independence, Christian America, Soviet, USA, Brookings