The idea that Islam, as a religion, is the enemy is growing here in the United States.
Is this a radical idea that’s drifted into the mainstream or is it a matter of the constant sound of silence from the “moderate Muslim world” when radical Islam rears its head time and time again wearing thin on the American psyche? Is it something else entirely?
Bonus Question:
Is not appeasing the same as antagonizing?