American conservative Christians believe the US is a Christian nation based on Christianity. Because of this, some of them believe everyone either needs to be a Christian, or leave. If this is the case, why do they believe it's ok for them to go into other countries, who's primary religion isn't Christianity, and attempt to make them all Christian? And why do they believe that they can attempt to suppress the religious belief of non-Christians here, but cry foul when Christians are persecuted in non-Christian nations? Isn't this incredibly arrogant, or am I missing something?
From my perspective, the Christians who believe the U.S. is a Christian nation founded on Christian principles (or as I hear more often "Judeo-Christian" principles), believe that the country is falling into decay the more diverse the country appears, where more religions, ethnicities, genders, and orientations become accepted into the mainstream.
To them, Christianity is the ideal that freedom ought to emulate. One has the free will to accept or reject Jesus, and suffer the consequences or rejoice in the bliss after making such a choice. To them, they don't feel as if they are suppressing the religions of others, but liberating them from their own perceived oppressive ideology. It seems, according to Christians, that other religions are either too burdensome or too morally lax. Christianity offers in their opinion the perfect balance.
I think they believe that it isn't domination, but freedom and compassionate justice they wish to bestow on the world.