vendredi 3 octobre 2014

When did America become "the homeland"?

I don't know if people have noticed, but this odd term popped up right around 9/11, and seems to be thrown around more and more. I can't seem to find any use of the word before 9/11.



Was it ever used before? Historically, America has been the Republic, and the Union, but this word like post-9/11 flag pins seems to be here to stay.



It's such an Orwellian word like fatherland.



I just heard a (Republican) congressman saying "secure the homeland" (talking about ebola), and it sounds like something out of a dystopian future science fiction story.



Does anyone else find the word un-American?




Aucun commentaire:

Enregistrer un commentaire