Kit Kasuboski
Friday, July 15, 2011
Are Americans taught that they won the 1812 war in school?
I keep hearing Americans perpetuating the myth that they won the 1812 war, and I was just wondering if this was a result of their education.
No comments:
Post a Comment
Newer Post
Older Post
Home
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment