What are some of the most interesting or shocking things Americans believe about themselves or their country?
American exceptionalism is the big one. It's just so, well, selfish. The rest of us like ourselves and our countries just fine without needing to fee we're the best in the world, y'know. And from the other side: What are the most unexpected/shocking/baffling things people encounter when visiting the USA for the first time?