I've never been to America, and it's the news and media that shapes my understanding of it. For us from the outside patriotism, militarism, gun ownership, etc. are what defines America.
Looking a bit deeper though, especially in these polarized times, it seems like there is a Metropolitan America and a Rural America with contrasting views on almost everything, and almost nothing in common, so it's kind of hard to pin point what American culture is. I'd be interested to know what you (or any of our resident Americans) think is universal enough across the states to be considered American culture.