Here's How America Destroyed Hawaiian Culture

  • 5 years ago
When it comes to Hawaii, the average American pictures an idyllic paradise and ideal vacation spot. But what exactly happened when the United States took over the Hawaiian territory? And what happened to the culture? Today, we're taking a deep, and honest, dive into American history.

Recommended