The first time I read The Hunger Games, I was horrified. It was horrifying to think about kids being forced to kill each other. That same feeling came back when I was watching the movie last night. Is it not sick that all these people from the Capitol are cheering, laughing, being sucked into the TV story of the Games when these kids are going in there to kill each other?? How can you not be thinking that there's something really wrong with this society, where the main entertainment is an annual war between kids?
There's a John Green quote, that's referring to the difference between lying and writing fiction, that I like. It highlights something about fiction that I like (and that I don't think my non-fiction loving father will ever understand).
"...The other big difference, I would argue, is that lies are attempts to hide the truth by willfully denying facts. Fiction, on the other hand, is an attempt to reveal the truth by ignoring facts." (Source)
Fiction can be used to reveal the truth. I think different types of books do this in different ways. Contemporary novels, for example, can reveal truths about people and individuals. Dystopian novels have such potential in this area, I think. They have the potential to awaken people to the realities of our world today.
Yes, The Hunger Games is fiction. But today's society's obsession with entertainment and reality TV is real. Violence and war in this day is very, very real. Heck, even kids killing each other is a reality in some places around the world. Lots of dystopian fiction shows us things about our world by extremizing them, or even just incorporating realities into fiction. The Handmaid's Tale by Margaret Atwood, Feed by M.T. Anderson, and others.
It just makes me really frustrated. How can you read books like The Hunger Games and not think beyond the romance?? How can you not ask questions like "how far-fetched is that, really? Could it happen? How different are we really from the crazy entertainment-addicted Capitol citizens?" and then answering them with things like "You know, maybe it can happen. Maybe it is happening. Maybe we're not that different."
The Hunger Games is so, so much more than the romance, or the stupid "love triangle". Dystopians should, I think, make us wake up to harsh realties in our own reality, our own society, our world today. They should make us think. Every book should make us think. But I think (whoa, lots of thinking... haha) that the relationship in The Hunger Games, or any book, should not be our focal point.
What do you think? Are we missing the point of The Hunger Games?