It's The End of the World as We Know It
Fiction trends, like fashion, tend to be cyclic. When I was a kid, horror novels were the hot thing. Every time you turned around, there was a Stephen King book or movie coming out.
Later, we saw a surge in western books, movies, and television shows. More recently we've had an explosion of fantasy and vampires, which makes my little heart happy.
But there's a darker trend that's sneaking up on us, one that is speaking to both our fears and hopes: Apocalyptic/Dystopian fiction.
There have always been disaster movies and books about survival after a some sort of apocalyptic event, but now they're everywhere -- stories about zombies, natural disasters, nuclear wars, disease.
Cable channels are swollen with documentaries about end-of-the-world prophecies and speculative scenarios. Reality shows such as Survivorman, The Colony, and Apocalypse Man abound. We have a new hit called The Walking Dead mixed in with movies such as The Road, 2012, and The Book of Eli.
In bookstores, shelves are laden with the same, and the Young Adult section is especially bursting with dystopian fiction, which is awesome, since my son can't get enough of it. Heck, I can't even get enough of it. I enjoy it so much that I'm writing my own apocalyptic fiction series about the Four Horsemen of the Apocalypse.So what do you think about this trend? Love it? Hate it? What are your favorite books/movies/TV shows that embrace it? And what do you think is driving it?