I have always believed books are important. With the state of American politics at the moment, I believe they are even more important now. Books give us a place to escape the harsh realities around us. Regardless of what your political leanings, I think it is undeniable that there is so much negativity on the news, on tv, on social media, etc. that we can all use a moment where we can step outside of everything and immerse ourselves in fiction.
I think romance, more than any other genre, is what this world needs now. I’m not saying that simply because I read and write in the genre, but because it’s one of the most diverse genres. By diverse I’m not only talking about the cultural backgrounds or sexual orientations of characters, but but also the authors themselves. This community is primarily women, but there is also a growing number of men who are writing romance. With each new voice added to the genre, we as readers, are introduced to a new world view whether we realize it or not.
Over the years I’ve seen article after article written that denigrate the romance genre and/or romance writers. Most of which have been written by people who don’t read the genre. From the outside, people look at the romance genres as “mommy porn” and “bodice rippers”, nothing more. They miss the point altogether. It is so much more than those things.
Romance gives women, and men alike, a fantasy of what’s possible. It gives it’s readers an escape. It shows that women can be strong and still find good men who will love them for that strength. It shows men they can be strong, but that it’s also okay to be weak. That allowing a woman to be strong doesn’t make them any less of a man.
This genre reflects the changes in society that have happened, that are happening, and that will happen. And it teaches us that while love might not conquer all, it makes those hard times more bearable.