Skip to Main Content

Next Reads: Women's Fiction: Home

Women's Fiction

Women's Fiction

Women’s fiction is a general term used for books centered on women’s issues and life experiences. They are usually written by women, are addressed to women, and tap into the hopes, fears, and dreams of women today.

Printable List