The upcoming premiere of the Sex and the City movie got me thinking about the series (which I loved for the most part) and I find it really sad that a series like this that was supposed to portray single women as strong and independent, and that was groundbreaking in so many ways, still had to end its final season with all four women in relationships.
Even for a show that discussed just about every taboo subject in the book, it was considered too shocking to portray a woman who doesn't find "Mr. Right" yet still lives happily ever after. I just find it so disappointing that the show couldn't depict at least one "happy ending" that didn't involve a relationship with a man.
What's the moral then, "It's OK to be single and have fun for a while, but in the end you've got to settle down?" Just a sad commentary on how little our society has progressed on this issue.