Several very popular US shows come to mind: Weeds, Game of Thrones, Orange is the New Black, just to take the first three that come to mind.
I admittedly don't watch a whole lot of TV, so I may be off base here, but my impression is that, here in the USA, on TV, onscreen depictions of rape are more common than onscreen depictions of positive sexual activity.
I admittedly don't watch a whole lot of TV, so I may be off base here, but my impression is that, here in the USA, on TV, onscreen depictions of rape are more common than onscreen depictions of positive sexual activity.