A question for those who complain about Hollywood films being "woke", can a mainstream film have gay characters or a strong woman lead and not be "woke"?


I'm asking because whenever any mainstream film has gay characters or the focus is on a woman, some come out of the woodwork and start accusing the film of being woke and I wonder, can you ever have gay characters or a woman lead or a black lead and not be woke?Was "Aliens" woke? Was "The Wire" woke? Was "Independence Day" woke?I understand when some films contrive to be inclusive however many have forgotten what the point of woke was as a criticism and it has taken a life of its own. It stopped being about something and it turned into a weapon by bigots so they can complain about women or minorities invading their territory. via /r/movies https://ift.tt/3Eoq9Q8
Share:

No comments:

Post a Comment

Labels

Blog Archive

Recent Posts