I hate to say it but EVERY movie has been pretty much 'Woke'.
What woke means is that elements are crafted specifically to resonate with and drum up some emotion with a certain subset of the audience, because it shows them i a good light, and gives them elements that appeal to them
Today's woke is focused more on Women and POC as they are emerging to PAY for more and more decisions, but the past 100 years was all woke catering 99% to white males.
We had 100 years of almost every movie and TV show that showed mostly white males, always surrounded by the hottest females mostly, and always winning, whether fighting Aliens, robots, or in a sitcom with average Husband with hot wife.
That is WOKE. It was all meant to get mostly white males excited and feeling like they always win.
As women and POC have become bigger purchasers the woke is just being spread around a little. But as humans we notice when OTHERS are being pandered too, but we do not notice when we are.
But that is what Hollywood does and has always done. Look out at who is buying and pander to them.
If you are being honest you have no idea what is going on . Hollywood has been wrecked, the stuff they make is now mostly dreary Soviet style propaganda, which is why Americans are increasingly giving up and not seeing new movies.