I am speaking to any woman who is desperately trying to embrace the whole 'body acceptance'/'body love' thing.
(I will use both of these terms throughout this piece, but they represent the same thing).
I am speaking to any woman who totally gets and appreciates the concept but......holy shitballs it's a struggle.
This 'body love' thing should be empowering, right?
It should feel good, right?
Isn't it to free us of the societal and cultural influences and pressures put on us to have to look a certain way in order to feel..in order to feel...in order to feel, what exactly?
Good about ourselves?
To look like:
We totally have our shit together?