Let’s talk about the most harmful issue in the health industry: sex. The health industry is selling sex in the name of health. As a personal trainer, and a woman, I am done with it. Sex can not be the focus of our health journey. It’s time to take sex out of the health industry and create a positive body image for women to work towards. And here’s why:
Sex does not Equal Health
Any medical professional can tell you that physical appearance does not determine your health. Just because someone is beautiful and in line with societal beauty standards, does not mean they are healthy. So why are we selling sex as the perfect image of health? The health industry should not represent health as looking a certain way. Yes, if you are healthy weight loss, fat loss, and muscle building can all be results. But, health looks different on every single body. It is a false advertisement to sell a sexy body that will result from a certain product. Especially when many of the advertisements are of models that either don’t follow a healthy lifestyle, or follow an extremely healthy lifestyle.
Health Goes Deeper than Physical Appearance
Women have a deep need to be healthy. This is rooted in caring for the vessel that allows them to work, care for others, and build relationships. This goes far deeper than appearing as a sex symbol. Ladies, if the health industry has convinced you that the only reason you need to eat healthy and workout is for your appearance, then take a step back. You are a talented, ambitious, and (probably) highly demanded woman. There is a need for you to be healthy as an individual so that you are able to accomplish your goals and build the life you are meant to live.
Harmful Effects of Our Sex Culture
Although the effects of our media have been known to be harmful, and for many years, it continues to worsen with Photoshop and editing effects. Only a small handful of women are able to attain the standards set by the health industry and media. But women are comparing themselves to these images they see and feeling dissatisfied with their own appearance, to the point where they go and edit their own images on social media to fit the societal mold better. The health industry is making this worse by allowing potential customers to believe that they are not beautiful or healthy until they look a certain way. Poor body image is already highly common in our society, and creating unhealthy body image in many is harmful. The health industry is responsible for helping to get people healthy, but if we are creating poor emotional health, then we have failed our objective.
Physical Appearance Alone is not Satisfying
Physical appearance alone is not enough to keep people healthy. Ladies, you will not be satisfied by a 6 pack, Kim K butt, or certain weight if you do not care for yourself as a person. Stop idolizing a thigh gap, thigh brow, or cellulite free booty, and care for your body! Invest time and energy in yourself, your health, and furthering your life. Spend time speaking positive words to yourself and believing in who you are. Find out how to be satisfied with WHO you are, which is lasting and fulfilling.
Women, we have a deep need to be loved, honored, and cherished. But the health industry and media have corrupted this by making you believe that will come about by being a sex symbol. Your health is important so that you are ABLE to live, not so that you can change your body to look a certain way. A healthy body and appearance will come from living a healthy lifestyle, even if it does not look exactly like how the media has presented it. Work towards being healthy ladies. The health industry should be teaching you to care for your body with safe workouts and nutrition. Find trainers, coaches, and nutritionists who are looking to help you transform your health. The physical transformation will occur but in your own way. And, don’t be disappointed when you don’t look like the models from the media in the health industry. Your body, proportions, and training are all unique to you. When in shape, you will look like the best version of yourself (rather than an image someone else has demanded you to be).
Ladies, I would love to hear your thoughts on this post about taking sex out of the health industry. Do you think it’s time the health industry sold just health? How do you feel about a health transformation over just a physical transformation?