While I've always known that corporations have a say in our food and how it's labeled, thinking about all the different lobbyists and how they're all trying to push their own agenda and sell their products really made me sick to think about. When you think about food companies, you'd like to think that they are trying to produce the healthiest or most nutritional foods, when in actuality, they're producing the cheapest products and lying about the contents of our food so that we keep buying. Although I can understand that these are businesses and their goal is to profit, it still blows me away that these companies are allowed to bend the truth in order to sell to consumers. In allowing this, we've let the government and these companies to have too much power, and rather than food being viewed as a basic necessity, food is a business. This is perpetuates a lot of the stereotypes that we have for people of lower economic background; healthy foods tend to be more expensive and junk food is cheap, but rather than addressing this issue, we as a society blame the lower class for not working harder to be able to afford more expensive foods.
I'm curious as to why society allows this. Why does the health of the general population take a backseat to profit? If more people were educated about the food industry, would we change our relationship with food and move towards a healthier, more organic diet? Or are we so stuck in this system that we wouldn't change even if the general public was knowledgeable about this?
No comments:
Post a Comment