When did we as a society normalize visiting the doctor and/or pharmacy every week? When did we normalize first taking a pill to mask symptoms, instead of finding the root cause? Doctors and the American medical system are lifesavers and I am so grateful to have access to them. At the same time I am horrified that so many people are accepting such treatment and giving their power away to a system that clearly is not working. You have the agency to change how you feel and to heal yourself. One step at a time, one bite at a time.