Funny how the medical field in America became so powerful after WWII.
A lot of doctors and medical professionals don’t have the answers. The research Cancer world won’t ever admit it but the truth is they want people to have cancer, Otherwise, they become unnecessary. There is nothing to study. Nothing that gives you physically painful invasive hope for a cure.No one will get praised or “paid”. A price tag is on every one of us. It’s only a matter of time that we are expected to pay up for existing. Being American makes us World Medical Experiments.
Medical societies, Pharmaceutical companies, Organized Institutions(Government, Judicial systems, Organized Religion) the list goes on.
We are forced into believing this is the only way, have faith that it is not.