Tha Sad State of The Health Care Industry

It is in the best interest of all of us to take the best care of ourselves that we can. The medical establishment will only take care of us if we have the best insurance or have the cash or credit card to pay up front. Therefore we must make sure we know how to tend to ourselves in the best possible way.
We need to understand how the food we eat affects our daily lives and health. We need to know where our food comes from, and what has been added to it by others before we eat it.
We need to understand that there are natural ways to manage your health and illnesses and give up our dependency on the large drug companies that pay doctors nicely to fill us up on their medication.
We need to understand HOW we become ill with colds, the flu, and other bacteria-caused disease and how to keep from becoming ill in the first place.
We need to get our own autoimmune system healthy enough to keep us from the illnesses that lurk around every corner. We need to understand that with every antibiotic we take, for whatever reason, makes it harder for our immune system to work and to develop into the germ fighting system it is supposed to be.
It's all about the profit margin, not the best interest of the public. Did I mention the drug companies and health care businesses have more money than we do to pay lobbyists in Washington to oversee their own private interests?
When are we going to wake up? We need to grow safe food, take natural God made (not man-made) drugs and vitamins, and get the exercise we all so desperately need for optimum health. The next time you MUST see a doctor, remember the choice of health is up to YOU, not him. He's in it for the money, you cannot afford to be.