As I was watching TV the other day, I realized that American society is a complete disaster. Every other commercial was about how to stay young or some new medicine that could help lower cholesterol with a hundred side affects. After a new drug commercial displays, another commercial plays that states you may be liable to money if a drug you were taking caused kidney failure or death. These commercials were playing during a show I was watching on the history channel, the show itself was talking about the invention of nanobots. The scientists seemed so excited about the fact that they could place robots inside the human body to basically keep them healthy. This concept makes me cringe! Really???? What about exercising and eating healthy?? What about not using toxic products containing carcinogens that cause cancer and other diseases? What about not taking a drug for heart disease or heartburn and using natural remedies? I know a person's genetic engineering plays a factor in the function of the immune system, but I think the onset of certain diseases are preventable.
We are subjected to toxic chemicals in our environment everyday that contaminate our food, air, water, and homes. If that isn't enough, toxic chemicals are used in the manufacturing and maintenance of pretty much everything we associate with life. These chemical toxins accumulate in our body's fat and can cause tons of health problems including cancer and birth defects. On top of that we stuff our faces with foods lacking in nutrition that are loaded with trans fat and toxins. Foods that are stocking shelves in corporate grocery stores in America are outlawed in other countries. If people are wondering why we have diseases such as cancer, autism, heart disease, etc., all they have to do is open their eyes and realize what they are putting into their bodies.
People will take better care of their cars than they take care of themselves. Seriously, they will spend tons of money on material things and spend little on nutritous foods. We are a processed country.
A drug company invents a drug for arthritis, a pharmaceutical "salesman" sells the drug to a doctor, the doctor gives the drug to his patients. The patient trusts his doctor is doing what he can to heal him, but is this really the case? Not in my opinion. Its seems people will put anything in their body just because someone tells them it is helpful.
No comments:
Post a Comment