
Why Most Doctor's Don't Promote Natural Remedies..??
Jul 17
1 min read
4
15
0
The U.S medical schools give very little info and little to no training for doctors on natural remedies or plant-based medicines.
Many doctors were taught to think plant-based remedies weren't powerful enough to balance the body. (All thanks to Rockefeller)
Rockafeller's influence on medical schools and natural remedies have been suppressed and were given little to no funding towards testing.
" THE MEDICAL'S INDUSTRY IS HIGHLY INTERTWINED WITH PHARMACEUTICAL COMPANIES"
"NATURAL REMEDIES HAVE BEEN AROUND FOR CENTURIES"
The truth is that when natural remedies are taken orally they work faster allowing for easy and rapid absorption into the bloodstream. The body absorbs the liquids more readily because the digestive system doesn't have to break down any plant fibers.
So why would you want to take something made in a lab when you can take something made in nature...think about it???
Make sure to check out all of my natural remedies for sale....
Thank you!












