iStock_000020627140XSmall - Herbal Medicine - a.jpg

Herbal Medicine

Herbal Medicine is a traditional system of medicine using selected plants, internally or externally, to stimulate the body’s own powers of healing. For many centuries plant remedies were the main medicines used to treat disease throughout Europe.

Herbal remedies are a safe and effective alternative to conventional chemical medicine.