What Is a Whole Foods Diet?

A whole foods diet is a way of life versus a temporary diet. Because this lifestyle emphasizes healthy, real foods, those switching to a whole foods diet from a standard American diet high in processed foods and saturated fats may lose weight and improve their overall health.