Webster’s Dictionary online defines greenwashing (noun ) as the practice of promoting environmentally friendly programs to deflect attention from an organization's environmentally unfriendly or less savory activities. The word greenwashing is derived from the word green `environmentally friendly' + (white)wash `conceal flaws'.
In other words it’s about corporate America jumping on the “environmentally friendly” bandwagon. Herbs are good for you right? Throw a few botanicals into the cosmetic formula and call it a “natural” product. You can also call it greenwashing.
I first heard the term greenwashing while interviewing the general manager at Aubrey Organics for an article I wrote for Natural Family Online called Defining “Organic” in Cosmetics. We were discussing what the terms “natural” and “organic” actually meant. I learned that the United States, unlike Europe, has no regulatory standards when it comes to manufacturing natural cosmetics and skin care products.
Fortunately, Aubrey Organics, Dr. Bronner’s and other companies are working hard to change this. Some cosmetics and skin care products really are made from natural ingredients, but others…well, they definitely are not. Read the label!
All this business about greenwashing reminds me of the corporatization (is that a word?) of the organic food and dairy industry. Remember the big stink about Horizon Organic Dairy? The milk may be organic, but the cows are not treated humanely. Likewise with produce. Corporate giants are slowly buying out the smaller organic farms. The produce may still be organic, but it’s definitely not “home grown” anymore is it?
Green Living | | Kitchen | | Garden | | Parenting| | Antiques | | Environment