I for one get really annoyed when they say something. I feel that people either won't take it seriously or little girls will see their favorite movie start or whatever and see ED's as something as petty as a fashion statement. I don't know why, but sometimes I feel like it gives regular people with ED's a bad name. When a celebrity says something, I actually become more ashamed about my problem.
What do you guys think?