I’m trying to get perspective on this particular beauty standard and how I want to approach it. Do people whiten their teeth where you live? Is it seen as expected to do so? Do you live in a city?
I have healthy teeth that have nevertheless seen a lot of tea and coffee. I have generally thought of this as similar to wrinkles, i.e. a natural thing bodies do that I don’t want to pay money to fix since it isn’t broken. I still think this. But I have been feeling lately like there might be more actual social stigma to my teeth being discolored. I am wondering if this is at all real? Has whitening teeth become an expected thing for all adults to do now? I thought I’d ask how other people feel and think about this and what the general norm is in your social circle.
Edit: thanks for the responses everybody.
No, you’re objectively wrong on this. It is more akin to cosmetic surgery because it is harmful for your teeth and potentially dangerous. This isn’t a normal hygiene standard.
My dentist disagrees. He recommends moderation, but says it is not harmful, much less dangerous.
I do it myself, about once a year, and I don’t have any issues at all.
My dentist said I could do it more often if I felt I needed to, 3-4 times a year, and my enamel would be fine, as long as I followed the directions.
I tried googling it and found no source that corroborated your statement. (I did find a Mayo Clinic and NYTimes article that both support my dentist’s claim.)
I’m willing to listen if you can provide an article, but your “objectively wrong” comment seems a little, well, objectively wrong.