Is dentistry still worth it?

I’ve been going down a rabbit whole of YouTube videos of dentists talking about how they were excited in the beginning but now they hate it because it’s all corporate and it’s not what they thought it and they’re not making as much and can’t own a practice etc. I’m feeling a little down bc of the time and money commitment and don’t want to pick wrong