Does anyone actually like their job?

I’ve been in the corporate America workforce 6 years mostly in different HR type of roles. I am so tired of the grind and how fake people feel in corporate America. I am so bored and feel under utilized. I want to do meaningful work that matters and is fulfilling. I am considering a career change or leaving corporate America all together but I want to be strategic about it. There are a lot of benefits/ perks about corporate America.

TLDR: if you like your job can you share what you do and why you like it?

Thanks!