do you believe in the term "girls girl"
do most women actually want the best for you and are kind or should i keep to myself because most women dont care about you and are actually secretly bitter. im not gonna lie ive had some awful and traumatizing experiences with women so im about to give up on making friends and just stick to myself