A Wound is a Sign You Fought for Something Meaningful


Our wounds are important to achieve a better version of ourselves, which otherwise would never be born or see the daylight. Yes, a wound is more than the pain it inflicts. A wound is a sign that you lived and that you are still part of the school of life. Now it’s time to treat your older wounds, so you can advance further in your life education - there is so much more to learn.

Advertisements