What does the word Indie even mean? Google says, “not belonging to or affiliated with a major record or film company.” Or “an indie pop group, record label, or film company.” An urban dictionary says, “an obscure form of rock which you only learn about from someone slightly more hip than yourself.”
So what does this mean for Indie Authors? What are we?
We are a force to be proud of. There is a negative stigma on the term Indie for a lot of people. Many run away from self-publishing because of this. Maybe you are wondering if you should do the same.
But I am here to tell you. Being Indie is nothing to be ashamed of. It is something better than that. Don’t hang you head in shame because of it.
We are independent. That’s where the worded Indie come from. We are a writer or publisher or author who is not controlled by anyone else. Yes, it means we have to do the work ourselves, but we should be proud of it.
The Unites States of America became what it is today because of what? They looked at Great Britain and said, “I don’t want to be part of the established way anymore. I want to be an Indie nation.”
Okay, okay, I know there is a lot more to it than that, but you get the idea. They didn’t want to do it the way everyone else did. They wanted to be independent; they wanted to be Indie.
So if you are starting to feel a bit ashamed of your route. If you are struggling and wondering if it’s worth the work to happily be an Indie or even more to try to do it differently than everyone else. DO NOT! You are Indie. You are different. You are free. You are independent. You are divergent. (Sorry I could not resist)
Seriously. You are Indie. You are free. You are unique. Don’t let anyone take that way. It is a proud word. So let’s make sure the world knows it.