November 16th, 2016by mysteriousvisitor
This is not an attempt to influence anyone’s religious beliefs, I have neither a reason nor the need to do such a thing, but rather an observation on something that I have seen many, many, many, many people talk about on this forum.
While this teaching has probably always existed in some form, it’s become very prevalent in our lifetime and I think it’s extremely harmful: the idea that there is a God whose job it is to serve humans by taking away all suffering and problems and making life easier. What happened to the concept of God standing back and allowing people to make their own choices in life and allowing nature to take its course? What happened to the idea of learning and becoming stronger through suffering? What happened to the idea of becoming closer to God by turning to Him through suffering? What happened to the idea of suffering consequences when we do something wrong? What happened to this life being the test and learning experience that we go through before entering the afterlife?
I’m mentioning this because I think it is extremely harmful to teach people that there is a Divine Being who is just going to reach down and magically make all problems go away. That only increases our suffering by making us angry, bitter, and resentful when it doesn’t happen. It makes us look in the mirror and say “Why doesn’t God care about me?” because we’ve been so convinced that He should intervene.
I’m not calling anyone’s religious beliefs wrong, nor am I objecting to teaching that there is a God who intervenes sometimes. But I think it’s pretty obvious that there’s no Divine Being whose purpose is to fix everything for us, because if that were the case it would already have been done and we would be living in a perfect world.