Jesus has provided physical healing to us through His death, burial, and resurrection. He took our sicknesses into His physical body. I have heard Christians
God desires for us to be healthy and experience His healing power. Far too many Christians think otherwise because of religious teaching. Some even think
Making the word of God of none effect through your tradition, which ye have delivered: and many such like things do ye. (Mark 7:13) The
Each one of us must come to a place of conviction that healing is not optional. Christians tend to place it at a lower level
The world needs to know that God will heal our bodies, deliver, prosper, and save us. I believe that this is the message of Scripture.