Understanding Jesus I'm a bit confused about some of the things Jesus said. Whenever he healed someone, he told them "Your faith has restored you." Did this mean that he wasn't taking credit for the healing? Also... He warned people not to expose Him as the Son of God. Why? I'm confused about that, because He was sent here by God. Wouldn't he want people to believe? If He healed someone, and they truly believed in their hearts that He was the Son of God...then wouldn't He want that person to go out and spread the good news? Thanks for your input on helping me to understand this.