I suffer from a serious chronic disease. I have become extremely dismayed both at how limited medicine is in its ability to help me and how consistently wrong the doctors I've consulted have been about everything they've ever said. I have come to believe that doctors are poorly trained in medical school and that most people in the profession are basically second-handed. I attribute this situation to the extreme degree of government control over the medical profession, especially licensing laws and FDA controls. Is my attitude justified, or am I being overly negative?