The NYT had a very good article this morning on the overuse of CT scanners, a new devise for examining the condition of the heart. The article points out that the scans are of little or no use for the vast majority of patients who receive them. They also are somewhat risky, because they expose patients to large doses of radiation. Nonetheless many doctors administer them to patients. This may be because they think that they are actually helpful or because they receive large fees for the scans. It would have been useful if the article had examined more carefully how the way in which the United States finances research into drugs and medical devices creates this incentive structure. The waste of resources on medical technology (either devices or drugs) occurs at the point where they are developed. At that point, all the scientific expertise and capital involved in either a medical device like a CT scan or a new drug has already been expended. The cost of actually using the scan (or a new drug) is very modest -- a bit of electricity and perhaps 20 minutes of a skilled technician's time. However, because of patent monopolies, these devices (and drugs) can command huge fees. These fees provide enormous incentives to use these devices in cases where they may be of little use or even harmful. On the other hand, if the cost of the research was paid up front (e.g. through public provided funds), then medical devices and drugs would be available at their marginal cost. In this event, a CT scan might cost less than $100. Most drugs were sell for less than $10 a prescription. There would be no incentive for manufacturers to misrepresent the benefits of these treatments. A piece of this length should have spent some time explaining how the incentive structure of the drug/device development process leads to the sort of problems it highlights.
--Dean Baker