The Role of Information Technology in Medical Research
The United States and the United Kingdom commit roughly the same percentage of total public medical research funds to health informatics. However, the United Kingdom is uniquely positioned to benefit from advancements in health informatics research because it is significantly ahead of the United States in its transition to electronic health records among primary care providers.
Using information technology (IT) to modernize our health care system will lead to improvements in medical research. Health informatics will allow medical researchers to determine the effectiveness of a particular treatment for a given population or to discover the harmful side-effects of a drug. While some of this research will occur in the private sector, public investment in this area will play a major role. This report finds that both the United States and the United Kingdom commit roughly the same percentage of total public medical research funds to health informatics. However, the United Kingdom is uniquely positioned to benefit from advancements in health informatics research because it is significantly ahead of the United States in its transition to electronic health records among primary care providers. More importantly, the National Health Service (NHS) has made an important strategic decision to emphasize medical research as one of its core missions. Thus, as the NHS continues to develop its IT infrastructure, it will be able to make technical upgrades and policy changes to improve information sharing and its information base for research. The United States currently lacks the capacity being developed by the NHS to turn its existing or future electronic health records into a usable database for medical research. To benefit from the full potential of health informatics, the United States should develop the capability to share medical data for authorized research in a timely and efficient manner.