Whatever the shape of the information network, collection imposes special burdens on the patients who are the subject of the information and who form the data pool for queries and analytics. Data storage, transfer, and queries create privacy risks: “No security measures…can ever completely safeguard against…release…or inappropriate use.” Patients therefore face actual or feared employment or insurance discrimination, reputational loss, or identity theft. Many patients also believe they have a claim to their information, of which collection deprives them. They may find research projects repugnant to their moral or religious beliefs, and appear to suffer dignitary harm when their information is used without their consent, whether identified or deidentified.

The key problem is this: increasingly, these burdens are being placed on individuals who enjoy less social welfare, by focusing information collection efforts on those groups. Government programs aim to enhance data collection from these groups; by law, this data are available for public research to make the secondary research breakthroughs current medicine is based upon. By contrast, those with higher levels of welfare can keep their information from the communal pot, although they are well-position to reap health benefits as the learning health system gets off the ground.

This Article argues that burdens should be distributed in a just manner. This entails taking into account the social welfare of the individual patient where possible when imposing on them information burdens and distributing information benefits. In concrete terms, this would require altering the points at which we collect information, focusing less on public benefit programs like Medicare and Medicaid, and looking to other sources such as wearable devices and wellness programs. It would also require us to alter research methods by broadening and where possible shifting the data pool that is queried or distributed for research. And it involves ensuring that benefits in the form of clinical decision support and smart devices that learn from iterative algorithms are accessible and used by the care providers of worse off populations.

Craig Konnoth, Health Information Equity, 165 University of Pennsylvania Law Review, 1317–1376 (2017).