The degree to which virtually all Americans—young and old, rich and poor—use technology to access information continues to astonish baby boomers, like myself, and certainly our parents. When we gathered around our black and white television sets to watch the Jetsons, none of us actually thought we would live long enough to video chat, much less be able to do it on a handheld device. The development and use of technology in our everyday lives apparently knows no bounds. As the costs of health care continue to escalate, albeit at a slower pace than has been witnessed in recent years, a question occurred to me: what if this collective passion for mastering technology could be harnessed to help reduce health care costs?

A large and growing body of work produced by my colleagues at the RAND Corporation may provide some answers and a possible path forward. In their recent report, Redirecting Innovation in U.S. Health Care, Steven Garber and his co-authors argue that new and improved economic incentives faced by inventors, investors, payers, providers, and patients are required to develop and use health care technologies that would reduce costs or increase value. Earlier RAND work, including a 2012 brief entitled, Skin in the Game, as well as the results of the landmark RAND Health Insurance Experiment conducted in the 1970s and 80s, provide additional insight into the role that economic incentives play in individuals’ health care decisions.

Specifically, the analysis discussed in Skin in the Game found that consumers who switched from traditional health plans to consumer-directed health plans, which were characterized by high deductibles and low premiums, reduced their health care expenditures by 21 percent. They achieved this by initiating fewer episodes of care and spending less per episode. Unfortunately, they also scaled back on preventive care and things like vaccinations. Similarly, the Health Insurance Experiment found that cost sharing reduced both necessary and unnecessary care.

These results and those of many other studies pose an important challenge: what can be done to encourage patients to seek necessary care, but avoid care that is of little or no value and how might technology help them in the process? Many possibilities exist that could leverage today’s information sharing technology.

One might involve providing positive economic incentives for people to choose to receive cost-effective care. For example, in another recent RAND study, researchers concluded that approximately $1 billion a year could be saved through reduced use of anesthesia providers in routine gastroenterological procedures. In practical terms, this means having colonoscopy patients forego being knocked out by propofol in favor of a combination of drugs that induce a twilight state and can safely be administered without an anesthesiologist. Studies have shown that the two approaches are equally safe and comfortable (or uncomfortable). So let’s imagine that there was a good way to deliver that information to patients due for a colonoscopy and, at the same time, encourage them to choose the more efficient path—forgoing the propofol and the anesthesiologist—by offering a financial reward that is large enough to capture their attention but small enough so that significant cost-savings still occur.

One could imagine many treatment decisions where patients could choose among alternative therapies or diagnostic procedures that produce equivalent outcomes but cost very different amounts of money. Moreover, if patients had access to appropriate information and knew they would receive a financial reward for selecting cost-effective treatments, the aggregate demand for low cost/high value care might increase. In fact, an application (“app”) could be developed that would enable patients to enter their diagnosis or procedure and find a list of treatment options ranked by value—that is, the best outcome for the lowest cost. If they then choose a high-value approach, they would receive some sort of compensation from their insurer or payer.

Some might argue that this type of tool could and should never replace physician judgment, and it wouldn’t. It might, however, produce informed patients who are empowered by the empirical knowledge that their “apps” provide to question physician decisions—e.g., “Why repeat a colonoscopy in five years if my first one was normal?” This could also lead patients to engage physicians in useful discussions of the costs and benefits of different treatments. In fact, armed with cost-effectiveness information and a set of appropriate economic incentives, it is likely that patients would make more-informed choices than they currently do when faced with economic penalties, generally in the form of cost-sharing. As has been seen, such blunt instruments lead people to cut preventive and necessary care as well as unnecessary care.

Devising methods to stimulate patients’ use of computers, smart phones, and other technology to become more engaged in their health care could usher in an era in which better health is just a click or tap away.