What these factors do not take into account is exposure to patients with covid-19, say residents. That means the algorithm did not distinguish between those who had caught covid from patients versus those who got it from community spread–including employees working remotely. And, as first reported by ProPublica, residents were told that because they rotate between departments rather than maintain a single assignment, they lost out on points associated with the departments where they worked.
The algorithm’s third category refers to the California Department of Public Health’s vaccine allocation guidelines. These focus on exposure risk as the single highest factor for vaccine prioritization. The guidelines are intended primarily for county and local governments to decide how to prioritize the vaccine, rather than how to prioritize between a hospital’s departments: but they do specifically include residents, along with the departments where they work, in the highest priority tier.
It may be that the “CDPH range” factor gives residents a higher score, but that this is still not enough to counteract the higher points given to the other criteria.
Stanford tried to factor in a lot more variables than other medical facilities, but Jeffrey Kahn, the director of the Johns Hopkins Berkman Institute of Bioethics says the approach was overcomplicated. “The more there are different weights for different things, it then becomes harder to understand, ‘why did they do it that way?'”
Kahn sat on Johns Hopkins’ 20-member committee on vaccine allocation, and says his university allocated vaccines based simply on job and risk of exposure to covid-19.
He says that decision was based on discussions that purposefully included different perspectives–including those of residents–and in coordination with other hospitals in Maryland. Elsewhere, the University of California San Francisco’s plan is based on a similar assessment of risk of exposure to the virus. Mass General Brigham in Boston categorizes employees into four groups based on department and job location, according to an internal email reviewed by MIT Technology Review.
“It’s really important [for] any approach like this to be transparent and public…and not something really hard to figure out,” Kahn says. “There’s so little trust around so much related to the pandemic, we cannot squander it.”
Algorithms are commonly used in healthcare to rank patients by risk level in an effort to distribute care and resources more equitably. But the more variables used, the harder it is to assess whether the calculations might be flawed.
For example, in 2019, a study published in Science showed that 10 widely-used algorithms for distributing care in the US ended up favoring white patients over Black ones. The problem, it turned out, was that the algorithms’ designers assumed that patients who spent more on health care were more sickly and needed more help. In reality, higher spenders are also richer, and more likely to be white. As a result, the algorithm allocated less care to Black patients with the same medical conditions as white ones.
Irene Chen, an MIT doctoral candidate who studies the use of fair algorithms in healthcare, suspects this is what happened at Stanford: the formula’s designers chose variables that they believed would serve as good proxies for a given staffer’s level of covid risk. But they didn’t verify that these proxies led to sensible outcomes, or respond in a meaningful way to the community’s input when the vaccine plan came to light on Tuesday last week. “It’s not a bad thing that people had thoughts about it afterward,” says Chen. “It’s that there wasn’t a mechanism to fix it.”
After the protests, Stanford issued a formal apology, saying it would revise its distribution plan.
Hospital representatives did not respond to questions about who they would include in new planning processes, or whether the algorithm would continue to be used. An internal email summarizing the medical school’s response, shared with MIT Technology Review, states that neither program heads, department chairs, attending physicians, nor nursing staff were involved in the original algorithm design. Now, however, some faculty are pushing to have a bigger role, eliminating the algorithms’ results completely, and instead giving division chiefs and chairs the authority to make decisions for their own teams.
Other department chairs have encouraged residents to get vaccinated first. Some have even asked faculty to bring residents with them when they get vaccinated, or delay their shots so that others could go first.
Some residents are bypassing the university healthcare system entirely. Nurial Moghavem, a neurology resident who was the first to publicize the problems at Stanford, tweeted on Friday afternoon that he had finally received his vaccine–not at Stanford, but at a public county hospital in Santa Clara County.
“I got vaccinated today to protect myself, my family, and my patients,” he tweeted. “But I only had the opportunity because my public county hospital believes that residents are critical front-line providers. Grateful.”
There are many differences between national and international pallets and it’s critical for business owners…
While puppies are undeniably adorable, they also come with a fair share of responsibilities. Read…
The Gucci Bamboo bag is one of the most iconic and enduring creations in the…
In today's digital world, it's vital to boost your 1v1 video chat site's visibility for…
Inadequate hazmat storage can lead to catastrophic fires, toxic spills, and harmful gas releases. Learn…
Thinking of buying a camper van? Explore these key factors to ensure the perfect fit…