Stanford Hospital staff protesting the decision by higher ups to give vaccines to admins at home from r/pics
“There is an enormous demonstration going on at Stanford Hospital right now carried out by staff, who are protesting the decision by higher ups to give vaccines to some administrators and physicians who are at home and not in contact with patients INSTEAD of frontline workers.” Twitter
Only Seven of Stanford’s First 5,000 Vaccines Were Designated for Medical Residents. Stanford Medicine officials relied on a faulty algorithm to determine who should get vaccinated first, and it prioritized some high-ranking doctors over patient-facing medical residents.
Algorithm issue my ass.
You prioritized age, and in so seniority, and didn’t prioritize Frontline work, and depriotize those working from home.
It doesn’t matter if you asked a computer to then run the numbers, you set the rules.
The sentence should be “Hospital administration did not prioritize front line workers but instead accounted for seniority in distributing the vaccine. As a result only 7 of the first 5,000 vaccines for staff will go to Frontline workers. These results were accepted without further scrutiny or adjustments by the administrators incharge of doing so.”
“The algorithm did it” is increasingly an excuse used for shitty management decisions.
Yup. Algorithms are created by people. The correct phrasing is “the algorithm was written to do it.”
She doesn’t remember exactly when she realized that some eligibility decisions were being made by algorithms. But when that transition first started happening, it was rarely obvious. Once, she was representing an elderly, disabled client who had inexplicably been cut off from her Medicaid-funded home health-care assistance. “We couldn’t find out why,” Gilman remembers. “She was getting sicker, and normally if you get sicker, you get more hours, not less.”
Not until they were standing in the courtroom in the middle of a hearing did the witness representing the state reveal that the government had just adopted a new algorithm. The witness, a nurse, couldn’t explain anything about it. “Of course not—they bought it off the shelf,” Gilman says. “She’s a nurse, not a computer scientist. She couldn’t answer what factors go into it. How is it weighted? What are the outcomes that you’re looking for? So there I am with my student attorney, who’s in my clinic with me, and it’s like, ‘Oh, am I going to cross-examine an algorithm?’”
The coming war on the hidden algorithms that trap people in poverty
A growing group of lawyers are uncovering, navigating, and fighting the automated systems that deny the poor housing, jobs, and basic services.
MIT Technology Review