I wonder if the exact phrasing has varied from the source, but even then if "consultation partners" is doing the heavy lifting there. If it was something like "useful consultation partners", I can absolutely see value as an extra opinion that is easy to override. "Oh yeah, I hadn't thought about that option - I'll look into it further."
I imagine we're talking about it as an extra resource rather than trusting it as final in a life or death decision.
> I imagine we're talking about it as an extra resource rather than trusting it
> as final in a life or death decision.
I'd like to think so. Trust is also one of those non-concrete terms that have different meanings to different people. I'd like to think that doctors use their own judgement to include the output from their trained models, I just wonder how long it is till they become the default judgement when humans get lazy.
I think that's a fair assessment on trust as a term, and incorporating via personal judgement. If this was any public story, I'd also factor in breathless reporting about new tech.
Black-box decisions I absolutely have a problem with. But an extra resource considered by people with an understanding of risks is fine by me. Like I've said in other comments, I understand what it is and isn't good at, and have a great time using ChatGPT for feedback or planning or extrapolating or brainstorming. I automatically filter out the "Good point! This is a fantastic idea..." response it inevitably starts with...
Because LLM’s, with like 20% hallucination rate, are more reliable than overworked, tired doctors that can spend only one ounce of their brainpower on the patient they’re currently helping?
In fact, the phenomenon of pseudo-intelligence scares those who were hoping to get tools that limited the original problem, as opposed to potentially boosting it.
The claim seems plausible because it doesn't say there was any formal evaluation, just that some doctors (who may or may not understand how LLMs work) hold an opinion.
Code created by LLM's doesnt compile, hallucinated API's.. invalid syntax and completely broken logic, why would you trust it with someones life !