94
AI cancer tools risk “shortcut learning” rather than detecting true biology
(www.eurekalert.org)
A community to post scientific articles, news, and civil discussion.
dart board;; science bs
rule #1: be kind
There’s a story about how they were teaching a model to detect tanks and thought it was awesome… and then they found out it had learned that particular types of trees correlated to the type of tank in the training data, so the AI just looked at the trees and responded on that.
There was actually a research paper where they figured they could predict if people drink wine or beer with a CNN that looks at knee X-rays. Turns out that part of their data is from beer drinking regions, and another part from wine drinking regions, and that the photos are just ever so slightly distorted depending on which physical machine they were made with.
The paper pointed that out, it's whole propose was to show how much bullshit you can use AI for if you are not careful what you train it with.
There were many other examples in this paper of what they can predict just from people's knee X-rays. They had all non-medical explanations such as the one above.
That’s fantastic, thanks for sharing!
There was something similar with a cancer-detecting model somewhat recently. It was trained on a bunch of scans both positive and negative. However, positive scans had to be signed off on by a doctor. So the model just ended up looking for signatures.