Introduction
Introduction Statistics Contact Development Disclaimer Help
[HN Gopher] A university got itself banned from the Linux kernel...
___________________________________________________________________
A university got itself banned from the Linux kernel (2021)
https://archive.md/cBIzm
Author : italophil
Score : 135 points
Date : 2026-01-13 18:58 UTC (21 hours ago)
web link (www.theverge.com)
w3m dump (www.theverge.com)
| gnabgib wrote:
| (2021) Discussion at the time (3025 points, 1954 comments)
| https://news.ycombinator.com/item?id=26887670
| jovial_cavalier wrote:
| The authors were 100% in the right, and GKH was 100% in the
| wrong. It's very amusing to go back and read all of the
| commenters calling for the paper authors to face criminal
| prosecution. The fact is that they provided a valuable service
| and exposed a genuine issue with kernel development policies.
| Their work reflected poorly on kernel maintainers, and so those
| maintainers threw a hissy fit and brigaded the community
| against them.
|
| Also, banning umn.edu email addresses didn't even make sense
| since the hypocrite commits were all from gmail addresses.
| yjftsjthsd-h wrote:
| > Also, banning umn.edu email addresses didn't even make
| sense since the hypocrite commits were all from gmail
| addresses.
|
| The blanket ban was kicked off by another incident _after_
| the hypocrite commit incident.
| caycep wrote:
| I mean...there is a whole discussion about the questionable
| ethics of the research methods in the verge article. And
| human subjects and issues-of-consent questions aside, they
| are also messing with a mission critical system (linux
| kernel), and apparently left crappy code in there for all the
| maintainers to go back and weed out.
| jovial_cavalier wrote:
| 1) once hypocrite commits were accepted, the authors would
| immediately retract them
|
| 2) I don't think it's unethical to send someone an email
| that has bad code in it. You shouldn't need an IRB to send
| emails.
| wtallis wrote:
| > I don't think it's unethical to send someone an email
| that has bad code in it.
|
| It's unethical because of the bits you left out: sending
| code you _know_ is bad, and doing so under false
| pretenses.
|
| Whether or not you think this rises to the level of
| requiring IRB approval, surely you must be able to
| understand that wasting people's time like this is going
| to be viewed negatively by almost anyone. Some people
| might be willing to accept that doing this harm is worth
| it for the greater cause of the research, but that
| doesn't _erase_ the harm done.
| mmooss wrote:
| Bad code is wasting time; investigating the security of
| Linux code approval is a good use of time.
| AlotOfReading wrote:
| 1) How did they hit stable then? [0]
|
| 2) Yes, emails absolutely need IRB sign-off too. If you
| email a bunch of people asking for their health info or
| doing a survey, the IRB would smack you for unapproved
| human research without consent. Consent was _obviously_
| not given here.
|
| [0] https://lore.kernel.org/linux-
| nfs/CADVatmNgU7t-Co84tSS6VW=3N...
| alphager wrote:
| Fun fact: one of the researchers removed any reference to this
| from their publications page: https://www-
| users.cse.umn.edu/~kjlu/
| gweinberg wrote:
| Yeah, given that it's been 5 years I would think there would be
| some followup.
| letmetweakit wrote:
| Imo, the experiment was worthwhile, it exposed a risk, hopefully
| the kernel is better armed against similar attacks now.
| knowitnone3 wrote:
| They retaliated against the entire university. I don't think
| they learned anything.
| imtringued wrote:
| What's the alternative to banning bad actors? Making Linux
| maintainers take every spam commit 100% seriously as if it
| was legitimate? All that would bring about is that the second
| "research project" would be about spamming commits to the
| Linux kernel to DDOS the maintainers.
| arjie wrote:
| The ultimate problem is that it's easy to fake stuff so you have
| to use heuristics to see who you can trust. You sort of sum up
| your threat score and then decide how much attention to apply.
| Without doing something like that, the transaction costs dominate
| and certain valuable things can't be done. It's true that Western
| universities are generally a positive component to that score and
| students under a professor there are another positive component
| to the score.
|
| It's like if my wife said "I'm taking the car to get it washed"
| and then she actually takes the car to the junkyard and sells it.
| "Ha, you got fooled!". I mean, yes, obviously. She's on the
| inside of my trust boundary and I don't want to live a life where
| I'm actually operating in a way immune to this 'exploit'.
|
| I get that others object to the human experimentation part of
| things and so on, but for me that could be justified with a
| sufficiently high bar of utility. The problem is that this
| research is useless.
| jovial_cavalier wrote:
| No, random anonymous contributors with
| [email protected] as their email address are not as
| trustworthy as your wife, and blindly merging PRs from them
| into some of the most security-critical and widely used code in
| the entire world without so much as running a static analyzer
| is not reasonable.
| arjie wrote:
| Oh I misunderstood the sections in the article about the
| umn.edu email stuff. My mistake. The actual course of events:
|
| 1. Prof and students make fake identities
|
| 2. They submit these secret vulns to Greg KH and friends
|
| 3. Some of these patches are accepted
|
| 4. They intervene at this point and reveal that the patches
| are malicious
|
| 5. The patches are then not merged
|
| 6. This news comes out and Greg KH applies big negative trust
| score to umn.edu
|
| 7. Some other student submits a buggy patch to Greg KH
|
| 8. Greg KH assumes that it is more research like this
|
| 9. Student calls it slander
|
| 10. Greg KH institutes policy for his tree that all umn.edu
| patches should be auto-rejected and begins reverts for all
| patches submitted in the past by such emails
|
| To be honest, I can't imagine any other such outcome could
| have occurred. No one likes being cheated out of work that
| they did, especially when a lot of it is volunteer work. But
| I was wrong to say the research was useless. It does
| demonstrate that identities without provenance can get
| malicious code into the kernel.
|
| Perhaps what we really need is a Social Credit Score for OSS
| ;)
| yjftsjthsd-h wrote:
| > 3. Some of these patches are accepted
|
| > 4. They intervene at this point and reveal that the
| patches are malicious
|
| > 5. The patches are then not merged
|
| It's not clear to me that they revealed anything, just that
| they did fix the problems:
|
| > In their paper, Lu and Wu claimed that none of their bugs
| had actually made it to the Linux kernel -- in all of their
| test cases, they'd eventually pulled their bad patches and
| provided real ones. Kroah-Hartman, of the Linux Foundation,
| contests this -- he told The Verge that one patch from the
| study did make it into repositories, though he notes it
| didn't end up causing any harm.
|
| (I'm only working from this article, though, so feel free
| to correct me)
| jovial_cavalier wrote:
| I don't believe they revealed that they were hypocrite
| commits at the time of their acceptance, that was only
| revealed when the paper was put on a preprint server. But
| they did point out the problems to maintainers before the
| changes were mainlined.
| arjie wrote:
| You know there's a lot of he-said she-said here. The
| truth is that I was repeating there what they claimed in
| the paper which is that they intervened prior to merge to
| mainline.
| yjftsjthsd-h wrote:
| My point was that (the article claims that) they didn't
| "reveal that the patches are malicious" at that point.
| Revert yes, reveal no.
| arjie wrote:
| Man, what a mess.
| worthless-trash wrote:
| IIRC one of them actually introduced a memory corrupting
| problem. I don't know if it got accepted or not. I
| remember seeing the issue and rejecting the patch for
| rhel.
| caycep wrote:
| Actually, I think #7 is one of the same students working
| for the professor. So GKH is correct in assuming it's more
| of the same.
|
| Research can be non-useless but also unethical at the same
| time...
| jovial_cavalier wrote:
| >No one likes being cheated out of work that they did,
| especially when a lot of it is volunteer work.
|
| You know what would really be wasteful of volunteer hours?
| Instituting a policy whereby the community has to trawl
| through 20 years of commits from umn.edu addresses and
| manually review them for vulnerabilities even though you
| have no reasonable expectation that such commits are likely
| to contain malicious code and you're actually just
| butthurt. (they found nothing after weeks of doing this
| btw)
| dessimus wrote:
| But what if the next paper is about then about the bad
| patch they put in 15 years ago and it still hasn't been
| noticed? UMN has created a situation that now calls into
| question everything that has contributed by UMN in
| showing bad-faith in retroactively approving Lu's
| actions.
| yjftsjthsd-h wrote:
| > even though you have no reasonable expectation that
| such commits are likely to contain malicious code and
| you're actually just butthurt
|
| Other than the tiny bit where that's not true. An
| institution just demonstrated that they are willing to
| submit malicious code, and don't feel any need to tell
| you that they did so (even after the fact). It's
| perfectly reasonable to ask if they've done this before.
| imtringued wrote:
| That professor just destroyed the ability to trust public
| institutions like universities to not be malicious
| actors. You can't restore that trust unless you comb
| through everything. If you just let them go, you now have
| to distrust every single university by default, which is
| even more expensive.
| paultopia wrote:
| Woah, the thing that leapt out at me, as a professor, is that
| they somehow got an exemption from the UMN institutional review
| board. Uh, how?? It's clearly human subjects research under the
| conventional federal definition[1] and obviously posed a
| meaningful risk of harm, in addition to being conducted
| deceptively. Someone has to have massively been asleep at the
| wheel at that IRB.
|
| [1] https://grants.nih.gov/policy-and-compliance/policy-
| topics/h...
| tptacek wrote:
| The whole story is a good example of why there are IRBs in the
| first place --- in any story _not_ about this Linux kernel
| fiasco people generally cast them as the bad guys.
| NetMageSCW wrote:
| Since this IRB approved the study, what good were they?
| margalabargala wrote:
| That person died in a car accident and they were wearing a
| seatbelt! Why would anyone wear a seatbelt? They are
| clearly useless.
| Consultant32452 wrote:
| If a lot of money is involved, it's only a matter of time
| before all oversight is corrupt. Similarly, you can
| safely assume all data that is on an important (big
| money) topic is fake.
| jujube3 wrote:
| But a lot of money was not involved here.
| stinkbeetle wrote:
| That seems like a bad faith reinterpretation of the
| context that the question was being asked in. The
| statement that the question pertained to was, "in any
| story not about this Linux kernel fiasco people generally
| cast them as the bad guys."
| elsjaako wrote:
| > That person died in a car accident and they were
| wearing a seatbelt! But in any story not about this car
| accident people generally cast them as the useless.
|
| This story isn't evidence that IRBs are always useless,
| but also it's not an example of them being useful. The
| thing this story shows is they are sometimes useless.
| margalabargala wrote:
| Yeah, that's reasonable.
| advisedwang wrote:
| A _reteroactive_ exception!
| something765478 wrote:
| I think they should have gotten permission from IRB ahead of
| time, but this doesn't sound like they were researching human
| subjects? They were studying the community behind the Linux
| kernel, and specifically the process for gatekeeping bad
| changes from making it to the kernel; they weren't
| experimenting on specific community members. Would you consider
| it human experimentation if I was running an experiment to see
| if I could get crappy products listed on Amazon, for example?
| firefax wrote:
| >I think they should have gotten permission from IRB ahead of
| time, but this doesn't sound like they were researching human
| subjects?
|
| I assure you that it falls under IRB's purview -- I came into
| the thread intending to make grandparent's comment. When
| using deception in a human subjects experiment, there is an
| additional level of rigor -- you usually need to debrief the
| participant about said deception, not wait for them to read
| about it in the press.
|
| (And if a human is reviewing these patches, then yes, it is
| human subjects research.)
| dessimus wrote:
| > Would you consider it human experimentation if I was
| running an experiment to see if I could get crappy products
| listed on Amazon, for example?
|
| Yes, if in the course of that experimentation, you also
| shipped potentially harmful products to buyers of those
| products "to see if Amazon actually let me".
| nearlyepic wrote:
| > they weren't experimenting on specific community members.
|
| Yes, they were. What kind of argument is this? If you submit
| a PR to the kernel you are explicitly engaging with the
| maintainer(s) of that part of the kernel. That's usually not
| more than half a dozen people. Seems pretty specific to me.
| fwip wrote:
| A community is made out of humans.
| lawejrj wrote:
| Maybe you're over-estimating how much universities actually
| care about ethics and IRB.
|
| I reported my advisor to university admin for gross safety
| violations, attempting to collect data on human subjects
| without any IRB oversight at all, falsifying data, and
| falsifying financial records. He brought his undergrad class
| into the lab one day and said we should collect data on them,
| (low hanging fruit!) with machinery that had just started
| working a few days prior, we hadn't even begun developing basic
| safety features for it, we hadn't even discussed design of
| experiments or requesting IRB approval for experiments. We
| (grad students) cornered the professor as a group and told him
| that was wildly unacceptable, and he tried it multiple more
| times before we reported him to university admin. Admin ignored
| it completely. In the next year, we also reported him for
| falsifying data in journal papers and falsifying financial
| records related to research grants. And, oh yeah, assigning
| Chinese nationals to work on DoD-funded work that explicitly
| required US citizens and lying to the DoD about it. University
| completely ignored that too. And then he got tenure. I was in a
| Top-10-US grad program. So in my experience, as long as the
| endowment is growing, university admin doesn't care about much
| else.
| derbOac wrote:
| I've also had to deal with the IRB a lot as a professor. The
| retroactive application is extremely weird (although maybe
| better than nothing?).
|
| This seems like one of those situations that would usually
| require regular review to err on the side of caution if nothing
| else. It's worth pointing out there are exceptions though:
|
| https://grants.nih.gov/sites/default/files/exempt-human-subj...
|
| Generally those exceptions fall into "publicly observable
| behavior", which I guess I could see this falling into?
|
| It's ethically unjustified how the whole thing actually
| happened but I guess I can see an IRB coming to an exemption
| decision. I would probably disagree with that decision but I
| could see how it would happen.
|
| In some weird legalistic sense I can also see an IRB exempting
| it because the study already happened and they couldn't do
| anything about it. It's such a weird thing to do and IRBs do
| weird things sometimes.
| amypetrik214 wrote:
| >I've also had to deal with the IRB a lot as a professor. The
| retroactive application is extremely weird (although maybe
| better than nothing?).
|
| I mean I feel like the IRB is mostly dealing with medical
| stuff. "I want to electrocute these students every week to
| see if it cures asthma". "No that's too much.. every other
| week at most". "Great I'll charge up the electrodes"
|
| So if a security researcher rolls in after the fact and says
| "umm yea so this has to do with nerd stuff, computers and
| kernels, no humans, and I just want it all to be super secure
| and nobody gets hacked, sound good" "ok sure we don't care if
| no people are involved and don't really understand that nerd
| stuff, but hackers bad and you're fighting hackers"
| jjmarr wrote:
| Any undergrad doing a survey at my university has to get
| IRB approval.
| harvey9 wrote:
| I can see the irb giving retroactive approval solely because
| of political pressure. Which is why legitimate studies seek
| approval in advance.
| samgranieri wrote:
| This is retroactive ass covering by the UMN IRB.
| knallfrosch wrote:
| Don't worry, the university investigated itself (again) and
| (again) found no wrongdoing. /s
| jruohonen wrote:
| > Woah, the thing that leapt out at me, as a professor, is that
| they somehow got an exemption from the UMN institutional review
| board. .... in addition to being conducted deceptively
|
| There are cases where deception (as they call it) can be
| approved (even by ethics boards). Based on the Verge's article,
| this research setup should not have been approved even by then.
| But the topic itself seems as relevant as ever with the _xz_
| case and all.
| tdeck wrote:
| Right, that's something to discuss at the IRB review. But
| they didn't even do an IRB review before conducting the
| experiment. After the outcry, they went back to the IRB and
| said "was this OK?"
| jmclnx wrote:
| Did they ever get un-banned ? IIRC, that Univ has/had great
| Computer Science Dept.
|
| But there is always the BSDs.
| opan wrote:
| The Gopher protocol was made there!
| 9cb14c1ec0 wrote:
| The stupid thing about the experiment was that it's never been a
| secret that the kernel is vulnerable to malicious patches. The
| kernel community understood this long before these academics
| wasted kernel maintainer time with a silly experiment.
| hamstergene wrote:
| Agree, to me this "research" is like proving grocery stores are
| vulnerable to theft by sending students to shoplift. If review
| process guaranteed that vulnerabilities can't pass, wouldn't
| that mean that the current kernel should be pristinely devoid
| of them?
| aucisson_masque wrote:
| Well I didn't know and thanks to them now I know.
|
| I believe most people believe that the Linux kernel couldn't be
| compromised because there is multiple approval process and
| highly professional people vetoing.
|
| It seems like a big vulnerability, if a teacher assistant could
| do that, there is no doubt that government agencies can too.
| something765478 wrote:
| While I did see some problems with their approach (i.e. doing the
| IRB reviews retroactively instead of doing them ahead of time,
| and not properly disclosing the experiments afterwards), I think
| this research is valuable, and I don't think the authors were too
| unethical. The event that this most reminds me of the Sokal
| Squared scandal, where researchers sent bogus papers to journals
| in order to test those journal's peer review standards.
| firefax wrote:
| >Then, there's the dicier issue of whether an experiment like
| this amounts to human experimentation. It doesn't, according to
| the University of Minnesota's Institutional Review Board. Lu and
| Wu applied for approval in response to the outcry, and they were
| granted a formal letter of exemption.
|
| I had to apply for exemptions often in grad school. You must do
| so _before_ performing the research -- it is not ethical to wait
| for outcry then apply after the fact. Any well run CS department
| trains it 's incoming students on IRB procedures during
| orientation, and Minnesota risks all federal funding if they
| continue to allow researchers to operate in this manner.
|
| (Also "exempt" usually refers to exempt from the more rigorous
| level of review used for medical experiments -- you still need to
| articulate _why_ your experiment is exempt to avoid people just
| doing whatever they want then asking for forgiveness after the
| fact)
| samgranieri wrote:
| I was honestly surprised the University of Minnesota didn't
| part ways with the teacher and students who performed this
| bullshit research.
|
| This level of malfeasance strikes me as something akin to
| plagiarism for a professional writer.
| cmxch wrote:
| a/b testing the insertion of vulnerable code is not a good idea.
| aetherspawn wrote:
| The uni should just donate to the Linux maintainers for damages -
| however much time was wasted - and just move on its merry way.
|
| Money is money and buys time, no harm done, useful research
| conducted, and a whole lot of publicity gained.
| aucisson_masque wrote:
| > If a sufficiently motivated, unscrupulous person can put
| themselves into a trusted position of updating critical software,
| there's honestly little that can be done to stop them," says
| White, the security researcher.
|
| That says a lot about Linux kernel safety.
| agent013 wrote:
| Wasting time -- it's just burning trust, and the reaction was
| entirely predictable.
| fennecfoxy wrote:
| Pretty ridiculous. If I send them an email with a stupid question
| wasting their time on purpose just to see if they'll reply is
| that "human experimentation"? What a loose definition.
|
| More to the point; are they salty because the author has possibly
| proved that it's most certainly possible to get critical flaws
| into the Linux kernel with social engineering? How else is
| something like that meant to be tested?
|
| If you give them a heads-up they'll pay more attention for a
| short duration of time.
___________________________________________________________________
(page generated 2026-01-14 16:01 UTC)
You are viewing proxied material from hngopher.com. The copyright of proxied material belongs to its original authors. Any comments or complaints in relation to proxied material should be directed to the original authors of the content concerned. Please see the disclaimer for more details.