Can I ask you all for a quick review of this idea?
Threat Model and Attack Scenario:
PC member A and B agree to not mark each other as a conflict and write each other favorable reviews and fight for each others papers.
Proposed Mitigation:
Conclusion:
Our approach solves the problem. PC member A and B cannot write each other favorable reviews or fight for each other's paper anymore.
Hoping for typical reviews from security folks here that identify any weaknesses or potential side effects ;)
=> More informations about this toot | More toots from lavados@infosec.exchange
@lavados what is "PC" here? Can this mitigation be applied to any pair of members, or does it rely on an authority singling out A and B specifically? And if so, is there a good reason to not just declare them abusers and ban them altogether?
=> More informations about this toot | More toots from siguza@infosec.space
@siguza the program committee. We're trying to mitigate potential collusion without knowing who is colluding. the mitigation is simply applied to all PC members as anyone could be colluding.
=> More informations about this toot | More toots from lavados@infosec.exchange
@lavados so in short, anyone can only see the papers they're assigned to, and they can't see any names?
=> More informations about this toot | More toots from siguza@infosec.space
@siguza yes, that's the idea.
=> More informations about this toot | More toots from lavados@infosec.exchange
@lavados ok, but A and B can still share with each other which papers are theirs out-of-band (and there is no way to prevent this). Is it possible to get assigned to a certain paper if you have some pre-shared info about it, or is the process completely random? Can the process be brute-forced until one ends up assigned to a chosen paper?
=> More informations about this toot | More toots from siguza@infosec.space
@siguza sure they can have preshared info. Brute forcing doesn't work though. There's a bit of luck involved for them to get assigned
=> More informations about this toot | More toots from lavados@infosec.exchange
@lavados I think removing access/visibility will just lead to other side effects like a program chair with insider information (thus the mapping of reviewer ID to actual name) suddenly becoming a person of interest (to hack or just as suspect of interference). Lack of knowledge leads to new roots for mistrust.
It really averts the core issue once again. Publish all reviews to the public and we are going to see who has actual pro/contra arguments instead of "favorable statements"! But academic industry just does not care about that.
[#]academia #transparency
=> More informations about this toot | More toots from meisterluk@mathstodon.xyz
@lavados does the conference include paper bidding? If so, what prevents A and B from bidding for each other’s papers?
Still, highly in favor of the mitigation.
=> More informations about this toot | More toots from murgi@infosec.exchange
@murgi Yes, paper bidding. Yes A and B can bid for each other's papers and are likely to get them assigned.
But the problem is solved because no other reviewer can see that A and B are reviewing each others papers :)
=> More informations about this toot | More toots from lavados@infosec.exchange
@lavados Assumptions: 1 you still allow bidding (as claimed later), 2 the second mitigation means that all review(er)s are anonymized (not just A and B). I don't think this helps: what stops A from bidding for B's paper, and fighting for it? A can still write favorable reviews, A can still fight; the only thing is that other reviewers now no longer know who's fighting for it?
This toot needs at least a major revision with better argumentation.
Score: 2. Weak reject
Expertise: 3. Knowledgeable
=> More informations about this toot | More toots from bartcopp@mastodon.social
@bartcopp Rebuttal:
However, we believe that the reviewer overlooked the subtleties of our design. In fact, we have proof that the system can be deployed as is because it is already deployed at conferences in practice. ;)
Also, at least it doesn't make things worse, right?
=> More informations about this toot | More toots from lavados@infosec.exchange
@lavados It might actually make it worse: because now other reviewers who might be aware of the conflict can no longer notice the conflicting paper assignment and report it to the chairs. (I'm not sure this would actually happen in practice if one assumes the bystander effect, but I'd be happy to be proven wrong.)
While I now see the subtleties of the design and why the authors are proposing it, unfortunately the rebuttal made it clear that this technique is not novel ;)
Score: 1. Reject
=> More informations about this toot | More toots from bartcopp@mastodon.social
@bartcopp oh damn it, the classic "no novelty" reject :D
=> More informations about this toot | More toots from lavados@infosec.exchange
@lavados Rebuttals, not even once, right? ;) (The alternative would have been the minor revision/shepherding to make sure you cite this definitely very relevant paper that just happens to be co-authored by a random member of the PC ;) )
=> More informations about this toot | More toots from bartcopp@mastodon.social
@bartcopp maybe a different direction would be better... making conflicts of interest completely visible within the PC (so that other PC members can check and see if something is missing?)
=> More informations about this toot | More toots from lavados@infosec.exchange
@lavados That's definitely an interesting idea. I'm not sure how well it would scale to the extremely large PCs... But then again, if only the PC chairs can check them, that definitely doesn't scale.
=> More informations about this toot | More toots from bartcopp@mastodon.social
@bartcopp i would go for all PC members (chairs can see the conflicts anyway already) - what is secret about a conflict of interest?
=> More informations about this toot | More toots from lavados@infosec.exchange
@lavados Yes, that's how I understood it, and it makes sense to me :) I agree that conflicts of interests in principle shouldn't be secret. One exception I can imagine is people who otherwise would have no publicly-known (valid) conflict, but having a secret affair...? (But that can be dealt with in an ad-hoc fashion I would argue.)
=> More informations about this toot | More toots from bartcopp@mastodon.social This content has been proxied by September (3851b).Proxy Information
text/gemini