II. Corona-Cup 2020 Rules

For discussing go rule sets and rule theory
John Fairbairn
Oza
Posts: 3724
Joined: Wed Apr 21, 2010 3:09 am
Has thanked: 20 times
Been thanked: 4672 times

Re: II. Corona-Cup 2020 Rules

Post by John Fairbairn »

I strongly believe the organisers are oblivious to our posts here
I'm sure you're right. Their loss? At any rate, this too speaks of L19's rapidly fading place in the go world.
Welcome to the ivory tower!
With this lockdown, my hair is growing long enough to do a Rapunzel and escape. But to what? Do I want to escape...
Javaness2
Gosei
Posts: 1545
Joined: Tue Jul 19, 2011 10:48 am
GD Posts: 0
Has thanked: 111 times
Been thanked: 322 times
Contact:

Re: II. Corona-Cup 2020 Rules

Post by Javaness2 »

When trying to establish what rules apply in EGF tournaments it is always possible to quickly clock up a fair flurry of posts, but that doesn't make this a very crucial issue.
NordicGoDojo
Beginner
Posts: 19
Joined: Wed Jun 10, 2020 5:43 pm
Rank: 1p
GD Posts: 0
Has thanked: 2 times
Been thanked: 8 times

Re: II. Corona-Cup 2020 Rules

Post by NordicGoDojo »

My apologies, it has been some time since I last checked L19.

For anybody who missed my announcement on nordicgodojo.eu (apparently everyone), this article describes the used anti-cheating tools to a degree. For now, there is no exact description mainly because the tools are still a work in progress.

When the above model brings a suspicious game to my attention, the anti-cheating team then analyses the game in detail, judging if a player of the given level should be able to perform as well as shown. If the team unanimously decides that a player is suspicious and most likely used an AI for help, the team asks the player if they have video footage to present as counter-evidence. If they don't, or if the counter-evidence is not enough to show that no cheating happened (for example not showing the player's hands or screen clearly), the player is disqualified. We recognise that it is possible that this will result in innocent participants getting disqualified, since we are working from 'mere probabilities', but so far this is the best solution we have found.

Anecdotally, I have tested the model on a semi-random collection of 20 games, 5 of which were human v. human, 5 AI v. AI, and 10 AI v. human (in which the AI was always AlphaGo). In this test, I got a hit rate of 87.5%; there was one false positive, Ke Jie's masterpiece against AlphaGo.
RobertJasiek
Judan
Posts: 6273
Joined: Tue Apr 27, 2010 8:54 pm
GD Posts: 0
Been thanked: 797 times
Contact:

Re: II. Corona-Cup 2020 Rules

Post by RobertJasiek »

I will read the paper later and so far only discuss the procedural aspects of using the tools irrespective of their quality.


You describe the following process:

1) Software suggests a suspicious game.

2) The anti-cheating team then analyses the game and judges if a player is still suspicious.

3) The player may provide counter-evidence, which can be a video recording or something else.

4) The anti-cheating team judges if the player is deemed likely to have cheated so disqualified.


What is good about this process is that human arbiters override software. However, the process should be improved as follows:

- Human arbiters should have at least the same power as software in step (1), that is, also have the possibility to suggest initially suspicious games.

- Step (2) is described to depend on a player's given level (such as rank or rating). However, this involves prejudice because it overlooks the possibility that a player can have learnt very much from AI before the game and therefore play similar to AI on many moves. Furthermore, a player can have a particular strength, such as the endgame, where good play can often result in many same moves by AI and the player.

- In step (3), there is too little description of the, what I have called, "something else" evidence. A player can, e.g., provide counter-evidence by explaining his thinking and decision-making as detailed as time allows him in a dispute schedule. Such evidence can be very strong but is not properly mentioned in the description of the process.

- Step (4) pretends that the anti-cheating team would be the only arbitration body. It is not. There are also the arbitration bodies "referee" and "appeals committee" and, if the EGF General Tournament Rules should apply (I cannot know because this has not been clarified yet), the "EGF [Tournaments and] Rules Commission". The relation between the anti-cheating team and the other arbitration bodies are unclear with each other, in relation to the EGF General Tournament Rules (if they apply) and in relation to the player's right to a fair trial (he has a right to know in advance which arbitration body decides at which procedural order and why that, if applying, is according to the EGF General Tournament Rules, which do not refer to an anti-cheating team at all).

- See also my earlier remarks on open decision-making and impact on player reputation.


The tournament announcement speaks of "state-of-the-art anti-cheating tools". I do not understand why the mentioned software tools should be the state-of-the-art. It would be easier to understand if they were just described as "whatever tools the anti-cheating team wants to use". Furthermore, an earlier claim was made that such tools have identified many cheaters, but I see no evidence for that claim and in particular none for the tools applied to go.
NordicGoDojo
Beginner
Posts: 19
Joined: Wed Jun 10, 2020 5:43 pm
Rank: 1p
GD Posts: 0
Has thanked: 2 times
Been thanked: 8 times

Re: II. Corona-Cup 2020 Rules

Post by NordicGoDojo »

RobertJasiek wrote:However, the process should be improved as follows:

- Human arbiters should have at least the same power as software in step (1), that is, also have the possibility to suggest initially suspicious games.
This is exactly how the process works. We generate graphs only to games that are brought to our attention one way or the other. I will then analyse the graphs for if the game looks the tiniest bit suspicious or not, and if it does, then I will analyse the game itself. Su Yang does not operate in the same fashion.
RobertJasiek wrote:Step (2) is described to depend on a player's given level (such as rank or rating). However, this involves prejudice because it overlooks the possibility that a player can have learnt very much from AI before the game and therefore play similar to AI on many moves. Furthermore, a player can have a particular strength, such as the endgame, where good play can often result in many same moves by AI and the player.
This applies very little to our model. We only compare the convergence of a player's moves with the AI's at a very late stage of the analysis, at which point we anyway have an idea on if cheating has happened or not. More important are metrics showing a player's general performance, irrelevant of if the player's chosen move is the AI's first or third or tenth choice.
RobertJasiek wrote:In step (3), there is too little description of the, what I have called, "something else" evidence. A player can, e.g., provide counter-evidence by explaining his thinking and decision-making as detailed as time allows him in a dispute schedule. Such evidence can be very strong but is not properly mentioned in the description of the process.
We are in general very happy to hear of any counter-evidence that the suspect can provide, but, in my experience, a player's verbal 'evidence' is worth much less than the information that our graphs give. In the first Corona Cup, we let several players 'off the hook' after listening to their explanations, and by this point two of them have been confirmed to have cheated. This is why we mention video footage as the players' recommended 'protection'.
RobertJasiek wrote:Step (4) pretends that the anti-cheating team would be the only arbitration body. It is not. There are also the arbitration bodies "referee" and "appeals committee" and, if the EGF General Tournament Rules should apply (I cannot know because this has not been clarified yet), the "EGF [Tournaments and] Rules Commission". The relation between the anti-cheating team and the other arbitration bodies are unclear with each other, in relation to the EGF General Tournament Rules (if they apply) and in relation to the player's right to a fair trial (he has a right to know in advance which arbitration body decides at which procedural order and why that, if applying, is according to the EGF General Tournament Rules, which do not refer to an anti-cheating team at all).
I cannot account for all the details, as I am not the main tournament organiser. However, I can mention that we are currently handling the first cheating case of the tournament, for which the procedure has been as follows:
  • We (the anti-cheating team) receive a cheating accusation.
  • We analyse the game in question and (this time) unanimously find it to be suspicious.
  • We contact the player, tell them of our suspicion, and ask for any possible counter-evidence they may have.
  • Upon receiving no counter-evidence necessitating rejudgement, we then check with the tournament referee that he accepts our judgement.
  • Upon receiving the referee's approval, we tell the player of our decision and briefly explain how we came to find their play suspicious, and instruct them to contact the appeals committee if they disagree with the decision.
The above procedure may not unfold exactly the same way each time because it has not been set in stone. Still, in general the players can expect roughly this level of communication from the organisers and possibility to influence the outcome.
RobertJasiek wrote:See also my earlier remarks on open decision-making and impact on player reputation.
I am not the main tournament organiser, so I cannot comment on this. However, in general I will try to influence the procedure so that any reputation damage to participants is minimised.
RobertJasiek wrote:The tournament announcement speaks of "state-of-the-art anti-cheating tools". I do not understand why the mentioned software tools should be the state-of-the-art. It would be easier to understand if they were just described as "whatever tools the anti-cheating team wants to use". Furthermore, an earlier claim was made that such tools have identified many cheaters, but I see no evidence for that claim and in particular none for the tools applied to go.
I do not intend to discuss the semantics of what 'state-of-the-art' means. To my best knowledge, no current go anti-cheating tools can compete with the level of analysis and precision of my graphing tools combined with my experience. For example Yike's Hawkeye, from what I know, is a much simpler and error-prone solution, whose main benefit is that it is automatic. I have worked on the model for almost a year, constantly improving it, and I will continue improving it from now on too.

How should I show that my tools have identified many cheaters while protecting their reputation?

In addition to the first Corona Cup and a larger number of non-tournament online games, my model has also been used in the 15th Korea Prime Minister Cup and for example the Canada Open Online tournament. There were no cheating cases in the KPMC, but you can ask the organisers of the Corona Cup and the Canada Open Online tournament (the webpage has an email address) if you want evidence of the model working.
RobertJasiek
Judan
Posts: 6273
Joined: Tue Apr 27, 2010 8:54 pm
GD Posts: 0
Been thanked: 797 times
Contact:

Re: II. Corona-Cup 2020 Rules

Post by RobertJasiek »

Thank you for your replies! I have a few questions about some aspects of your last reply.

Since you have worked for a long time on your tool, I wonder whether you do so as part of some task at university or is it purely your hobby?

Maybe it will become clear from your paper but what do you refer to as "a player's general performance"?

During the first Corona Cup, how have players been confirmed to have cheated? Or at which URL can we read about that?

You can protect reputations while reporting about detected cheaters by giving the players aliases, such as Player000001, and mentioning tournament and date of judgement.
NordicGoDojo
Beginner
Posts: 19
Joined: Wed Jun 10, 2020 5:43 pm
Rank: 1p
GD Posts: 0
Has thanked: 2 times
Been thanked: 8 times

Re: II. Corona-Cup 2020 Rules

Post by NordicGoDojo »

RobertJasiek wrote:Since you have worked for a long time on your tool, I wonder whether you do so as part of some task at university or is it purely your hobby?
At this point, the answer is kind of 'both'. Personally I am not affiliated with any university, but the Associate Professor I am collaborating with is (you can read a bit more on him in the paper).
RobertJasiek wrote:Maybe it will become clear from your paper but what do you refer to as "a player's general performance"?
Most importantly, I am tracking the effect of a player's moves to their winrate and estimated score lead. From the latter, we can further track the size of a player's average mistake and how it develops throughout the game. Comparing this with the development of the winrate already gives a lot of information, and there are still a few more experimental metrics used that I will keep secret for now.
RobertJasiek wrote:During the first Corona Cup, how have players been confirmed to have cheated? Or at which URL can we read about that?
Nowhere – and because of the topic's sensitivity, I will not give any further details.
RobertJasiek wrote:You can protect reputations while reporting about detected cheaters by giving the players aliases, such as Player000001, and mentioning tournament and date of judgement.
So to superficially anonymise the cases while still making it possible to snoop some of them out? In my opinion, the risk of the cheaters' identities possibly getting figured out severely outweighs any possible PR gain for the model.
RobertJasiek
Judan
Posts: 6273
Joined: Tue Apr 27, 2010 8:54 pm
GD Posts: 0
Been thanked: 797 times
Contact:

Re: II. Corona-Cup 2020 Rules

Post by RobertJasiek »

There are different uses of public versus alias names of (alleged) cheaters. Depending on use, the best might be one of: a) public name; b) alias name but stated tournament and date; c) alias name without further information, provided care is taken not to count any case more than once; d) statistical summaries. The less specific the information the less credit it might get and the more reputation is preserved.
lightvector
Lives in sente
Posts: 759
Joined: Sat Jun 19, 2010 10:11 pm
Rank: maybe 2d
GD Posts: 0
Has thanked: 114 times
Been thanked: 916 times

Re: II. Corona-Cup 2020 Rules

Post by lightvector »

Just to chime in here as a bystander - I'd like to say that I've generally been impressed @NordicGoDojo's level of thoughtfulness and care in the Go community. And that for this particular issue, the line being walked between transparency of some of the general approaches and methods and willingness to engage with the community, versus maintaining privacy and/or pushing back on giving too much detail any specific people or players or instances, seems not unreasonable to me. I don't imagine that this is always an easy line to walk, and with the wide range of opinions on the internet it can be easy to draw criticism - so anyways, thanks for doing this work.
RobertJasiek
Judan
Posts: 6273
Joined: Tue Apr 27, 2010 8:54 pm
GD Posts: 0
Been thanked: 797 times
Contact:

Re: II. Corona-Cup 2020 Rules

Post by RobertJasiek »

The paper is discussed here: https://www.lifein19x19.com/viewtopic.p ... 9&start=10
Conclusion: the theory of the software tools is far from ready for distinguishing cheating from no cheating, and judgement about cheating is prejudiced depending on players and towards finding alleged cheating.
User avatar
SoDesuNe
Gosei
Posts: 1810
Joined: Wed Apr 21, 2010 1:57 am
Rank: KGS 1-dan
GD Posts: 0
Has thanked: 490 times
Been thanked: 365 times

Re: II. Corona-Cup 2020 Rules

Post by SoDesuNe »

RobertJasiek's conclusion: the theory of the software tools is far from ready for distinguishing cheating from no cheating, and judgement about cheating is prejudiced depending on players and towards finding alleged cheating.

; )
User avatar
jlt
Gosei
Posts: 1786
Joined: Wed Dec 14, 2016 3:59 am
GD Posts: 0
Has thanked: 185 times
Been thanked: 495 times

Re: II. Corona-Cup 2020 Rules

Post by jlt »

Read in the Discord chat of the Corona Cup:

Lukan: there has been exactly one proved case of AI use so far.

Antti: If I were to nitpick, 'proved' is not the correct word here – but we found it so probable that we went ahead and disqualified the player.
Post Reply