Ive added a missing anchor in the EN page; please sent for translation to propagate the correction. Thank you.
Talk:Code of Conduct
Return to "Code of Conduct" page.
Reply to "StackOverflow Code of Conduct"
Reply to "Recommendation to modify the appeals process"
Reply to "Suggested amendment: Public logging of bans made by CoC"
About this board
send to translation
sorry no evolution after my last update of 5 May:
anchor is to be declared in the english page not the french one !
Text from the Contributor Covenant is licensed under a CC BY 4.0 license, rather than the MIT license.
At the time the Code of Conduct was created the Contributor Convenant was licensed under MIT.
That is true. Can we change it now to CC BY 4.0?
StackOverflow Code of Conduct
Stack Overflow (or to be more precise the Stack Exchange network) just created their Code of Conduct. It's quite well done, IMO (of course very specific to what they are doing). The expectations part (ie. starting with something positive) is something I miss from ours.
I like that Expectations section very much too. I also like the general presentation of their CoC. Looks more easy to consume, and less like a legal document.
I agree, it is friendly, positive and easy to understand.
👍, we should take inspiration and make our page better.
I see a profound problem when the aspirational (to cite a common example - "A Scout is: Trustworthy, Loyal, Helpful, Friendly, ...") is conflated with quasi-judicial concepts along the lines of "misconduct". This can lead to broad and vague speech codes, which may be selectively enforced in capricious ways. Serious question to those who find the prohibitions clear: Would you say any of the controversial tweets of a recently hired New York Times editorial board member, are in violation of this CoC's provision of "No bigotry. We don't tolerate any language likely to offend or alienate people based on race, gender, sexual orientation, ..."? Would she be expelled for "display[ing] a pattern of harmful destructive behavior toward our community"?
Seth, respectfully, I don't think your comment has to do with this topic and should be discussed in a separate topic.
@Huji, what it has to do with this topic is my assessment is different from the views above, which say, "quite well done" "easy to understand", and so on. I don't think it's well-done or easy to understand all all, from the crucial perspective of what is a violation as opposed to an aspiration. Hence, I ask, if it is well done, or easy to understand, please tell me if the case I give above is overall a violation.
Oh, I see.
So let me clarify my opinion then: I don't know if their CoC is something I would approve or not. But for what it is, the mothod of delivery is appealing to me. I think we could/should try to take our own CoC and make it more appealing like that as well. My opinion has not to do with the content of their CoC. Yours, apparently, does. And that's okay, but I wanted to clarify that we might be talking about different things.
Anyone to take the lead on turning good parts of StackOverflow 's CoC to amendments of or CoC?
Recommendation to modify the appeals process
Hi, the Technical Collaboration’s Community Health group wants to share some thoughts about the appeal process that we are currently handling. The text describing the appeal process itself is fine. The problematic part is to have an appeals team other than the CoC Committee itself without defining the relationship and governance between both teams.
The core of the problem is that it is not defined who has the ultimate decision on the resolution of an appeal. This is fine when both teams agree on a resolution, but what if they don’t? The options are
- They have to keep discussing until there is consensus. This would put both teams on equal foot, which is fine but needs to be documented.
- The Committee has the last word. This means that the Appeals team has an advisory function, which is fine but needs to be documented.
- The Appeals team has the last word. This might even be the default expectation (?), but it is actually the most problematic one because it means that the Appeals team has more power than the Committee itself.
If we want to go for the third option anyway, then that Appeals body cannot be a team like we have now, formed by Wikimedia Foundation members by design. There were good reasons to make this choice (leaving tough situations to paid professionals, saving some trouble to volunteers), but having a team of WMF employees having more power than the Committee is a setup that we don’t want to have.
The current text states: "These [appeals] will be considered by the Committee, which may alter the outcome." This suggests to me that the Committee has the last word. I believe this makes perfect sense, since the foundation should only override community-elected structures for legal reasons (in which case the Community Health group doesn't sound like the right group to make a decision anyway).
Can you link to the pages for each of these two committees or teams? I want to see a page for each, listing who the members are, and stating how anyone comes to be on these teams.
Based on what you say here and my browsing around I cannot quickly come to understand the differences in the nature of these two teams.
Can the auxiliary members of the CoC be the Appeals team? In which case I think option 1 above makes the most sense.
I'm not excited by having the auxiliary members be the Appeals team. Said as a former auxiliary member, I'd prefer to keep the function strictly as fallback in case of conflict of interest of active CoC committee members.
An additional factor to be considered. The Technical Collaboration team doesn't exist as such anymore. The people who form the Community Health group are all active, so if we receive a new appeal we can still handle it. However, we would welcome a decision on our proposal.
Tracked: task T199086
I think the Appeals team should have the final word on cases submitted to their consideration. Thank you.
Giving a non-elected (wmf appointed) team power to veto and repeal decisions my a community committee seems contrary to being a community driven organization. WMF staff should not have "benevolent dictator" powers in social processes.
In order to help the discussion, I think two aspects should be considered:
- Should the Appeals team be nominated by the Wikimedia Foundation or not? (and if not, how is this team nominated)
- Who should have the last word, the Committee or the Appeals team?
The combination of these points offer four scenarios. A fifth would be that there is no Appeals team.
So far there are two things that can be seen here:
- Slight majority believe a WMF-based team should not have power to overrule CoCC remedies. If there is no strong objections towards this by the next 7 days, I will make it clear in the CoC.
- We don't have a "Community Health group" at WMF anymore. The functionality needs to be given to another body. I don't know WMF internal structure to suggest an alternative body in these cases. I reach out to people for suggestions.
I still think that an appeals body should exist and be able to overturn a decision submitted to their consideration. Otherwise, what'd be the point on having one? It'd be bureaucracy for the sake of bureaucracy and a false appearance on the existance of an appeal process. If the problem is that we don't want to grant such power to a WMF Team for whatever reason, then I suggest that the appeals body be formed by community members instead in the same way the COCC is elected. Thank you.
I don't have any better proposal but making a committee just to check appeals seems too much overhead to me. There are several committees/group/teams we can delegate this responsibility.
The WMF operates most technical spaces, sponsors most development etc. so ultimately it is the WMF's responsibility to ensure technical spaces have a healthy culture. Having it as the decisionmaker of last resort makes sense.
OTOH if most decisions get appealed and some WMF team has to secondguess the CoC committee all the time, that seems like a bad situation. Rather than setting up another committee, I think it might be better to restrict appeals to situations where the committee made some objectively identifiable mistake (and then the WMF team's involvement would be limited to verifying that the mistake indeed happened).
Dead link: 'Open Code of Conduct' at todogroup.org
At paragraph 'Attribution and re-use', the item 'the Open Code of Conduct' has a dead link to 'https://todogroup.org/opencodeofconduct/'
Thanks, I've made it point to their Github repository for their CoC instead.
In spite of my work I still have not the rights: can you still push the page for translation to propagate the correction ? Thanks.
Proposal amendment: Committee should serve for one year
In the hackathon we (Committee members) are talking about having the term extended to one year because it takes a rather a long time for the committee to learn how to work together and thus makes sense to work together for a longer period of time.
Even a year seems short to me. If the term is one year then I hope that it happens that people stay longer if their committee service is working.
+1 from me for terms of 1-2 years. My only fear in longer terms is the extra stress placed on the committee members.
I agree on one year, would however disagree on 2 years. I believe the committee should have some dynamic to exchange members. It seems likely that after 2 years, it will be hard to change things up and add new people to a group that has been working together this long. I think there should be an incentive to add new members from time to time.
A year is fine. More than that should only be based on re-election to the position. I'd rather have more people from the community that are interested in keeping our spaces welcoming, rotate through, then have it turn into the same folks year after year.
In the name of knowledge sharing and continuity, I propose an "interleafed" mode: only re-elect half the committee every year. That way, half the committee stays on and can onboard the other half. IIRC KDE's board operates this way. In any case, re-electing an entire body is always extremely disruptive, and should be avoided.
+1. This model makes sense.
+1 I think this one sounds good
I also +1 this proposal.
+1 for 1 year. The members have underlined often that membership takes a toll on them and we wouldn't want to see any burn-outs.
Proposal amendment: Make WMDE receive a notification when targets of reports are WMDE employees
This proposal has been rejected.
In Code of Conduct/Committee#Confidentiality we report to WMF HR and their managers when target of the report is an employee of WMF and since WMDE is the only other organization that has software engineering department, it makes sense to have a similar policy.
I cannot say how to execute this but thanks for posting the note.
I cannot say how often it happens, but it seems like having an affiliation with a Wiki organization makes people more likely to be the targets of reports. I know that this process is for software engineering but I appreciate the precedent and foundation of discussion that this process is setting.
Previous discussion about the requirement about reporting to WMF HR: Talk:Code of Conduct/Archive 2#Confidentiality
If you re-read that section, I think you'll find nearly everyone in opposition to it, except that it was forced in by WMF Legal.
I suggest we reject this proposal and instead propose an amendment that completely removes that sentence.
I went back through the archive and I found a couple relevant points made by WMF Legal:
"The Wikimedia Foundation has an interest in being informed of potential misconduct by its employees and contractors in the workspace and faces legal risks if HR is not informed about matters related to employee/contractor harassment in its workspaces. These workspaces include not only physical but also virtual spaces. The Wikimedia Foundation has this interest uniquely as a host of the website whose employees are using these technical spaces as their workspace."
"This section isn't us trying to say that having a WMF reporting exception makes this CoC policy better in the ideal. Rather, it's that we looked at the policy, at how the technical spaces are used (in particular the combination of lots of WMF employees and WMF as the host of the space) and concluded that the legal risk under HR law is too high for the WMF to do this without having the reporting exception."
So, in my non-lawyer understanding: because of its unique position as the host of these technical spaces, the WMF has a unique legal obligation to its employees and contractors who participate in these sites. Even the WMF Legal department sees the downsides of this obligation, but that doesn't change the legal position.
To me, that explains why the provision applies to the WMF only and why it's not something that we the technical community can change. Obviously, you're free to disagree—but we should keep in mind that none of us are lawyers.
As far as I know, there's no legal obstacle to us adding a similar provision regarding WMDE. But I think that would have all the downsides of the WMF provision (the potential to break confidentiality against the reporter's wishes), but without the legal requirement to force its inclusion.
Code of conduct committee call for new members
It's coming close to time for annual appointments of community members to serve on the Code of Conduct (CoC) committee. The Code of Conduct Committee is a team of five trusted individuals plus five auxiliary members with diverse affiliations responsible for general enforcement of the Code of conduct for Wikimedia technical spaces. Committee members are in charge of processing complaints, discussing with the parties affected, agreeing on resolutions, and following up on their enforcement. For more on their duties and roles, see https://www.mediawiki.org/wiki/Code_of_Conduct/Committee
This is a call for community members interested in volunteering for appointment to this committee. Volunteers serving in this role should be experienced Wikimedians or have had experience serving in a similar position before.
The current committee is doing the selection and will research and discuss candidates. Six weeks before the beginning of the next Committee term, meaning 8th of April 2018, they will publish their candidate slate (a list of candidates) on-wiki. The community can provide feedback on these candidates, via private email to the group choosing the next Committee. The feedback period will be two weeks. The current Committee will then either finalize the slate, or update the candidate slate in response to concerns raised. If the candidate slate changes, there will be another two week feedback period covering the newly proposed members. After the selections are finalized, there will be a training period, after which the new Committee is appointed. The current Committee continues to serve until the feedback, selection, and training process is complete.
If you are interested in serving on this committee or like to nominate a candidate, please write an email to techconductcandidates AT wikimedia.org with details of your experience on the projects, your thoughts on the code of conduct and the committee and what you hope to bring to the role and whether you have a preference in being auxiliary or constant member of the committee. The committee consists of five members plus five auxiliary members and they will serve for six months; all applications are appreciated and will be carefully considered. The deadline for applications is end of day on 5th of April, 2018.
Please feel free to pass this invitation along to any users who you think may be qualified and interested.
Best, Amir on behalf of the CoC committee
For reference, the list has been posted to Code of Conduct/Committee/Candidates/2018-I.
Suggested amendment: Public logging of bans made by CoC
Given the discussion happened in the past couple of days, here's my proposal to amend to CoC, part of cases:
Any ban made by the code of conduct committee will be logged publicly in Code of Conduct/Cases/Log/2018 with user name, space of ban (phabricator, mediawiki.org, etc.), duration, and if not private reasoning. Unless any of these two conditions apply:
- The reporter asks that ban not to be public.
- Or the code of conduct committee decides not to disclose anything as it might disclose private information. For example, logging harassment cases in Wikimedia events can lead to real-world identity of users being disclosed.
It goes without saying that this is not applicable retrospectively.
Just to clarify, this would apply to bans specifically, not cases themselves?
I’m against this. Even with the on wikitech-l suggested option of only making it public while the ban is in effect, if the log is kept on a wiki, the (history of) the ban list can (and will) be used for naming and shaming, long after punishment has been dealt and served.
If any transparency is required, I would counter propose that an aggregate report is published periodically (quarterly/bi-annually, yearly —- depending on the number of expected cases —- max 10 or so per period would be nice), which report on the number of actions per platform and the duration of the actions, where applicable.
Also, the ban might be worn as a badge of honor, which would encourage further abuse. :(
Is having a list of bans performed by CoC any different than a wiki having Special:BlockList?
No, but perhaps that shouldn't exist either. Most websites don't list all of the users who have been banned, and I don't think that is necessarily a bad thing.
Most websites are not collaborative projects supported by a huge community.
That's true, but as an example: GitHub, Stack Overflow, or any an all Open Source projects (including projects much much larger than ours). I can't think of a website/project (other than Wikimedia's) that publicly lists bans. I think the public log is an assumed virtue, when the existence of the list, may in fact, encourage more harassment. I have no evidence to suggest this, my point is only that the existence of such a list may have unintended consequences.
Not existence of such list is already having consequences. And bans on the wiki had been public since forever, as far as I know, without any evidence this did encourage anyone for more harassment.
As a somewhat trivial example, being blocked by the President of the United States, is seen by some as a badge of honor, and therefor, a prize to be sought. https://www.washingtonpost.com/news/the-intersect/wp/2017/06/07/how-getting-blocked-by-trump-on-twitter-became-a-badge-of-honor/
At least on enwiki, I don't think many people use Special:BlockList unless they're looking up a block ID to investigate an autoblock. It chiefly consists of drive-by vandals, so it doesn't really serve as a wall of shame or badges of honour, as you'll quickly be buried in there and forgotten. The block log itself obviously is necessary to see prior abuse. At any rate, admins can redact the log entries, but on the English Wikipedia this practice is prohibited.
Yes, Not even warnings neither any mention of reporter will be logged. Hope that helps.
I'd propose one additional clause that requires:
- In the first month of each year, last year's log is amended one final time to disclose the number of non-public bans by space of ban. This number cannot be opted out of by the reporter, the committee, or anyone else. And this statistic would not contain any start date, duration, user, or reasoning. Just the numbers by space of ban. (Bans affecting multiple spaces could be counted multiple times, or we could decide that "Multiple spaces" be one of the counted spaces.)
- In the first month of each year, last year's log should be linked to from a public announcement. I have no preference for which platform this post would be on, but it should be decided ahead of time and included in the policy. Perhaps Wikitech-l, or another public wikimedia.org list, or a mediawiki.org newsletter, or a Phabricator Phame post, etc.
Lastly, I would recommend that this clause does apply retroactively, treating all prior cases as non-public. Thus disclosing this year's complete counts by January 2019.
EDIT: I wrote this comment at the same time as @Siebrand wrote his. I did not see his until after I submitted mine. I would support doing only aggregated reports.
Anonymous statistics are in theory already part of the CoC (see the last paragraph of https://www.mediawiki.org/wiki/Code_of_Conduct/Cases#Responses_and_resolutions ).
I would prefer that this applies not just to bans but all sanctions by the committee (e.g. asking someone to apologize doesn't need to be logged, but doing something to someone against their will does. e.g. deleting a phab comment does require logging).
I even wonder if maybe the enforcement role of the committee should be separated from the judgment roles of the committee. Right now I feel like the CoC committee is judge, jury and executioner and lacks any effective oversight due to secrecy. I consider this state of affairs dangerous
I would also like to suggest that for any sanction, the person being sanctioned must be notified of the action, length, rationale and how to appeal. I've heard complaints that in some cases when comments were deleted the commenter was never notified, and only found out by accident (which is supposed to deter them how?). In the recent MZMcbride case, the notification was rather lacking (Assuming the email posted by Mzmcbride is accurate) as it failed to disclose length, how to appeal and the rationale while not totally missing was lacking imo (that's more debatable though).
I agree with your thoughts.
I agree with the publishing of the bans, but I am not sure Wiki (and specifically Mediawiki) is the best place because, as correctly noted, the wiki infrastructure ensures the information is preserved forever. While in theory everything public on the internet is forever, in practice there might be benefits to not carrying old grudges around.
The problem to solve here is not statistics though. It's nice, but it's a different case. What is really needed is when action is in force, I (as a participant of the collaborative platform) can see that it is, and adjust workflows accordingly. It is a collaborative environment, so the ban would affect collaboration. Aggregated report is useless for this case - if I worked with X and their account suddenly shows up as "disabled", it won't help me any that in 2 months time I'd see in aggregated report "number of bans: 7". What I need to know is: a) was X banned? and b) for how long. Most community members, I suspect, would also want c) by whom and d) for what.
I am not sure how it can be possible that the ban won't be public - account being disabled on Phabricator is pretty public, and we'd be kidding ourselves if we ignore the fact that this is what people would assume (especially now). If we do it in non-transparent way, people would just assume more and create a narrative in their heads which might not necessary be even true, but is inevitable. I do not think the fact of the ban in a public collaborative platform can be hidden. Or should be. We might not disclose any details about it beyond the duration (I think this is a must), though I would advise in general case not involving sensitive matters to opt to more disclosure (again, secrecy breeds mistrust, and mistrust makes people construct narratives that we'd want to avoid). But I do not see how hiding bans is practical, or desirable.
I would discourage using wikitech-l to publish ban reports. This mailing list is read by newbies and other developers who just want to stay informed about technical matters. It is not very welcoming to see such drama and it might incite more lengthy and toxic discussion. I think an on-wiki page is fine, or somewhere that's not as "in your face". I should not be forced into seeing this side of the technical community.
Edit -- I meant to post under my volunteer account, for the record!
I think using wikitech-l is a consequence of a lack of pre-designated forum for such discussions. When there's a need for discussion and community does not have a Schelling point for it, the next best thing is taken to be it. To avoid it, we need to arrange such space in advance, and specify it when CoC decision is communicated, and maybe also in CoC pages where we establish the rules.
I join Bawolff on the lake of separation of powers. I would add that it would be good to have well defined processes and publicly documented rules stating for which cases what kind of sanction can be applied or not. That doesn't mean absolute transparency on each details of execution of this processes, which is certainly not a required or desirable level of transparency permitting trust to foster, and can itself be a source of legitimate concerns for people safety, as it was pointed. Although I am not sure that this is the most appropriate place to talk about this aspect of the issue.
As there is a collaboration aspect that have been indicated, I think Meta would be a more appropriate place to publish a report, if any.
Regarding the feedback of status of collaborators when they are banned, I don't have a perfect idea so far. My first thought would be something like integrate a feature in Phabricator itself, showing in the user page that "this account as been (temporally) disabled (for 3 eons), for more information consult this page/contact that referent".
I support the original amendment. I don't care much about the venue for publishing the bans, but it is tremendously important that they are visible at least for the duration of the ban. Not everyone who needs to be aware is in the CoC committee (phabricator.wikimedia.org and wiki admins, event organizers etc)
This seems like a solution in need of a problem. Would you mind clearly stating what the problem is so we can determine if this is the appropriate solution?
The problem is very simple. They way it works now there's no indication to anyone (even Phabricator admins!) what happened when account is disabled. Not who, not why, not for what duration, nothing. It is unacceptable that on a collaborative platform people are just disappearing without any notice and any possible explanation (and since the account is disabled, they can't communicate what happened either). This is not how open collaboration should be done, and not how a collaborative environment should be administered. I understand the privacy concerns, but my firm belief is it is completely possible to produce a transparent and accountable system without revealing any private details. Just hiding everything and not providing any information at all is not a good solution.
Yes. CoC santions must all be publicly logged for accountability, transparency and awareness. This doesn't mean that we must require a full disclosure in the most sensitive cases but at least username, type of sanction and duration must be made publicly avalaible. With regards to the case that caused the opening of this thread, we already have two Phabricator administrators that didn't know where the block came from, one of them overturning the ban ignoring that it was a CoC sanction. This is not optimal, as we can lead to involuntary reversals and misunderstandings. It is in the benefit of both the CoC and the users to know all of this. As some have pointed above, wiki blocks are kept logged for the said reasons on a personal log and active blocks and bans listed on an special page. Why the secrecy here? Of course reports should be managed with confidentiality, but actions by the CoC based on said reports should not fully be. I must note that you can find at List of globally banned users a list of users subject to community and WMF-Legal global bans. Those lists ain't aimed at being walls of shame, but help users know why something has happened. I insist, the log doesn't have to have the full details, but at least username, space, duration (and reasoning if that's appropriate) should be published. Thank you.
I don't believe there is value in transparency for the sake of transparency. Especially when that transparency comes at the cost of someone's privacy. Also, it may have unintended consequences that have not been explored.
Perhaps an acceptable compromise would be to permission the ban/block log to the administrators of the domain? (i.e. Phabricator/Gerrit Administrators). This way, it isn't a public log (that can be used for shame/honor), but it would resolve the problem of administrators not being aware of bans/blocks that they should be aware of.
How about bans for real-life event? How can other events organizers be aware of a ban if no public log exists?
I'd like to insist that I am not asking for the full disclosure of the case details. That'd be crazy. The log can be so brief as to "User X banned from (platform/WMF technical spaces) for (a week/indefinite) by decision of the Code of Conduct Committee [optional: because ...]." This does not IMHO disclose any personal information other than what the user might have disclosed themselves voluntarily (ie: if you use a username which is also your real name [Bad Idea (tm)]), helps understand people why an account suddenly has become deactivated, inactive or is not replying, and improves transparency and accountability of the CoC. Of course cases and investigations should continue to happen in camera, and there might be cases where revealing even that certain user is subject to a CoC santion might be a bad idea. If that situation ever comes I'd certainly support not making that situation fully public, but at least the administrators of the platform where the user got banned should be informed (e.g.: Please note that the COCCom has banned X from Y; please do not overturn) as to avoid involuntary reversals or misunderstandings. Best regards.
I understand what you are asking for, but I do not understand why a ban/block log needs to be public, when it seems that the problem is that administrators are the ones who need this log. I understand that it would not include personally identifiable information. My argument is that, the block itself ought to remain private. Or are we saying that people who are blocked/banned no longer have that right to keep that information private?
Is there a problem with making this log permissioned to admins?
> Or are we saying that people who are blocked/banned no longer have that right to keep that information private?
How can you keep it private if your account is disabled and this fact is publicly visible? If account becomes disabled and then enabled back in a week, it doesn't take Sherlock Holmes to see what happened.
Perhaps instead of having a log, admins should be trained to contact CoCC before unbanning/unblocking a user?
On every disabled account, even one having nothing to do with CoCC, just on the chance it might be another of the ineffable ways of CoCC? That sounds completely backwards - instead of CoCC plainly announcing the action, we have to ask it on every action that happened - is this your action? What about this? What about this? And this? And this? I don't think CoCC members would want this, and IMHO this is not the right way to manage a community. CoCC should not be a black box oracle.
If that's the case, then what's the current problem we are trying to solve?
As I described above, the current setup only shows the account is disabled. It does not show the fact of the ban, the duration, the contact person, the discussion venue etc. Anything that is needed for a proper community management is hidden. The only fact that you consider to be private is not. One could guess, post-factum or after a long discussion on the mailing list and sometimes a comedy of errors with different admins undoing actions of each other, what happened - but that should not be the modus operandi of a healthy community. We shouldn't waste a lot of time and effort of multiple people to figure out what happened. Transparency and user-friendliness should be part of the system functionality, not possible emergent result of hard work by multiple people. That's the problem we are trying to solve.
Those are valid reasons, and I understand the rational.
As I asked above, Is there a problem with making this log permissioned to admins? That should resolve everyone's concerns I think?
It'd be good if you could specify a reason in the "disable account" function. The log of account enables and disables are already only visible to admins. Problem is that I am not sure if Upstream could be convinced this is a priority to introduce in their software, and I am not sure if we can make a local "hack" in our install (or even if that's even adviceable) to do so. Manual logging is the only solution for now. Where to seems to be the problematic part here.
As for other venues, Gerrit for example requires running a CLI command to disable an account (and of course there's no public log either). IRC does have their +b lists. Mailing lists have the moderation function. Wikitech and MediaWiki do have Special:Log/block for blocks and as for other technical spaces I don't know.
There's not a single "administration" for the whole lot of WMF technical spaces managers. Either all of them are informed (thus making the log visible for all of those that can issue restrictions on WMF technical spaces) or we still run into the risk of your left hand not knowing what your right hand did.
It would solve half of the problem. It will be still not transparent to 99.99% of the users of the system (who still wouldn't know what happened to the person they were collaborating with), but will allow at least admins to perform their functions efficiently. As a partial solution, it would be better than nothing, but not entirely satisfactory.
Some things to consider:
- CoC bans could involve at least Phabricator, Github, Gerrit, mediawiki.org, Mailman, IRC. All of those have different admin groups who need to be aware (if nothing else, then to avoid being tricked by the blocked person into removing the block). Some of those places have have public ban lists (IRC, mediawiki.org unless suppressed), some don't have a list but you can see the block in the account status (Phabricator, mediawiki.org when suppressed), some don't have any public block info (Github, Mailman, maybe Gerrit?).
- Suppose the banned user does not want the ban to be public - they promise to fix their ways and do not want a stain on their record (public logs can be removed but people's memories of them can't). What level of dissemination would be appropriate then? Admins of the affected spaces? Admins of any space? Temporary public record? Permanent public record?
- If there are public records, is it important that people find it easily? If "collaborators should be aware of what happened" is a real problem to solve then putting a block log on some wiki page no one ever heard of is not really a solution.
It makes me wonder, how publication of this can of logs are impacted by w:GDPR? What would be the relevant people/team to contact to have relevant advises on that matter?
I think that the idea that logs on sensitive topics (such as any kind of abuse in Wikimedia events) should be private both for victim and abuser is laudable. You can probably have both goals succeed in different ways: a public task with a log, where without any naming there would be somewhat concrete description of CoCC actions, and a private log, visible for moderators on Phabricator and CoCC, where it would be logged with more details (not enough to damage any people, of course).
The problem of bad reputation that is provided by public logging of blocks in Wikimedia projects is real, and consequences of implementing it on a real world abuse would probably be more damaging to both parties, so finding a more sophisticated approach should not be scoffed at.
Unfortunately this thread seems to focus solely on bans for online activities. For offline activities, there is no "admin" group that can be announced. There are all kind of independent events in universities and other venues that are in no way affiliated with the WMF, many times do not necessarily interact with WMF community liaisons, but are bound by the CoC. None of the alternative solutions proposed above considers these events.
I would ask people opposing a public ban list to consider how their proposals fit with offline events. Thank you.
If it's not being done already, maybe the organizers of offline events should be provided by someone at the WMF/COC with a list of banned people privately so they can manage the attendance to the event safely. Best regards.
In theory this works, sure, but there are 2 issues I can think of right now:
- Is anyone tracking tech events throughout the world? Would it scale to do that?
- Would that require event organizers to sign some kind of NDA? If so, that would be a huge turn-off...
Or make IRL bans public but other bans not. I imagine most bans will be about online spaces and not events (as most people are more likely to misbehave online than face to face).
Sorry to repeat myself (and maybe this is slightly off-topic), but if you noticed we recently had one or two new contributors welcomed on wikitech-l. The long nasty "disabled phab account" thread seems to have died down, and I hope it stays that way. Am I the only one concerned about how bad it appears to newbies? "Welcome to Wikimedia! Hope you like drama!"
It's a good thing the thread has died down, since the case itself was not worth it. However, this proposal is worth adopting.
This comment is in regards to the original amendment. Each venue should have a place where bans are listed, for the benefit of other administrators or persons who might be asked to lift the ban. So I would suggest that phabricator have such a place where the action, the target, the actor, the duration and (potentially) the reasoning are made available to phab admins. This does not necessarily need to be public.
There are two things being conflated in this discussion: 1) the need for other venue admins to know what's happening when someone is banned, 2) the desire of the public (contributors to Wikimedia projects including MediaWiki) to know what's going on. I'd rather deal with each separately.
I think you nailed the issue; personally I think the CoCC can't really be effective until problem 1 has been solved.
"Problem" 2 is not really a problem as far as I'm concerned - or at least it's something we are not in a hurry to solve.
I'm kind of irritated that most people here seem to believe that there is no need to diclose ban information to the general public. How are you supposed to work like that in a community-driven project?
Given the case that a user is blocked on Phabricator, this is visible to the general public and prominently marked whereever the account appears (marked with a dot or greyed out, depending on the form). If I see this happened to a user I worked with, how am I supposed to know what happened? Did this user just quit his MediaWiki work and left? Did he stop working for WMF? Or is he just temporarily gone? If I don't know what has happened, how shall I decide which actions to take, when I need to contact this user?
I do believe we should disclose bans. A block summary and a log on Phabricator would be most ideal, something more than just saying "disabled". I understand this probably isn't a feasible thing to implement right now, but I assume Phab admins can at least edit the "blurb" or "title" on the user's profile? You could link to the diff of the log entry that lives somewhere on mediawiki.org. This need extends well beyond CoC transparency. For instance, for a long time I collaborated with a particular user on Phabricator, and one day he disappeared. I saw his account was disabled and assumed he did something very wrong. Turns out, sadly, he passed away, which I didn't discover until months later. So we should log all blocks, regardless if they are CoC-related. I think the block summary should be as descriptive as possible, barring privacy concerns. If we need to be vague, "code of conduct violation" (or what have you) would suffice. I'll note that blocks, logs, internal drama, etc., are well-hidden on the wiki and I only hope the same will be true for Phabricator and other technical spaces. I'm all for transparency, but please, pretty please, don't use the highly-visible mailing list to announce or debate these actions, scaring off uninvolved contributors, or distracting them from their work. No one should be forced into this drama.
- Any ban made by the code of conduct committee will be logged publicly in Code of Conduct/Cases/Log/2018 with user name, space of ban (phabricator, mediawiki.org, etc.), duration, and if not private reasoning. Unless any of these two conditions apply:
- The reporter asks that ban not to be public.
- Or the code of conduct committee decides not to disclose anything as it might disclose private information. For example, logging harassment cases in Wikimedia events can lead to real-world identity of users being disclosed.
Weak support, put I'd prefer to see a third condition there: the banned user asks for the ban not to be public.
Also I think there should be a guideline of how to handle bans in various spaces, as the log is not really useful if it's not exposed when someone tries to interact with a banned user. E.g. it should be linked from the MediaWiki block summary, the Phabricator user profile, the gerrit status message etc. Of course that's not part of the CoC amendment.
Converted to Structured Discussions
Following on from Talk:Code of Conduct/Archive 4#Flowify?, and because there's been no active conversation here for several months, I've converted the page to use Structured Discussions.
There are no older topics