Talk:Personal image filter/Wikitext

Name

Brandon raised a good point about the name, saying that "Controversial content management system" could be read as "Controversial content management system". To avoid the ambiguity with a CMS, what about "Controversial content hiding system"? guillom 07:21, 1 March 2011 (UTC)Reply

I like "obfuscation" better than "hiding"; I'm not sure "system" is needed at all. --MZMcBride 19:14, 1 March 2011 (UTC)Reply
I don't think "obfuscation" is a good term. "Controversial Content Display Behavior System" is probably best and most accurate.--Jorm (WMF) 19:15, 1 March 2011 (UTC)Reply
Obfuscation cracks me up. I would actually like to get away from the term "controversial content" in general; it's a useful shorthand but also has pejorative connotations. Image hiding system? -- Phoebe 05:15, 4 March 2011 (UTC)Reply
I'm thinking of going with "Personal image filter". It seems to me this is really what the feature allows: to empower the user to enable or disable filters, according to their preference, to hide images or not. guillom 09:18, 4 March 2011 (UTC)Reply
It's applicable to videos as well. That may not be a good enough reason to rename the project, but it's something to bear in mind in user interfaces and such.--Eloquence 22:16, 16 March 2011 (UTC)Reply
I think "Personal Media Filter" is probably best.--Jorm (WMF) 00:32, 18 March 2011 (UTC)Reply
I also like "Personal Media Filter" because of its emphasis on individual users - and their choices.Bishdatta 06:24, 21 March 2011 (UTC)Reply
I think "image" is more understandable than "media", even if not totally accurate, but I won't oppose renaming from "Personal image filter" to "Personal media filter". guillom 08:25, 21 March 2011 (UTC)Reply
Remember the target for this feature, more than most, includes a large proportion of non-computer minded and non-technical people. Keep it extremely simple. "Content options" is all that's needed, and keeps it open-ended. "Options" are widely understood to apply just to your own access or use and this is simple, intuitive, helpful, and doesn't imply anything related to censorship or other "filters". FT2 (Talk | email) 03:55, 22 March 2011 (UTC)Reply
The name of the extension is not user-facing. The internal terminology doesn't need to be user-friendly; just accurate. Since the technology is personal, involves filters, and is applicable to media files, "Personal Media Filter" seems right to me. Also, "P.I.F." is a used, albeit archaic, acronym in computers: PIF files are Program Information Files, used by older versions of Microsoft Windows.--Jorm (WMF) 05:23, 22 March 2011 (UTC)Reply
Hello, the name should be simple and link to the already used terms. So if the user has to click on "Display settings", why not call it "Display system" or "Image display system"? --Ziko 19:12, 30 June 2011 (UTC)Reply

Objections

This is way more restrictive than would be acceptable in at least the part of the enWP community with which I am familiar. First, I do not think the entire project acceptable as a matter of principle. We have the obligation to create properly descriptive metadata for material; we have the obligation to aggregate the descriptors in logical and useful ways. We recognize that these descriptors can and will be used by outside parties to filter our material according to their own purposes, but the possible misuse of the descriptors should no more prevent their making than the possible misuse of any other content prevents it. We probably even have the obligation not to construct our system in such way as to frustrate this use (we do not judge the virtue of a type of use) , and possibly even to give links to the various systems available for those who want to use them. (we provide information to help our users make use of the material in whatever way they want to).

But for the WMF itself to make such filters is a gross abuse and contradiction to our principles. Our business is the provision of information, not its restriction. We do not censor, except as we are compelled to. Therefore, we do not make filters for censoring. We do not implement filters others may make. We do not make categories designed specifically to facilitate censoring that we would not make otherwise.

I shall do what I can to urge the community to reject this entire proposal. My idea of the proper objectionable content policy is just two provisions: 1.we do not host material that is illegal according to the governing law where our servers are located. 2.we do not force any project to include material it does not want to include, but they must adopt a policy as close to NOT CENSORED as their principal readers and editor communities permit.

The most objectionable part

The most objectionable part is the default filter. Even were we to have this project , there is one default filter setting and one only that is consonant with the principles of NOT CENSORED -- that is "Allow everything". Otherwise we are censoring in the sense of segregating material in a private back room.

The second most objectionable part

The second most objectionable part is to compel the different Wikipedias to implement the necessary overhead for this policy. If the policy is thought necessary for some culture areas, a position I disagree with very strongly but can image the WMF adopting, that still does not give those areas the right to impose their standards on others, or to compel them to participate in a project against their own mores. In other words, if the xxWP decides to use such categories, the enWP does not have to provide equivalents of them. In such a case, what to do with Commons is a difficult question, but I would still say the Commons should not use it. The various WPs can use such images from commons as they choose, and not use others. (I believe this is already the case with respect to copyright, and even the enWP has long required permission to use certain particularly provocative images.) DGG 23:01, 19 March 2011 (UTC)Reply


Hi DGG, Thanks for your thoughts.
  1. There's no default filter; default is the status quo. (I think the spec says it could be possible to make a default if wikis wanted to, but most of this spec is about how to enable readers to filter images only for themselves, in which case the default for the whole project stays the same).
  2. You're objecting to people hiding images *for themselves* (not for others?) I think the system described is so readers can make their own choices about what to see. Editorial policy remains the same, of course.

-- Phoebe 23:15, 19 March 2011 (UTC)Reply

I'm certainly glad to hear that no default use is intended. I'm certainly in favor of Wikipedia users being able to choose by themselves what to see, but I'm not in favor of Wikipedia of having the WMF produce anything that might be seen as a concession to censorship--next thing, we'll have proposals like the default settings. I would certainly prefer that such a function be implemented as a variety of available add-on products through which one reaches Wikipedia--and produced by someone else. Unfortunately, I'm aware this might not be the simplest solution I find the present concentration on sexuality a little absurd--of all articles, these are the least likely to be found by people who are not looking for them. I'd suggest we want to enable people to be more selective in a variety of ways, not limited to "controversial" content, and not putting undue focus on sexuality or violence: we might have a page that would be labelled something like Image Selection that would present optional selection buttons to not immediately show :
a. all images--replace them by the alt text unless clicked on. This would of course be the safest setting, and quite independently of that, be the best for low speed connections. (I have set such an option for a number of sites I look at in order to get them to work quicker without the background nonsense. It can obviously be done on the browser side, but many of our users won't know how do do this)
b. all images over a certain size. Again, for lower speed connections. Ideally, we'd have images served at a default resolution depending on the connection, like commercial sites often do.
c. images showing humans
d. images showing violence towards humans
e. images dealing with sexually related subjects
f. images dealing with medically related subjects
g. images dealing with religiously-related subjects
h. images of political emblems
I can't think how to deal with unclothed humans. Some might not want any that show even the faces, or not show the faces of those of a particular gender. Some might not want pictures that show the sexually charged parts of the body--but I cannot find a culture-free definition of that. Some might not want images that show such parts even if clothed, or it might depend upon the nature of the clothing. Most available commercial filters are designed for a particular cultural sensitivity in mind--typically the images that an middle-aged very socially conservative American does not want young teen-agers to see. I don't see how we can make this determination of a world-wide basis.

It's a tricky question whether this would work without being implemented for all Wikipedias. If it were limited to Commons content, but the same interface shown to everyone, then a person using, say, the English Wikipedia and selecting option d. would still see the images of violence hosted on enWP, but not those hosted on Commons. But to use ity on each WP would require some Wikipedias to use e a particular form of metadata for their content they do not find helpful. But if it could be used on an interface specific to each WP, the way the language is localized, then it would not matter. Any particular WP could choose not to use it, or to use it, in which case, they would need to code their own images, if any, and then it would affect all images they display on their WP. (I'm assuming no particular Wikipedia directly uses images from another Wikipedia) . DGG 15:48, 20 March 2011 (UTC)Reply

Why create a new category system for this?

Hello, I am confused about this part:

Because of the way categories work (or don't work, as the case may be), this system will only work:
  • if it has to deal with a limited number of categories -- somewhere between 5-10 -- which are placed into a global content filter category

Can you please explain in detail why this is needed? I understand that the proposed implementation involves cache invalidation. What is the imagined bottleneck if users are able to tag any set of categories to filter (along with their subcategories)? What is the 'category-checking' function that would be made more efficient by having only 10 new categories? Notsuohs 17:57, 21 March 2011 (UTC)Reply

It is not a question of "new" or "old" categories; it is a question of the number of categories:
  • For every category that is filterable, the cache invalidation load grows.
  • Since this feature is designed to work for anonymous users, there is a hard limit of around 15 categories that we could possibly filter (we run up against the cookie limit).
There are also significant usability concerns. The feature is designed to be used by people who don't know what categories even are, let alone how to use them. With a small category set, we can allow filtering from the perspective of the images; with a large set, we will be required to do filtering from the perspective of the category. Since categories are an esoteric part of MediaWiki (from an inexperienced user's point of view), we have to go with the "image" perspective.
Hope that answers your question.--Jorm (WMF) 19:33, 21 March 2011 (UTC)Reply
Why a hard limit? Surely MW can store all categories in one cookie. For example a 15 character string of "1"s and "0"s. (This isn't to deny caching issues or suggest it's desirable, just curious why it's a hard limit) FT2 (Talk | email) 03:50, 22 March 2011 (UTC)Reply
How does the cookie limit require only providing 10 options in the first place? You could let people choose the first N categories they want to filter, then say "if you want to use this feature more fully, please log in to an account". That's a bog-standard way of inviting heavy users to create accounts on complex sites. This "create a classification scheme" seems like the obvious weak point in what otherwise seems a workable scheme. I can't see any way to implement it that isn't at once flawed and controversial in its own right. Please fix this. Notsuohs 03:02, 1 July 2011 (UTC)Reply
Return to "Personal image filter/Wikitext" page.