Talk:InstantCommons/Archive 1

Latest comment: 15 years ago by Brion VIBBER in topic ForeignAPIRepo
The following discussion has been transferred from Meta-Wiki.
Any user names refer to users of that site, who are not necessarily users of MediaWiki.org (even if they share the same username).

Thumbnail generation? edit

The Commons server should not generate thumbnails for other servers. It should just transfer the full resolution image and let the requesting wiki engine do any wanted resizing. --Pmsyyz 05:41, 17 February 2006 (UTC)Reply

There are two things to keep in mind here:
1) The Commons server may already have a thumbnail of the right size. No need to generate it twice.
2) The client wiki may have no or limited thumbnail generation capability (e.g. no SVG rendering).
If 1) and 2) are not true, the thumbnail can be generated on the client side.--Eloquence 06:39, 17 February 2006 (UTC)Reply
I second this, having the sophisticated thumbnail generation and SVG rendering capabilities of Commons available to third-party wikis would be immensely useful. Torfason 09:46, 14 March 2007 (UTC)Reply

Costs edit

"While it is unlikely that individual wikis using the InstantCommons feature would cause a significant increase in cost for the Wikimedia Foundation"

What is this assumption based on? Has anyone actually tried working out the costs associated with this? Angela 09:34, 17 February 2006 (UTC)
The specs say we could make a distinction between free content and proprietary wikis. However, we should also distinguish the philosophical from the practical. I run a small wiki hosting company, and with my cheapest hosting partner, I pay about $0.20 for every GB of traffic; I would imagine that Wikimedia pays significantly less. But even at this rate, $10 would get me 50 GB of images, i.e. about 50,000 1 MB-sized pictures (which is fairly large for an image). In the InstantCommons model, every image would only have to be downloaded once and be cached from then on (updates are requested explicitly).
So even an extremely large wiki (larger than any hosted by Wikia currently) using 50,000 images would cost us no more than $5-$10 over the course of a year or so. The CPU cost of serving these files is negligible, since they are not dynamic content (thumbnail generation is a different issue and will have to be discussed).
The cost is no higher than a user manually downloading the image from Commons, and uploading it to the wiki, with the main difference being ease of use (and better license compliance). It seems to me that a simple notice on the description page with a link to the Wikimedia fundraising page would be sufficient and probably generate more money than it costs Wikimedia to serve the files. However, we could always think about additional charges if it turns out to be a problem.--Eloquence 09:59, 17 February 2006 (UTC)Reply

Some notes edit

First of all, i think it's a good idea although i don't know about its feasibility.

  1. This must be implemented in another domain. Let's say instant.commons.wikimedia.org It'd have the same as commons.wikimedia.org but the special page would refuse to export any data if not called as instant.commons.wikimedia.org. Why? You have other wikis eating your resources. You can easily change it to another server. Or you can disable the DNS to point to nothing. Then you have broken all instantant downloads (of course, can't connect must be handled), but wikimedia commons continue working :) Notice that they could hack their host files, to point to commons.wikimedia.org. This should be taken in account too. Of course we must be able to disable exporting if we want without having to change dns, but dns could stop large queries, dos...
  2. instantant.commons has no obligation to generate thumbnails. The mediawiki installation will send its name, url, version, addons... (if they don't like us to know them, they should deactivate instantant commons). It ask for both the thumbnail and original file (well, file + size). Then instantant commons can decide. It can think it has already sent that file to this ip (it's not caching!) and refuse it, send the original file, thumbnail only or even both if install doesn't allow thumbnails (this should be mandatory for instantant commons, but well...). If it receives only a thumbnail, the text will then have something as "see this image complete at its description page..." (not a direct image url).
  3. The image text giving credit to us, should be on the answer, so other wikies could have the export function too.
  4. Take into account that many mediawiki installations will have fopen url disabled for security reasons so this extension won't work.
  5. Uploading to commons from other wikis may be problematic as they may have other image policy.
  6. I like the idea of different wikis sharing its medias over them. Something like agreements to send tcp packets...
  7. Be aware that wikipedia mirrors will want to get our images too.
  8. Some kind of query must be done to check for modyfings/deletions too. Something like once a month asking if thare have been changes to all the images.
  9. Someone could made image spam. It creates a image in commons. Let's say [[image:Spam.png]] Then he goes over a lot of little wikis posting that image. They all download it. The image is deleted in commons, but he has dozens of copies on the internet :S
  10. Even worse, the image has some kind of virus (like the WMF bug, now solved).
  11. Some kind of permission of non-shareable media?
  12. Images with wrong extension non shareables by default.
  13. Asking a small wiki to download a big image can collapse that wiki's space.
  14. A wiki shouldn't have instantcommons enabled if it hadn't have enabled the Upload (of course, the wiki admin could configure it ;), i'm talkin about defaults).
  15. Asking a lot of wikis to download big files from us, can collapse us.
  16. Instantantcommons shouldn't be the only source from remote images. A wiki could have a list of wikis-to check asking for the image (it shouldn't be too high, for perfomance reasons, but there could be several, checking next only-if-not-on-previous).

Platonides 23:27, 17 February 2006 (UTC)Reply

The purpose of our projects is to deliver information to people in the world. This functionality will upload pictures that exist in Commons once. Therefore the chance of it crashing Commons because of this mechanism is negliceble. Thumbnails of a standard size have already been generated for MANY pictures; it is therefore mostly tapping into an existing resource. The text that will be associated with a picture that is to be uploaded will be the existing text; we can not assume that people speak English and we should not.
This function is to promote the use of Commons. It is not to create another Commons.
If people have multiple Wikis, they can set up their local resource already. Your suggestion that WikiPedia mirrors want the pictures too, well actually if they are legal mirrors they can have them already what is your point ?
This software will not solve hunger and war and bad breath. It allows other MediaWiki wikis to make use of Commons. That is a perfectly reasonable objective. When there are problems with pictures, like with text, we have no mechanism to manage these when they are on another website. We should not even consider if we want to be in that position.
As to "non-sharable" media.. I think that is in the rules for the admission of material to Commons already.

Thanks, GerardM 08:14, 18 February 2006 (UTC)Reply

The thumbnails may be given. Only that the specifications should give up to commons if it wants to share it at that moment or not, and how much. I rewrote it to commons has no obligation to generate thumbnails for other wikis to make it more clear.
Sorry, about the asociated text i meant the "This image comes from commons message", that would came in the XML, not hardcoded in mediawiki. Of course the page text would be shared too.
I am not talking about creating another Commons. It would be the same website, but accessed from other domain. Thus making easily splittable (on other servers, exporting shutdown...)
I have nothing against wikipesdia mirrors wanting the images. I only remark that. I remember you there have been some voices against mirrors which get everything. So a hungry mirror may start asking images and images...
"As to "non-sharable" media.. I think that is in the rules for the admission of material to Commons already." Yes, but what happens between the file is uploaded until it gets removed? There is some time until someone founds it, see it shouldn't be on commons, tags it, and after a week, it gets deleted. So it makes sense that files whch may be copyrighted, not enter into the exporting system (Maybe even an especific message: Not allowed to export. <reason>{{Copyvio}}</reason> ?), at least since it gets tagged. Maybe some X time before it can be shared, depending on how it works. It's undesirable, but maybe good for commons-pedians. See the spammer sample.
"When there are problems with pictures, like with text, we have no mechanism to manage these when they are on another website. We should not even consider if we want to be in that position." I don't fully understand you. Do you mean that if someone gets an image from us, which is then deleted from commons, we are not responsible of it? If that wiki is sued for copyright violation, the wiki will say it's our fault, so we should care. We sent them thata image, we made it posible (as we made posible the wiki existence), and we didn't setuo a mechanism to restrict it. That's why i talk about some kind of syncing between wikis to maintain childs up-to-date. If the wiki admin decide it's not good to resync even once a month and decides to disable it, he can. But we did provide the tools.
Platonides 16:08, 18 February 2006 (UTC)Reply
The person who uploads content to Commons, the person who writes in a project is the one that is responsible for his action. The WMD has to make a reasonable effort to keep our content free of illegal content.
There are no purge mechanisms for text and there is no obvious need for a purge mechanism for pictures. GerardM 20:03, 18 February 2006 (UTC)Reply

Hello Platonides,

thanks for your comments. Implementation on a separate domain is not necessary as we are talking about single-loading operations. You should note that there is currently, by default, in every MediaWiki installation the ability to embed images from external servers as URLs. Try it yourself: Edit the Wikicities sandbox and paste in this URL:

http://upload.wikimedia.org/wikipedia/commons/2/23/Georgia_Aquarium_Tropical_Tank.jpg

You will note that the image is directly embedded into the page in full resolution. Not only that, the image is loaded every single time the page is viewed from Wikimedia's servers. It is also completely possible for me to generate a thumbnail on Commons and load it in the same way, as well as link to the full-size version. Wikimedia is doing nothing to prevent this except in egregious cases, and it hasn't taken down the servers yet as far as I'm aware. ;-)

Now, InstantCommons is a much more efficient method to load images from Commons, since every file only has to be loaded once (future updates notwithstanding - every updated version only has to be loaded once as well). What we're trying to do here is to make it more convenient to do the right thing than it is to do the wrong thing; we provide easy and full support for cached loading operations while only providing simple support for expensive loading operations.

As for fopen being disabled, we may be able to override this with ini_set; if not, there may still be the final route of allowing external tools like "wget" to be used. But it's true that not all installations will be able to support something like InstantCommons. In any case, in a shared hosting context something that can easily max out the available bandwidth like this is probably not going to be too popular.

Uploading to commons from other wikis may be problematic as they may have other image policy.

Yes, obviously. But it is possible to make a very clear distinction between free and non-free media in the user interface. This is a future idea in any case.

Some kind of query must be done to check for modyfings/deletions too. Something like once a month asking if thare have been changes to all the images.

I have added the need for a maintenance script to the specs. In addition, wiki admins should be able to delete images and prevent them from being re-created under the same name.

Some kind of permission of non-shareable media?

No. Wikimedia Commons media are free by definition.

Images with wrong extension non shareables by default.

Files with the "wrong extension" cannot be uploaded in the first place.

Asking a small wiki to download a big image can collapse that wiki's space.

Hence the ability to configure a per-file maximum limit.

A wiki shouldn't have instantcommons enabled if it hadn't have enabled the Upload (of course, the wiki admin could configure it ;), i'm talkin about defaults).

The wiki needs to have a writable directory to store the files, but I don't see why uploads should have to be enabled for other reasons.

Instantantcommons shouldn't be the only source from remote images. A wiki could have a list of wikis-to check asking for the image (it shouldn't be too high, for perfomance reasons, but there could be several, checking next only-if-not-on-previous).

Possibly, but not in this implementation. Another potential opened up by InstantCommons is to utilize the users (using wikis) of InstantCommons content as mirrors in the Wikimedia context, a first step towards a more "peer-to-peer" infrastructure. But key to success is to define reasonable milestones. This is the first one.--Eloquence 06:20, 19 February 2006 (UTC)Reply


Of course it's much easier an implementation of "include from the other server". You have alrady explained that. But notice that wikimedia could easily block almost all this transcluded images, and change for "This is a Wikimedia image. We are having server problems. Please donate." for any URL but wikimedia projects.
Again, there's no need for any special setup on Wikimedia's site. As InstantCommons talks to Wikimedia's servers via XML-RPC before requesting images, Wikimedia's servers can see what hosts cause how much traffic, and either disable the API completely or for particular hosts, should it become a problem -- which, as already explained multiple times, is extremely unlikely.--Eloquence 15:51, 19 February 2006 (UTC)Reply
The fopen problem, is only a possible problem i'm remembering. Of course if it's my computer i can configure it, or i could made a script to download it, but if i were on a more restrictive hosting maybe i wouldn't have it, much less being able to use programs.
"No. Wikimedia Commons media are free by definition." Already answered up. What happens if we are used to redistribute non-desirable images to other wikis?? And yes, GerardM, the responsible is the person which uploads it. But we should take all of this into account too. If someone spams www.example.com we can delete it (here, not in the exported sites!!), ban the account, but will came again with other username if it works. And we know he comes from www.example.com, but... we should be able to stop it from here.
Imagespam is likely to be less of a problem since the whole point of spamming is to boost Google ranking from regular hyperlinks. A non-visible image that is easily reverted from a page does do nothing for a spammer. But in any case, both for copyright and spam, the maintenance script I have added to the specs could fix those issues on a regular basis.--Eloquence 15:51, 19 February 2006 (UTC)Reply
Well, i was more thinking in a spam as a big Coca-Cola image and not a simple link to www.cocacola.com. Or you could use unattended mediawiki installations to store images as the (sadly) famous Mahoma images. And what about the virus-images? Until some admin of that wiki reads it, you get infected. Even the admin. And not only IE is vulnerable, (well, it usually is the most). There were some mozilla problems hanging you browser (computer?) asking for a lot of memory. So IE users can get a virus, mozilla can't remove it for it hangs... And no wiki intelligent user to change the url or use Opera :D Platonides 15:58, 19 February 2006 (UTC)Reply
Sorry, but that's pure paranoia. If there's an issue with JPEG and GIF security exploits, you're going to be in trouble one way or another - through direct uploads to the wiki, through embedded inline links, and so forth.--Eloquence 17:01, 19 February 2006 (UTC)Reply
Maybe i'm a bit paranoiac. That's why i prefer have everything under control, a second domain for the api querys, etc. When the WMF bugs, fake wmf images could had another extension, and IE read them, saw that they were WMF and started the exploit. If you can avoid its propagation, you have half the way done. :-) Platonides 17:28, 19 February 2006 (UTC)Reply
Again, in this case you should be much more worried about inline embedding of images, as there's no way we can detect whether these truly are JPEGs or GIFs. We can check the material that is uploaded to Commons. So a mechanism like InstantCommons provides the infrastructure for more security, not less.--Eloquence 17:41, 19 February 2006 (UTC)Reply
"Files with the "wrong extension" cannot be uploaded in the first place." You can't upload an .wma file, but you could upload an .wma audio with .ogg extension (and some players even treat it as WMA). I think mediawiki can detect it automatically and display an alert. So, it could also set shareable to false for it, until a person verifies it.
Either Wikimedia detects false extensions or it doesn't; that is a security consideration on the Wikimedia side. There's no need for this "sharable" flag either way.--Eloquence 15:51, 19 February 2006 (UTC)Reply
"The wiki needs to have a writable directory to store the files, but I don't see why uploads should have to be enabled for other reasons." Because if he doesn't like people uploading files, won't probably want people uploading files from instantant commons. Admins which know what want to, may override it, of course.
InstantCommons cannot be enabled without a writable upload directory, so the point is moot.--Eloquence 15:51, 19 February 2006 (UTC)Reply
Again, the server quota may be filled with a big file or several little. Or were you talking a maximum per file set from the remote wiki, not from commons?
Platonides 15:20, 19 February 2006 (UTC)Reply
A total InstantCommons cache size in addition to bandwidth limits may make sense.--Eloquence 15:51, 19 February 2006 (UTC)Reply

Stupid questions edit

A couple of really stupid questions that I was just thinking about in my head: What about video? If you put a link like this: , would it be downloaded (as its a movie and not really embeded)? what about a link like this (With colon in front of name): Image:Electricity substation danger.jpg, would that be downloaded to the other wiki? Finially, what about stuff under Commons:template:CopyrightByWikimedia? Bawolff 19:59, 19 February 2006 (UTC)Reply

Non-embedded files would only be downloaded when you view the file description page, provided they fall within the size limits specified by the site administrator. As for copyright by Wikimedia stuff, I don't think it's worth worrying too much about, but if it becomes an issue, we can implement the sharable flag suggested above.--Eloquence 03:46, 20 February 2006 (UTC)Reply

Perspective from MediaWiki site owners: edit

I am a bit of an outsider here, but I like the whole concept of having a central repository or images. I'd like to help it along, but my primary goal is to have a good MediaWiki site for my users and if helping Commons gets in the way, I'm sorry, but the Commons feature loses. I don't mean to be rough- I want people to understand the perspective. We have a ton of images coming into our site and to be honest, maybe only 10% of them are of interest to the audience of Commons. But some of them are good photos- eg scenic views of the Hawaiian islands, Hawaiian craft stuff, religious art.

I'd like to encourage my folks to upload images to Commons. I have no problem caching the images on my server. My goal is not to conserve storage space. My goal is to contribute stuff. As a side benefit, it would be nice if any further annotations of the COmmons versin could be mirrored to my site. It is a minor benefit and one we can do without, but it there is beauty to the collaboration concept and mutual benefits.

I think this is a magnificent idea- however you folks choose to implement. I am willing to go a few extra miles if this means a slightly greater admin burden on my site. But you know- egregious things like my cache copy being deleted if the commons copy gets deleted would be a nonstarter for us.

Good luck with this proposal. I think this could really open Commons to a floodgate of further contributions of high quality content. MakThorpe 22:48, 22 March 2006 (UTC)Reply

Opening a floodgate to the Commons is something I am slightly worried about. We have enough trouble convincing Wikimedia users of the importance of copyright, sources, licensing etc, let alone other wikis. But if this is implemented, it would be good to work with the external site owners to try and minimise problems like this. pfctdayelise 10:49, 23 March 2006 (UTC)Reply
It won't open a floodgate unless we explicitly want it to, that is, simply making images from Commons transparently available to other wikis will not automatically lead to thousands of people creating accounts on Commons and uploading files there. We can (and should anyway) emphasize the free/non-free distinction in the user interface if we seek to encourage cross-wiki uploading. The more important question, in my opinion, is whether or not we want images that do not necessarily have relevance to any existing Wikimedia project. This is a question for the Wikimedia community to decide. But I think the two goals of making the files available for use and getting users to upload new files are distinct.--Eloquence 15:17, 23 March 2006 (UTC)Reply
Right. Simply because attaining one goal benefits a second goal does not make the second subordinate in any way. It is my conjecture that making Commons more relevant to external wikis would have this effect, but I really have nothing but my faith in people's generous nature and common sense to base that on. Just to clarify, we are non profit/ non interested in retaining anything more than copyleft and have 3 professional photographers who generate hundreds of photos per month. We know that most have no relevance to the Commons community and personally I think the graphics folks would err on the side of not uploading the 5 or 10% that are of general interest because I would imagine that putting them on Commons would be more of a hassle. So I don't think anyone should fear folks uploading tons of useless pictures because it is in their workflow. I was saying floodgates because I see a LOT of mediawiki based sites popping up, so even if you are only getting a small percentage of their good stuff- that is a nice feed of content to jack in to. I presume the implemenation would safeguard against nefarious schemes. Regardless, the motive of bandwidth stealing is a transitory concern. In our case, we have 150GB and 1.5TB/month for $20/month. For us that is way way more than we need but it is dirt cheap to add more. So to me, bandwidth stealing in the context of a discussion about typical bitmaps is already a bit antiquarian. In our case, we don't need any storage or bandwidth from anyone. -MakThorpe 01:45, 26 March 2006 (UTC)Reply
I'm absolutely for encouraging and simplifying uploads of relevant content from other communities. One important aspect of this will be cross-wiki authentication; if you don't even have to login in order to upload to Commons, the incentive to do so will be even greater. I will write some more about the problems and potential of cross-wiki uploads soon, and how we can encourage the use of correct licensing through the user inerface. But the core functionality of InstantCommons is what I would like to focus on for the time being.--Eloquence 22:34, 28 March 2006 (UTC)Reply

Instant Mirror edit

Why limit this principle to commons? It could be used to build a live mirror with caching of Wikipedia or any MediaWiki website. Normally live mirrors of Wikipedia will be blocked, but that's because they don't do any caching. --LA2 09:09, 31 March 2006 (UTC)Reply

Wikinfo does just that. However, the stragegic benefits of providing such functionality are much less clear to me. Wikinfo is a fork, and such cached live mirror functionality would be of most interest to forks, especially if no publisher/subscribe model is implemented. Encouraging forks is, of course, not necessarily a good thing.--Eloquence 09:36, 31 March 2006 (UTC)Reply

Great Idea edit

This idea is revolutionary, and I would like to support it wholeheartedly. It takes at least several minutes to prepare an image properly for upload to Commons or any other image database. You have to take the picture, secure licence clearance (if you are not the author), put context information and description on the image, rename the file to human readable form, put tags on it etc, and hit the upload button. In contrast, it takes on Wikimedia Projects only 3 seconds to put the image address of a commons image in the article or wikipage. The latest files control on Wikimedia Commons as a method of preventing copyright infringements is perhaps not perfect, but pretty good in comparison to other user generated (not-for-profit) sites. In short, this is a very efficient way to project high quality, checked multimedia content all over the entire wiki sphere /internet. If there would be the possibility to place a link to commons on the target wikis, this could be powerful Public Relation. Great thing, Greetings, Longbow4u 19:13, 31 March 2006 (UTC)Reply

I can also say that I think that it is a great idea and I hope you don't stop the project.Kolossos 22:42, 3 July 2006 (UTC)Reply

Automatic email notification to author edit

Perhaps it would be possible to send an automatic system email to the Commons author of the media when his image or other media is used on another wiki, with the corresponding IP-address where the image is located, so she can check it out (if it is not restricted access). This could be as a monthly resume, to avoid too much traffic, and could optionally be shut off in the user settings. Longbow4u 11:54, 31 May 2006 (UTC)Reply

Future potential edit

Symmetrical files sharing edit

Why should only other wikis benefit from WikiMedia Commons? If we provide the other way as well, then for example - skipping implemetation details - an autor of Wiki A might find a useful ressource file in Wiki B, and then reference it as if it was available from his local Wiki A. After saving his edit, he follows the red link to the ressource, is automatically directed to an upload form, where enters only the reference to the file in Wiki B. Provided, all three wikis are enabled to share content via instant commons, then, depending on installation details, either:

  • the WikiMedia Commons request the file with all of its history, uploader, and other metatdata from wiki B, stores it locally, and makes it availabe to Wiki A, or
  • Wiki A requests the file from Wiki B exactly in the way, it would from the WikiMedia Commons.

Some functionality pointing in this direction has just recently added to MediaWiki. --Purodha Blissenbach 18:29, 2 September 2006 (UTC)Reply

Peer-bridging down times edit

When a server becomes (partially or temporarily) unavailable due to scheduled maintenance, hardware malfunction, network outages, or overload, and the mediawiki installation using it is informed about the situation (be it by an automated monitoring process or human intervention), it could easily emit (all or some) media URLs refering to a peer server, e.g. the Wikimedia Commons, as measure to bridge the outage with considerably less service degradation to end users.

When the Wikimedia Commons keep track of latest (or latest N) external mirror(s) having a copy/copies of any given file, and these data can be held accessible to other servers on the MW server farms even if image server(s) themselves are not functioning properly, the inverse support is possible as well. Provided, a sensible way of load balancing can be implemented that promises not to overload small sites, I'm pretty confident, site managers will be willing to negotiate and agree to such a mutual support scheme.

Big enough sites off course could act like the MediaWiki Commons as described above, as central repository, and fallback to others.

I think, transparently redirecting media requests to secondary, or fallback servers could also be integrated with page caching, but that is a task o its own. --Purodha Blissenbach 18:29, 2 September 2006 (UTC)Reply

Quick!! edit

InstantCommons is a proposed feature for MediaWiki to allow the usage of any uploaded media file from the Wikimedia Commons in any MediaWiki installation world-wide. InstantCommons-enabled wikis would cache Commons content so that it would only be downloaded once, and subsequent pageviews would load the locally existing copy.

WoW!! Excellent feature! But, I know that this feature is exist already. I want to use to Commons images for my mediawiki...make it quickly, please... :) -- WonYong (Talk / Contrib) 01:19, 25 September 2006 (UTC)Reply

Project Status (?) edit

Does anybody know if that project is going to be implemented, or is it just open for discussion ? As one could read here there are many things to be taken into account but it might be best to distinguish between two cases:

Shared-ReadOnly Shared-ReadWrite
Traffic-Problem (using an image link already permits to use the commons uncached !)

...

Different upload policies

...

If it is too critical to enable the Shared-ReadWrite thing one still may permit the commons to be used as ressouce place for the user-wikis.

What do you think ?

--141.41.37.95 10:46, 25 September 2006 (UTC)Reply

according to gerardM the current implementation works to some degree . I don't know to what degree, and this page should link to the code... +sj | help with translation |+ 21:02, 25 December 2007 (UTC)Reply
Instantcommons branches

Re: Maintenance script for updates and copyright issues edit

Imho it is likely not generally acceptable for some wiki operators to have media deleted locally, once their copies on Commons have been deleted. I do not mind this strategy as a first crude implementation, since local admins have the choice of running this script or not running it. I suggest to give it a "test and report only" mode from the beginning on, so that admins can determine deletions before they are actually made, and take apropriate action, if they disagree.

Imho giving reasons and additional information for deletions would be vital for a successful propagation stategy. (This is not differing much from current state of afair when propagating deletions from commons to other Wikimedia projects, see also: commons::Village_pump#Handling_delections_of_files_in_use_more_gracefully.)

There are several sorrows and grievance, local admin may be concerned with, before letting the Commons community and/or Commons admins govern part of their local repository of media files. For example:

  • accepting a deletion when a proper replacement is available may be much easier than having to fear loosing a potentially vital piece of information;
  • being informed of a pending deletion that could cause harm properly in advance should increase confidence and the feeling of being recognized;
  • being able to preserve media which maybe do not meet the policies of Commons, but maybe well meet local wiki policies, is likely a necessity for the acceptance of automated deletions in general;
    • likely also for a local suggestion to upload appropriate content to Commons;
  • etc.

These need to be addressed, when InstantCommons is to be accepted by wiki operators and wiki communities outside mwf projects and closely related ones.

Since policies and concern for copyright issues varies widely in the world, and thus between wikis, design decisions need to be made under premises that, likely, wiki editors do not, and need not, even understand some of the copyright questions not relevant to them under their locale. This shall not limit Commons, of course, but may limit the use of information on as to why such and such decision was made on Commons, which requires some special scrutinity when localising, or regionlaizing it.

Greetings --Purodha Blissenbach 17:37, 23 September 2007 (UTC)Reply

From other Wiki's edit

Don't get me wrong - this sounds like a really cool feature. What I'm interested in is this same inter-wiki sharing - from somewhere OTHER than commons. For example I run several corporate internal wiki's, I'd love to have a central wiki for all shared images - much like commons does now. Is that in the specs? (Also, if happens to be already available, could someone point me in the right direction - Thanks) --ShakataGaNai 07:32, 10 January 2008 (UTC)Reply

I Want This edit

If Commons is truly to be common, then let it be accessible. I will use this extension. Beware creeping featuritis. When the specification grows to include opening beer cans, it's gone too far. Implement, please. Xiong 11:41, 14 May 2008 (UTC)Reply

ForeignAPIRepo edit

For testing, I've tossed a ForeignAPIRepo class in for 1.13; this allows the "native" use of files from another wiki, fetching info via MediaWiki's web app API.

Currently this doesn't do local caching or thumbnailing of files, pulling everything from the remote server. Transparent caching of files could be a rather icky DoS issue (filling up servers quick!), so who knows if anyone'll bother implementing this. :)

But for small-scale or testing use, it does appear to do the job so far, and won't necessarily require any extensions. --brion 17:46, 22 May 2008 (UTC)Reply

Return to "InstantCommons/Archive 1" page.