Project:Support desk

About this board

Welcome to the MediaWiki Support desk. This is a place where you can ask any questions you have about installing, using or administrating the MediaWiki software.

(Read this message in a different language)

See also

Before you post

Post a new question

  1. To help us answer your questions, please indicate which version of MediaWiki you are using, as found on your wiki's Special:Version page:
  2. If possible, add $wgShowExceptionDetails = true;error_reporting( -1 );ini_set( 'display_errors', 1 ); to LocalSettings.php in order to make MediaWiki show more detailed error messages.
  3. Please include the web address (URL) to your wiki if possible. It's often easier for us to identify the source of the problem if we can see the error directly.
  4. To start a new thread, click the box with the text "Start a new topic".

Upgrade from 1.35.14 to 1.39.10 or 1.42.3 stops at ExistingWiki

3
89.17.154.63 (talkcontribs)

Hi

I am in the process of upgrading my old wiki to latest version.

I am using the WebUpdater as I don't have command line access. I have a cpanel interface.

PHP 8.3.12

MySQL 8.0.40

I successfully upgraded from 1.24 to 1.35.14 without many issues.

Tried a few times to upgrade from 1.35.14 to 1.42.3

The web updater stops in step 2 at ExistingWiki with a blank page.

Then tried to upgrade from 1.35.14 to 1.39.10 with same result. Stop at ExistingWiki blank page - no errors.

When viewing my wiki at this stage the wiki seems updated to the correct version but the db is still probably in 1.35 version so no content accessible only db errors.

I have no idea how to continue from here.

I have asked my hosting support to provide errorlogs but nothing there that seems relevant.

This wiki dates back to 2005 so LocalSettings.php has been updated a lot.

Any suggestions?

Thanks in advance.

Osnard (talkcontribs)

Using PHP 8.3.12 on the MediaWiki 1.39.10 codebase may be an issue. Try this step of the update with a PHP 8.1.

In general: If there is a blank screen (internal server error) shown on the client, there is also an entry in the serverside PHP error log. Maybe you (or your hoster) need to tweak the error_reporting first, but you should be able to get an actual error message in such a case.

If any possible, use the command line updater rather than the web updater.

213.181.100.121 (talkcontribs)

Hi,

the expert support team at my hosting company says no php error logs are to be found and it appears to not follow any normal PHP log references. Where would those PHP logs be located?

I added the following lines at the top of the LocalSettings file

error_reporting( -1 );

ini_set( 'display_errors', 1 );

$wgShowExceptionDetails = true;

I don't know if this is the correct way and would appreciate any help as I am completely stuck.

Note that prior to getting to this place I got errors on screen for issues that I had to fix in LocalSettings as well as updating the PHP version to get to the first step of the WebUpdater.

Reply to "Upgrade from 1.35.14 to 1.39.10 or 1.42.3 stops at ExistingWiki"

Why does this wiki have registering disabled?

2
Summary by Tropicalkitty
2601:C6:D200:E9B0:C71:F201:A4A6:5316 (talkcontribs)

I'm reposting this again after like 2 months because I didn't get an answer but I'm curious to know why the Touhou wiki (https://en.touhouwiki.net/wiki/Touhou_Wiki) has registering disabled. They say you should request an account on their Discord, though I don't have and am not willing to make a Discord account as I will never use it, so I don't see what the point in making the account just to use it to request another account on another platform is. There is nowhere on the Touhou wiki itself to discuss this because you need an account to edit, and I don't know how else to get in touch with the admins. I'm wondering if this is maybe to prevent vandals from coming or from other people to mass create accounts. I've always been looking forward to editing on this wiki, but unfortunately I can't. I will most likely make probably blog post pages about my own fanworks and edit things that are outdated, typos, etc.

And I've never been able to do these because they say you have to request your account be made via Discord.

I'm not asking FOR an account, but just asking why and if there could be an alternate way or something, I dunno.

Thanks! <3

2601:C6:D200:E9B0:C71:F201:A4A6:5316 19:47, 4 November 2024 (UTC)

Tropicalkitty (talkcontribs)

To answer this question by technical setup, see Manual:Preventing access. As for why they chose to disable account creations, we are unable to assist.

Reply to "Why does this wiki have registering disabled?"

How to hide "Special:ShortUrl" from new pages

1
2tapadm (talkcontribs)

Greetings:

I'm running MW v1.40.4 on Ubuntu 22.02 and Nginx 1.26.2

Today after enabling "ShortUrl" extension , I started seeing the "Special:Url/#" on each newly created page.

I read docs on "Extension:ShortUrl" but it's not clear to me how I can hide this value (shorturl) from being displayed under page name.

Any guidance or hints will be greatly appreciated.

--2TA

Reply to "How to hide "Special:ShortUrl" from new pages"

How to hide 'notice 1' from MediaWiki:Editnotice-0 in VE?

5
Rob Kam (talkcontribs)

I can add a message to editors at MediaWiki:Editnotice-0 but in Visual Editor it's headed by 'notice 1'. How to remove this, to have only the plain message?

Ammarpad (talkcontribs)
Rob Kam (talkcontribs)

I deleted it but it comes back with the same "$1 {{PLURAL:$1|notice|notices}}" content?

Ammarpad (talkcontribs)

Sorry I should say blank it rather than delete. That's edit the message page and save it with nothing (remove the text you see there and save).

Rob Kam (talkcontribs)

Thanks. I copied some other text in there and then blanked that. That seems to have got rid of the number.

Reply to "How to hide 'notice 1' from MediaWiki:Editnotice-0 in VE?"
Lwangaman (talkcontribs)

Would it be possible to have some official stubs at least for Mediawiki core? I have made an attempt here: https://github.com/JohnRDOrazio/mediawiki-stubs, however I have not been very successful in getting it to work in VSCode with intelephense. Perhaps someone more experienced with PHPStan or `nikic/php-parser` or similar tools could provide a repository with official stubs that can be used with VSCode intelephense? It would make development of plugins or code contributions to core that much easier...

And perhaps this should be another topic, but along the sames lines of development environment: when using composer to add the mediawiki codesniffer rules to my project, i.e.

        "require-dev": {
                "mediawiki/mediawiki-codesniffer": "43.0.0",
                "mediawiki/mediawiki-phan-config": "0.14.0",
                "mediawiki/minus-x": "1.1.3",
                "php-parallel-lint/php-console-highlighter": "1.0.0",
                "php-parallel-lint/php-parallel-lint": "1.4.0"
        }

I get the following error:

phpcs: Trait "MediaWiki\Sniffs\PHPUnit\PHPUnitTestTrait" not found in {path-to-project}/vendor/mediawiki/mediawiki-codesniffer/MediaWiki/Sniffs/PHPUnit/SetMethodsSniff.php on line 15

What am I missing to get this to work nicely with VSCode?

I have a .phpcs.xml file in the root folder of my project:

<?xml version="1.0"?>
<ruleset>
    <rule ref="./vendor/mediawiki/mediawiki-codesniffer/MediaWiki" />
    <file>.</file>
    <arg name="extensions" value="php"/>
    <arg name="encoding" value="UTF-8"/>
</ruleset>
MarkAHershberger (talkcontribs)

I'm not using VSCode, but I do use intelephense and I'm having some success. Since it works for me without stubs, I'm curious what you mean about stubs? That said I do use "intelephense.environment.includePaths" to point to the mediawiki core when I'm working on extensions.

Does that help?

Lwangaman (talkcontribs)

A stubs file is a PHP file containing all of the function / method / constant signatures for a codebase together with their Doc Blocks, so that even without having a local copy of the Mediawiki core, you won't get "undefined method" or similar errors in the Code editor, and you get information about the class or method or constant when you hover over it (so you can see which parameters a method expects for example, or which methods are available on a class instance).

I think that the trouble I was having may be due to the memory limit imposed by intelephense (see https://github.com/bmewburn/vscode-intelephense/issues/3083#issuecomment-2394779492). My stubs file is 9MB, so `intelephense.files.maxSize` needs to be set to a value greater than 9MB. I haven't gotten around to testing this yet, but I wouldn't be surprised if that were the issue. My attempt at producing the stubs file is fairly rudimentary, and winds up being more bloated than necessary (classes within the same namespace should be grouped together within the same namespace block instead of being each in their own namespace block, for example). 9MB seems a bit much for a stubs file, and it could perhaps be broken down into multiple stubs files.

An example of a similar stubs file for WordPress: https://raw.githubusercontent.com/php-stubs/wordpress-stubs/refs/heads/master/wordpress-stubs.php. This stubs file is 4.66MB.

Lwangaman (talkcontribs)

I have finally fixed generation of the MediaWiki intelephense stubs, after spending a whole day to fine tune the generation script. The updated stubs: https://github.com/JohnRDOrazio/mediawiki-stubs/tree/main/stubs. I have also published the generation script on packagist: https://packagist.org/packages/johnrdorazio/mediawikistubs.

I manually added the stubs to the VSCode intelephense `node_modules` folder and enabled `mediawiki` as an entry in the extension's `package.json`, and I now finally have intelligent recognition of core classes, methods, definitions, etc. I requested that the stubs be added by default to the extension stubs, see https://github.com/bmewburn/vscode-intelephense/issues/3083#issuecomment-2455722065.

Reply to "Mediawiki PHP stubs"

minervaneue disabled after install

1
2A02:AB88:5687:8500:6533:C042:8E92:C72F (talkcontribs)

Hello guys. I'm trying to install and run MediaWiki on a second hosting service. MediaWiki installs successfully, but it refuses to use the MinervaNeue skin. Even though all the settings in LocalSettings.php are configured correctly, it still doesn't work. The installer was downloaded from the official MediaWiki site, and the skin is also present. I can't figure out what the problem is. Can anyone please help me?

https://kaszinowiki.org/

I'll translate the error message displayed by the site:

Oops! The default wiki interface, which is set to MinervaNeue according to $wgDefaultSkin, is not available.

The installation includes the following interfaces. For more information on configuring skins and setting the default skin, see Manual: Skin configuration.

  • minervaneue / MinervaNeue (disabled)
  • monobook / MonoBook (enabled)
  • timeless / Timeless (enabled)
  • vector / Vector (enabled)

If you just installed MediaWiki:

  • You likely installed it from Git or in some other way directly from the source. In this case, this is expected. Try installing skins from the MediaWiki skin directory on mediawiki.org in one of the following ways:
    • Download the tarball installer, which includes many skins and extensions. Simply copy the skins/ directory from it.
    • Download individual skin packages from mediawiki.org.
    • Download skins using Git
Reply to "minervaneue disabled after install"
Mkulawin (talkcontribs)

Hello, I updated my mediawiki from 1.35.14 to 1.42.3.

After run:

php maintenance/run.php update.php

The wiki is not running. The logfile of "php-fpm" shows the error:

PHP Fatal error:  Allowed memory size of 52428800 bytes exhausted (tried to allocate 40960 bytes) in /var/www/mediawiki-1.42.3/includes/config/GlobalVarConfig.php on line 75

I crease the memory limit from 20M to 100M. But I got the same error other limit.

Can you help?

Bawolff (talkcontribs)

The error says the memory limit is only 50mb. This is way too low for mediawiki. Please increase to about 256mb.

Mkulawin (talkcontribs)

Thanks for the info, I change to 256 MB but this not solve the issue.

[25-Oct-2024 08:21:52 UTC] PHP Fatal error:  Allowed memory size of 268435456 bytes exhausted (tried to allocate 40960 bytes) in /var/www/mediawiki-1.42.3/includes/config/GlobalVarConfig.php on line 75

I increase the memory to 512 MB, then the limit is not reach. But I got the error:

[25-Oct-2024 08:30:40 UTC] PHP Fatal error:  Maximum execution time of 30 seconds exceeded in /var/www/mediawiki-1.42.3/includes/libs/rdbms/database/DatabaseMySQL.php on line 756.

I use the database version:

Server version: 10.3.39-MariaDB MariaDB Server

Bawolff (talkcontribs)

Normally people disable max execution time for command line scripts. It is normal that update.php might take more than 30 seconds.

Mkulawin (talkcontribs)

The update.php don't need more then 30 second. The update runs without issue. The problem is:

When I got to our media wiki webpage and the index.php. I got a timeout because the index.php comes not back. When I increase the timeout, then you can see in the logfile from "php-fpm", that we reach the memory limit (see posts up). When I increase the memory big engouth, then I got a timeout from the DatabaseMySQL.php. But I see no long running quieries on the database. For me it looks that we have a memory leak during run index.php by accessing the database. But I don't see the reason.

This was a test:

When I update only the mediawiki php part, not the database part, than the index.php side will comes open without errors. The issue starts running php maintenance/run.php update.php.

Mkulawin (talkcontribs)

I test the update to version "mediawiki-1.41.4". This works well. I don't have issues with that version.

Mkulawin (talkcontribs)

Can you help me with version 1.42.3 and the memory leak. I don't found a solution. The usage of 1.41.4 is only a workaround.

Reply to "Memory Leak"

How to make thumbnails of .tiff, .pdf, .mp4, .webm, .mov, etc files appear on their respective media pages

23
Guillaume Taillefer (talkcontribs)

I'm trying to upload different files to my MediaWiki site to test out if everything was gonna work or not. The uploads were successful, but when I got to each file page for the file I uploaded, it appears that it wasn't showing the thumbnail of the file. For some weird reason, whenever I tried uploading JPEGs or PNGs the thumbnails would show up, but when I tried to upload PDFs, TIFFs, MP4s, WEBMs, MOVs, etc the thumbnails wouldn't show up (or in the case for media files the media player didn't show up). I don't know if it is some sort of extension(s) I have to install or something else but for pdfs I was expecting to see something like this: https://en.wikipedia.org/w/index.php?title=File%3ABonaparte_-_Acte_de_M%C3%A9diation%2C_1803.pdf&page=1. As for any movie or audio files, just the standard Wikimedia Commons players. And for TIFF, just the way that is done with JPEG and PNG. If anyone could help with this I would be grateful, thanks

Bawolff (talkcontribs)
Guillaume Taillefer (talkcontribs)

Thank you for the response! I downloaded each one (except for PDFHandler because it is already downloaded) and added their respective wfLoadExtension(); functions in LocalSettings.php. However when I tried doing the PDFHandler it wouldn't show the pdf like in the link I sent you (I even downloaded that pdf and uploaded it to my site but the problem remained). All that is added is a warning about PDFs. Then I tried the TIFF handler but the same problem happened. Finally I tried to do the TimeMediaHandler but now whenever I try to go to the pages specifically for those files my web browser gives me the error that the page is inaccessible. Thanks

Bawolff (talkcontribs)

These extensions require some additional programs installed on the server which might be the issue (although often they are already installed), check their docs for details. Other than that, are there any error messages? What does mediawiki say the dimensions of the file are.

Also if newly uploaded files work but old ones dont, you might need to purge the old pages.


For the page inaccessible thing - make sure that php error reporting is enabled. See How to debug for details.

Guillaume Taillefer (talkcontribs)

No there isn't any error messages for the TIFF and PDF file formats both of which show 0 x 0 dimentions. However for the case of the media player (of which this problem I was about to post about in its respective Discussion page), my browser either tells me that the page is inaccessible or the page loads and gives me this error message: [YrucAQ2Mk-EubQoECBuYnwAAAM8] 2022-06-29 00:25:37: Fatal exception of type "Wikimedia\Rdbms\DBQueryError"

Before I didn't have ffmpeg but then I tried installing the 5.0.1 standard Linux version from this page (which was linked from the official website) https://johnvansickle.com/ffmpeg/, and extracted the files into a folder to which I linked to with the specific function ($wgFFmpegLocation), but the same problems happened.

Jonathan3 (talkcontribs)

"No there isn't any error messages for the TIFF and PDF file formats both of which show 0 x 0 dimentions."

I had this problem with PDFs on a shared server and couldn't get to the bottom of it, despite advice from here. When I set up a VPS from scratch it just started to work. That knowledge might or might not help you!

Bawolff (talkcontribs)

So 0x0 dimension usually means there is a problem extracting width and height. For pdfs, this usually means pdfinfo command is not installed. For tiff, i think this means identify is not installed. Sometimes these values get saved so after changing something you should upload a new file to test (or run refreshImageMetadata.php). It can also mean mediawiki cannot run external programs (check php error log. Also mediawiki debug log may have more info on precisely what command is run)

For the timedmediahandler error - most commonly that means you need to run update.php (or the web installer) . Also consider enabling $wgShowExceptionDetails=true; in LocalSettings.php

Guillaume Taillefer (talkcontribs)

I'm sorry for not responding for months, but do you know exactly how to download xpdf and configure it for MediaWiki? What file(s) do I put it in, do I need to extract anything, run anything, etc? Thanks

Bawolff (talkcontribs)

It depends on your operating systems. On debian typically via apt get. Now a days people tend to use poppler-utils instead of xpdf.

Guillaume Taillefer (talkcontribs)

I am using Bluehost which uses CentOS, unfortunately on a Shared Hosting account. Therefore I can't use yum, sudo, apt get, etc unfortunately. I did manage to download the tar zip file from xpdf and put it into the usr/bin/pdfinfo sub-directory and unzipped it. However whenever I try to use the command which gs convert pdfinfo pdftotextit only gives me the results of:

/usr/bin/gs

/usr/bin/convert

In the INSTALL file under the folder that is created from the zip file, it says that

There is no need to "install" anything from this package.  You can

unpack the tar file and run the binaries directly from there.

However by "running the binaries directly from here" is what I don't exactly know what to do

Bawolff (talkcontribs)

If the folder you are installing it to is not in your $PATH it wont show up in which and you have to use the full path to ru it.

E.g. if it is in a subdirectory usr/bin /pdfinfo of the current directory you would run it as ./usr/bin/pdfinfo

You may have to use chmod command to set the execute bit (chmod u+x path/to/file) although it may have already been done for you if its from a tar file.

Guillaume Taillefer (talkcontribs)

So, I think I know what you mean but I want to make sure and ask some questions. The bin folder that I put the xpdf files under is under a directory (lets just call it "site"), which is also under a directory called "home". so /home/site/bin/. However through my SSH I figured out where the other two files (convert and gs) were actually located, and they weren't in /home/site/bin/. Instead I figured out that there was a parent directory above home/ with no name (lets just call it parent). Under parent (which as I found out doesn't allow for the ls command and gives you an error), I figured out that the true usr directory was under parent, and that when you go to bin under that folder, thats where all the other "scripts" such as convert and gs are. so parent/home/site/bin and parent/usr/bin. The problem is however, is that usr is a read-only directory (because I'm on shared hosting), and so I can't put, delete, or modify anything under usr. Therefore I can't put pdfinfo nor pdftotext in there. I then tried modifying $wgPdfInfo and $wgPdftoText to the correct directories to which pdfinfo and pdftotext are located, but when I go to any pdf files (even ones I uploaded after modifying the variables), the 0 x 0 problem still shows. I'm not sure if the solutions you gave have to do with this, but I wanted to explain the full problem to better solve it. Again thank you for all your help.

Bawolff (talkcontribs)

For reference, what you are referring to as "parent" is usually called the root directory, or just /.

That all sounds like the normal setup for shared hosting. Most programs are in /usr/bin (or a few other more obscure directories), but only the system administrator can add things there.

Programs can normally be anywhere, but if they arent in the normal directory, you have to type out the full path name instead of just the program name.

The one extra thing is the program has to have the execute permissions in order to run as a program. You can change a program permission with the chmod command.

Some shared hosts dont let you run custom programs (perhaps via selinux rules or something), so that's a possibility that may or may not apply to you, although most hosts do allow it.

If you enable the mediawiki debug log (see How to debug) it should include what it tried to run and errors encountered which might help with debugging.


There are also many different types of linux. If you got precompiled binaries, its possible you have the wrong type. I would try running these commands with the full path ( so if binary is in the site directory, typing /home/site/pdfinfo ) from ssh to make sure the programs work.

Guillaume Taillefer (talkcontribs)

So I was able to finally find a version of xpdf that supports glibc 2.12... through the wayback machine (I got 4.02 from mid-late 2020). I then replaced 4.04 with 4.02, and now the new pdftotext and pdfinfo work, except for a few problems. So the new paths that they are under is home/site/bin/xpdf-tools-linux-4.02/bin64/(pdfinfo/pdftotext). I then ran each one with the ./ command thing before home/site/etc and they worked, giving me a list of commands to run with pdfinfo or pdftotext. However, thats basically where it ends. I tried to link each filepath with the $wgPdftoText and $wgPdfInfovariables, each one with the filepath that I mentioned, but then I type in which pdfinfo pdftotext into my SSH, and nothing shows. I then tried your chmod u+x thing for each one and each command worked without errors, but then I went back again to the which command and still nothing. Each file has all permissions set to 777, and still nothing.

Guillaume Taillefer (talkcontribs)

Nvm, this part I actually solved, since then I added the

$wgPdfProcessor = '/usr/bin/gs';

$wgPdfPostProcessor = $wgImageMagickConvertCommand;

to LocalSettings.php above the pdfinfo and pdftotext link variables (I don't know if this contributed to solving the problem, but just wanted to let future users know)

I then ran both the refreshImageMetadata.php and rebuildImages.php scripts, and then I went back to the pdf files, and they worked!!! :)

Guillaume Taillefer (talkcontribs)

I tried the debugging this, but no errors pop up. I have both programs set their permissions as 0777 so they can do as they want. I then tried executing each program via the ./pdfinfo and .pdftotext (I was in bin) and they gave me this error: ./pdfinfo: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by ./pdfinfo/pdfinfo). I looked into it and found that I needed to download glibc 2.14 and found where I could download it from gnu. I couldn't find however where exactly to extract it for MediaWiki. Thanks

Bawolff (talkcontribs)

You probably dont want to just download your own libc. That gets really complicated. Its probably better to find a copy of pdfinfo that has been compiled for the version of glibc included with your OS or try compiling it yourself (also rather complex).


if you really want to try a custom libc, read up on LD_LIBRARY_PATH, but i really reccomend against it.

Guillaume Taillefer (talkcontribs)

I also have a problem with tiff files. The extension seems to be working, but I have three different issues. This first two go with tiff files where their title with the extension ends with .tiff. First, some tiff files do give me a thumbnail but its with a grey background with the message saying: Error creating thumbnail: File with dimensions greater than 12.5 MP . The second type of files is when I go to them, at the top (I have debug available now lol) it gives me the following repeating error message:

Notice: Undefined index: first_page in /home/site/public_html/extensions/PagedTiffHandler/includes/PagedTiffHandler.php on line 518

Notice: Undefined index: last_page in /home/site/public_html/extensions/PagedTiffHandler/includes/PagedTiffHandler.php on line 531

At one point in the middle of it repeating (it repeats line 531 one more than 518), it puts the following:

Notice: Undefined index: page_count in /home/site/public_html/extensions/PagedTiffHandler/includes/PagedTiffHandler.php on line 505

The full page below with the rest of the page's content show a blank thumbnail at 0 x 0.


Then theres the tiff files where when I upload them, they get a .tif title instead of a .tiff. When I go to the page for them, even though the files do no have two pages on them (like a pdf file) it shows the dropdown to flip to other pages (like in pdfhandler). There are two pages, the first one just being one with a thumbnail displaying the same error message with the 12.5 MP thing from above, and the second one being the actual image file.

Bawolff (talkcontribs)
Guillaume Taillefer (talkcontribs)

I tried increasing it to as much as $wgMaxImageArea = 10e7;, but in all cases they say instead of 12.5MP, same message but with 100MP. Btw these things are scanns of documents of around 3200 dpi and above. I know that Mediawiki (mine especially) is able to reduce the size of the image to fit on the page because as I mentioned with a bunch of the tiffs it gives me two pages (when theres only one) of the same thing, one with the image itself, and the other exactly the same size with the error message. Also even though for the images that are displayed, if I click on "open media viewer", it gives me the error that it couldn't be displayed, retry, etc. I have no idea what I'm supposed to try and do to fix all that

Guillaume Taillefer (talkcontribs)

For the double paged one as well, even if I click on the one that shows the proper thumbnail, if I go into MediaViewer, it gives me an error message saying: There seems to be a technical issue. You can retry if it persists. Error: error in provider, thumb info not found

Abhishek.lal (talkcontribs)

TimedMediaHandler breaks visualeditor? Can you please explain why this happens?

{ "error": { "code": "internal_api_error_TypeError", "info": "[f16b2d79985533c31ed7fe29] Exception caught: DOMElement::setAttribute(): Argument #2 ($value) must be of type string, null given", "errorclass": "TypeError", "trace": "TypeError at

Bawolff (talkcontribs)
Reply to "How to make thumbnails of .tiff, .pdf, .mp4, .webm, .mov, etc files appear on their respective media pages"

Export wiki tables into CSV or other formats

5
Abdeaitali (talkcontribs)

This is my first created topic here, so apologies if it is not well formulated!

I am wondering if there is a tool to simply extract data tables from Wikipedia articles (or other wiki projects) to other formats such as CSV.

Thanks!

AhmadF.Cheema (talkcontribs)
Abdeaitali (talkcontribs)

Thanks @AhmadF.Cheema for your comment!

Yeah, this is exactly the tool I use now but I was wondering if there is some tool integrated in the wiki or if there is a plan to integrate it.

A right click on a wiki table giving export options would be a great integration.

Cheers!

Ciencia Al Poder (talkcontribs)
Mateussf (talkcontribs)

Hello. I've tried using the wikitable2csv.ggor.de software but I found a problem. The wikitable uses commas in its cells "for example, one cell is "sculptor, scientist") and the software adds commas as the separators between cells. So this means I can't open the CSV file on R or Excel properly, because there are more columns than column names. Is there any way to change the cell separator in the wikitalbe2csv software? Thanks!

Reply to "Export wiki tables into CSV or other formats"

How to save a Wikipedia article as an ODT file?

2
223.24.170.80 (talkcontribs)

Hello !

I want to save an article with the most common Free and Open Source Software file extension I know of, which is ODT.

I want ODT and not PDF so the file would be easily editable from any common rich text editor.

Wikipedia only allows me to save an article as a webpage or as a PDF file, even for a printable version, hence I ask:

How to save a Wikipedia article as an ODT file?

No conversions, no complex tasks, just simply saving a printable version (in HTML) to an ODT (in XML).

Thank you !

Gryllida (talkcontribs)

copy/paste?

Reply to "How to save a Wikipedia article as an ODT file?"