Manual talk:Pywikibot/2013
This page used the LiquidThreads extension to give structured discussions. It has since been converted to wikitext, so the content and history here are only an approximation of what was actually displayed at the time these comments were made. |
I read the Traditional Chinese Version, it says you must use Monobook skin to run Pywikipediabot, is it real at present? 一個正常人 (talk) 13:43, 18 January 2013 (UTC)
- It seems it can use in Vector skin. 一個正常人 (talk) 11:58, 19 January 2013 (UTC)
- I propose to use the core branch of the bot. It always uses API for retrieving data whereas the older compat branch uses screen scraping for only few parts which may not woork with vector skin. But most methods are ported to API too. @xqt 13:22, 23 December 2013 (UTC)
Sorry, can pywikipediabot automatically add templates or other text after reflinks on certain sites? For example: [1], [2]. Thanks. Ворота рая Импресариата (talk) 10:44, 20 January 2013 (UTC)
- Your example is just your user page. 一個正常人 (talk) 12:42, 20 January 2013 (UTC)
- Yes, I know. This is just an example. But is it possible in principle to search through the links on a web site and add text after each link? In Russian Wikipedia rules have changed, and now it is impossible without the consent of the administrator use spamremove.py . After linking to sites that may spread Malicious Software, it is necessary to put the template. Thank you. Ворота рая Импресариата (talk) 15:07, 20 January 2013 (UTC)
I use this software on this site as Abcbot.
I input the text below in the cmd:
replace.py -ns:0 樹 树 -search:樹
and find this page. I pressed "y". Then it returned:
HTTPError: 500 Internal Server Error WARNING: Could not open 'http://zh.minecraftwiki.net/api.php'. Maybe the server is down. Retrying in 1 minutes... HTTPError: 500 Internal Server Error WARNING: Could not open 'http://zh.minecraftwiki.net/api.php'. Maybe the server is down. Retrying in 2 minutes... HTTPError: 500 Internal Server Error WARNING: Could not open 'http://zh.minecraftwiki.net/api.php'. Maybe the server is down. Retrying in 4 minutes... HTTPError: 500 Internal Server Error WARNING: Could not open 'http://zh.minecraftwiki.net/api.php'. Maybe the server is down. Retrying in 8 minutes... HTTPError: 500 Internal Server Error WARNING: Could not open 'http://zh.minecraftwiki.net/api.php'. Maybe the server is down. Retrying in 16 minutes...
After a while, it returned:
Traceback (most recent call last): File "C:\Documents and Settings\user\My Documents\pywikipedia\wikipedia.py", l ine 8902, in async_put page.put(newtext, comment, watchArticle, minorEdit, force) File "C:\Documents and Settings\user\My Documents\pywikipedia\wikipedia.py", l ine 2056, in put sysop = sysop, botflag=botflag, maxTries=maxTries) File "C:\Documents and Settings\user\My Documents\pywikipedia\wikipedia.py", l ine 2147, in _putPage response, data = query.GetData(params, self.site(), sysop=sysop, back_respon se = True) File "C:\Documents and Settings\user\My Documents\pywikipedia\pywikibot\suppor t.py", line 115, in wrapper return method(*__args, **__kw) File "C:\Documents and Settings\user\My Documents\pywikipedia\query.py", line 143, in GetData res, jsontext = site.postForm(path, params, sysop, site.cookies(sysop = syso p) ) File "C:\Documents and Settings\user\My Documents\pywikipedia\wikipedia.py", l ine 6107, in postForm cookies=cookies) File "C:\Documents and Settings\user\My Documents\pywikipedia\wikipedia.py", l ine 6151, in postData f = MyURLopener.open(request) File "C:\Python27\lib\urllib2.py", line 406, in open response = meth(req, response) File "C:\Python27\lib\urllib2.py", line 519, in http_response 'http', request, response, code, msg, hdrs) File "C:\Python27\lib\urllib2.py", line 444, in error return self._call_chain(*args) File "C:\Python27\lib\urllib2.py", line 378, in _call_chain result = func(*args) File "C:\Python27\lib\urllib2.py", line 527, in http_error_default raise HTTPError(req.get_full_url(), code, msg, hdrs, fp) HTTPError: HTTP Error 503: Service Unavailable
What should I do? 一個正常人 (talk) 12:53, 20 January 2013 (UTC)
- Then I use other word to replace. It sucessed. So strange! 一個正常人 (talk) 12:55, 20 January 2013 (UTC)
- The second bug is the bot change my words into question mark. See the contribution of my bot. 一個正常人 (talk) 08:58, 21 January 2013 (UTC)
- I don't see any question marks at the given link or the difflinks. Please give a difflink where you see them. This must be a character encoding problem in your browser (or in your command line). Bináristalk 22:01, 25 March 2013 (UTC)
- That is a contribution page, each edit which have question mark in the summary have this program. 一個正常人 (talk) 09:45, 8 August 2013 (UTC)
- I don't see any question marks at the given link or the difflinks. Please give a difflink where you see them. This must be a character encoding problem in your browser (or in your command line). Bináristalk 22:01, 25 March 2013 (UTC)
- The second bug is the bot change my words into question mark. See the contribution of my bot. 一個正常人 (talk) 08:58, 21 January 2013 (UTC)
- These are clearly HTTP errors as stated. The problem is with the server, not the bot. Try again later or contact the maintainer of the wiki. Bináristalk 21:59, 25 March 2013 (UTC)
I'm not sure how useful these links are, or if they should be included as translations somehow, but I am just copying the template m:Template:Pywiki-lang's content:
es - fa - fr - hu - ja - nl - pt -
Since the template will be deleted on Meta-Wiki, I figured I should at least archive the translations somewhere more relevant, maybe here. πr2 (t • c) 04:47, 9 March 2013 (UTC)
- I'm porting my script from trunk to rewrite branch. How can I create item in wikidata? ChongDae (talk) 05:39, 30 April 2013 (UTC)
import pywikibot site = pywikibot.Site('en', 'wikipedia') repo = site.data_repository() data = {'sitelinks': {'site':'enwiki', 'title':'Main Page'}} something = repo.editEntity({}, data, bot=True) print something
- It is in the roadmap to make this an ItemPage constructor in the future. Legoktm (talk) 20:44, 30 April 2013 (UTC)
- thanks... been looking a while for this! Edoderoo (talk) 20:42, 19 January 2017 (UTC)
What needs to be bot to work on Min Wikipedia? Kolega2357 (talk) 15:40, 1 May 2013 (UTC)
- What error are you having? Legoktm (talk) 19:32, 2 May 2013 (UTC)
Hi,
I got an issue on pywikipediabot today when I was using pagefromfile.py script to generate pages to my SMW box from template files. Please see below:
mediawiki@ontodiaAZ:~/core/pywikipedia$ python pagefromfile.py -appendtop
No handlers could be found for logger "pywiki" Reading 'dict.txt'...
>>> Resource/postalCode/12698 <<<
Traceback (most recent call last):
File "pagefromfile.py", line 365, in <module> main() File "pagefromfile.py", line 361, in main bot.run() File "pagefromfile.py", line 159, in run self.put(title, contents) File "pagefromfile.py", line 188, in put if appendtops.find(self.nocontents)==-1 and appendtops.find(self.nocontents.lower())==-1:
NameError: global name 'appendtops' is not defined
I have checked the script itself but cannot see the declaration of variable 'appendtops', can anybody help? Much thanks ahead! Akkking (talk) 22:12, 7 May 2013 (UTC)
- The issue has been resolved.
- If anyone has the similar issue and cannot resolve it please kindly let me know. Akkking (talk) 21:03, 20 May 2013 (UTC)
- I get the same error when I use the -appendbottom option. How did you fix it? HerrMister (talk) 15:41, 24 May 2013 (UTC)
- Could you write the solution to the problem here?
- What I did was I changed these lines of code:
if self.append == "Top":
if appendtops.find(self.nocontents)==-1 and appendtops.find(self.nocontents.lower())==-1:
contents=contents +appendtops
pywikibot.output(u"Page %s already exists, appending on top!"
% title)
else:
pywikibot.output(u'Page had %s so it is skipped' % (self.nocontents))
return
contents = contents + page.get()
comment = comment_top
elif self.append == "Bottom":
if appendtops.find(self.nocontents)==-1 and appendtops.find(self.nocontents.lower())==-1:
contents=contents +appendtops
pywikibot.output(u"Page %s already exists, appending on bottom!"
% title)
else:
pywikibot.output(u'Page had %s so it is skipped' % (self.nocontents))
return
contents = page.get() + contents
comment = comment_bottom
- to
if self.append == "Top":
contents = contents + page.get()
comment = comment_top
elif self.append == "Bottom":
contents = page.get() + contents
comment = comment_bottom
Bennylin (talk) 13:57, 27 May 2013 (UTC)
- aji.mariyil@gmail.com / AjikkuttanMariyil 106.76.110.14 09:02, 22 May 2013 (UTC)
- Bug reported at bugzilla:58892 @xqt 13:30, 23 December 2013 (UTC)
- Now it's fixed Ladsgroup (talk) 23:27, 20 June 2014 (UTC)
MW 1.19.3
When I want to login with bot to my wiki I see this error:
No handlers could be found for logger "pywiki" Logging in to westeros:fa as SiteBot via API. Traceback (most recent call last): File "C:\pywikipedia\login.py", line 436, in <module> main() File "C:\pywikipedia\login.py", line 432, in main loginMan.login() File "C:\pywikipedia\login.py", line 319, in login cookiedata = self.getCookie(api) File "C:\pywikipedia\login.py", line 181, in getCookie response, data = query.GetData(predata, self.site, sysop=self.sysop, back_response = True) File "C:\pywikipedia\pywikibot\support.py", line 121, in wrapper return method(*__args, **__kw) File "C:\pywikipedia\query.py", line 143, in GetData res, jsontext = site.postForm(path, params, sysop, site.cookies(sysop = sysop) ) File "C:\pywikipedia\wikipedia.py", line 6460, in postForm cookies=cookies) File "C:\pywikipedia\wikipedia.py", line 6514, in postData raise PageNotFound(u'Page %s could not be retrieved. Check your family file ?' % url) pywikibot.exceptions.PageNotFound: Page http://www.westeros.ir/w/api.php could not be retrieved. Check your family file?
My family file is this:
# -*- coding: utf-8 -*- import family # westeros class Family(family.Family): def __init__(self): family.Family.__init__(self) self.name = 'westeros' self.langs = { 'fa': 'www.westeros.ir', } def version(self, code): return "1.19.3" def scriptpath(self, code): return '/wiki' def apipath(self, code): return '/wiki'
api.php is in '/wiki' folder and defined correctly in family file, I don't know why it says "Page http://www.westeros.ir/w/api.php could not be retrieved"? Where should I change "/w" to "/wiki" other than family.py file? 3dmahdi (talk) 06:58, 2 July 2013 (UTC)
- Hi. The issue is in your indentation. Try:
# -*- coding: utf-8 -*- import family # westeros class Family(family.Family): def __init__(self): family.Family.__init__(self) self.name = 'westeros' self.langs = { 'fa': 'www.westeros.ir', } def version(self, code): return "1.19.3" def scriptpath(self, code): return '/wiki' def apipath(self, code): return '/wiki'
- Rather than tabs, you should use 4 spaces, which will keep everything consistent over multiple computers/operating systems. Legoktm (talk) 15:25, 4 July 2013 (UTC)
- Thank you very much! That was the problem. Now my bot can login and edit pages! :) 3dmahdi (talk) 20:08, 5 July 2013 (UTC)
The following discussion is closed. Please do not modify it. Subsequent comments should be made on the appropriate discussion page. No further edits should be made to this discussion.
Hello, I am consistently getting this error, and not just with category.py.
C:\Documents and Settings\(me)\Desktop\pywikipedia>category.py add -links:"C loth Armor" Category to add (do not give namespace): Items No handlers could be found for logger "pywiki" Getting 60 pages from requiem_wiki:en... ERROR: URLError: <urlopen error [Errno 11001] getaddrinfo failed> WARNING: Could not open 'http://requiem.irowiki.orgNone/index.php?title=Special: Export&useskin=monobook'. Maybe the server or your connection is down. Retrying in 1 minutes...</pre> Basically, I have no Idea what to fix. I'm just trying to add a category to all the links found on a specific page. Any help is more than appreciated, Thank you very much. [[User:TheMobius|TheMobius]] ([[User talk:TheMobius|talk]]) 09:38, 4 August 2013 (UTC) :Solved, played around with family.py for a good amount of time. Ended up being a simple issue that I kept overlooking. :<syntaxhighlight lang='text'> def scriptpath(self, code): return '/w/' </syntaxhighlight> :I didn't have "/w/" set properly. [[User:TheMobius|TheMobius]] ([[User talk:TheMobius|talk]]) 11:34, 4 August 2013 (UTC) {{Archive bottom}} == New script == Would be great if someone create a script that remove unused parameters from templates in articles :o [[Special:Contributions/200.127.196.161|200.127.196.161]] 22:45, 11 August 2013 (UTC) :+1 [[b:pt:User:Helder.wiki|Helder]] 00:18, 30 August 2013 (UTC) ::Using the [https://github.com/earwig/mwparserfromhell mwparserfromhell] library. Note that you need to be using at least v0.3.2 (the latest release). ::<source lang="python"> # Authors: Legoktm, Earwig import pywikibot import mwparserfromhell as mwp site = pywikibot.Site() for page in site.allpages(): code = mwp.parse(page.get()) for temp in code.filter_templates() for param in temp.params: if not param.value.strip(): temp.remove(param) page.put(code, 'Removing empty parameters') </source> ::Untested, but should work. [[User:Legoktm|Legoktm]] ([[User talk:Legoktm|talk]]) 10:28, 31 August 2013 (UTC) == Can't login == {{Flow summary|https://sourceforge.net/p/pywikipediabot/bugs/1673/ maybe?}} Hi, I've working in the [http://minecraft-de.gamepedia.com/ German Minecraft Wiki] and I've got a Bot. This Wiki was moved to another URL recently and the login changed to the site [http://curse.com curse.com]. Now I can't login with this Bot. The log is the following: <pre>(myComputer) ~/pywikipedia $ python login.py Password for user XuBot on mc:de: ******** No handlers could be found for logger "pywiki" Logging in to mc:de as XuBot via API. Error downloading data: No JSON object could be decoded Request de:/api.php? Retrying in 1 minutes...
Why isn't it work? The api.php-URL is right. 84.183.255.160 09:41, 22 September 2013 (UTC)
I am trying to localise the messages Pywikibot:Delete-referring-pages ("Robot: Deleting all pages referring from %(page)s") and Pywikibot:Delete-linked-pages ("Robot: Deleting all pages linked from %(page)s") at translatewiki.net. Apparently the documentation at delete.py states:
- -links: Delete all pages linked from a given page.
- -ref: Delete all pages referring from a given page.
Can someone please explain the difference between "link" and "refer"? Lloffiwr (talk) 10:00, 6 October 2013 (UTC)
- link is all the links on a page. So if your page content was [[Page1]] [[Page2]], it would use those two pages. refer is basically Special:WhatLinksHere, or all the pages that link to the page provided. Legoktm (talk) 21:10, 6 October 2013 (UTC)
- Done Thanks - documented at tranlatewiki.net. Lloffiwr (talk) 21:00, 8 October 2013 (UTC)
This post by Nemo bis was moved on 2013-11-06. You can find it at Manual talk:Pywikibot/weblinkchecker.py#c-Mdann52-2013-11-06T08:35:00.000Z-Weblinkchecker.py. Nemo 23:19, 6 November 2013 (UTC)
Hello.
Is there a script for remove wanted files from galleries?
<gallery> Wiki.png DOESNTEXIST.png|Hello DOESNTEXIST.png </gallery>
Thanks. An unknown anonymous user 11:35, 25 November 2013 (UTC)
- The delinker.py script could perhaps be used for this, though (afaik) it does require running continuously in order for it to get notified of deletions (it makes the assumption, mainly for scalability reasons, that existing documents don't reference wanted files).
- This is also the script used on Wikipedia to unlink files following a delete action on Wikimedia Commons. Krinkle (talk) 01:57, 6 December 2013 (UTC)
- Hi, You can use delinker.py or image.py (I don't know image supports removing from galleries or not)
- Sorry for delay because I didn't notice until now (is there any kind of mail notification here?) Thank you Krinkle for answering :)
- Best Ladsgroup (talk) 14:13, 10 December 2013 (UTC)
Hello, can anyone add the Tyvan language (tyv) to the source code. Tyva-Wikipedia opened in August, 2013, but it still unable taking the data here. Soul Train (talk) 12:47, 16 December 2013 (UTC)
- tyv-wiki was activated on 13th september with gerrit:84114 @xqt 13:11, 23 December 2013 (UTC)
How to stop bot from running script? It goes wrong, creating articles with wrong name, and the only sollution is to close cmd window? Maybe exist one better without exiting CMD? XXN (talk) 02:42, 22 December 2013 (UTC)
- Type Ctrl-C to stop the script with KeyboardInterupt. @xqt 13:06, 23 December 2013 (UTC)
- This sollution is only for Unix-like OS, or works on Windows too? (i use Windows). Thank you. XXN (talk) 02:25, 25 December 2013 (UTC)
- It works for windows too :) @xqt 19:01, 28 December 2013 (UTC)
- This sollution is only for Unix-like OS, or works on Windows too? (i use Windows). Thank you. XXN (talk) 02:25, 25 December 2013 (UTC)
Installation tutorial is not updated for PWB 2.0+? I have an older version (with all scripts by default in root folder, and no install required [for windows]), it worked. But now i tried to install newer version of PWB, and it cause me troubles :/
Why now is necessary to install PWB, with distribution of files in various folders (including APP data)? XXN (talk) 14:11, 26 December 2013 (UTC)
- You may install the core release (pwb 2.0) but you also may run the bot without installation. In this case you must run scripts through the pwb wrapper script i.e. you have to run
- pwb.py <script> <options>
- instead of
- <script> <options>
- The script suffix ".py" may be ommitted if running through the wrapper. @xqt 19:06, 28 December 2013 (UTC)
When I try to enter, I get this message:
$ python login.py WARNING: Skipped '/cygdrive/c/pywikipedia/user-config.py': owned by someone else. Traceback (most recent call last): File "login.py", line 58, in <module> import re, os, query File "/cygdrive/c/pywikipedia/query.py", line 29, in <module> import wikipedia as pywikibot File "/cygdrive/c/pywikipedia/wikipedia.py", line 9056, in <module> getSite(noLogin=True) File "/cygdrive/c/pywikipedia/wikipedia.py", line 8807, in getSite _sites[key] = Site(code=code, fam=fam, user=user) File "/cygdrive/c/pywikipedia/pywikibot/support.py", line 121, in wrapper return method(*__args, **__kw) File "/cygdrive/c/pywikipedia/wikipedia.py", line 5989, in __init__ % (self.__code, self.__family.name)) pywikibot.exceptions.NoSuchSite: Language language does not exist in family wikipedia
my user-config:
mylang='he' family = 'wikivoyage' usernames['wikivoyage']['he']=u'DekelBot' console_encoding = 'utf-8'
What should I do? Dekel E (talk) 19:22, 27 December 2013 (UTC)
- Have you updated your code?
- But the problem lies here:
WARNING: Skipped '/cygdrive/c/pywikipedia/user-config.py': owned by someone else.
- on linux you'd do "sudo chown <youruser> user-config.py" or whatever, dunno in your case. Nemo 19:35, 27 December 2013 (UTC)
- Seems you have to update your bot first. @xqt 19:09, 28 December 2013 (UTC)