If I copy hieroglyphs from WikiHeiro I get Gardiener’s codes but not hieroglyphs symbols with Egyptian Hieroglyph Format Controls.
About this board
Copying of WikiHeiro
Hello, I would like to run CommonsNotifier on fawiki. I have followed the instructions, and everything seems OK except for the database part. When I run bin/first-run, I get the following error
Traceback (most recent call last): File "/mnt/nfs/labstore-secondary-tools-project/nn1l2bot/bot/virtualenv/lib/python3.5/site-packages/pymysql/connections.py", line 920, in connect **kwargs) File "/usr/lib/python3.5/socket.py", line 712, in create_connection raise err File "/usr/lib/python3.5/socket.py", line 703, in create_connection sock.connect(sa) ConnectionRefusedError: [Errno 111] Connection refused During handling of the above exception, another exception occurred: Traceback (most recent call last): File "./make-list.py", line 11, in <module> userdb = mysql.connect() File "/mnt/nfs/labstore-secondary-tools-project/nn1l2bot/bot/commonsbot/mysql.py", line 33, in connect use_unicode=True) File "/mnt/nfs/labstore-secondary-tools-project/nn1l2bot/bot/virtualenv/lib/python3.5/site-packages/pymysql/__init__.py", line 90, in Connect return Connection(*args, **kwargs) File "/mnt/nfs/labstore-secondary-tools-project/nn1l2bot/bot/virtualenv/lib/python3.5/site-packages/pymysql/connections.py", line 699, in __init__ self.connect() File "/mnt/nfs/labstore-secondary-tools-project/nn1l2bot/bot/virtualenv/lib/python3.5/site-packages/pymysql/connections.py", line 967, in connect raise exc pymysql.err.OperationalError: (2003, "Can't connect to MySQL server on 'localhost' ([Errno 111] Connection refused)") CRITICAL: Exiting due to uncaught exception <class 'pymysql.err.OperationalError'>
I want to run the code by User:4nn1l2bot from toolforge. Currently, my.cnf file reads:
[client] password = <redacted> user = s53099 host = tools.db.svc.eqiad.wmflabs database = s53099__commonsbot
I have spent several hours to get familiar with the process, but I'm stuck. I would appreciate your help.
I am sorry that I accidentally sent the above message with my bot account. I am new to Flow.
Problem solved with copying my.cnf from bot directory to its parent directory.
Hey, did you know you can just ask us to run it for you? We'd be happy to help.
I would be more than happy to run it on the Persian Wikipedia (fawiki) and maybe the Arabic Wikipedia (arwiki). I will also complete all the formalities such as filling in a phab ticket (phab:T190233), asking for the approval, etc. Furthermore, this is the first edit by the bot on fawiki: fa:special:diff/26557322. However, if you believe that running two bots would be a waste of resources or for whatever reason the bot should be run by User:Community Tech bot, I understand and will abide by. Please let me know.
@4nn1l2 Is there a reason you want to run this by yourself? Keeping it all in one place is much simpler.
@NKohli (WMF): I stopped running 4nn1l2bot. Please see m:Special:Diff/19194399. Phab ticket created too: phab:T227622. You guys should have announced the creation of the bot to the Technical Village Pump of different projects much sooner. One year after the creation of the bot, it has been deployed on only 4 Wikipedias. I'm sorry to be blunt, but Community Tech seems to have a communication problem.
@4nn1l2 I am sorry about that. This was announced on the project page for the wishlist project and messaged to people who voted for this project and also announced on our newsletter. We didn't announce it on all village pumps because we did not have the bandwidth to translate messages into so many languages. We will try to do a better job of it in future.
A barnstar for you!
|The Technical Barnstar|
|Thanks for addressing the WP0 issues on Phab recently. I really appreciate it.|
Piling on this. Using plain text in case you prefer it to graphic/pics :p
+1. Thanks so much.
Wow, thank you! It's an honor, ladies and gentlemen.
Country and city attribution of geolocated item
I found your GeoData API very clean, simple and user friendly in retrieving various data according to geo location of the item. But there are difficulties with retrieving country/city affiliation of the item. While country can theoretically be get in a single request (also not always but only if being specified and not in name format but rather by its alphabetic designation), the city is possible to be get only for items which are cities by themselves. Let's imagine I want to determine in one request providing the coordinates of the Sagrada Familia temple the name of the item and that it is located in Barcelona, Spain. As far as I understood there is no way to do that. From the second hand this information does exist for every geo tagged item and is available for example through WikiData SPARQL query service. But then I'll need to need to perform a second request to WikiData which I would have liked to avoid by all means.
Can you advice on the optimal strategy to perform the desired? If not is it possible to add the country/city attributes to GeoData list=geosearch attributes?
Bot activity on the Dutch Wikipedia
Your bot account hasn't made any edits on the Dutch Wikipedia for at least three years. In accordance with the local bot policy the bot flag will be removed in three months. To avoid losing the bot flag, you can confirm you want to retain the bot flag by going to this page.
With kind regards, Kippenvlees1
Новые разделы переводимых страниц
Привет! <translate></translate> (и тоже без маркеров).делать нельзя. Убираешь элемент - убирай вместе с маркером. Добавляешь новые - без маркера (никаких номеров вручную ставить не нужно). Движок автоматом разметит новые абзацы (один элемент = 1 абзац). Если нужно плотнее, то каждый новый элемент внутри своей пары тегов
Чесслово, меня это чудо-расширение так достало, что я теперь просто пытаюсь его избегать :P
Removal of wfgeoLink function hook from the MapSources extension
last night the wfgeoLink function hook was removed from the MapSources extension. Why you did this and why nobody told this to the community? On the German Wikivoyage #geolink is used for about 30.000 times.
This hook has nothing to do with the slippymap.
You did us a disservice.
By the way I am one of the former authors of this extension. But nobody made my a subscriber.
Replied at phab:T149288.
You are laughable...
Unbelievable, as a new user to media wiki I am really put-off with this product. To delete pages on wim - without informing the contributor in any way before hand is un precendented and completely rediculous. This is truly a first. I will never use this product again, and I will go out of my way to ensure that people avoid using it at all costs.
Have you seen the warning "Attention visitors: This site is for documentation on the MediaWiki software. If you are trying to add a page to your company's internal wiki site, you're in the wrong place. Just hit "back" in your browser until you find yourself back home." at the top of edit page?
I am not questioning your reason fbehind deleting my page. I am questioning your process. Have you no notion of consideration? Is sending a warning message a difficult undertaking for you? Are you so irritated with your life that you feel the need to detriment others? My advice is to take a vacation, find a new line of work, or see a therapist. You strike my as a very angry person.
Looks like you need to take a vacation too:)
I have to agree with SWEngineering here.
I think you should really stop deleting pages this fast and put a template first!
GeoData index problem
Hey Max. Having run into Bugzilla 49893 (the index problem with GeoData), I was wondering if there's any timetable for fixing it? I can get round the way it makes gslimit useless (or rather, a big over-estimate of the actual number of results returned), but it seems to drop results which makes GeoData near-useless for my purposes. If there's a fix on the horizon then I don't mind waiting, but otherwise I'm going to have to start looking at other ways of extracting a comprehensive list of geotagged articles, like hacking through an entire database dump. (ugg!) Cheers.
This depends on switch of all wikis to ElasticsearchCirrusSearch, so probably a matter of weeks or a couple of months. Meanwhile, I manually perform a full reindex from time to time to reduce the impact of this bug.