Talk:Wikidata Query Service/User Manual
Item count
editHow can I just count the number of items that match a certain query? I mean, I get the total with the result set, but if the total is very large, the browser tab explodes, and loading the entire dataset is unnecessary anyway, since all I want is the number.
I tried to use SPARQL's COUNT() but got a syntax error. grepping the User Manual for 'count' returned no results either. I'm guessing this is a FAQ, so perhaps an answer should be added to the manual, even if it is "currently you can't". Ijon (talk) 22:01, 30 October 2015 (UTC)
This is standard SPARQL. Try something like: SELECT (COUNT(DISTINCT ?x) AS ?count) WHERE { ... } Cheers, Bovlb (talk) 01:59, 17 October 2016 (UTC)
ranks
editIs there a way to take ranks into account, like ignoring deprecated statements in d:Wikidata:Database reports/items with P569 greater than P570. --Zolo (talk) 08:16, 27 January 2016 (UTC)
Sitelinks
editThe manual is still missing how to get sitelinks. And is it possible to get only the sitelinks of a certain namespace? --Tobias1984 (talk) 13:05, 27 May 2016 (UTC)
- I remember writing such a query and posting it to the #wikidata IRC channel, but I couldn't easily recreate it. The problem is that the contains function does not directly work on the URLs. Currently I'm a bit too swamped to search in the logs. -- Jan Zerebecki 14:30, 1 June 2016 (UTC)
- Querying for sitelinks is described here: https://www.wikidata.org/wiki/Wikidata:SPARQL_query_service/queries#Working_with_sitelinks Bovlb (talk) 02:12, 17 October 2016 (UTC)
Translations
editHi
Thank you for this amazing tool!
In order to make it easier to use, have you considered to mark that manual for translation?
Thanks, Trizek (talk) 13:29, 18 June 2016 (UTC)
- @Trizek: Since no one has replied to this section in six months and a few changes have been made to the page, I think there should be no problem. If is it, anyone could cancel and we can discuss.
- Ogoorcs (talk) 21:30, 5 March 2017 (UTC)
Invalid example
editThe example "showing the list of US presidents and their spouses" (section "Label service") results in the following error (when trying to evaluate it following the link "Try it"):
Query is malformed: QName 'v:P6' uses an undefined prefix
The prefix "v" is not defined in the menu "Prefixes" in the query tool, and I cannot find it in this page either.
--FGeorges (talk) 05:49, 6 October 2016 (UTC)
- I have fixed it. It should have been ps: . Bovlb (talk) 02:10, 17 October 2016 (UTC)
- Just tried it, works for me indeed. Thanks!
- --FGeorges (talk) 16:54, 21 October 2016 (UTC)
Map view, multiple columns
editFor the map view, when there are multiple co-ordinate columns, it would be good to display them in distinct colors. Example query - http: //tiny url.com/h35p92u Bovlb (talk) 01:57, 17 October 2016 (UTC)
- @Bovlb: plese just use full urls, not shorteners. url shorteners are blacklisted for a reason.
- Here's a full URL. [1] Bovlb (talk) 20:20, 13 December 2018 (UTC)
- @Bovlb: plese just use full urls, not shorteners. url shorteners are blacklisted for a reason.
Label_service returns IllegalArgumentException
edit@Smalyshev (WMF): : Wikidata_query_service/User_Manual#Label_service can be more robust. This query
select ?x ?Label where { ?x wdt:P31 wd:Q328468. SERVICE wikibase:label {bd:serviceParam wikibase:language "en,de,pl"} }
returns "Unknown error: java.lang.IllegalArgumentException".
The reason is there's nothing between "?" and "Label" in the second var. (It's looking for "x" or "xAlt") --Vladimir Alexiev (talk) 19:59, 13 March 2017 (UTC)
Items with no value
editFor maintenance reasons it could be useful to make a query to list all articles in a given category that don't have a value set for a certain statement, i.e. all cities in USA with no value set for P164/Flag. How do I make a query that only picks articles with no value for a certain statement? --82.134.16.126 20:38, 23 May 2017 (UTC)
Label service - same effect without service
editOn my local WikiData I would like to get the label information using direct blazegraph queries.
https://www.mediawiki.org/wiki/Wikidata_query_service/User_Manual#Label_service mentions that it would be possible "to achive the same effect" as with the label service with a SPARQL query. But how? --WolfgangFahl (talk) 23:06, 5 January 2018 (UTC)
Example for label service in automatic mode is missing
editThe section about label service ends with "e.g.:" but there is no example.
I'm using the MWAPI service to run a query on a series of items and I'm confused by the number of rows I'm getting. Rather than doing aggregation after the fact I'd reduce the number of rows at the source: is srlimit supposed to be respected? I also tried to remove the quotes so that it's not a string but that didn't change anything. --Nemo 08:14, 4 September 2018 (UTC)
Source code for label and geospatial services
editCan anyone point me at the source code for the label and geospatial extensions? Thanks, Bovlb (talk) 19:47, 13 December 2018 (UTC)
Typo in sample "statement subject" query
editThe sample query at the end of this section currently has this line:
<random_URI_1> ps:P580 "1800-11-17" . # The X's start date is 1800-11-17
Instead of the "ps:" namespace, it should use "pq:", because "start date" is a qualifier here, not a statement.
My ideal outcome, I think, would be to duplicate just a little bit more of the information in the SPARQL tutorial's Qualifiers section mentioning the distinction between the two namespaces, and to more heavily emphasize the link to the tutorial, because the tutorial is great but this user manual led me down the wrong path for a while.
I don't understand why I can't fix it myself; perhaps because the page has been translated?
--JameySharp (talk) 17:16, 10 February 2019 (UTC)
- Fixed the typo, you couldn't do it yourself since the page has been protected. Matěj Suchánek (talk) 10:38, 11 February 2019 (UTC)
clearer value of '290B years' for translations
editPlease, what is the meaning of suffix B in '290B' ? — Preceding unsigned comment added by Wladek92 (talk • contribs) 5. 7. 2019, 14:04 (UTC)
- "bilion", I suppose. --Matěj Suchánek (talk) 08:56, 8 July 2019 (UTC)
Query timeout limit for important query reached
editSee here. --jobu0101 (talk) 21:56, 10 August 2019 (UTC)
Export LDF results
editI am looking to export all the results of this LDF request, which contains over 94G results, into one or more files. Right now, they only output I can get is on a web page, 100 results at a time. To get all the results client side, I would have to download 940K web pages, which is not convenient. I tried to add different data formats application/ld+json
, and other format documented, into the URL, but to no avail.
Is there a way to export the results into a file, client side, or is the web interface the only way of getting access to the results? Dirac (talk) 17:18, 14 August 2019 (UTC)
DCAT AP query is not working anymore
editThe DCAT AP query is not working anymore. PAC2 (talk) 20:16, 5 May 2021 (UTC)
Query Execution Limits
editHi, I am programmatically running a batch of queries successfully, but after several hours I start to get connection errors. I read the information about the processing time limit (i.e., "60 seconds each 60 seconds"), and the number of error queries per minute (i.e., "30 error queries per minute"), and my code follows these limits. Is there any other limit (per day?) defined by the Wikidata query service that should be taken into account? Best, Criscod (talk) 12:01, 21 August 2023 (UTC)
examples with prefixed names
editIt appears that the LDF service accepts prefixed names. As these are easier to enter, why not use them in the examples? Peter F. Patel-Schneider (talk) 14:14, 11 October 2024 (UTC)
Can SPARQL return the hash of an item-valued qualifier?
editI see that pqv: links statements to qualifier value nodes, but it doesn't return anything for item-valued qualifiers [e.g.]. I am using qualifier hashes in Wikibase-cli commands, and I'm having to get the JSON for each statement from the database and extract the qualifier hash from there. In thousands-long batches, those database calls take a lot of extra time; it would help a lot if I could just get the hashes in my sparql query. Thanks! Swpb (talk) 19:08, 21 November 2024 (UTC)