Extension talk:External Data/Archive 2013

Can't use array of data as array of strings

I use #get_db_data for fetching an array of strings T12U from a database. It works just fine since I'm able to show the data using {{#for_external_table:{{{T12U}}}, }}, e.g.: M,M,D,D,H,Y,M,Y,. I prepared a template Time unit for conversion these data to a more readable form:

{{#switch:{{{1|}}}|NS=ns|US=μs|MS=ms|S=s|M=min|H=h|D=day|Y=year}}

However, the following code {{#for_external_table:{{Time unit|{{{T12U}}} }} }} shows nothing. Though the template itself works fine, e.g. {{Time unit|M}} produces min. What's wrong? --176.104.0.186 07:25, 16 January 2013 (UTC)

The issue is just that #for_external_table can't handle templates very well; it's something I need to look into. Yaron Koren (talk) 16:23, 16 January 2013 (UTC)

Stricter Parameter in New Version causes much work

I didn't find any hints in the release notes (maybe I didn't look proberly`?)

empty MySQL-access password doesn't work anymore

In older Version was it possible to access MySQL with empty user password (what was OK for a complete internal WIKI in an environment where all participants have access on script level) Now I had to add a password in MySQL and LocalSettings.php.

Changes in get_db_data-syntax

following once worked

{{#get_db_data:
|server = adr
|from = PersonAdresse
|where = VN_NN_Name LIKE 'G%'
|data=
 VN_NN=VN_NN_Name,
 NN_VN=NN_VN_Name,
 COMP=Company,
 PH=Phone
}}

Now it this has to be changed in

{{#get_db_data:
|server=adr
|from=PersonAdresse
|where=VN_NN_Name LIKE 'G%'
|data=VNNN=VN_NN_Name,NNVN=NN_VN_Name,COMP=Company,PH=Phone
}}

Otherwise you'd get
"Unknown column 'NNVN' in 'field list'"
with
"SELECT VN_NN_Name,NN_VN=NN_VN_Name,Company,Phone FROM `PersonAdresse` WHERE `VN_NN_Name` LIKE 'G%'

Means, "VNNN=" vanished and "NNVN=" is misinterpreted, therefore:

  • no space in combination with "=" and ","
  • no internal names with underscore "_"
  • I'll have to change a lot :-(

Second Topic seams to reopen "Issue with parameters that contain '_'" --Hk sup (talk) 09:46, 23 January 2013 (UTC)

Hi - I can't think of any change to the External Data code that would have caused either issue. Perhaps the real cause in both cases was an upgrade of MediaWiki itself? Yaron Koren (talk) 13:25, 23 January 2013 (UTC)
Simply true, a new MediaWiki needed new External Data Extension. But who has to be addressed for this problem? I updated to the newest version of mediawiki and semantic media bundle and expected less massive change in behaviour. Was my old way to use no mysql password and get_db_data-syntax that far away from the intended standard?
--Hk sup (talk) 11:00, 24 January 2013 (UTC)
Well, the values of "data=" are meant to be on a single line, yes - although support for putting them on separate lines could probably easily be added to the code. As for underscores in the values, "A_B" and "A B" are two different things (unless it's part of a wiki page name), so I'm not surprised that it doesn't work to use the wrong one. And connecting to MySQL with an empty password - unfortunately I know nothing about that; the External Data code just calls MediaWiki's own database-connection code. You might ask on the mediawiki-l mailing list, or #mediawiki IRC channel. Yaron Koren (talk) 13:59, 24 January 2013 (UTC)
Thanks for the clarification --Hk sup (talk) 14:41, 24 January 2013 (UTC)
Hello, I wasn't clear from this discussion whether or not underscores can be used in internal names. However, what I found is that the following works for all variables:

{{#get_db_data:
db=market_data1
|from=sub_core_data c, sub s
|where=s.sub_id=c.sub_id 
|data=par=c.par_12,numberactive=c.number_active,grossportfolio=c.gross_portfolio
}}

Whereas the following only works with the first data variable, par_12 (the difference being that the internal variables now use underscores


{{#get_db_data:
db=market_data1
|from=sub_core_data c, sub s
|where=s.sub_id=c.sub_id 
|data=par_12=c.par_12,number_active=c.number_active,gross_portfolio=c.gross_portfolio
}}

variable names breaks queries

I'm not sure in what version things broke, however I just upgraded to 1.5 (from 1.3.3 I believe) and all my previous db queries stopped working. This is because I prefix and postfix my variable names so that I don't get collisions.

In 1.5 if in your data= section you have

data=username=user_name,realname=user_real_name,email=user_email

everything is peachy.

data=username_postfix=user_name,realname=user_real_name,email=user_email

is also cool with an underscore on the first variable

data=username-postfix=user_name,realname=user_real_name,email=user_email

or a dash on the first variable

data=username_postfix=user_name,realname_postfix=user_real_name,email=user_email
data=username-postfix=user_name,realname-postfix=user_real_name,email=user_email

Doing it on the second or 3rd variable however, breaks the query. You'll get an error for "EDUtils:SearchDB" db error "1056: Unknown column 'realname' in 'field list' (192.168.1.1)".

What is also interesting is that in the output of the SQL query, the first field 'user_name' will not have a variable assigned to it, only the 2nd+ do.

No changes were made to the wiki itself, all these queries and templates used to work, it looks like the error is in the parsing code for variables, which must have changed.

--Finlay Beaton (talk) 19:26, 1 February 2013 (UTC)

I downgraded External Data to 1.4.3 and the problem still occurs.
I then downgraded to 1.2.2 (the only tar I had lying around of pre-git days) and the problem does not occur and everything works as expected. I'll dig around and see if I have 1.3.3 lying around and use that but until it's fixed I can't run >1.4 series.
--Finlay Beaton (talk) 19:35, 1 February 2013 (UTC)
Yes, I'm aware of that problem - it was just discovered on the SMW mailing list two days ago, by coincidence. It's due to a change that was made in version 1.4.2. I hope to have it fixed soon. Yaron Koren (talk) 19:46, 1 February 2013 (UTC)
Hi Yaron. We're using version 1.6.1 and still having problems with multiple parameters in data field. Has this issue been fixed yet?
~Aykut Simsek (talk) 16:21, 5 June 2013 (UTC)

Why get_db_data does not contain GROUP BY?

Why get_db_data does not contain GROUP BY? Are any plans to introduce this option? --178.214.207.74 13:31, 8 February 2013 (UTC)

No strong reason - it just didn't seem that helpful, I guess, to get a breakdown of values without actually getting the values themselves. Yaron Koren (talk) 03:33, 13 February 2013 (UTC)
I had the same issue but was able to get around it by creating a view with the desired query and calling it instead. --144.58.97.83 18:08, 15 October 2013 (UTC)

Max column characters

I am having an unknown issue with only one column where the name is 32 characters long. Could the length of the column name be the issue? --Dgennaro 17:34, 8 March 2013 (UTC)

Hi - it's possible, although I don't think there's anything in the External Data code that restricts the column name length... Yaron Koren (talk) 20:31, 8 March 2013 (UTC)

Error: Incomplete information for this database ID

I installed the extension without any problem.
I created a new database in the same SQL database as MediaWiki is.
Placed below information in LocalSettings.php:
$edgDBServer['employee-db'] = "localhost";
$edgDBServerType['employee-db'] = "mysql";
$edgDBName['employee-db'] = "test";
$edgDBUser['employee-db'] = "testuser";
$edgDBPass['employee-db'] = "testpassword";

Then I placed following code in one of the wikipages to test (with the syntax before and after):
get_db_data:
db=employee-db
|from=agenter
|data=firstname=firstname

The firstname is the name of one of the columns in the table I try to get data from.

The information I recieve is following "Error: Incomplete information for this database ID."
I am not sure what the problem might be!
Any ideas?


Never mind.
It was me who was dumb.
I placed database settings above the add extension line which caused the problem.


Table not update directly

I use the External Data extension to display a table of users which works just fine.
But if I add a new user or delete a user I need to press edit and then save to update the table.
How can I get around that problem?

You don't need to save the page - you just need to refresh/purge it. You can do that by adding "action=purge" to the URL; or, conversely, you can just wait about 24 hours, for the page cache to expire. Yaron Koren (talk) 21:07, 13 March 2013 (UTC)

I saw some information regarding the cache. Some managed to get a workaround installing a extension MagicNoCache but that didn´t solved it for me. I made a change yesterday at 4:00 PM and today at 08:00 AM it still hadn´t made the change. Is there a possibility to minimize the cache time for that extension only? I changed the $wgParserCacheExpireTime in DefaultSettings.php (found in includes) to 60 seconds as a workaround. Best would be to have the change only for External Data extension if that would have been possible.

Hi - for MagicNoCache to work, you need to add the "__NOCACHE__" tag to that page - did you try that? That's actually the best way to ensure that External Data calls are always up-to-date. Yaron Koren (talk) 14:10, 14 March 2013 (UTC)

I missed that part, thank a lot!


Connecting to an SQL server instance

We have an SQL server instance running on our local network, that can be normally be accessed (for example, using ADO connections) by entering the server name in the format hostname/instancename. Using #get_db_data though, this doesn't work. I've tried setting $edgDBServer['databaseID'] to all non-silly combinations of hostname, host IP address and instance name, but haven't had any success - either there's a timeout, or a "No such host is known" error. Is it possible to connect to a specific SQL server instance, and if so what is the syntax? Monomix (talk) 17:16, 15 March 2013 (UTC)

Did you follow the special instructions for connecting to SQLServer? Yaron Koren (talk) 22:14, 15 March 2013 (UTC)
The problem turned out to be a version issue with the Microsoft drivers for PHP for SQL Server. After updating and going through the special instructions, everything's working nicely. Thanks! Monomix (talk) 12:58, 18 March 2013 (UTC)

Retrieving data from Restful webservice format

Hello,

I would like to use data from ParisOpenData website which some are in Restful webservice format Exemple.

Anyone knows how to manage this format ?

Cheers

Nicolas NALLET (talk) 09:04, 4 April 2013 (UTC)

"RESTful" isn't a format, it's an approach; and it's actually the only kind of API/web service that External Data can deal with. What actual format is the data in, do you know? Yaron Koren (talk) 12:26, 4 April 2013 (UTC)

Some are in "esri shapefile" format exemple.

Thanks Nicolas NALLET (talk) 17:44, 4 April 2013 (UTC)

Ah - External Data can't handle the Shapefile format, although in theory someone could add support for it to the code. Yaron Koren (talk) 17:57, 4 April 2013 (UTC)

Created new OdbcDatabase extension to help with SQL Server data on Linux

As a result of my own previous request in the 2012 archives of this discussion, I created a way to access MS SQL Server from Linux using ODBC. I had trouble using other methods, including MSSQLBackCompat, so I created a new extension to solve this problem: OdbcDatabase. I have put it to good use already in our wiki. Hopefully it will help someone else.

Roger Cass

Cool, that's great! Could this extension potentially be used to access other database systems that MediaWiki doesn't support, like Sybase? Yaron Koren (talk) 16:51, 10 April 2013 (UTC)

Numbered rows?

Is there a way to make autonumbering on the rows? I have found some javascript functions that makes it in static tables but it doesn´t work on a table that is being created using External Data Extension. I would lika to have the possibillity to show numbers for all the rows. MWMP (talk) 12:24, 22 April 2013 (UTC)

Are you talking about the #for_external_table function? Yaron Koren (talk) 14:10, 22 April 2013 (UTC)
I´m sorry. Yes I mean for the #for_external_table function! MWMP (talk) 11:06, 24 April 2013 (UTC)
The NumerAlpha extension may allow you to do that. Yaron Koren (talk) 13:13, 24 April 2013 (UTC)
Hi again, thanks for the quick reply! I tried NumerAlpha but it doesn't manage to count allthw rows.

It looks like this (with the a. in the middle only:

Nr.  Name         Skillset
     John Doe     2 DK, 1 SE 
     Karl Doe     4 DK, 2 SE
a.   Moe Doe      8 SE, 3 DK
     Joe Doe      5 FI, 7 NO
     Soe Doe      2 NO, 6 SE

I don´t know maybe my #for_external_table function is wrong? It looks like this:

{{#get_db_data:
db=employee-db
|from=agents
|order by=country ASC
|data=firstname=firstname,lastname=lastname,country=country,skillset=skillset
}}{| class="wikiTable"
! Nr.
! Name
! Skillset
|-
|<ia />
|{{#for_external_table:{{{firstname}}} {{{lastname}}} 
}} |{{#for_external_table:{{{skillset}}}
}} |}
Well, I would think you should have only one #for_external_table call, and the <ia /> should be within it. See the main documentation for how to generate a table within a #for_external_table call. Yaron Koren (talk) 20:18, 24 April 2013 (UTC)
I have tried to follow your examples but it wouldn´t work with table.
I am able to print everything as long as I don´t tries to add table.
The above example was the only way I managed to print it within a table.
Even if I just copy from your example it looks weard.
I´m out of ideas how to write it in order to get it to work. MWMP (talk) 08:46, 25 April 2013 (UTC)
I found an old question and created the template page (didn´t understood that from the examples). Now i renders a table correct.
{{#get_db_data:
db=employee-db
|from=agents
|order by=country ASC
|data=id=id,firstname=firstname,lastname=lastname,country=country,skillset=skillset
}}
{| class="numberedTable"
! Nr.
! Name
! Skillset {{#for_external_table:
|-
| <ia />
| {{{firstname}}} {{{lastname}}}
| {{{skillset}}} }}
|}
Allthough when trying to use NumerAlpha within one #for_external_table it just calls every line "a.".
Nr.  Name         Skillset
a.   John Doe     2 DK, 1 SE 
a.   Karl Doe     4 DK, 2 SE
a.   Moe Doe      8 SE, 3 DK
a.   Joe Doe      5 FI, 7 NO
a.   Soe Doe      2 NO, 6 SE

MWMP (talk) 09:41, 25 April 2013 (UTC)

That's unfortunate. I can't think of a good solution, then. Perhaps you could add an "ID" field to the database table that holds the row number, and then query on that? Yaron Koren (talk) 18:13, 25 April 2013 (UTC)

LDAP Query not displaying certain attributes

I've got the LDAP query working (for the most part) to Active Directory. However, a few attributes refuse to either be read from LDAP or displayed on the page (not sure where the problem is). Specifically the streetAddress and sAMAccountName attributes. I've confirmed there is data in these attributes (obvious for sAMAccountName). Other data such as name, title, etc works perfectly.

{{#get_ldap_data:
domain=domainname
|filter=(sAMAccountName=myusername)
|data=sAMAccountName=sAMAccountName,streetAddress=streetAddress
}}

{{#external_value:sAMAccountName}}
{{#external_value:streetAddress}}

Any suggestions?

Unfortunately, I know very little about the LDAP code - that part was written by David Macdonald, who I don't think is involved in MediaWiki development any more. My only guess is that it might be an uppercasing/lowercasing issue. Yaron Koren (talk) 13:12, 1 May 2013 (UTC)
Indeed it was a casing issue. The attributes in AD are sAMAccountName and streetAddress however the query only works with samaccountname and streetaddress. Thanks.--Shizzire (talk) 19:18, 1 May 2013 (UTC)

#arraymap inside of #for_external_table

I'm getting data from the database where Parameter2 is in the format:

data1, data2, data3, data4

And I'd like to use #for_external_table to populate a wikitable with results and have the individual data items wikilinked to their respective pages. I've tried the following code to interpret the Parameter2 data as an #arraymap:

{{#get_db_data:
db=DataBase1
|from=AssetTable
|order by=Parameter1
|data=Parameter1=Parameter1,Parameter2=Parameter2,Parameter3=Parameter3,Parameter4=Parameter4
}}
{| class = "wikitable"
! align="left"|Parameter 1
! Parameter 2
! Parameter 3
! Parameter 4
|-
{{#for_external_table:<br/>
{{!}}-
{{!}} {{{Parameter1}}}
{{!}} {{#arraymap:{{{Parameter2}}}|,|x|[[x]]|<br/>}}
{{!}} {{{Parameter3}}}
{{!}} {{{Parameter4}}}
}}
|}

I've tried different combinations of the pipe and Template:! and this one seems to work the best(but not completely working). I'm expecting for the result to be like:

[[data1]]
[[data2]]
[[data3]]
[[data4]]

But the result is:

[[data1, data2, data3, data4]]

I'll also note that this works relatively well using the #external_value with a single database result:

{{#arraymap:{{{#external_value:parameter2}}}|,|x|[[x]]|<br/>}}

But using this inside of #for_external_table has this result:

[[data1]]
[[data1]]
[[data1]]
[[data1]]
I don't think that can be made to work, unfortunately. You could try calling #store_external_table to do the data storage, using the "#list" syntax - it's not mentioned in this documentation, but you can see it described in the #set_internal documentation. Yaron Koren (talk) 13:35, 7 May 2013 (UTC)

Bug on page with category

I have this strange problem when I insert a category to any page that use this extension:
"WikiPage::updateCategoryCounts". Database returned error: "25P02: ERROR: current transaction is aborted, commands ignored until end of transaction block ".

I use Postgresql, so this can be related. If the page does not use the extension, I have no problem with category.

I assume this only happens on pages that call #get_db_data? Yaron Koren (talk) 16:47, 31 May 2013 (UTC)
Yes, only when I call #get_db_data
One more question: is the database you're querying the MediaWiki database, or is it a different one? Yaron Koren (talk) 15:40, 2 June 2013 (UTC)
sorry, I wrote few informations. Postgresql is version 9.2. It is just a single postgres installation, but wiki db and the database where I get data is another one. User used has superuser role. Sadly it is an internal wiki, not accessible from outside, so I cannot show :-( Mediawiki version is 1.21.1 and extension version is 1.6.1. I had a similar problem on another wiki, with older version, but I managed to: 1, create page with the call of #get_db_data, save and then edit and add category. But in this case it does not work and I have the error (I can show you this old wiki if interested).
I can guess what's causing the issue - the same Postgres-accessing code is called for both #get_db_data and the category save, and there's some sort of conflict there - but I don't know if there's a solution. One possibility you can try is using the OdbcDatabase extension, which will use different code for #get_db_data. Another is just not to have additional categories... I know it sounds extreme, but if you use Semantic MediaWiki, they mostly become unnecessary. Yaron Koren (talk) 21:58, 3 June 2013 (UTC)
I investigated again. After I obtained the postgresql trace, I see an error, where mediawiki seems to add the category with the same name, and I obtain: CEST ERROR: duplicate key value violates unique constraint "category_title" Anyway I will dig deeper :-)

does #get_db_data support "union"?

{{#get_db_data:
  db=mongo-prod                                        
  |from=a,b,c,d,e,f,g,h,i,j,k,l
  |find query={"hello": "world" }
  |limit=3
  |data=id=id,data=data
}} 

I want to query from a lot of collections and union the result together, does this extension support union, and how can I write the statement? Thanks

Hi. I'm the guy who expanded on Yaron's great MongoDB integration work to allow native mongodb.find support. In a nutshell, ExternalData has a very thin wrapper around MongoDB. The find query is simply passed through (be careful about adding whitespaces around your curly braces so as not to confuse the MW parser!) as a JSON fragment, so whatever you can do in mongodb find, you can do in ED! So this is more of a MongoDB question. Have you tried looking at the MongoDB aggregation framework? Or perhaps, if you're using ED with SMW, taking advantage of #store_external_table and just using SMW ask queries to do disjunctions? --Jnatividad (talk) 21:35, 13 June 2013 (UTC)

Error message when using #for_external_table

Hi all,

I just started using External Data and created a simple Mysql table (testdata) with two columns (id and description). When I try the following statement I get correct results:

{{#get_db_data:db=ID |from=testdata |data=nummer=id,beschreibung=description}}

{| class="wikitable"
|-
! ID
! Beschreibung {{#for_external_table:<nowiki/>
{{!}}-
{{!}} {{{nummer}}}
{{!}} {{{beschreibung}}} }}
|}

I get my table and all looks fine. When I try to edit the page again it looks like this:

{{#get_db_data:db=ID |from=testdata |data=nummer=id,beschreibung=description}}

{| class="wikitable"
|-
! ID
! Beschreibung {{#for_external_table:This page contains the following errors:error on line 1 at column 133: expected '>'
Below is a rendering of the page up to the first error.}}
|}

Can someone give me a hint on this?

Thanks in advance!

Sorry, I don't understand - the error message literally shows up in the page's wikitext? That seems impossible... Yaron Koren (talk) 20:50, 8 August 2013 (UTC)
Thanks already for your time to reply. I created three screenshots to make my problem more visible. Best regards, Stefan Molls 13:50, 9 August 2013 (CST).
- Initial Edit
- Result
- Click on Edit again
Okay - well, it still seems impossible. :) If you look in the page history, who made the edit that changed that text? Yaron Koren (talk) 20:58, 9 August 2013 (UTC)
I love to create impossible stuff ;-). In the history it just shows the change I did. Maybe this error is connected to another extension that has problems to display the edit text (WYSIWYG Editor). Will be off for the next 3 weeks. Stefan Molls 11:42, 14 August 2013 (CST).

display_external_table not showing anything

I'm trying to use {{#display_external_table: but the template is never shown. I'm using mediawiki 1.19, PHP 5.4.11 and an Oracle DB.

Here's the call I'm tying to make:

{{#get_db_data:
db=mydb
|from=mytable at
|where=at.table_name='something'
|order by=at.app_db_cd
|data=lvAppDbCd=DISTINCT at.app_db_cd}}

{{#display_external_table:template=TableTab|data=table=something,lang=F,source=lvAppDbCd}} 

My template TableTab accept 3 parameters: table, lang and source.

Is it because I'm using Oracle? The {{#for_external_table: works very well with Oracle... Thanks.

Are you saying that #for_external_table works with that exact #get_db_data call? Yaron Koren (talk) 23:36, 14 August 2013 (UTC)
Yes, #for_external_table works perfectly. I'm using it quite often with Oracle. Here's what I came up with instead of using #display_external_table to call my template:
{{#get_db_data:
db=mydb
|from=mytable at
|where=at.table_name='something'
|order by=at.app_db_cd
|data=lvAppDbCd=DISTINCT at.app_db_cd}}
{{#vardefine:varList|{{#for_external_table:{{{lvAppDbCd}}},}}}} <!--varList will contain the list of sources seperated by commas -->
{{#vardefine:varList|{{#sub:{{#var:varList}}|0|-1}}}} <!--removing the last comma in varList-->
<!-- Going through varList and calling the template:TableTab for each source -->
{{#while:
| {{#ifexpr: {{#expr:{{#pos:{{#var:varList}}|,}}+1}} > 1 | true}}
| {{#vardefine:varSource|{{#sub:{{#var:varList}}|0|{{#expr:{{#pos:{{#var:varList}}|,}} }} }} }} 
{{TableTab|table=something|lang=F|source={{#var:varSource}}}}
{{#vardefine:varList|{{#sub:{{#var:varList}}|{{#expr:{{#pos:{{#var:varList}}|,}} + 1}} }} }}
}}
<!-- Calling the template:TableTab for the last source found in varList (or the only source) -->
{{TableTab|table=something|lang=F|source={{#var:varList}}}}
I have the same problem. #for_external_table works perfectly but when I try to use #display_external_table for the exact #get_web_data call I get nothing. What am I supposed to put inside the template? Can you give us an example of how Country info row template should be in this case? --Ioannis Protonotarios 07:39, 27 August 2013 (UTC)

Umlaut Issue

Undefined offset: 1 in /var/www/source/php/mediawiki/extensions/ExternalData/ED_ParserFunctions.php on line 278

appears when the XML data being collected contains Umlauts e.g.

...
<Strasse>Münchnerstrasse 122c</Strasse>
...

will not work but

...
<Strasse>Muenchnerstrasse 122c</Strasse>
...

will. The encoding is set to UTF-8 and seems to be correct. Debugging the situation indeed there is:

doForExternalTables: variable 'street' has 2 values: M,ünchnerstrasse 122c

Which happens already in the getContent function of ED_Utils

getContent: tag='strasse' value='M'
getContent: tag='strasse' value='ünchnerstrasse 122c'

adding

 xml_parser_set_option($xml_parser,XML_OPTION_TARGET_ENCODING,'ISO-8859-1'); 

will lead to

getContent: tag='strasse' value='M'
getContent: tag='strasse' value='ünchnerstrasse 12'

Googling for the issue there is a whole bunch of articles on this:

The manual article has a lot of work-arounds. Most of them seem to be better than what was introduced in Version 1.6.2 for $ampersandReplacement e.g. this would fix the issue:

diff ED_Utils.php 
11c11
<   private static $inside_data = false;
---
> 
26d25
< 		self::$inside_data = false; 
32d30
< 		self::$inside_data = false; 
44,53c42,45
< 		if (self::$inside_data) {
< 			$l_index=count($edgXMLValues[$edgCurrentXMLTag])-1;
< 		  $edgXMLValues[$edgCurrentXMLTag][$l_index] .= $content;
< 		} else {
< 		  if ( array_key_exists( $edgCurrentXMLTag, $edgXMLValues ) )
< 			  $edgXMLValues[$edgCurrentXMLTag][] = $content;
< 		  else
< 			  $edgXMLValues[$edgCurrentXMLTag] = array( $content );
< 		}
< 		self::$inside_data = true; 
---
> 		if ( array_key_exists( $edgCurrentXMLTag, $edgXMLValues ) )
> 			$edgXMLValues[$edgCurrentXMLTag][] = $content;
> 		else
> 			$edgXMLValues[$edgCurrentXMLTag] = array( $content );

and in ED_ParserFunctions.php add an isset ...

if ( array_key_exists( $var, $edgValues ) && count( $edgValues[$var] > $i ) && isset($edgValues[$var][$i]) )
That's some great detective work - this patch you came up with is potentially very useful. Is there a public API of some sort, that has an umlaut in its data, that I can access to see this problem directly? Yaron Koren (talk) 20:21, 19 August 2013 (UTC)
Since I ran into the same problem, I have made some tests and provided a test page with data with umlauts in different encodings: https://publicwiki-01.fraunhofer.de/Testwiki/index.php/Extension-ExternalData. Links to test data are included. ED 1.6.2. The tests show:
  • UTF-8 and ANSI-encodings work to some extent in V 1.6.2
  • the wiki page is loaded despite some defects in the display of the text (this was not the case in version 1.6.1)
  • umlauts at the beginnig of a word are not shown
I hope this data helps a bit. Kappa (talk) 16:06, 6 December 2013 (UTC)
Thanks, that helps a lot! It helped me find the issues in the External Data code. I just checked in a change, that I believe fixes most of the issues relating to umlauts and/or encodings. The only thing I couldn't get working was the lone "€"s, in Unicode - that's a tough nut to crack. Yaron Koren (talk) 14:39, 9 December 2013 (UTC)
I tried the patch, but the queried string is still truncated at the first point where an acute accent appears. It's good to see though you've been able to make some progress! Cavila (MW 1.19.7, MySQL 5.1.72-2, Php 5.3.3-7, SMW 1.8.0.5, SF 2.5.2) 11:31, 11 December 2013 (UTC)
Can you point to the data set you're using, or copy it somewhere? Yaron Koren (talk) 12:01, 11 December 2013 (UTC)
It's the same dataset that I mentioned below, although it does not present any trouble here: http://smw.referata.com/wiki/User:Cavila/ED_tests (example 1; don't know about the xpath equivalent). As you suggested yourself, the issue may be related to specific conditions in my Php/cURL configuration. Unfortunately, I don't have direct access to the machine (shared host), so it's difficult to diagnose this. Cavila (MW 1.19.7, MySQL 5.1.72-2, Php 5.3.3-7, SMW 1.8.0.5, SF 2.5.2) 14:13, 11 December 2013 (UTC)

Truncated variables and other stuff (ED 1.6.2)

Hi Yaron. I'm slowly beginning to get to grips with the XML part of this extension. Apart from the odd "XML error: Undeclared entity warning" and the like (which supposedly may just be a sign of malformed XML), I notice that values tend to get clipped a lot. For instance, "Bibliothèque de l'université de Virginie" in [1] simply becomes "Biblioth". Also, is there a way to retrieve the complete French poem from the same link using this extension? Cavila (MW 1.19.7, MySQL 5.1.66, Php 5.3.3-7, SMW 1.8, SF 1.5.2} 18:57, 20 August 2013 (UTC)

It sounds like you're having similar problems with accented characters (most likely due to file encoding) as the person above. I can't reproduce those errors, though, hitting that URL. What is the exact #get_web_data call you're making? Yaron Koren (talk) 20:32, 21 August 2013 (UTC)

The call used for the result above ("Biblioth") is written like this:

{{#get_web_data: url=http://www.bvh.univ-tours.fr:8080/xtf/data/tei/XUVA_Gordon1578_L47/XUVA_Gordon1578_L47_tei.xml
|format=XML
|data=repository=repository
}}
{{#external_value:repository}}

Nothing fishy so far, right? Cavila (MW 1.19.7, MySQL 5.1.66, Php 5.3.3-7, SMW 1.8, SF 1.5.2} 06:57, 22 August 2013 (UTC)

Okay, that's strange - that exact same call works for me. I assume the issue, then, is some sort of PHP/cURL configuration, related to encoding. I don't know anything more than that, though. There's no way I could access your system, is there? Yaron Koren (talk) 14:12, 22 August 2013 (UTC)
Thanks for replying. I've just sent you a link so that you can see the results for yourself. Check your inbox. Cavila (MW 1.19.7, MySQL 5.1.66, Php 5.3.3-7, SMW 1.8, SF 1.5.2} 10:32, 25 August 2013 (UTC)

THIS IS A LONG STRING USED AS A REPLACEMENT FOR AMPERSANDS 55555555

I made a simple page (see snapshot) to display the latest 15 new article titles on my wiki:

{{#get_web_data:
url={{SERVER}}{{SCRIPTPATH}}/api.php?action=query&list=recentchanges&rctype=new&rcnamespace=0&rclimit=15&format=xml
|format=XML
|data=QQtype=type,QQns=ns,QQtitle=title,QQrcid=rcid,QQdpageid=pageid,QQrevid=revid,QQold_revid=old_revid,QQtimestamp=timestamp
}}
{| class="wikitable"
! Title
! Date {{#for_external_table:<nowiki/>
{{!}}-
{{!}} [[{{{QQtitle}}}]]
{{!}} {{{QQtimestamp}}} }}
|}

but in one page's title there is an ampersand (&) which is rendered as "THIS IS A LONG STRING USED AS A REPLACEMENT FOR AMPERSANDS 55555555"!!! I may be able to do a workaround with string functions but what's the point of this message anyway? Was it a temporary programming comment that has been forgotten in the code maybe?

  P.S. for all users: This example works as is. Use it if you want to have a blog-like latest articles page. Just don't forget to create the pipe template {{!}} with just the pipe character | inside. Change rclimit to the number of titles you want to be displayed.

--Ioannis Protonotarios 18:24, 26 August 2013 (UTC)

That's certainly not something that users are supposed to see! It's not a comment - it's a string that ampersands are supposed to be replaced with before being replaced back. Clearly the second replacement isn't happening; I'll look into it. Yaron Koren (talk) 19:30, 26 August 2013 (UTC)
I just made a change in the Git code that I believe finally fixes the problem; this will go into the next version as well. Yaron Koren (talk) 17:45, 1 November 2013 (UTC)

Retrieving data from a non-public wiki using #get_web_data

I am trying to retrieve the 'author' of a page in a wiki that requires log in.

I am using #get_web_data with the full url of the page like this (I am making the query from the page itself):

{{#get_web_data:url=http://exmple.com/api.php?action=query&titles=External Data/Archive 2013&prop=revisions&format=xml&rvdir=newer&rvlimit=1
 |format=xml
 |use xpath
 |data=author=/api//rev/@user
}}
The author of this page is: {{#external_value:author}} 

But nothing is returned. Any chance to make this work? is there any better way? Thanks. --J. Xavier Atero (talk) 18:32, 31 August 2013 (UTC)

If that API URL itself requires a login to display its results, then that unfortunately won't work. There are two solutions I can think of: (1) create a small, publicly-accessible web page that logs in to the wiki and displays the contents of that URL; or (2) it may be possible to get the data using #get_db_data instead. Yaron Koren (talk) 13:09, 1 September 2013 (UTC)
Thanks (2) worked!
{{#get_db_data: db=my-db
 | from=revision
 | where=rev_page=181406
 | limit=1
 | data=author=rev_user_text
}}

--J. Xavier Atero (talk) 21:23, 12 December 2013 (UTC)

Problem with get_db_data when it returns just one row.

MediaWiki :- 1.21.1 PHP :- 5.4.14 (cgi-fcgi) MySQL :- 5.6.12-log

External Data (Version 1.6)

I have the following :-

{{#get_db_data:
db=Qondoc
|from=search_host2
|data=Name=Name,OS=OS,Version="OS Version",CPU="CPU Speed"
}}

{{#external_value:Name}}
{{#external_value:OS}}
{{#external_value:Version}}
{{#external_value:CPU}}

this produces the expected result and lists the first row from database table and debug shows a query :-

SQL: [SELECT /* EDUtils::searchDB Rb99mrs */ Name,OS,"OS Version","CPU Speed" FROM search_host2 ]

which indeed returns all the entries in the table.

However, when i try this :-

{{#get_db_data:
db=Qondoc
|from=search_host2
|where=Name='PAPP-QDOC'
|data=Name=Name,OS=OS,Version="OS Version",CPU="CPU Speed"
}}

{{#external_value:Name}}
{{#external_value:OS}}
{{#external_value:Version}}
{{#external_value:CPU}}

I get no data returned in the wiki, but the SQL query shown in the debug log :-

SELECT /* EDUtils::searchDB Rb99mrs */ Name,OS,"OS Version","CPU Speed" FROM search_host2 WHERE Name='PAPP-QDOC'

returns 1 entry as expected.

I am sure I missing something blindingly obvious....

Hello. I assume Qondoc connects to MySql? To make sure your query works first show all results in a table without the WHERE clause. Don't try to use capitalized characters for the parameter names, when you are in the testing phase it only makes it easier to make a mistake.   Note:Make sure to create a Template called "!" with only a "|" in it. So: Template:! with: |
{{#get_db_data:
db=Qondoc
|from=search_host2
|data=name=Name,os=OS,version="OS Version",cpu="CPU Speed"
}}
{| class = "wikitable" width = "100%"
! Name
! OS
! Version
! CPU{{#for_external_table:<nowiki/>
{{!}}-
{{!}} {{{name}}}
{{!}} {{{os}}}
{{!}} {{{version}}}
{{!}} {{{cpu}}}
}}
|}
This should list all entries (I hope there are not millions!) in there appropriate columns. If this does not produce the same results as the values in the original table there is probably something wrong with the naming of the columns. We never use empty spaces in column names so there is no need to quote column names in query calls with {{#get_db_data:. We also use {{#external_value: only then when the result of a query produces only one row of data. If you get the correct results try adding the WHERE clause and go from there. We Regards --Jongfeli (talk) 11:31, 18 September 2013 (UTC)

THanks for that, the thick plottens.... its actually an mssql db but I dont think that is the issue. I do already have the ! mall. I thought it was working correctly with the for_external_table mode, but now I realise that what is happening is that one row of data ( the first record returned from the database ) is missing. So if there is just one row then I get nothing, if there are two rows of sql data, then just one row, (the second) is displayed... so now I am utterly confused.

Just so I understand you correctly, when you query the table without the WHERE clause and lets say the actual table has 30 rows, on the Wiki you only get 29 rows? --Jongfeli (talk) 14:16, 18 September 2013 (UTC)

Exactly. n-1 rows...which is especially unlucky when n=1 ;-)

When you can, try to reproduce this when you query against the same data on a MySql database. --Jongfeli (talk) 12:20, 19 September 2013 (UTC)

Well, damn my old boots... mysql works correctly. So it must be something in the SQL portion of the code. Which as i (probably mis)understand it is not a part of External Data?

No it is not part of External Data, (Yaron, correct me when I am wrong) PHP takes care of the database connection. If you did not gave it a try already, you could run PHP with the Microsoft Drivers 3.0 for PHP for SQL Server. We also run a SQL Server 2012 database and had the same dilemma when we started. We made the choice to automatically copy the data we need on our Wiki from our MS SQL database to the MySQL database on our Wiki server. This can be achieved with MySQL Connectors. This works like a charm. --Jongfeli (talk) 11:53, 20 September 2013 (UTC)

Many thanks for your help, (ps I will investigate the connector anyway) but found the issue. I downloaded a fresh mediawiki, ( this one is very old, just upgraded from 1.12...) and did some compares and the databasemmssql.php was not the same. I can only think i screwed something up in the (multiple) upgrade processes. Once replaced, it now works like a charm. Once again, many thanks for your time and assistance.

[SOLVED] get_web_data Returns Nothing After Migration to Linux

First, love this extension. It's totally changed my workplace. I'm moving my MW 1.18.1 to a linux environment, but can't seem to get this extension working there and am having a terrible time troubleshooting it. It shows up in Special:Version ok. cURL is installed and working (at least, I have a short php script that seems to use cURL successfully). I've remarked out setting of $edgAllowExternalDataFrom, disabled caching with MagicNoCache as discussed here. It's like nothing happens when the function is called. Ideas? I'm wondering if there's some kind of permissions thing I'm missing. I'm behind a proxy, but should that matter if I'm getting data from the machine itself? Also, I have $wgShowExceptionDetails set to true, no errors display. - Lbillett (talk) 16:59, 18 September 2013 (UTC)

My guess is that it's the proxy that's somehow the issue. You can use cURL, but can you use cURL to retrieve the URL in question? Yaron Koren (talk) 18:19, 18 September 2013 (UTC)
URL in question is on IP of the server itself. I don't think it needs to go through the proxy to get to it? cURL seems to work fine if being called from standalone php scripts, and can get internet web content by setting CURLOPT_PROXY. I guess what's got me so miffed is how it works fine on my two-bit windows installation. Which, makes me wonder if I've missed some kind of permissions setting, or perhaps missed an important component of the cURL install? - Lbillett (talk) 18:44, 18 September 2013 (UTC)
Sorry to belabor the point, but have you actually tried using cURL to get that URL? Yaron Koren (talk) 18:50, 18 September 2013 (UTC)
Seems I did manage to avoid your pointed question. Yes. I can use cURL (with no proxy settings) to get content at target URL. I've tried using a static htm as the URL to read. Temporarily set wrx permissions on my whole file structure. Tried zip and git snapshots of the extension. Really running out of ideas. PHP 5.3.10-1, now with every curl package apt can find. Set up a get_db_example which throws a MySQ connection error. Wondering if that's pointing somewhere. - Lbillett (talk) 02:58, 19 September 2013 (UTC)

I don't know, then - my best advice would be to go into the code and add some printout statements, to try to debug the problem, if you know how to do that - you should especially look into the Http::get() calls within fetchURL() in ED_Utils.php. Yaron Koren (talk) 14:23, 19 September 2013 (UTC)

Ah. Looks like $contents isn't getting set by Http::get() in fetchURL(). Upgraded to a barebones 1.21.2 to see if I messed something up in the core during copying. Same null result. It must be some kind of cURL configuration issue on my server. Thanks for pointing the way. Will leave a signpost if I make it to the other side. - Lbillett (talk) 00:09, 20 September 2013 (UTC)
Turns out the Http class was getting into trouble because the http_proxy environment variable had been set to some boilerplate stuff. However, it seems the class doesn't volunteer errors unless you get them yourself. Un-setting it made everything work. My custom scripts worked because they don't try to set anything based on the environment vars. Thanks for pointing me in the right direction Yaron. I made it. - Lbillett (talk) 14:01, 21 September 2013 (UTC)
Cool. If you know of any way to improve ED to report such errors, please let me know. Yaron Koren (talk) 15:21, 22 September 2013 (UTC)
Well, not being very good with objects (or programming on the whole), I'm not sure how to implement it. But it looks near line 760 in includes/Httpfunctions.php part of the CurlHttpRequest object will set it's some kind of public fatal status variable to the error message. We're 3 or 4 objects deep at this point... could maybe tweak ED's fetchURL() to test for it and return it if found. Sounds tricky (to me) - Lbillett (talk) 23:45, 22 September 2013 (UTC)

Only one variable at #get_web_data works

Hi Yaron and team, I have a Problem with JSON data retrieved from an icinga server. The output is looking like that one: http://classic.demo.icinga.org/icinga/cgi-bin/tac.cgi?jsonoutput (user: guest, pass: guest)

With this I get no results:

{{#get_web_data:url=https://icinga-server/icinga/cgi-bin/tac.cgi?jsonoutput
|format=json
|data=total_services=total_services,total_hosts=total_hosts
}}

This one works:

{{#get_web_data:url=https://icinga-server/cgi-bin/icinga/tac.cgi?jsonoutput
|format=json
|data=total_hosts=total_hosts
}}
{{#get_web_data:url=https://icinga-server/cgi-bin/icinga/tac.cgi?jsonoutput
|format=json
|data=total_services=total_services
}}

As you can see the only difference is that I used only on local to external declaration after data= . I used this table to proof for values in the variables:

{|class="wikitable"
! Name
! Value
|-
| total_hosts
| {{#external_value:total_hosts}}
|-
| total_services
| {{#external_value:total_services}}
|}

What is wrong here? Thanks, Lars

Hi - sorry about that; the issue is a bug in External Data, where the "data" param doesn't get split up correctly by comma if any of the "local variable" names contain an underscore. That bug has been in place for nearly a year; it's due to a fix for another parsing bug. I just checked in a fix, something I should have done a long time ago. You can either get the new code via Git, or just change the local variable names to not contain underscores. Yaron Koren (talk) 19:46, 13 October 2013 (UTC)

No return from #get_web_data with JSON arrays (multiple values)

Hello Yaron, I'm having the same issue with JSON arrays, but am not using underscores. Versions are ED:1.62 & MediaWiki:1.21.2. I've tried numerous combinations of the #external_value, #for_external_table and #arraymap methods. I also used Templates and was successful with single values but not multiple values. Any help here would be greatly appreciated. Thank you for this great contribution and time!

Single values work as expected:

state.json example - {"state1":"utah", "state2:"texas"}
{{#get_web_data: url=http://www.someurl.com/state.json
  | format=json
  | data=state1=state1,state2=state2 }}
{{#external_value:state1}}
{{#external_value:state2}}

Multiple values do not display anything besides empty white space including within or outside of a table:

states.json example - {"states": ["utah","texas"]}
{{#get_web_data: url=http://www.someurl.com/states.json
  | format=json
  | data=states=states }}
{{#external_value:states}}

This doesn't work either:

states.json example - {"states": ["utah","texas"]}
{{#get_web_data: url=http://www.someurl.com/states.json
  | format=json
  | data=states=states }}
{{#for_external_table:
{{!}}-
{{!}} {{{states}}} }}
I'm aware of this issue, and I hope to look into it at some point soon. Yaron Koren (talk) 09:12, 25 October 2013 (UTC)

“Error: Could not connect to database.”

Hi Have installed the External Data Extension but when I attempt to get data from the external database I get this message “Error: Could not connect to database.” I have IIS 7, Mediawiki 1.21 PHP 5.3.25 and MySQL 5.611 running on a 64bit windows 7 computer. The external database a SQL Server 2008 R2 on a Windows 2008 Server I have installed and have running Microsoft drivers for PHP for SQL Server. I have a number of web pages on the same web server but outside of the wiki using PHP to connect to the sane database without a problem. To test the connection I am using the SA account and password? I would be grateful for any help.

I guessed that it was SQLServer just from seeing the subject line... unfortunately, I don't think I can help that much. It looks like you followed the steps correctly to get SQLServer working, so I don't know what's going wrong. Hopefully someone else here can help. You could also try asking on a general MediaWiki forum, since it sounds like the issue is with the core MediaWiki code. Yaron Koren (talk) 15:38, 7 November 2013 (UTC)

Magic word 'get_external_data' not found

The docs state that the current version of ExternalData works with MediaWiki 1.1.17 and above. Do you have a version that might be compatible with MediaWiki 1.1.15 please? When loading the extension at the moment, the Magic word error displays.

I assume you mean MediaWiki 1.15. It looks like ED 1.3.5 was the last version that still supported 1.15 - you can download it here. Yaron Koren (talk) 13:54, 21 November 2013 (UTC)
Yes sorry, I did mean MediaWiki 1.15. Thanks for putting up ED 1.3.5. I downloaded it but got the same error below:

Magic word 'get_external_data' not found

Backtrace:

  1. 0 /apps/wiki/htdocs/w/includes/MagicWord.php(244): Language->getMagic(Object(MagicWord))
  2. 1 /apps/wiki/htdocs/w/includes/MagicWord.php(197): MagicWord->load('get_external_da...')
  3. 2 /apps/wiki/htdocs/w/includes/parser/Parser.php(4034): MagicWord::get('get_external_da...')
  4. 3 /apps/wiki/htdocs/w/extensions/ExternalData/ExternalData.php(49): Parser->setFunctionHook('get_external_da...', Array)
  5. 4 [internal function]: edgRegisterParser(Object(Parser))
  6. 5 /apps/wiki/htdocs/w/includes/Hooks.php(117): call_user_func_array('edgRegisterPars...', Array)
  7. 6 /apps/wiki/htdocs/w/includes/parser/Parser.php(178): wfRunHooks('ParserFirstCall...', Array)
  8. 7 /apps/wiki/htdocs/w/includes/parser/Parser.php(190): Parser->firstCallInit()
  9. 8 /apps/wiki/htdocs/w/includes/parser/Parser.php(312): Parser->clearState()
  10. 9 /apps/wiki/htdocs/w/includes/Article.php(3557): Parser->parse('[[Category:Web ...', Object(Title), Object(ParserOptions), true, true, 59578)
  11. 10 /apps/wiki/htdocs/w/includes/Article.php(979): Article->outputWikiText('[[Category:Web ...')
  12. 11 /apps/wiki/htdocs/w/includes/Wiki.php(450): Article->view()
  13. 12 /apps/wiki/htdocs/w/includes/Wiki.php(63): MediaWiki->performAction(Object(OutputPage), Object(Article), Object(Title), Object(User), Object(WebRequest))
  14. 13 /apps/wiki/htdocs/w/index.php(116): MediaWiki->initialize(Object(Title), Object(Article), Object(OutputPage), Object(User), Object(WebRequest))
  15. 14 {main}

--Equick (talk) 15:25, 21 November 2013 (UTC)

Oh, oops - 1.3.4 was the last ED version that supported 1.15. Try downloading that instead; just change the URL. Yaron Koren (talk) 15:28, 21 November 2013 (UTC)
Cool thanks. ED 1.3.4 works :-) Thanks Yaron! --Equick (talk) 15:37, 21 November 2013 (UTC)

Outcome of source outages

We are using MW 1.17.5 with ED 1.3.1 and SMW 1.6.1. Updating is near impossible right now. I dont understand exactly how caching works in conjunction with SMW thus my question might be stupid.

We do have a remote webserver that holds some master data for pages we have in our local wiki. We are preparing to use ED to retrieve said data and store it in properties. We dont plan on caching as we would need most actual master data.

Now what will happen if the remote server has an outage? We would need to have the latest available results displayed. Is that possible? What would we have to do?

I read about some problems updating the fetched informations when stored within properties. Is there a way of forcing an update other that SMW_refreshData.php?

-- Kind regards Ciannicay (talk) 16:43, 26 November 2013 (UTC)

You would need to use caching of one kind or another; I don't believe there's any way to have logic of the form, "if the outside data exists, use it; otherwise use a stored version". As for updating external data stored via SMW - besides SMW_refreshData.php there's also the SemanticUpdateOnPurge extension. Yaron Koren (talk) 18:00, 26 November 2013 (UTC)
Sadly the extension you named seems to have cheased further development. It would work for my outdated version of MW tho. I am staggerd by the fact that the described behavior - eg. better data from yesterday than none at all - is so unusual. I really had hoped this would be a more common feature request. As the data we fetch and show is mission critical for our maintenance teams especially if there already is an outage in our services we would really need such a behavior. Do you happen to know any other way to achieve this? -- Ciannicay (talk) 15:54, 27 November 2013 (UTC)
That's true - there is a good amount of logic to having that kind of functionality. I guess in most cases people expect the data source to always be available. I don't know of any way to achieve it; although that kind of functionality could potentially be added in to the code. Yaron Koren (talk) 18:01, 27 November 2013 (UTC)
Here's a crazy idea. I haven't used External Data in awhile, so I have no idea if this will work. Also, the method of setting a property based upon the previous value of that property seems like it could be error prone...but maybe it'll work. You could extend this to check for preexisting values or to loop (arraymap) through multiple values if you needed to.
{{#ifeq: {{#get_db_data: ... }} 
  | ...Some error message (or would it just be blank if connection failed?)
  | <!-- 
    If you get an error message, then set the property based on the existing value 
    
    -->{{#set: My Property={{#show: {{FULLPAGENAME}} |?My Property |link=none }} }}
  | <!-- 
    else, set the properties from the external db 

    -->{{#set: My Property={{#external_value:MyPropertyValue}} }}
}}
--Jamesmontalvo3 (talk) 19:59, 27 November 2013 (UTC)
Thats a really nice idea! Thanks. I will try that as soon as my web services department fixes the permissions bug we eencountered. To prevent the direct "read property to set it" mechanism i could put the values into some #vardefine and use #var later on. Sounds as it might work out. As i said, i will try as soon as i can and report here.
@Yaron: Including this into your code should only be done if more ppl than me would benefit from it. But if you really want to do it, i will check with my web services department if they can do an upgrade. Thanks! -- Ciannicay (talk) 11:42, 28 November 2013 (UTC)

ExternalData.sql doc needs an upate?

In case this is not related to my host/setup:

Just enabled URL caching as written in the docs. After importing the sql and setting cache db and expiration time it failed asking for the 'myproject.wiki_ed_url_cache'. Changing the first line in the sql and dropping the wrong table fixed everything.

Not sure where this wiki_ prefix comes from. The configuration line states ed_url_cache....

--Hippie2000 (talk) 04:32, 2 December 2013 (UTC)

I assume "wiki_" is your wiki's own database table prefix. So, was there a table called "wiki_ed_url_cache"? If so, what failed? If not, what was the table you dropped? Yaron Koren (talk) 16:38, 3 December 2013 (UTC)
You are right. Noiced that "wiki_" is the db prefix my hoster configured. So maybe worth to note in the docs that the SQL needs to be adapted if a prefix is used before importing it. However, it was no big issue to drop the non prefixed table and reimport the sql with added prefix. --Hippie2000 (talk) 15:16, 4 December 2013 (UTC)
Return to "External Data/Archive 2013" page.