Content translation/Deployments/How-to/TPA

This is how-do document to update Template Parameter Alignment database in the cxserver.

Connect to stat100x


ssh -N stat100X -L 8880:

Open, http://localhost:8880/

This will open JupyterHub, which requires LDAP password to login.

Starting notebook


Make sure to check Kerberos authentication timeout first. Default is set to 48 hours now.


Extend it by running kinit:


Running scripts

  1. Open terminal and clone:
  1. Update config.json for pairs requires to generate template parameter alignments.
  1. Run all notebooks in order.
  1. 00ExtractNamedTempates.ipynb overwrites existing output files if it runs again, so it is better to save produced JSON files (eg: templates-articles_xx.json and templates-summary_xx.json) in other directory to avoid losing data. For large languages like en, it can be reused if we are running process within few days, this will save time.
  1. While running 02alignmentsSpark.ipynb, make sure that Wikidata partition is up-to-date.

Updating database


Run: scripts/ from cxserver pointing all generated files from the process.

This will update new templatemapping.db in the same folder. Use sqldiff command (available with sqlite3-tools package in Linux) to see difference between old and new database.

Copy it to config/templatemapping.db and submit patch for review. This database can be open with sqlite command to check number of template parameters updated.

eg: sqlite> select count(*) from templates where source_lang='en' and target_lang='vec';



1. 02alignmentsSpark.ipynb will need fastText_multilingual module to be manually install in the conda envionment, which is available at:

a. Find conda environment directory using conda list

b. Copy module to environment manually. eg /home/kartik/.conda/envs/2023-06-08T01.31.46_kartik/lib/python3.10/site-packages/fastText_multilingual

2. requires instead of version provided by pip.

3. might throw error: IndexError: list index out of range when language has no {{Cite web}} available or linked to Wikidata. Try fixing Wikidata entry. If not, we need to skip that language.

Useful resources