Jump to content

Talk:Collaboration/Team/Processes

Add topic
From mediawiki.org

Debugging

[edit]

MattF says:
Full command for using IDE debugging (with XDebug) of a command line script is something like

$ sudo apt-get install php5-xdebug
$ XDEBUG_CONFIG="remote_enable=1 idekey=emacs \
    remote_host=localhost remote_port=9001" php some-script.php

localhost should be replaced with the host's IP (usually 10.0.2.2), if running under Vagrant. If using mwscript or sudo, use --preserve-env. E.g.:

XDEBUG_CONFIG="remote_enable=1 idekey=emacs remote_host=10.0.2.2" sudo -u www-data --preserve-env mwscript extensions/Flow/maintenance/convertLqtPage.php --convertnotifications --srcpage Talk:Yet_Another_LQT Talk:Yet_Another_LQT_Output

IDE key I don't think matters, and you can leave the default port if HHVM isn't taking up 9000.

Scripts and hacks

[edit]

Things we do unrelated to the code in git/gerrit.

working with UUIDs

[edit]

w:Base 36#Conversion has code to convert to/from Flow's alphadecimal representation of UUIDs in URLs e.g. rs714d9sds75b0e7. Note JavaScript integers have insufficient precision, e.g.

> parseInt( "rs714d9sds75b0e7", 36);
6.142141599954396e+24

There are online base36 converters, e.g. this one with ads.

mysql

[edit]

We store 88-bit UUIDs in binary(11) columns, so they appear as garbled strings such as:

 �#�����o~p

In mysql, use LOWER( HEX() ) and UNHEX(), e.g.:

mysql> SELECT LOWER(HEX(definition_id)), definition_name FROM flow_definition;
mysql> UPDATE flow_workflow
    ->   SET workflow_definition_id = UNHEX('0503AC33637ABBA2F8A9FA')
    ->   WHERE workflow_type = 'discussion';

There is no (?) mysql function to convert to/from Flow's alphadecimal representation. A UUID object can be converted to hex with getHex. For example ("Topic:Se5ulm02fl2zddtu" would be the page name in this example, only the first char differs in case):

$uuid = Flow\Model\UUID::create( strtolower( 'Se5ulm02fl2zddtu' ) );
var_export( $uuid->getHex() );

Output: '05313710f23bfce993e242'

(20:00) root@localhost:[wiki]> SELECT workflow_namespace, workflow_page_id, workflow_title_text, workflow_name FROM flow_workflow WHERE LOWER(HEX(workflow_id)) = '05313710f23bfce993e242';
+--------------------+------------------+---------------------+---------------+
| workflow_namespace | workflow_page_id | workflow_title_text | workflow_name |
+--------------------+------------------+---------------------+---------------+
|                  1 |               75 | Sandbox             |               |
+--------------------+------------------+---------------------+---------------+
1 row in set (0.00 sec)

Lowercase hex is also an allowed input format for UUID::create.

Determining database, gaining access

[edit]

The configuration of Flow's cross-wiki external database can be opaque.

ssh into MediaWiki server (e.g. deployment-bastion for the Beta cluster or wikitech:terbium for production), then

 $ mwscript eval.php --wiki=enwiki
 var_dump( Flow\Container::get( 'db.factory' )->getDB( DB_SLAVE ) );

this will give you mServer, mDBname, mUser, mPassword to connect from the mysql command line.

E.g. from a labs instance to access the beta cluster,

 $ mysql -u mUser -p --host=mServer mDBname
 (enter mPassword)

Use DB_MASTER if you need to write to the database.

Clearing cache

[edit]

Flow caches topic data which can prevent changes and fixes showing up.

If you're using memcached

 sudo restart memcached

If you're using redis (the default in MediaWiki-Vagrant

 echo flushdb | redis-cli

However, we can't do this in production. If code you introduce needs to clear the cache, instead your patch should bump $wgFlowCacheVersion. But note in production the one Flow DB is accessed by wikis running different versions of the Flow code...

Purging problem posts

[edit]

If viewing a board or topic throws exception and the exception log on fluorine contains "Did not load root post UUID>", you can run ErikB's purge script on tin.eqiad.wmnet:/home/ebernhardson/purge.php. For example, if Talk:BOARDNAME on mediawiki.org has problems:

tin:$ mwscript eval.php mediawikiwiki Talk:BOARDNAME
> require_once "/home/ebernhardson/purge.php"

Sample output

 Skipping mediawikiwiki:lag_times:10.64.16.18:lock
 Skipping mediawikiwiki:lag_times:db1038:lock
 Skipping mediawikiwiki:lag_times:10.64.32.18:lock
 Skipping mediawikiwiki:revisiontext:textid:1133331
 Skipping mediawikiwiki:lag_times:10.64.16.154:lock
 Purging keys: flowdb:flow_workflow:title:v2::BOARDNAME:discussion:mediawikiwiki:1:4.5, flowdb:flow_topic_list:list:rvnx8zltkabnji5j:4.5, flow:tree:subtree:rvnx8zm95io4xdbb:4.5, ...

Counting talk pages

[edit]
mysql> SELECT COUNT(*)
    FROM page WHERE page_namespace MOD 2 = 1;
mysql> SELECT DISTINCT page_namespace , COUNT(1)
    FROM page WHERE page_namespace MOD 2 = 1 GROUP BY 1 ;

Updating labs instances

[edit]

You need to configure ssh in order to log in to hosts on the labs network .eqiad.wmflabs, see wikitech:Help:Access#Accessing instances with ProxyCommand ssh option (recommended)

ee-flow

[edit]

spage has a cron job that updates Flow and Echo, see the /etc/motd

Don't run git as root. We have git configured for group sharing, so it shouldn't be necessary. If you get permission errors, run sudo chmod -R g+w . && sudo chgrp -R project-editor-engagement .

flow-tests

[edit]

flow-tests runs wikitech:Labs-vagrant, so you can run labs-vagrant git-update !

Manually

[edit]

Manually, run something like:

$ ssh flow-tests.eqiad.wmflabs
$ cd /vagrant/mediawiki/extensions/Flow/ && sudo su vagrant
$ make master

to update Flow to a particular gerrit commit:

$ ssh flow-tests.eqiad.wmflabs
$ cd /vagrant/mediawiki/extensions/Flow && sudo su vagrant
$ # Paste the "''Download'' Anonymous HTTP" command line from the gerrit patch
$ git fetch https://gerrit.wikimedia.org/r/mediawiki/extensions/Flow refs/changes/11/166711/4 && git checkout FETCH_HEAD

Beta cluster

[edit]

Job runner health:

ssh deployment-jobrunner01
less /var/log/mediawiki/jobrunner.log